Read more on "Server Infrastructure Tools" »

Amazon Web Services Advances Lambda Serverless with New Capabilities

By Sean Michael Kerner (Send Email)
Posted November 30, 2017


The Amazon cloud was originally an idea about enabling elastic capabilities for virtualized server instances. In recent years that idea has expanded significantly, with one of the most active areas being the Serverless category, which removes the need for dedicated running virtualized servers for enabling cloud services.

At the AWS re:Invent conference on Nov. 30, AWS CTO Werner Vogels announced multiple new serverless features and services to help further disaggregate the cloud from the soon-to-be legacy model of virtualized servers.AWS Serverless

The cornerstone of the AWS serverless portfolio is the AWS Lambda service, which is now getting new features that further expand its capabilities. One of the new services is AWS API Gateway VPC (Virtual Private Cloud) Integration.

"You can run your VPCs and your gateways with Lambda," Vogels said during his re:Invent keynote.

Lambda is also getting better concurrency controls. Vogels explained how it's now easier for developers to configure which functions should run in parallel. Additionally, Lambda now supports up 3GB of memory for larger functions. Finally, Lambda is being expanded with support for the .Net Core 2.0 and Go programming languages.

"I'm very happy that we now have Go support; many of you have been asking for that," Vogels said.

AWS Serverless Application Repository

Looking beyond just Lambda itself, Vogels said AWS wants to help organizations understand how to build and deploy serverless workloads more easily, which is where the AWS Serverless Application Repository comes into play.

"Anyone can put their serverless functions into the repository so they can be re-used," Vogels said. "Quite a few of our partners are putting their functions into the serverless repository."

Aurora Serverless

The Amazon Aurora database is now also going serverless, providing a new option for users that don't want or need to manage ongoing database capacity.

"The database automatically starts, scales, and shuts down based on application workload," Amazon stated in a press release. "Customers simply create an endpoint through the AWS Management Console, specify the minimum and maximum capacity needs of their application, and Amazon Aurora handles the rest."

Sean Michael Kerner is a senior editor at ServerWatch and InternetNews.com. Follow him on Twitter @TechJournalist.

Page 1 of 1

Read more on "Server Infrastructure Tools" »

Comment and Contribute

Your name/nickname

Your email

(Maximum characters: 1200). You have characters left.


 

 


Thanks for your registration, follow us on our social networks to keep up-to-date