Operations | Monitoring | ITSM | DevOps | Cloud

Serverless

The latest News and Information on Serverless Monitoring, Management, Development and related cloud technologies.

The Complete AWS Lambda Handbook for Beginners (Part 1)

Welcome to the Serverless world. One of the first things you’ll hear about is AWS Lambda - and you’ll continue to keep hearing about it! While architecture can be serverless without Lambdas involved, it’s very often the key component within a serverless application. In the first post of this 3-part AWS Lambda Handbook series, we run through what is AWS Lambda, dialling back to basics with the various terminology, how to create a Lambda function and how to run it.

Exploring AWS Lambda Deployment Limits

We have explored how we can deploy Machine Learning models using AWS Lambda. Deploying ML models with AWS Lambda is suitable for early-stage projects as there are certain limitations in using Lambda function. However, this is not a reason to worry if you need to utilize AWS Lambda to its full potential for your Machine Learning project. When working with Lambda functions its a constant worry about the size of deployment packages for a developer.

The Great Irony of Serverless Computing

Working with Serverless computing is like riding an electric bike. You get speed, flexibility, automatic assistance to scale with ease. Development is usually hassle-free because you can focus on code and only pay for usage of the service. Except when your users hit an error. Debugging that issue feels like your bike’s battery just died while climbing a steep hill.

How to Debug Slow Lambda Response Times

When you build your application on top of Lambda, AWS automatically scales the number of “workers” (think containers) running your code based on traffic. And by default, your functions are deployed to three Availability Zones (AZs). This gives you a lot of scalability and redundancy out of the box. When it comes to API functions, every user request is processed by a separate worker. So the API-level concurrency is now handled by the platform.

Serverless vs. Containers: Key Differences Explained

The “as a service” business model continues to grow rapidly, largely thanks to the rise of cloud computing. “As a service” offerings deliver IT products and technologies such as software, hardware, and data storage to consumers via the Internet, rather than having to install or manage them themselves. Serverless and containers are two such “as a service” technologies that have seen increasing adoption in recent years.

The Ultimate Guide to Monitoring Serverless Applications

Serverless applications, more often than not, have logic distributed over multiple functions and services, which with growth and agents and wrappers attached, can get more complex and costly. This is where Serverless monitoring comes in to help. But what is Serverless monitoring? Serverless monitoring allows developers to gain important insight on what happens during each execution and event, errors become more easily visible and measuring resource consumption for each invocation is possible.

10 amazing benefits of serverless technology

If you’ve heard of the term “serverless architecture” but you have no idea what it means, don’t feel bad. This buzzword (buzz-phrase?) has been on the rise for the past few years, and it doesn’t look like it’s going to stop. More and more cloud companies are trying to promote serverless architectures for businesses. There are plenty of benefits to using a serverless architecture for your business.