State of the Web: Serverless Functions

Serverless functions (also known as Function as a Service) execute code statelessly on the Cloud. This means that they can do things like scale infinitely and run anywhere. There are many different serverless function providers, like AWS Lambda, Cloudf…

Serverless functions (also known as Function as a Service) execute code statelessly on the Cloud. This means that they can do things like scale infinitely and run anywhere. There are many different serverless function providers, like AWS Lambda, Cloudflare Workers, and Vercel.



Background of Serverless Functions

Google created the first serverless model in their Google App Engine product, which offered auto-scaling stateless code execution. App Engine was different from most more recent serverless function providers, but it was the first product to try this idea. However, while it was used by companies like Snapchat, it did not catch on with the overall developer community.

The first Function as a Service (FaaS) provider to truly catch on was AWS Lambda. AWS Lambda was a serverless function service that Amazon released in November 2014. Lambda allows for functions written in many different languages to automatically scale in under a second while allowing users not to have to worry about the underlying hardware. Companies like Google, Microsoft, and Oracle also created their own service for serverless functions. Although to this day, Lambda is the most popular serverless function provider, and since its release, it has become faster, more flexible, and easier to use.

However, that is not the end of the story. Since then, many services have improved on AWS Lambda’s model in ways like ease of use and performance. The first notable FaaS provider was Vercel (ZEIT Now at the time), which was released in April of 2016 and was a lot simpler to use than Lambda. Another selling point was that it integrated well with Next.js, a React framework made by Vercel. Other, newer services also try to be a lot easier, like Begin, although Vercel is still the most popular in that group.

The second major innovation in serverless functions was edge computing with lightweight isolates. This was pioneered by Cloudflare Workers, a serverless product released in September 2017. It promised to allow your code to run on any of the many Points of Presence Cloudflare has worldwide, and it used V8 Isolates to reduce the startup time to a few milliseconds, and later, even zero.



Why Serverless Functions are Significant



Performance

Many serverless function providers offer high-speed services. As talked about in the background, edge computing has revolutionized serverless functions. Because serverless functions are stateless, they do not need to always run in the same place. This means they can work like CDNs and automatically deliver content from data centers close to users (the “edge”) rather than one centralized location. Serving from the edge can be a huge difference in latency for large networks like Cloudflare’s. Not all serverless function providers support this, but a growing number do, like Netlify, Cloudflare Workers, Vercel, AWS Lambda@Edge, and more.



Scalability

Unlike virtual machines, serverless functions can usually scale from zero to infinity. This means that you never get overloaded by requests, and you don’t have to waste money on computing power that you are not using. Whenever users request the HTTP endpoint, most serverless function providers automatically determine whether there are already running functions that can process the request. If there are not, a new function is created. Additionally, if function instances are not processing anything, they are automatically stopped. Some virtual machine and container services also offer autoscaling, but because it takes longer to start a virtual machine/container, it is much less granular.



Ease of Setup

Serverless functions abstract over the hardware usually (hence the serverless). Instead of worrying about setting up servers and operating systems, the provider takes care of everything. This is not necessarily unique to serverless functions, as containers and virtual machines usually do this too, but with serverless functions, you don’t even need to worry about the operating system or software running your code. The advantage of not managing your hardware and operating system is that you can get started a lot faster and not have to worry about as much.



The State of Serverless Functions



Language support

You can use almost any language with serverless functions. Whether you are using JavaScript, Go, or C, most serverless function providers support them. However, if you use a V8-based serverless function setup (most edge serverless function setups do this), language support might be more limited. Since V8 is primarily a JavaScript engine, the best-supported language is JavaScript. However, sometimes you want to use languages that do not support compiling to JavaScript. The solution is often WebAssembly, a portable assembly-like language that most modern languages support as a compilation target. WebAssembly also has other advantages, like how it can often perform faster. For more information on WebAssembly, you can look at our article on WebAssembly.



Production readiness

Serverless functions are used by many different companies and are supported by some of the biggest names in web hosting like AWS and Cloudflare. While serverless functions are somewhat new, they are still very production-ready.



Running on the Edge

Edge serverless functions are still experimental for the most part. However, some services are battle-tested, namely AWS Lambda@Edge and Cloudflare Workers. These services have existed for multiple years and are used by companies like NPM and Amazon. There are also other more recent services offered by companies like Vercel, Netlify, and Fastly.



Conclusion

That is it! Hopefully, now you understand serverless functions, why they are helpful, and what state they are currently in. If you liked this article, be sure to sign up for the mailing here. I hope you learned something, and thanks for reading.


Print Share Comment Cite Upload Translate
APA
AsyncBanana | Sciencx (2024-03-28T23:32:45+00:00) » State of the Web: Serverless Functions. Retrieved from https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/.
MLA
" » State of the Web: Serverless Functions." AsyncBanana | Sciencx - Sunday January 16, 2022, https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/
HARVARD
AsyncBanana | Sciencx Sunday January 16, 2022 » State of the Web: Serverless Functions., viewed 2024-03-28T23:32:45+00:00,<https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/>
VANCOUVER
AsyncBanana | Sciencx - » State of the Web: Serverless Functions. [Internet]. [Accessed 2024-03-28T23:32:45+00:00]. Available from: https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/
CHICAGO
" » State of the Web: Serverless Functions." AsyncBanana | Sciencx - Accessed 2024-03-28T23:32:45+00:00. https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/
IEEE
" » State of the Web: Serverless Functions." AsyncBanana | Sciencx [Online]. Available: https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/. [Accessed: 2024-03-28T23:32:45+00:00]
rf:citation
» State of the Web: Serverless Functions | AsyncBanana | Sciencx | https://www.scien.cx/2022/01/16/state-of-the-web-serverless-functions/ | 2024-03-28T23:32:45+00:00
https://github.com/addpipe/simple-recorderjs-demo