Serverless computing paradigm is a new way of leveraging cloud infrastructure. Over the last few years, many applications have been conceived as serverless applications. Even the big cloud vendors are promoting it like crazy. The major benefit is the cost, both in terms of development & ownership. Another advantage is the ease of development and maintenance.
If you or your organization is looking at evolving your legacy cloud-hosted application towards the serverless paradigm then this blog post is for you. We will cover the generic use case scenarios for which the serverless paradigm is well suited, along with a list of the major players offering serverless platforms. So if you are looking to build a serverless application then read on to know whether your use case makes the cut
First, a look at what really is meant by serverless.
What Exactly Do We Mean by Serverless Computing
Whether you are hosting a website or building a web based application, you need computing resources to process the incoming user requests. A typical computing infrastructure looks somewhat like this.
This is a generic system comprising of input, processing and an output block. All information systems are based on this simple model: processing input information and storing it, followed by retrieving it back as output.
Each of the tasks performed by this generic system requires computing power. Depending upon the complexity and scalability of the application, it can be handled by one or multiple servers wherein each server is assigned a fixed role.
In the serverless paradigm, servers are not static in their role. For the end user of the application, this does not matter at all. But this distinction is very important from a developer’s point of view.
In a traditional cloud computing environment, a bunch of servers support a platform and are assigned a fixed application specific role. Take the case of a web application. The servers run a Linux platform ( example LAMP Stack) and performs a certain function which is split between the web, application and a database servers and results in some data being retrieved and processed to display on the end user’s browser.
In a serverless environment, the server does not have an application specific role. Rather it can switch roles and perform any function as long as that function is portable and supported by the server's underlying platform.
If you are a developer building a web application, then deploying it on a server the traditional way entails choosing and allocating the server and then installing and executing the application binaries. Instead, if you decide to go serverless, you would just focus on the application code and let the cloud platform handle the server allocation for you, on demand. As and when you demand, a server will be available to execute your application code.
This is a significant mind shift from the traditional cloud computing paradigm. Now, you do not rent the server’s uptime, based on calendar days. You only own the server’s time slice which is used to process your application functionality for serving a user request.
For a detailed low down on the technical , deployment and maintenance aspects of applications built with serverless, you must check out this resource
How Can You Leverage Serverless Computing
The serverless computing paradigm is possible due to the advancement in containerized runtime environments. Docker is the best example of containerization technology in real-world deployment. A containerized system allows the operating system to be split into multiple independent partitions which can execute an application.
Here is a nice article on what is containerization and how it differs from virtualization.
You might wonder why do we need containerization for running separate application when modern operating systems can anyways run multiple applications. Think of a container as a slice of the operating system resources as a separate partition. If you slice the operating system into four containers, then you can install four independent applications on those containers, with independent file systems and configurations, without interfering with one another.
The serverless paradigm leverages the container based application deployment and adds an orchestration layer which allows the servers to serve requests in a pooled fashion. So instead of a dedicated server for an application, a set of pooled containers serve many applications based on incoming requests.
If serverless is so beneficial, then why don't we shift all the applications to serverless way? Like all technological innovations, there are trade-offs and it is worthwhile to consider them before you jump into the serverless bandwagon.
1. Limited Server Customization : Firstly, adopting a serverless computing platform means that you are not in control of the server. You are only in control of your application business logic. The server is orchestrated by the cloud service provider. Hence you cannot tweak the server’s system resources at your will.
2. Diminishing Returns With Scale : You are paying by the time slice of the server's computing time, and not by the server’s uptime. If your applications make 24 calls to the server in a day and each calls results in a 1-minute computation, then you are paying only for 24 minutes of server time which is minimal. However, if you are making 1440 calls to the server then do the math. Serverless does not yield any benefits when your application function is invoked too frequently and is computationally heavy. So certain applications do not fit the bill for serverless computing. Examples include multimedia and real-time interactive streaming, hugely scalable systems that handle millions of requests every second.
3. "Less" is a Misnomer : Serverless does not mean the absence of servers. It only means that the server serves you less and as and when required. But the server is still present. So the provider of the serverless computing platform still has to bear the cost of server infrastructure. If you are developing an application then it is in your best interest to leverage one of the cloud providers which provide serverless computing as a service, instead of hosting your own serverless platform.
4. Overkill for Elastic Scalability : Containers are elastic and very flexible when it comes to spawning or releasing container instances. If your applications take a fixed amount of computing resources then you do not need this flexibility and can do away with containers. Examples include long running processes or backup scripts.
Irrespective of the trade-offs, you can always find a way to optimize your server infrastructure by selectively adopting serverless computing. In most cases, you might be maintaining an existing application built on the legacy cloud architecture and there is a constant pressure to add new features. Instead of plugging in the additional features as chunks of new code to the existing monolith application, you can possibly look at extending the application using the serverless paradigm.
There are multiple use cases where migrating to a serverless approach makes complete sense, within the realm of a traditional cloud computing environment.
All software and IT systems have a need for sharing data with third-party applications. Usually, these third-party integrations are always customizable based on the user’s preference. Hence this requires additional processing based on a decision logic to decide whether to share information with the third party or not.
Burdening your application to do this additional processing may result in more complex business logic and additional processing time. it might be worthwhile to hand off such decisions to a serverless module.
This is like having a webhook to trigger external data sharing. The serverless module can be invoked as a HTTP webhook at certain trigger points within your application business logic. It will handle the additional processing in parallel, without adding any overheads to the main business logic.
Use Cases of Sharing Data using Serverless
Data syndication - Syndicate with partners or third party services who can consume your data for marketing and promotional purposes
Research - Sharing data with external consultants and experts for research purpose
Second party sharing - Sharing data with your immediate external vendors / partners who are part of your business process
Notifications: Send fanned out notifications to external applications for handling specific requests
Events - Sharing data as an event payload for triggered by a specific logic which may or may not be dependent on the data
Backup : Sharing data for backing up on a third party application
Gone are the days when IT systems are built as a monolithic application. Thanks to the API economy, many applications can be rapidly built be leveraging third-party APIs. This is a very common need, especially when you are building an application that relies on a niche set of services which are beyond the scope of the original system.
These days, a very frequent example of data manipulation is integration of an AI/ML-based feature to enhance the data. Executing an AI/ML driven business logic is not as straight forward as the input, processing & output based flow. It requires background processing to train and tune the system and a massive amount of number crunching to produce desired results. Chances are that you may not have designed your original system for handling such requirements. How do you integrate AI/ML workflows into your system?
The answer is simple. Have an API that invokes the AI/ML service and use a serverless component to interact with that service. This way you can build additional processing flows in your main application workflow without burdening your core business logic.
Use Cases of Data Manipulation using Serverless
Data Translation : Translating data from one form to another without changing the underlying meaning, for example, translating text from one language to another
Data Transformation : Transforming data from one format to another, for example, converting text to speech
ETL : Performing ETL processes for a data analysis pipeline to convert raw data into a useful form
Annotation : Annotating data by adding meta data for different purposes, for example, a chat message with word tags
A heterogeneous information system receives input data from a variety of sources. Some of these sources are external and may not be under the control of the application provider. In such cases, it might be required to establish gating controls at the data ingestion points in the system to guard against any unsavory deluge of garbage data.
As the saying goes, garbage in is garbage out, hence it is important to safeguard your system from all incoming data, especially when it is coming from a third party system which is beyond your control.
If you identify all those system checkpoints in your application that ingest data from third parties, then you can offload the filtering logic to a serverless component.
Use Cases of Data Filtering using Serverless
Data Filtering : Filtering data or metadata based on predefined rules or machine learning
Content Filtering : Filtering content based on images, audio and video
Interface Gating : Filtering data ingestion interfaces based on rules such as volume, source or frequency of data
If data is the new oil then data analysis applications are like the refineries which churn out valuable insights and knowledge nuggets from the crude data. This requires additional processing which is often executed in parallel with the main business logic of the application.
Whether you are analyzing the data for your internal use or for external consumption, you need additional computing resources for that. Serverless is ideally suited for this kind of requirements. If you identify those functional flows in your application which tend to get bulky due to the inline data analysis, then it's best to offload those tasks to a serverless platform.
Use cases of Data Analysis using Serverless
Data Aggregation : Gathering data from multiple sources and combining them into one source of truth
Data Export : Exporting data to an external platform for reporting and visualization
Data Exploration & Mining : Transporting data to a third party system to run AI/ML workflows and for further analysis
The Best Serverless Platforms to Choose From
Serverless has been around for over four years now. A lot of companies have emerged which offer serverless platforms. The traditional biggie cloud service providers have also upgraded their service offerings to include serverless computing within their bouquet of services.
All serverless platforms are offered as a development environment where functions are written. As a developer, you can write your function containing the application business logic. Once done, it gets automagically deployed and executed by the platform whenever a pre-configured trigger is invoked. Hence the serverless platform serves to execute the function without you worrying about the server deployment. This is how a new category of as-a-Service offering has emerged.
Serverless, also known as Function-as-a-Service (FaaS)
1. AWS Lambda
AWS is hands-down, the pioneer in cloud computing. Launched in late 2014, AWS Lambda was one of the first commercially available serverless platforms. It supports many programming platforms like Node.JS, Python, Java, .NET and more.
If your application is hosted within the AWS platform, then Lambda can be easily integrated with other services. If not, then you can create a webhook by coupling the lambda with the AWS API Gateway service and expose it to the outside world.
2. IBM Cloud Functions
IBM Cloud functions offers FaaS which is based on Apache Openwhisk. It is comparable with Lambda in terms of features and supported languages. You can also integrate with custom docker containers to host and execute your code in your preferred language.
IBM Cloud also offers its API service named, API Connect which integrates with cloud functions to create backend webhooks to trigger functions.
If you are looking for real-time processing of data streams at scale then PubNub Functions is well suited for this job. Backed by the globally distributed data streaming network, PubNub Functions offers a very reliable and resilient server offering for hosting your FaaS business logic that needs realtime processing at scale.
PubNub has a catalog of third-party integrated services on PubNub Functions, known as BLOCKS. You can check out their BLOCKS catalog to look for pre-built serverless modules around AI/ML, chatbots, IoT data handling and more.
Currently, in beta, Twilio Functions is the serverless offering from Twilio which one of the largest cloud communication platforms.
Backed by Twilio’s programmable voice and SMS based services, Twilio Functions can add a serverless flavor to your application if it is heavily dependent on SMS and voice-based alerts.
If you work with Microsoft technologies then Azure Cloud is a good option. All services offered by Azure Cloud are comparable with the ones offered by AWS and IBM.
Azure Functions is the FaaS offering from Azure Cloud. Although it also supports a lot of programming languages, most sample examples on their website are either on .NET or Node.JS.
If you are looking for a universal serverless framework that providers the flexibility to spawn serverless workloads on multiple clouds then check out https://serverless.com. It supports AWS, IBM, Azure as well as Google cloud. With a few commands you can deploy your serverless functions on any one of these cloud services
In terms of the features, the AWS, IBM and Azure offer a plethora of options to build serverless applications. All of them have a matured eco-system of their own and it is easy to combine other services, such as databases, networking and AI/ML workflows.
PubNub and Twilio are standalone FaaS platforms. They don't offer a rich set of services around the FaaS to build complete applications, but they are well suited for their specific niche and are easy to integrate with. PubNub actually supports webhooks in their Functions, thereby making them callable via an API, without additional configuration.
Beware of Self-hosted Serverless Platforms
If you are looking to adopt any of the serverless migration strategies for your application then one among these serverless platforms are your best bet. Be mindful of the fact that these are hosted platforms and not DIY serverless frameworks. There are also a lot of open-source serverless frameworks that enable to host your own serverless platform. As an application provider, your best option is to avoid these self-hosted serverless platforms and go for a hosted platform as the costs of managing the servers for serverless hosting is still high.
Take Your Pick
Choosing the right platform for your application is a critical decision and involves scrutiny of many details at a specification level . We have not factored in the nuances of the specification for each of the above platforms beyond the language support and their specific niche. But, that is something which we will possibly cover in a future post. So let us know if you want a elaborate post on the specification parameters to evaluate the performance of the FaaS platforms.
In terms of cost, they are all comparable. With serverless, you anyways pay for the hits to the server or for the timeslice. Each call will cost a minuscule fraction of a cent, however some platforms might charge an upfront monthly pre-paid subscription fees to enable the service.
There are other challenges as well. Serverless way of developing applications require a different mindset when it comes to DevOps, tooling, release and software upgrade management. Each of these is an in-depth discourse on its own with various opinionated views. Serverless is still very new as compared to traditional legacy architecture and it will take some time to evolve these practices.
All said and done, serverless has opened up a new way of thinking cloud architecture. It is the most efficient way of maintaining elasticity in cloud deployment, has significant cost savings potential in the long run and spares the developers from writing , debugging and testing infrastructure code, which used to be the norm, even a decade back.