Is There a Mutual Future for Serverless and DevOps

DevOps plays a prominent role in the culture and workflow of software engineering, handling crucial tasks such as building, testing, deployment, and monitoring, and facilitating collaboration amongst teams writing the source code. DevOps also handles automation approaches, applying automated testing (CI),  and automated deployment (CD) to the software development cycle.

Even with the ever-increasing adoption of cloud-based services, DevOps has remained integral to these development workflows. With serverless computing becoming popular, the question comes up: what does this mean for DevOps as we know it?

In the following article, we’ll dive into the architecture of serverless infrastructure, and benefits to organizations who choose to adopt this approach. At its most basic definition, serverless refers to an environment in which cloud providers are wholly responsible for the management of hardware and operating systems. With many management tasks “outsourced” to providers (and more and more tasks going to an “X-as-a-service” model), many suggest that DevOps may meet a bit of redundancy.

Instead of viewing serverless as a threat to DevOps, let’s explore how the approach can support DevOps objectives, making life (and management) a lot easier while lowering costs.

What is Serverless

To begin, let’s start with defining what Serverless is – and what it’s not.

Many incorrectly define serverless as FaaS (Functions-as-a-Service), a model which provides a platform for the development, management, and run of applications. By building an app using the FaaS model, one can achieve a serverless architecture – but this doesn’t mean that FaaS = serverless. The FaaS model is one means of establishing a serverless environment.

Serverless refers to a cloud systems architecture in which any servers, virtual machines, or containers are abstracted from the operator or developer, meaning they don’t require provisioning or management. A serverless approach allows developers to write and deploy code without taking underlying architecture into account, releasing the need to own and manage the physical hardware needed to support development endeavors.

A serverless approach allows more flexibility, as developers are now able to to purchase and leverage backend services on an “on demand” basis, paying only for services they use. The term serverless obviously shouldn’t be taken too literally – servers continue to provide this backend infrastructure. The difference, however, is that the functionality and maintenance is handled by the service provider.

Some Other Advantages of the Serverless Model:

Simplification – Developers building in a serverless environment do not need to deal with policies for scaling, as their provider will address all scaling on demand. By taking a FaaS approach to serverless, developers can write independent functions.

Lower Overhead – The serverless model helps to lower operational and development costs, as services are billed on a consumption basis, and without hardware onsite, management and upkeep is irrelevant.

Scalability – Serverless applications and services have auto-scaling built in, meaning there is no need for concern about scaling to meet demand.

Turnaround Time – Developers no longer need as complicated a process to roll out features and bug fixes, reducing the time to market by making it possible to add and modify code in small increments..

CI/CD And Serverless

The term “serverless”, whether referring to accessing cloud hosted services, or custom user code in a FaaS (Functions As A Service) model, means essentially outsourcing server management.  Infrastructure as a service is as old as the cloud, and frees users from managing physical servers.  “Serverless” takes things a step further by eliminating all server management (physical or virtual).  Outsourcing server management simplifies operations, but doesn’t eliminate the need to test and deploy software.

Continuous integration, which encompasses both unit and automated integration testing, still must occur.  Unit testing is utterly unaffected, since by definition it tests software at a very low level in isolation.  Integration tests are another matter.  A serverless integration test involves deploying one or more serverless functions, running integration tests against them, and then tearing them down.  This is not conceptually much different from standing up some servers in the cloud, deploying code and running tests against it.  The steps are the same, with only the complexity of deployment, including scaling and healing scenarios, greatly simplified. 

Continuous delivery seeks to automate the deployment of production systems.  The basic strategies for deploying to production are unchanged for serverless, as least conceptually.  The process still means dealing with exposing new code to customer access, whether incrementally or all at once, and still requires a strategy to undo the rollout if things go sideways.  The complexity is reduced with serverless, and is affected by the serverless platform in use, but the underlying challenges are the same.

Having a declarative platform like Cloudify that can adapt to serverless and conventional cloud and multi-cloud deployments, makes CI/CD provisioning centralized and versionable.  You can’t really call serverless provisioning “infrastructure as code”, but it can/should be handled similarly.

The Cloudify Serverless Plugin

The Cloudify serverless plugin adds support to blueprints for serverless workloads.  The plugin is in incubation status, and only works for Python handlers on AWS Lambda for the moment.  The plugin defines a TOSCA type that represents one or more serverless functions named `cloudify.nodes.serverless.Service`.  This type lets you deploy (and undeploy) serverless functions by referencing them in your blueprint.  See the very simple example blueprint.

Conclusion

While serverless technology reduces the “Ops” in DevOps less challenging, it has no effect on the “Dev” part.  In fact, testing of significant numbers of remotely hosted, asynchronous event handles is quite challenging and calls for extensive automated testing and deployment.  An unopinionated, declarative orchestrator like Cloudify can automate Devops IaaS, service provisioning, and FaaS requirements in a familiar TOSCA based way that can be managed like any other code in your environment.

comments

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    Back to top