For a time, it appeared that the reach of serverless platforms had exceeded their grasp. After some notable success stories like the high-profile “serverless-first” Capital One strategy, hype appeared to be dying down. This is partly because serverless has always struggled from unclean definitions, and despite the lack of infrastructure and application servers to wrangle with, developers still needed to have some understanding of operating within a cloud-native platform. The proof is in the pudding: Around 2020, there was a significant drop-off in public interest in serverless platforms according to Google search metrics. But an interesting uptick started to percolate in 2022, and today, search metrics and our survey data show that serverless is more popular than ever. Here’s why serverless is trending again with customers:

  • A better definition has emerged as serverless becomes a path to developer productivity. The term “serverless” was always a misnomer and, even among end users and vendors, tended to mean different things depending on product and use case. Just as the cloud is someone else’s computer, serverless is still someone else’s server. Today, things are much clearer. A serverless application is a software component that runs inside of an environment that manages the underlying complexity of deployment, runtimes, protocols, and process isolation so that developers can focus on their code.
  • Enterprise success stories delivered proven, repeatable use case solutions. The initial hype around serverless centered around fast development cycles and back-end use cases where serverless functions acted as the glue between disparate cloud services. Many of these examples came from cloud-born early adopters or from vendors leveraging function as a service (FaaS) under the hood for their own platforms and services. Since then, we’ve seen many more enterprise customers taking advantage of serverless.
  • An expanded ecosystem of ancillary services drives emerging use cases. The core use case of serverless remains building lightweight, short-running ephemeral functions. But in recent years, serverless providers have expanded the ecosystem of data and integration services like serverless key value stores, API frameworks, and serverless vector databases, thus enabling a larger variety of use cases.

A Maturing Serverless Ecosystem Drives New Ways To View The Marketplace

In our last Forrester Wave™ covering the serverless development space, we focused exclusively on FaaS platforms offered by hyperscale public cloud providers. While FaaS remains the core technology to serverless development, the ecosystem has expanded to focus more on supporting use cases than one opinionated platform approach. We will align our upcoming Landscape and Wave covering this market with these emerging trends:

  1. Versatile cloud services are replacing early serverless development use cases. Serverless computing is expanding beyond FaaS and now describes cloud-native development platforms that require little to no manual provisioning and that offer autoscaling, consumption-based micro-billing, and scale to zero. Early functions were used primarily as integration glue in use cases like filtering, routing, batch processing, and event enrichment. These use cases are going away and being replaced by flexible turnkey cloud services that are built to be part of a “compose and consume” development pattern.
  2. AI use cases are breathing new life into the serverless computing model. AI is the topic du jour in most product categories and business units, and serverless computing is no different. Given the implications of widespread AI workloads in terms of power consumption and scarce GPU resources, as well as the inherent on-demand nature of generative AI prompting, the ephemeral serverless model has emerged as a strong fit for AI applications.
  3. WebAssembly (Wasm) has emerged as a powerful enabler of deployment-agnostic serverless platforms. Despite much of the chatter coming from vendors rather than end users, Wasm might just be the future of serverless development platforms. It carries several built-in advantages: Cold start times are drastically reduced, with demonstrations highlighting spin-up times in under a millisecond. It also offers significantly more portability than other approaches, with the Wasm binary format compatible within different browsers in multiple operating systems and CPU architectures like Intel and ARM.
  4. Edge and serverless development will converge on distributed use cases. For a while, the infrastructure that services developers relied on existed in one of two camps, whether serverless or not. On one end of the spectrum, you had public cloud data centers, offering a dozen or more site options in a regional context. On the other end of the spectrum, you had edge-focused providers, which usually emerged from content delivery network (CDN) companies and invested in thousands of smaller points of presence. We expect these models to converge and focus more on applications, use cases, and capabilities rather than the infrastructure paradigm.

Serverless Development Platforms Are A Core Cloud-Native Technology

Because of all of the trends above, here is our new market definition. Serverless platforms are:

A cloud-native software development platform that abstracts away underlying cloud infrastructure, complex server configurations, runtime characteristics, and deployment patterns from the development process. FaaS is the most common implementation of serverless development and forms the core of serverless architecture, but any platform that meets the definition will be considered. A serverless development platform supports the deployment of arbitrary business logic, decouples state from underlying compute, automatically scales on demand, offers micro-billing (often by the millisecond), is run on managed cloud infrastructure, and supports event-driven communication. In addition, there are extended capabilities that expand the use cases a serverless development platform can accommodate such as state/storage services, distributed managed infrastructure, asynchronous messaging, observability, and security.

For more information or if you have any questions, please schedule an inquiry or guidance session with me.