Introducing Modus: code-first intelligent APIs

Introducing Modus, an open source, serverless framework for building intelligent APIs.

Today, we’re excited to unveil Modus, an open source, serverless framework for building intelligent APIs. Modus makes building backends delightful, freeing developers from infrastructure complexities and putting code at the heart of development.

Whether you’re using AI today or not, modern apps will require seamless integration of data, models, and business logic. Modus was crafted to enable developers to rapidly experiment with AI. And when you’re ready to launch, it eliminates the common barriers of moving AI into production by providing a sandboxed, high-performance execution environment powered by WebAssembly.

From startups to tech giants, Modus streamlines building smarter, more adaptive apps that are ready for production from day one.

How we got here

More than a decade ago, the cloud redefined how infra would be consumed. It formed around the primitives of compute, network, and storage. And not a lot has really changed since.

Developers have been forced to think about their apps through the lens of deployment, not the other way around. It’s nearly 2025 and even a “serverless” app still requires you to wire your lambda compute through custom networking to an API gateway that you need to configure and harden before exposing it to the world.

Two years ago, Vercel coined the term framework-defined infrastructure, characterizing frameworks and corresponding runtimes where the infrastructure conforms to the app (rather than the inverse). The implementation of this approach in Next.js revolutionized the frontend world. Airflow unlocked a similar level of productivity for data engineering. There were attempts to do the same for backends, but the container was never the endgame.

The latest wave of AI adds even more components. From language models to vector databases, it quickly became more infrastructure for devs to integrate. The first generation of AI libraries are best classified as “infrastructure-defined frameworks”, attempting to create useful abstraction but pushing complexity to the developer in the process. They made it quick to build POCs, but getting into production often led to rewrites into lower-level formats.

[Modus enters the chat]

We’re excited to introduce Modus, an open source, serverless framework for building functions and APIs powered by WebAssembly. It simplifies the integration of AI models, data, and business logic with sandboxed execution, providing a code-first workbench for building AI-enabled apps.

With Modus, we’ve reimagined the app backend from the developer’s point of view. We worked backward from the live app and forward from the way teams write new apps and features from the first line. And we built it all with a deep understanding of the security requirements of putting AI into production.

The result is a polyglot framework (currently Go and AssemblyScript, with more language support to come) that unlocks rapid experimentation, is really fast and secure, and provides a workbench for building AI-enabled apps. Modus is batteries-include, yet extensible to fit decisions you’ve made on data stores, hosting environments, and AI model providers

“You finally found something useful to do with WebAssembly.”

– Robert Edwards, Head of Enterprise Architecture at Glidewell Dental.

Built for rapid experimentation

A backend framework built for developers means fundamentally starting with the code and crafting build tools and a runtime that fit its contours naturally. Remove the plumbing and toil and your code feels more powerful than ever. A change in your function is automatically reflected in strongly-typed interfaces and schema-agnostic data stores on every save.

When you write functions with Modus, your APIs auto-generate. Queries and mutations register exactly as you’d expect based on the constructs of your code.

With fast refreshes, modus dev allows developers to quickly iterate when working locally. The Modus CLI builds and loads your updated modules into the runtime on every save. Within seconds, your app’s endpoint is updated with your changes.

Really fast and secure

Whether it’s scaling out to meet demand or raw processing speed, faster is generally better–who wouldn’t want a query or inference to run faster? But it normally comes with trade-offs like wasted capacity and reusing execution environments. Developers need to either accept or mitigate these challenges, adding more to their overloaded plates.

Modus leans on the power of WebAssembly to ensure performance, scalability, and security. Let’s take a look at what happens when the Modus runtime receives a request on your API:

  1. your compiled code is loaded into a sandboxed execution environment with a dedicated memory space

  2. your code runs, aided by host functions that power the Modus APIs

  3. if needed, data and AI models are queried securely, without exposing credentials to your code

  4. the function responds via the API result and the execution environment is released

Each execution gets its own environment. After the execution environment is released, the associated memory is freed, wiped clean, and returned to the pool of available memory. With this design, there’s no risk of leaving data behind for future misuse or memory leaks slowing down your functions over time.

Furthermore, only the data stores, AI models, and external APIs that are defined in the Modus app manifest are accessible, limiting exfiltration risk with a secure-by-default execution environment.

Fast function execution is made possible by the work Modus performs during the build and load steps of deployment. At build time, Modus extracts the metadata of your functions and then compiles the code to a WebAssembly module. Now, let’s take a closer look at what Modus does as it loads these inputs into the runtime environment:

  1. your code is ahead-of-time (AOT) compiled with optimizations based on the host environment

  2. the compiled module is cached in memory for fast retrieval

  3. an invocation plan is prepared for each function

  4. an API schema is generated and endpoint activated

  5. connections, models, and other configuration details are extracted from the app’s manifest

These performance and security benefits come for free to the developer when building with Modus. No configuration required and the overhead of establishing the sandbox for an average function is just a few milliseconds. We’ll share more benchmarks in a separate post comparing Modus to standards like AWS Lambda.

Workbench for modern apps

Modus simplifies the integration and validation of working with all classes of models. Connect through an extensible interface that is designed for 100% feature completeness with the target model platform’s APIs. Modus allows you to try new models without needing to learn a new SDK with each iteration.

Language models are about more than generation though. There are also classes of language models that embed text into vectors. When plotted in a multi-dimensional space, language similarity, clustering, and merging algorithms can be applied with great results. Seamlessly add natural language similarity search to your app without shuffling vector embeddings between disparate services. Instead, let the infrastructure respond to your app’s requirements while removing redundant network calls.

When you build with Modus, model invocations are automatically captured and stored for easy understanding and future model fine-tuning. You can even track token usage at the function level so you understand what code is driving your model serving costs.

Using AI isn’t required when building with Modus, but you’ll know it’s in your toolkit and ready when your app calls for it.

What’s next

Our near-term focus is continuing to refine the developer experience and usability with Modus. For example, we’re bringing an integrated API explorer to the fast refresh experience so that you can easily query your app’s endpoint with every change.

Additionally, over the coming months we’ll be introducing ModusDB, an embedded multi-model database built on the power of Dgraph’s query engine. Dgraph is the most popular open source graph database with a distinct ability to efficiently query across schema-agnostic data. Developers will be able to simply write and read type-centric data in their functions, deploying without worrying about constant schema migrations and messy joins.

Get involved with Modus

We’ve seen increasingly complex apps get built with Modus over the past year by a set of early adopters. It’s encouraging to see complex use cases become simple implementations and the hurdle for AI adoption lowering every day. As we release Modus publicly, we’re excited to see what you build and encourage you to get involved via GitHub or Discord.

Want to learn more? Join Matt Johnson-Pint, co-creator of Modus and staff software engineer at Hypermode for Modus: A Deep Dive, a virtual event on Thursday, November 21st.

Stay updated

Hypermode Inc. © 2024

Stay updated

Hypermode Inc. © 2024

Stay updated

Hypermode Inc. © 2024