Skip to main content

Introduction to the Jamstack

8 min read

Older Article

This article was published 7 years ago. Some information may be outdated or no longer applicable.

This article walks through the Jamstack: what it’s about, why it matters, and which services slot into the stack.

If you’re interested in a condensed version of this article, have a look this video of my talk “Unleash the Power of the Jamstack” delivered at Front Conference.

Tamas Piros - Unleash the Power of the JAM Stack from Front Conference Zurich on Vimeo.

What is the Jamstack?

The Jamstack stands for JavaScript, API and Markup. But there’s a lot more packed into the stack than that abbreviation lets on.

A quick search turns up dozens of definitions. The one that sticks with me comes from Phil Hawksworth (Netlify):

A modern architecture: Create fast, secure sites and dynamic apps with JavaScript, APIs and pre-rendered Markup, served without web servers.

That quote carries some heavy words: ‘fast’, ‘secure’, ‘dynamic’, ‘pre-rendered’. And the kicker? A Jamstack site gets ‘served without a webserver’. Sounds impossible.

But swap “webserver” for “origin server” and things click. One of the Jamstack’s defining traits is this: there’s no origin server. Sites get served from an edge server instead.

Origin Server vs Edge Server

An origin server is just a computer running an application (typically a web server like Apache) that responds to requests from browsers.

Picture the LAMP stack. You’ve got a server running Linux with Apache. There’s a database (probably MySQL) and a server-side language like PHP generating content that becomes HTML. Every time someone requests a page, a PHP script fires, maybe runs a query, grabs some data, renders HTML, and sends it to the browser.

Two things matter here. First, this process runs on each request. Second, code is being executed on the server.

We won’t go into scaling, maintenance or security for this kind of setup here (what happens when 100k users hit the site? what stops malicious code execution? what if the app server and database server are separated?) but seasoned developers know these pain points intimately.

A quick note on the MEAN/MERN stack: it also has an origin server because Node.js runs alongside Express (which acts as a webserver), so there’s still server-side code execution. The key difference is that the browser has become the primary execution engine.

How is an Edge Server Different?

An edge server refers to a set of computers distributed globally with one job: delivering content as fast as possible to the nearest requesting user.

Think about a CDN (Content Delivery Network). We already use CDNs to deliver static assets like JavaScript and CSS. CDNs excel at caching static assets and always serve them from the nearest location. A user in Singapore gets JS and CSS from a CDN in Singapore. A user in Spain gets assets from a CDN near Spain.

CDNs also handle load balancing, automatic scaling, and traffic redirection under heavy loads. All of this comes out of the box. As developers, we never need to think about it.

Here’s the interesting bit. If we already store CSS and JavaScript on CDN edge servers, why not store HTML there too? HTML is really just a static markup language. But how would that work? How could we have data and structure ready when someone requests a page? That’s where pre-rendering comes in.

Pre-rendered Markup

To get static HTML assets onto a CDN, we need to generate the markup somehow. A whole category of tools does exactly this: Static Site Generators (SSGs).

The key difference is that SSGs generate HTML ahead-of-time, at build time, not at request time. All the data that HTML pages need gets collected during the build. The result is valid, static HTML markup ready to sit on a CDN or any static hosting provider.

There’s a huge number of SSGs available today. Some build on popular frontend frameworks like React and Vue. Some use vanilla JavaScript. Others run on Ruby or Go.

You might be confused here. The Jamstack is JavaScript, API and Markup, so where do Ruby and Go fit? Generating (pre-rendering) static HTML is the job of a tool running server-side. Any programming language can generate HTML for us. That’s the pre-rendering part. The JavaScript in the Jamstack refers to progressive application enhancement: using client-side JavaScript to call APIs. Don’t confuse the two.

Example Static Site Generators

NameLanguageURLGatsbyJavaScript (React)https://gatsbyjs.orgNuxtJavaScript (Vue)https://nuxtjs.orgNextJavaScript (React)https://nextjs.orgVuePressJavaScript (Vue)https://vuepress.vuejs.orgGridsomeJavaScript (Vue)https://gridsome.org11tyJavaScripthttps://11ty.ioJekyllRubyhttps://jekyllrb.comHugoGohttps://gohugo.io

There are many other Static Site Generators out there. For a more extensive list, check out https://staticgen.com

Static != Static

We’ve established that we can generate static HTML at build time. But these HTML pages aren’t static in the way you might think. We can hit APIs to make our HTML dynamic and interactive. The A in Jamstack refers exactly to this.

So what kind of APIs can we use? Any of them. Literally.

We can consume a custom-built REST API. Maybe your company exposes one. You pull that data at build-time and produce the final HTML.

There’s another type of API usage worth flagging. For progressive enhancements inside “static” pre-rendered HTML, we can drop in some client-side JavaScript and use the Fetch API to call any endpoint and append results to the page.

Standing on the Shoulders of Giants

We can also tap into APIs from third-party service providers.

Here’s the thing: loads of companies offer brilliant services and expose APIs or SDKs. These organisations are experts in their domain. They do one thing and they do it well. Need to accept payments? Use Stripe or PayPal. Need to serve, optimise and transform images? Use Cloudinary. Need authentication? Use Auth0. The list is endless. Borrowing Apple’s old line, “there’s an App for that”, it’s fair to say “there’s an API for that”.

And the beauty of it? Beyond being genuine experts, these providers handle all the infrastructure. They maintain it, they manage it, and we don’t worry about it. All we do is call an API.

But I Want to Execute Server-Side Code

If you need to run server-side code (remember, edge servers can’t do that) or bridge two APIs together, Functions As A Service and serverless computing fill that gap.

In short: you write a function, deploy it to a provider like AWS, and get a URL. Calling that URL invokes the function server-side and returns the result.

If you’d like to learn more about FaaS and Serverless, read the article titled Introduction to Serverless

Content (Headless CMS)

Every web app runs on content. So where does content come from in the Jamstack? Where does a static site pull its data? The answer: headless CMS.

A traditional CMS (like WordPress) couples the content layer with the presentation layer. That’s a monolith. A headless CMS separates them. It doesn’t care how the data gets displayed. It just lets us store and manage the content.

Some headless CMSes provide a UI for managing data. Don’t confuse that with controlling the presentation. The presentation layer is still decoupled.

There are two flavours of headless CMS: API-based and Git-based. An API-based headless CMS stores data in a database (SQL or NoSQL) and exposes RESTful endpoints. A Git-based headless CMS stores data (typically Markdown files with YAML frontmatter) in Git (GitHub, GitLab or BitBucket).

Example Headless CMS

NameGit/APIURLStrapiAPIhttps://strapi.ioNetlify CMSGithttps://netlifycms.orgContentfulAPIhttps://contentful.comCosmic JSAPIhttps://cosmicjs.comApperneticGithttps://appernetic.io

The above is just a short list. For a much more extensive one, visit http://headlesscms.org.

Benefits

We’ve covered the essential pieces of the Jamstack. Now let’s talk about why someone would actually choose this stack. Some benefits land on end-users. Others help developers and organisations.

Let’s start with the end-user perspective.

End-users

How do end-users benefit? They get a noticeably better experience. Performance is the big one. Static Site Generation and Server Side Rendering help, but the real speed comes from CDN edge servers.

The site delivered to users contains only HTML, CSS and JavaScript. There’s no server-side code execution to wait for, no database queries to run. The site is distributed globally by default and served from the nearest edge server.

Because of the decoupled architecture, users also benefit from faster, leaner deploy cycles. New features reach them quicker.

Developers

Developers (and businesses) benefit from the Jamstack in several ways.

First, there are real cost savings. Thinking “static first” means no physical servers and no cloud provisioning. The infrastructure shrinks, and the whole project becomes more cost-effective. And there’s another type of cost: developer time. Some organisations have dedicated staff for managing infrastructure (provisioning, patching, database installs). Others dump that work onto the engineers writing the code. Either way, it pulls people away from their actual goal: building the site or application.

With services like Netlify, you get atomic deploys. You can ship parts of the application and roll back any change with a single click. It’s an effortless operation.

Developers also get to work in familiar environments. The entire codebase lives in Git. Content can live in Git too. Deployments can be triggered by git pushes and commits.

And then there’s security. In a traditional stack like LAMP or MEAN, an origin server executes code. Application and database servers communicate with each other. That creates a wide attack surface with lots of moving parts. With the Jamstack, there’s no origin server, no server-side code execution, fewer moving parts. CDN providers handle the security considerations you might not have thought about.

Conclusion

We’ve covered the fundamentals of the Jamstack, and I hope the article gives you a solid overview. The Jamstack matters to developers because of git workflows, continuous deployment, and atomic deploys. It matters to end-users because of performance. Both sides win.