In the last few months, I have started to redesign some of our Open Source project websites. This includes the websites of the Node.js CSV project, the Node.js HBase client and the Nikita project, our homemade system deployment tool. I have been using multiple static website generators in the past but I wanted to try something new, performant and ideally with React.js or Vue.js used on the server side. After assessing different tools including Nuxt and VuePress, I decided to go with Gatsby.js, with the additional benefit of being able to experiment with GraphQL.
Gatsby.js was created about 3 years ago by Kyle Mathews at a time when React.js and GraphQL were getting endorsed by the community. It expanded over time with a plugin service and a growing large ecosystem. By backing 2 technologies at the right time, he attracted early adopters to help push the project forward involving people and organizations such as the Facebook and the GraphQL teams as contributors. At the same time, headless CMS and services such as Netlify were starting to get traction as well. At the time v1 was released in July 2017, a core group had emerged with hundreds of contributors.
Gatsby.js is a blazing fast static site generator but also a full-blown startup which not only built a product but also a community. In May 2018, Gatsby.js raised a $3.8M seed round and is now a startup. How does a static generator project, based technology which has been around for years without much innovation, raised $3.8M to become a big business?
In the traditional approach, the user requests a page from the server and the server build the page to send it back. Now with static site generators, everything is precompiled ahead of time and served to the end user. The limitation of this approach is that the time incurred by the necessary build step can become tremendous for larger websites, despite Gatsby.js being very fast and getting faster after each generation. While taking a few seconds for the majority of websites, it can take minutes to hours for the largest ones. In such a case, it becomes impossible to publish new versions of the website for every change. Managing the concurrency of updates as well as waiting too long before the publication is not always acceptable. Gatsby.js as a commercial product will focus on providing a tailored infrastructure, consulting services as well as custom tools to support incremental builds.
Gatsby as an open source product shall remain unchanged. In everybody’s minds, the word static means to take a bunch of data, to process it and to spit out HTML out of it. Gatsby.js takes it a little further on multiple fronts.
To begin with, it is very performance focus. Furthermore, it is able to act as a universal data consumer.
When a website is built through Gatsby, it automatically handles the code splitting, the minification, all the optimizations that need to happen such as next page preloading in the background, images processing and resizing. The site that is built, without any specific intervention, is already performing better than the average as soon as you ship it. This is even more important nowadays with the announcement of Google a few months ago which announces to use mobile performance as an indicator in search engine ranking.
- Entire Project on a CDN
- Everything Lives in Git
- Modern Build Tools
- Automated Builds
- Atomic Deploys
- Instant Cache Invalidation
Out of the box, a Gatsby.js site gets excellent results at the Lighthouse tests and it is almost trivial to get all the tests passed with the maximum score. Lighthouse is an open-source automated tool published by Google for improving the quality of web pages. It has audits for performance, accessibility, progressive web apps, and more. To get an idea, here’s a quick extract of some of the requirements expected by Lighthouse:
- Performance: no blocking scripts, lazy loads of images, cache policies, rendering timing, CPU usage, …
- Progressive Web App: usage of manifests, service worker and Cache API, splash screen, …
- Accessibility: correct usage of aria attributes, meta information, local settings, …
- Best Practices: HTTPS, HTTP/2, doctype, no error output, …
- SEO: mobile friendliness, data structure, HTTP status code, …
Google considers that any page loading in more than 3s loses 50% of its visitor. On 3G, the rendering time often exceeds 12s to 15s. If you follow the project recommendations and use available tooling like for image handling, a Gatsby.js site will take an average 1s to 5s on this same 3G connection.
Here are the performance scores of 3 websites we recently build:
First Contentful Paint: 1,410 ms
Speed Index: 1,410 ms
Time to Interactive: 1,430 ms
First Meaningful Paint: 1,410 ms
First CPU Idle: 1,410 ms
Estimated Input Latency: 13 ms
First Contentful Paint: 990 ms
Speed Index: 990 ms
Time to Interactive: 2,590 ms
First Meaningful Paint: 1,010 ms
First CPU Idle: 2,480 ms
Estimated Input Latency: 25 ms
First Contentful Paint: 970 ms
Speed Index: 2,880 ms
Time to Interactive: 4,810 ms **
First Meaningful Paint: 970 ms
First CPU Idle: 3,400 ms
Estimated Input Latency: 75 ms
** Note, the high Time to Interactive result in Nikita is due to the canvas animation present on the homepage and only apply to this page.
Data may be sourced from any provider as long as it can be converted to a JSON object. This includes a database, a remote REST endpoint or Markdown files stored locally. Gatsby.js can consume these data by putting it in a uniform and central endpoint with GraphQL. The developer, when working with a Gatsby.js site, has a uniform surface to work on. He writes code in React and consumes data from GraphQL. For content editors, the content team can work in Wordpress, the sales team can work in Salesforce and the developer can write its documentation in Markdown. Nobody has to change its workflow because Gatsby.js can consume all the sources and the frontend team can build amazing sites not being concerned with the data coming from all the places.
Introducing a GraphQL layer on top of a large project impacts the velocity of the team which interacts with a centralized GraphQL service instead of multiple REST endpoints. It shifts the way frontend is built by eliminating the need to write multiple async requests and only declaratively define what you need, not having to deal with access rights and authentication, data reconciliation, performance overhead, request scheduling which breaks the developer’s workflow. Gatsby is concerned with both how to get the data and how to display it. This makes it unique compared to other GraphQL solutions and static site generators.
A lot of developers, including John Resig, the creator of JQuery, presents GraphQL as a viable alternative to REST. With the growth of public and private APIs, editors find it hard to understand all the requirements required to integrate those data into their application and GraphQL can help in this matter. It is better than what REST has provided.
In the last few months, efforts have been made to prepare and deliver the version 2 of Gatsby.js. Along with this new major release, an entirely new redesigned website was published to enhance the navigation and enrich the documentation.
Here are some expectations for the near future:
- Build time: Gatsby being focused on performance, it is not among the fastest engines in term of content generation. The team has engaged multiple initiatives to optimize the build process. Incremental builds are other functionalities expected in the future.
- Internationalization: There is no current official ways of doing Internationalisation and users have to build their own. Works are anticipated to determine whether or not this is the responsibility of Gatsby to handle such use cases and what are the best practices for doing it.
- Accessibility: It has been a main focus in the past, providing the basics such as running the JSX accessibility plugin in the ESLint configuration. In the future, they hope to increase the support for accessibility as for example by finding a proper solution indicating navigation changes to users using a screen reader.