Written by Samson Omojola✏️
The Relay compiler is a GraphQL code generation tool used for React apps. Previously created in JavaScript, the compiler was recently rewritten in Rust. In this article, we'll explore its features, how it works, and why the rewrite was necessary.
Overview of Relay and its compiler
There are quite a number of GraphQL clients available in the market, but Relay comes with some distinct features and advantages. One advantage Relay has is that it lets you think in terms of small scopes.
Using GraphQL fragments in Relay
For instance, when creating a component, you can tell Relay to only source the specific data required inside that particular component by creating a fragment.
This way, you never need to worry about the big picture. Each component can have its own fragment, and at compile time, all the fragments stitch together into a query that proceeds to fetch all the needed data.
The concept of a fragment living inside a component along with its view code is called colocation. The advantage of colocation is that there is no over-fetching data, which allows your application to perform better.
There is also no under-fetching data, which prevents errors that might occur from missing data.
Another advantage of colocation is that a component only rerenders when specifying data in its fragment changes, thereby preventing unnecessary re-renders.
Relay’s data masking
Relay also provides its scope management advantage at runtime; after data responds to a query with data, the compiler creates data files that serve each component its own required data when the component is about to render.
This concept is called data masking. The advantage of this is that components cannot assess data that wasn’t specified in their GraphQL fragments. This way, changes made to a component and its data dependencies do not affect another component.
As one component cannot rely on another for data, data masking prevents data dependency bugs and provides your application with stability.
Fragments can easily be used in multiple components, are easy to refactor, and make your application efficient.
Why Relay needs a compiler
Relay uses a compiler to improve runtime performance and guarantee stability. With Relay, much of the components’ work and GraphQL’s communication executes at build time, thereby improving your application’s runtime performance significantly.
Refetching and pagination
Tasks like refetching data and pagination in applications can be tricky to implement and are error-prone. Through Relay’s APIs like useRefetchableFragment
and usePaginationFragment
, Relay takes advantage of its compiler to automate these tasks.
With Relay’s out-of-the-box pagination feature, you only need a few lines of code to implement pagination in your app, compared to implementing it manually.
The compiler helps you create the custom query needed for pagination and helps you keep track of information that’s often needed when paginating, like data that’s already loaded and the amount of data that has yet to load.
It hides away a lot of complexity, which is great if you simply want to put a quick pagination feature together.
Automatic type generation
The Relay compiler also enables automatic type generation to implement type safety in your application and prevent bugs.
The Relay compiler optimizes performance in your application by removing redundancies in queries and, as a result, reduces the size of your query payload. The compiler creates compact, optimized queries that run your app smoothly at runtime.
It also helps to save your users’ bandwidth and improve your application’s performance by excluding the schema or string representation of GraphQL fragments from your application bundle.
Using a unique query ID
Another way the Relay compiler helps users save bandwidth is rather than sending a long query to your application’s server, the compiler generates a unique query ID and uses that to source data from the server.
The limitations of JavaScript in Relay’s compiler
As mentioned above, the previous compiler was written in JavaScript. According to the React Relay team, JavaScript was originally picked for the compiler because it was the language that the Relay runtime and other GraphQL tools were written in.
But, in spite of all the attempts made to optimize the JavaScript compiler, its performance dwindled over time. The team’s biggest challenge with JavaScript was the fact that it’s a single-threaded language.
In Node.js, you can’t run multiple threads with shared memory. Although worker threads can be created to share memory, with the size of schema that Relay has, this method wouldn’t have been efficient.
Why Relay uses Rust for the new compiler
With the previous JavaScript compiler, as the Relay codebase grew, it took increasingly more time to compile code.
According to the React Relay team, the constant increase in the number of queries in Relay’s codebase had been slowing down performance. It eventually became suboptimal for the problem it was created to solve.
When it became obvious JavaScript wouldn’t cut it anymore, a number of languages (which didn’t have JavaScript’s single-thread limitations and had strong internal support) were considered before landing on Rust.
C++ was eliminated for its steep learning curve and its weak memory safety; Java was eliminated for not providing enough low-level control, and OCaml was eliminated for its inefficient concurrency.
In the end, React settled on Rust for its speed, memory safety, and concurrency, and with it, large data structures that can be easily and safely shared across different threads. The new Rust-based compiler is faster, has many new features, and was designed with scaling in mind.
Features of Relay’s new Rust compiler
The new compiler was created as a collection of independent modules that can be used in other GraphQL tools. Basically, the same modules used in the new React Relay compiler are also used internally in GraphQL tools for other platforms. The compiler comes with features like:
- TypeScript support
- Support for remote persisted queries
- The
@no_inline
directive, applied to fragments to prevent them from inlining - The
@required
directive, which simplifies null checks
The @required
directive can be added to fields in a GraphQL fragment to handle null values generated at runtime. Take the fragment below as an example:
const data = useFragment(
graphql`
fragment ArticleComponent_article on Article {
tech_article @required(action: NONE){
title @required(action: NONE)
}
}
`,
Above, you have a basic fragment requesting the titles of tech articles. The @required
directive attached to the title field performs a null check on it. If the title is null, then its parent field, tech_article
, is declared null as well by Relay.
The same thing happens with the @required
directive applied to the tech_article
field.
Now, the action parameter is the important part. This is where you specify what you want Relay to do if it finds a null field. When action is set to NONE
, instead of throwing an error, your UI renders fine and nothing displays wherever the title
value is used in your UI.
There are of course other options you can apply to your fields when handling null checks. This feature is especially useful when performing many null checks in your code.
According to React Relay team, the rewrite was also done to support some future plans like abstracting more common complexities in apps and shipping with more out-of-the-box features beyond pagination.
One tool that was built into the new compiler but is not public yet is a VS Code extension that makes using GraphQL easier by autocompleting field names as you type and showing you information on a field when you hover over it.
Why many use Rust to rewrite JavaScript tooling
It seems that a lot of JavaScript tooling is currently being rewritten in Rust. But why? Better speed, better performance, and better memory efficiency.
Instead of the traditional garbage collection method that JavaScript uses, Rust uses a much more efficient-memory management system, making it faster and more performant.
The Next.js team recently added a Rust compiler to the framework to replace JavaScript tools like Babel and Terser. This was done to maximize performance, achieve faster builds, and refresh rates. Their new Rust compiler is 17 times faster than Babel and seven times faster than Terser.
Fast and memory-efficient, Rust found a way to bring both low-level control and high-level comfort design to the software world. Memory safety, which is one of Rust’s most prominent features and biggest selling points, lets you easily identify and get rid of bugs at compile time.
We are also starting to see Rust alternatives for tooling like Prettier, ESLint, and Webpack spring up.
Conclusion
Rust, which was voted the most-loved programming language in the last six years (2016, 2017, 2018, 2019, 2020, and 2021) appears to complement JavaScript really well. With JavaScript's simplicity and ease of use, and Rust’s speed and memory efficiency, I believe both languages together would be unstoppable.
Full visibility into production React apps
Debugging React applications can be difficult, especially when users experience issues that are hard to reproduce. If you’re interested in monitoring and tracking Redux state, automatically surfacing JavaScript errors, and tracking slow network requests and component load time, try LogRocket.
LogRocket is like a DVR for web and mobile apps, recording literally everything that happens on your React app. Instead of guessing why problems happen, you can aggregate and report on what state your application was in when an issue occurred. LogRocket also monitors your app's performance, reporting with metrics like client CPU load, client memory usage, and more.
The LogRocket Redux middleware package adds an extra layer of visibility into your user sessions. LogRocket logs all actions and state from your Redux stores.
Modernize how you debug your React apps — start monitoring for free.
Top comments (0)