Hello World! My name is S and I am the head of growth here at Wundergraph. The article was written by our CEO / CTO Jens Neuse. Enjoy!
Versioning APIs is an essential part of the lifecycle of APIs. Some API styles, like GraphQL, completely miss versioning and call this a feature. Others, like RESTful APIs, give developers a lot of different ways to implement versioning.
I think that versioning for APIs is important but also way too complex. Itβs important because backwards compatibility is critical in a world of inter-connected companies, using APIs as the bridge. At the same time, itβs also a complex problem to solve for development teams.
More and more companies are starting to understand their APIs as products. The companies of tomorrow will not operate in isolation. Instead, they will be using APIs from 3rd parties while providing APIs to others themselves.
Relying on other companies APIs will give these companies an advantage as they can be more focused on their own business. At the same time, proving their own APIs as a product to other companies will give them an advantage over those companies who donβt let others easily integrate with them. All this will result in a win-win situation for those participating. I expect that this trend can only lead to exponential growth. The more problems are easily solvable by integrating with an API, the easier it becomes for others to build new business models on top, which again, will add more APIs to the ecosystem.
We'll eventually reach a state where every problem can be solved by using an API.
So, what are the challenges ahead of us to get there?
If we want to be able to solve any problem with APIs, we have to make sure that all APIs involved are backwards compatible, forever. If any API in this interconnected mesh of APIs introduces breaking changes, the whole system could fail just like a house of cards.
Additionally, a lot of API consumers are not able to catch up with the changes you'd like to make to your API. Think of IoT devices for example. It might not be possible to update them once deployed. Another example is native apps for iOS and Android. Users are not automatically updating an app just because the developer decided to push an update. There's always a huge lag, up to a year or even more between shipping an update and deprecating an old version.
At the same time, breaking changes are important. Maintaining APIs forever is hard, especially if you're trying to move fast or are working in new uncharted territory with little experience. You'll probably not be able to get your API right with the first iteration. Having to maintain backwards compatibility for your API can be a huge burden, eating up a lot of resources while distracting you from working on something new and innovative, something that gives your users additional value.
Ideally, you could introduce breaking changes whenever you want, without breaking anything.
In this post, I'll explain a concept on how we can achieve exactly this. I want you to be able to break your API all the time, but without breaking any of your API clients.
You'll also see why we're going to be using GraphQL as the underlying API specification. Even though OpenAPI Specification has more adoption, we'll see why GraphQL is going to rule the integration market in the upcoming years.
You've probably read about the "advantages" of GraphQL over REST. Most of these blog posts are just trying to surf the hype wave. In this blog post, I'll present you a real advantage, not the usual underfetching, overfetching fad, we'll also not "generate" APIs today, even though it gives you a lot of dopamine in the first 5 minutes (and a lot of stress when you have to add custom business logic).
I hope, the "REST enthusiasts" are still onboard. You'll learn something cool today, I promise.
Versionless APIs
I call the concept I'm explaining today Versionless APIs. Versionless doesn't mean there are no versions. Versionless APIs is meant in the same way as Serverless.
Serverless is not about "no servers". Serverless means, you don't have to deal with servers.
Versionless means, you don't have to deal with versions.
Misconceptions about versioning GraphQL and REST APIs
I talked about versioning before but am happy to recap again.
When you read about the advantages of GraphQL over REST APIs, you'll hear quite often that GraphQL is better because you don't "have to version your API".
This statement is driving me nuts, because it makes absolutely no sense at all. GraphQL is not better in any sense when it comes to versioning. If you don't version your REST API, there's absolutely no difference between the two.
GraphQL simply doesn't offer a solution to versioning, Although that's not really true. You could add a new field and give it a version suffix, then deprecate the old one using the @deprecated directive.
Here's an example, Version 1:
type Query {
hello: String
}
Version 2:
type Query {
hello: String @deprecated(reason: "please use helloV2 instead")
helloV2(arg: String!): String
}
What's the difference between the example above and adding a new endpoint to your REST API, with a version tag in the URL, as a Query Parameter or maybe a Header?
For both REST and GraphQL you'd have to either maintain two implementations, one for hello and one for helloV2.
There's also an IETF Draft by Erik Wilde on the Deprecation HTTP Header Field which does essentially the same thing as the @deprecated directive. Another Draft, again by Erik Wilde on the Sunset HTTP Header which helps developers understand when an API gets out of service. Erik seems to care about the lifecycle of APIs. Thank you, Erik!
With all this, is there really any difference between REST and GraphQL when it comes to versioning? If you don't want to version your APIs, you could just not break them.
Additionally, you could also have multiple versions of your GraphQL API. Who said, example.com/graphql/v2 is not ok? It might be hard to maintain because there's little tooling to support this use case, but it could be possible although I don't think it's a great idea.
To end this excursion about misconceptions, I'd like to make a point that I don't consider GraphQL by itself as Versionless. I'll discuss later what exactly is meant by Versionless.
First, let's talk about why GraphQL is such a great language for API integration.
Why GraphQL is going to take over the API integration market
This is the section you've probably been waiting for. I'm very happy to share this concept with you today. We're actively working on this right now, if you're interested in trying it out as early as possible, feel free to sign up with the early adopter programme.
Ok, what is it that GraphQL is actually better at, compared to REST. Actually, it's not just GraphQL. GraphQL is not enough, it's about Federation.
Federation allows you to extend types of another GraphQL API. The other feature that's going to help us is Interfaces, rarely used but extremely powerful.
Let's look at an example. Imagine we have two companies in our universe, the first is providing an API to retrieve the Latitude and Longitude for a given address, the second one is offering an API to get the current weather for a Latitude-Longitude pair.
How could our universe of APIs look like?
First, let's look at the Geocoder company. What could we do to make it super easy to adopt?
Instead of forcing a company into vendor lock-in, could we design an abstract API? Yes, absolutely!
interface IGeoCoder {
geoCode(address: String!): ILatLng
}
interface ILatLng {
latitude: Float
longitude: Float
}
This abstract GeoCoder specification could live in a git repository, e.g. github.com/graphql-schemas/geocoder, but that's just an implementation detail. Let's keep it high level for now.
Alright, how could the GeoCoder company implement this abstract GeoCoder?
type Query implements IGeoCoder {
geoCode(address: String!): LatLng
}
type LatLng implements ILatLng @key(fields: "latitude longitude") {
latitude: Float
longitude: Float
}
interface IGeoCoder @specifiedBy(git: "github.com/graphql-schemas/geocoder") {
geoCode(address: String!): ILatLng
}
interface ILatLng @specifiedBy(git: "github.com/graphql-schemas/geocoder") {
latitude: Float
longitude: Float
}
With this schema, the GeoCoder company made their API conform to the official GeoCoder standard.
Side note for the people not so familiar with the Federation specification. The directive @key(fields: "latitude longitude") defines that LatLng becomes an entity as per the Federation spec. This means, any other service can look up a LatLng Object using the fields latitude and longitude.
What's the benefit of this?
It's not just that we've solved the vendor lock-in problem. We've also made it very easy for a company to adopt an API. As someone who's looking to solve a problem through APIs, look for an open standard, e.g. Open Banking, FHIR, or simpler ones like the GeoCoder above, search for companies that implement the spec and integrate with them.
This will lead to an open market of APIs that have to compete on quality, latency, support, etc... because vendors can be swapped easily. Compare this to who things work today, this would be a huge step for API consumers. Nowadays, if you use a GeoCoder, want to send SMS or E-Mails via an API, you're very easily locked into a vendor, which doesn't have to fear competition that much because swapping vendors is expensive.
There are even new startups that focus completely on helping users swap vendors for specific vendors. Ideally, you could just switch from one implementation to another and call it a day.
Alright, we're done with the GeoCoder. If you liked the anti-vendor lock-in, and an open market for APIs, you'll be surprised what comes next, because this very next thing is about true API collaboration.
Let's talk about the Weather API provider. How can they make sure to get as much exposure as possible? How can they be compatible to as many other APIs as possible?
Here's a draft of how the Weather API "contract" could look like:
interface IWeatherApi extends ILatLng
@specifiedBy(git: "github.com/graphql-schemas/weather-api")
@key(fields: "latitude longitude") {
latitude: Float @external
longitude: Float @external
weatherInfo: IWeatherInfo
}
interface IWeatherInfo @specifiedBy(git: "github.com/graphql-schemas/weather-api") {
temperature: ITemperature!
summary: String!
}
interface ITemperature @specifiedBy(git: "github.com/graphql-schemas/weather-api") {
Celsius: Float
Farenheit: Float
}
interface ILatLng @specifiedBy(git: "github.com/graphql-schemas/geocoder") {
latitude: Float
longitude: Float
}
Let's assume we're storing this specification for a simple weather API in a git repository too: "github.com/graphql-schemas/weather-api"
The WeatherAPI provider can now implement the following schema:
type LatLng implements IWeatherApi @key(fields: "latitude longitude") {
latitude: Float @external
longitude: Float @external
weatherInfo: WeatherInfo
}
type WeatherInfo implements IWeatherInfo {
temperature: Temperature!
summary: String!
}
type Temperature implements ITemperature {
Celsius: Float
Farenheit: Float
}
interface IWeatherApi extends ILatLng
@specifiedBy(git: "github.com/graphql-schemas/weather-api")
@key(fields: "latitude longitude") {
latitude: Float @external
longitude: Float @external
weatherInfo: IWeatherInfo
}
interface IWeatherInfo @specifiedBy(git: "github.com/graphql-schemas/weather-api") {
temperature: ITemperature!
summary: String!
}
interface ITemperature @specifiedBy(git: "github.com/graphql-schemas/weather-api") {
Celsius: Float
Farenheit: Float
}
interface ILatLng @specifiedBy(git: "github.com/graphql-schemas/geocoder") {
latitude: Float
longitude: Float
}
You're probably thinking what's going on here. It's indeed a lot to unpack so let's go step-by-step
interface IWeatherApi extends ILatLng
@specifiedBy(git: "github.com/graphql-schemas/weather-api")
@key(fields: "latitude longitude") {
latitude: Float @external
longitude: Float @external
weatherInfo: IWeatherInfo
}
We define a new contract, the IWeatherApi, which similarly to all other contracts is just an abstract definition and therefore an Interface. This Interface extends the ILatLng Interface, which as we can see below, is defined by the spec in a fictitious git repository ("github.com/graphql-schemas/weather-api"). The directive @key(fields: "latitude longitude") defines the two foreign keys for the Interface ILatLng, latitude and longitude. Furthermore, the @external directives mark the two fields a external, meaning that these come from the foreign service. The field weatherInfo has no directive attached, meaning our own service is going to provide it.
interface ILatLng @specifiedBy(git: "github.com/graphql-schemas/geocoder") {
latitude: Float
longitude: Float
}
While defining the IWeatherApi contract, we're making use of the ILatLng Interface. By using the @specifiedBy directive, we're making sure that we link to the correct specification.
By the way, it could be absolutely valid to implement multiple interfaces. If there are multiple standards, a service could implement one or more of them, allowing compatibility with all implemented (linked) specifications.
type LatLng implements IWeatherApi @key(fields: "latitude longitude") {
latitude: Float @external
longitude: Float @external
weatherInfo: WeatherInfo
}
Finally, we're implementing the IWeatherApi contract with a non-abstract, concrete type definition.
So far, this should at least make some sense from a technical perspective. But what does all this mean from a business perspective?
Both, the GeoCoder Api provider and the WeatherApi provider implement open standards, we've touched on anti vendor lock-in before. But the Weather API is a special case because it's not implementing the Query type. Instead, it's extending the ILatLng interface, specified in another open standard.
Building links between open standards of API specifications is the future of the API economy.
Instead of pushing the work of integrating multiple APIs to the API consumer, the API provider can actually add these links to other open standards, making it easy for consumers of such open standards to integrate with additional APIs.
API Mesh - building Links between standardized APIs, specified using open standards
Imagine a world that is not just "API first", a world where we don't just treat APIs as products. Imagine a world where we standardize on specific use cases, like GeoCoding, transferring money, sending SMS, and define these as open standards.
Imagine a world where we would not just define these open standards but also add links between them, a mesh of APIs or API mesh.
Imagine a world where every company is API first, implements open standards and has "Links" to implementations of other API providers.
Imagine the possibilities, how easily you'd be able to integrate APIs from 3rd parties. You'd look up the open standards you'd like to use, search for the best vendors and start using them.
Are you interested in becoming one of the ambassadors for such a world? Join our early access program to join a group of forward thinkers and API enthusiasts.
Versionless APIs - Why backwards compatible APIs are so important
I apologize if I drifted too far away from the core topic of this blog post. I'm going to do another write up on the concept of the API Mesh. That said, I think the stage is set to talk about why backwards compatible APIs are essential to make this future a reality.
Think about a mesh of thousands of public (not unprotected) APIs with Links between all of them. APIs can be stacked on top of another. All this means, there are a lot of dependencies between all the API providers. If the GeoCoder API provider decides to rename the latitude field, it's not just affecting their own API consumers but also the Weather API provider, whose contract would immediately break. In reality, the consequences of a small breaking change could affect the whole mesh of APIs.
So, I think it's clear to say that without 100% backwards compatible guarantees, it's not possible to turn this into reality.
How to add breaking changes to your GraphQL API without breaking clients
If you've made it this far, you're probably sold on the idea of an interconnected Mesh of GraphQL APIs and keen to see how it's possible to add breaking changes without breaking clients, or at least you're interested in a possible solution.
If you've read a few other posts on this blog, like this super popular one on GraphQL security, you're probably familiar with the concept of how WunderGraph uses JSON-RPC in front of a virtual GraphQL API.
For those not yet familiar with the concept, here's a short recap.
WunderGraph takes all your REST- and GraphQL APIs as well as generated APIs from your Database and merges them into one single GraphQL Schema. This GraphQL schema is never directly exposed to the public, which is why I call it the "Virtual Schema" or "Virtual API". Instead of directly exposing a GraphQL API, we're taking the approach that is used by companies like Facebook, Twitter & Co., with one small adjustment, we've turned their custom-built solutions into a ready to use product.
During development time, developers define the GraphQL operations they'd like to use in their application. These Operations will be compiled into something similar to "Prepared Statements", essentially removing GraphQL from the runtime and replacing it with JSON-RPC.
This comes with a lot of upsides. On top of the list comes security. Not allowing clients to define arbitrary Queries is the easiest way to improve security. If you want to dive deeper into this topic, this post on security is for you.
Pre-Compiling the Operations into efficient code also improves performance because a lot of complex computational steps, like validation or execution planning, can be skipped.
Additionally, we're able to extract JSON-Schema definitions for each "persisted" Operation, allowing both server and client to validate the user inputs easily.
But there's another fantastic side effect of this JSON-RPC GraphQL facade architecture which comes in quite handy when it comes to making APIs versionless.
Coming back to the simple example from the beginning:
type Query {
hello: String
}
If a client was consuming this API, it'd probably look like this. The client would create an RPC Endpoint that stores a Query with the field hello, expecting a response looking like this (in JSON Schema format):
{
"type": "object",
"properties": {
"data": {
"type": "object",
"properties": {
"hello": {
"type": "string"
},
"additionalProperties": false
}
}
},
"additionalProperties": false,
"required": ["data"]
}
Here's the stored Query:
{ hello }
Remember, this client and the whole API Mesh is relying on this API. Now, let's introduce a breaking change. We'll rename the field hello to helloV2, no deprecation, just rename and deploy.
Whenever a client is generated, WunderGraph remembers which client understands which version of an API, like a Snapshot in time. If you keep a history of Schema Changes and know at which time a client was generated, you're able to tell which version of a Schema a client understands.
With this information we're able to prevent the breaking change to be deployed automatically. But that's not all. We can also let you "auto-migrate" the client to the new Schema.
I call it migrate, maybe the term is misleading, but I like the analogy of applying a set of migrations to a database until it reaches compatibility with the newest state.
So, whenever your intention is to break an API, we'll prevent you from breaking clients by automatically stopping the deployment. Then, we'll let you write a "migration" script to migrate older clients onto the new Schema to make them compatible again.
How would the migration look like in our scenario?
First, instead of Querying the field hello, we should rewrite the Query to use the field helloV2. This would obviously still break the client because we're now no longer conforming to the JSON-Schema. So, in a second step we'd have to rename the field data.helloV2 to data.hello. Alternatively, we could have also rewritten the Query with an alias:
{ hello: helloV2 }
With this migration in place, we're good to deploy our new Schema with the breaking change.
All clients with a timestamp older than the deployment time of the Schema will run through the migration.
You can then look at your analytics and decide how many old versions of clients you'd like to support.
What does this mean for an API provider from the business perspective?
You can iterate a lot faster, break things, and move forward, all while not putting off your existing clients and users.
What does it mean to developers?
They've got a simple tool to migrate old clients. Thanks to the analytics, they can ship updates with confidence as they know they won't break any clients. This is going to be a game changer for those who have to support mobile clients. Mobile apps will not immediately download and install your updated app. You might have to maintain old versions of your API for months or even years. With this approach, there's one big challenge out of the way. You can use all the benefits of GraphQL while decoupling the client (which you cannot directly control) from the GraphQL Schema.
You could even completely swap out the Schema, all while maintaining compatibility with all clients by migrating them over.
Want to migrate from FaunaDB to dgraph or vice versa? We've got you covered!
What does it mean to the API Mesh as a whole?
As stated above, keeping the API Mesh as a whole intact, that is, not breaking it, is the key requirement to be able to build Links between the APIs and keeping the API contracts between implementations and clients intact.
Without Versionless APIs, a Mesh of APIs isn't really possible.
Like what you have read so far?
Tell us here!
Alternative solutions to keep your GraphQL API backwards compatible
I'd like to highlight one open source solution that tries to solve the same problem with a different approach, the library is called graphql-query-rewriter and does exactly what the name suggests, it's a NodeJS compatible middleware that allows you to rewrite GraphQL requests.
Istn't it ironic that some people in the GraphQL community claim that the absence of "versioning-features" in the GraphQL specification is a feature while almost 400 stars for this library indicate that there's a need for versioning?
The approach taken is slightly different from the one I've proposed in this post. The library has a few supported options in rewriting GraphQL requests:
FieldArgTypeRewriter
FieldArgNameRewriter
FieldArgsToInputTypeRewriter
ScalarFieldToObjectFieldRewriter
JsonToTypedObjectRewriter
NestFieldOutputsRewriter
The way it works is that it checks GraphQL Operation AST to find matching rewrite rules and applies them.
As we can see from the list above, there are quite some options to choose from, but there will always be edge cases where a rewrite might not be possible.
The library README states that there are some limitations regarding aliased feels. There's also an issue with rewriting GraphQL documents containing multiple GraphQL Operations.
Here's a simple example of how to configure the rewriter:
The library README states that there are some limitations regarding aliased feels. There's also an issue with rewriting GraphQL documents containing multiple GraphQL Operations.
Here's a simple example of how to configure the rewriter:
app.use('/graphql', graphqlRewriterMiddleware({
rewriters: [
new FieldArgTypeRewriter({
fieldName: 'userById',
argName: 'id',
oldType: 'String!',
newType: 'ID!'
}),
]
}));
What I like about this library:
If you're already using a Node-JS GraphQL Server, this solution can get you pretty far without much effort. The configuration of the rules seems straight forward.
A few things to think about:
It seems to me that the rewrite rules are not fully typesafe. Type literals like String! (Non-Nullable String) are treated like plain strings. I guess you'd have to add additional tests to make sure that all rewrites are correct.
There's also no specific version tag or anything similar. This means, the library treats all API clients the same. I think it would be beneficial to keep track of all the clients and their versions, but this seems out of scope for the library. I have a bit of a fear that over time, it can become quite messy if you don't know what clients are using which version of the schema if there's no clear cut between each version. That is, if you remove one of the rewrites, it's quite unpredictable which clients will be affected.
Another problem I see with this approach is that it's a NodeJS only solution. If you're not using NodeJS for your GraphQL Server, you'd have to re-implement the logic in your language of choice or run a separate NodeJS process to handle the rewrites.
In general, I believe that solutions like "rewriting requests" do not belong into an application itself. API Gateways or advanced proxies are the right place to put these rules.
My biggest criticism though is about the rewrite strategy itself and has to do with the absence of version tags in the clients. Imagine there's a field foo on the type Query. In our second iteration we add a new field called bar and remove the field foo. To not break any clients, we're adding a rewrite rule from foo to bar. Later on, we decide we want to add a new field called foo (again) but with a completely different meaning. Re-adding this field is not really possible because we're only allowed to add breaking changes in one direction. Without a timestamp or version tag in the client, we're not able to distinguish between old clients that wanted the old foo field (rewritten to bar) or new clients that actually want to new foo field without rewrites.
The approach taken by WunderGraph embeds a version hash into the client. This allows us to clearly identify the version of the GraphQL Schema the client understands so that we can correctly rewrite it.
To sum up this section, I think this library is a really smart solution. If you're aware of what it can do for you and where it has some limitations it can be a great solution.
Summary and Conclusion
We've discussed why versioning of APIs is important and how it enables companies to move forward with their products. At the same time, we've looked into the challenges of maintaining backwards compatible APIs, especially with GraphQL.
We've then compared the differences of versioning between REST and GraphQL APIs. I hope I've made it clear that there isn't really much of a difference.
Next, we've went onto a small excursion on the topic I'm most excited about, enabling collaboration through APIs using open standards, and the ability to build Links between APIs.
This led to the core of the blog post, how we can make APIs Versionless, using JSON-RPC in combination with API snapshots and automatic client migrations as described above.
We've also looked into an alternative approach and discussed pros and cons of both solutions.
So, Versionless APIs is not just a smart approach to keep APIs backwards compatible without a huge overhead. Versionless APIs are an enabler for a whole new ecosystem of API collaboration.
If you're keen in trying this out as soon as possible and want to work on this togehter with us, shaping the future of API collaboration, sign up with our early access programme!.
Like what you read?
Interested in learning more about Wundergraph? Contact us here!
Top comments (3)
backwards compatibility Its not only an versioning and protocol thing its your whole ecosystem. Your example is not what would happen in real world. You would just provide both fields and add an deprecated warning to the old one.
But renaming is a Bad idea in any case. So first I would work on my workflows. Normally i rename only things when it is far from Release and its unclear what fields are needed etc.. Once released you will live with that and change later just like described with an additional field. In graphQL its pretty simple. Normally REST APIs are tight coupled with your ORM so its a Little Bit More work. But also possible ofc.
Is there another example why I should use Wundergraph?
Why don't you email me at Growth@wundergraph.com. You bring up some interesting points and we'd love to chat with you. Let's set up a time to chat and show you the features of Wundergraph!
Thank you sir!