DEV Community

Thomas Reggi
Thomas Reggi

Posted on • Updated on

APIs and Artificial Intelligence

This post is a port and was originally authored on Jun 25, 2016

TLDR: Web API's act as a giant super computer, and if more that have API integration from stripe, shopify, to twillio utilized the same shared schema it would be easier down the line for applications built with artificial intelligence to tap into and utilize these endpoints.

Humans are incredibly inconsistent and sloppy when it comes to writing code. While you have articles in wired about the end of code, for the time being were still doing the writing. Let’s talk about APIs, schemas, and artificial intelligence.

What fundamentally is a API? Well the acronym floats around a lot, it stands for application process interface, which is a pretty broad term. What do they do? What are they for?

The general idea is that an API is a server that allows computers to send information to it and gets information back. It’s a system that can do anything, most are abstractions upon the CRUD operations you’d perform on a database. An API can contain one or many endpoints (URLs). These endpoints can also be protected with a varying number of authentication methods.

APIs generally return some sort of data in some sort of format that a computer can parse such as json, xml, csv. This data is super helpful for other computers to talk to the API to perform tasks it wouldn't otherwise have access to.

APIs are everywhere. They bucket into two camps, APIs that are for the public and internal APIs that are specific to the API creator. Twilio is a company that provides endpoints that allow you to make text messages as well as send and receive phone calls. If I need to make an app that needs this functionality I can make an authorized request to their server and build off the expertise they’ve developed. Right now I’m using the Medium android app which most-likely uses some sort of private API under the hood.

Now you know a little about APIs and why or how they would be used, if you step back you can see a bigger picture. The Internet with all these APIs act as one global computer that all use the Internet to communicate and accomplish tasks. What does this mean for artificial intelligence? AI is hard and the only way to reasonably create an intelligent system is to outsource each of the smaller bits to different services. It’s about breaking everything into individual pieces and then tackling them one at a time.

Here's the problem, APIs don't all speak a universal language. They are all implemented in a unique and custom way. They all differ in technologies, programming languages, authenticate details, request validation, error message handling, response status codes, error codes, internationalization, and endpont naming.

Enter the humble schema. There are two things that can change this chaos for the better. One is GraphQL and the other is Swagger.

Swagger allows you to build a document that contains all of the implementation details about an API, and I mean ALL. Swagger first and foremost is a specification, a set of rules that dictate how things should behave. The benefits of Swagger are sevenfold, not only does it allow you to detail an API, you can also use Swagger tools to build documentation, you can also run your server on top of a Swagger file, and base the actual routing and validation on the file itself. You can also use it to generate a library for accessing your API in different languages, say javascript, Ruby, python, etc.

GraphQL is a emerging technology, one sourced by Facebook that shifts that way APIs are written and used entirely. Intrinsic to its design is a schema and type system that builds the rules as you design the api. Unlike Swagger where you have to detail endponts, methods, objects, ect. GraphQL only use one endpoint and primary the POST method. Where with a classic Rest API you’d need to design individual endpoints and associate them with HTTP methods, then after the API is designed you need to write a whole bunch of documentation by hand.

Let’s put this all together. APIs computers to talk to each other, and execute operations remotely, and when an API has a proper schema or spec that is universal it allows people and computers to have better knowledge of how the API works without having to be explicitly told how things operate. This is a necessary step to allow more automated and autonomous operation of these APIs.

An apparent issue is that while schemas do benefit the API creator its has pros and cons from a capitalistic perspective. When different systems become compatible and documented it makes it easier to migrate from one to another. Take two similar services Shopify and Big Cartel both have APIs that provide access to an Order object but both are structured differently using different types and keys for the same object. However, in a world where a schema is a versioned object itself. Its just a matter of mapping one schema to the other, which makes migration incredibly simple.

The generation of schemas in APIs is happening whether we can stop it or not. It's a common sense way of managing and standardizing interoperable systems. The main question we need to ask is who or what is this work going to benefit most? Us humans, or the systems that will use them?

Top comments (0)