DEV Community

Roberts Guļāns
Roberts Guļāns

Posted on • Updated on

GraphQl with absinthe

There are a ton of graphql tutorials already. This is more like solidifying my understanding of the topic. If you want to learn it, there probably is better resources to follow.

Gotcha 1: Everything is a field.

Literally, everything is a "field". What do I mean?

Each field can be a predefined scalar value (string boolean, float, id or integer) or custom scalar value

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in this scope can be publicly accessible
  query do
    # Defining field `name` with type string
    field :name, :string
  end
end

Now we can execute query:

query {
  name
}

and expect the following output:

{
  "data": {
    "name": null
  }
}

There's nothing to it because we haven't told absynthe how to resolve this field. Let's for now tell it to return hardcoded value.

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in the query scope can be publicly accessible
  query do
    # Defining field `name` with type string and resolver for it
    field :name, :string, resolve: fn _, _, _ -> {:ok, "John Doe"} end
  end
end

Now executing the same query we get John Doe as value for the name field instead of null

{
  "data": {
    "name": "John Doe"
  }
}

The resolver callback cannot return an object if the type is provided as a string.

Gotcha 2: Fields can also be complex structures (list, map)

Often field will contain more complex value than a scalar. Especially if the field is at root level:

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in the query scope can be publicly accessible
  query do
    # Defining field `user` with type `user`
    field :user, :user
  end

  # Custom defined object `user` that has one field `name` and resolver for it
  object :user do
    field :name, :string, resolve: fn _, _, _ -> {:ok, "John Doe"} end
  end
end

Now we can execute query:

query {
  user {
    name
  }
}

and expect the following output:

{
  "data": {
    "user": {
      "name": "John Doe"
    }
  }
}

but actually, we get

{
  "data": {
    "user": {
      "name": null
    }
  }
}

Hmmm... so... I can not resolve deeply nested structures. I need to resolve at the root level. At least it seems so.

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in the query scope can be publicly accessible 
  query do
    # Defining field `user` with type `user` and resolver for it
    field :user, :user, resolve: fn _, _, _ -> {:ok, %{name: "John Doe"}} end
  end

  # Custom defined object `user` that has one field `name`
  object :user do
    field :name, :string
  end
end

While executing the previous query we now indeed get the expected output:

{
  "data": {
    "user": {
      "name": "John Doe"
    }
  }
}

If the parent level doesn't provide field, the child field resolver callback isn't called. So we could get the same effect by combining resolvers from both levels:

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in the query scope can be publicly accessible
  query do
    # Defining field `user` with type `user` and resolver for it
    field :user, :user, resolve: fn _, _, _ -> {:ok, %{name: nil}} end
  end


  # Custom defined object `user` that has one field `name` and resolver for it
  object :user do
    field :name, :string, resolve: fn _, _, _ -> {:ok, "John Doe"} end
  end
 end

We can even combine resolvers from both levels and expect the same output:

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in the query scope can be publicly accessible
  query do
    # Defining field `user` with type `user` and resolver for it
    field :user, :user, resolve: fn _, _, _ -> {:ok, %{name: "John"}} end
  end

  # Custom defined object `user` that has one field `name` and resolver for it
  object :user do
    field :name, :string, resolve: fn %{name: name}, _, _ -> {:ok, name <> " Doe"} end
  end
end

Gotcha 2.1: Fields can also be complex structures

All tutorials I were looking into, were tightly coupled with ecto, so it somehow slipped my attention, how exactly field structures were working. Here you could see the even deeper nested structure. Each user have permissions map field with two values is_admin? and role.

defmodule MyApp.Schema do
  use Absinthe.Schema

  # All fields in the query scope can be publicly accessible
  query do
    # Defining field `user` with type `user` and resolver for it
    field :user, :user, resolve: &resolve_user/3
  end

  # Custom defined object `user` that has two fields `name` and `permissions` and resolver for the latter
  object :user do
    field :name, :string
    field :permissions, :permissions, resolve: &resolve_permisisons/3
  end

  # Custom defined object `permissions` that has two fields
  object :permissions do
    field :is_admin, :boolean
    field :role, :string
  end

  defp resolve_user(_, _, _) do
    # Here as well could happen DB query, or reading data from any imaginable data source possible
    # For now simulating that JSON value is retrieved from a data source in permissions field
    user = %{
      name: "John Doe",
      permissions: Jason.encode!(%{is_admin: true, role: "admin"})
    }

    {:ok, user}
  end

  defp resolve_permisisons(%{permissions: permissions}, _, _) do
    Jason.decode(permissions, keys: :atoms!)
  end
end

Needless to say that such hierarchies could go as deep as necessary. There are some security concerns regarding depth.

It is possible to retrieve data from one data source, and while resolving its children, mutate its values (in this case decoding JSON), but even reach in other data source for extra data. Be careful tho, 1+n can creep very fast.

In conclusion

From the graphql perspective, there isn't some special root elements, root objects or something like that. Everything is a field, and you have complete control over what (string, integer, some custom scalar value or object) each field represents and how data for a specific field is retrieved and processed.

As with erlang/elixir itself, such simple concept as processes with mailboxes have evolved into such stable environments, the graphql went similar path, by choosing to implement as little new concepts as possible. When right building blocks are refined, solutions on those blocks can be built not only faster, but also more stable.

I sort of understand hype train around graphql. Will see how my journey through mutations, paginations, and other seemingly more complex use cases will go, but that's a story for another day.

P.S. Feedback is welcome

Top comments (0)