As a best-selling author, I invite you to explore my books on Amazon. Don't forget to follow me on Medium and show your support. Thank you! Your support means the world!
Building modern applications often feels like trying to speak multiple languages at once. One part of your system needs the straightforward, universal nature of REST. Another demands the precise, efficient queries of GraphQL. Internally, your services might scream for the raw speed of gRPC. For a long time, my team and I would build these as separate silos, leading to the same logic written three times over. It was messy, slow, and a nightmare to keep consistent. Then, we started weaving them together into a single, cohesive layer—a hybrid API.
This isn't about choosing a winner. It's about letting each tool do what it does best, all while sharing the same brain. Think of it like a well-organized workshop: you have a specific tool for cutting, another for joining, and another for finishing, but they all work on the same project plan. In Java, we can build this workshop. Here’s how we did it, broken down into five practical techniques.
The first step is building a single front door. You don't want your clients—be they a mobile app, a web frontend, or another service—to worry about where to knock. They should come to one place. We use an API Gateway to act as this concierge, inspecting each incoming request and guiding it to the right handler.
A tool like Spring Cloud Gateway is perfect for this. You configure routes that look at the request path or headers and send it to the appropriate backend service. A request to /api/rest/orders goes to your REST controllers. A POST to /api/graphql with a query in the body is routed to your GraphQL endpoint. Even gRPC traffic, often handled over HTTP/2, can be proxied through this gateway. This gives you one place to manage security, logging, and rate-limiting for everything.
Here’s a simplified look at how you might set up those routes in Java. This code lives in your gateway service and acts as the traffic director.
@Bean
public RouteLocator apiRoutes(RouteLocatorBuilder builder) {
return builder.routes()
.route("rest-route", r -> r
.path("/api/rest/**")
.uri("lb://core-service"))
.route("graphql-route", r -> r
.path("/api/graphql")
.uri("lb://core-service"))
.route("grpc-route", r -> r
.path("/grpc/**")
.filters(f -> f.setRequestHeader("Content-Type", "application/grpc"))
.uri("lb://grpc-service"))
.build();
}
From the outside, clients see one hostname. Inside, the gateway quietly ensures each protocol's request finds its way home. It’s the foundational piece that makes the rest of this architecture possible.
Now, let's talk about the core of the matter: your business logic. The biggest mistake is writing your order processing code once for a REST controller, again for a GraphQL resolver, and a third time for a gRPC service. Duplication is the enemy. The solution is protocol translation.
The idea is simple. Your business rules live in plain Java service classes, completely unaware of HTTP, GraphQL, or protocol buffers. These services work with your own internal domain objects. Then, you build thin "adapter" layers around them that speak the specific protocol.
Imagine an OrderService that knows how to create an order. This service doesn't care where the request came from.
@Service
public class OrderService {
public Order createOrder(CreateOrderCommand command) {
// This is the only place your creation logic exists.
// Validate items, check inventory, calculate tax, etc.
Order newOrder = new Order(command);
return orderRepository.save(newOrder);
}
}
Now, you expose this single service through three different doors. The REST controller takes a JSON payload, converts it to a CreateOrderCommand, calls the service, and converts the resulting Order back to a JSON response.
@RestController
public class OrderRestController {
private final OrderService orderService;
@PostMapping("/api/rest/orders")
public ResponseEntity<OrderResponse> create(@RequestBody RestOrderRequest request) {
CreateOrderCommand command = convertToCommand(request);
Order order = orderService.createOrder(command);
return ResponseEntity.ok(convertToResponse(order));
}
}
The GraphQL resolver does something similar, but it fits into the GraphQL execution model. It might fetch the order and then have separate methods to resolve related fields like the customer.
@Component
public class OrderGraphQLResolver implements GraphQLQueryResolver {
private final OrderService orderService;
public Order getOrder(String id) {
return orderService.findOrder(id);
}
@SchemaMapping(typeName = "Order", field = "customer")
public Customer resolveCustomer(Order order) {
// Fetch customer details on-demand, as GraphQL requests.
return customerService.findById(order.getCustomerId());
}
}
Finally, the gRPC service implements the auto-generated gRPC interface. It takes the protocol buffer request, translates it to the same CreateOrderCommand, and streams the result back.
@GrpcService
public class OrderGrpcServiceImpl extends OrderServiceGrpc.OrderServiceImplBase {
private final OrderService orderService;
@Override
public void createOrder(OrderProto.CreateRequest request,
StreamObserver<OrderProto.Order> responseObserver) {
CreateOrderCommand command = fromProtoRequest(request);
Order order = orderService.createOrder(command);
responseObserver.onNext(toProtoResponse(order));
responseObserver.onCompleted();
}
}
The key is the shared OrderService. Whether the request arrives as JSON, a GraphQL query, or a binary gRPC call, it all funnels through the same business logic. Change a rule in the OrderService, and it's changed for every API. This is the heart of a maintainable hybrid system.
When you have multiple ways to describe the same thing, inconsistency creeps in. Your REST API might call a field customerName, your GraphQL schema might use customer_name, and your gRPC message might have customer_full_name. This causes confusion and bugs. To prevent this, we use a schema-first approach.
Instead of letting the code define the API, you define the API contract first using a neutral language. For the shape of your data, you can use tools like OpenAPI (for REST) and Protobuf IDL (for gRPC). The magic happens when you use these definitions to generate your Java code.
For example, you can write a .proto file for your core messages.
// order.proto
message Order {
string id = 1;
string customer_id = 2;
repeated OrderItem items = 3;
double total_amount = 4;
OrderStatus status = 5;
}
Then, in your build process (using Gradle or Maven plugins), you generate the Java classes from this file. These generated classes become the shared model used by your gRPC service and can be used internally. You can do a similar trick with OpenAPI Generator to create REST model classes from an OpenAPI specification.
The goal is to have one source of truth for what an Order looks like. Your build script orchestrates this generation.
// In your build.gradle
plugins {
id 'com.google.protobuf' version '0.9.4'
id 'org.openapi.generator' version '6.6.0'
}
protobuf {
protoc {
artifact = "com.google.protobuf:protoc:3.21.12"
}
generateProtoTasks {
all().each { task ->
task.plugins {
grpc { artifact = "io.grpc:protoc-gen-grpc-java:1.54.0" }
}
}
}
}
openApiGenerate {
generatorName = "java"
inputSpec = "$rootDir/specs/openapi.yaml"
outputDir = "$buildDir/generated/openapi"
}
Now, your OrderService can work with the Order class generated from your Protobuf definition, ensuring everyone agrees on the data structure from the start.
How do you know all these different doors lead to the same room? You test it. Thoroughly. Testing a hybrid API requires a two-pronged approach: testing the shared core logic in isolation, and testing that each protocol adapter behaves correctly.
First, you test your OrderService directly. These are plain unit tests, ensuring the business rules work.
@Test
void createOrder_ValidCommand_ReturnsOrder() {
CreateOrderCommand command = new CreateOrderCommand("cust-123", List.of(item1, item2));
Order result = orderService.createOrder(command);
assertNotNull(result.getId());
assertEquals(OrderStatus.PENDING, result.getStatus());
}
But the crucial part is integration testing. You need to verify that hitting the REST endpoint, sending a GraphQL mutation, and calling the gRPC method all produce the same outcome. We write tests that use real clients for each protocol.
@SpringBootTest
class OrderCreationIntegrationTest {
@Autowired TestRestTemplate restTemplate;
@Autowired GraphQLTestTemplate graphQLTestTemplate;
@Autowired GrpcClient grpcClient;
@Test
void allProtocolsCreateConsistentOrder() {
// 1. Test via REST
RestOrderRequest restRequest = new RestOrderRequest(...);
ResponseEntity<OrderResponse> restResponse = restTemplate.postForEntity("/api/rest/orders", restRequest, OrderResponse.class);
String createdOrderId = restResponse.getBody().getId();
// 2. Test via GraphQL
String graphqlMutation = """
mutation { createOrder(input: {customerId: "cust-123"}) { id } }
""";
GraphQLResponse graphQLResponse = graphQLTestTemplate.post(graphqlMutation);
String graphqlOrderId = graphQLResponse.get("$.data.createOrder.id");
// 3. Test via gRPC (using a blocking stub for simplicity)
OrderProto.CreateRequest grpcRequest = OrderProto.CreateRequest.newBuilder()...build();
OrderProto.Order grpcResponse = grpcClient.createOrder(grpcRequest);
String grpcOrderId = grpcResponse.getId();
// Now, fetch the same order through REST and compare key fields
OrderResponse fetchedOrder = restTemplate.getForObject("/api/rest/orders/" + createdOrderId, OrderResponse.class);
// Assert that the IDs from all three calls match the fetched data's ID.
// This proves all protocols interacted with the same core service.
assertEquals(fetchedOrder.getId(), graphqlOrderId);
assertEquals(fetchedOrder.getId(), grpcOrderId);
}
}
These tests give us confidence. They prove that our translation layers are correct and that a client will get a functionally equivalent result regardless of the protocol they choose.
When something goes wrong in a system with multiple moving parts, you need clear visibility. Is the gRPC service slowing down? Are GraphQL queries failing? You need unified monitoring that cuts across protocol boundaries.
We achieve this by instrumenting our code to emit metrics and traces with consistent tags. Every request, no matter its origin, should be recorded with a protocol label (rest, graphql, or grpc).
Using Micrometer, which integrates seamlessly with Spring Boot, we can time every operation.
@Component
public class ApiMetrics {
private final MeterRegistry meterRegistry;
public <T> T measure(String protocol, String operation, Supplier<T> task) {
Timer.Sample sample = Timer.start(meterRegistry);
try {
return task.get();
} finally {
sample.stop(Timer.builder("api.request.duration")
.tag("protocol", protocol)
.tag("operation", operation)
.register(meterRegistry));
}
}
}
// Used in a REST controller
@PostMapping
public ResponseEntity<?> createOrder(@RequestBody OrderRequest req) {
return apiMetrics.measure("rest", "createOrder", () -> {
// ... handle request
return ResponseEntity.ok(...);
});
}
For tracing, we use distributed tracing with Spring Cloud Sleuth or OpenTelemetry. The gateway should initiate a trace, and that trace ID should be passed through to all internal services, whether they handle REST, GraphQL, or gRPC. This lets you see a single user's journey as it might hop from a GraphQL query to a gRPC internal call and back.
Finally, a unified health check is vital. Your operations team needs one endpoint to tell if the entire API layer is healthy.
@Component
public class HybridApiHealthIndicator implements HealthIndicator {
@Override
public Health health() {
Health.Builder builder = Health.up();
// Check REST connectivity
builder.withDetail("rest_endpoint", checkRest());
// Check GraphQL schema loading
builder.withDetail("graphql_schema", checkGraphQL());
// Check gRPC server status
builder.withDetail("grpc_server", checkGrpc());
return builder.build();
}
}
This health endpoint can be exposed via a standard REST call (like /actuator/health), giving a snapshot of the entire hybrid system's status.
Pulling all this together requires thoughtful design, but the payoff is immense. You get to offer the right API for each client—flexible queries for your complex web dashboard, simple REST for third-party integrations, and fast gRPC for internal microservices—without multiplying your development and maintenance work.
The code becomes cleaner because the business logic is centralized. The system is more robust because you can test for consistency. And you have the observability to understand how each part is performing. It turns a potential tangle of conflicting APIs into a coordinated, efficient system. In my experience, moving to this hybrid model wasn't just a technical upgrade; it was a fundamental shift that let our team build faster and our applications serve more diverse needs reliably.
📘 Checkout my latest ebook for free on my channel!
Be sure to like, share, comment, and subscribe to the channel!
101 Books
101 Books is an AI-driven publishing company co-founded by author Aarav Joshi. By leveraging advanced AI technology, we keep our publishing costs incredibly low—some books are priced as low as $4—making quality knowledge accessible to everyone.
Check out our book Golang Clean Code available on Amazon.
Stay tuned for updates and exciting news. When shopping for books, search for Aarav Joshi to find more of our titles. Use the provided link to enjoy special discounts!
Our Creations
Be sure to check out our creations:
Investor Central | Investor Central Spanish | Investor Central German | Smart Living | Epochs & Echoes | Puzzling Mysteries | Hindutva | Elite Dev | Java Elite Dev | Golang Elite Dev | Python Elite Dev | JS Elite Dev | JS Schools
We are on Medium
Tech Koala Insights | Epochs & Echoes World | Investor Central Medium | Puzzling Mysteries Medium | Science & Epochs Medium | Modern Hindutva
Top comments (0)