DEV Community

Mark Heckler
Mark Heckler

Posted on

Cosmos DB for Spring Developers, Part II: Using Cosmos DB as a MongoDB Database

In the first installment of this series, I discussed Spring Boot, Spring Data, and several of the underlying datastores supported by Spring Data. I also introduced Azure Cosmos DB and offered a glimpse of the various APIs it supports. I then showed how to create a Spring Boot application that uses Cosmos DB as a SQL database.

If you're using a SQL datastore, I trust that was a really useful introduction to how to take your existing Spring Boot relational database application and make it truly planetary scale. But what if you prefer a non-relational (NoSQL) document database? Short of rewriting your apps, are you constrained by what that particular vendor chooses to offer? Absolutely not!

In this installment, I'll show you how to use Cosmos DB as a MongoDB database.

Azure Cosmos DB for MongoDB

As mentioned in Part I, Azure Cosmos DB is a planetary-scale datastore that offers various ways and APIs to manage your critical data and provide access to it at breathtaking speed. Harnessing the power of Cosmos DB as a SQL datastore was a fairly straightforward affair, but here's a pleasant surprise: replacing a MongoDB database in your Spring Boot application is even easier.

Aside from the rather cumbersome (yet delightfully descriptive) name, Azure Cosmos DB for MongoDB is a simple, yet powerful, drop-in replacement for MongoDB in your Spring Boot apps. To illustrate this, I'll create another project very similar to the one I created in Part I, but using MongoDB as the initial choice of database.

Create a Spring Boot app

There are many ways to begin creating a Spring Boot application, but I prefer to begin with the Spring Initializr. Choose your build system (I opt for Maven most days, but your -- and my -- mileage may vary), choose a meaningful artifact name, change the group name if you so choose, then select the dependencies as shown in the following screen capture.

Create and open the project

NOTE: All code is in a Github repository linked at the bottom of this article.

Image description

I use only three dependencies for this example:

  • Reactive Web - includes all dependencies to create web apps using reactive streams, non-blocking and/or imperative APIs
  • Spring Data Reactive MongoDB - enables your app to access MongoDB using the Reactor (reactive streams) API
  • Lombok - boilerplate-reducing library, useful for domain classes, logging, and more

Next, click the "Generate" button to generate the project structure and download the compressed project files (.zip). Once downloaded, go to the directory where the file was saved, decompress it, and open the project in the Integrated Development Environment (IDE) or text editor of your choice. I use IntelliJ IDEA and Visual Studio Code (VSCode) for nearly all my dev work, and for this article I'll open the project in IntelliJ by navigating (using the Mac Finder) into the expanded project directory and double-clicking on the Maven build file, pom.xml.

Once the project is loaded, you can verify that the dependencies chosen from the Spring Initializr are present within the project by opening the pom.xml file. There will be additional ones brought in for testing, etc., but these represent the three we selected before generating the project structure:

<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-data-mongodb-reactive</artifactId>
</dependency>
<dependency>
    <groupId>org.projectlombok</groupId>
    <artifactId>lombok</artifactId>
    <optional>true</optional>
</dependency>
Enter fullscreen mode Exit fullscreen mode

Code the application

This is a fairly simple (and a bit contrived) example, but I have expanded it a bit from the previous article in this series. Check back for the next installment to see where things go from here.

The domain

First, I code a domain class. In this example, I create a User class with an id, firstName, lastName, and address member variables. This class and its properties are annotated thusly:

  • @Document - indicates that each instance of this class can be treated as a document by the underlying datastore, allowing for CRUD (Create, Read, Update, and Delete) operations
  • @Data - Lombok annotation, instructs Lombok to consider this a "data class" and generate accessors (getters) and mutators (setters) for each member variable, along with equals(), hashCode(), and toString() methods
  • @NoArgsConstructor - Lombok annotation that instructs Lombok to generate a zero-argument constructor
  • @RequiredArgsConstructor - Lombok annotation that instructs Lombok to generate a constructor with a parameter for each "required" member variable, as designated by the @NonNull member variable annotation
  • @Id - indicates which member variable corresponds to the underlying table's primary key
  • @NonNull - addressed under @RequiredArgsConstructor

The repository

Spring Boot's autoconfiguration extends the power of Spring Data. By placing a database driver in your classpath (including it as a dependency in your build file results in its inclusion for deployment) and extending a Spring Data-derived interface in your application code, Spring Boot's autoconfiguration creates the beans necessary to provide a proxy to the desired underlying datastore. In our case, this is all we need to provide foundational reactive database capabilities:

interface UserRepository extends ReactiveCrudRepository<User, String> {}
Enter fullscreen mode Exit fullscreen mode

In this single line of code, we are defining an interface called UserRepository that will inherit and potentially extend the capabilities of the ReactiveCrudRepository, storing objects of type User with identifiers (IDs) of type String. Spring Boot's autoconfiguration does the rest.

NOTE: We can do more, of course, defining custom query methods and more. But for this example, the provided functionality is sufficient.

The API

Next, I define the Application Programming Interface (API) that provides the means and structure for external applications to interact with this service. Since this is our second app in this series, I expanded on the previous API a bit by adding a second GET endpoint and method to get the first user document stored, along with a POST endpoint and method, allowing for the creation and storage of a new user.

@RestController
@AllArgsConstructor
class CosmosMongoController {
    private final UserRepository repo;

    @GetMapping("/")
    Flux<User> getAllUsers() {
        return repo.findAll();
    }

    @GetMapping("/oneuser")
    Mono<User> getFirstUser() {
        return repo.findAll().next();
    }

    @PostMapping("/newuser")
    Mono<User> addUser(@RequestBody User user) {
        return repo.save(user);
    }
}
Enter fullscreen mode Exit fullscreen mode

To revisit some fundamentals, the @RestController annotation is provided by the Spring Framework and combines the functionality of @Controller, to respond to requests, and @ResponseBody, to make the resultant object(s) the response body itself, rather than just providing access to the object(s) via a model variable, as is the typical MVC methodology.

The @AllArgsConstructor annotation instructs Lombok to create a constructor for the CosmosMongoController class with a parameter (and thus required argument) for every member variable. Since the only member variable is a UserRepository, Lombok generates a ctor with that single parameter.

NOTE: @AllArgsConstructor has the fortunate (or perilous) capability to update your constructor automatically if you simply add/remove member variables, so remember: With great power comes great responsibility. If you don't want this behavior, use @RequiredArgsConstructor instead, annotating each member variable with @NonNull that you wish to require and thus have represented as a parameter in the constructor.

The data

As in Part I, I create a Spring bean using the @Component annotation to populate some sample data. The @Component annotation instructs Spring Boot, upon application initialization, to create an instance of the annotated class and place that object (bean) into the application context, i.e. its Dependency Injection (DI) container. All beans in the DI container are managed by the Spring Boot application in terms of lifecycle and, of course, injection into other code as dependencies.

Once the bean is constructed, the loadData() method is executed automatically due to the @PostConstruct annotation. In this application, the loadData() method deletes all lingering data in the underlying datastore (from previous app executions), populates it with two sample User records, returns all User records now stored in the database, and logs them to the console for verification.

NOTE: The Reactive Streams API and the implementation of it as provided by Spring WebFlux/Project Reactor is beyond the scope of this particular article. Please consult the appropriate documentation at the 'Web on Reactive Stack' Spring documentation site, any of several sessions I've delivered available via my YouTube channel, or by visiting the Reactive Streams and Project Reactor sites.

@Slf4j
@Component
@AllArgsConstructor
class DataLoader {
    private final UserRepository repo;

    @PostConstruct
    void loadData() {
        repo.deleteAll()
                .thenMany(Flux.just(new User("Alpha", "Bravo", "123 N 45th St"),
                        new User("Charlie", "Delta", "1313 Mockingbird Lane")))
                .flatMap(repo::save)
                .thenMany(repo.findAll())
                .subscribe(user -> log.info(user.toString()));
    }
}
Enter fullscreen mode Exit fullscreen mode

Demo, take 1

NOTE: This demo assumes you have a MongoDB instance running locally on your machine. If you do not, please install and run MongoDB locally or better yet, use the MongoDB Docker script provided at the end of this article to spin up a MongoDB instance in a Docker container.

NOTE: If you intend to use Docker to run a local MongoDB instance in a container, you must first install Docker on your machine. Please consult the Docker installation documentation for your operating system.

Since we're running MongoDB locally, we do not need to set any properties for our Spring Boot app. Spring Data MongoDB will automatically connect to the MongoDB instance exposed via the default port of 27017 on localhost.

To verify the application is working properly and is able to access the local MongoDB instance, execute the following commands from a terminal window:

curl http://localhost:8080

Alternatively, you can access http://localhost:8080 from a browser tab or window.

NOTE: I use HTTPie instead of curl and greatly prefer it. Commands seem more logical and output more readable. HTTPie leverages default values for many properties, such as the hostname: if not provided, it is assumed to be localhost, a very reasonable assumption. In all of my examples, the http command you see is HTTPie.

The following should be displayed:

HTTP/1.1 200 OK
Content-Type: application/json
transfer-encoding: chunked

[
    {
        "address": "123 N 45th St",
        "firstName": "Alpha",
        "id": "639900044b494a1a26d1c1fd",
        "lastName": "Bravo"
    },
    {
        "address": "1313 Mockingbird Lane",
        "firstName": "Charlie",
        "id": "639900044b494a1a26d1c1fe",
        "lastName": "Delta"
    }
]
Enter fullscreen mode Exit fullscreen mode

To exercise the second endpoint and return the first user in the database, execute the following command:

curl http://localhost:8080/oneuser

You should see something like this:

HTTP/1.1 200 OK
Content-Length: 98
Content-Type: application/json

{
    "address": "123 N 45th St",
    "firstName": "Alpha",
    "id": "639900044b494a1a26d1c1fd",
    "lastName": "Bravo"
}
Enter fullscreen mode Exit fullscreen mode

To exercise the third endpoint and add a new user to the database, execute the following command:

http POST :8080/newuser firstName=Echo lastName=Foxtrot address="21 Chester Place"

You should see something like this:

HTTP/1.1 200 OK
Content-Length: 102
Content-Type: application/json

{
    "address": "21 Chester Place",
    "firstName": "Echo",
    "id": "6399029c4b494a1a26d1c1ff",
    "lastName": "Foxtrot"
}
Enter fullscreen mode Exit fullscreen mode

To double-check that all records are present in the database, you can once again access the first, summary endpoint via http :8080 or curl http://localhost:8080.

"Migrating" our app to Azure Cosmos DB for MongoDB

It's almost embarrassing how easy it is to migrate our app to Azure Cosmos DB for MongoDB. You'll see what I mean shortly, in the following sections.

Initialization and configuration

I use two scripts (repository link at the bottom of this article) to prepare the Azure environment generally and Cosmos DB specifically for this project: one to initialize environment variables to be used, and one to create the Azure Cosmos DB for MongoDB target based upon those variables and expose two variables Spring Data MongoDB expects for connections to databases at coordinates other than the defaults, localhost and port 27017.

The script CosmongoInitEnv.sh should be sourced by executing it in this manner from a terminal window:

source ./CosmongoInitEnv.sh

This assumes that the script resides in the current (.) directory, but if not, you must specify the full path to the shell script. Sourcing the script sets the environment variables contained therein and ensures they remain part of the environment after the script finishes, rather than spawning a different environment to run the script and having all changes made in that process disappear upon completion.

To verify environment variables are set as expected, I like to check the environment using a command similar to the following:

env | sort

or

env | grep COSMOSDB_MON

To create the Azure resources we will need for this project, and to expose the requisite variables for our app to locate our new cloud-based database, we then source the CosmongoConfig.sh script as follows:

source ./CosmongoConfig.sh

Two things of note. First, this again assumes you are executing this script from the current (.) directory. Second, since we once again set environment variable values that we plan to use after the script finishes execution, we must source this script as well. Failure to do isn't catastrophic, but it will result in empty variables and will require manually executing the final three lines (the exports) from CosmongoConfig.sh in the terminal in order to proceed.

You can verify the required environment variables are now set using this or similar command:

env | grep SPRING_DATA

These env vars are essential to the app we're about to create:

  • SPRING_DATA_MONGODB_URI
  • SPRING_DATA_MONGODB_DATABASE

If both variables' values are present, we can proceed to the next step and execute our application to begin using Azure Cosmos DB for MongoDB.

Changing the application

Just kidding! The app is already ready to go.

Yes, you read that correctly. Spring Data will automatically seek and incorporate the provided environment variables to connect to a database at coordinates other than the defaults. That's all that's required to migrate our app to Azure Cosmos DB for MongoDB!

Demo, take 2

From the project directory, execute the following command to build and run the project:

mvn spring-boot:run

Exercising the endpoints produces results nearly identical to the ones before, except that the data is now stored in Azure Cosmos DB for MongoDB (and of course, the database-assigned IDs are different).

Retrieve all current users:

» http :8080
HTTP/1.1 200 OK
Content-Type: application/json
transfer-encoding: chunked

[
    {
        "address": "123 N 45th St",
        "firstName": "Alpha",
        "id": "639bbb1db945f0487e9a778a",
        "lastName": "Bravo"
    },
    {
        "address": "1313 Mockingbird Lane",
        "firstName": "Charlie",
        "id": "639bbb1db945f0487e9a778b",
        "lastName": "Delta"
    }
]
Enter fullscreen mode Exit fullscreen mode

Retrieve the first user in the database:

» http :8080/oneuser                                                                
HTTP/1.1 200 OK
Content-Length: 98
Content-Type: application/json

{
    "address": "123 N 45th St",
    "firstName": "Alpha",
    "id": "639bbb1db945f0487e9a778a",
    "lastName": "Bravo"
}
Enter fullscreen mode Exit fullscreen mode

Add a new user to the database:

» http POST :8080/newuser firstName=Echo lastName=Foxtrot address="21 Chester Place"
HTTP/1.1 200 OK
Content-Length: 102
Content-Type: application/json

{
    "address": "21 Chester Place",
    "firstName": "Echo",
    "id": "639bbb5cb945f0487e9a778c",
    "lastName": "Foxtrot"
}
Enter fullscreen mode Exit fullscreen mode

Retrieve all users, verifying the new user we just added is present:

» http :8080
HTTP/1.1 200 OK
Content-Type: application/json
transfer-encoding: chunked

[
    {
        "address": "123 N 45th St",
        "firstName": "Alpha",
        "id": "639bbb1db945f0487e9a778a",
        "lastName": "Bravo"
    },
    {
        "address": "1313 Mockingbird Lane",
        "firstName": "Charlie",
        "id": "639bbb1db945f0487e9a778b",
        "lastName": "Delta"
    },
    {
        "address": "21 Chester Place",
        "firstName": "Echo",
        "id": "639bbb5cb945f0487e9a778c",
        "lastName": "Foxtrot"
    }
]
Enter fullscreen mode Exit fullscreen mode

Summary

Thanks to the developer-first mindset of the Spring Boot, Spring Data, and Cosmos DB development teams, upgrading your Spring Boot app from using MongoDB to Azure Cosmos DB for MongoDB is as simple as changing a few environment variables and restarting your application. This enables you to go from concept to code to planetary-scale production in a matter of minutes...or seconds, if you don't stop for coffee along the way!

But don't worry, I won't tell your boss. Go ahead and grab that cup of java. You've earned it.

Resources

Top comments (0)