DEV Community

Manu Radhakrishnan
Manu Radhakrishnan

Posted on

3 1

Is there any way to compress the data while using mongo persistence with NEventStore?

I'm working with C#, Dotnet core, and NeventStore( version- 9.0.1), trying to evaluate various persistence options that it supports out of the box.

More specifically, when trying to use the mongo persistence, the payload is getting stored without any compression being applied.

Note: Payload compression is happening perfectly when using the SQL persistence of NEventStore whereas not with the mongo persistence.

I'm using the below code to create the event store and initialize:

private IStoreEvents CreateEventStore(string connectionString) 
    { 
        var store = Wireup.Init() 
                        .UsingMongoPersistence(connectionString,  
                                  new NEventStore.Serialization
                                  .DocumentObjectSerializer()) 
                        .InitializeStorageEngine() 
                        .UsingBsonSerialization() 
                        .Compress() 
                        .HookIntoPipelineUsing() 
                        .Build(); 
        return store; 
    }
Enter fullscreen mode Exit fullscreen mode

And, I'm using the below code for storing the events:

public async Task AddMessageTostore(Command command) 
{ 
    using (var stream = _eventStore.CreateStream(command.Id)) 
         { 
                stream.Add(new EventMessage { Body = command }); 
                stream.CommitChanges(Guid.NewGuid()); 
         }
} 
Enter fullscreen mode Exit fullscreen mode

The workaround did: Implementing the PreCommit(CommitAttempt attempt) and Select methods in IPipelineHook and by using gzip compression logic the compression of events was achieved in MongoDB.

Attaching data store image of both SQL and mongo persistence:
Image description
Image description

So, the questions are:

  1. Is there some other option or setting I'm missing so that the events get compressed while saving(fluent way of calling compress method) ?
  2. Is the workaround mentioned above sensible to do or is it a performance overhead?

Image of Timescale

🚀 pgai Vectorizer: SQLAlchemy and LiteLLM Make Vector Search Simple

We built pgai Vectorizer to simplify embedding management for AI applications—without needing a separate database or complex infrastructure. Since launch, developers have created over 3,000 vectorizers on Timescale Cloud, with many more self-hosted.

Read full post →

Top comments (0)

Postmark Image

Speedy emails, satisfied customers

Are delayed transactional emails costing you user satisfaction? Postmark delivers your emails almost instantly, keeping your customers happy and connected.

Sign up