Event-Sourcing-CQRS-Azure-Cosmos-DB

Event Sourcing with CQRS on Azure Cosmos DB

Working at Zupa means working in an innovative and forward-thinking environment. You’ll have the opportunity to use the latest technologies to build innovative solutions across eCommerce, eProcurement and Reporting tools.

That also means: you’ll find a lot of opportunities to advance your technical career with Zupa!

Introducing myself

I am Alberto, a Software Developer and proud new member of Team 2!

We are the team that owns the development of Agreement microservices, and today I want to present Azure Cosmos DB and how we use it.

Event Sourcing and CQRS

Our agreement microservice uses Event Sourcing coupled with CQRS (Command Query Responsibility Segregation) on Azure Cosmos DB. Event Sourcing affirms that storing data in tables is not as desirable as storing the transactions that occur in our system.

Let’s make the effort to pretend we’re inside an Azure Cosmos DB container. What would we see?

I tried to answer this question myself, and I imagined a big sky full of stars like in the picture below.

Made with bannersnack.com

What are those rectangles?

Each of those rectangles is a recorded transaction that would occur in the agreement service.

Let’s pretend a user on zupaPlatform just started trading with a partner. The two actors would go through a set of stages. During each stage, they would perform some transactions that would change part of the agreement they are negotiating.

For example, one of the actors might want to change the name of one of their products. In a relational word that would mean ‘update a cell in a table’. In event sourcing that will mean ‘append an event in your store’ with id “Product Name Changed” and payload “New Name”.

This way of storing data provides flexibility: we are creating a time-machine that is enabling us to move back and forward in time and at the occurrence, reinterpret history as we like.

Do we want to debug how a certain value is displayed on our screens? Done. We want to rewrite our application, forget worrying about losing your production data or making painful migrations. Audit logs? Provided for free. Need to reuse data in a different domain? Just reinterpret them differently.

How do we read data?

The answer is, by using projections.

After data arrives at an append-only store, we create snapshots and we store them in reading optimised datastores. This enables us to present the data in the best possible way for what will consume them.

Undesirable data duplication

One of the main ideas that we are taught in university during our DB courses, is that duplication is undesirable when it comes to design a good relational datastore.

When we approach Event Sourcing we learn that duplication is not always that bad but handling it comes with challenges. For example, how can we keep data synchronised between our append-only store and our projections?

Azure Cosmos DB has an integrated ‘messaging system’, referred to as “change feeds”. These are what we use in Team 2 to keep our projections in sync!

Azure Cosmos DB Change Feeds

A message is a piece of information for which the sender has expectations regarding how it will be handled by the receiver.

Sender and receiver are coupled and losing a message would have an impact on the system. That is also why message brokers come in handy. Not only do they decouple senders and receivers by providing temporary storage for the messages, but they also offer means to configure how messages will be received (“At Least Once”, “At Most Once”, “Exactly Once”).

Azure Cosmos DB change feeds come with an SDK that provides “At Least Once” delivery guarantee.

How?

The SDK will monitor if the thread that is receiving messages in your application is throwing exceptions, in that case, it will keep retrying to send the message.

How we apply this at Zupa

In Team 2 we keep the read side and the write side of our solution separate and we write code using CQRS. On the write side, we store events. On the read side, we project those events in the best way for the application that will consume the data.

I hope you enjoyed this article if you had any questions or comments please email me at alberto.denatale@zupa.co.uk.


Did you learn something from this post? Tell us on twitter. To register your CV for Zupa job alerts in the software industry click here.

Now read Five Software Development tools I actually use!