Handling events and incoming Web API calls simultaneously

event-programmingmicroservicesweb-api

I'm working on a project where I got an architecture with multiple "Microservices" communicating with each other by using an event bus (RabbitMQ).

I got an Web API-project set up for each service so the outside world can send commands using REST-calls.

When my Web-API receives a call, it sends the command down to my domain-layer where I process that command and persist any data if its necessary. When that's done I update my surrounding services by publishing an event with the corresponding data.

The thing I'm struggling with is the receiving end of the event-driven communication. How can a microservice be ready to accept API-calls from the outside world and have some event-handler waiting for incoming events at the same time? It feels like this shouldn't be some other service but I can't think of any other way to deal with this problem.

I hope that the following image will make things clear

enter image description here
If we have a banking system and when we want to create a customer, we need the account-service to react to that event and create an account for him. How can we make sure that the Account Service is ready to receive API-calls and simultaneously handle the events from the event bus?

Best Answer

From what you're describing, it strikes me that you could have a problem with the threading model for your event processing loop (ie. the RabbitMQ message handler) and how ASP.NET's concurrency works.

There's two ways to tackle this: in-process and out-of-process.

In-process:

Use a dedicated background thread for your event loop, one that is isolated from the threads that are processing HTTP requests.
Do this in Application_Start. That will give you the concurrency you're looking for: handling incoming HTTP requests & listening for messages at the same time.

Unfortunately, you're doing this in an application framework that's not been designed for this style of work processing. Phil Haack explores why a long running background thread, and an event processing loop is exactly that, in ASP.NET is problematic So, depending on your context, you may be happy to ignore the problems surface in that post. The main one to consider is that application pools, by default, get torn down after 20 minutes. That's fundamentally in-compatible with the always-on process model an event pulling loop needs.

Out-of-process:

Stop handling RabbitMQ messages in your application and expose each type of message processing as an HTTP triggered RESTy API call instead. Introduce a RabbitMQ-to-HTTP bridge process (a windows service, or even a console application) that will listen for Rabbit messages and trigger the appropriate RESTy method.

The downside of this approach is increased deployment complexity and you have to consider distribution as part of your failure scenarios. The nice bit is that your main application runs off HTTP only and your RabbitMQ message processor itself is very straightforward, with no threading worries. All this should facilitate development and testing, despite the increased level of distribution.

Variation: instead of going HTTP-centric for your main application, go Rabbit-centric, and build an HTTP -> RabbitMQ bridge instead and convert your existing REST calls into message triggered equivalents. I don't recommend that, because you're always dependent on a very particular message broker to do work, while HTTP is a ubiquitous integration technology.

Related Topic