top of page

Event-Driven Architecture Challenges with Business Central


Like many, I'm excited by the benefits that microservice architectures offer over traditional, monolithic systems. As always, the devil is in the detail but many of us don't get a chance to learn about the detail until we've committed our organisations down a particular path.

I had an opportunity to look deeply into some of the the implementation practicalities of such a re-architecture. At the time of writing, the team is looking at a variety of ERP vendors, but I've learned enough about the integration surfaces of Microsoft's Business Central in particular, to jot down some findings in the hope that others may benefit.

The context of these learnings is an IT transformation project where the client is seeking to move away from an on-premise ERP toward a cloud-based offering.

The client organisation has partitioned itself organically around shared concerns into several business domains: Production, Warehousing, Supplier Management and so forth. From the earliest discussions with the client, it was clear that staff in some areas were having a really Bad Time, but the monolithic nature of the incumbent ERP made any kind of change impractical. So the client is strongly motivated to move to an architecture that facilitates easy evolution. Dividing the software assets up into coarse-grained microservices ("mini-services"?) along the lines of the existing business domains seems like a good way to achieve this level of evolvability. Also, by not putting all the eggs in the one ERP basket, the door remained open to some of the highly-interactive, highly-graphical real-time tools the users are crying out for.

The nature of, and motivations behind event-driven, highly-decoupled microservices are covered pretty well around the internet. Suffice to say that one of the most fundamental flows that the new architecture must support is the ability to change a sales order in the cloud-based ERP and have an event-based mechanism multicast the relevant details of that change to other functional areas of the business.

For example, the Production team needs to know about new orders coming in so they can start planning when and how to fit them into their schedule.

(We could easily get side-tracked here by production-grade resilience concerns requiring event-sourcing and materialised views, but I just want to drill down on some more fundamental issues involved in raising events from Business Central and handling them in an Azure Function)

Event subscription overview

At the time of writing, Business Central does not provide any support for the OpenAPI specification (it's planned for April 2019). This is a pity because this would simplify integration with Microsoft Flow and Logic Apps - in particular, the Webhook object requires an OpenAPI definition before you can use it.

However, we can set up change notifications manually. Out of the box, Business Central provides a facility to subscribe to various change events for its basic business entities. By invoking a REST API you can create a 'subscription' object that causes Business Central to send a POST request to a URL of your choice whenever certain entities are changed or added.

My intention has been to set up some change subscriptions, catch the events emanating from Business Central and forward them to the Azure Event Grid for subsequent consumption by other parts of the organisation.

Note that we don't want to wire up Business Central events directly to subscribers - that would create a coupling that would eventually inhibit our ability to change. And in any case, the entity data that you get out of Business Central needs a bit of work done on it before it's in a suitable shape to send to event subscribers.

Creating Business Central event subscriptions

You would hope that there would be some degree of authentication to protect against random internet strangers being able to set up subscriptions to your core data, and indeed there is. You have to be an authorized Business Central user to create a webhook subscription.

At the time that you create a subscription, Business Central also carries out a small test to improve the chances that you actually control the notificationUrl you are pointing it to: It sends a one-time POST to your notificationUrl with a query parameter called 'validationToken'. The notificationUrl must respond with the value of that parameter in the response body. It's a simple bit of logic and the motivations are understandable, however it stops you pointing a subscription directly at an Event Grid topic. This was initially disappointing but it turns out that there are other reasons why this would have been impractical.

Updating event subscriptions

The Business Central subscriptions need to be revalidated every 3 days, or else they are removed. Once again, it's a matter of invoking a particular REST API to do this. I imagine that the Business Central team chose to do this as a way of reducing the number of zombie subscriptions firing off endlessly into an uncaring void. Given that most people will just set up automated timers to refresh them, I can't help but think they've just moved the problem to a different place.

Deleting event subscriptions

At the time of writing, Business Central webhook subscriptions could not be deleted. The documentation indicates that it's possible, but it doesn't work.

Protecting the Notification Url

Business Central doesn't provide facilities for its event notifier to authenticate with the notificationUrl. You can't configure it to send a particular Authorization header with each change notification, for example.

So your target (an Azure Function, in my case) needs to allow anonymous access. There may be ways to lock this down through IP restrictions, API gateways, but I didn't pursue this.

Alternatively/additionally, there is a 'clientState' property that Business Central sends with each change notification as part of the POST body. It's a bit hacky, but it seems to be designed to be used as a shared secret by the Azure function. Ideally you would validate it as early as possible and if it doesn't match, bail out. You'd have to give some first-class thought to a mechanism to synchronize the update of these shared secrets in a scalable way.

Poke-pull problems

Business Central doesn't supply you with all the entity property values when a subscribed entity changes. Instead, you get a JSON document that says "a thing was updated or created, and here's a URL to query if you want to know more". Your code is then expected to reach back into Business Central (again, with all the hassles of authenticating against the API and maintaining secrets) to extract the entire description of the object that has changed. The Business Central documentation calls this "poke-pull".

But when you pull back the changed entity, you don't know what has changed. Was it one field, or many? In fact, much of the time, it will be one of the child entities that has changed. For example, if you are subscribing to changes in the SalesOrder object, the webhook will fire whenever any of the child Sales Order Lines change. You have no way of finding out which one it was, or which field changed within the Sales Order Line. Sadly, you can't subscribe to changes in SalesOrderLines, only the parent SalesOrder entity.

If you want to pass on fine-grained changes via Event Grid, it seems that the only way you can really do this is to maintain a private copy of the Sales Orders (and all the child entities) for comparison purposes. This has clear scalability problems and just a generally Bad Smell to it (because it just seems like a lot of work to achieve a pretty simple thing)

Alternatively you could architect around not caring what has changed. You could just blast out all the details of all the objects you retrieve, whether they have changed or not. In my case, that would involve forwarding each Sales Order header record plus each Sales Order line out to the respective Event Grid endpoints. By doing this you push the what-has-changed problem out to your subscribers. There will be some unavoidable degree of duplication and wasted CPU cycles in this approach.

Chatty change events

Another potential issue to be aware of is that the change events are raised as part of autosave within the Business Central client UI. There is no "batching up changes in a modal dialog until an OK button is hit". If you make several changes to a line item (maybe you're on a phone call, and you make several adjustments, up and down, to the number of pieces in a line item over the course of a 5-minute conversation), then sure, the last one wins, but several change events could be raised.

(NB: I've seen the notifications come anywhere from 30 seconds to 3 minutes after they are made in the UI)

These multiple change events are no problem if you're simply updating a couple of slaved database tables in other microservices. But in our case, each line item corresponds to a number of shipping containers. So changing the number of pieces in a line item will ultimately cause new shipping containers to be ordered or existing ones to be cancelled. This is non-trivial. Bookings will be made or cancelled with trucking companies causing carnage to their driver scheduling. Most of all, each container kicks off a heavyweight, change-averse process with port authorities, government biosecurity and customs bureaucracies. So we'll be batching up those changes, otherwise some innocent chopping and changing in the UI could unwittingly annoy a lot of external partners.

I hope this list of gotchas is useful to some people; you'd be unlikely to be aware of these concerns from the Business Central documentation. If nothing else, it might be the basis for some probing questions as you interview your own ERP vendors.


Recent Posts
Featured Posts
Search By Tags
Archive

Mike Wiese

 

enquiry@ksc.net.au


0427 886 404, West Australian Time
 

© 2017 Keystone Software Consulting. All rights reserved.

  • Facebook - White Circle
  • Twitter - White Circle
  • Google+ - White Circle
bottom of page