Serverless Integration design pattern on Azure to handle millions of transactions per second
Recently i was working on one of the integration project where client have following requirements-
We have analyzed below options for this integration framework-
1. Biztalk Server
2. Azure Function, Logic App & Event Hub
3. Third party integration server
Every option has its own pros & cons. Based on client requirement analysis with each option we found option 2, Azure Function and Event Hub approach most suitable for this type of integration.
Below are the main features which played a key role to select this framework for integration-
Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure.
Azure Logic Apps is a cloud service that helps you automate and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations. Logic Apps simplifies how you design and build scalable solutions for app integration, data integration, system integration, enterprise application integration (EAI), and business-to-business (B2B) communication, whether in the cloud, on premises, or both.
Azure Event Hubs is a Big Data streaming platform and event ingestion service, capable of receiving and processing millions of events per second.
SQL Database is a general-purpose relational database managed service in Microsoft Azure that supports structures such as relational data, JSON, spatial, and XML. SQL Database delivers dynamically scalable performance within two different purchasing models: a vCore-based purchasing model and a DTU-based purchasing model.
Let's discuss on processing & data flow in this design pattern.
Partner can send data in two ways-
1. Post XML files to logic app
2. Post data directly to Azure function
Partner who have existing XML formatted files & does not want to make any change in existing application & are not able to send data to Azure function they can send XML files to logic app. Below is the process in logic app-
1. Client post XML files with other fields for client id and token
2. Azure SQL contains list of clients, client id, token, supported integration details e.g. input schema, XSLT name and target schema name.
3. Based on configuration done for particular client and message type, logic app picks schema, XSLT from integration account and after processing send message to azure function to further processing.
2. After successful data validation send message to process event hub.
3. Another azure function pick message and do the processing based on business logic. After business processing it collect all data and send to another event hub for dbupdate.
4. Another azure function pick message from dbupdate event hub and save data into cosmosdb, azure SQL etc.
5. We have one more event hub which is used to post data to another applications in asynchronous way. Whenever we need to post data to another application we are sending message to posttopartner event hub.
6. Azure function pick message from posttopartner and send to partners endpoint with security token etc.
2. Inside catch block it validates error code. It this is related to any partner system connectivity than it insert record into cosmosdb in separate retry table.
3. Another azure function runs after some duration and check if any unprocessed record available in retry table than it pickup those records and post to respective event hub.
4. From event hub respective azure function pickup the failed message and start the processing.
5. Inside every business logic it first check if the same message is not processed than only process it.
I hope this will help you to understand how to achieve high performance integrations which can handle millions of requests per second with on demand scaling and server less technology.
Let me know if you have any query.
- It should support millions of transactions per second
- Based on demand/number of requests it should auto-scale.
- Client don't want to procure huge hardware up-front.
- It should support multiple applications integration & open for future integrations without any or minimum changes.
- If any application is down for some time, framework should have in-build retry logic after some configured schedule.
We have analyzed below options for this integration framework-
1. Biztalk Server
2. Azure Function, Logic App & Event Hub
3. Third party integration server
Every option has its own pros & cons. Based on client requirement analysis with each option we found option 2, Azure Function and Event Hub approach most suitable for this type of integration.
Below are the main features which played a key role to select this framework for integration-
Azure Function-
Azure Functions is a serverless compute service that enables you to run code on-demand without having to explicitly provision or manage infrastructure.
Azure Logic App-
Azure Logic Apps is a cloud service that helps you automate and orchestrate tasks, business processes, and workflows when you need to integrate apps, data, systems, and services across enterprises or organizations. Logic Apps simplifies how you design and build scalable solutions for app integration, data integration, system integration, enterprise application integration (EAI), and business-to-business (B2B) communication, whether in the cloud, on premises, or both.
Azure Event Hub-
Azure Event Hubs is a Big Data streaming platform and event ingestion service, capable of receiving and processing millions of events per second.
Cosmos DB -
Azure Cosmos DB is Microsoft's globally distributed, multi-model database service. With the click of a button, Azure Cosmos DB enables you to elastically and independently scale throughput and storage across any number of Azure's geographic regions. You can elastically scale throughput and storage, and take advantage of fast, single-digit-millisecond data access using your favorite API among SQL, MongoDB, Cassandra, Tables, or Gremlin. Cosmos DB provides comprehensive service level agreements (SLAs) for throughput, latency, availability, and consistency guarantees, something no other database service can offer.SQL Azure-
SQL Database is a general-purpose relational database managed service in Microsoft Azure that supports structures such as relational data, JSON, spatial, and XML. SQL Database delivers dynamically scalable performance within two different purchasing models: a vCore-based purchasing model and a DTU-based purchasing model.
Let's discuss on processing & data flow in this design pattern.
Partner can send data in two ways-
1. Post XML files to logic app
2. Post data directly to Azure function
Partner who have existing XML formatted files & does not want to make any change in existing application & are not able to send data to Azure function they can send XML files to logic app. Below is the process in logic app-
1. Client post XML files with other fields for client id and token
2. Azure SQL contains list of clients, client id, token, supported integration details e.g. input schema, XSLT name and target schema name.
3. Based on configuration done for particular client and message type, logic app picks schema, XSLT from integration account and after processing send message to azure function to further processing.
Working of Azure function-
1. Based on message type first validates input message.2. After successful data validation send message to process event hub.
3. Another azure function pick message and do the processing based on business logic. After business processing it collect all data and send to another event hub for dbupdate.
4. Another azure function pick message from dbupdate event hub and save data into cosmosdb, azure SQL etc.
5. We have one more event hub which is used to post data to another applications in asynchronous way. Whenever we need to post data to another application we are sending message to posttopartner event hub.
6. Azure function pick message from posttopartner and send to partners endpoint with security token etc.
Exception and Retry management-
1. Every azure function which works on event have all code within try catch block.2. Inside catch block it validates error code. It this is related to any partner system connectivity than it insert record into cosmosdb in separate retry table.
3. Another azure function runs after some duration and check if any unprocessed record available in retry table than it pickup those records and post to respective event hub.
4. From event hub respective azure function pickup the failed message and start the processing.
5. Inside every business logic it first check if the same message is not processed than only process it.
I hope this will help you to understand how to achieve high performance integrations which can handle millions of requests per second with on demand scaling and server less technology.
Let me know if you have any query.

Comments
Post a Comment