Introduction
Today, the Internet of Things (IoT) enables everything from Autonomous vehicles, realising the concept of Smart Cities to boosting Supply chain management efficiency. IoT applications seem endless, and they have just begun.
Being a leader in API Management and Integration landscape, MuleSoft has introduced ways to enable Organisations to harness the power of IoT to improve their operational efficiency.
In this blog, I will give insight and an example of achieving operational efficiency by enabling IoT Device data integration between MuleSoft and AWS IoT Cloud services.
The use case for the example simulation is monitoring temperature sensors installed in manufacturing or warehouse facilities. The data will be captured, and an incident will be escalated to the Operations team for their action.
AWS IoT Core provides the capability to connect things (devices) to its services ecosystem and therefore is the first layer to interface with the Device/Sensor data. The data then gets moved into the MuleSoft ecosystem by leveraging other AWS services like SQS.
To better understand the solution presented here, the audience should be familiar with the concepts of
- AWS Services
- EC2
- IAM
- EC2 Connect
- IoT Core
- Cloudwatch logs
- Amazon SQS
- MuleSoft Anypoint Platform
- ServiceNow Incident Management
High-Level Flow of our Integration Solution
In the above diagram, The First layer consists of the sensors/devices (we will create a virtual device to simulate the real-world sensors) that will send data to the AWS IoT Core.
The data will be processed by the AWS IoT Core Message broker and published to the Rule engine. The Rule engine then applies the filter and route to the Amazon SQS (It can also be routed to other AWS services like Lambda, S3, Kinesis etc).
Once data is in the SQS Standard queue, MuleSoft API is triggered. It gets the filtered data (since IoT Rule Engine in the previous step is filtering the temperature data and allowing only the ones matching the rules) from the queue. Once it is inside the API and Integration Layer, the data is Logged, transformed and used to invoke ServiceNow Incident Service to create Incident tickets for the Operations Team.
To enable the above integration, we need to set up several tasks.
Part-1 (AWS specific configurations)
- Create a Free AWS account.
- Launch an EC2 instance.
* Public IP address is masked.
- Configure a Virtual Device(Thing)
* Root account is masked here and in relevant screenshots below
- Create an IAM role
IAM User with Admin privileges(since its a dev environment, its safe to give user admin privileges)
IAM role with permissions as below(role with all the privileges for SQS)
Trust permissions for the entities(below) who would assume the DataSQSRole.
- Create a Standard SQS queue in London Region (No dead letter queue is attached)
- Configure a Rule(IoT)
For this demonstration, we are only concerned with the use cases where sensor temperature exceeds the operational threshold and requires the Operations Team’s attention. We have therefore used AWS IoT core rule engine to filter incoming messages for such cases.
Here, the IoT routing rule is created, which will get the sensor data from the ‘iot/topics/facility/sensor’ topic and filter the data wherein the temp exceeds 208 Degrees Fahrenheit.
Rule Action defines what should be done once the data is filtered, so here it will be routed to the SQS Standard queue.
Error Action defines what should be done in case of Broker/Rule Engine execution errors, so here we are logging the errors to the Cloudwatch logs for further investigation.
Part-2 (MuleSoft and other configurations)
- MuleSoft Implementation Flow
We have created a simple implementation here, where the Amazon SQS source connector is used to subscribe and consume messages from the SQS queue – DataInQ. In the subflow, an incident ticket is being raised by invoking the insert operation via the ServiceNow connector.
Both the connector configurations are provided down below.
- ServiceNow Instance set up.
Please follow the link to create a free developer instance – https://www.youtube.com/watch?v=ZdQKrbPcXuE
Please jot down the Service URL, Username and Password to be used later during MuleSoft ServiceNow connector configuration (Refer to Step 10)
- SQS Connector configuration
The Role and Queue ARN represents the Roles and SQS queue created earlier.
The Access Key and Secret Key is of the IAM user (created earlier) and not of the root account.
The Region is the AWS Region i.EU London (eu-west-2).
- ServiceNow Connector configuration
Using the data to generate a ServiceNow ticket
IoT Virtual Device sensor data is generated and sent to IoT Core. The data is then filtered out based on the rules defined in the rule engine.
The same data is routed to the SQS Standard queue, where it triggers a MuleSoft API, which transforms and sends the request to create an incident in ServiceNow.
Stage 1 – Data generation from Thing(virtual device). The script will generate the temperature data for the topic.
Stage 2 – MuleSoft Flow Execution
The above execution logs from Anypoint studio shows the request sent to ServiceNow, and the response received from the service.
Stage 3 – ServiceNow ticket creation
The above screenshot shows the ticket generated and the details for the same.
Look out for the Caller, Short Description, Additional Comments and Urgency fields value.
Conclusion
In conclusion, MuleSoft Integration has helped in achieving operational efficiency by not only capturing the temperature sensor data but also escalating an incident to the Operations team for their action.
This solution could be enhanced to create a Monitoring Dashboard/ Visualisation/ Insights with the sensor temperature data by importing all the sensor data into MuleSoft and using Tableau or any other Visualisation/Insights technology products.
In my next blog, I will achieve the same outcome by introducing Anypoint MQ (as a reliable persistent layer to hold high-frequency sensor data streams) and MySQL & Tableau or AWS services.
To learn more about increasing operational efficiency with MuleSoft, get in touch with Devoteam and we’ll be happy to help.