top of page

IoT in D365FO (architecture)

In today's article we are going to talk about IoT; of IoT connected to D365FO and how to give our ERP a new layer of functionality that allows us to control, monitor and predict everything related to our connected devices in our business.



Which is the purpose of this functionality?

Clearly today we are hearing more and more about IoT (Internet of things) and how to control devices from the cloud, have real-time monitoring and so on. That is exactly what this functionality intends; being able to have a work environment with a series of IoT devices (signals, sensors, machines, telemetry data...) that send this information to D365FO and being able to carry out actions and draw conclusions based on that data.

Some time ago we saw the launch of this functionality under the name of "IoT Intelligence". Now this functionality has been improved and in turn renamed "Sensor Data Intelligence".


What can we do with this feature?

Currently, this functionality allows us to analyze 5 different scenarios, although it is expected that the number of these will be increased. Previously we could only analyze 3 of them.

Let's see them in the following table:

Scenario

Utility

Inactivity time

Allows you to analyze the efficiency of a device by measuring idle times

Asset maintenance

Minimize costs and extend the useful life of devices

Machine status

Allows you to analyze the performance of devices based on their readings

Product quality

Allows readings to be compared with ranges of values ​​to ensure the correct quality of the products

Production delays

​Allows you to measure the actual life cycle with the planned one to notify delays in production actions

Application examples:

  • Production sensor that is responsible for weighing the pallets that pass over it. If the weight is higher than expected, that pallet must be rejected.

  • Presence reader on a production line to check that production is still running. If there is a failure on the line we will be notified.

  • Measure the usage time of a device to predict the point at which maintenance will be required

So far we could say that we have all the NON-technical information of this functionality. From this point, all the technical aspects of architecture and configuration are analyzed to understand the ins and outs of the system.


How can I activate this functionality?

This new version of the functionality offers us a very important variation regarding its activation and initial configuration. While in the previous version it was configured through an add-in from the LCS portal; in the current version we do it directly from the D365FO environment.

So we will perform the following actions:

  1. Go to System Administration > Workspaces > Feature Management.

  2. On the All tab, use the Filter field to search for the feature called Sensor Data Intelligence.

  3. If the Sensor Data Intelligence feature is enabled on your system, select it in the list and then select Disable to disable it. You cannot use this old version of the feature together with the new version

  4. Use the Filter field to search for the feature called (Preview) Sensor Data Intelligence.

  5. Select the feature in the list, and then select Enable Now to enable it.

Once the functionality is activated, we will configure it:

  • We log in to D365FO with an account that has administrator permissions.

  • We go to System Administration > Configuration > Sensor Data Intelligence > Deploy and connect Azure resources to open the wizard.

  • On the Deploy sample IoT solution to Azure page we select Deploy.


  • A new window will open that will redirect us directly to the Azure portal.

  • On the Custom Deployment page, in the Subscription field, we select the subscription that will host the resources.

  • We create a new resource group or reuse an existing one

  • We set the following fields: Supply Chain Management Environment URL and Reuse existing Azure IoT Hub unchecked.

  • We select Next: Review and create.

Once the resources have been created, we are going to make some final configurations in the D365FO environment.

If we have not closed the wizard, in the next step these fields will appear:


If we have closed the wizard, we only have to navigate to the System administration route > Configure > Sensor data intelligence > Sensor Data Intelligence parameters in which we will see a form where we can configure the same fields.


We need 2 strings of information to be able to fill these fields. We will go to the Azure portal and within the created resources we will look for 2 of them:


Client Id (managed identity)
Redis Cache

We copy these values ​​and paste each one in its corresponding place.


Once these steps are done, we already have the tool correctly configured and we can use it. But we are not going to stay here; We are going to analyze what has happened in this installation process and what are the elements involved.


Azure Architecture

In the following image we can see the architecture that will be in Azure once we have implemented the resources mentioned above:

IoT sensors

In the first place (on the left of the image) we have the physical sensors that will be in charge of sending the information of their readings to an IoT Hub. These readings will need to be sent in a particular format to Azure and will always need to be a "sensor-value" pair.

The format will be JSON and will have a structure like the following:

{
    "value": 50,
    "sensorId": "Machine1"
}

In this way, the IoT Hub is capable of distributing the messages it receives among all the devices created.


IoT Hub

This resource is in charge of storing and organizing all the IoT devices that we have. Conceptually it should host as many IoT devices as real devices that we have connected.

To do this we go to the "Devices" tab and create as many as we want:

The difference between them is that each one has its connection string, and it is the one that we must use to be able to configure the sending of the signals.


Azure Stream Analytics

These Stream Analytics resources are responsible for performing the following processes:

1. The link between the IoT Hub (where device signals reside) and the destination of those signals. In particular, this information has 2 destinations:

  • Azure Redis Cache (metrics entry point to D365FO)

  • Service Bus Queue (notification management)

2. Receive data from D365FO in order to be able to identify the features and triggers configured in D365FO and to be able to execute notifications.


Thus, we can find a Stream Analytics resource for each type of scenario discussed above.

If we enter one of them we will see how there are inputs and outputs for these functions, common to all of them.

  • Tickets

    • IoT signals

    • D365FO configuration information

  • Departures

    • Metrics to Redis Cache (to be able to visualize the signals in D365FO)

    • Messages to Service Bus (to manage possible notifications)

Azure Function

This resource is the simplest of all but the most necessary. It is responsible for the actual transmission of the information between the source and the destination. It is the channel where our signals will travel from Stream Analytics to Redis Cache.


Redis Cache

Azure Cache for Redis is a fully managed in-memory cache that enables high performance and scalable architectures. Use it to build cloud or hybrid deployments that handle millions of requests per second with sub-millisecond latency, all with the configuration, security, and availability benefits of a managed service. Therefore it is extremely useful when we talk about transmission of simple data with a very high frequency and with the need for real time.

We can in this resource configure the size of this cache and the lifetime of the data in it. It comes to be a simple "SQL" with automatic deletion.

Thus, it also offers us a great variety of scalability, as you can see in the following image:

Although obviously later it affects the cost of the resource.

It is in this resource where our IoT data will be stored so that they are accessible by D365FO. In fact D365FO will make periodic requests to Redis Cache to get the data and build the graphs.


Storage Blob

This resource is nothing more than a storage account where, from a Logic App, we store the D365FO configurations so that later the Stream Analytics service can compare the real data with the configurations and be able to generate alerts.

The content of the storage account is as follows:

As you can see in each folder the information of each scenario will be stored with the intention that Azure has this data.

For example, if we configure the product quality scenario, we will have a file similar to this one with the information of that configuration:

The content of one of these files is as follows:


[

{

"@odata.etag": "W/\"JzEsNTYzNzE0NjA4MTsxLDY4NzE5NDc2NzYyJw==\"",

"SensorId": "MachineStatus1",

"JobId": "005863",

"JobDataAreaId": "usmf",

"ItemNumber": "P0111",

"MaximumAttributeTolerance": 30,

"OptimalAttributeValue": 25,

"JobRegistrationStartDateTime": "2022-09-22T02:50:40Z",

"OrderId": "B000052",

"IsJobCompleted": "No",

"MinimumAttributeTolerance": 10,

"JobRegistrationStopDateTime": "1900-01-01T00:00:00Z",

"AttributeName": "Concentration"

}

]


In it we can see the name of the sensor, the company, the article, the tolerance margins, the associated production order, the attribute that we are going to measure...

With this information it is only necessary to make a comparison with the value obtained by the sensor to know if it is outside or within tolerances.


Logic App (streaming data from D365FO to Azure)

This resource is the most complex to analyze, although its function is perhaps quite simple to understand. This logic app is responsible for taking configurations from D365FO and transferring them to the aforementioned storage. As it does? You just have to take a look at the following diagram so that you can see the complexity of the matter.

Impossible to detect anything in this image so let's make a spoiler: there is a branch for each scenario. So let's just focus on one and then mentally we'll be able to replicate it for the others.

Let's take the example of the product quality scenario.

First of all, we have a series of requests to access D365FO and return the list of active and configured scenarios.

Once obtained (common step for all) the flow performs its own actions for each scenario; in the case of product quality assessment:

Without going into much detail, what this flow does is compare the configurations that exist in the Storage and create those that are necessary for the configuration to be current. By default, the delay between configuration updates is 2 minutes, although we can easily modify it.


Service Bus

This service is in charge of queuing all the notifications detected by Stream Analytics and that have to be sent to D365FO in order.


Logic App (notifications)

Once we have the notifications in the Service Bus we need something to transmit these alerts to D365FO. This Logic App takes care of this.

Through a Post on D365FO it shows us the alerts and marks the messages as completed so that the queue is emptied and the message is not sent again.


With this knowledge, could I already use IoT with D365FO?

Of course the answer is yes. But in this article we have not seen its operation or its configuration. For this, we invite you to see the following post where, in addition, there will be surprises.

In the next post we will configure scenarios, see how the results are displayed in D365FO, and also see how we can add a better visualization layer to it with external tools. Finally, what has been said, some surprise will appear.

bottom of page