MindConnect Services provide APIs that enables shop floor devices to send data securely and reliably. It allows custom applications (agents) to collect and upload data which shall be stored and used by applications in the cloud.
For accessing this service you need to have the respective roles listed in MindConnect API roles and scopes.
Agents need a field-side network infrastructure to forward and route outbound HTTPs requests to the Internet.
MindConnect supports multiple agent device classes: strong hardware platforms as well as resource constrained devices. All target agent platforms must comply to the following minimum requirements:
- HTTP processing
- JSON parsing
- JSON Web Token (JWT) generation
- HMAC generation (preferably SHA2 based hashing)
Users with IP based filtering on their firewall can use the following two static IPs for whitelisting the data upload traffic:
Enabling these two IP addresses is sufficient for agents and clients accessing *.eu1.mindsphere.io.
Region China 1
To access *.cn1.mindsphere-in.cn , you should add the following IP address in your firewall rules instead:
If the agent or client is using certificate revocation list URLs, these have to be whitelisted in addition. However, this does not cover the interactive login process by WebKey and native access to IDL (AWS S3).
A SSL implementation may want to check the revocation list of Certificate Authorities. If the SSL implementation is trying to get this certificate revocation list at runtime, it needs access to an external URL. It is used for distributing revoked CA certificates by the CA. However, it is more often done by operation system or java updates.
Data Source Configuration¶
A data source configuration is needed for interpreting the data it receives from an agent. This configuration contains data sources and data points. Data sources are logical groups, e.g. a sensor or a machine, which contain one or more measurable data points, e.g. temperature or pressure.
Data Point Mapping¶
Data point mapping is required for storing the data it receives from an agent. This maps the data points from the data source configuration to properties of the digital entity, that represents the agent. When Insights Hub receives data from an agent, it looks up which property the data point is mapped to and stores the data there.
Use the MindConnect Service for defining the data point mapping. For instructions refer to Creating a Data Point Mapping.
Events originating from the device in the field are sent to Insights Hub using MindConnect APIs. These events are stored in the corresponding agent asset. For example, if a field device is connected to Insights Hub over MCLIB agent, all the events originated from the field device is stored in the MCLIB (core.mclib type) agent. By using event mapping APIs from the agent, these events can now be mapped to appropriate asset in Insights Hub. The API allows you to define mapping criteria. For example, if the event type field source contains “MyMachine” then map the events to the asset (asset id field in the API). Once the mapping is performed, every time the event of the selected type reaches the agent, the event is automatically stored against the asset. If otherwise, the event remains in the agent itself.
- If there are no event mappings matching an event, that event will be stored in the agent asset.
- Multiple mappings can match for an event uploaded. In such a case, all matching mappings will be applied for the event.
- An asset can have multiple event mappings.
- Maximum of 50 event mappings can be created per agent.
- Maximum of 5 event mappings can be created per agent from an event type.
Use the MindConnect Service for defining the event mapping. For instructions refer to Creating Event Mapping.
The MindConnect API allows agents to upload their data to Industrial IoT. This data can be of type:
- Time Series
The format conforms to a subset of the HTTP multipart specification, but only permits nesting of 2 levels. For instructions refer to Uploading Agent Data.
Standard data types¶
The MindConnect Service uses standard data types, which allow Industrial IoT to automatically process the data without additional configuration or coding. This means:
- The API defines how standard data types are transmitted, e.g. how metadata and production data need to be formatted as HTTPs payloads.
- Standard data types are automatically parsed and the information is stored to (virtual) assets.
- For each of the standard data types, there is a preconfigured mass data storage available.
- Data of standard types can be accessed and queried in a standardized way by applications and analytical tools.
The following standard data types for production data are supported:
- Time Series
Time Series are data point values that change constantly over time, e.g. values from analog sensors like a temperature sensor. This also applies to any other measured values that have an associated timestamp.
Events are based on machine events, e.g. emergency stops or machine failures. However, this mechanism can also be used to upload custom notifications, e.g. if you do on-site threshold monitoring and want to report a broken threshold.
Files of up to 10 MB can be uploaded per exchange call. The files are attached to the corresponding (virtual) asset, e.g. device log files or complex sensor structures. Files that are uploaded can be referenced by the parent (virtual) asset. The content of these files is not parsed. It requires custom applications or analytical tools to interpret and visualize the data.
- Data Models
Data models describe the agent-side asset hierarchy and configuration including measurement points.
The MindConnect Service exposes its API to agents for realizing the following tasks:
- Upload time series
- Upload files
- Describe and upload asset data models
- Upload data of custom data types for custom handling
The manager of a wind farm wants to collect sensor data of a wind turbine. A developer implements a field application (agent) which collects the sensor data. The agent uses the MindConnect API for uploading the data to Industrial IoT.
Except where otherwise noted, content on this site is licensed under the Development License Agreement.