AWS IoT TwinMaker makes it simpler for builders to create digital twins of real-world techniques similar to buildings and factories with the power to make use of present knowledge from a number of sources.
AWS IoT TwinMaker makes use of a connector-based structure you could join knowledge from your personal knowledge supply to AWS IoT TwinMaker with no need to re-ingest or transfer the information to a different location. AWS IoT TwinMaker supplies built-in knowledge connectors for AWS providers similar to AWS IoT SiteWise and Amazon Kinesis Video Streams. You can even create customized knowledge connectors to make use of with different AWS or third-party knowledge sources, similar to Amazon Timestream, Amazon DynamoDB, Snowflake, and Siemens Mindsphere.
On this weblog, you’ll learn to use your personal knowledge supply in AWS IoT TwinMaker utilizing the AWS IoT TwinMaker knowledge connector interface.
Overview
The connection between an information supply and AWS IoT TwinMaker is described in Elements. A element accesses an exterior knowledge supply through the use of a Lambda connector. A Lambda connector is a Lambda operate that you just specify within the element definition.
Listed below are the steps to create an information connector for Amazon DynamoDB utilizing a Schema initializer connector to fetch the properties from the underlying knowledge supply and a DataReader connector to get the time-series values of those properties. As soon as the information connector is created, you’ll obtain path on learn how to create a element for this knowledge connector and fix it to entities.
Amazon DynamoDB is used as knowledge supply on this put up however the ideas described are relevant for some other knowledge supply.
Stipulations
To setup and execute the steps on this weblog, you want the next:
- An AWS account. If you happen to don’t have one, see Arrange an AWS account.
- An AWS IAM Identification Middle (successor to AWS Single Signal-On) consumer with the permissions to create the assets described within the weblog.
- Learn the part What’s AWS IoT TwinMaker? of the documentation to grasp the important thing ideas of AWS IoT TwinMaker.
Walkthrough
On this walkthrough, you’ll carry out 6 steps in an effort to join your Amazon DynamoDB knowledge supply to AWS IoT TwinMaker:
- Create a DynamoDB desk. This desk is just for the aim of this put up. You’ll be able to simply adapt the directions to make use of an present database.
- Create a Lambda Perform for the Schema initializer connector.
- Create a Lambda Perform for the DataReader. You will have to present to the operate’s execution function the permissions to learn from the desk.
- Create a TwinMaker Workspace. You will have so as to add to the workspace function the permissions to invoke each features.
- Create a TwinMaker Part.
- Check the element. Earlier than testing the element, you’ll create a TwinMaker entity and fix the element to the entity.
Step 1: Create a DynamoDB desk
For the aim of this put up, you’ll create a DynamoDB desk named TwinMakerTable
that incorporates the important thing thingName
of sort String
as partition key and the important thing timestamp
of sort Quantity
as Kind key. See learn how to create a DynamoDB desk for extra data.
The desk you created would retailer air high quality measurements from sensors. You’ll maintain it easy for this put up and create gadgets within the desk equivalent to measurements from a sensor recognized by its title (saved as partition key thingName
). Along with the title of the sensor, every measurement has the next properties of sort Quantity
: timestamp
(saved as type key timestamp
that’s the Unix timestamp in milliseconds of the measurement); temperature
, humidity
and co2
.
Let’s create 5 gadgets within the desk, corresponding to five measurements of a sensor named airTwin
. For the timestamp you’ll be able to obtain the present timestamp in milliseconds from this web site after which derive 5 timestamps by subtracting 10000 per measurement. You’ll be able to then enter random values for the properties: temperature
, humidity
and co2
. See Write knowledge to a desk utilizing the console to study extra.
Now that you’ve got the desk created with knowledge, you’ll create two Lambda features. The primary one for the Schema initializer connector and the second for the DataReader connector.
Step 2: Create a Schema initializer connector
The Schema initializer connector is a Lambda operate used within the element sort or entity lifecycle to fetch the element sort or element properties from the underlying knowledge supply. You’ll create a Lambda operate that can return the schema of the TwinMakerTable
.
You create a Node.js Lambda operate utilizing the Lambda console.
- Open the Features web page.
- On the Lambda console, select Create operate.
- Below Fundamental data, do the next:
- For Perform title, enter
TwinMakerDynamoSchemaInit
. - For Runtime, verify that Node.js 16.x is chosen.
- For Perform title, enter
- Select Create operate.
- Below Perform code, within the inline code editor, copy/paste the next code and select Deploy:
exports.handler = async (occasion) => {
let consequence = {
properties: {
temperature: {
definition: {
dataType: {
sort: "DOUBLE"
},
isTimeSeries: true
}
},
humidity: {
definition: {
dataType: {
sort: "DOUBLE"
},
isTimeSeries: true
}
},
co2: {
definition: {
dataType: {
sort: "DOUBLE"
},
isTimeSeries: true
}
},
}
}
return consequence
}
This operate sends the definition of every property of our desk and specifies the sort. On this case all properties are of sort “DOUBLE” and are time-series knowledge. You’ll be able to verify the legitimate varieties within the documentation.
Notice: right here the properties are hard-coded within the operate. You could possibly design a operate that retrieves robotically the properties and their varieties from an Merchandise for instance.
Now let’s create the DataReader connector.
Step 3: Create a DataReader connector
DataReader is an information airplane connector that’s used to get the time-series values of properties in a single element.
You create a Node.js Lambda operate utilizing the Lambda console.
- Open the Features web page.
- On the Lambda console, select Create operate.
- Below Fundamental data, do the next:
- For Perform title, enter
TwinMakerDynamoDataReader
. - For Runtime, verify that Node.js 16.x is chosen.
- For Perform title, enter
- Select Create operate.
- Below Perform code, within the inline code editor, copy/paste the next code and select Deploy:
const TABLE = 'TwinMakerTable'
const aws = require('aws-sdk')
const dynamo = new aws.DynamoDB.DocumentClient()
exports.handler = async (occasion) => {
attempt {
let {workspaceId, entityId, componentName, selectedProperties, startTime, endTime } = occasion
// QUERY THE DATABASE WITH THE SELECTED PROPERTIES
const {Objects} = await dynamo.question({
TableName: TABLE,
ProjectionExpression: `${selectedProperties}, #tmsp`,
KeyConditionExpression: `thingName = :hashKey AND #tmsp BETWEEN :startTime AND :endTime`,
ExpressionAttributeNames: {
'#tmsp': 'timestamp'
},
ExpressionAttributeValues: {
':hashKey': entityId,
':startTime': (new Date(startTime)).getTime(),
':endTime': (new Date(endTime)).getTime()
}
}).promise()
let outcomes = { propertyValues: [] }
let res = []
Objects.forEach(merchandise => {
selectedProperties.forEach(prop => {
if(!res[prop]){
res[prop] = {
entityPropertyReference:{
propertyName: prop,
componentName,
entityId: occasion.entityId
},
values: []
}
}
res[prop].values.push({
time: (new Date(merchandise['timestamp'])).toISOString(),
worth: {doubleValue: merchandise[prop]}
})
})
})
for (let key in res){
outcomes.propertyValues.push(res[key])
}
console.log(outcomes)
return outcomes
} catch (e) {
console.log(e)
}
}
The TwinMaker element will use this DataReader connector to fetch the information from the DynamoDB desk. The element supplies within the request two properties startTime and endTime (ISO-8601 timestamp format) which might be utilized by the connector to fetch solely the information on this time vary. You’ll be able to verify the request and response interfaces within the Information connectors part of the documentation.
Earlier than transferring to the subsequent step, that you must grant the operate the entry to the desk. See Permits a Lambda operate to entry an Amazon DynamoDB desk to study extra.
Now you’ll be able to transfer to the step of making a workspace in AWS IoT TwinMaker.
Step 4: Create a Workspace in AWS IoT TwinMaker
On the AWS IoT TwinMaker console, create a workspace named AirWorkspace
. You’ll be able to comply with the directions of the part Create a workspace of the AWS IoT TwinMaker documentation.
As soon as the workspace is created, it’s best to have an Amazon Easy Storage Service (Amazon S3) bucket created. AWS IoT TwinMaker will use this bucket to retailer data and assets associated to the workspace.
You also needs to have an IAM Identification Middle function created. This function permits the workspace to entry assets in different providers in your behalf.
Earlier than creating the element, it’s essential to present permissions to invoke each lambda features (created within the earlier steps) to the workspace function. See Permissions for a connector to an exterior knowledge supply for an instance of giving permission to the service function to make use of a Lambda operate.
{
"Model": "2012-10-17",
"Assertion": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "lambda:InvokeFunction",
"Resource": [
"arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:function:TwinMakerDynamoDataReader",
"arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:function:TwinMakerDynamoSchemaInit"
]
}
]
}
Now you can create your element.
Step 5: Create an AWS IoT TwinMaker element
Choose the workspace you will have created. Within the workspace, select Part varieties after which select Create element sort.
Copy the next JSON doc within the Request part and exchange the ARN of the DataReader and Schema initializer features respectively with those you created earlier than:
{
"componentTypeId": "com.dynamodb.airQuality",
"description": "Connector for DynamoDB – Use case Air High quality",
"propertyDefinitions":{
},
"features": {
"dataReader": {
"implementedBy": {
"lambda": {
"arn": "arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:operate:TwinMakerDynamoDataReader"
}
}
},
"schemaInitializer": {
"implementedBy": {
"lambda": {
"arn": "arn:aws:lambda:{{AWS_REGION}}:{{ACCOUNT_ID}}:operate:TwinMakerDynamoSchemaInit"
}
}
}
}
}
Select Create element sort. Now the element is created, you’ll be able to create an entity to check the element.
Step 6: Create an entity and take a look at the element
You’ll now create an entity and fix the element you created to it.
- On the Workspaces web page, select your workspace, after which within the left pane select Entities.
- On the Entities web page, select Create, after which select Create entity.
- Within the Create an entity window, enter airTwin for the entity title and in addition for the entity ID of your entity.
- Select Create entity.
- On the Entities web page, select the entity you simply created, after which select Add element.
- Enter a reputation for the element. You’ll be able to name it
dynamoAirComponent
. - In Sort, choose the element
com.dynamodb.airQuality
created beforehand. - Select Add element.
The element is hooked up to the entity with the ID airTwin
. Now the one step that continues to be, is to check the element. When testing the element (or when calling the GetPropertyValueHistory API motion), the element will ship to the DataReader Lambda connector a request together with the ID for the entity. The Lambda connector will use the ID to question the measurements of the sensor with the title equivalent to the ID. On this case, it will likely be measurements from the airTwin
sensor.
- On the Entities web page, select the entity
airTwin
, after which choose the elementcom.dynamodb.airQuality
. - Then select Actions and View element particulars.
- Within the tab Check, choose the properties you need to retrieve and a time vary. Guarantee that the time vary chosen consists of the timestamp of the measurements.
- Lastly, select Run take a look at to check our element.
You need to see the measurements of your sensors within the Time-series consequence part.
Now you can name the GetPropertyValueHistory API motion to retrieve the measurements out of your sensors saved in your DynamoDB desk.
Cleansing up
To keep away from incurring future costs, delete the assets created throughout this walk-through.
Conclusion
AWS IoT TwinMaker supplies a unified knowledge entry API to learn from and write to your digital twin’s supply knowledge. You should utilize your present knowledge sources with out the necessity to transfer your knowledge.
On this weblog, you discovered learn how to join an Amazon DynamoDB desk to AWS IoT TwinMaker. The ideas described are relevant to your different knowledge sources. You can even mix a number of knowledge sources to counterpoint your digital twin functions.
If you wish to see an instance of an answer utilizing AWS IoT TwinMaker and Amazon S3 as knowledge supply, watch the video Construct a Digital Twin utilizing the Good Territory Framework and AWS IoT TwinMaker on Youtube. You can even go to the associated GitHub repository to verify the code.
Concerning the Writer
Ali is a Expertise Evangelist for IoT and Good Cities at Amazon Net Companies. With over 12 years of expertise in IoT and Good Cities, Ali brings his technical experience to allow and assist AWS companions and prospects to speed up their IoT and Good Cities tasks. Ali additionally holds an govt MBA, giving him the power to zoom out and assist prospects and companions at a strategic degree. |