Introduction
As industrial and manufacturing firms embark on their digital transformation journey , they wish to leverage superior applied sciences for elevated effectivity, productiveness, high quality management, flexibility, value discount, provide chain optimization, and aggressive benefit within the quickly evolving digital period. AWS prospects within the manufacturing and industrial house, more and more leverage AWS IoT SiteWise to modernize their industrial information technique and unlock the total potential of their operational know-how. AWS IoT SiteWise empowers you to effectively gather, retailer, arrange, and monitor information from industrial tools at scale.It additionally allows you to derive actionable insights, optimize operations, and drive innovation by means of data-driven selections.
The journey usually begins with a Proof of Worth (PoV) case examine in a growth surroundings. This strategy offers you with a chance to discover how information assortment and asset modelling with an answer that features AWS IoT SiteWise will help. As you develop into snug with the answer, you might scale extra belongings or services right into a manufacturing surroundings from staging over time. This weblog put up offers an summary of the structure and pattern code emigrate the belongings and information in AWS IoT SiteWise from one deployment to a different, whereas guaranteeing information integrity and minimizing operational overhead.
Getting began with AWS IoT SiteWise
Throughout the PoV part, you determine information ingestion pipelines to stream close to real-time sensor information from on-premises information historians, or OPC-UA servers, into AWS IoT SiteWise. You’ll be able to create asset fashions that digitally symbolize your industrial tools to seize the asset hierarchy and important metadata inside a single facility or throughout a number of services. AWS IoT SiteWise offers API operations that can assist you import your asset mannequin information (metadata) from numerous techniques in bulk, corresponding to course of historians in AWS IoT SiteWise at scale. Moreover, you’ll be able to outline frequent industrial efficiency indicators (KPIs) utilizing the built-in library of operators and features accessible in AWS IoT SiteWise. You may as well create customized metrics which can be triggered by tools information on arrival or computed at user-defined intervals.
Establishing a number of non-production environments on a manufacturing facility flooring could be difficult resulting from legacy networking and strict rules related to the plant flooring – along with delays in {hardware} procurement. Many purchasers transition the identical {hardware} from non-production to manufacturing by designating and certifying the {hardware} for manufacturing use after validation completes.
To speed up and streamline the deployment course of, you want a well-defined strategy emigrate their IoT SiteWise assets (asset, hierarchies, metrics, transforms, time-series, and metadata) between AWS accounts as a part of your commonplace DevOps practices.
AWS IoT SiteWise shops information throughout storage tiers that may help coaching machine studying (ML) fashions or historic information evaluation in manufacturing. By means of this blogpost we offer a top level view about how you can migrate the asset fashions, asset hierarchies, and historic time collection information from the event surroundings to the staging and manufacturing environments which can be hosted on AWS.
Resolution Walkthrough
Let’s start by discussing the technical facets of migrating AWS IoT SiteWise assets and information between AWS accounts. We offer a step-by-step information on how you can export and import asset fashions and hierarchies utilizing IoT SiteWise APIs. We additionally talk about how you can switch historic time collection information utilizing Amazon Easy Storage Service (Amazon S3) and the AWS IoT SiteWise BatchPutAssetPropertyValue API operation.
By following this strategy, you’ll be able to promote your AWS IoT SiteWise setup and information by means of the event lifecycle as you scale your industrial IoT functions into manufacturing. The next is an summary of the method:
- Â AWS IoT Sitewise metadata switch:
- Â Export AWS IoT SiteWise fashions and belongings from one AWS account (
growth account
) by working a bulk export job. You should use filters to export the fashions and/or belongings. - Â Import the exported fashions and/or belongings right into a second AWS account (
staging account)
by working a bulk import job. The import information should comply with the AWS IoT SiteWise metadata switch job schema.
- Â Export AWS IoT SiteWise fashions and belongings from one AWS account (
- AWS IoT Sitewise telemetry information switch
- Use the next API operations emigrate telemetry information throughout accounts:
- BatchGetAssetPropertyValueHistory retrieves historic telemetry information from the
growth account
. - CreateBulkImportJob ingests the retrieved telemetry information into the
staging account
.
- BatchGetAssetPropertyValueHistory retrieves historic telemetry information from the
- Use the next API operations emigrate telemetry information throughout accounts:
The info migration steps in our answer make the next assumptions:
- The
staging account
doesn’t have AWS IoT SiteWise belongings or fashions configured the place it makes use of the identical identify or hierarchy because thegrowth account
. - You’ll replicate the AWS IoT SiteWise metadata from the
growth account
to thestaging account
. - You’ll transfer the AWS IoT SiteWise telemetry information from the
growth account
to thestaging account
.
1: Migrate AWS IoT SiteWise fashions and belongings throughout AWS accounts
AWS IoT SiteWise helps bulk operations with belongings and fashions. The metadata bulk operations assist to:
- Â Export AWS IoT SiteWise fashions and belongings from the
growth account
by working a bulk export job. You’ll be able to select what to export while you configure this job. For extra data, see Export metadata examples.- Â Export all belongings and asset fashions, and filter your belongings and asset fashions.
- Export belongings and filter your belongings.
- Export asset fashions and filter your asset fashions.
- Import AWS IoT SiteWise fashions and belongings into the staging account by working a bulk import job. Just like the export job, you’ll be able to select what to iÂÂmport. For extra data, see Import metadata examples.
- The import information comply with a particular format. For extra data, see AWS IoT SiteWise metadata switch job schema.
2: Migrate AWS IoT SiteWise telemetry information throughout AWS accounts
AWS IoT SiteWise helps ingesting excessive quantity historic information utilizing the CreateBulkImportJob API operation emigrate telemetry information from the growth account
to the staging account
.
2.1 Retrieve information from the growth account utilizing BatchGetAssetPropertyValueHistory
AWS IoT SiteWise has information and SQL API operations to retrieve telemetry outcomes. You should use the export file from the Export AWS IoT SiteWise fashions and belongings by working a bulk export job step to get an inventory of AWS IoT SiteWise asset IDs and property IDs to question utilizing the BatchGetAssetPropertyValueHistory API operation. The next pattern code demonstrates retrieving information for the final two days:
import boto3
import csv
import time
import uuid
"""
Hook up with the IoT SiteWise API and outline the belongings and properties
to retrieve information for.
"""
sitewise = boto3.consumer('iotsitewise')
# restrict for less than 10 AssetIds/PropertyIDs/EntryIDs per API name
asset_ids = ['a1','a2','a3']
property_ids = ['b1','b2','b3']
"""
Get the beginning and finish timestamps for the date vary of historic information
to retrieve. At present set to the final 2 days.
"""
# Convert present time to Unix timestamp (seconds since epoch)
end_time = int(time.time())
# Begin date 2 days in the past
start_time = end_time - 2*24*60*60
"""
Generate an inventory of entries to retrieve property worth historical past.
Loops by means of the asset_ids and property_ids lists, zipping them
collectively to generate a novel entry for every asset-property pair.
Every entry comprises a UUID for the entryId, the corresponding
assetId and propertyId, and the beginning and finish timestamps for
the date vary of historic information.
"""
entries = []
for asset_id, property_id in zip(asset_ids, property_ids):
entry = {
'entryId': str(uuid.uuid4()),
'assetId': asset_id,
'propertyId': property_id,
'startDate': start_time,
'endDate': end_time,
'qualities': [ "GOOD" ],
}
entries.append(entry)
"""
Generate entries dictionary to map entry IDs to the total entry information
for retrieving property values by entry ID.
"""
entries_dict = {entry['entryId']: entry for entry in entries}
"""
The snippet under retrieves asset property worth historical past from AWS IoT SiteWise utilizing the
`batch_get_asset_property_value_history` API name. The retrieved information is then
processed and written to a CSV file named 'values.csv'.
The script handles pagination by utilizing the `nextToken` parameter to fetch
subsequent pages of information. As soon as all information has been retrieved, the script
exits the loop and closes the CSV file.
"""
token = None
with open('values.csv', 'w') as f:
author = csv.author(f)
whereas True:
"""
Make API name, passing entries and token if on subsequent name.
"""
if not token:
property_history = sitewise.batch_get_asset_property_value_history(
entries=entries
)
else:
property_history = sitewise.batch_get_asset_property_value_history(
entries=entries,
nextToken=token
)
"""
Course of success entries, extracting values into an inventory of dicts.
"""
for entry in property_history['successEntries']:
entry_id = entry['entryId']
asset_id = entries_dict[entry_id]['assetId']
property_id = entries_dict[entry_id]['propertyId']
for history_values in entry['assetPropertyValueHistory']:
value_dict = history_values.get('worth')
values_dict = {
'ASSET_ID': asset_id,
'PROPERTY_ID': property_id,
'DATA_TYPE': str(record(value_dict.keys())[0]).higher().change("VALUE", ""),
'TIMESTAMP_SECONDS': history_values['timestamp']['timeInSeconds'],
'TIMESTAMP_NANO_OFFSET': history_values['timestamp']['offsetInNanos'],
'QUALITY': 'GOOD',
'VALUE': value_dict[list(value_dict.keys())[0]],
}
author.writerow(record(values_dict.values()))
"""
Verify for subsequent token and break when pagination is full.
"""
if 'nextToken' in property_history:
token = property_history['nextToken']
else:
break
2.2 Ingest information to the staging account utilizing CreateBulkImportJob
Use the values.csv
file to import information into AWS IoT SiteWise utilizing the CreateBulkImportJob API operation. Outline the next parameters when you create an import job utilizing CreateBulkImportJob
. For a code pattern, see CreateBulkImportJob within the AWS documentation.
- Change the
adaptive-ingestion-flag
withtrue
orfalse
. For this train, set the worth to true.- By setting the worth to
true
, the majority import job does the next:- Ingests new information into AWS IoT SiteWise.
- Calculates metrics and transforms, and helps notifications for information with a time stamp that’s inside seven days.
- Â When you had been to set the worth to
false
, the majority import job ingests historic information into AWS IoT SiteWise.
- By setting the worth to
- Change the
delete-files-after-import-flag
withtrue
to delete the info from the Amazon S3 information bucket after ingesting into AWS IoT SiteWise heat tier storage. For extra data, see Create a bulk import job (AWS CLI).
Clear Up
After you validate the leads to the staging account
, you’ll be able to delete the info from the growth account
utilizing AWS IoT SiteWise DeleteAsset and DeleteAssetModel API operations. Alternatively, chances are you’ll proceed to make use of the growth account
to proceed different growth and testing actions with the historic information.
Conclusion
On this weblog put up, we addressed the problem industrial prospects face when scaling their AWS IoT SiteWise deployments. We mentioned transferring from PoV to manufacturing throughout a number of vegetation and manufacturing strains and the way AWS IoT SiteWise addresses these challenges. Migrating metadata (corresponding to asset fashions, asset/enterprise hierarchies, and historic telemetry information) between AWS accounts ensures constant information context. It additionally helps selling Industrial IoT belongings and information by means of the event lifecycle. For extra particulars please see Bulk operations with belongings and fashions.
Creator biographies