Topics In Demand
Notification
New

No notification found.

Blog
ML predictions@Flying-Edge [drones]

January 11, 2021

AI

926

0

Drone solutions offer value across industries. In agriculture, drones can help with yield estimation from large farm fields in addition to spraying pesticides and fertilizer. In mining, solutions help with stockpile volume measurement and encroachment detection. For oil, gas, and infrastructure, drones assist with for surveillance and construction process monitoring. In rural development, drones help with land records verification.

However, the majority of drone solutions in the market today are visual line of sight (VLOS) solutions where the unmanned aircraft vehicle (UAV) is within pilot’s sight. We anticipate the more wide spread use of drones from beyond visual line of sight (BVLOS) solutions where the pilots have no visual reference of aircrafts.  With BVLOS, geographical span of drones increases and base stations can operate drone flights to distant places as well.

Drones can capture high definition images and videos, but they need to send this heavy data back to base station servers to analyze. Drone actions also depend on the results of these analyses and instructions coming from base stations.

Drone solutions need to deal with the challenge of sending or streaming data to the base station for review and then receiving feedback on the analysis. Network disruption, communication latency, energy requirement for establishing consistent connection, and other interferences add to complexity of solution.  In BVLOS solutions, developers also need to deal with issues related to the drones being farther from base stations than in the case of VLOS solutions. Unavailability of visual reference in BVLOS solutions amplifies network, communication and energy requirement challenges.

By considering drones as Internet of Things (IoT) devices and running analysis logic and machine learning (ML) inference on drones itself, you remove the requirement of connecting back to base station for heavy data transfer. And you can maintain connectivity to communicate control data and ML inference results with base station on lightweight protocols.

Below architecture not only solves network, connectivity and energy challenges for drone solutions but also, by running ML inference at the edge, unlocks many industry use cases with low or limited internet connectivity. Running ML on edge requires no bandwidth—or less, depending on your use case. By doing so, you’ll need less resources and reduce the cost.

A drone is a special IoT device that is not stationary. This flying-IoT device faces various challenges of network including low bandwidth, connectivity disruption. As a result, the solution requires an IoT platform that helps in keeping the drone data in sync while addressing the network challenges. You will also need an ML platform to help deploy the models on the drone in a seamless way.

Examples of such ML Platform are Cloud based Amazon SageMaker, open source H2O.ai etc. Couple of options for IoT Platform are AWS IoT and Blynk among others.

Reference architecture:

Solution flow

Take the following steps to build and deploy your own models on your drones

  1. Configure AWS IoT Greengrass to communicate with the camera (and other devices) on the drone.
  2. (Optional) Send processed images to Amazon Simple Storage Service (Amazon S3).  This step is required when you want to collect the data for further ML training. Alternatively, to mitigate connectivity challenges, you can execute one route of drones only to collect the images on local drone storage and later upload to Amazon S3.
  3. Build, train, and tune your ML model with Amazon SageMaker based on images in Amazon S3 bucket.
  4. Deploy your model on AWS IoT Greengrass using Amazon SageMaker.
  5. Configure AWS IoT Greengrass connector to communicate with Amazon Simple Notification Service (Amazon SNS).
  6. Configure Amazon Kinesis Data Firehose delivery stream to store visual inspection data in Amazon S3 bucket.
  7. ML inference happen at the edge. [Reference: AWS IoT Greengrass ML Inference]
  8. Send production data and ML inferences to AWS IoT Core
  9. Send notification to Amazon SNS. (You can subscribe various applications to notification at the backend.)
  10. Visualize your data and analysis using Amazon QuickSight on the Amazon Athena data source.

Author: Sachin Punyani, Business Development Lead, Artificial Intelligence, Machine Learning, Analytics, Internet of Things, Amazon Internet Services Pvt. Ltd.


That the contents of third-party articles/blogs published here on the website, and the interpretation of all information in the article/blogs such as data, maps, numbers, opinions etc. displayed in the article/blogs and views or the opinions expressed within the content are solely of the author's; and do not reflect the opinions and beliefs of NASSCOM or its affiliates in any manner. NASSCOM does not take any liability w.r.t. content in any manner and will not be liable in any manner whatsoever for any kind of liability arising out of any act, error or omission. The contents of third-party article/blogs published, are provided solely as convenience; and the presence of these articles/blogs should not, under any circumstances, be considered as an endorsement of the contents by NASSCOM in any manner; and if you chose to access these articles/blogs , you do so at your own risk.


AISPL

© Copyright nasscom. All Rights Reserved.