Drones are in the news lately - most notably Cardi B with breaking the Guiness World Records with the most number of albums delivered via drones in an hour.
Drones can be used for even better things like helping the community from disasters.
Speaking of community, I had the pleasure of presenting at AWS Community Day and AWS User Group in Brisbane of recent weeks.
In the sessions, I talked about how you can use drones for a good purpose, before, during and after a disaster.
At NTT, we provide drones to customers via our e-Drone Technology team which deal with 2 types of drones.
One being the aerial photography Anafi drones and inspection Skydio drones.
The drones can be programmed with autonomous flight and 360 degree object avoidance.
It can also do thermal and LiDAR imaging and collect comprehensive video and still imagery.
We also build our own drones for agricultural usage, such as inspection of crops for effective use of land.
Before disasters
As for helping before disasters, we have the predictive maintenance angle.
At NTT, we help customers do predictive maintenance across their hydraulic power plants.
Who uses solar panels for their home or business?
Although not specific to a disaster and more around a predictive maintenance angle, this use case where drones can detect faulty panels would be useful for Australian households and businesses.
During a disaster
For help during a disaster, we’re able to assist firefighters to see where the fire has spread to and help locating people that might be at risk of danger and drones can extinguish the fire.
After a disaster
As for post-disaster events, drones are able to help with navigating through difficult terrain where it’s unsafe for people and vehicles to get to.
Even restore internet connectivity by flying over landslides by re-laying fibre optic cables.
You can see our drones in action here on our YouTube channel.
In my session, I talked about how we can use AWS services to help scale our existing ML image analysis model and enhance it with Amazon Bedrock. The idea is to create a dashboard application on React hosted on AWS Amplify from the front-end user perspective that shows the range of Brisbane-based bridges that are in scope for the analysis.
Each of the bridges are scored, low, medium or high risk according to a combination of human in the loop (bridge inspectors) plus Amazon Bedrock and Anthropic Claude models.
I gave Claude the prompt “You are a bridge inspector. Assign a bridge risk status based on the data we have about the bridge”.
The data in this case is produced by the proprietary ML model NTT eDrones runs to produce the images with cracks and rust super imposed over the top plus the CSV and CAD data.
The key is that we have thousands of hours of high quality drone imagery that we’ve captured over the past around infrastructure such as bridges and piers.
I also used some help with Kiro to come up with baseline user stories for Bridge Inspectors and Engineers, Maintenance Co-ordinators and IT Administrators and generate code on an existing git repository.
Kiro is Amazon's spec-driven code generation tool which gathers requirements on your behalf.
I managed to create a Kiro coaster with punch needling technique. The eyes are a bit wonky, I'm a punch needling noob - this is an MVP.
I also happened to be a participant at AWS Community Day New Zealand (Aoteroa) in Wellington for an hour or so in time to catch the last keynote by AWS Developer Advocate, Donnie Prakoso.
The future is bright with the AWS New Zealand region just opening - I saw a lot of engaged Developers and Technologists in the Community Day session.
Thanks for the AWSome mouse pad - great idea.
Here's my actual session from a different AWS Community Day in Brisbane, Australia.
After the session, I spoke to an Amazonian who mentioned that I should look into incorporating Amazon Nova models and agents into the mix as an idea.
So I did, now with this high-level design.
What does the multi-modal agent AI capability look like in this demo solution?
AI components used
After that discussion I thought about why we might need to use AI agents for this solution.
Does it actually make sense to even use agents in this case?
We could potentially use it for multi-modal analysis.
There’s text analysis and the human-in-the-loop approach and also for comparing LLMs.
Amazon Bedrock + Claude 3: What I started off with
Think of Claude 3 as your trusted Bridge Engineer who never gets tired.
We feed it inspection data and get back proper engineering reports, based on the detailed prompt around being in the shoes of a Bridge Engineer.
Even the best AI needs a sanity check from real engineers, so we will always need input from Engineers in the field manually inspecting bridges.
We made sure there was a traffic light system of risk (High, Medium, Low) classifications with confidence scores for each bridge.
It also gives you approximate figures for repairs, which is useful for budget planning in the short, medium and long term.
Amazon Nova + AgentCore
I’ve used Amazon Nova model with AgentCore for multi-modal analysis.
In addition, as a way to see whether the Nova models stack up to Claude models.
By incorporating agents into the data analysis (image and text) it does so more efficiently than the original solution calling Claude via Amazon Bedrock.
In terms of image processing, Nova Vision models can spot cracks in bridge photos better than most untrained humans.
Nova Text models can analyse CSV files of measurements of these defects and find patterns.
Nova Reasoning models combine everything into actionable recommendations based on the data collected.
AgentCore Orchestration
There are multiple agents working together to orchestrate the different LLMs.
It also manages the workflow for complex analysis processes.
Built-in fallbacks when AgentCore doesn’t work, it gracefully falls back to direct Amazon Bedrock Agent access.
Multi-Modal Analysis
Image Processing (Nova Vision)
It does everything that the Claude models do on Bedrock as per below.
Automatically analysing any structural damage from the image, classifying the risk level, tracking how damage develops over time and location mapping.
Data Analytics (Nova Text)
It uses CSV files as input for finding trends in measurement data that humans would miss.
It also tracks how things change between inspection rounds for each bridge, important for predictive maintenance.
Data validation is part of the analytics as it catches measurement errors before they cause problems.
Synthesis Engine (Nova Reasoning)
This is where everything comes together based on the data outputted from the text and multi-modal models.
Combining visual and measurement data for comprehensive analysis.
Gives you short, medium and long-term recommendations.
In terms of costs, it helps prioritise repairs based on risk vs. cost.
Event-driven architecture approach
This is where using AWS Lambda helped with triggering events when the file lands in the Amazon S3 bucket.
This is so that it can trigger bridge analysis events with real-time status updates.
From a risk perspective, it was important to alert any changes to the status for co-ordinating safety responses.
Real-Time Update System
From a customer perspective you can leverage some of the 5G, LTE network connectivity that our drones have.
At the Brisbane AWS User Group session, I talked about how Amazon's Project Kuiper could help with Satellite connectivity across Australia's rural areas when it's launched in 2026. It may even help with network coverage for drones working in rural areas.
We still need to consider the data transfer costs of real-time vs batch image processing. If we need to say setup a private LTE network, there will be costs associated with setting up the networking hardware required for this.
From a front-end integration perspective, I used WebSocket integration for live updates of analysis results and alerting for multiple users and profiles.
Intelligent Caching System
In terms of optimising performance, caching helps improve response times and reduces API costs with minimising repeated calls to Amazon Bedrock.
Fallback Mechanisms
Graceful degradation is part of the solution with structured fallback from AgentCore to direct Nova model access.
Monitoring and Observability
Using Amazon CloudWatch for real-time metric tracking for overall service health, for time, costs and error rates.
Edge Processing capability
This solution can handle on-device AI analysis if running ML on the edge.
In addition, the solution can continue its autonomous operations with offline capabilities, with default batch processing if required.
Security and Compliance
Security is job zero for us, and that's what I used to say back at AWS. This solution is no exception with secure processing with encryption and access controls.
I used Amazon CloudWatch and CloudTrail for comprehensive logging for regulatory compliance and quality assurance.
Also least-privileges using IAM role-based permissions with appropriate data access levels.
Let me walk through the front-end and the explanation around the ability to switch from Claude to Nova models.
What would the impact look like for the solution in the future?
We can look to integrate Real-time IoT sensor data integration for continuous monitoring capabilities.
In addition, we can look at GIS Integration with Spatial analysis and geographic information.
We can look at make the job easier for Bridge Inspectors out in the field with real-time guidance and support.
Also, from a regulatory compliance perspective, we can automate generation of summaries for compliance reporting.
How drones can help for good
By combining drones with AI analysis, we can provide communities globally a more efficient disaster planning and response, before, during and post the event.
The integration of Amazon Bedrock's Claude models with Amazon Nova's multi-modal agents creates an advanced level of intelligence in analysing infrastructure such as bridges and piers.
This solution provides actionable insights, cost estimates, and prioritised response plans that enable communities to recover faster and more effectively.
This solution demonstrates how emerging technologies can be used for social good, providing critical capabilities when communities need them most.
Stay tuned for part 2 of this blog where I dive into the FinOps and advanced cost analysis dashboard components.
If you've been working in this space, I'm sure you can imagine your Chief Finance Officer saying “tell me what the Return on Investment is with this AI workload”?
Top comments (0)