What It Actually Means for Edge AI
A recent Reuters report highlights a major shift in edge AI: real robotics platforms are now being designed specifically for generative AI workloads.
This is not just another partnership announcement. It shows how edge AI hardware, robotics, and generative models are starting to converge into real products.
What Happened
South Korean AI chip startup DEEPX is expanding its partnership with Hyundai Motor Group to build a new computing platform for robots powered by generative AI.
The key element is DEEPX’s upcoming DX-M2 chip, a second-generation low-power NPU designed for on-device AI.
Unlike cloud-based AI systems, these chips run models directly on robots — meaning:
- no dependency on cloud latency
- lower power consumption
- real-time decision making
The chips will be manufactured using Samsung’s 2nm process, with mass production planned for next year.
Why This Is Important
This news confirms a major trend: generative AI is moving to the edge.
DEEPX is not building general-purpose GPUs. It focuses on NPUs optimized for:
- robotics
- autonomous systems
- industrial AI
According to the company, its current chips are up to 20× more power-efficient and cheaper than NVIDIA Jetson Orin.
That is a big deal for robotics, because power and heat are major constraints, especially for humanoid robots.
Generative AI Inside Robots
One of the most interesting parts of the report is how DEEPX describes the next generation of chips.
They are designed specifically for generative AI workloads, meaning robots can:
- learn from experience
- adapt to new environments
- improve behavior over time
This is closer to how large language models work, but applied to physical systems.
Instead of fixed logic, robots become systems that can evolve.
Hyundai’s Strategy: Scale Robotics Production
Hyundai is not experimenting — it is scaling.
The company plans to produce up to 30,000 robots per year by 2028, including humanoid platforms.
This partnership with DEEPX is part of a broader strategy:
- build an ecosystem of on-device AI
- reduce reliance on external compute
- control the full robotics stack
This is similar to what Apple did with its own silicon — but now for robotics.
Why Low-Power NPUs Matter
Traditional AI hardware like GPUs is powerful but inefficient for edge systems.
Robots need:
- low latency
- low power consumption
- stable thermal behavior
DEEPX specifically targets these constraints.
For example, the company notes that lower power consumption helps prevent overheating in humanoid robots — a real problem for current designs. ([Investing.com][2])
This is where specialized AI accelerators outperform general-purpose hardware.
How This Relates to Existing Edge AI Hardware
If you look at the current edge AI ecosystem, there are already several approaches:
- NVIDIA Jetson (high performance, higher power)
- Hailo accelerators (efficient NPU design)
- emerging players like DEEPX
Also, a good breakdown of DeepX Architectures
What DEEPX is doing is pushing this space further toward generative AI + robotics, not just inference.
Bigger Picture
This news confirms several important shifts in edge AI:
- AI is moving from cloud → on-device
- robotics is becoming a primary use case
- NPUs are replacing GPUs in many edge scenarios
- generative AI is entering real-world systems
Most importantly, this is no longer theoretical.
These systems are being designed for mass production.
Conclusion
The DEEPX + Hyundai partnership is a clear signal that edge AI is entering a new phase.
This is not about benchmarks or demos.
It is about building real robots powered by on-device generative AI.
And once this model works, it will spread fast across:
- factories
- logistics
- autonomous machines
Edge AI is becoming physical.

Top comments (0)