DEV Community

Bob Jiang | awesomerobots
Bob Jiang | awesomerobots

Posted on

Awesome Robots Digest - Issue #1 - August 29, 2025

TL;DR; πŸ“‹

Funding Frenzy: Robotics AI startup FieldAI raised a massive $405M to build "universal robot brains," signaling investor confidence in embodied AI. Meanwhile, soft robotics innovator SoftWear Automation stitched up $20M to automate apparel manufacturing.

Product & Tech Launches: NVIDIA unveiled the Jetson AGX Thor module, delivering 7.5Γ— the AI compute of its predecessor to power next-gen robots with real-time reasoning. Boston Dynamics and Toyota demonstrated Atlas performing complex tasks via a Large Behavior Model – a leap toward general-purpose humanoids.

Research & Open Source: NYU's EgoZero system uses smart glasses to train robots from human demonstrations, improving generalization without robot-collected data. Allen Institute's MolmoAct open-sourced a 7B "Action Reasoning" model that plans in 3D space, bridging language and robotics control.

Community & Events: The inaugural World Humanoid Robot Games in Beijing wowed audiences with robots dancing and sparring. ROS enthusiasts gathered at local meetups (e.g. ROS By-The-Bay) and gear up for ROSCon UK in September. Open-source contributions, from new ROS 2 drivers to DIY robot hands, continue to thrive.


Introduction πŸš€

This week in AI robotics has showcased remarkable progress on all fronts. On the industry side, we're seeing huge investments in "robot brains" – exemplified by FieldAI's $405M war chest to develop foundation models that enable robots to adapt and learn on the fly. Major tech launches are accelerating the convergence of AI and robotics: NVIDIA's new Jetson Thor module promises unprecedented edge AI power for autonomous machines, and a Boston Dynamics–Toyota collaboration just demonstrated a humanoid robot performing complex packing tasks through a Large Behavior Model, without explicit programming. In academia and open source, researchers unveiled cutting-edge tools like EgoZero (training robots via human-worn smart glasses) and MolmoAct (an open 3D action reasoning model), lowering barriers to deploying intelligent robots. The global robotics community remains vibrant – from humanoid robot competitions capturing the public imagination to grassroots developers building innovative DIY robots. It's clear that AI-driven robotics is charging ahead, breaking new ground in capability, investment, and community engagement.


Top News & Breakthroughs πŸ“°

🏒 Company News

  • Boston Dynamics & TRI – Atlas Levels Up: Boston Dynamics, in collaboration with Toyota Research Institute, demonstrated its Atlas humanoid completing a long sequence of tasks using a Large Behavior Model (LBM). In a new video, Atlas autonomously packs and moves boxes with whole-body dexterity, adapting on the fly to surprises (like a box lid being closed). This AI-powered feat – controlling locomotion and manipulation with one big neural model – marks a key step toward general-purpose humanoids.

  • Agility Robotics' Humanoid Hits 1-Year on the Job: Agility Robotics celebrated a milestone with its bipedal robot Digit: one year of continuous operation at a logistics warehouse. Deployed with 3PL provider GXO (fulfilling orders for apparel company Spanx), Digit works an 8-hour shift moving bins from autonomous mobile robots to conveyors, as part of the facility's daily workflow.

πŸš€ Product Launches

  • NVIDIA Jetson Thor – Edge AI Supercomputer: NVIDIA announced the Jetson AGX Thor, a new robotics computer module powered by its latest Blackwell GPU (128GB memory). Jetson Thor boasts 7.5Γ— the AI compute and 3.5Γ— the energy efficiency of its predecessor (Jetson Orin). This jump in performance enables robots to run multiple advanced models (including vision, language, and generative AI) simultaneously at the edge, supporting real-time perception and decision-making.

πŸ’° Funding & Investments

  • FieldAI Scores $405M for "Universal Robot Brains": Irvine-based startup FieldAI came out of stealth with a whopping $405 million raised across two rounds (the latest $314M led by Bezos Expeditions, Prysm, and Temasek). FieldAI is developing "foundational embodied AI models" – essentially general-purpose robot brains that can run on various platforms (humanoids, quadrupeds, self-driving cars) and enable them to quickly learn new tasks.

  • Sewbots Get a Boost – $20M for SoftWear Automation: Atlanta-based SoftWear Automation raised $20 million (Series B1) to expand its autonomous sewing robots. The startup's Sewbot systems use AI-driven machine vision and robotics to cut and sew garments with minimal human labor.


Research Spotlight πŸ”¬

πŸ“„ Research Papers

  • "Admissible Strategies" for Safe Human-Robot Teams – Lahijanian et al., IJCAI 2025: Researchers at University of Colorado Boulder introduced a game-theoretic approach to help robots collaborate safely with humans in unpredictable environments. Instead of optimizing for a robot to "win" (finish its task at all costs), the algorithm seeks an "admissible strategy," maximizing task completion while minimizing potential harm to humans.

  • Autonomous Breeding Robot – Xu et al., Cell (CAS Institute): A team from the Chinese Academy of Sciences published a groundbreaking paper in Cell detailing GEAIR, a robotic system for plant hybrid breeding. Combining robotics, AI vision, and biotechnology, GEAIR autonomously performs tasks like selecting parent plants, pollinating flowers, and managing growth environments – essentially automating the entire crop breeding pipeline.

πŸ”§ Open Source Projects

  • MolmoAct – Open 3D Reasoning for Robots (Allen AI): The Allen Institute for AI released MolmoAct, an open-source AI model that lets robots understand and act in 3D space. MolmoAct is built on a vision-language foundation model (AI2's Molmo) and extends it into an Action Reasoning Model (ARM).

  • NASA JPL Visual Perception Engine (VPE): NASA's Jet Propulsion Lab open-sourced a Visual Perception Engine designed for rover and robot vision. VPE is a fast, flexible framework that can run multiple perception models concurrently on resource-constrained hardware (like the NVIDIA Jetson platform).


Event Horizon πŸ“…

πŸ—“οΈ This Week

  • ROS By-The-Bay Meetup – Aug 28, 2025 (San Francisco, CA) – An informal gathering of ROS (Robot Operating System) developers and enthusiasts in the Bay Area.
  • Gazebo "Jetty" Tutorial Party – Aug 27, 2025 (Virtual) – An online mini-event hosted by Open Robotics to kick off testing of Gazebo Jetty, the upcoming release of the Gazebo robot simulator.

πŸ“… Next Week

  • ROSCon UK 2025 – Sept 15–17, 2025 (Edinburgh, UK) – The first-ever ROSCon UK user conference.
  • CoRL 2025 (Conference on Robot Learning) – Sept 27–30, 2025 (Seoul, South Korea) – A top academic conference merging ML and robotics.

🎯 Upcoming Deadlines

  • RAAI 2025 (Robotics, Automation, AI Conference) – Paper Submission Deadline: Sept 5, 2025
  • ICMSR 2025 (Mechatronics Systems and Robots) – Submission Deadline: Sept 5, 2025
  • Robot Art Competition – Entry Deadline: Sept 10, 2025

Tool/Resource of the Week πŸ› οΈ

🎯 Featured Resource: Foxglove Studio – Robotics Observability Platform

What It Is: Foxglove Studio is a powerful visualization and observability platform for robotics development. Think of it as a next-generation replacement for ROS's RViz and rqt tools – it allows you to visualize live or recorded robot data (sensors, messages, logs) in a flexible, web-based interface.

Key Features:

  • Multi-modal Data Visualization: Foxglove supports 20+ types of panels to view data – from 3D point clouds, camera feeds, and LiDAR scans to line plots for sensor readings and even log consoles.
  • Custom Layouts & Extensibility: Build dashboards tailored to your robot/application. Drag-and-drop panels (maps, images, plots, etc.) into a layout, and save/share it with your team.
  • Collaboration & Data Management: Foxglove enables cloud-based collaboration – team members can annotate data, share insights, and access robot telemetry remotely through a browser.

Getting Started:

  • Website: foxglove.dev
  • Documentation: docs.foxglove.dev
  • Community: Foxglove has an active Discord for support

Community Corner πŸ‘₯

πŸ’¬ Trending Discussions

  • Podcast – "Do Humanoid Robots Need Legs?" (Audrow Nash Podcast): A lively discussion has been making the rounds in robotics circles: should humanoid robots walk like humans, or would wheels suffice? In a recent episode of Audrow Nash's podcast, Agility Robotics CTO Pras Velagapudi explains why his team gave their robot Digit two legs.

  • Reddit – LLMs for Robotics #Discussion: On r/robotics this week, a post titled "Anyone using GPT-4 or similar for robot control?" gained traction. In it, a grad student described experiments using large language models to generate high-level task plans for a home robot.

πŸ› οΈ Cool Projects

  • Ambidextrous Robot Hand – by Kelvin Gonzalez Amador: A hobbyist engineer, Kelvin stunned the community with his 3D-printed robotic hand that can perform the entire American Sign Language alphabet in both left- and right-hand orientations.

  • "Open-Platform Da Vinci Instruments" by University X Robotics Club: A group of students and researchers has developed an adapter to use Da Vinci surgical robot instruments on common collaborative robots.

πŸŽ‰ Community Highlights

  • Canonical Joins Open Source Robotics Alliance (OSRA): In community governance news, Canonical (the company behind Ubuntu) became a Platinum member of the Open Source Robotics Foundation's alliance.

  • World Humanoid Robot Games Debut: A fun and inspiring community-driven event took place in mid-August – the 1st World Humanoid Robot Games in Beijing.

🌟 Spotlight: DIY Biped Community Project "Bobble-Bot"

In our community spotlight, we highlight Project Bobble-Bot, a low-cost open-source biped robot that has been gaining traction on social media. Created by a small team of robotics enthusiasts, Bobble-Bot is a 2-legged walking robot about 40cm tall, 3D-printed, and driven by hobby servos.


Conclusion 🎯

As we wrap up this inaugural issue of Awesome Robots Digest, one theme stands out: the pace of advancement in AI robotics is truly accelerating. From massive funding fueling ambitious "robot brain" startups to grassroots makers sharing ingenious DIY designs, every level of the ecosystem is brimming with progress. It's increasingly clear that the convergence of AI and robotics – whether through large behavior models controlling humanoids or smart algorithms that make robots safer teammates – is unlocking capabilities we only imagined a few years ago.

What does this mean for all of us? For one, there's never been a more exciting (or challenging) time to be in the field. New tools and open-source releases lower barriers for entry, while real-world deployments (like Digit's year on the job) prove that robots are graduating from labs to daily life. The community's energy, evident in forums, meetups, and competitions, continues to inspire innovation and collaboration across continents.

We'd love to hear your thoughts: What breakthrough or news item intrigued you the most this week? πŸ” Did something spark a question or an idea for your own project? Let's keep the conversation going – feel free to reach out or share your perspectives for next week's digest. After all, this newsletter is by and for a community passionate about robotics.

Stay tuned for Issue #2 – we'll dive into more developments (hint: there's a big conference around the corner and rumors of a major robot software update). Until then, keep exploring, keep building, and keep being awesome, robot fans!


πŸ“§ Stay Connected


This digest is curated by the Awesome Robots team. Have a robotics story or project to share? Or feedback on this issue? We'd love to hear from you! Feel free to reach out via our website's contact form or ping us on social media. Together, let's continue to learn and celebrate the amazing world of AI robotics.

Rendered by AI Robotics Digest Curator, your weekly guide to what's new in the robotics universe.

Top comments (0)