When you think of a robot hand, you probably imagine it gripping objects with brute force or fumbling blindly. But what if robots could truly feel—detecting texture, pressure, and slip like humans do? That’s the vision behind Hong Kong-based DAIMON Robotics, a two-and-a-half-year-old startup that’s making headlines with its latest breakthrough: the Daimon-Infinity dataset. This isn’t just another data dump; it’s the largest omni-modal robotic dataset for physical AI, packed with high-resolution tactile sensing data from over 80 real-world scenarios. From folding laundry at home to assembling parts on a factory line, DAIMON is giving robot hands a sense of touch. Here are eight key things you need to know about this game-changing initiative.
1. The Daimon-Infinity Dataset: World’s Largest Omni-Modal Dataset
Released in April, the Daimon-Infinity dataset is described by DAIMON as the largest and most comprehensive omni-modal robotic dataset ever created. It scales up to a million hours of multimodal data, including ultra-high-resolution tactile feedback, and spans more than 2,000 human skills across 80+ real-world environments. Unlike typical datasets that focus only on vision, Daimon-Infinity integrates tactile, language, and action data—making it a goldmine for training robots to handle delicate tasks. The dataset was built using a distributed out-of-lab collection network that can generate millions of hours of data annually. Partners like Google DeepMind, Northwestern University, and the National University of Singapore collaborated on the project, signaling a major shift toward open, collaborative AI research.

2. Why Tactile Sensing Matters for Robot Manipulation
Most current robotic systems rely on vision and language models (the VLA approach) to decide how to grasp objects. But that leaves a critical gap: insensitivity. Robots can’t tell if a glass is slippery, if a fabric is delicate, or if they’re applying too much pressure. Tactile sensing fills that void. DAIMON’s vision-tactile sensors pack over 110,000 effective sensing units into a fingertip-sized module, providing real-time feedback on contact forces, texture, and slip. This allows robots to adjust their grip on the fly—much like humans do without thinking. By adding touch, DAIMON aims to achieve truly dexterous manipulation, enabling robots to handle everything from fragile eggs to flexible wires in manufacturing. For more on the brains behind this, jump to item 4.
3. The Vision-Tactile-Language-Action (VTLA) Architecture
DAIMON’s co-founder and chief scientist, Prof. Michael Yu Wang, pioneered the Vision-Tactile-Language-Action (VTLA) architecture. This elevates tactile data to a modality equal to vision and language in robotic decision-making. In traditional VLA models, touch is either absent or treated as a low‑priority signal. VTLA changes that by feeding high-definition tactile data directly into the AI pipeline, alongside visual inputs and natural language commands. The result is a more holistic understanding of the environment—robots can see an object, hear instructions, and feel its surface properties. This multimodal fusion is critical for unpredictable tasks, like sorting recyclables or assisting in surgery.
4. Prof. Michael Yu Wang: The Brain Behind the Breakthrough
Prof. Michael Yu Wang isn’t new to robotics. He earned his PhD at Carnegie Mellon University under Matt Mason, a pioneer in robotic manipulation, then founded the Robotics Institute at Hong Kong University of Science and Technology. An IEEE Fellow and former Editor-in-Chief of IEEE Transactions on Automation Science and Engineering, Wang has spent nearly four decades tackling the “insensitivity” problem in robot hands. His vision for VTLA was born from the realization that robots need a sense of touch as rich as human skin to succeed in unstructured environments. He co-founded DAIMON Robotics to commercialize this research, and the Daimon-Infinity dataset is his latest push to democratize tactile capability for the entire AI community.
5. High-Resolution Tactile Sensors: Fingertip-Sized Powerhouses
DAIMON’s hardware is equally impressive. Their flagship tactile sensor is monochromatic and vision-based, but it packs over 110,000 effective sensing units into a module the size of a fingertip. By measuring minute deformations in a soft gel layer with a high‑speed camera, the sensor captures detailed contact geometry and force distribution in real time. This allows robots to detect surface textures, edges, and even the direction of applied forces. Compared to other tactile sensors that offer only sparse data, DAIMON’s technology provides a dense, high‑resolution touch map—essential for tasks like gripping a slippery fish or rotating a screw without stripping the head.

6. Open-Sourcing 10,000 Hours of Data: Accelerating AI
To speed up real‑world deployment of embodied AI, DAIMON has open‑sourced 10,000 hours of its tactile‑rich data. This move is aimed at researchers and startups who may not have the resources to collect massive datasets from scratch. By providing a ready‑to‑use benchmark for tactile‑enabled manipulation, DAIMON hopes to drive innovation in robotic learning, especially for dexterous tasks. The open‑source data includes recordings from over 80 scenarios and 2,000 human skills—all annotated with tactile signals. This is a strategic choice to build an ecosystem around VTLA, as more data leads to better models and faster adoption in industries like logistics and healthcare.
7. Real-World Applications: From Hotels to Factories
Where will touch‑enabled robots first make an impact? According to Prof. Wang, the initial deployments are likely in commercial environments across China, such as hotels, convenience stores, and factories. For example, a robot that can feel the texture of a towel can fold laundry neatly without tearing it. In retail, tactile sensing lets robots handle fragile items like eggs or glass bottles. In manufacturing, they can assemble components that require precise force control—think inserting a microchip or tightening a bolt. The dataset already covers many of these scenarios, so the path from lab to real‑world is shortening. As the technology matures, we may see tactile‑aware robots in homes, assisting with cooking, cleaning, and caregiving.
8. Collaborative Efforts: Global Partners Driving Progress
No robot revolution happens in isolation. DAIMON’s Daimon-Infinity dataset is supported by a network of leading institutions including Google DeepMind, Northwestern University, and the National University of Singapore. These partners bring expertise in AI, materials science, and manipulation planning. The collaboration also spans multiple Chinese universities, creating a distributed data collection network that runs continuously. By pooling resources, the consortium aims to generate millions of hours of tactile data each year—far more than any single lab could achieve. This open, collaborative model mirrors the approach that made large language models successful, and it could be the key to unlocking general‑purpose robot dexterity.
DAIMON Robotics is giving robot hands a sense of touch—and with the Daimon-Infinity dataset, they’re handing the keys to the entire AI community. From Prof. Wang’s four decades of research to the cutting‑edge VTLA architecture, every piece of this puzzle works toward one goal: making robots that can feel the world as we do. Whether it’s folding laundry in a hotel or assembling electronics in a factory, tactile‑aware robots are on the verge of leaving the lab and entering our daily lives. The future of manipulation isn’t just visual—it’s tactile, and DAIMON is leading the charge.