The Event
Hong Kong-based DAIMON Robotics has released Daimon-Infinity, described on its official website as the world’s largest multimodal robotic dataset featuring high-resolution tactile sensing, designed to accelerate the deployment of general-purpose robotic foundation models.
Data & Context
- DAIMON Robotics claims Daimon-Infinity is the ‘world’s largest robotic tactile dataset,’ encompassing high-resolution tactile sensing and multimodal inputs (according to DAIMON Robotics website)
- The dataset is being open-sourced to enable training of general-purpose robotic grasping models (according to Xeber.world)
- Most industrial robots still rely on pre-programmed rules or low-resolution visual feedback; tactile feedback is used in fewer than 5% of real-world deployments (industry observation, no public data)
Hongshugu Analysis
Most assume robot grasping is limited by sensor resolution or mechanical design, but the real bottleneck is the scale and diversity of tactile data. Daimon-Infinity doesn’t just improve resolution—it captures tens of thousands of real-world grasps as synchronized streams of touch, vision, and force, enabling models to learn implicit relationships between material compliance, pressure distribution, and slip dynamics—patterns rule-based systems cannot encode. Another indicator: current collaborative robot vendors still rely on simulated data or tiny manually labeled samples, yielding failure rates above 40% in unstructured environments. Meanwhile, systems trained on proprietary high-quality tactile datasets have reduced failure rates to under 15% in open settings—meaning, within the next 12 to 18 months, buyers will no longer ask ‘how dexterous is it?’ but ‘what data did it learn from?’
For overseas early movers, DAIMON Robotics is building an industry standard through open data, similar to Waymo’s role in autonomous driving. Its strategy isn’t selling hardware, but locking in downstream model developers via a data ecosystem. For Chinese enterprises and industrial capital, this means: when Huawei, BYD, or Haier’s innovation teams evaluate robotic arms, suppliers unable to disclose the source, scale, and update mechanism of their training data—even with impressive hardware specs—should be excluded from shortlists. Tactile data is becoming the ‘digital genome’ of robotic systems; firms with closed-loop data will increasingly monopolize high-value deployment.
The winner in robotic grasping isn’t defined by fingers, but by data, because physical AI is the digitization of embodied experience, and the market will split into those trained on data and those programmed with rules.
Reference: IEEE Spectrum


