AR / VR / Metaverse

Powering the Metaverse with Synthetic Data

Powering the Metaverse with Synthetic Data

The Datagen Platform provides high quality, perfect 2D & 3D annotated visual data for the computer vision tasks needed to build seamless, immersive experiences in AR/VR and the Metaverse


As AR/VR and Metaverse developments are taking steps to enable wide market adoption, they require new capabilities for humans to seamlessly interact with the digital world. This includes the ability to interact with virtual objects, optimization of on-device rendering using accurate eye gaze estimation, photorealistic user avatar representation and creating a stable 3D digital overlay on top of the real world.


AI models are key to enabling these capabilities.


Obtaining visual data to train these AI models is extremely challenging. It requires a vast amount of face and full body data. This data includes accurate 3D annotations to fulfill tasks such as hand pose & mesh estimation, full-body pose estimation, gaze analysis, SLAM, 3D environment reconstruction, and codec avatar creation. Manual collection and annotation of such data is slow, expensive, not scalable, prone to privacy issues and lacking important 3D annotations.


Datagen data includes accurate representations of dynamic actions like grasping, hand gestures, and eye movements. Teams can use the platform to generate faces and full-body simulated data to quickly iterate on their model and improve its performance.

Use case examples

Egocentric Action Recognition

Scenario recognition (walking up to a door, cooking food, taking off device) / Picking up an object / Typing on a virtual/physical keyboard

Eye Gaze Estimation

Eye gaze in dynamic sequences / Eyes focusing on a certain point

Hand Tracking & Gesture Recognition

Dynamic Gestures (e.g. pinching, grasping, or closing fist) / Multi-hand tracking and pose estimation / Hand 3D mesh reconstruction

Why Datagen for AR / VR / Metaverse

The right domain-specific data

Controllable camera devices

Automatic annotations

3D ground truth annotations and perfect 2D quality

Controllable data generation

Frictionless granular control by the computer vision engineer

Zero biases in the data distribution

Ability to define the distributions for every part of the data with no inherent biases

Serving top Fortune 500 firms across AR/VR/Metaverse, Automotive, IoT Security