Humans for In-Cabin Automotive

Recognize and Alert Drowsy or Distracted Drivers

In-Cabin Humans In Motion

Obtaining visual data  to train high-quality In-cabin Driver Monitoring Systems (DMS) and Occupant Monitoring Systems (OMS) is incredibly challenging. Not only is the data itself highly complex – involving humans interacting with specific automotive environments – but it is increasingly difficult to capture given the fast-changing landscape of privacy regulation. Given that DMS and OMS systems will be broadly deployed, it is also critical that data is high-variance and sufficiently diverse to avoid bias.

The Datagen Platform provides high quality, perfectly annotated visual data in the form of video and images for tasks related to in-cabin driver action analysis, body keypoint estimation, gaze analysis, and hand pose analysis.

This includes accurate representations of the in-cabin environment with advanced motion and human-object interactions. Teams can use the platform to generate faces and full-body simulated data to quickly iterate on their model and improve its performance.

Use case examples

Attention Analysis

Driver is falling asleep or driver is turning around and looking around

 

Driving Distractions

Driver is playing with mobile phone or driver is drinking

Pose Analysis

Driver is driving with two hands on wheel or driver is touching his/her face

Why Datagen for In-Cabin Automotive

Driver Monitoring Systems (DMS)

Systems consist of a series of small cameras or sensors to monitor driver behaviors and issue alerts or warnings when drivers show signs of drowsiness, distraction, or inattention

Occupant-Monitoring Systems (OMS)

OMS understand the occupants’ needs, allowing them to control media and recognize if a cell phone is left in the car, and in the case of an accident, adapt airbags to the occupants’ body shape and position

Serving top Fortune 500 firms across AR/VR/Metaverse, Automotive, IoT Security