Skip to main content
MIND Logo

The MIND Standard

A new, open data standard for studying human behaviour in context - connecting XR systems, embodied AI, robotics, and neurophysiology.

The Challenge

The advent of immersive devices and wearable biosensors offers unprecedented access to naturalistic behavioural data. Combined with advances in machine learning, this will enable new insights into human behaviour and the development of AI systems capable of understanding real-world environments. Yet, integrating information across devices, platforms, and experiments remains a challenge.

  • Fragmented data formats across recording systems
  • Lack of standardized data and metadata schemas
  • Time-consuming data preprocessing and conversion steps
  • Inconsistent data quality and recording conventions
  • Difficulty sharing datasets between research groups
  • Limited interoperability with AI training pipelines

Our Solution

MIND addresses these challenges with a comprehensive data standard that puts researchers first.

Fully Open

Open by design, MIND ensures that researchers can freely adopt and build upon the standard.

Optimized for AI

Transforms complex multimodal data into a structured representation ready for AI model training.

Community-Driven

Developed with input from researchers and industry partners to meet real-world needs.

Engine Ready

Optimized for 3D experimental environments, with frameworks for virtual and augmented reality.

What is MIND?

MIND is designed to capture the full complexity of multimodal behavioural and embodied data through four core principles.

M

Multimodal

Captures diverse data streams including motion, gaze, physiological signals, environmental context, and user interactions in a unified format.

I

Interoperable

Integrates with existing tools, platforms, and AI frameworks while supporting conversion between different data formats.

N

N-dimensional

Defines the shape and dimensionality of multimodal data structures.

D

Dynamics

Coordinates synchronization and feedback across hardware components.

Get Involved

Join us in building the future of multimodal behavioural and embodied data standards. Whether you're a researcher, developer, or organization, we want to hear from you.

Early access to the specification
Invitation to community working groups
Access to development tools and SDKs
Updates on partnerships and adoption

Stay Updated

Be the first to know when MIND launches and get involved in shaping the standard.