The Challenge
The advent of immersive devices and wearable biosensors offers unprecedented access to naturalistic behavioural data. Combined with advances in machine learning, this will enable new insights into human behaviour and the development of AI systems capable of understanding real-world environments. Yet, integrating information across devices, platforms, and experiments remains a challenge.
- Fragmented data formats across recording systems
- Lack of standardized data and metadata schemas
- Time-consuming data preprocessing and conversion steps
- Inconsistent data quality and recording conventions
- Difficulty sharing datasets between research groups
- Limited interoperability with AI training pipelines
Our Solution
MIND addresses these challenges with a comprehensive data standard that puts researchers first.
Fully Open
Open by design, MIND ensures that researchers can freely adopt and build upon the standard.
Optimized for AI
Transforms complex multimodal data into a structured representation ready for AI model training.
Community-Driven
Developed with input from researchers and industry partners to meet real-world needs.
Engine Ready
Optimized for 3D experimental environments, with frameworks for virtual and augmented reality.
What is MIND?
MIND is designed to capture the full complexity of multimodal behavioural and embodied data through four core principles.
Multimodal
Captures diverse data streams including motion, gaze, physiological signals, environmental context, and user interactions in a unified format.
Interoperable
Integrates with existing tools, platforms, and AI frameworks while supporting conversion between different data formats.
N-dimensional
Defines the shape and dimensionality of multimodal data structures.
Dynamics
Coordinates synchronization and feedback across hardware components.
Get Involved
Join us in building the future of multimodal behavioural and embodied data standards. Whether you're a researcher, developer, or organization, we want to hear from you.
Stay Updated
Be the first to know when MIND launches and get involved in shaping the standard.
