OSMOSIS: Open-Source Multi-Organizational Collaborative Training for Societal-Scale AI Systems

Note: See the Co-Ops: Collaborative Open-Source and Privacy-Preserving Training for Learning to Drive project page for details on the continuing research on this topic.

The goal of our project is to develop a novel framework and cloud-based implementation for facilitating collaboration among highly heterogeneous research, development, and educational settings. Currently, AI models for real-world intelligent systems are rarely trained as part of a collaborative process across multiple entities. However, collaboration among different companies and institutions can increase AI model robustness and resource efficiency. Towards a more efficient development process of AI systems at massive scale, we propose a general framework for AI model sharing and incentivization structures for seamless collaboration across diverse models, devices, use cases, and underlying data distributions. Through distributed sharing of AI models in a secured, privacy-preserving, and incentivized manner, our proposed framework enables significant cost reduction of system development as well as increased system robustness and scalability.


Project Detail in arXiv


Principal Investigator: Eshed Ohn-Bar


Student Sun Zhang was awarded a Collaboratory Student Research Award to conduct research on associated sub-project “D-COLLECTIVE: Democratized Data Collection and Collaborative Training for Extreme-Scale Autonomous Systems”

This project is supported by the Red Hat Collaboratory at Boston University.

Status

Research Area(s)

Contacts

Project Resources

RIG(s)

Affiliations