Our guest, Andrew Trask, is building OpenMined - a platform that merges cryptographic techniques such as homomorphic encryption and multi-party computation and blockchain technology to create the ability to train ML models with private user data. OpenMined will allow AI companies of the future to develop models, have them trained on user data without compromising user privacy, and incentivise users to train their model. We walk through the OpenMined vision and its potential impact on AI business models and AI safety
A significant part of the modern digital economy, is underpinned by machine learning models that are trained to perform tasks such as facial recognition, content curation, health diagnostics etc. Data to train machine learning models is the essential commodity of this century – a sentiment captured by epithets such as “”Data is the new oil””. Today’s dominant AI paradigm has companies focus their efforts on gathering data from their users in order to train models and monetise usage of the model. This model has many consequences such as loss of privacy for the user, consolidation of data in a handful of large companies, low access to data for startups and a fundamental impossibility of collecting sensitive data such as markers for depression.
Our guest, Andrew Trask, is building OpenMined – a platform that merges cryptographic techniques such as homomorphic encryption and multi-party computation and blockchain technology to create the ability to train ML models with private user data. OpenMined will allow AI companies of the future to develop models, have them trained on user data without compromising user privacy, and incentivise users to train their model. We walk through the OpenMined vision and its potential impact on AI business models and AI safety
Topics covered in this episode:
Episode links:
This episode is hosted by Brian Fabian Crain and Meher Roy. Show notes and listening options: epicenter.tv/217