PHOTO: ELIEL KILKKI
New machine learning method developed by researchers at the University of Helsinki, Aalto University and Waseda University of Tokyo can use for example data on cell phones while guaranteeing data subject privacy.
Modern AI is based on machine learning which creates models by learning from data. Data used in many applications such as health and human behaviour is private and needs protection. New privacy-aware machine learning methods have been developed recently based on the concept of differential privacy. They guarantee that the published model or result can reveal only limited information on each data subject.
Privacy-aware machine learning
“Previously you needed one party with unrestricted access to all the data. Our new method enables learning accurate models for example using data on user devices without the need to reveal private information to any outsider”, Assistant Professor Antti Honkela of the University of Helsinki says.
The group of researchers at the University of Helsinki and Aalto University, Finland, has applied privacy-aware methods for example to predicting cancer drug efficacy using gene expression.
“We have developed these methods with funding from the Academy of Finland for a few years, and now things are starting to look promising. Learning from big data is easier, but now we can also get results from smaller data”, Academy Professor Samuel Kaski of Aalto University says.
Learn more: New AI method keeps data private
The Latest on: Differential privacy
- What your company needs to understand about digital privacy (but probably doesn't)on August 3, 2022 at 5:23 am
Digital privacy is often framed as an issue for consumers, but Ruslan Momot argues that companies need to consider the concept as a key element of their business.
- New Research on the Relationship Between Facebook Friendships and Economic Opportunityon August 1, 2022 at 5:30 am
Harvard’s Opportunity Insights research group published new research using privacy-protected data on Facebook friendships to better understand the connection between social networks and economic ...
- Deep Dive: What will be the impact of a non-personal data framework on M2M communication?on July 29, 2022 at 2:50 am
We talk to experts on how the framework governing Non Personal Data in India will impact M2M communication, and the concerns they have ...
- Surprise! The metaverse is going to suck for privacyon July 29, 2022 at 12:35 am
More thought – or at least some thought – needs to be given to privacy protection in the promised metaverse of connected 3D virtual-reality worlds, experts have concluded. In a paper distributed via ...
- Does data anonymization really hide your identity?on July 10, 2022 at 3:37 am
While most of the scenarios Xu discussed concern network privacy specifically, companies such as Apple and Google are trying new methods to protect user data including differential privacy ...
- Ant Group Makes Its Privacy-Preserving Computation Framework Open Sourceon July 6, 2022 at 12:06 am
Ant Group has announced that its privacy-preserving Computation Framework (or the “Framework”) is now open source. This move aims to make the technologies more accessible to global developers ...
- Ant Group open sources privacy softwareon July 4, 2022 at 5:00 pm
It integrates a range of Ant Group’s privacy computing technologies and covers the whole information life cycle, including secure multi-party computation, differential privacy, and homomorphic ...
- Ant Groups Privacy-Preserving Computation Framework Becomes Open Sourceon July 4, 2022 at 5:51 am
Ant Group today announced that its privacy-preserving Computation Framework (or the “Framework”) becomes open source, aiming to make the technologies more accessible to global developers and speed up ...
- Individual re-identification from incomplete datasets protected by differential privacyon June 30, 2022 at 4:59 pm
It is essential to remove explicit identifiers, sample population data, and apply differential privacy, which is the de facto standard privacy metric used by companies such as Google, Apple ...
- Microsoft and Harvard collaborate on differential privacyon June 25, 2022 at 5:00 pm
Microsoft and the OpenDP Initiative at Harvard have collaborated on a new platform that will offer differential privacy for large datasets. Differential privacy allows researchers to analyze ...
via Google News and Bing News