Now Reading
Differential privacy is a new AI method to keep data private

Differential privacy is a new AI method to keep data private

PHOTO: ELIEL KILKKI
New machine learning method developed by researchers at the University of Helsinki, Aalto University and Waseda University of Tokyo can use for example data on cell phones while guaranteeing data subject privacy.

Modern AI is based on machine learning which creates models by learning from data. Data used in many applications such as health and human behaviour is private and needs protection. New privacy-aware machine learning methods have been developed recently based on the concept of differential privacy. They guarantee that the published model or result can reveal only limited information on each data subject.

Privacy-aware machine learning

“Previously you needed one party with unrestricted access to all the data. Our new method enables learning accurate models for example using data on user devices without the need to reveal private information to any outsider”, Assistant Professor Antti Honkela of the University of Helsinki says.

The group of researchers at the University of Helsinki and Aalto University, Finland, has applied privacy-aware methods for example to predicting cancer drug efficacy using gene expression.

“We have developed these methods with funding from the Academy of Finland for a few years, and now things are starting to look promising. Learning from big data is easier, but now we can also get results from smaller data”, Academy Professor Samuel Kaski of Aalto University says.

Learn more: New AI method keeps data private

See Also

 

The Latest on: Differential privacy
[google_news title=”” keyword=”differential privacy” num_posts=”10″ blurb_length=”0″ show_thumb=”left”]

via Google News and Bing News

What's Your Reaction?
Don't Like it!
0
I Like it!
0
Scroll To Top