GDPR isn’t dangerous for machine learning, says GDPR Delivery Manager

When it comes to machine learning and the upcoming GDPR, which will take place on the 25 May 2018, there is a widespread belief that GDPR might kill machine learning because it brings the obligation to explain the algorithm to the user. Some say that it will stop deep learning completely because you can’t explain how the system evolves in deep learning even if you want to.

According to Can Huzmeli, GDPR Delivery Manager at ICAN Consultancy, GDPR will not stop, nor is dangerous, for neither machine learning nor deep learning.

“GDPR is focusing on what data you used as the input to the system and who you share the data with as a result of your processing. The ‘how’ part is only related to security,” said Huzmeli.

Data processing

“As long as the way you do your data processing is secure in terms of privacy, you can use any algorithm. In this case, as long as you have a lawful basis for the input of your system, meaning the data you collect, then you are fine. If you also do not illegally share the output of your algorithm, then you are safe.”

Another discussion is around the area of sensitive data. GDPR states in article 22 paragraph 4 that, decisions which produce legal effects, or of similar, shall not be based on sensitive data (such as political opinions, religious beliefs, genetic data, biometric data).

Filtering sensitive data

He continued: “This does bring some extra burden for machine learning systems because they usually use crawlers to gather the data. However, they already use filters to clean the data. So, the only extra work to be added for GDPR will be filtering the sensitive data before it ends up in the data set.

“GDPR is bringing in the necessary awareness to the ecosystem and pointing software companies in the right direction by reminding them how big of a responsibility they have in the privacy arena.”

Huzmeli also noted that, in the next decade, machine learning and AI will be dominant in our daily lives and we can’t risk a “free from all rules” way of working when it comes to personal data.

Written by Leah Alger

More
articles