fbpx

k/ AI issue

AI, Big Data and Privacy: (Fr)enemies?

Once considered a dystopian concept, artificial intelligence and big data swiftly settled down in everyday lives of many. As all great technological developments always come hand in hand with significant legislative changes, or at least attempts thereof, this case is no exception. Innovation typically tends to collide with concerns, leading to intense talks and negotiations with international regulators trying to grasp on every aspect that might require legal guidance. In such an attempt, EU envisioned the adoption of the so-called AI Act, hailed as “the world’s first comprehensive AI law”1. On another yet similar front, the EU recently also welcomed the adoption of the so-called Data Act, a fresh legislation regulating the access to and use of data generated through use of connected products or related services2 

While the AI Act focuses on various aspects related to the design and development of systems processing large volumes of data so that they would be considered “safe” to use across markets, and the Data Act strives to grant users more control over the data generated by their connected devices, the basic nature of these systems – processing vast volumes of data to deliver attentive solutions, simultaneously makes them elusive privacy hazard. Thus, the adoption of all-out frameworks with global influence has made experts, as well as the public, (justifiably) interested and concerned.  

On the privacy front, the EU’s GDPR was able to follow and guide the new technological developments and related frameworks to a certain extent, given its highly abstract wording and frequent publishing of area-specific guidance. However, the rapid utilization of AI and big data keeps raising the stakes, as AI heavily relies on big data to feed its algorithms, with privacy risks increasing exponentially with the surge in data volumes consumed by AI systems. As a result, the practice shows that companies developing or relying on various AI solutions and working with big data (such as the developers of connected device technologies) frequently underestimate privacy related legal risks, often exposing them to the scrutiny of the regulators.  

Despite new frameworks being established, a few topics are particularly challenging to grasp from a privacy perspective:    

  • Lack of transparency with automated decision-making – Automated decision-making involves complex algorithms which often lack transparency, making it difficult for individuals to understand how decisions are reached, contrary to the transparency requirements in data privacy laws; 
  • “Lights speed” cross-border transfers – Novel technologies require data transfers to be made at lights speed, introducing complexities in ensuring compliance with data privacy stringent rules on international data transfers (especially if data transfers are made to the so-called “third countries”); 
  • Appropriateness of user consent – Increasing reliance on machine learning models, which constantly evolve and adapt based on new data inputs, challenges the GDPR’s (and respective nationals’) principle of purpose limitation, as the initial consent granted by individuals may not cover unforeseen uses or developments in the data processing lifecycle; 
  • Algorithmic bias and discrimination – AI systems can inadvertently introduce biases and discriminatory outcomes, especially if trained on biased datasets, posing challenges for compliance with basic notions of fairness of processing according to international standards. 

While we await to see whether the privacy framework will regulate these issues specifically (on both international and  national levels) or if will we have to rely on the existing pieces of legislation, individuals and companies engaged in big data and AI should not underestimate the legal pitfalls, but rather mindfully ground-base their technologies in a privacy-compliant manner.  

 

The information in this document does not constitute legal advice on any particular matter and is provided for general informational purposes only.