Earlier this year, the European Union passed the General Data Protection Regulation – landmark legislation addressing data protection and privacy concerns. The regulations, which were ten years in the making, will take effect in 2018 and allow residents to decide how and when their data is collected and used. For example, if a super market issues a loyalty card that tracks your purchasing behavior, this regulation gives you rights as a customer to control the use of the data.
However, buried within the legislation are new regulations around how companies leverage “automated individual decision making, including profiling.” These regulations are some of the first formal restrictions on how automated software can be used to offer data driven services and complying could have a huge impact on the world’s tech companies.
Two key articles in the regulations (articles 13(2f) and 22(1) and (4)) address how businesses can use collected data. Most importantly, one of the articles in the regulation mandates that customers now have a “right to explanation” if their personal data is used to generate an automated service. Automated decisions that “significantly affect” EU citizens will be under scrutiny, which includes an automated response that assesses a person’s “performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.”
For example, if someone applies for a loan online and their application is turned down, he or she has a right to ask the system for an explanation. At this point, businesses don’t know what constitutes a valid explanation. But if “textual (linguistic) explanations” are required, businesses will need to find an effective solution.
If a human expert made the decision, they can provide a reason; but now many businesses use algorithms to make decisions. Mortgage applications and insurance applications are now processed automatically by algorithms, so many of the financial services could be impacted by this regulation.
The user’s right to an explanation could be fulfilled manually by publishing a standard explanation for all users if the law allowed this. At this stage, businesses simply do not have any legal precedents. But if businesses are legally required to provide “meaningful information about the logic involved” in decision making, then businesses will look for algorithms that can auto-generate explanations.
These stipulations could have enormous implications for global Internet services. Take any social media network, such as Facebook (News - Alert) or Instagram, that is leveraging personal data it’s collected to drive targeted advertisements. If these advertisements result in individuals being discriminated against because of their personal data, then the new regulation kicks in to protect these individuals. Globally, organizations are currently adopting unprecedented levels of business process automation (BPA), and therefore their exposure to the new regulation could be significant. Each automated BPA could be a potential vulnerability. From utility companies that calculate the value of your monthly payment to the recommender systems that provide you specific products, discounts or vouchers, each such data-driven service falls within the scope of the new regulation. In our day to day lives, we interact with numerous systems, from utility companies that calculate the value of your monthly direct debits, multiple banking services, the super markets that provide you specific discount coupons or vouchers, etc. These systems have become deeply ingrained in the daily operations and processes within organizations. All these systems will need to now explain the why behind the decisions that are being made in real time.
Explaining what goes into these types of automated decisions is incredibly complex. The algorithms that generate decisions often take millions of pieces of data into consideration; and while they sort through the data, asking how they do so is a whole new set of inquiries and are not yet easy to understand.
Natural language generation (NLG) is one possible solution for auto-generating the explanations necessary to comply with the General Data Protection Regulation. NLG is artificial intelligence technology that can translate data into text. In real time, the technology can absorb large sets of structured data, analyze it, and draw conclusions from it – generating summaries, detailed reports and even recommendations in a colloquial narrative format.
In this instance, NLG could be incorporated into existing algorithms (of the right kind), analyze the data being used, and generate an explanation in an easily digestible format as if it was written by a human. The technology is developed to receive information, analyze the information, and comprehend it similar to the human mind. The final reports could be produced in a number of different languages for different audience segments. Because the new regulation empowers individuals to seek personalized explanations, the user driven configurability of NLG technology is the key feature here.
Of course organizations may choose simpler solutions than NLG by simply listing out the business rules (if such rules drive the decision making algorithm) that fired for a given user as explanation. Alternatively, visualizations of the business logic could also be presented as explanations. However, if users demand textual explanations, then manually producing personalized explanations for each and every user is humanly impossible.
Regulators are still navigating the potential fallout from implementing the new regulation. The day-to-day impact these changes will have on both businesses and individuals will depend on the practical legal interpretation of the law, the level of detail required in explanations and the severity of penalties for those that do not comply.
Regardless of these pending specifics, the General Data Protection Regulation is the start of a new conversation – a new era for legislation and the future of human and computer interaction. What role should automation play in our day-to-day privacy and Internet use? And, how will governments move forward with monitoring and regulating this new and rapidly changing landscape? This regulation renders decision making algorithms accountable to the people they affect. But it remains to be seen if individuals would like to exercise their new rights and demand explanations. Historically, previous regulations of similar scope such as freedom of information law have shown that modern citizens are more likely to exercise their rights than not.
About the Author
Dr. Nikhil Ninan, PhD, is the Head of Technical Business Development Global for Arria NLG. Nikhil engages directly with client domain experts to understand, develop and deliver data and NLG-driven solutions to solve their problems. Prior to his current role, Nikhil was a Senior Data Scientists at Arria NLG. Nikhil is located in Aberdeen (News - Alert), UK, and received his PhD from the University of Aberdeen.