New regulations for AI decision-making and the ‘right to explanation’
In 2018, the EU will implement legislation on machine-learning algorithms. Other jurisdictions are likely to follow. This legislation means that any automated individual decision-making system must be able to provide an explanation as to why a particular decision or outcome was made. [Read more about this regulation here.]
Putting the customer first
The objective of this legislation is to empower end users who have ‘a right to explanation’ for any decisions made about them by automated software. The legislation will mean that machines will have to act more like humans, by providing the user with the ability to query why or how a decision was reached. Ultimately, this ruling means that there should be no detrimental impact on the user. Whether a decision is made by a machine or a human, the right to explanation must remain. In 2018, ‘computer says no’ will not be considered an acceptable excuse for not providing consumers with the full story.
Putting the customer first is not new for industries such as Financial Services. In the UK, companies have a compliance obligation enforced by the FCA, whereby they must meet ‘Treating Customers Fairly’ (TCF) standards which state that customers have a right to ‘clear, fair and not misleading’ information about financial products and services. This, coupled with the new legislation due in 2018, will mean that consumers should be better equipped to make well-informed financial decisions based on information provided to them, whether it be from machines or humans. To ensure that a compliance trail is fully documented and can be readily available for audit, we’ll need to have articulate NLG systems in place.
How will the new legislation impact consumers?
Here’s an everyday example of how this legislation could affect you. An algorithm is used to decide whether to approve you for a loan, mortgage, or credit card. Typically, such intelligent machine decision-making is opaque to the consumer - you may not know why your application has been rejected, only that it has been rejected. With the new legislation, companies operating in the EU will be required to provide you with an explanation as to why you have been rejected.
A bigger concern has been the possibility for discriminatory decisions to be made based on intrinsically sensitive data held on an individual. This could be the area they live in, their ethnicity, faith or gender, and so on. The right to explanation will address this concern - companies will have to explain how the algorithm has arrived at a particular decision.
What this means for NLG systems
There is no doubt that the ‘right to explanation’ will pose both a challenge and an opportunity.
- The challenge will be for industries to be transparent in how they have come to a decision.
- The opportunity will be for decision-making software providers to lead in designing algorithms and evaluation frameworks which avoid discrimination.
Adding an NLG layer to the whole process could be a solution.
Arria’s future-proof technology
Because of the way that Arria’s technology has been designed by our leading scientists, the ‘right to explain’ legislation is not an issue for future NLG applications. In fact, it’s not even an issue for our current technology available today!
The Arria NLG Engine is structured so that every one our applications is built with the ability to query the output and dig deeper. This functionality not only meets the 2018 EU regulations, but is also vital for business reporting, where there is often a need to query data and ask the system to explain the reported results of a dashboard or a graph [read more on storytelling dashboards from Dr. Yaji Sripada]. This is where Arria NLG’s technology is so useful in explaining data and acting instinctively. Just like a human expert, it can provide intelligent answers to data/output queries.
How it works
There are two major components to the NLG Engine that make it the most powerful data storytelling machine on the market:
- Articulate Analytics™
First, the Engine contains a Data Analytics and Interpretation stage. This is where it takes the various sources of data that need to be explained and extracts and deduces the important facts and insights that should be communicated to an interested party. The results of this process are informational units we call messages.
- Advanced NLG
The second stage takes these messages and works out how to communicate the information they contain in an articulate and coherent manner, using natural language text and, where appropriate, graphical representations of the data. Voice output is also an option.
Because our Engine is structured in layers that contain business rules and processes, the Engine can work back through the layers to seek out answers or explanations. This Articulate Analytics™ functionality is simply not possible with the more template-based NLG systems available on the market.
What to ask when looking for a future-proof NLG supplier
- Is your NLG output based on templates?
- Can your technology explain the reasoning behind its NLG output?
- How does your technology analyse and interpret data to structure the sentences it produces?
- How will your NLG technology meet the 2018 EU regulations for the ‘right to explanation’ from machine-generated algorithms?
Get future-proof NLG for your business
Request a demo to see our future-proof NLG Engine in action today and learn more about our unique Articulate Analytics™ methodology.
Author: Saad Mahamood, Senior Natural Language Generation Engineer at Arria NLG