Last updated 2 weeks ago by Michael Darmanin
There is nothing more frustrating than a vending machine eating your coins but refusing to deliver the snack or drink. In all seriousness, artificial intelligence (AI) systems are taking more and more decisions for us. Examples include finding cancerous tumors on CT scans, predicting car malfunctions, and improving the efficiency of oil wells. Although helpful most of the times, these systems will anger most of us when they fail to work, leaving us wondering what went wrong.
The European Union acknowledged that AI systems need to be more transparent. As European citizens, existing legislation already protects our right to having decisions explained to us. A problem emerges in the case of AI. These tools use large databases and run complicated formulas in order to arrive at a decision. These formulas are out of reach for most of us, who lack a background in the field of computer science.
NL4XAI stands for “Interactive Natural Language Technology for Explainable Artificial Intelligence.” It is a European initiative aimed at training young scientists. Their goal is to innovate and learn how build new AI systems which can automatically explain their decisions to regular, non-expert users.
Utrecht University joined the race
PhD candidate Alexandra Mayn, supervised by Prof Kees van Deemter is the Utrecht researcher who joined the NL4XAI program. AI uses complex logical formulas in order to analyze data and make a decision. Alexandra Mayn will focus on building systems which can translate those formulas into simple text. Through this text, in English, Dutch, or Chinese, users will gain insight into the “magic” behind the decisions made by the – for now – elusive AI systems.
Source: Utrecht University