FRANKFURT SCHOOL

BLOG

Using AI to support operational compliance: expectations vs reality
Executive Education / 2 June 2023
  • Share

  • 4571

  • 0

  • Print
Alumnus Certified Compliance Professional
Rachid Tahrioui is a passionate fighter against Financial Crime and has been working in the compliance area of the banking sector for years. Starting in management consulting to various in-house positions, he was able to gain a lot of experience especially in the area of AML Compliance. In 2019, he completed the Certified Compliance Professional course at the Frankfurt School and is currently working as Senior Manager at Hyundai Capital Europe Bank in Frankfurt.

To Author's Page

More Blog Posts
Turning Ideas into Impact: The Case of Ceres FieldCheck
Cooperation Company Project: A Journey Through Retail Location Analysis
Erforschung der Auswirkungen von KI auf die Zukunft der Arbeit

Artificial intelligence (AI), machine learning (ML) and robotic process automation (RPA) are terms used – all too often – in today’s financial sector when discussing ways to optimise processes, cut costs and save time. For a while now, the market has been driven by a steady demand for the latest high-tech developments. In particular, the experts responsible for combating financial crime are begging to be allowed to implement the above-mentioned technologies. But what are the challenges involved in moving over to artificial intelligence? What’s the position of Germany’s Federal Financial Supervisory Authority (BaFin) on the issue? Finally, what’s the potential impact of these technologies on operational compliance? It’s time to take a closer, critical look at these questions.

Data is Key

Before embedding artificial intelligence in their monitoring and screening tools, financial institutions must first ensure that their technological infrastructure is suitably robust. One of the key criteria here is data quality. Unless a certain level of data quality is present, the use of artificial intelligence simply won’t work.

In a study entitled “Big data meets artificial intelligence” (Big Data trifft auf künstliche Intelligenz), BaFin defines the quality and quantity of available data as a strategic prerequisite for implementing artificial intelligence, alongside various other factors. According to their definition, the main criteria are the completeness, accuracy, consistency and timeliness of the data. (1)

However, as most recently confirmed in a survey conducted by auditing firm PricewaterhouseCoopers (PwC), a large number of financial services providers are complaining that they lack sufficient data to implement AI. (2) So it’s important to carry out very thorough checks to ensure that all necessary data on customers, transactions, products and partners has been imported into your ERP system prior to implementing, for example, robotic process automation.

Too complex for most people to use

As well as the whole data issue, a general lack of industry expertise is another showstopper when attempting to implement AI as a tool for ensuring operational compliance. Again, according to the PwC survey mentioned above, some 64% of respondents admitted they didn’t have the skills needed to implement the new technologies. Among other things, this complicates the traceability and explainability of the decision-making processes required for operational compliance – which could, in turn, have a not insignificant impact on internal and external audits.

The BaFin perspective: neither pro nor con. Or maybe con?

Generally speaking, Germany’s top financial regulator doesn’t approve of any algorithmic decision-making processes. But three years ago, BaFin did state that it was technology-neutral. (3) This essentially means that BaFin has not laid down any specific provisions for the types of technology that can – or cannot – be used to fulfil mandatory supervisory requirements concerning process organisation or documentation.

One thing is clear, however: the ultimate responsibility – hence also risk of liability – stays with senior management. (4) This also has implications for compliance officers who, along with the senior managers, are exposed to personal liability risk.

Potential impact on operational compliance and regulatory systems

Take, for example, the conventional statistical approach to transaction monitoring, which currently works on the basis of rules-based detection scenarios. If used in the suspicious activity reporting (SAR) process, AI could reduce the number of “false positives” – i.e. hits that trigger a false alarm once the analysis has been completed. And this is one of the core arguments for using AI in transaction monitoring. (5)

It is important to start by considering whereabouts in the SAR process it would make most sense to use the above-mentioned technologies. Thus robotic process automation could become a major data aggregation driver in the SAR process, boosting the immediacy of suspicious case processing – a key criterion. At present, the investigative software used in current SAR processes is still characterised by a wide variety of standalone solutions. Quite apart from the monitoring system, most SAR processes must also gather information from in-house data sources (archiving and CRM systems) as well as popular external sources (such as Google, WorldCheck or Factiva).

The extent to which such systems are influencing the rising number of suspicious activity reports related to potential money laundering submitted to Germany’s Financial Intelligence Unit (FIU) remains to be seen. (6) But there is reason to suspect that once artificial intelligence is implemented, risk aversion in the calibration of transaction monitoring will tend to increase. This in turn could further extend the (already extensive) SAR backlog that still needs to be processed by the FIU. (7)

Apart from the impact on day-to-day operations, it is also important to consider the implications in terms of regulatory requirements. For example, the use of machine learning could require the creation of the new role of algorithm officer – essentially analogous to the role of data protection officer. (8) Or perhaps BaFin will be obliged to publish new administrative regulations called something like MaAlgo (“MinAlgo” – Minimum Requirements for Algorithms)?

Practical tips

  • This post should not be interpreted as an indictment of the potential role of AI in operational compliance. It is simply intended to shed light on the challenges associated with what has, for some years now, been regarded as a sectoral trend, and reflect the current state of play.
  • Until all parties involved – supervisory authorities and the institutions under supervision – have acquired the necessary expertise and put suitable financial parameters in place, we should stick with the current software-based methods and solutions.
  • Rules-based scenarios should be tested and recalibrated for as long as it takes to achieve a balance between recoverable value and appropriate risk appetite.

 

(1) “Big data meets artificial intelligence – Challenges and implications for the supervision and regulation of financial services” (BaFin, 2018).

(2) “How mature is AI adoption in financial services?” (PwC, 2020).

(3) “Does BaFin have a general approval process for algorithms? No, but there are exceptions” (BaFin publication, 2020).

(4) “Big data and artificial intelligence: Principles for the use of algorithms in decision-making processes” (BaFin, 2021).

(5) “Mit künstlicher Intelligenz Geldwäsche erkennen – bevor sie geschieht [Using artificial intelligence to detect money laundering – before it happens]” (KPMG, 2022).

(6) “Annual Report 2021” (Financial Intelligence Unit).

(7) “Wie Deutschland die Geldwäschebekämpfung verschleppt [How Germany is dragging its feet in the fight against money laundering]” (ZEIT ONLINE, 2022).

(8) “Big data meets artificial intelligence – results of the consultation on BaFin’s report” (BaFin, 2019).

 

This post was first published as a German-language article on the FCH-Gruppe website in February 2023.

0 COMMENTS

Send