Given the changing technological landscape, ACCA and EY have explored the evolving role of the regulator, other government institutions and accounting professionals in combating economic crime in the digital age. A clear point arose from these discussions: professionals interviewed for the report agreed that the overall objective of regulation, and the historic tensions within industry and innovation, have not changed.
There is still a pressing need to create a regulatory environment that supports financial innovation, while at the same time limiting the risks for consumers and businesses, supported by audited information to inform decision-making. Auditors could assess whether a company has adequately put in place controls to comply with the sector-relevant laws and regulations in the jurisdiction in which it operates.
These laws and regulations would not only be those relating, for example, to money laundering, corruption and tax evasion — but also those concerned with data protection, environmental impact and the treatment of the workforce, among other concerns.
For regulators and policymakers, economic crime in a digital age presents some particular challenges in addition to those dealt with before. These include, at a minimum, challenges pertaining to anonymity, accessibility and accountability.
Transacting through the online environment has provided avenues by which fraudsters can avoid disclosing their identity or provide false identities. The idea of opaque fund holdings or complicated financial structures to obfuscate the view of regulators is not new in itself. And cash is, of course, the classic tool for facilitating untraceable, “under the table” payments.
Now, the anonymity associated with cryptocurrencies adds a new method for facilitating payments. These currencies are anonymous by design (or pseudonymous, to be more exact). In other words, it is possible to see the target addresses to which funds are going, but this does not provide a reliable confirmation of the identity of the actual counterparties or individuals in the background.
As a result, we see that “cryptocurrency has fueled secondary markets for criminal activity [because] of this inability to directly track and trace the beneficial owner of the funds,” observes Narayanan Vaidyanathan, Head of Business Insights, ACCA. “And that’s a challenge that we didn’t have in the same way previously.”
This increased anonymity creates new challenges for policymakers, regulators, compliance and audit professionals working to tackle activities such as money laundering. The related challenge of “know your customer” (KYC) lies at the heart of dealing with this issue. It is already a space keenly contested between those committing crimes and those seeking to catch them — and this will intensify as a key battleground for the future.
“The ability to identify and do proper KYC checks quickly and efficiently is still the holy grail of a financial crime compliance department,” says David Higginson, Partner, Ernst & Young LLP United Kingdom, Forensic & Integrity Services.
Financial crime has become a more accessible activity, with experts identifying reduced barriers to entry for undertaking a financial crime, the proliferation of information on the internet and many of the perpetrators conforming to stereotypes of the “hardened criminal.”
The marketplace characteristics of the digital crime mean that it may be possible, for example, to hire the services of cybercriminals even if one does not know how to commit the crime oneself. This level of accessibility does not presuppose any connection to the world of cyber hacking or the need to have an extensive network of contacts or knowledge in the area.
Craig highlighted that “You can have a 14-year-old in their bedroom download[ing] malware and recruit[ing] a botnet to take part in a sophisticated attack, perhaps unknowingly to some degree. There’s a worrying aspect of ‘gamification’ to cyber attacks.”
Another facet of the accessibility challenge is the significant increase in cross-border activity or the globalization of economic crime. Regulation remains jurisdictional by nature.
“You are as likely to be scammed from Jakarta, as you would be from Kiev, as you would be from any one of the islands within the Philippines,” warns Harbinson.
The locations of a criminal and their target could be in completely different jurisdictions, whose law enforcement agencies have no interaction with one another. This unprecedented access to a global pool of targets has given economic crime a previously unthinkable level of scalability. And the regulatory challenges to dealing with this are significant because it is difficult for government agencies to coordinate across borders at the same speed as the perpetrators of crime.
Increasingly, artificial intelligence — or more specifically, augmented or assisted intelligence — has the potential for automating not just processes but also decisions. That presents difficult questions of judgment, given the sophistication of some of the technologies involved. Algorithmic models, for example, can be complex, and it may not be easy for compliance departments or the regulators providing oversight to understand how and why these models are performing in a certain way.
This, however, should not become a reason for corporate actors to absolve themselves from responsibility. Ultimately, legal structures as they stand look at a legal “person” as being an individual or corporate entity. The implication is that human oversight cannot be dispensed with by simply outsourcing responsibility to an algorithm.
“It’s great that we have algorithms that can help review false positives, or identify suspicious activity, or flag fraudulent activity or patterns. But it is difficult to rely completely on them because if they do not flag money laundering or dismiss a false positive, the risk remains with the firm,” says Sexton.
The important challenge here will be to achieve the right regulatory balance between human oversight and reliance on the machine. Getting this wrong could create a wider discontent among organizations in that when things go well, the technology gets the credit, but when things go wrong, the individual gets the blame. There is a challenge here in ensuring proportionate and fair regulatory and associated regimes that will allow innovation to flourish while providing essential safeguards for accountability.
Should regulatory or legal responsibility be applied differently in the case of a flawed credit model that is intentionally manipulated for fraudulent intent as opposed to one that is unintentionally fed with biased data input? Having an appreciation of this requires some level of granularity in regulators’ understanding of technological development and how that translates to issues of accountability and integrity.