In the early 19th Century, the Luddites were a secret oath organization that destroyed textile machinery in industrializing England, and subsequently entered the history books as an example of anti-progress feelings in society. Algorithms exploring the value of big data to support decision-making in complex settings, combined with intelligent automation (IA) leveraging breakthroughs in Artificial Intelligence (AI) in such areas as Natural Language Processing (NLP) and image recognition, offer significant promise to all areas of economic and social activity. Yet, unless as public service operations and corporates we can develop trust with our citizens, our customers, and our employees on the intrinsic virtues of these technologies, and how they are being used, there is a risk that a Luddite-like backlash delays the effective adoption of promising technology.
Trust is multifaceted, covering important concepts such as data privacy, transparency, accountability and, security of IT infrastructure, that we know well how to operationalize. But it also covers more general humanistic and societal aspects including autonomy of the individual, fairness and absence of bias, and even general wellbeing of people exposed to AI, whether they are aware of it or not.
Because of the latter and pervasive and ubiquitous applications of AI, we have heightened awareness, and even alarm, in society. Regulation is being contemplated, and looking at the huge variety of voices contributing to the current debate, it is likely going to be fragmented.
Findings from a global study by EY in collaboration with The Future Society (TFS) (pdf), show organizations are focused on technical and operationalizable aspects of AI, such as data privacy, while legislators and regulators recognize its societal and humanistic impact.
Despite these ongoing debates, and the risk they entail for responsible management in both the private and public sector, there are a few basic principles that can profitably be applied to navigate through the challenges, irrespective of the regulation that is likely to come.