EY refers to the global organization, and may refer to one or more, of the member firms of Ernst & Young Global Limited, each of which is a separate legal entity. Ernst & Young Global Limited, a UK company limited by guarantee, does not provide services to clients.
How EY can help
-
Explore the EY Americas Metals & Mining Center of Excellence — your gateway to innovation, expertise & sustainability in the evolving metals & mining sector.
Read more
Cybercrime is expected to cost companies $10.5 trillion globally in 2025,1 a number expected to climb to almost $14 trillion by 2028.2 The average ransomware payout almost doubled in one year, from $812,380 in 2022 to $1,542,333 in 2023.3 And businesses shelled out an average of $4.88 million last year to address data breaches.4
The numbers are daunting. Whether you’re in Canada or Cameroon, mining gold and diamonds or extracting potash and gravel, no metals and minerals organization is completely safe from attack. But through collaboration and shared insights, the industry can keep its ear to the ground, focus on proven defense strategies, close the gap on vulnerabilities and minimize attack surfaces, adding an extra layer of protection to already sound cybersecurity action plans.
New horizons in AI
There’s no doubt that AI is catalyzing foundational skills in business and across organizational operating models. The ready availability of data is exponentially advancing capabilities, and more businesses are looking to AI to generate value.
No longer requiring advanced specialists previously needed to interact with AI, human-executed processes — supported by data and powered by tech — are becoming technology powered by data, with human oversight and governance. Metals and minerals organizations are benefitting from improved productivity, with use cases ranging from exploration and operations to processing, transport, sales and marketing, with AI completing tasks like subsurface geological modeling and viability assessments, core scanning, optimal extraction planning, predictive maintenance environmental impact reduction and so much more.
But such interchanges present risk. AI “hallucinations” — incorrect or misleading results generated by AI models — raise valid concerns around accuracy and trust. The appropriate collection and use of data gives rise to questions about the sensitivity of data being made available in the public domain. And with potential new vulnerabilities being unearthed as we rely more heavily on AI to connect dots between IT and operational technology (OT), what do organizations need to do today to ready themselves for the future?
Defining solid data governance and management initiatives are a good start. As is AI governance, if we are to expect a positive outcome from any AI initiative. But while roundtable attendees had such security strategies in place, most admitted to approaching scenarios on a case-by-case basis or on a limited scale. Using it to track licenses and contracts, for example, which could take human counterparts months to follow up on. But the possibilities of data leakage and go-forward governance topped their list of concerns.
And although shadow AI and the use of ChatGPT are raising eyebrows, most agreed they’d identified and locked down sensitive information but were excited to explore new capabilities — in a controlled environment, stressing the importance of training and awareness for their teams.
Phishing emails have become more believable than ever and social engineering or synthetic media, like deep fakes of high-value employees fueled by AI are growing increasingly sophisticated.