The prudential regulator has written to super funds to warn them that their information security practices and governance are struggling to keep pace with an increasing number of AI threats – especially those posed by “high capability frontier models” like Anthropic Mythos.
“APRA found that, while AI is being actively adopted by all the entities we engaged with, there are differing levels of maturity across functions such as governance, risk management and operational resilience,” APRA executive board member Therese McCarthy Hockey said in a letter to industry on artificial intelligence.
“In addition, assurance practices are not keeping pace with the scale, speed and complexity of AI.”
APRA identified a number of concerns originating both outside and inside funds, including that identity and access management capabilities have not yet adjusted to nonhuman actors like AI agents, while the volume and speed of AI-assisted software development is “placing strain on the effectiveness of change and release management controls”.
The prudential regulator also observed gaps in the scope and coverage of security programs for both AI implementation and “responding to the AI-augmented threat environment”.
“APRA is also engaging across the sector on the potential for increased cyber threats from high capability AI frontier models such as Anthropic Mythos,” Hockey said.
“APRA has heard clear recognition from regulated entities of the need for a step change in cyber practices and a continuing uplift in capabilities to protect IT assets in an evolving threat environment. This uplift could also include the use of AI in identifying and resolving vulnerabilities.”
Hockey said that APRA is in the process of finalising its forward plan for supervision of AI risks and “taking a proportional approach to entity prudential reviews, thematic activities and AI supplier engagement.”
“APRA will continue and monitor the use of AI to assess potential prudential risks and consider whether further APRA policy action may be needed,” Hockey said.







Leave a Comment
You must be logged in to post a comment.