Artificial intelligence (AI) is not just another technological wave; it is a structural force that is reshaping markets, societies and even systems of governance. Its ability to drive change is already outpacing that of the internet and mobile phone revolutions. Unlike previous innovations, AI’s impact is unavoidable. It affects people, economies and institutions whether they choose to engage with it or not.
As this power grows, so do the risks. Business models built on immature AI systems are fragile. Social challenges, such as polarisation and the spread of false information, are growing globally. Although AI safety has improved significantly, outcomes remain difficult to predict and malicious actors adapt rapidly. Cyberattacks can now be automated. False information spreads faster than ever before. Labour markets are facing significant disruption. And because AI development is concentrated in the hands of a small number of dominant companies, vulnerabilities are no longer isolated — they are systemic.
In this new landscape, investors cannot afford to sit back and watch. They are uniquely positioned to demand transparency, set governance standards and hold companies to account for the responsible development and deployment of AI. Stewardship is no longer simply about protecting capital; it is also about shaping the trajectory of a technology that will define the decades ahead.
Sarasin’s leadership role
Being actively involved in engaging with companies on ethical AI, both unilaterally and as part of investor coalitions, we believe it is important to continuously foster collective industry action by connecting it to the front-line science on AI-related risks and risk management.
Our first investor seminar on Ethical AI was in December 2023, where we discussed the limitations of technologies and stressed that stewardship focus must always be tailored to the product’s role in the AI value chain.
At our second investor seminar on held in September, we brought together investors, academics and practitioners to examine the current state of these issues. The message was clear: AI is amplifying existing risks while creating entirely new ones that traditional governance approaches cannot handle.

Investors need to focus on effectiveness of safeguards
In the age of AI, with companies’ continuously soaring investment spending, investors are unwittingly taking on significant risk. Proactive engagement is now a strategic necessity.
Investor engagement with AI has already demonstrated impacts on improving company disclosures But this still sometimes takes the form of box-ticking, which is not sufficient. Investors must ask sharper, more strategic questions, such as challenging companies on the following questions:
- What safeguards are in place to prevent various types of misuse, and can they operate effectively at the scale of billions of users? How do you measure their effectiveness and progress?
- How are boards involved in key decisions involving deep knowledge, for example ensuring that AI models’ source code is independently audited, ideally through a combination of expert review and automated tools? What is your board’s expertise in this area?
- Even with strong oversight in place, can companies make sure they will avoid being caught off guard by newly evolving threats? How can they contribute to industry-wide efforts to boost safe AI?
Stewardship is changing — and must evolve faster
Investor stewardship must be long-term, multi-layered and ambitious. Engagement should focus on both AI technology developers and deployers, recognising that risks exist at every point in the value chain. Especially as deployers have to deal with counterparty risks cascaded to them, in addition to their own challenges of productive deployment of AI. To gain convincing power, we need to elaborate on best practice examples and on the materiality element, for both developers and deployers.
Collaborative investor initiatives and benchmarking exercises are helping to raise standards across the market. We have seen quite a few positive moves from companies in response to investor expectations. But the pace of stewardship must accelerate dramatically if investors are to keep up with the rapid evolution of technology. Persistence, creativity, and escalation (media, expanding the relevant contacts) are essential when companies resist.
In addition to shareholder resolutions, voting against directors where AI oversight is absent is emerging as a powerful governance tool.
We cannot ignore the systemic dimension
AI is not only a corporate governance issue — it is also a global stability and environmental challenge:
- The concentration of AI capabilities among a small number of dominant companies raises concerns about systemic dependencies and regulatory capture.
- AI’s escalating energy use is both a governance and environmental issue.
- If AI projects fail to live up to inflated expectations, investment bubbles could emerge, destabilising markets.
These are not abstract concerns. They are material factors that investors need to integrate into their stewardship strategies now.
Regulatory frameworks such as the US National Institute of Standards and Technology (NIST) AI Risk Management Framework and the EU Digital Services Act are starting to encourage better structured disclosure. However, the lack of independent assurance, quantitative metrics and comparability remains a significant challenge. ISO 42001 standard is a great tool, but can be too expensive for smaller players.
A stronger plan for AI stewardship
Three priorities emerged clearly from our seminar discussions:
- Engage at three levels — with developers, deployers and policymakers — to close governance gaps across the AI ecosystem.
- Support standardised quantitative measures of safeguards’ effectiveness and human rights impact assessments as key accountability tools.
- Treat energy intensity as material issue, on a par with financial performance.
We all recognise that AI will shape the future of economies and societies. Whether it does so responsibly depends on the actions taken today. Investors are not mere observers in this story — they are the key protagonists.
This document is intended for retail investors and/or private clients. You should not act or rely on any information contained in this document without seeking advice from a professional adviser.
This document has been issued by Sarasin & Partners LLP, Juxon House, 100 St Paul’s Churchyard, London, EC4M 8BU. Registered in England and Wales, No. OC329859. Authorised and regulated by the Financial Conduct Authority (FRN: 475111). Website: www.sarasinandpartners.com. Tel: +44 (0)20 7038 7000. Telephone calls may be recorded or monitored in accordance with applicable laws.
This document has been prepared for informational purposes only and is not intended as promotional material in any respect. It is not a solicitation, or an offer to buy or sell any security. The information on which the material is based has been obtained in good faith, from sources that we believe to be reliable, but we have not independently verified such information and we make no representation or warranty, express or implied, as to its accuracy. All expressions of opinion are subject to change without notice.
This document should not be relied on for accounting, legal or tax advice, or investment recommendations. Reliance should not be placed on the views and information in this material when taking individual investment and/or strategic decisions.
Capital at risk. The value of investments and any income derived from them can fall as well as rise and investors may not get back the amount originally invested. If investing in foreign currencies, the return in the investor’s reference currency may increase or decrease as a result of currency fluctuations. Past performance is not a reliable indicator of future results and may not be repeated. Forecasts are not a reliable indicator of future performance.
Neither Sarasin & Partners LLP nor any other member of the J. Safra Sarasin Holding Ltd group accepts any liability or responsibility whatsoever for any consequential loss of any kind arising out of the use of this document or any part of its contents. The use of this document should not be regarded as a substitute for the exercise by the recipient of their own judgement. Sarasin & Partners LLP and/or any person connected with it may act upon or make use of the material referred to herein and/or any of the information upon which it is based, prior to publication of this document.
Where the data in this document comes partially from third-party sources the accuracy, completeness or correctness of the information contained in this publication is not guaranteed, and third-party data is provided without any warranties of any kind. Sarasin & Partners LLP shall have no liability in connection with third-party data.
© 2025 Sarasin & Partners LLP. All rights reserved. This document is subject to copyright and can only be reproduced or distributed with permission from Sarasin & Partners LLP. Any unauthorised use is strictly prohibited.
