Over the past few weeks, we have witnessed a growing scandal surrounding Elon Musk’s AI image-generation tool, Grok Imagine, which has been used to digitally undress photos of women and children and create public, sexualised deepfake images without their consent. This powerful new technology was pushed out at speed and also integrated to social media platform X with weak or insufficiently tested safeguards, driven by priorities of growth, attention, and beating rivals to market[1]. This represents one of the most serious recent cases of child sexual abuse material (CSAM) facilitated through a social media platform.
Following widespread backlash, X limited Grok’s image-generation and editing capabilities. The company stated: “We take action against illegal content on X, including CSAM, by removing it, permanently suspending accounts, and working with local governments and law enforcement as necessary.” Elon Musk himself added: “Anyone using Grok to make illegal content will suffer the same consequences as if they upload illegal content.[2]” Yet these actions came only after harm had already occurred at scale.
When social media platforms are allowed to follow the instincts and values of their owners without robust governance constraints, developers can go virtually anywhere. Even if protections are installed, it is still not always clear: who is being protected, and from whom? We have repeatedly seen how appeals to freedom of speech can be used to justify serious violations of other rights, including the right to safety and the protection of children. At the same time, an absolutist approach to safety may risk stifling innovation; trial and error is inevitable in such a fast-moving technological space, and clear definitions of harm, such as those embedded in the UK Online Safety Act, are essential.
Defining standards
A way to avoid fundamental confusion of means and objectives is the articulation of clear principles, values, developing policies to enforce those, and the consistent application of these in practice. There has been a consistent and increasing pressure by investors on companies to define these standards publicly, embed them operationally, train staff accordingly, and demonstrate how they are enforced. The work of the WBA Collective Impact Coalition, based on its published Investor Statement, and awarded with the PRI 2025 Recognition for Action, has been an instrumental force in investor engagement on such responsible approach.
As all the Big Tech leaders are demonstrating a sense of urgency, speedy launches of under-tested products in response to competitive pressure will continue. Governments, regulators, civil society, and users will respond – as seen in the blocking of Grok in Malaysia and Indonesia, and probes initiated in the EU, UK, Brazil, and India. In the US, Democratic senators have even called for Grok to be temporarily removed from app stores.
Should the customers, i.e. advertisers, be more socially aware, this could have led to collapse of xAI. And given the chain of interdependencies within Silicon Valley, this may cause financial problems across the tech sector in general.
Investors have a distinct and powerful role to play in this rapidly evolving space. Through stewardship and engagement, we can shape expectations at the largest and most influential technology companies, those whose practices often become de facto industry standards. Investor pressure is usually also supported by human rights-focused NGOs, which could also be instrumental in coordinating the work on establishing international standards.
Together, this can help ensure that innovation does not come at the cost of human dignity, safety, and fundamental rights.
Sarasin’s engagement with AI developers
At Sarasin, we engage both individually and collaboratively with AI developers across our portfolios to encourage them to clearly articulate their principles and policies, including defining what their models should never be permitted to do, and to establish accountability when those standards are not met. This includes advocating for regular audits, such as human rights impact assessments (HRIAs), and the transparent publication of their findings.
For open-source models, we believe robust responsible-use guidelines are essential, alongside mechanisms to monitor and enforce compliance. Highly sensitive products and services should be subject to rigorous pre-launch testing, overseen by specialised experts, such as responsible AI committees.
In the case of social media companies, we are closely scrutinising content moderation policies, particularly where the safety of children is concerned. In this area, progress can only be achieved through the consistent application of strong ethical safeguards. We believe these objectives are achievable when appropriate governance structures, processes, and tools are firmly in place.
We have seen encouraging responses and tangible progress from some companies with stronger governance structures, although this remains an ongoing journey.
[1] [2] Guardian, 12.01.2026. https://www.theguardian.com/world/2026/jan/12/monday-briefing-how-elon-musks-grok-is-being-used-as-a-tool-for-digital-sexual-abuse
Important information
This document is intended for retail investors and/or private clients. You should not act or rely on any information contained in this document without seeking advice from a professional adviser.
This is a marketing communication. Issued by Sarasin & Partners LLP, Juxon House, 100 St Paul’s Churchyard, London, EC4M 8BU. Registered in England and Wales, No. OC329859. Authorised and regulated by the Financial Conduct Authority (FRN: 475111). Website: www.sarasinandpartners.com. Tel: +44 (0)20 7038 7000. Telephone calls may be recorded or monitored in accordance with applicable laws.
This document has been produced for marketing and informational purposes only. It is not a solicitation or an offer to buy or sell any security. The information on which the material is based has been obtained in good faith, from sources that we believe to be reliable, but we have not independently verified such information and we make no representation or warranty, express or implied, as to its accuracy. All expressions of opinion are subject to change without notice. This document should not be relied on for accounting, legal or tax advice, or investment recommendations. Reliance should not be placed on the views and information in this material when taking individual investment and/or strategic decisions.
Capital at risk. The value of investments and any income derived from them can fall as well as rise and investors may not get back the amount originally invested. If investing in foreign currencies, the return in the investor’s reference currency may increase or decrease as a result of currency fluctuations. Past performance is not a reliable indicator of future results and may not be repeated. Forecasts are not a reliable indicator of future performance.
Neither Sarasin & Partners LLP nor any other member of the J. Safra Sarasin Holding Ltd group accepts any liability or responsibility whatsoever for any consequential loss of any kind arising out of the use of this document or any part of its contents. The use of this document should not be regarded as a substitute for the exercise by the recipient of their own judgement. Sarasin & Partners LLP and/or any person connected with it may act upon or make use of the material referred to herein and/or any of the information upon which it is based, prior to publication of this document.
Where the data in this document comes partially from third-party sources the accuracy, completeness or correctness of the information contained in this publication is not guaranteed, and third-party data is provided without any warranties of any kind. Sarasin & Partners LLP shall have no liability in connection with third-party data.
© 2026 Sarasin & Partners LLP. All rights reserved. This document is subject to copyright and can only be reproduced or distributed with permission from Sarasin & Partners LLP. Any unauthorised use is strictly prohibited.