UK mental health charities handed sensitive data to Facebook for targeted ads | Mental health

UK mental health charities handed sensitive data to Facebook for targeted ads | Mental health

Some of Britain’s biggest charities providing support for people with mental health problems shared details of sensitive web browsing with Facebook for use in its targeted advertising system.

The data was sent via a tracking tool embedded in the charities’ websites and included details of webpages a user visited and buttons they clicked across content linked to depression, self-harm and eating disorders.

It also included details of when users requested support – such as clicking a link saying “I need help” – and when they viewed webpages to access online chat tools. Some of the pages that triggered data-sharing with Facebook were aimed specifically at children, including a page aimed at 11- to 18-year-olds offering advice on suicidal thoughts.

The data sent to Facebook during the Observer’s analysis did not include details of conversations between charities and users or messages sent via chat tools. All the charities said such messages were confidential and stressed that they took service user privacy extremely seriously.

However, it often related to browsing users would usually expect to be private – including details of button clicks and page views across websites for the mental health charities Mind, Shout and Rethink Mental Illness, and eating disorder charity Beat.

The information was matched to IP address – an identifier that can usually be linked to an individual or household – and, in many cases, details of their Facebook account ID.

Most of the charities have now removed the tracking tool, known as Meta Pixel, from their websites.

The findings come after an Observer investigation last week revealed that 20 NHS England trusts were sharing data with Facebook for targeted advertising – including information about browsing activity across hundreds of webpages linked to specific medical conditions, appointments, medication and referral requests.

A page about self-harm on the website for mental health charity Shout, which shared details of the browsing with Facebook via the Meta Pixel.
A page about self-harm on the website for mental health charity Shout, which shared details of the browsing with Facebook via the Meta Pixel. Photograph: Screengrab

In one case, an NHS trust told Facebook when a user viewed a guide for HIV drugs. After being alerted to the presence of the tracking tool on their websites, and the nature of the data being shared, most of the trusts removed it and apologised to patients. All 20 have now stopped using it.

The NHS trusts had also been using Meta Pixel, which is a piece of code provided by Facebook that can be embedded in an organisation’s website. Facebook says it can help them get “rich insights” into website performance and user behaviour.

But the company also uses the data sent to it via the pixel for its own business purposes, including improving its targeted advertising. In one guide, Facebook’s parent company, Meta, says it uses data collected by the pixel to improve users’ experiences, for example by showing them ads they “might be interested in”. “You may see ads for hotel deals if you visit travel websites,” it explains.

Facebook says it makes clear that organisations should not use Meta Pixel to collect or share sensitive data, such as information that could reveal details about a person’s health or data relating to children. It also says it has filters to weed out sensitive data it receives by mistake. But past research has suggested these don’t always work, and Facebook itself admits the system “doesn’t catch everything”.

The company has been accused of doing too little to monitor what information it is being sent, and faced questions over why it would allow certain organisations – such as hospitals or mental health charities – to send it data in the first place.

In some cases, the UK organisations found to be using the tool said they were unaware of how information was being shared with Facebook.

One charity, Rethink Mental Illness, said it used Meta Pixel to “optimise communications” relating to events and fundraising.

skip past newsletter promotion

Facebook was sent data on sensitive web browsing by several mental health charities.
Facebook was sent data on sensitive web browsing by several mental health charities. Photograph: Screengrab

“We were not aware that Meta was collecting and using personal identifiable information, such as IP addresses, to profile users,” it said. The charity said it was “very troubled” by the data use. It has removed the tool and apologised.

Shout, which runs a crisis text line for children and adults, said Meta Pixel was installed to “track the efficacy” of campaigns promoting its confidential service. Logs of data sent to Facebook by the charity show it included details of when users clicked links to view pages called “support with abuse”, “support with suicidal thoughts” and “support with self-harm”.

A spokesperson said the charity took the privacy and data protection of service users “very seriously” and was “not aware Meta was using data in this way”. It has removed the tool.

Beat said it used Meta Pixel to support advertising for fundraising, campaigning and promoting its support services. “We take the privacy of the people we support very seriously and we are investigating this as a matter of urgency,” a spokesperson said.

Cath Biddle, head of digital at Mind, said the privacy of visitors to its website was of “utmost importance” and that it was carrying out a “comprehensive audit” of its policies and data use. “We have immediately paused our use of Meta Pixel while we investigate this matter further,” she said.

In all, the Observer tested 32 charity websites and found seven using the tracking tool. This is in addition to 20 out of 213 NHS trust sites. As well as wide variation in reliance on tracking tools, the analysis revealed significant differences in the approaches taken by organisations to informing service users about the data use and obtaining their consent. While some of the charities mentioned Meta Pixel in their privacy policies, in all the cases, the data transfer happened automatically upon loading the charity’s webpage, before the user had clicked to accept or decline cookies – meaning the default setting was that data would be shared.

Cori Crider, cofounder of Foxglove, a legal nonprofit specialising in data privacy law, said the findings were “alarming”. “People should be able to trust that when they’re going to a charity seeking help, that is going to stay private,” she said. She urged charities and other organisations that handle sensitive data to “get wise” to the way Facebook’s advertising systems work, and the fact it had “created an economy in trafficking people’s data”, and called for action from politicians and regulators. “They have been asleep at the wheel,” she said.

In the US, Facebook is facing legal action, accused of violating web users’ privacy by knowingly collecting and monetising “individually identifiable health information” via the Meta Pixel tool. Several hospitals have also been sued. Earlier this year, online therapy company BetterHelp was ordered to pay $7.8m by the US government for allegedly sharing sensitive data with Facebook and other social media firms for advertising.

A Meta spokesperson said: “Organisations should not send sensitive information about people through our business tools, doing so is against our policies.

“We educate advertisers on properly setting up business tools to prevent this from occurring and our system is designed to filter out potentially sensitive data it is able to detect.”

The UK regulator, the Information Commissioner’s Office, said it had “noted the new findings” regarding use of Meta Pixel on charity websites. It is already investigating use of pixels on NHS trust websites after the Observer investigation last week.

A spokesperson said: “Organisations must provide clear and comprehensive information to users when using cookies and similar technologies, especially where sensitive personal information is involved. We are continuing to review the findings and to investigate the potential extent of any personal data collected and shared with third parties via the use of pixels.”