Joe Rogan was left stunned after discovering that Facebook conducted a secret experiment on thousands of people which allegedly drove some to attempt suicide.
Rebecca Lemov, a historian of science and expert on mind control at Harvard University, revealed on the Joe Rogan Experience podcast that the social media giant quietly altered the news feeds of nearly 700,000 users in 2012.
These changes either made the news feeds more positive or more negative, allowing Facebook to manipulate the emotions of their users to study the fallout in their behavior.
The Mark Zuckerberg-owned platform claimed they carried out the one-week experiment to 'make the content people see on Facebook as relevant and engaging as possible.'
However, the experiment was slammed by the public as brainwashing because it covertly manipulated users by curating their news feed content, showing them only emotionally charged stories, and potentially causing mental harm.
Lemov said that the experiment was similar to the type of brainwashing that takes place in cults, which focuses on changing the way people think by having them 'catch' the 'contagious' emotions of others.
'It's not that it changed my thoughts, it's that it changed my feelings about my thoughts,' Lemov explained.
Facebook never informed the 689,003 people who had their feeds altered that they were involved, and Lemov added at least one person claimed that they believed their negative news feed during the study period pushed them to attempt suicide.

Despite fierce public outrage once the experiment went public in 2014, the mind control expert noted that Facebook users unknowingly agree to be a part of these experiments when they sign up for the platform.
'Whenever you go on the platform, you agree to be tested or A-B testing,' revealed Lemov, the author of 'The Instability of Truth: Brainwashing, Mind Control and Hyper-Persuasion.'
'This is why there was an ethical debate when the experiment was published in 2014,' she added in the May 15 podcast.
The study was carried out by researchers from Cornell University and the University of California-San Francisco.
The experiment secretly changed the algorithm to prioritize either positive or negative posts for different groups, without users' knowledge or explicit consent.
Their goal was to see how emotions spread through social networks and understand whether the emotional tone of certain content influences the viewer's own posts and emotional states.
Simply put, scientists tried to see if they could make people happier or depressed by subjecting them to an altered and biased news feed designed to sway them in one direction or the other.
The study found that users exposed to more positive content did indeed post more positive updates, while those shown more negative content posted more negative updates.

However, Lemov revealed that one person wrote on the research group's Facebook page in 2014 that they were in a hospital emergency room threatening to commit suicide at the time of this secret experiment. Others shared similar experiences.
'Of course, they could never know and it can't be traced backwards,' the expert explained, saying that this person and anyone else had no way of finding out if they were manipulated by Facebook.
'Whenever people have power, unchecked power, and insane influence, particularly influence to manipulate people and influence over people's minds... You could get away with so much,' Rogan warned.
'No matter who you are, you're vulnerable, whether it's through society, whether it's through peer groups, whether it's through community, we're vulnerable. Everyone's vulnerable,' he added.
The public response to the study, published in the Proceedings of the National Academy of Sciences, even triggered an investigation by government officials.
In the US, no direct legal action was taken as the experiment was deemed to fall within Facebook's terms of service at the time, which allowed data use for research purposes.
The Electronic Privacy Information Center (EPIC) - a nonprofit research group focused on protecting privacy and freedom of expression online - filed a complaint with the Federal Trade Commission (FTC) in 2014.

They alleged that Facebook deceived users by misrepresenting its data practices and violated a 2012 FTC order on user consent.
The complaint argued that the unannounced study manipulated users' emotions using data in their news feeds, but the complaint never escalated into a full-blown lawsuit against Facebook.
The UK Information Commissioner's Office also looked into whether Facebook's actions violated data protection laws.
Despite claims that Facebook's secret project was unethical, the ICO did not impose any sanctions on the company for their controversial study.
The 2012 experiment was not the only case of mind control and manipulation involving unsuspecting people online.
Rogan also referenced research by Robert Epstein, who he interviewed in a September 2024 episode of his podcast, which suggested that internet search engine results can significantly sway undecided voters.
Epstein's research revealed that a substantial portion of online discourse may be driven by automated bots, shaping narratives for political or corporate agendas.
Rogan cited the 2016 presidential election, noting that if someone searched on Google for 'is Hillary Clinton a criminal?' the search engine would only produce results related to Donald Trump being a alleged criminal.
'You wouldn't find things on Hillary Clinton. You had to keep digging and digging and digging. If you wanted to find positive things on Hillary Clinton, you could find them quite easily,' Rogan explained.
Rogan claimed that there were few internet search browsers allow people to search for so-called 'controversial' topics.
According to the podcast host, popular browsers like Google are altering their search results in an attempt to censor what he called malinformation.
'Malinformation is information that is correct, but that would be ultimately harmful. They put vaccine side-effects under malinformation because it would cause vaccine hesitancy,' Rogan said.
In January 2025, Meta announced it would scrap its third-party fact-checking program, starting in the US, over continued concerns that their censoring campaign on Facebook was hampering free speech.