[HOME] - [NEWS] - [MY CARTOONS] - [MY COMIC BOOK] - [MY MOVIES] - [MY PAPERS] - [HISTORICAL ESSAYS]

FACEBOOK WEAPONIZED - PSYCHOLOGICAL MOOD MANIPULATION

Facebook isn't just being used by the illuminati to feed us false news. It's also being used to manipulate our emotions. The illuminati uses facebook as a mood weapon - to make certain groups depressed at certain times. If you don't want Democrats to show up to the election, depress them on their facebook feeds before they go to the polls.



In 2014 Facebook published details of a vast experiment in which it manipulated information posted on 689,000 users' home pages and found it could make people feel more positive or negative through a process of "emotional contagion".

In a study with academics from Cornell and the University of California, Facebook filtered users' news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users' exposure to their friends' "positive emotional content", resulting in fewer positive posts of their own. Another test reduced exposure to "negative emotional content" and the opposite happened.

The study concluded: "Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks."

Lawyers, internet activists and politicians said this weekend that the mass experiment in emotional manipulation was "scandalous", "spooky" and "disturbing". This is the type of weapon that the CIA has been wanting for decades. The ability to manipulate mood is the core of PSYOPS. Facebook is the ultimate PSYOP weapon - it manipulates mood and can be targeted at anyone or particular groups. It can be used against particular countries or racial groups.

In 2014, a senior British MP called for a parliamentary investigation into how Facebook and other social networks manipulated emotional and psychological responses of users by editing information supplied to them.

Jim Sheridan, a member of the Commons media select committee, said the experiment was intrusive. "This is extraordinarily powerful stuff and if there is not already legislation on this, then there should be to protect people," he said. "They are manipulating material from people's personal lives and I am worried about the ability of Facebook and others to manipulate people's thoughts in politics or other areas. If people are being thought-controlled in this kind of way there needs to be protection and they at least need to know about it."

commentators voiced fears that the process could be used for political purposes in the runup to elections or to encourage people to stay on the site by feeding them happy thoughts and so boosting advertising revenues.

In a series of Twitter posts, Clay Johnson, the co-founder of Blue State Digital, the firm that built and managed Barack Obama's online campaign for the presidency in 2008, said: "The Facebook 'transmission of anger' experiment is terrifying."

He asked: "Could the CIA incite revolution in Sudan by pressuring Facebook to promote discontent? Should that be legal? Could Mark Zuckerberg swing an election by promoting Upworthy [a website aggregating viral content] posts two weeks beforehand? Should that be legal?"

It was claimed that Facebook may have breached ethical and legal guidelines by not informing its users they were being manipulated in the experiment, which was carried out in 2012.

The study said altering the news feeds was "consistent with Facebook's data use policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research".

But Susan Fiske, the Princeton academic who edited the study, said she was concerned. "People are supposed to be told they are going to be participants in research and then agree to it and have the option not to agree to it without penalty."

James Grimmelmann, professor of law at Maryland University, said Facebook had failed to gain "informed consent" as defined by the US federal policy for the protection of human subjects, which demands explanation of the purposes of the research and the expected duration of the subject's participation, a description of any reasonably foreseeable risks and a statement that participation is voluntary. "This study is a scandal because it brought Facebook's troubling practices into a realm – academia – where we still have standards of treating people with dignity and serving the common good," he said on his blog.

It is not new for internet firms to use algorithms to select content to show to users and Jacob Silverman, author of Terms of Service: Social Media, Surveillance, and the Price of Constant Connection, told Wire magazine on Sunday the internet was already "a vast collection of market research studies; we're the subjects".

"What's disturbing about how Facebook went about this, though, is that they essentially manipulated the sentiments of hundreds of thousands of users without asking permission," he said. "Facebook cares most about two things: engagement and advertising. If Facebook, say, decides that filtering out negative posts helps keep people happy and clicking, there's little reason to think that they won't do just that. As long as the platform remains such an important gatekeeper – and their algorithms utterly opaque – we should be wary about the amount of power and trust we delegate to it."

Here is the only mention of “informed consent” in the paper: The research “was consistent with Facebook’s Data Use Policy, to which all users agree prior to creating an account on Facebook, constituting informed consent for this research.”

That is not how most social scientists define informed consent.

Here is the relevant section of Facebook’s data use policy: “For example, in addition to helping people see and find things that you do and share, we may use the information we receive about you ... for internal operations, including troubleshooting, data analysis, testing, research and service improvement.”

So there is a vague mention of “research” in the fine print that one agrees to by signing up for Facebook. As bioethicist Arthur Caplan told me, however, it is worth asking whether this lawyerly disclosure is really sufficient to warn people that “their Facebook accounts may be fair game for every social scientist on the planet.”

Any scientific investigation that receives federal funding must follow the Common Rule for human subjects, which defines informed consent as involving, among other things, “a description of any foreseeable risks or discomforts to the subject.” As Grimmelmann observes, nothing in the data use policy suggests that Facebook reserves the right to seriously bum you out by cutting all that is positive and beautiful from your news feed. Emotional manipulation is a serious matter, and the barriers to experimental approval are typically high. (Princeton psychologist Susan K. Fiske, who edited the story for PNAS, told the Atlantic that this experiment was approved by the local institutional review board. But even she admitted to serious qualms about the study.)

Facebook presumably receives no federal funding for such research, so the investigation might be exempt from the Common Rule. Putting aside the fact that obeying these regulations is common practice even for private research firms such as Gallup and Pew, the question then becomes: Did Cornell or the University of California–San Francisco help finance the study? As public institutions, both fall under the law’s purview. If they didn’t chip in but their researchers participated nonetheless, it is unclear what standards the experiment would legally have to meet, according to Caplan. (I reached out to the study authors, their universities, and Facebook, and will update this story if they reply.)

Even if the study is legal, it appears to flout the ethical standards spelled out in instructions to scientists who wish to publish in PNAS. “Authors must include in the Methods section a brief statement identifying the institutional and/or licensing committee approving the experiments,” reads one requirement on the journal’s website. (The study did not.) “All experiments must have been conducted according to the principles expressed in the Declaration of Helsinki,” reads another. The Helsinki standard mandates that human subjects “be adequately informed of the aims, methods, sources of funding, any possible conflicts of interest, institutional affiliations of the researcher, the anticipated benefits and potential risks of the study and the discomfort it may entail.”

Over the course of the study, it appears, the social network made some of us happier or sadder than we would otherwise have been. Now it’s made all of us more mistrustful.