OUT OF MIND
Would you like to react to this message? Create an account in a few clicks or log in to continue.
Latest topics
» Is it possible to apply positive + in favor Newton III Motion Law as a dynamic system in a motor engine
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptySat Mar 23, 2024 11:33 pm by globalturbo

» Meta 1 Coin Scam Update - Robert Dunlop Arrested
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptySat Mar 23, 2024 12:14 am by RamblerNash

» As We Navigate Debs Passing
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Jan 08, 2024 6:18 pm by Ponee

» 10/7 — Much More Dangerous & Diabolical Than Anyone Knows
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyThu Nov 02, 2023 8:30 pm by KennyL

» Sundays and Deb.....
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptySun Oct 01, 2023 9:11 pm by NanneeRose

» African Official Exposes Bill Gates’ Depopulation Agenda: ‘My Country Is Not Your Laboratory’
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyThu Sep 21, 2023 4:39 am by NanneeRose

» DEBS HEALTH
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptySun Sep 03, 2023 10:23 am by ANENRO

» Attorney Reveals the “Exculpatory” Evidence Jack Smith Possesses that Exonerates President Trump
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyTue Aug 29, 2023 10:48 am by ANENRO

» Update From Site Owner to Members & Guests
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyTue Aug 29, 2023 10:47 am by ANENRO

» New global internet censorship began today
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 21, 2023 9:25 am by NanneeRose

» Alienated from reality
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 4:29 pm by PurpleSkyz

» Why does Russia now believe that Covid-19 was a US-created bioweapon?
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 4:27 pm by PurpleSkyz

»  Man reports history of interaction with seemingly intelligent orbs
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:34 pm by PurpleSkyz

» Western reactions to the controversial Benin Bronzes
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:29 pm by PurpleSkyz

» India unveils first images from Moon mission
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:27 pm by PurpleSkyz

» Scientists achieve nuclear fusion net energy gain for second time
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:25 pm by PurpleSkyz

» Putin Signals 5G Ban
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:07 pm by PurpleSkyz

» “Texas Student Dies in Car Accident — Discovers Life after Death”
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:05 pm by PurpleSkyz

» The hidden history taught by secret societies
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:03 pm by PurpleSkyz

» Vaccines and SIDS (Crib Death)
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 3:00 pm by PurpleSkyz

» Sun blasts out highest-energy radiation ever recorded, raising questions for solar physics
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptyMon Aug 07, 2023 2:29 pm by PurpleSkyz

» Why you should be eating more porcini mushrooms
Was DoD Behind Facebook’s Controversial Manipulation Study? EmptySun Aug 06, 2023 10:38 am by PurpleSkyz


You are not connected. Please login or register

Was DoD Behind Facebook’s Controversial Manipulation Study?

Go down  Message [Page 1 of 1]

PurpleSkyz

PurpleSkyz
Admin

Was DoD Behind Facebook’s Controversial Manipulation Study?

Posted on July 2, 2014 by Deus Nexus Leave a comment

Facebook reveals news feed experiment to control emotions

  • By Robert Booth | The Guardian

Was DoD Behind Facebook’s Controversial Manipulation Study? Facebook-spyProtests over secret study involving 689,000 users in which friends’ postings were moved to influence moods.
Facebook, the world’s biggest social networking site, is facing a storm of protest after it revealed it had discovered how to make users feel happier or sadder with a few computer key strokes.
It has published details of a vast experiment in which it manipulated information posted on 689,000 users’ home pages and found it could make people feel more positive or negative through a process of “emotional contagion”.
In a study with academics from Cornell and the University of California, Facebook filtered users’ news feeds – the flow of comments, videos, pictures and web links posted by other people in their social network. One test reduced users’ exposure to their friends’ “positive emotional content”, resulting in fewer positive posts of their own. Another test reduced exposure to “negative emotional content” and the opposite happened.
The study concluded: “Emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.”
Lawyers, internet activists and politicians said this weekend that the mass experiment in emotional manipulation was “scandalous”, “spooky” and “disturbing”.  READ MORE
Was DoD Behind Facebook’s Controversial Manipulation Study? Blog_divider_line2
Was the Department of Defense Behind Facebook’s Controversial Manipulation Study?
Reposted from: Liberty Blitzkrieg | by Michael Krieger
I’ve spent pretty much all day reading as much as possible about the extremely controversial Facebook “emotional contagion” study in which the company intentionally altered its news feed algorithm to see if it could manipulate its users’ emotions. In case you weren’t aware, Facebook is always altering your news feed under the assumption that there’s no way they could fill your feed with all of your “friends’” pointless, self-absorbed, dull updates (there’s just too much garbage).
As such, Facebook filters your news feed all the time, something which advertisers must find particularly convenient. In any event, the particular alteration under question occurred during one week in January 2012, and the company filled some people’s feeds with positive posts, while others were fed more negative posts.
Once the data was compiled, academics from the University of California, San Francisco and Cornell University were brought in to analyze the results. Their findings were then published in the prestigious Proceedings of the National Academy of Sciences. They found that:
For people who had positive content reduced in their News Feed, a larger percentage of words in people’s status updates were negative and a smaller percentage were positive. When negativity was reduced, the opposite pattern occurred. These results suggest that the emotions expressed by friends, via online social networks, influence our own moods, constituting, to our knowledge, the first experimental evidence for massive-scale emotional contagion via social networks.
You probably know most of this already, but here is where it starts to get really strange. Initially, the press release from Cornell highlighting the study said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.” Once people started asking questions about this, Cornell claimed it had made a mistake, and that there was no outside funding. Jay Rosen, Journalism Professor at NYU, seems to find this highly questionable. He wrote on his Facebook page that:
Strange little turn in the story of the Facebook “emotional contagion” study. Last month’s press release from Cornell highlighting the study had said at the bottom: “The study was funded in part by the James S. McDonnell Foundation and the Army Research Office.”
Why would the military be interested? I wanted to know. So I asked Adam D.I. Kramer, the Facebook researcher, that question on his Facebook page, where he has posted what he called a public explanation. (He didn’t reply to my or anyone else’s questions.)
See:https://www.facebook.com/akramer/posts/10152987150867796
Now it turns out Cornell was wrong! Or it says it was wrong. The press release now reads: “Correction: An earlier version of this story reported that the study was funded in part by the James S. McDonnell Foundation and the Army Research Office. In fact, the study received no external funding.”
Why do I call this strange? Any time my work has been featured in an NYU press release, the PR officers involved show me drafts and coordinate closely with me, for the simple reason that they don’t want to mischaracterize scholarly work. So now we have to believe that Cornell’s Professor of Communication and Information Science, Jeffrey Hancock, wasn’t shown or didn’t read the press release in which he is quoted about the study’s results (weird) or he did read it but somehow failed to notice that it said his study was funded by the Army when it actually wasn’t (weirder).
I think I would notice if my university was falsely telling the world that my research was partially funded by the Pentagon… but, hey, maybe there’s an innocent and boring explanation that I am overlooking.
It gets even more interesting from here. The Professor of Communication and Information Science, Jeffrey Hancock, who Mr. Rosen mentions above, has a history of working with the U.S. military, specifically the Minerva Institute. In case you forgot what this is, the Guardian reported on it earlier this year. It explained:
A US Department of Defense (DoD) research program is funding universities to model the dynamics, risks and tipping points for large-scale civil unrest across the world, under the supervision of various US military agencies. The multi-million dollar program is designed to develop immediate and long-term “warfighter-relevant insights” for senior officials and decision makers in “the defense policy community,” and to inform policy implemented by “combatant commands.”
Launched in 2008 – the year of the global banking crisis – the DoD ‘Minerva Research Initiative’ partners with universities “to improve DoD’s basic understanding of the social, cultural, behavioral, and political forces that shape regions of the world of strategic importance to the US.”
SCG News has written one of the best articles I have seen yet on the links between the Facebook study and the Department of Defense. It notes:
In the official credits for the study conducted by Facebook you’ll find Jeffrey T. Hancock from Cornell University. If you go to the Minerva initiative website you’ll find that Jeffery Hancock received funding from the Department of Defense for a study called “Cornell: Modeling Discourse and Social Dynamics in Authoritarian Regimes”. If you go to the project site for that study you’ll find a [url=http://minerva.henrian.com/?graph-generator=watts-strogatz&N=50&K=5&beta=0.3&m0=2&m=1&type=disease&seed_size=1&types=1&infectiousness=0.33&idea_p_true=0.66&threshold=rbeta%281.5%2C 2%29&interval_ms=300&visualize=true&=undefined]visualization program[/url] that models the spread of beliefs and disease.
Cornell University is currently being funded for another DoD study right now called “Cornell: Tracking Critical-Mass Outbreaks in Social Contagions” (you’ll find the description for this project on the Minerva Initiative’s funding page).
So I went ahead and looked at the study mentioned above, and sure enough I found this:
Was DoD Behind Facebook’s Controversial Manipulation Study? Screen-Shot-2014-07-01-at-2.21.45-PM-1024x194
There he is, Jeff Hancock, the same guy who analyzed the Facebook data for Cornell, which initially claimed funding from the Pentagon and then denied it.
I call bullshit. Stinking bullshit.
So it seems that Facebook and the U.S. military are likely working together to study civil unrest and work on ways to manipulate the masses into apathy or misguided feelings of contentment in the face of continued banker and oligarch theft. This is extremely disturbing, but this whole affair is highly troubling in spite of this.
For one thing, although governments and universities need to take certain precautions when conducting such “research,” private companies like Facebook apparently do not. Rather, all they have to do is get people to click “I accept” to a terms of service agreement they never read, which allows companies to do almost anything they want to you, your data and your emotions. What we basically need to do as a society is completely update our laws. For starters, if a private corporation is going to lets say totally violate your most basic civil liberties as defined under the Bill of Rights, a simple terms of service agreement should not be sufficient. For more invasive violations of such rights, perhaps a one page simple-to-read document explaining clearly which of your basic civil liberties you are giving away should be mandatory.
For example, had Facebook not partnered at the university level to analyze this data, we wouldn’t even know this happened at all. So what sort of invasive, mind-fucking behavior do you think all these large corporations with access to your personal data are up to. Every. Single. Day.
The Faculty Lounge blog put it perfectly when it stated:
Academic researchers’ status as academics already makes it more burdensome for them to engage in exactly the same kinds of studies that corporations like Facebook can engage in at will. If, on top of that, IRBs didn’t recognize our society’s shifting expectations of privacy (and manipulation) and incorporate those evolving expectations into their minimal risk analysis, that would make academic research still harder, and would only serve to help ensure that those who are most likely to study the effects of a manipulative practice and share those results with the rest of us have reduced incentives to do so. Would we have ever known the extent to which Facebook manipulates its News Feed algorithms had Facebook not collaborated with academics incentivized to publish their findings?
We can certainly have a conversation about the appropriateness of Facebook-like manipulations, data mining, and other 21st-century practices. But so long as we allow private entities freely to engage in these practices, we ought not unduly restrain academics trying to determine their effects. Recall those fear appeals I mentioned above. As one social psychology doctoral candidate noted on Twitter, IRBs make it impossible to study the effects of appeals that carry the same intensity of fear as real-world appeals to which people are exposed routinely, and on a mass scale, with unknown consequences. That doesn’t make a lot of sense. What corporations can do at will to serve their bottom line, and non-profits can do to serve their cause, we shouldn’t make (even) harder—or impossible—for those seeking to produce generalizable knowledge to do.
If you read Liberty Blitzkrieg, you know I strongly dislike Facebook as a company. However, this is much bigger than just one experiment by Facebook with what appears to be military ties. What this is really about is the frightening reality that these sorts of things are happening every single day, and we have no idea it’s happening. We need to draw the lines as far as to what extent we as a society wish to be data-mined and experimented on by corporations with access to all of our private data. Until we do this, we will continue to be violated and manipulated at will.
In Liberty,
Michael Krieger


Thanks to: http://deusnexus.wordpress.com

Back to top  Message [Page 1 of 1]

Permissions in this forum:
You cannot reply to topics in this forum