When convicted Boston Marathon bomber Dzhokhar Tsarnaev’s lawyers filed a motion last month requesting a new trial, at a different venue, a key reason was generally overlooked in superficial media coverage. The legal team based its filing in large part on evidence it says shows at least some of the jurors were exposed to “inflammatory” information on their social media feeds during the trial.
“The social media activity of individual jurors and of their social media contacts is highly relevant to the question of venue, because it further demonstrates a constitutionally-intolerable level of risk to the fairness of the trial …” the defense wrote.
We don’t yet know what that “social media activity” consists of because that section of Tsarnaev’s motion was redacted.
But could social media posts have actually influenced the outcome of the Boston Marathon Bombing trial?
We do know of an instance in which not seeing a particular media post had a fatal effect. Kevan Fagan, the only juror whose identity is known to the public, admitted that had he known some of the victims’ families were opposed to the death penalty, he “probably” would not have voted for it.
That’s a stunning admission. If Fagan had voted the other way, a non-unanimous verdict would have assured that Tsarnaev would not now be on death row. His assertion also underlines the monumental power that “inflammatory” information can have over this deadly serious process. What if one of Fagan’s friends or family members just happened to post that headline on Facebook? He says he went to Facebook while he was on the jury panel, but somehow avoided discussions about the trial.
But what did other jurors see that may have influenced them to vote for the death penalty? Peer-reviewed studies have shown that social media can be extremely effective at influencing people’s thoughts.
And as far as criminal trials are concerned, lawyers spend vast amounts of time, paper, and ink fighting over what can and cannot be shown to jurors. Turns out social media has become front and center in this particular court fight.
Remember those creepy Facebook experiments? Last summer it was revealed that Facebook had conducted “emotional experiments” on hundreds of thousands of unwitting Facebook users in an attempt to create what researchers described as “emotional contagion” — and it worked.
The experiment, published in the Proceedings of National Academy of Sciences journal, was designed to determine whether experimenters could cause a measurable shift in users’ emotional state, by tweaking what emotional trigger words they saw in their Facebook Newsfeed.
In a separate study, researchers found that Facebook, by selectively adding graphics to users’ pages, could sway a close election. (A recent WhoWhatWhy article details another study which found that tweaking Google search algorithms also has the potential to swing elections.)
The aptly named 2014 Guardian article, “If Facebook can tweak our emotions and make us vote, what else can it do?” describes the subtle manipulation of voters:
It’s not only emotions Facebook can nudge. It can make you vote too. On presidential election day 2010 it offered one group in the US a graphic with a link to find nearby polling stations, along with a button that would let you announce that you’d voted, and the profile photos of six other of your “friends” who had already done so. Users shown that page were 0.39% more likely to vote than those in the “control” group, who hadn’t seen the link, button or photos.
The researchers reckoned they’d mobilised 60,000 voters — and that the ripple effect caused a total of 340,000 extra votes. That’s significantly more than George Bush won by in 2000, where Florida hung on just 537 votes.
In other words, simply adding or subtracting seemingly insignificant bits of information to people’s Facebook pages had a real-world influence on their behavior.
Even creepier, some of the world’s clandestine agencies have weaponized social media, spending untold sums of money in an effort to sway the hearts and minds of various populations.
This is not to suggest that anyone’s social media was deliberately manipulated to change opinions about the Tsarnaev trial. But it does show that the kinds of information one is exposed to on social media can profoundly affect the way we think and feel about a topic — with tangible results.
There’s a growing recognition of the problem presented by the Internet in general and by social media in particular to jury trials.
A juror in the federal corruption trial of Pennsylvania State Senator Vincent J. Fumo was found to be posting updates about the trial to Facebook and Twitter, a discovery that the defendant’s lawyers used as grounds for appeal. Although the appeal ultimately failed, it prompted one of the appellate judges to write a separate opinion highlighting the potential of social media to skew the fairness of jury trials.
And in the Jodi Arias murder trial, questions were raised about one juror’s social media use, as she had followed multiple news stations’ Facebook pages, which had extensively covered the trial, according to the prosecutor. Particular suspicion arose because the juror was the only holdout against sentencing Arias to death. Her position created a behind-the-scenes battle, as the prosecutor and the other 11 jurors sought to have the holdout removed. The judge refused, and Arias will instead serve life in prison after the presiding judge declared a mistrial.
A 2009 New York Times article explains the problem in general terms:
Jurors are not supposed to seek information outside of the courtroom. They are required to reach a verdict based on only the facts the judges decided are admissible, and they are not supposed to see evidence that has been excluded as prejudicial. But now, using their cell phones, they can look up the name of a defendant on the Web or examine an intersection using Google Maps, violating the legal system’s complex rules of evidence.
They can also be exposed to the information on Facebook — whether they’re looking for it or not.
With the inordinate expense of criminal trials, judges are likely reluctant to declare mistrial except in the most egregious cases — as when a Florida judge threw out a 2009 drug case after discovering nine of the jurors had been Googling information about the trial. But they are clearly aware of the problem and as a result have been ramping up, in scope and frequency, their admonitions against social media use.
Yet despite judges’ repeated warnings against “communicating” about the trial with others in any way, jurors cannot seem to resist the siren call of social media.
The judge in the Tsarnaev trial, US District Court Judge George O’Toole, told jurors explicitly that “they must not communicate about this case; or allow anyone to communicate about it with you, by telephone, text message, Skype, email, or via social media such as Twitter or Facebook [emphasis added],” according to Tsarnaev’s latest motion.
However, just opening one’s Facebook page allows friends, family and associates to “communicate” their thoughts and opinions about a host of topics — even if we don’t actively engage them.
The reason we know the identity of only one juror is because the sitting judge in the Tsarnaev trial has taken the unprecedented step of withholding jurors’ names for over three months since trial, a fact that finally set off alarm bells with some in the Boston media about the highly unusual secrecy.
O’Toole stated in a decision filed last week that his decision to withhold the names is due mainly to the appeal filed by Tsarnaev’s lawyers, which claimed jurors were exposed to “inflammatory” information on their social media feeds — allegations O’Toole says he needs time to investigate.
It will be interesting to see (if we do indeed get to see) what kind of stuff floated through Kevan Fagan’s Facebook feed, which brings us back to the question posed by the Guardian: “what else can it [Facebook] make us do?”
Vote for the death penalty — or not?
Related front page panorama photo credit: Courthouse (Tim Evanson / Flickr)
Where else do you see journalism of this quality and value?
Please help us do more. Make a tax-deductible contribution now.
Our Comment Policy
Keep it civilized, keep it relevant, keep it clear, keep it short. Please do not post links or promotional material. We reserve the right to edit and to delete comments where necessary.