Dec 02, 2013, 12:55 pm
Facebook is developing a speech impediment. The recent fracas over beheading videos was marked by severe bouts of waffling from the social media giant. On one hand, it seems to want to ease unfettered expression. On the other hand, it's set itself up as the content police.
These two aspects often collide with disastrous results. Beheadings are a go, but breast cancer groups can't post photos of mastectomies. Recent partnerships with government agencies see Facebook willing to censor by proxy, even as it attempts to roll back its control in other areas. Giving 800+ million users access to a "report" button is well-intended, but the reality is more troubling. Something that's simply unpopular can be clicked into oblivion in nearly no time whatsoever.
Here's how Facebook's efforts to play both sides (free speech/content patrol) works in practice. [Caution: contains mandatory autoplay video because every TV news website in the US apparently believes we want to be informed without consent.]
One thing that's worth pointing out is that Kang posted this rant on her page. She didn't head to a body acceptance group page or the Daily Mail's post or anywhere that she would be, in effect, intruding. So, while people may have found it offensive, she didn't go out of her way (so to speak) to offend.
This ultimately didn't matter. Outraged readers reacted swiftly enough to get her post taken down and her account yanked in record time. Facebook explained its actions to Kang this way.
The bright line between protected expression and hate speech may occasionally be rather grey and dim, but there's a big difference between what Kang said and what most people would consider to be "hate speech." Something that angers others -- even if it's several others -- is not "hate speech" solely because it provokes a strong reaction.
Shortly after Kang went public with this, Facebook offered a completely different explanation of what happened.
Facebook restored Kang's account but did not restore the post, saying Kang could repost it if she chose to. At this point, the post is still deleted, along with every comment, Like, etc. that was attached to it. Reposting it fresh would pretty much be pointless. Kang made her point once and was treated to Facebook's baffling, inconsistent "policy" on controversial content.
Many readers agreed with Kang's sentiments. Many others felt it was bullying. There's no real "right" viewpoint here because all Kang did was express her opinion. The correct response would have been to leave it alone and let the best weapon against offensive speech sort it out: more speech. Kang's posting undoubtedly drew at least as much vitriol as she offered up. Speech battles speech better than weak censorious efforts like Facebook's knee jerk response.
The platform has nearly one billion users. To shut down speech because of a single report completely undermines the power inherent in one of the largest social networks in the world. Facebook: you have a billion voices. Let them talk it out. Hastily pulling the trigger at the behest of those offended only makes it harder for the offender to see how their words have affected others. It certainly doesn't teach anyone to be more tolerant. All it does is feed into their mindset by proving them "right" -- that the rest of the world can't handle their "truth."
Source
These two aspects often collide with disastrous results. Beheadings are a go, but breast cancer groups can't post photos of mastectomies. Recent partnerships with government agencies see Facebook willing to censor by proxy, even as it attempts to roll back its control in other areas. Giving 800+ million users access to a "report" button is well-intended, but the reality is more troubling. Something that's simply unpopular can be clicked into oblivion in nearly no time whatsoever.
Here's how Facebook's efforts to play both sides (free speech/content patrol) works in practice. [Caution: contains mandatory autoplay video because every TV news website in the US apparently believes we want to be informed without consent.]
Quote:Fitness advocate and mother of three, Maria Kang, is an avid blogger and Facebook user, but her most recent post didn't sit well with Facebook, and some people are now calling her a bully...
When Kang saw an article online from the Daily Mail showing plus-size women posing in lingerie, she logged onto her Facebook page and started writing.
"The popular and unrelenting support received to those who are borderline obese (not just 30-40lbs overweight) frustrates me as a fitness advocate who intimately understands how poor health negatively effects a family, a community and a nation," Kang said.
"While I think it's important to love and accept your body, I was a little peeved because I think that we're normalizing obesity in our society," Kang said.
One thing that's worth pointing out is that Kang posted this rant on her page. She didn't head to a body acceptance group page or the Daily Mail's post or anywhere that she would be, in effect, intruding. So, while people may have found it offensive, she didn't go out of her way (so to speak) to offend.
This ultimately didn't matter. Outraged readers reacted swiftly enough to get her post taken down and her account yanked in record time. Facebook explained its actions to Kang this way.
Quote:Just three hours after her post went viral, Facebook responded by removing the post and shutting down her account, calling the post "hate speech," according to Kang.
The bright line between protected expression and hate speech may occasionally be rather grey and dim, but there's a big difference between what Kang said and what most people would consider to be "hate speech." Something that angers others -- even if it's several others -- is not "hate speech" solely because it provokes a strong reaction.
Shortly after Kang went public with this, Facebook offered a completely different explanation of what happened.
Quote:"A user reported content on her page, and it was mistakenly removed by Facebook. When we realized the error we corrected it immediately, and restored full access to Ms Kang. We apologize for the inconvenience."
Facebook restored Kang's account but did not restore the post, saying Kang could repost it if she chose to. At this point, the post is still deleted, along with every comment, Like, etc. that was attached to it. Reposting it fresh would pretty much be pointless. Kang made her point once and was treated to Facebook's baffling, inconsistent "policy" on controversial content.
Many readers agreed with Kang's sentiments. Many others felt it was bullying. There's no real "right" viewpoint here because all Kang did was express her opinion. The correct response would have been to leave it alone and let the best weapon against offensive speech sort it out: more speech. Kang's posting undoubtedly drew at least as much vitriol as she offered up. Speech battles speech better than weak censorious efforts like Facebook's knee jerk response.
The platform has nearly one billion users. To shut down speech because of a single report completely undermines the power inherent in one of the largest social networks in the world. Facebook: you have a billion voices. Let them talk it out. Hastily pulling the trigger at the behest of those offended only makes it harder for the offender to see how their words have affected others. It certainly doesn't teach anyone to be more tolerant. All it does is feed into their mindset by proving them "right" -- that the rest of the world can't handle their "truth."
Source