As the
world watches the violence and anti-American protests in many Arab Spring
countries, it is hard not to wonder if this could have been avoided simply by
YouTube invoking prior restraint on the anti-Muslim video created by a
mysterious American. Since the development of the mass media, people have
debated the question of the media's ability to truly elicit actions. Who can be held responsible for the negative
outcome of a message disseminated through the mass (or in recent cases, social)
media? I believe this question has two
components; first, the legal question (can a medium be held legally
accountable?) and second, the ethics question (does a medium have an ethical
duty to act in these cases?).
The Legal Question
In
terms of the legal liability of media for the content they produce or
disseminate, the U.S. Supreme Court has ruled consistently that unless specific
instructions for violence were in the messaging, it is not the media outlet's
fault if violence ensues. (See Brown v.EMA). In the United States, the First
Amendment protects even offensive speech, and even protects the medium through
which such messages are communicated. However, it must be noted that with the rise of social
media, a nearly-international medium, the legal
factor is thrown by the wayside as different governments have different
views of and implications for free speech.
The Ethical Question
In most
cases, although legal action may be pursued by a government trying to restrict
speech or attain information, the question of dissemination of potentially
inflammatory speech often boils down to a question of organizational
ethics. This is not as simple as the
difference between 'right' and 'wrong', however. For example, YouTube willingly removed the
anti-Muslim video from being accessed in certain countries, where it was most
likely to elicit violence.
YouTube/Google believed that their ethical obligation was to try to curb
violence in an already-inflamed situation.
However, Twitter is well-known for its firm stance against infringements
of speech. Earlier this year, the
Pakistani government blocked Twitter after the company refused to remove
certain content that had been disseminated through its channels. Although its reaction was the opposite of
Google's, it believed it was acting ethically by standing for free speech. YouTube takes the more utilitarian view of
free speech ethics, acting in the best interest of the greatest number of
people (by attempting to stop violence), whereas Twitter acts in the deontological view of free speech ethics, seeing a moral duty to uphold the
principles of free speech.
In my
opinion, although I am a supporter of free speech, I believe that social media
outlets like Twitter, Facebook and YouTube have an ethical obligation to curb
the dissemination of inflammatory material.
However, this is a difficult call to make for these organizations. In the United States, groups regularly poke
fun at religious symbols and other 'sacred' images or figures. We would probably not think much of a video
poking fun of a prophet, certainly not enough to begin massacring people and
destroying things. However, in other
countries, where there is not an everyday reality of free (and offensive)
speech, this video elicited an unprecedented reaction of violence and
hate. Social media outlets, being
international entities, must tread carefully when balancing the social and
political value of free speech with the equally-important value of peace and stability.
No comments:
Post a Comment