Tag Archives: rumors

How to change minds

I’m a part of the March for Science Facebook group, for people that were in the Marches for Science all across the USA on April 2017 or that supported such. A lot of the talk on the group has been about science education and public relations. There are individuals and communities all over the USA – and the world – fighting against science-based decision making in public policies and science education in schools, and many on the group feel this is because of poor wording and poor outreach by scientists and those that support science regarding public relations. In my ongoing quest to be a better communicator, I’ve watched these discussions closely.

Recently, someone posted the following regarding how we communicate about science. I think it’s a great testimony regarding what works, and what doesn’t, regarding swaying public opinion, changing people’s minds and fighting misinformation. I’m sharing it here, with her permission, but without her name to protect her identity:

I’m not a scientist. I’m not afraid of science but I also don’t have a strong grasp of most science related jargon. I joined this group along with a few other science groups/pages as I heard more and more of anti-science rhetoric from our govt. Allthough I don’t understand a lot of scientific things that doesn’t mean I don’t realize the importance of science for our society and for our future.

I have learned SO MUCH from reading posts and comments. The reason I have learned so much? The reason I am no longer “afraid” of GMO’s? The reason I have changed my mind on other popular misconceptions? Because my fear was never the science. My fear was that I didn’t know what information to trust. Money talks. It’s hard to figure out who is paying. Do I trust a science study that was paid for by a big corporation? Do I trust a study that’s published but not peer reviewed? WHO do you trust?

The common thread I’ve found as I read posts and comments in order to learn more is how stupid I am. How dumb was I to not trust GMO’s. People’s comments were blatantly MEAN. And sure, I was completely uneducated about GMO’s. I read the wrong information. I trusted the wrong sources. But again, without hours of research to find out funding sources, etc HOW do I know what to trust?

This question was amazing. I always want to learn more. I want to understand about so many things – to give my kids the best future possible. The best food to eat. The best meds for my asthmatic child. The best environment for them to grow up in, etc. But here’s the thing. If I wasn’t determined to do the best for my kids . . . by the 100th ridiculing comment on a post I found interesting I would have stopped following and learning. Heck by the 20th I would have written off these sciences pages.

Even in this thread there are those using terms like “stupid,” “brainwashing,” etc. Very derogatory terms and grouping all people who don’t have a knack for science into one realm. I have a great head for business, finances and can analyze the heck out of any non-technical literature. I don’t make fun or ridicule those people who don’t have have that ability. It accomplishes nothing.

So thank you to those of you who answered this post thoughtfully. I’m certain there are many of you who diligently try over and over again to get your point across. Don’t give up. Changing peoples’ minds is never easy but in this case it’s worth the fight.

—end quoted text—

Also see:

Folklore, Rumors & Misinformation Campaigns Interfering with Humanitarian Efforts & Government Initiatives

gossipUPDATED:

Preventing Folklore, Rumors, Urban Myths & Organized Misinformation Campaigns From Interfering with Development & Aid/Relief Efforts & Government Initiatives

Folklore, rumors and contemporary myths / legends often interfere with development aid activities and government initiatives, including public health programs – even bringing such to a grinding halt. They create ongoing misunderstandings and mistrust, prevent people from seeking help, encourage people to engage in unhealthy and even dangerous practices, and have even lead to mobs of people attacking someone or others because of something they heard from a friend of a friend of a friend. With social media like Twitter and Facebook, as well as simple text messaging among cell phones, spreading misinformation is easier than ever.

Added to the mix: fake news sites set up specifically to mislead people, as well as crowdsourced efforts by professional online provocateurs and automated troll bots pumping out thousands of comments, countering misinformation efforts has to be a priority for aid and development organizations, as well as government agencies.

Since 2004, I have been gathering and sharing both examples of this phenomena, and recommendations on preventing folklore, rumors and urban myths from interfering with development and aid/relief efforts and government initiatives. I’ve recently updated this information with new information regarding countering organized misinformation campaigns.

Anyone working in development or relief efforts, or working in government organizations, needs to be aware of the power of rumor and myth-sharing, and be prepared to prevent and to counter such. This page is an effort to help those workers:

  • cultivate trust in the community through communications, thereby creating an environment less susceptible to rumor-baiting
  • quickly identify rumors and misinformation campaigns that have the potential to derail humanitarian aid and development efforts
  • quickly respond to rumors and misinformation campaigns that could derail or are interfering with humanitarian aid and development efforts

And, FYI: I do this entirely on my own, as a volunteer, with no funding from anyone. I update the information as my free time allows.

Also see:

fake news, folklore & friendships

gossipIt wasn’t getting a journalism degree, or being a journalist, that made me a skeptic when it comes to sensational stories. It was a folklore class. Urban Folklore 371, to be exact. It was a very popular class at Western Kentucky University back in the late 1980s, both for people getting a degree in folklore studies and for people needing humanities courses for whatever their degree program was, like me. Class studies focused on contemporary, largely non-religious-based legends, customs and beliefs in the USA. One class might focus on watching a film about the games kids play on a playground and how those games explore the things they fear – marriage, childbirth, stranger danger, being ostracized by their peers, etc. Another class might review the difference versions of the “vanishing hitchhiker” story and why such stories are so popular in so many different cultures, and how the story changes over time.

I heard at least one student say, “That’s not a true story?! I always thought it was!” at least once in every class. Because of that class, I realized there were legends being told as truth all around me, by friends, by family, even by newspapers. “I heard it from my cousin” or “My friend saw it in a newspaper” or “My Mom saw it on Oprah” was usually the preface to some outlandish story told as fact. But the class taught me that, in fact, no woman was ever killed by spiders nesting in her elaborate hairdo, that there has never been a killer with a hook for a hand that attacked a couple in a parked car in a nearby town, that there is no actor who has never had a gerbil removed from his anus, and on and on and on.

I became the “um – that’s not true” girl at various places where I worked. And then via email. And I still am, now on social media. And what I have learned from being little Ms. Debunker is that people REALLY do NOT like these stories debunked. In fact, pointing out the facts that prove these stories aren’t true, no matter how gently I try to do it, often makes people very angry.

Back in the 1990s, a friend sent me yet another forwarded email. This time, the text said the email was from Microsoft Founder Bill Gates, that he’d written a program that would trace everyone to whom the email message was sent, and that he was beta testing the program. The email encouraged people to forward the message and said that if it reaches 1,000 people, everyone on the list would receive $1,000. Of course, it wasn’t true – I knew it as soon as I saw it. She’d sent me several of these type of emails – one that said people that forwarded the message would get a free trip to Disney World, another said we’d all get free computers, and on and on. I had been deleting them, but I was tired of it. So I looked online, found a site that debunked the myth, and sent her the link. I didn’t make any judgement statements; I just said, “This is a myth. Here’s more info. You might want to let everyone know you sent to, as well as the person you got it from,” or something similar.

She was not happy with me. In fact, it almost ended our friendship. She told me that the Internet was “a place for having fun” and “you can’t win if you don’t play” and what did she have to lose by forwarding the message even if it sounded fishy?

And that kind of reaction kept happening. Three new friends I made back in 2010, after I’d moved back to the USA, all unfriended me on Facebook the same day, outraged that I pointed out several things they were posting as their status updates – about how Facebook was going to start charging users, about how putting up a disclaimer on your Facebook page would stop the company from being able to sell your information, and on and on – were all urban legends, all untrue. Their reaction was almost verbatim of what that friend via email had said: Facebook is “a place for having fun” and “it’s better to be safe and share it” and what did they have to lose by sharing the message even if it sounded fishy? Also, they said they did not have time to “check every single thing online.”

Now, in 2016, I have friends that are furious with me for posting science-based web sites that debunk their posts from quack sites like the “Food Babe” claiming that GMOs cause cancer or that vaccines cause autism (to be clear, these are MYTHS). Two journalists – JOURNALISTS – were mad at me when I pointed out that a status update one had shared – it urged users to use the Facebook check-in function to say they were at Standing Rock in North Dakota, that this would somehow prevent the Morton County Sheriff’s Department there from geotargeting DAPL protesters – was promoting false information. I wasn’t just annoyed by the message – I found it imprudent, and yet another example of slackervism or slacktivism: people truly wishing to assist the protesters were checking in on Facebook rather than doing something that would REALLY make a difference, like sending funds to support the protest efforts or writing their Congressional representatives in support of the protesters. It also misdirects people from the nefarious ways law enforcement really does surveil people on social media. I would have thought journalists would know better than engage in such behavior.

Contemporary legends online cause harm, and it’s bothered me long before the Standing Rock/Facebook book check-in myth. Since 2004, I have been gathering and sharing examples of how rumors and urban / contemporary myths often interfere with relief and development activities, and government initiatives, including public health initiatives — even bringing such to a grinding halt. These myths create ongoing misunderstandings among communities and cultures, prevent people from seeking help, encourage people to engage in unhealthy and even dangerous practices, cultivate mistrust of people and institutions, and have even lead to mobs of people attacking someone or others for no reason other than something they heard from a friend of a friend of a friend. With the advent of social media like Twitter and Facebook, as well as just text messaging among cell phones, spreading misinformation is easier than ever.

Based on my experience as a researcher and a communications practitioner, and everything I’ve read – and I read a LOT on this subject – rumors that interfere with development and aid/relief efforts and government health initiatives come from:

  • misinterpretations of what a person or community is seeing, hearing or experiencing,
  • from previous community experiences or their cultural beliefs,
  • willful misrepresentation by people who, for whatever reason, want to derail a development or relief activity,
  • unintentional but inappropriate or hard-to-understand words or actions by a communicator, or
  • the desire of an individual or community to believe an alternative narrative, a desire that is stronger than the facts

That list of bullet points was central to the long list I made of recommendations on preventing folklore, rumors and urban myths from interfering with such initiatives. I made that list to help aid workers, particularly people leading public health initiatives. For years, I’ve updated that list and felt really good about it being comprehensive and realistic, and I’ve employed some of the methods myself in my work.

But are these recommendations enough anymore? I’m not sure. Because BuzzFeed reported that fake news stories about the USA Presidential election this year generated more engagement on Facebook than the top election stories from 19 major news outlets COMBINED – that included major news outlets such as The New York Times, The Washington Post, CNN, and NBC News, and on and on. And a new study from Stanford researchers evaluated students’ ability to assess information sources, and described the results as “dismaying,” “bleak” and a “threat to democracy,” as reported by NPR News. Researchers said students displayed a “stunning and dismaying consistency” in their responses, getting duped again and again. The researchers weren’t looking for high-level analysis of data but just a “reasonable bar” of, for instance, telling fake accounts from real ones, activist groups from neutral sources and ads from articles. And the students failed. Miserably. And then there’s my own experience seeing the reaction a lot of people have to references to sites like snopes.com or truthorfiction.com or hoax-slayer.com or the Pulitzer Prize-winning site Politico that debunk myths; those people claim that “These sites aren’t true. They’re biased.” And that’s that – just a simple dismissal, so they can continue to cling to falsehoods.

National Public Radio did a story a few days ago about a man in Los Angeles who decided to build fake news sites that publish outrageous, blatantly false stories that promote stories that extreme far-right groups in the USA (also known as “alt-right”) would love to believe; he thought that when these stories were picked up by white supremacist web sites and promoted as true, he and others, particularly major media outlets, would be able to point out that the stories were entirely fiction, created only as bait, and that the white supremacists were promoting such as fact. But instead, thousands of people with no formal association with white supremacists groups shared these stories as fact – reaching millions more people. He wrote one fake story for one of his fake sites on how customers in Colorado marijuana shops were using food stamps to buy pot. Again, this story is NOT TRUE. But it led to a state representative in Colorado proposing actual legislation to prevent people from using their food stamps to buy marijuana; a state legislator was creating legislation and outrage based on something that had never happened.

BTW, to see these fake news sites for yourself, just go to Google and search for snopes is biased, and you will get a long list of links to fake news sites, most right-wing, all fighting against debunking fact-based sites like Snopes. I refuse to name those fake news sites because I don’t want them to get any more traffic than they already do.

Competent decision-making depends on people – the decision-makers – having reliable, accurate facts put in a meaningful and appropriate context. Reason – the power of the mind to think, understand and form judgments by a process of logic – relies on being able to evaluate information regarding credibility and truth. But fact-based decision-making, the idea of being logical and using reason and intellect, have become things to eschew. The Modis Operandi for many is go with your gut, not with the facts. Go not for truth, but truthiness.

I always thought that last bullet in my list of why people believe myths, “the desire of an individual or community to believe an alternative narrative, a desire that is stronger than the facts,” was easy to address. Now, given all the aforementioned, I’m not at all sure.

I’m going to keep calling out myths whenever I see them, and if it costs me Facebook friends, so be it. I prefer the truth, even when the truth hurts, even when the truth causes me to have to reconsider an opinion. There is a growing lack of media literacy and science literacy in the USA – and, indeed, the world. And the consequences of this could be catastrophic – if they haven’t been already. People need to be able to not just access information, but also to analyze it and evaluate the source. That’s just not happening. And I’ve no idea how to change things.

Also see:

8:10 am Nov. 28, 2016 Update: Filippo Menczer, Professor of Computer Science and Informatics and Director of the Center for Complex Networks and Systems Research at Indiana University, Bloomington, authored the article Why Fake News Is So Incredibly Effective, published in Time and The Conversation. Excerpts: “Our lab got a personal lesson in this when our own research project became the subject of a vicious misinformation campaign in the run-up to the 2014 U.S. midterm elections. When we investigated what was happening, we found fake news stories about our research being predominantly shared by Twitter users within one partisan echo chamber, a large and homogeneous community of politically active users. These people were quick to retweet and impervious to debunking information.” Also of note: “We developed the BotOrNot tool to detect social bots. It’s not perfect, but accurate enough to uncover persuasion campaigns in the Brexit and antivax movements… our lab is building a platform called Hoaxy to track and visualize the spread of unverified claims and corresponding fact-checking on social media. That will give us real-world data, with which we can inform our simulated social networks. Then we can test possible approaches to fighting fake news.”

1:05 pm Nov. 29, 2016 Updates:

Donald Trump and the Rise of Alt-Reality Media: You think the truth took a hit last year? It’s about to get worse. A lot worse. from Politico.

For Some, Scientists Aren’t The Authority On Science from NPR

Dec. 3, 2016 Updates:

Spread of Fake News Provokes Anxiety in Italy from The New York Times

Dec. 6, 2016 Updates:

A North Carolina man read online that a pizza restaurant in northwest Washington, DC, was harboring young children as sex slaves as part of a child-abuse ring, so he drove six hours from his home to the restaurant, and not long after arriving, he fired from an assault-like AR-15 rifle. No one was injured, and he’s been arrested, but, as The New York Times notes,  “the shooting underscores the stubborn lasting power of fake news and how hard it is to stamp out. Debunking false news articles can sometimes stoke the outrage of the believers, leading fake news purveyors to feed that appetite with more misinformation. Efforts by social media companies to control the spread of these stories are limited, and shutting one online discussion thread down simply pushes the fake news creators to move to another space online. The articles were exposed as false by publications including The New York Times, The Washington Post and the fact-checking website Snopes. But the debunking did not squash the conspiracy theories about the pizzeria — instead, it led to the opposite. ‘The reason why it’s so hard to stop fake news is that the facts don’t change people’s minds,’ said Leslie Harris, a former president of the Center for Democracy & Technology, a nonprofit that promotes free speech and open internet policies.”

Dec. 9, 2016 update

“Fakes, News and the Election: A New Taxonomy for the Study of Misleading Information within the Hybrid Media System”

Giglietto, Fabio and Iannelli, Laura and Rossi, Luca and Valeriani, Augusto

November 30, 2016. Convegno AssoComPol 2016 (Urbino, 15-17 Dicembre 2016), Forthcoming. Available at SSRN: https://ssrn.com/abstract=2878774

Abstract:
The widely unexpected outcome of the 2016 US Presidential election prompted a broad debate on the role played by “fake-news” circulating on social media during political campaigns. Despite a relatively vast amount of existing literature on the topic, a general lack of conceptual coherence and a rapidly changing news eco-system hinder the development of effective strategies to tackle the issue. Leveraging on four strands of research in the existing scholarship, the paper introduces a radically new model aimed at describing the process through which misleading information spreads within the hybrid media system in the post-truth era. The application of the model results in four different typologies of propagations. These typologies are used to describe real cases of misleading information from the 2016 US Presidential election. The paper discusses the contribution and implication of the model in tackling the issue of misleading information on a theoretical, empirical, and practical level.

Also see: Feuds in the nonprofit/NGO/charity world

Ukrainian journalism student project: Stopfake.org

For more than a decade, I’ve been informally studying how folklore, rumors & urban myths interfere with development/aid/relief efforts, and government initiatives, & how these are overcome. I’m so fascinated with the subject that it was almost my Master’s degree these once-upon-a-time – but I couldn’t find enough people to go on-the-record in interviews.

I have longed for myth-busting sites like snopes.com or Straight Dope column by Cecil Adams in the USA to be created for developing and transitional countries (as well as home-grown versions of the show “MythBusters“), in local languages. I dream of winning the lottery just so I can fund such initiatives in various countries.

Imagine my thrill to discover this week that there IS such a thing in Ukraine! Fact-checking website Stopfake.org was launched on March 2, 2014 by alumni and students of Mohyla School of Journalism and of the Digital Future of Journalism professional program. “The main purpose of this community is to check facts, verify information, and refute distorted information and propaganda about events in Ukraine covered in the media,” according to the web site. The site is in both English and Russian.

It’s an all-volunteer site (and that includes ONLINE volunteers / virtual volunteering), verifying information, finding and translating and researching stories, etc. Though the site is meant to fact-check anti-Ukrainian bias in media, there are some articles that debunk pro-Ukrainian stories as well. What I particularly love is the article How to Identify a Fake.

I hope that, once the conflict between Ukraine and Russia has become non-violent and not quite so threatening and vitriolic, the focus of the Stopfake.org site can move to more every-day myths that float around Ukraine – about HIV/AIDS, or about Islamic or Jewish religions/culture, for instance. Such myths can have serious, even deadly, consequences.

I also hope the site will start being updated again in English soon – as of the time of this blog’s writing, it hasn’t been updated in English since the end of August. I wonder if this program would qualify to use the UN’s Online Volunteering service to find online volunteers to translate articles from Russian to English… I certainly consider debunking rumors as an essential part of development and aid work

If you know of a similar myth-debunking site in countries other than the USA, please note such in the comments section on my blog.

And for a good source of information about the conflict in Ukraine, from a variety of sources (news, NGOs, UN agencies, etc.), my go-to site is ReliefWeb’s Ukraine site.

You have an obligation to be truthful online

Because of the Internet and text messaging, it has never been easier to share information – or misinformation.

Also because of the Internet and text messaging, we’ve all become mass communicators. This isn’t the same as passing around a Christmas letter to the family, sending cards to friends or showing a video of the company picnic at a gathering of co-workers. Posting a blog is publishing. Posting a Facebook status update is publishing. Posting a video on YouTube is broadcasting. Yes, it is. You may have set your privacy settings so that only your friends can see what you have published or broadcast, but they have the ability to cut and paste your ideas into their own publications or broadcasts.

And because of all of the aforementioned, you have an obligation in all of your publishing and broadcasting to be truthful – that includes what you forward. I’m not talking about jokes or satire. I’m talking about “Here’s an article from The New York Times” you are sharing because you saw it on someone else’s page – did you make sure it really is from The New York Times? Did you take 15 seconds or less to cut one sentence from the article and paste it into Google or Bing and to see what comes up – a NYT link or a Snopes article debunking the story? (I timed it – it really does take just 15 seconds or less).

You don’t have to be a journalist to have ethics. And you still get to post all sorts of opinions and thoughts and dreams and hopes and fears and jokes and pretty pictures wherever you like, however you like, to whomever you like. But take just 15 seconds or less before you post that amazing story about a boy with cancer or a heroic dog or some outrageous action or comment by someone you don’t like, to make sure it’s true.

What are the consequences of NOT being a responsible citizen of cyberspace? These:

  • You cast doubt on everything you say, once people start to figure out they can’t trust something you post online.
  • You can be seen as careless, once people start to realize you didn’t verify an article before you posted it, an article they initially believed.
  • It’s disrespectful to your network – shouldn’t friends, family and colleagues expect you to respect them enough to verify the information you share with them?
  • You cast doubt on news that IS true. What if there really is a kid with cancer who needs donations, but people don’t believe it because they know that a story you posted about a kid with cancer wasn’t true?

Do you really want the to be associated with untrustworthiness and carelessness? Don’t your friends and family deserve more?

What to do when you find out something you posted is not true? Take it down and replace it with correct information, along with an apology.

I’ve posted information a few times that I thought was true and that turned out not to be. As a trained journalist, I was mortified by my carelessness. I try to use each of those experiences to be a more responsible publisher and broadcaster. Because that’s what my friends, family and colleagues deserve from me.

Related subjects:

Folklore / text messaging interfering with development, aid/relief & public health initiatives

Rampant misinformation online re: Mumbai (from the archives)

Myths aren’t just annoying – they promote hatred

Citizen journalism/crowd-sourcing gone wrong?

Social media: cutting both ways since the 1990s

folklore / text messaging interfering with development, aid/relief & public health initiatives

I’ve recently updated a resource I started back in 2004:

Folklore, Rumors (or Rumours) & Urban Myths Interfering with Development & Aid/Relief Efforts, & Government Initiatives (& how these are overcome)

Folklore, rumors (or rumours) and urban myths / urban legends often interfere with relief and development activities, and government initiatives, including public health initiatives — even bringing such to a grinding halt. They create ongoing misunderstandings among communities and cultures, prevent people from seeking help, encourage people to engage in unhealthy and even dangerous practices, cultivate mistrust of people and institutions, and have even lead to mobs of people attacking someone or others for no reason other than something they heard from a friend of a friend of a friend. And now, with cell phone text messaging and social media, it’s easier than ever to spread such misinformation.

This resource has examples of this happening – in developing countries but also in the USA – and how to prevent or address such. And I’ve had the opportunity to update it recently. It’s a project that I would love to devote more time to, but I really need that time funded.

rampant misinformation online re: Mumbai (from the archives)

This blog originally appeared on a different blog host on 28 November 2008. If any URL does not work, type it into archive.org to see if there is an archived version there. For more information about why I am republishing these old blogs, scroll down to the bottom of this blog entry.

I’m intensely interested in how rumors and myth derail humanitarian efforts — or affect our understanding of various events, both current and historical. So yesterday, as I watched CNN reporters trumpet again and again how easy it was for “ordinary people” to find and disseminate information regarding the Mumbai attacks via various Internet tools such as blogs and Twitter, as well as cell phone text messaging, I wondered how long it would be before CNN started reporting unverified items from these Internet sources and ended up repeating things that would turn out not at all to be true.

I think it took approximately 15 minutes after that thought before a reporter started retracting some of the things being reported online that CNN had repeated. Suddenly, cyberspace wasn’t such a great example of “citizen journalism” after all.

In CNN’s own story about this online phenomenon today, they admit that a vast number of the posts on Twitter amounted to unsubstantiated rumors and wild inaccuracies. As blogger Tim Mallon put it, “far from being a crowd-sourced version of the news it (Twitter) was actually an incoherent, rumour-fueled mob operating in a mad echo chamber of tweets, re-tweets and re-re-tweets… During the hour or so I followed on Twitter there were wildly differing estimates of the numbers killed and injured – ranging up to 1,000.”

Amy Gahran has posted Responsible Tweeting: Mumbai Provides Teachable Moment that includes four excellent tips for people who want to micro-blog the news as it happens. It emphasizes checking sources and correcting information that you have found out is incorrect, and cautions journalists to remember that everything you read on the Internet or your cell phone isn’t necessarily true (how sad that they even have to be reminded…)

Sometimes misinformation is bad, or even worse, than no information at all. As with any communications tool, when it comes to instant networking tools like blogs, Twitter, and cell phones, use with caution. And TV journalists — please re-read your journalism 101 text books.

Why I’ve republished this old blog:

I have long been passionate about debunking urban legends, and that I’m very concerned at how easy online and phone-based tools, from email to Twitter, are making it to promote rumors and myths. Five to 10 years ago, I was blogging on this subject regularly. The web host where I published these blogs is long gone, and I’m now trying to find my many blogs on the subject of how folklore, rumors (or rumours) and urban myths Interfere with development work, aid/relief efforts and community health initiatives, so I can republish them here. I’ll be publishing one or two of these every Saturday until they are all back online.

 

Citizen journalism/crowd-sourcing gone wrong?

They are well-meaning people who have not considered the moral weight of what they’re doing.* This is vigilantism, and it’s only the illusion that what we do online is not as significant as what we do offline that allows this to go on. Imagine if people were standing around in Boston pointing fingers at people in photographs and (roughly) accusing them of terrorism…

Investigating these bombings is just not a job for “the crowd,” even if technology makes such collaboration possible. Even if we were to admit that Reddit was “more efficient” in processing the influx of media around the bombing, which would be a completely baseless speculation/stretch/defense, it still wouldn’t make sense to create a lawless space in which self-appointed citizens decide which other citizens have committed crimes. This would be at the top of any BuzzFeed list of the tried-and-true lessons of modern civilization. We have a legal system for a reason.

from “Hey Reddit, Enough Boston Bombing Vigilantism” by Alexis Madrigal, senior editor at The Atlantic, where he oversees the Technology channel.

On a related note, I’m trying to figure out how to incorporate this and related items into my page on folklore, rumors and urban myths interfering with development and aid/relief efforts, and government initiatives.

Social media is such a great thing… until it’s not.

Social media: cutting both ways since the 1990s

Social media — those avenues to send instant, short, widely-distributed messages and images — cuts both ways:

  • It can be used to organize protesters, but it can also be used to identify protesters and arrest them.
  • It can be used to spread information, but it can also be used to spread MISinformation.
  • You can use it to promote your organization and cause, and others can use it to tear down your organization.

And it’s been used to organize protests since the 1990s – so can we stop now with how “new” it all is?

Back in 2001, while working for UNDP/UNV, I researched how handheld computer technologies were being used, or could be used, in community service / volunteering / advocacy. It wasn’t called “social media” or “micro volunteering” back then, but even without the snazzy jargon, I knew something very exciting was going on, something that was changing the way communities are engaged and mobilized. Among the discoveries in my research was that grassroots advocates had used handheld computer or phone devices to help organize and direct protesters during the 1999 Seattle demonstrations against the World Trade Organization, and that in 2001, protesters in the Philippines used cell-phone text messaging to mobilize demonstrators to help oust President Joseph Estrada. In addition, in China, also in 2001, tens of thousands of followers of the spiritual group Falun Gong continued to exist-despite a harsh crackdown-in a vibrant community fed by the Web and encrypted text messaging. I created a web page just on the subject of using text messaging for advocacy – but I was not the first to do so, as you will see on the page.

I also noted in that page that hand held technology can lead to widespread misinformation as well: “Musician and U.S.A. Green Party activist Jello Biafra noted in an article on Zdnet.Uk: ‘Be careful of the information gossip you get on the Internet, too. For example, late in 1997 I discovered out on the Internet that I was dead.'”

We’re not hearing enough about how effective Web 2.0 tools are in promoting misinformation and negative speech. For instance, micro-blogs, tweets, texts and other technology spread misinformation about and within Haiti, as well as other disaster zones (it will be interesting to see what misinformation gets spread in Japan). During the swine flu panic in the USA a while back, we saw Twitter’s power to misinform, and rumors still affect polio eradication campaigns. So-called “new” media has helped spread misinformation to derail government health initiatives here in the USA rapidly and efficiently.

It’s not just the misinformation that’s a problem in trying to use social media to mobilize community activists and educate the public: in an interview with Radio Free Europe, Evgeny Morozov, author of The Net Delusion: The Dark Side of Internet Freedom, noted that internal security agencies welcome the use of new- and social-media tools. “The reason why the KGB wants you to join Facebook is because it allows them to learn more about you from afar,” he said. “It allows them to identify certain social graphs and social connections between activists. Many of these relationships are now self-disclosed by activists by joining various groups.” Al Jazeera profiled cases in Azerbaijan, Tunisia and Moroccans where the government or those opposed to any change in government were, indeed, using Facebook accounts to anticipate protests and easily monitor and arrest protesters.

And then there’s social media, like YouTube and blogs, being used by GOTCHA media advocates, as I blogged about yesterday: there could be just one person in your community with a video camera and a dream of humiliating your organization right out of existence, and social media makes that easier than ever to do.

Don’t roll out the comments saying I’m anti-social media. Don’t start pulling your hair and gnashing your teeth, chanting, “Jayne hates Web 2.0!” I love the Interwebs. But it’s long-overdue for a reality check on all these “Twitter revolutions.” Yes, there are lessons to be learned – but we’re not focusing on the right lessons. Back in 2001, the Ruckus Society featured Longwire’s Communications Manual for Activists on its web site, and included tips for using various hand held devices and avenues-two-way radios, CB radios, cell phones, pagers, satellite communications and more in community organizing. Those lessons from a decade ago could teach current activists a lot about using social media tools effectively.