Tag Archives: fake news

New UN Initiative seeks “Information Volunteers”

Verified is a United Nations initiative to encourage people to check the validity of news, advice and information before sharing it. Verified is looking for “Information Volunteers” to sign up to receive a daily Verified briefing and then to share the fact-based advice and information with their networks.

You’re engaging right now in the biggest project of social collaboration the world has seen. Bigger than the moon landing, than the Olympics, than the building of the tallest skyscraper or longest bridge. Billions of people are working together – the doctor on the other side of the country. The parent homeschooling their child. The scientist working on the vaccine. The nurse working around the clock. You, reading this. Working towards one common goal: to look after each other.

In this crisis, sharing trusted and verified information will help keep everyone safe, while misinformation can put lives in danger. If you want to make sure the content you’re sharing helps the world, sign up to receive Verified content, and always look out for the Verified tick.

We’re doing this for each other – for everyone on the biggest team the world has ever seen.

The initiative is available in a variety of languages:

This is in addition to the UN’s main virtual volunteering initiative, the UN’s Online Volunteering Service.

Yup, I’ve signed up!

Also see:

You have an obligation to be truthful online

Updated: How Misinformation Can Derail Aid & Relief Efforts

There are lots of obstacles that can stand in the way of human, community and institutional development, aid and relief efforts, or government health initiatives, or even elections. But there is one obstacle that, until recently, rarely got discussed: widespread misunderstanding and myth-spreading.

Folklore, rumors (or rumours) and urban myths / urban legends, as well as organized misinformation campaigns and “fake news”, often interfere with relief and development activities, and government initiatives, including public health initiatives — even bringing such to a grinding halt. They also create ongoing misunderstandings among communities and cultures, prevent people from seeking help, encourage people to engage in unhealthy and even dangerous practices, cultivate mistrust of people and institutions, have even lead to mobs of people attacking someone or others for no reason other than something they heard from a friend of a friend of a friend, motivated legislators to introduce laws to address something that doesn’t exist, and influenced elections. And with the advent of social media like Twitter and Facebook, as well as just text messaging among cell phones, spreading misinformation is easier than ever.

Since 2004, I have been gathering and sharing both examples of this phenomenon, and compiling recommendations on preventing folklore, rumors and urban myths from interfering with development and aid/relief efforts and government initiatives. I do this research and entirely on my own, as a volunteer, with no funding from anyone. I update the information as my free time allows – and time has allowed such.

Once upon a time, I had wanted it to be the topic of my Master’s Degree thesis, but back in 2004, I couldn’t get an agency to go on record to tell their story. The few representatives of organizations that I talked to didn’t want to give any attention to the misinformation campaigns that were targeting them. With the advent of social media and the proliferation of misinformation, government agencies and nonprofits are scrambling to address rumors before they get out of hand – and before people are killed as a result. For instance, in 2017, in India, in the southern state of Telangana, videos were circulated among villagers that had been staged or edited in a particular way and claimed to show children being abducted by a criminal gang were circulated in more than 400 villages in the southern Indian state of Telangana via WhatsApp and an Indian messaging service called ShareChat. These videos claimed that the children were being abducted in order to harvest their organs. The claims in these videos were completely false. But because so many people believed what they saw in these videos, people stopped going out of night, several completely innocent people were attacked by mobs who accused them of being organ thieves, and at least 25 people were murdered – lynched – falsely accused of being a part of the gang. Here’s more about the consequences of such misinformation campaigns and how the situation in India was addressed.

Also see:

Scammers target those that care about soldiers, world affairs

Aid workers need to help local staff avoid scams 

You have an obligation to be truthful online

My voluntourism-related & ethics-related blogs (and how I define scam)

I’ve been trying to warn about “fake news” since 2004

Since 2004, I have been gathering and sharing both examples of and recommendations for preventing folklore, rumors and urban myths from interfering with development and aid/relief efforts and government initiatives. And for years, I felt like the lone voice in the wilderness on this subject. It was almost my master’s thesis project, but while I could find examples of widespread misunderstanding and misinformation campaigns interfering with relief and with relief and development activities, and government initiatives, including public health initiatives, I could not get enough people to go on record to talk about these circumstances and how they were addressing such. For a year, I contacted numerous organizations, particularly organizations promoting women’s health and access to abortion, trying to get them to talk about how these misinformation campaigns were affecting them, but if they replied at all to my emails or phone calls, they said they didn’t want to bring more attention to the problem, even if that attention was in an academic paper that people outside the institution may never read.

I went with another subject for my Master’s project, but I had gathered a lot of publicly-available information, so I shared it all on my web site, and I have kept it updated over the years as my time has allowed. I have always easily found many examples of myths and misinformation creating ongoing misunderstandings among communities and cultures, preventing people from seeking help, encourage people to engage in unhealthy and even dangerous practices, and cultivating mistrust of people and institutions. I easily have found examples that had lead to mobs of people attacking someone or others for no reason other than something they heard from a friend of a friend of a friend, to legislators introducing laws to address something that doesn’t exist, and influencing elections, long before such finally got noticed because of Brexit and the USA November 2016 elections.

In my original web pages, I said that this subject was rarely discussed, and for more than a decade, that was the truth: while I could find all of those examples, it was very difficult to find any online resources or published resources outside of academic papers about how to address or prevent misinformation campaigns designed to interfere with a relief or development effort, public health campaign, etc. Where was the practical info on how to deal with this? It was few and far between. For many years, mine was the only web site tracking such.

How did I get interested in this subject? I noticed stories my friends and family told often turned out not to be true, everything from spiders or snake eggs found in a jacket of a friend of a cousin that lives in another state, to why a local store closed, to something they had heard about happening on a TV talk show but hadn’t actually seen themselves. Then, while attending Western Kentucky University for my undergrad degree, I took a very popular class, Urban Folklore 371, where we discussed these stories, how they were spread, how the story changes over time and why such stories are believed. I was hooked on the psychology of rumor-spreading.

When I worked at a United Nations agency from 2001 to 2005, I made a joke to a colleague about the outrageous mythologies about the UN that so many people believed back in the USA – I’m not going to repeat them here, on this blog, but they are easy to find online. She gave me a confused look and said she didn’t know what I was talking about. So I showed her various web sites that promote this misinformation. She stood there, with her mouth open and eyes wide, staring at the outrageous graphics and text. “Is this a joke?” she asked. No, I replied, this is very real. I showed her more. “I can’t believe this!” she said. I explained that we could stand there all day with me showing her these sites, and these were just ones in the USA – I had no idea how many there were based in other countries, in other languages. And I admit I was starting to get angry, because not only did this seasoned UN staff member not know about this, no one I worked with at the UN had ever heard of these myth-spreading web sites. Conspiracy theories, pre-social media, were already affecting our work, yet, I seemed to be the first person to be talking about it, at least at my agency.

We have a saying in English: closing the barn door after the horses are already out. It means you are too late in trying to address an issue. Now, all these many years after trying to sound the alarm, I fear that there are entire generations of people that will now never be convinced that global climate change is real and devasting to communities, particularly to poor communities, or that will never believe that vaccinations do NOT cause autism nor infertility, or that will never believe that condoms can prevent HIV, or that will never accept fluoride in their water because they believe too many outrageous things I can’t even begin to list here, and on and on. I fear these generations are lost forever in having basic scientific literacy. And I fear that if we don’t make a concentrated, sustained effort on educating young people about science and how to evaluate information they are hearing and reading, more people will die, more communities will be devastated, more lives will be shattered.

Also see:

How to change minds

I’m a part of the March for Science Facebook group, for people that were in the Marches for Science all across the USA on April 2017 or that supported such. A lot of the talk on the group has been about science education and public relations. There are individuals and communities all over the USA – and the world – fighting against science-based decision making in public policies and science education in schools, and many on the group feel this is because of poor wording and poor outreach by scientists and those that support science regarding public relations. In my ongoing quest to be a better communicator, I’ve watched these discussions closely.

Recently, someone posted the following regarding how we communicate about science. I think it’s a great testimony regarding what works, and what doesn’t, regarding swaying public opinion, changing people’s minds and fighting misinformation. I’m sharing it here, with her permission, but without her name to protect her identity:

I’m not a scientist. I’m not afraid of science but I also don’t have a strong grasp of most science related jargon. I joined this group along with a few other science groups/pages as I heard more and more of anti-science rhetoric from our govt. Allthough I don’t understand a lot of scientific things that doesn’t mean I don’t realize the importance of science for our society and for our future.

I have learned SO MUCH from reading posts and comments. The reason I have learned so much? The reason I am no longer “afraid” of GMO’s? The reason I have changed my mind on other popular misconceptions? Because my fear was never the science. My fear was that I didn’t know what information to trust. Money talks. It’s hard to figure out who is paying. Do I trust a science study that was paid for by a big corporation? Do I trust a study that’s published but not peer reviewed? WHO do you trust?

The common thread I’ve found as I read posts and comments in order to learn more is how stupid I am. How dumb was I to not trust GMO’s. People’s comments were blatantly MEAN. And sure, I was completely uneducated about GMO’s. I read the wrong information. I trusted the wrong sources. But again, without hours of research to find out funding sources, etc HOW do I know what to trust?

This question was amazing. I always want to learn more. I want to understand about so many things – to give my kids the best future possible. The best food to eat. The best meds for my asthmatic child. The best environment for them to grow up in, etc. But here’s the thing. If I wasn’t determined to do the best for my kids . . . by the 100th ridiculing comment on a post I found interesting I would have stopped following and learning. Heck by the 20th I would have written off these sciences pages.

Even in this thread there are those using terms like “stupid,” “brainwashing,” etc. Very derogatory terms and grouping all people who don’t have a knack for science into one realm. I have a great head for business, finances and can analyze the heck out of any non-technical literature. I don’t make fun or ridicule those people who don’t have have that ability. It accomplishes nothing.

So thank you to those of you who answered this post thoughtfully. I’m certain there are many of you who diligently try over and over again to get your point across. Don’t give up. Changing peoples’ minds is never easy but in this case it’s worth the fight.

—end quoted text—

Also see:

Folklore, Rumors & Misinformation Campaigns Interfering with Humanitarian Efforts & Government Initiatives

gossipUPDATED:

Preventing Folklore, Rumors, Urban Myths & Organized Misinformation Campaigns From Interfering with Development & Aid/Relief Efforts & Government Initiatives

Folklore, rumors and contemporary myths / legends often interfere with development aid activities and government initiatives, including public health programs – even bringing such to a grinding halt. They create ongoing misunderstandings and mistrust, prevent people from seeking help, encourage people to engage in unhealthy and even dangerous practices, and have even lead to mobs of people attacking someone or others because of something they heard from a friend of a friend of a friend. With social media like Twitter and Facebook, as well as simple text messaging among cell phones, spreading misinformation is easier than ever.

Added to the mix: fake news sites set up specifically to mislead people, as well as crowdsourced efforts by professional online provocateurs and automated troll bots pumping out thousands of comments, countering misinformation efforts has to be a priority for aid and development organizations, as well as government agencies.

Since 2004, I have been gathering and sharing both examples of this phenomena, and recommendations on preventing folklore, rumors and urban myths from interfering with development and aid/relief efforts and government initiatives. I’ve recently updated this information with new information regarding countering organized misinformation campaigns.

Anyone working in development or relief efforts, or working in government organizations, needs to be aware of the power of rumor and myth-sharing, and be prepared to prevent and to counter such. This page is an effort to help those workers:

  • cultivate trust in the community through communications, thereby creating an environment less susceptible to rumor-baiting
  • quickly identify rumors and misinformation campaigns that have the potential to derail humanitarian aid and development efforts
  • quickly respond to rumors and misinformation campaigns that could derail or are interfering with humanitarian aid and development efforts

And, FYI: I do this entirely on my own, as a volunteer, with no funding from anyone. I update the information as my free time allows.

Also see:

fake news, folklore & friendships

gossipIt wasn’t getting a journalism degree, or being a journalist, that made me a skeptic when it comes to sensational stories. It was a folklore class. Urban Folklore 371, to be exact. It was a very popular class at Western Kentucky University back in the late 1980s, both for people getting a degree in folklore studies and for people needing humanities courses for whatever their degree program was, like me. Class studies focused on contemporary, largely non-religious-based legends, customs and beliefs in the USA. One class might focus on watching a film about the games kids play on a playground and how those games explore the things they fear – marriage, childbirth, stranger danger, being ostracized by their peers, etc. Another class might review the difference versions of the “vanishing hitchhiker” story and why such stories are so popular in so many different cultures, and how the story changes over time.

I heard at least one student say, “That’s not a true story?! I always thought it was!” at least once in every class. Because of that class, I realized there were legends being told as truth all around me, by friends, by family, even by newspapers. “I heard it from my cousin” or “My friend saw it in a newspaper” or “My Mom saw it on Oprah” was usually the preface to some outlandish story told as fact. But the class taught me that, in fact, no woman was ever killed by spiders nesting in her elaborate hairdo, that there has never been a killer with a hook for a hand that attacked a couple in a parked car in a nearby town, that there is no actor who has never had a gerbil removed from his anus, and on and on and on.

I became the “um – that’s not true” girl at various places where I worked. And then via email. And I still am, now on social media. And what I have learned from being little Ms. Debunker is that people REALLY do NOT like these stories debunked. In fact, pointing out the facts that prove these stories aren’t true, no matter how gently I try to do it, often makes people very angry.

Back in the 1990s, a friend sent me yet another forwarded email. This time, the text said the email was from Microsoft Founder Bill Gates, that he’d written a program that would trace everyone to whom the email message was sent, and that he was beta testing the program. The email encouraged people to forward the message and said that if it reaches 1,000 people, everyone on the list would receive $1,000. Of course, it wasn’t true – I knew it as soon as I saw it. She’d sent me several of these type of emails – one that said people that forwarded the message would get a free trip to Disney World, another said we’d all get free computers, and on and on. I had been deleting them, but I was tired of it. So I looked online, found a site that debunked the myth, and sent her the link. I didn’t make any judgement statements; I just said, “This is a myth. Here’s more info. You might want to let everyone know you sent to, as well as the person you got it from,” or something similar.

She was not happy with me. In fact, it almost ended our friendship. She told me that the Internet was “a place for having fun” and “you can’t win if you don’t play” and what did she have to lose by forwarding the message even if it sounded fishy?

And that kind of reaction kept happening. Three new friends I made back in 2010, after I’d moved back to the USA, all unfriended me on Facebook the same day, outraged that I pointed out several things they were posting as their status updates – about how Facebook was going to start charging users, about how putting up a disclaimer on your Facebook page would stop the company from being able to sell your information, and on and on – were all urban legends, all untrue. Their reaction was almost verbatim of what that friend via email had said: Facebook is “a place for having fun” and “it’s better to be safe and share it” and what did they have to lose by sharing the message even if it sounded fishy? Also, they said they did not have time to “check every single thing online.”

Now, in 2016, I have friends that are furious with me for posting science-based web sites that debunk their posts from quack sites like the “Food Babe” claiming that GMOs cause cancer or that vaccines cause autism (to be clear, these are MYTHS). Two journalists – JOURNALISTS – were mad at me when I pointed out that a status update one had shared – it urged users to use the Facebook check-in function to say they were at Standing Rock in North Dakota, that this would somehow prevent the Morton County Sheriff’s Department there from geotargeting DAPL protesters – was promoting false information. I wasn’t just annoyed by the message – I found it imprudent, and yet another example of slackervism or slacktivism: people truly wishing to assist the protesters were checking in on Facebook rather than doing something that would REALLY make a difference, like sending funds to support the protest efforts or writing their Congressional representatives in support of the protesters. It also misdirects people from the nefarious ways law enforcement really does surveil people on social media. I would have thought journalists would know better than engage in such behavior.

Contemporary legends online cause harm, and it’s bothered me long before the Standing Rock/Facebook book check-in myth. Since 2004, I have been gathering and sharing examples of how rumors and urban / contemporary myths often interfere with relief and development activities, and government initiatives, including public health initiatives — even bringing such to a grinding halt. These myths create ongoing misunderstandings among communities and cultures, prevent people from seeking help, encourage people to engage in unhealthy and even dangerous practices, cultivate mistrust of people and institutions, and have even lead to mobs of people attacking someone or others for no reason other than something they heard from a friend of a friend of a friend. With the advent of social media like Twitter and Facebook, as well as just text messaging among cell phones, spreading misinformation is easier than ever.

Based on my experience as a researcher and a communications practitioner, and everything I’ve read – and I read a LOT on this subject – rumors that interfere with development and aid/relief efforts and government health initiatives come from:

  • misinterpretations of what a person or community is seeing, hearing or experiencing,
  • from previous community experiences or their cultural beliefs,
  • willful misrepresentation by people who, for whatever reason, want to derail a development or relief activity,
  • unintentional but inappropriate or hard-to-understand words or actions by a communicator, or
  • the desire of an individual or community to believe an alternative narrative, a desire that is stronger than the facts

That list of bullet points was central to the long list I made of recommendations on preventing folklore, rumors and urban myths from interfering with such initiatives. I made that list to help aid workers, particularly people leading public health initiatives. For years, I’ve updated that list and felt really good about it being comprehensive and realistic, and I’ve employed some of the methods myself in my work.

But are these recommendations enough anymore? I’m not sure. Because BuzzFeed reported that fake news stories about the USA Presidential election this year generated more engagement on Facebook than the top election stories from 19 major news outlets COMBINED – that included major news outlets such as The New York Times, The Washington Post, CNN, and NBC News, and on and on. And a new study from Stanford researchers evaluated students’ ability to assess information sources, and described the results as “dismaying,” “bleak” and a “threat to democracy,” as reported by NPR News. Researchers said students displayed a “stunning and dismaying consistency” in their responses, getting duped again and again. The researchers weren’t looking for high-level analysis of data but just a “reasonable bar” of, for instance, telling fake accounts from real ones, activist groups from neutral sources and ads from articles. And the students failed. Miserably. And then there’s my own experience seeing the reaction a lot of people have to references to sites like snopes.com or truthorfiction.com or hoax-slayer.com or the Pulitzer Prize-winning site Politico that debunk myths; those people claim that “These sites aren’t true. They’re biased.” And that’s that – just a simple dismissal, so they can continue to cling to falsehoods.

National Public Radio did a story a few days ago about a man in Los Angeles who decided to build fake news sites that publish outrageous, blatantly false stories that promote stories that extreme far-right groups in the USA (also known as “alt-right”) would love to believe; he thought that when these stories were picked up by white supremacist web sites and promoted as true, he and others, particularly major media outlets, would be able to point out that the stories were entirely fiction, created only as bait, and that the white supremacists were promoting such as fact. But instead, thousands of people with no formal association with white supremacists groups shared these stories as fact – reaching millions more people. He wrote one fake story for one of his fake sites on how customers in Colorado marijuana shops were using food stamps to buy pot. Again, this story is NOT TRUE. But it led to a state representative in Colorado proposing actual legislation to prevent people from using their food stamps to buy marijuana; a state legislator was creating legislation and outrage based on something that had never happened.

BTW, to see these fake news sites for yourself, just go to Google and search for snopes is biased, and you will get a long list of links to fake news sites, most right-wing, all fighting against debunking fact-based sites like Snopes. I refuse to name those fake news sites because I don’t want them to get any more traffic than they already do.

Competent decision-making depends on people – the decision-makers – having reliable, accurate facts put in a meaningful and appropriate context. Reason – the power of the mind to think, understand and form judgments by a process of logic – relies on being able to evaluate information regarding credibility and truth. But fact-based decision-making, the idea of being logical and using reason and intellect, have become things to eschew. The Modis Operandi for many is go with your gut, not with the facts. Go not for truth, but truthiness.

I always thought that last bullet in my list of why people believe myths, “the desire of an individual or community to believe an alternative narrative, a desire that is stronger than the facts,” was easy to address. Now, given all the aforementioned, I’m not at all sure.

I’m going to keep calling out myths whenever I see them, and if it costs me Facebook friends, so be it. I prefer the truth, even when the truth hurts, even when the truth causes me to have to reconsider an opinion. There is a growing lack of media literacy and science literacy in the USA – and, indeed, the world. And the consequences of this could be catastrophic – if they haven’t been already. People need to be able to not just access information, but also to analyze it and evaluate the source. That’s just not happening. And I’ve no idea how to change things.

Also see:

8:10 am Nov. 28, 2016 Update: Filippo Menczer, Professor of Computer Science and Informatics and Director of the Center for Complex Networks and Systems Research at Indiana University, Bloomington, authored the article Why Fake News Is So Incredibly Effective, published in Time and The Conversation. Excerpts: “Our lab got a personal lesson in this when our own research project became the subject of a vicious misinformation campaign in the run-up to the 2014 U.S. midterm elections. When we investigated what was happening, we found fake news stories about our research being predominantly shared by Twitter users within one partisan echo chamber, a large and homogeneous community of politically active users. These people were quick to retweet and impervious to debunking information.” Also of note: “We developed the BotOrNot tool to detect social bots. It’s not perfect, but accurate enough to uncover persuasion campaigns in the Brexit and antivax movements… our lab is building a platform called Hoaxy to track and visualize the spread of unverified claims and corresponding fact-checking on social media. That will give us real-world data, with which we can inform our simulated social networks. Then we can test possible approaches to fighting fake news.”

1:05 pm Nov. 29, 2016 Updates:

Donald Trump and the Rise of Alt-Reality Media: You think the truth took a hit last year? It’s about to get worse. A lot worse. from Politico.

For Some, Scientists Aren’t The Authority On Science from NPR

Dec. 3, 2016 Updates:

Spread of Fake News Provokes Anxiety in Italy from The New York Times

Dec. 6, 2016 Updates:

A North Carolina man read online that a pizza restaurant in northwest Washington, DC, was harboring young children as sex slaves as part of a child-abuse ring, so he drove six hours from his home to the restaurant, and not long after arriving, he fired from an assault-like AR-15 rifle. No one was injured, and he’s been arrested, but, as The New York Times notes,  “the shooting underscores the stubborn lasting power of fake news and how hard it is to stamp out. Debunking false news articles can sometimes stoke the outrage of the believers, leading fake news purveyors to feed that appetite with more misinformation. Efforts by social media companies to control the spread of these stories are limited, and shutting one online discussion thread down simply pushes the fake news creators to move to another space online. The articles were exposed as false by publications including The New York Times, The Washington Post and the fact-checking website Snopes. But the debunking did not squash the conspiracy theories about the pizzeria — instead, it led to the opposite. ‘The reason why it’s so hard to stop fake news is that the facts don’t change people’s minds,’ said Leslie Harris, a former president of the Center for Democracy & Technology, a nonprofit that promotes free speech and open internet policies.”

Dec. 9, 2016 update

“Fakes, News and the Election: A New Taxonomy for the Study of Misleading Information within the Hybrid Media System”

Giglietto, Fabio and Iannelli, Laura and Rossi, Luca and Valeriani, Augusto

November 30, 2016. Convegno AssoComPol 2016 (Urbino, 15-17 Dicembre 2016), Forthcoming. Available at SSRN: https://ssrn.com/abstract=2878774

Abstract:
The widely unexpected outcome of the 2016 US Presidential election prompted a broad debate on the role played by “fake-news” circulating on social media during political campaigns. Despite a relatively vast amount of existing literature on the topic, a general lack of conceptual coherence and a rapidly changing news eco-system hinder the development of effective strategies to tackle the issue. Leveraging on four strands of research in the existing scholarship, the paper introduces a radically new model aimed at describing the process through which misleading information spreads within the hybrid media system in the post-truth era. The application of the model results in four different typologies of propagations. These typologies are used to describe real cases of misleading information from the 2016 US Presidential election. The paper discusses the contribution and implication of the model in tackling the issue of misleading information on a theoretical, empirical, and practical level.

Also see: Feuds in the nonprofit/NGO/charity world

rampant misinformation online re: Mumbai (from the archives)

This blog originally appeared on a different blog host on 28 November 2008. If any URL does not work, type it into archive.org to see if there is an archived version there. For more information about why I am republishing these old blogs, scroll down to the bottom of this blog entry.

I’m intensely interested in how rumors and myth derail humanitarian efforts — or affect our understanding of various events, both current and historical. So yesterday, as I watched CNN reporters trumpet again and again how easy it was for “ordinary people” to find and disseminate information regarding the Mumbai attacks via various Internet tools such as blogs and Twitter, as well as cell phone text messaging, I wondered how long it would be before CNN started reporting unverified items from these Internet sources and ended up repeating things that would turn out not at all to be true.

I think it took approximately 15 minutes after that thought before a reporter started retracting some of the things being reported online that CNN had repeated. Suddenly, cyberspace wasn’t such a great example of “citizen journalism” after all.

In CNN’s own story about this online phenomenon today, they admit that a vast number of the posts on Twitter amounted to unsubstantiated rumors and wild inaccuracies. As blogger Tim Mallon put it, “far from being a crowd-sourced version of the news it (Twitter) was actually an incoherent, rumour-fueled mob operating in a mad echo chamber of tweets, re-tweets and re-re-tweets… During the hour or so I followed on Twitter there were wildly differing estimates of the numbers killed and injured – ranging up to 1,000.”

Amy Gahran has posted Responsible Tweeting: Mumbai Provides Teachable Moment that includes four excellent tips for people who want to micro-blog the news as it happens. It emphasizes checking sources and correcting information that you have found out is incorrect, and cautions journalists to remember that everything you read on the Internet or your cell phone isn’t necessarily true (how sad that they even have to be reminded…)

Sometimes misinformation is bad, or even worse, than no information at all. As with any communications tool, when it comes to instant networking tools like blogs, Twitter, and cell phones, use with caution. And TV journalists — please re-read your journalism 101 text books.

Why I’ve republished this old blog:

I have long been passionate about debunking urban legends, and that I’m very concerned at how easy online and phone-based tools, from email to Twitter, are making it to promote rumors and myths. Five to 10 years ago, I was blogging on this subject regularly. The web host where I published these blogs is long gone, and I’m now trying to find my many blogs on the subject of how folklore, rumors (or rumours) and urban myths Interfere with development work, aid/relief efforts and community health initiatives, so I can republish them here. I’ll be publishing one or two of these every Saturday until they are all back online.

 

Social media: cutting both ways since the 1990s

Social media — those avenues to send instant, short, widely-distributed messages and images — cuts both ways:

  • It can be used to organize protesters, but it can also be used to identify protesters and arrest them.
  • It can be used to spread information, but it can also be used to spread MISinformation.
  • You can use it to promote your organization and cause, and others can use it to tear down your organization.

And it’s been used to organize protests since the 1990s – so can we stop now with how “new” it all is?

Back in 2001, while working for UNDP/UNV, I researched how handheld computer technologies were being used, or could be used, in community service / volunteering / advocacy. It wasn’t called “social media” or “micro volunteering” back then, but even without the snazzy jargon, I knew something very exciting was going on, something that was changing the way communities are engaged and mobilized. Among the discoveries in my research was that grassroots advocates had used handheld computer or phone devices to help organize and direct protesters during the 1999 Seattle demonstrations against the World Trade Organization, and that in 2001, protesters in the Philippines used cell-phone text messaging to mobilize demonstrators to help oust President Joseph Estrada. In addition, in China, also in 2001, tens of thousands of followers of the spiritual group Falun Gong continued to exist-despite a harsh crackdown-in a vibrant community fed by the Web and encrypted text messaging. I created a web page just on the subject of using text messaging for advocacy – but I was not the first to do so, as you will see on the page.

I also noted in that page that hand held technology can lead to widespread misinformation as well: “Musician and U.S.A. Green Party activist Jello Biafra noted in an article on Zdnet.Uk: ‘Be careful of the information gossip you get on the Internet, too. For example, late in 1997 I discovered out on the Internet that I was dead.'”

We’re not hearing enough about how effective Web 2.0 tools are in promoting misinformation and negative speech. For instance, micro-blogs, tweets, texts and other technology spread misinformation about and within Haiti, as well as other disaster zones (it will be interesting to see what misinformation gets spread in Japan). During the swine flu panic in the USA a while back, we saw Twitter’s power to misinform, and rumors still affect polio eradication campaigns. So-called “new” media has helped spread misinformation to derail government health initiatives here in the USA rapidly and efficiently.

It’s not just the misinformation that’s a problem in trying to use social media to mobilize community activists and educate the public: in an interview with Radio Free Europe, Evgeny Morozov, author of The Net Delusion: The Dark Side of Internet Freedom, noted that internal security agencies welcome the use of new- and social-media tools. “The reason why the KGB wants you to join Facebook is because it allows them to learn more about you from afar,” he said. “It allows them to identify certain social graphs and social connections between activists. Many of these relationships are now self-disclosed by activists by joining various groups.” Al Jazeera profiled cases in Azerbaijan, Tunisia and Moroccans where the government or those opposed to any change in government were, indeed, using Facebook accounts to anticipate protests and easily monitor and arrest protesters.

And then there’s social media, like YouTube and blogs, being used by GOTCHA media advocates, as I blogged about yesterday: there could be just one person in your community with a video camera and a dream of humiliating your organization right out of existence, and social media makes that easier than ever to do.

Don’t roll out the comments saying I’m anti-social media. Don’t start pulling your hair and gnashing your teeth, chanting, “Jayne hates Web 2.0!” I love the Interwebs. But it’s long-overdue for a reality check on all these “Twitter revolutions.” Yes, there are lessons to be learned – but we’re not focusing on the right lessons. Back in 2001, the Ruckus Society featured Longwire’s Communications Manual for Activists on its web site, and included tips for using various hand held devices and avenues-two-way radios, CB radios, cell phones, pagers, satellite communications and more in community organizing. Those lessons from a decade ago could teach current activists a lot about using social media tools effectively.