Tag Archives: evaluation

A dare for nonprofit executive directors

graphic representing volunteers at work

Do you head a nonprofit or non-government organizations (NGO)? I have a challenge for you. It’s a simple challenge, but a revealing one, and I’m daring you to do it:

Make this list, entirely on your own, with no consultation with others, of each person at your organization that you believe is supposed to be primarily responsible for:

  • responding to someone that emails or calls and says they want to volunteer.
  • meeting with / interviewing someone for the first time that wants to volunteer, getting all the necessary paperwork from the new applicant, etc.
  • orienting/training someone that will volunteer and what that orienting or training consists of (watching a certain video? going over the employee policy manual? getting a tour of facilities?)
  • inputting all of the volunteers’ information into a central database.
  • letting volunteers know about organization events or activities they would be welcomed to join or that they may be asked about from the public they work with.
  • following up with volunteers to see how their experience is going.
  • trouble-shooting on behalf of volunteers.
  • firing a volunteer.
  • recognizing and rewarding volunteers.
  • tracking volunteer contributions and reporting such to the organization.
  • interviewing volunteers that leave, to see why and to address issues.

Now that you have your list, then, at your next staff meeting, ask your staff these same questions. And learn two things:

  • If you are right.
  • If the staff that have these responsibilities knew they had these responsibilities.

Don’t be surprised if, in fact, you are wrong about who is responsible for what, nor surprised that there are staff with these responsibilities that didn’t know it. Reflect on these discrepancies and think about how you are going to support staff that didn’t know it was their responsibility to manage a piece of working with volunteers.

And then, finally, ask for a progress report on each of these tasks. And don’t be surprised to hear, again and again, “We’re behind on that. We’ve had other priorities. Sorry.” Because unless you have a dedicated manager of volunteers, someone whose sole responsibility is to support and engage volunteers, it’s very likely all those other people who are supposed to have at least a piece of volunteer engagement as a part of their roles – the marketing director, the fundraising manager, the thrift store manager, etc. – aren’t doing it regularly. And with that, you’ll finally understand why your organization doesn’t have all the volunteers it needs and why volunteers don’t stay.

And maybe then you’ll stop saying, “Well, people just don’t want to volunteer anymore!”

Also see:

What is your social media manager doing?

This happens a lot. Too much, in fact:

I find a Twitter account for a subject in which I am very interested. I look at who the account follows, so I can see other, related accounts on the subject. Instead, I see a long list of celebrities that whomever the social media manager follows: movie stars, athletes, bands, reality show celebrities, etc. Sometimes, I even see the account follows adult entertainment stars and highly-controversial political figures. And I wonder: how much time does this social media manager spend on Twitter doing what personally interests them rather than activities that benefit the organization?

It’s not just what you post on social media that sends a message about your organization: it’s also who you follow, what you “like”, what you retweet, etc.

The accounts that your Twitter account follows should be related to your organization’s mission or subjects your organization needs updates about, such as nonprofit financial management, corporate social responsibility, volunteer management, etc.

This isn’t to say your organization can’t follow a celebrity via its social media accounts. If a celebrity is vocal in supporting the issue that is central to your nonprofit’s mission and posts about such frequently, by all means, like that celebrity’s posts that relate to that – in fact, leverage them: reply to and retweet their messages with your own organization’s congratulations or point of view.

This isn’t to say your organization shouldn’t follow a politician: you absolutely should follow your area’s elected officials, even if you don’t agree with them, because what they do can affect your organization and clients. And again, reply to their posts, even if you disagree with them, if your message relates to what your organization tries to do as a part of its mission.

If a social media manager reports to you, you need to be supervising them! You do that by:

  • Following your organization’s account on Twitter via your own, personal Twitter account – an account you never, ever have to use to post anything at all – and reading that account regularly, certainly every week
  • Following your organization’s account on Facebook and reading the posts regularly
  • Asking how many people are coming to events or activities as a result of social media posts (and if they say they don’t know, tell them they need to start finding out)
  • Asking how many people engage with the organization’s social media (comment, ask questions, etc.), not just how many people “like” a social media post
  • Asking what the manager is doing to attract new followers on social media
  • Asking for an overview of who is following the organization on social media. People interested in attending events or obtaining services? Elected officials? Other area organizations?
  • Asking the social media manager to break down by percentage the categories posts might fall into: posts that are about marketing activities, posts that are about attracting donors, posts that are about promoting the organization’s accomplishments, posts meant to educate regarding the organization’s cause, etc. If 50% of posts are asking for money, should this be reduced, and the number of posts about accomplishments be increased?
  • Asking the manager how he or she engages with other accounts on their feeds: what posts are they “liking” or commenting on, and have those interactions lead to anything – new followers, questions, criticisms, etc.

On a related note: please put the FULL name of your organization in your Twitter description, not your mission statement! I don’t want the only way to find you on Twitter to be to look on your web site – most people just give up rather than trying to hunt you down.

If you have benefited from this blog or other parts of my web site and would like to support the time that went into developing material, researching information, preparing articles, updating pages, etc., here is how you can help.

Also see:

Reporting impact should be EASY – why do so many struggle with it?

I think the work of the United States Agency for International Development (USAID) is one of the most important that my country, the USA, does.

I think foreign aid by the USA, or any other country, is vital to world economic stability and security. I believe foreign aid prevents wars and reduces human migration fueled by violence and poverty. I also believe foreign aid is just the right thing to do, to help people and our world.

Because I think USAID is so important, it’s difficult to see it stumble so badly, especially in a country I dearly love, Afghanistan. And that seems to be the case with Promote, an Afghanistan-based initiative that is USAID’s largest women’s empowerment program in the agency’s entire history. The Promote web site says:

The aim is to advance opportunities for Afghan women to become political, private sector, and civil society leaders and to build upon existing and previous programs for women and girls.

Three years after it launched, a USA government watchdog agency has reviewed the program and cannot find any concrete data that it has helped any women become political private sector or civil society leaders.

The Special Inspector General for Afghan Reconstruction (SIGAR) was established by Congress to monitor spending by the USA in Afghanistan. In its report released last week, SIGAR cites a letter from USAID saying that the Promote program had “directly benefited 50,000 Afghan women with the training and support they need to engage in advocacy for women’s issues, enter the work force and start their own businesses.” The letter added that Promote had helped women “raise their voices and contribute to the peace and prosperity of their country.”

But the SIGAR report notes that these USAID claims for the program are not backed up by any measurable data, such as actual jobs, internships or additional trainings made possible because of Promote’s work.

The SIGAR report notes that:

  • The Promote program changed its performance indicators substantially in its first two years, greatly reducing the number of people it committed to serve.
  • Because it did not complete a baseline study early in its implementation, Promote lacks a starting point from which to monitor and evaluate the program’s progress over its first 2 years and to measure its overall impact in Afghanistan. In other words, evaluation was not baked in right from the beginning.
  • The Promote program delivers much of its programming through contractors, and SIGAR found that USAID/Afghanistan’s records on the contractors’ required deliverables were incomplete and inaccurate because management did not give contractors enough guidance on record keeping and tracking important information about deliverables in a consistent manner. In addition to such records being absolutely fundamental to being able to evaluate impact, the report notes that complete and accurate records are critical to documenting and maintaining institutional knowledge in a mission that experiences high staff turnover.
  • The report also notes that the program didn’t have feedback from contractors on the potential negative impacts of the proposed programming.

In some cases, attendance at a single gender empowerment class organized by Promote was counted as a woman benefiting from the program. One target was to help 20 women find leadership positions in the Civil Service, but none have so far, according to the SIGAR report. One of the few concrete results cited in a study of the Promote project was the promotion of 55 women to better jobs, but the SIGAR report says it is unclear whether the Promote program could be credited for those promotions.

Two people associated with the program that I have seen on social media have been very upset about the SIGAR report and the article in The New York Times about it. They are saying the data IS there – but neither could give me any links to it, say where the data is or how it was collected, etc. One said that the kind of data SIGAR is asking for is impossible because of two things out of the program’s control: the security situation in Afghanistan and because of the conservative nature of the country. To which I say: NONSENSE. Neither of those factors are reasons not to have the data necessary to evaluate this program – if those issues didn’t prevent activities by the program, then they would not prevent data-gathering about such.

Program results are not meetings, not trainings, not events, and not the number of people that participated in any of them. Those are activities and mere activities can rarely be reported as program results. What happened because of the meeting or training or event? What changed? What awareness or skill was gained? What happened to the participant at the meeting, or because of the meeting, that met the programs goals?

Here is just how easy it can be to evaluate a program: Create a survey to be delivered before or at the start of a meeting, a training or event for attendees. You can get answers to that survey as one big group exercise, as a series of small group exercises or in one-on-one interviews if its a low-literacy group or if you don’t believe the target audience will fill out a paper survey. Ask about their perceptions of various issues and challenges they are facing in relation to the issues you want to address. Ask their expectations of your meeting, training or event. Then conduct a similar survey weeks or months, with the same group, and compare the results. TA DA: YOU HAVE DATA FOR EVALUATION OF YOUR RESULTS. This is a very simplistic approach and just scratches the surface on all that the Promote program should have been gathering, but even just this would have been something. It would have given some indication as to whether or not the program was working.

Now, let’s be clear: this SIGAR report does NOT say the Promote program isn’t doing anything and should be ended. Rather, as the report itself says:

after 3 years and $89.7 million spent, USAID/Afghanistan has not fully assessed the extent to which Promote is meeting its overarching goal of improving the status of more than 75,000 young women in Afghanistan’s public, private, and civil society sectors. 

And then it makes recommendations to the USAID Administrator “to ensure that Promote will meet its goal in light of the program’s extensive changes and its mixed performance to
date.” Those recommendations are:

1. Conduct an overall assessment of Promote and use the results to adjust the program and measure future program performance.

2. Provide written guidance and training to contracting officer’s representatives on maintaining records in a consistent, accurate manner.

3. Conduct a new sustainability analysis for the program.

Here’s some tips regarding number 2:

  • give the representatives examples of what data should look like
  • explain the importance of reporting data that shows an activity has NOT worked in the way that was hoped for, and how reporting this data will not reflect poorly on the representative but, rather, show that the representative is being detailed, realistic and transparent, all key qualities for a program to actually work
  • engage the representatives in role-playing regarding gathering data. Have staff members do simple skits showing various data-gathering scenarios and overcoming various challenges when interviewing someone and how to address such. Then have representatives engage in exercises where they try these techniques, with staff playing the roles of government officials, NGO representatives, community leaders hostile to the program, women participating in the program, etc.
  • emphasize over and over that evaluation isn’t a separate activity from program delivery, done at the end of a project, and provide plenty of examples and demonstrations on what evaluation activities “baked in” to program delivery really looks like.

I developed this comprehensive list of questions to answer in preparation for reporting to donors, the media & general public with a colleague in Afghanistan, to help the local staff at the government ministry where we worked know what information donors and UN agencies regularly asked for, and what we anticipated they might start asking for; what subjects the media regularly asked about or reported on, and what we anticipated they might start asking about or reporting on; and what information could be used for evaluation purposes later. It was part of our many efforts to build public sector staff communications capacities in countries where I’ve served. We needed a way to rapidly bring staff up-to-speed on reporting – on EVALUATION – needs, and I think we did with these kinds of efforts. I hope Promote will develop something similar for those delivering their services, and make sure the lists are understood.

Also see:

History & Evaluation of UNV’s Early Years

Whilst trying to make a list of all of the Executive Coordinators of the United Nations Volunteers program since UNV began in 1970, to update UNV’s profile on Wikipedia, I found quite a delicious document from 1974, which provides the most detailed history of the origins of the UNV program that I have ever read – origins I don’t think most people are aware of, including most staff at UNV – as well as an evaluation of UNV’s first three years of operation.

Some things have changed quite a lot at UNV since this document was published – but some have stayed the same.

The article is The Platonic Acorn: A Case Study of the United Nations Volunteer. It’s by Robert A. Pastor who, at the time of this paper’s publication in 1974, was a graduate student at the Kennedy School of Government at Harvard University. Pastor was a former Peace Corps volunteer who went on to many high-profile international endeavors: he was a member of the National Security Council Staff during the administration of President Jimmy Carter, he was associated with various universities, and also served as a Senior Fellow at the Carter Center, where he established the programs on Latin America and the Caribbean, democracy and election-monitoring, and Chinese village elections. He died of colon cancer in 2014.

His paper about the first years of UNV was published in the journal International Organization, published by Cambridge University Press on behalf of the International Organization Foundation. This article appeared in Volume 28, Issue 3 July 1974, pp. 375-397, and it’s accessible online, for free, from JSTOR. The Abstract for his paper:

This article presents both a history and an administrative analysis of the United Nations Volunteers, an international organization established by a General Assembly resolution in December 1970. The hope that the new organization would presage a new era of multinational volunteerism has proven groundless. In seeking to explain the ineffectiveness of the UN Volunteers, I look inside the organization and find that it has little or no control over its six principal functions. This extreme decentralization of responsibility is then explained not by a static description of the institutional but by focusing on the dynamic process by which state and transnational actors exercised influence during the different stages of the organization’s establishment and development. Those actors whose autonomy was most jeopardized by a new volunteer organization were most active in defining and limiting the scope of its operations. The relative lobbying advantages of state and transnational actors meshed with bureaucratic and budgetary constraints to ensure an enfeebled organization.

Whew!

Pastor is very critical of UNV’s recruitment and placement processes in particular, as it slowed volunteer placement to a crawl. The problem was that, in the 1970s, each stage in the UNV selection process was managed by a different organization in a different location, resulting in 11 different stages between the volunteer-involving organization and the applicant. As a result, as of March 1973, UNV had filled just 93 posts from approximately 400 requests. In addition, 85 percent of volunteers were from least developed countries (LDCs). Then, it was seen as a problem, because the program was supposed to be “universal”, with a significant number of young volunteers from industrialized countries:

The organizational process also helps to explain why there is such a high percentage of volunteers from LDCs, and may help predict why this is likely to continue. Many applicants from LDCs view the UNV as a step into the UN civil service, and thus they are willing to tolerate longer delays than their counterparts in the developed world who generally view volunteer service as precisely that. The result, that LCD volunteers currently count for nearly half of all volunteers, is a bit ironic since one of the original purposes of volunteerism was to exploit the skill surplus of the developed countries.

I have no idea what the timeline is now between the creation of a UNV assignment and placement of a person into that assignment, but mentalities regarding people from developing countries as UNV has greatly changed: UNV now prides itself on a high percentage of volunteers from developed countries, the idea being that it is a reflection of south-to-south cooperation. The average age of UNVs has also increased, from people in their 20s when the program started to 38 now – a program originally designed to channel the energies of youth has become something quite different.

Another criticism by Pastor is that “Although volunteers are supposed to work directly with host country people, they find themselves working with and accountable only to foreign experts.” In the last few years, UNV has focused on its capacity to be a low-cost staffing solution for UN agencies, so this criticism could still be made – and may become a greater issue.

Pastor questions UNV’s ability at the time to fulfill specialized requests for volunteers, and suspects the level of specialization requested is much higher than what is actually necessary. He provides imaginary, outrageous examples of such requests, such as for a “French-speaking sand dune fixation expert.” He says, “Assuming that these specialists exist, the likelihood of finding one who would volunteer is negligible, while the price of the search is exorbitant.” Pastor’s paper was written more than two decades before the Internet became widely used in the USA, and then grew exponentially globally; recruitment of highly-specialized candidates for volunteering is now easy for most situations, and the number of applicants for these assignments shows an abundance of experts willing to take on such volunteering roles.

Another criticism in the document is if the UNV program was, in fact, a volunteer program because of the “high professional calibre” of volunteers – meaning the degree of expertise of the volunteers somehow makes them not really volunteers anymore. He notes that UNV “insists on selling its product as an inexpensive substitute for experts.” Since then, thankfully, the understanding of the word volunteer has changed, and it does not mean amateur, unskilled, or inexperienced. But for UNV now, in 2017, what does volunteer mean? In the USA, a person is a volunteer at a nonprofit or other mission-based organization if he or she is not paid by that agency for services rendered. In fact, the federal agency in charge of regulating labor has strict guidelines on who may be called a volunteer – and who may not. As UNVs, especially national UNVs from the same country where they are serving, receive excellent compensation, called a stipend rather than a salary, what makes them a volunteer? That I cannot answer.

Pastor’s review of UNV is a fascinating document which offers a lot of challenging questions about UNV – and some of these questions, IMO, need to be asked again.  I’m so sorry I can’t thank him for his paper, and talk with him about how UNV has evolved. I would have loved to hear what he thought of the Online Volunteering service in particular, which I think meets many of the goals originally set out for UNV but not realized.

In the course of my research, I also found the book The Role and Status of International Humanitarian Volunteers and Organization: The Rights and Duty to Humanitarian Assistance by Yves Beigbeder, ISBN 0-7923-1190-6. It was published in 1991, and from the pages available on Google, it seems to also have some scathing analysis of UNV’s performance up to that date. It’s hard to find information about the author; there’s scant information online about him, though he seems to be a prolific writer. Online, it says he served at the Nuremberg Tribunal in 1946 and had a “long career in UN organizations as a senior official.” Beigbeder’s book is hard to get hold of; it is offered online for about $100, well beyond the budgets for most folks interested in evaluating volunteer placement agencies (and beyond my own budget as well).

In the book, Beigbeder says that the UNDP Governing Council asked the UNV administrator to undertake a review of the UNV program in 1986 and in 1987. The report was a mixed bag on UNV performance at that time: it was noted that, in Yemen, “UNVs are quickly operational, less demanding in support services and more adaptive to difficult, harsh and isolated working conditions than other technical assistance staff.” But In Papua New Guinea, results were good and bad. “When UNVs have not done well, the cause was either poor project design, noninvolvement by supervisors in developing the job description, job duties imprecise or modified after the arrival of the UNV, wrong selection, or language deficiencies.” All of those can still be problems with UNV assignments – or for any international placement organization, for that matter. Addressing those problems is an ongoing issue.

Finally, my search also lead me to the self-published book Not Only a Refugee: An American UN Volunteer in the Philippines by Eleanor Grogg Stewart, about her time in the early 1980s, specifically in and around 1982, when she worked in a refugee camp. Several pages from her book are available on books.google.com. It’s detailed account of the early days of UNV, as well as trying to navigate UN bureaucracy.

It’s a shame that early accounts and evaluations of nonprofit organizations, international aid agencies, government programs and other mission-based entities are forgotten. It’s so interesting to read how much has changed, how much has improved – and how far we still have to go. How can we know if we’re making a difference if we aren’t looking at what our agencies promised in the past?

One final note: On 31 May, 2017, the Executive Board of UNDP, UNFPA and UNOPS convened in New York to discuss the findings of an independent evaluation of the United Nations Volunteers (UNV) programme and UNV’s new Strategic Framework goals and objectives for 2018-2021. Here is a press release about the meeting, which says Nina Retzlaff is the independent evaluator of the UNV Strategic Framework 2014-2017 and that she elaborated on key points and early findings from her evaluation at the New York meeting, noting that there was a high level of satisfaction from UN partners on the work of UN Volunteers and that “91% of UN partners confirm UNV responds to their needs, [and that] 92% of the UN Volunteers report a satisfactory experience.” She also said that “UNV’s programmatic niche is in Youth and Volunteer Infrastructure.” I would love to read the evaluation but, cannot find out if it’s even been finished, let alone published. It would be fascinating to read how it compares to these earlier aassessments.

Also see:

Measuring social media success? You’re probably doing it wrong.

logoA nonprofit buys billboard space on a major highway. Thousands of people drive by the billboard every day. After a week, the marketing director declares the billboard a huge success because of the number of people that are driving by the billboard. However, there is no significant gain in donations, volunteers or clients by the organization.

Does this sound like a ridiculous way to measure the success of a marketing activity? It is. Yet, that’s how I regularly hear people measure the success of social media use by a nonprofit, government agency or other mission-based initiative.

If your nonprofit is an animal shelter, or a farmer’s cooperative, or a community theater, or a health clinic, or any other nonprofit that serves a geographically-specific clientele, having thousands of Twitter followers is not an indication that you are having social media success. So what? That’s the same as the billboard out on the highway. It’s just a number, and if it’s not translating into something tangible, it’s a waste of money and effort.

For online activities to translate into something tangible, online action must create and support offline action or behavior. What could this look like?

  • An increase in the number of volunteers providing service to your organization
  • An increase in the number of volunteers who stay with your organization over a longer term
  • A greater diversity of volunteers providing service, with greater representation from under-represented groups
  • Greater numbers of donors
  • More repeat donors
  • New donors
  • Greater attendance to conferences, workshops, etc.
  • Greater attendance to events with an entrance fee, which creates greater revenues
  • Greater numbers of downloads or purchases of a publication or other product
  • Greater numbers of clients or people served
  • More repeat clients
  • A greater diversity of clients receiving services from your organization
  • Larger numbers of people writing government officials, corporate representatives or the media regarding the cause your organization promotes
  • Larger numbers of people filling out surveys that you will use in creating proposals, reports and publications regarding your organization’s work
  • More feedback from volunteers, donors, clients and the general public regarding your work
  • Volunteers and clients reporting a perception of greater support from your organization
  • Volunteers and clients reporting a new / changed perception that relates to your mission (for instance, those you engage with online reporting that they are no longer prejudiced against a particular group or community) or a change in behavior or practice that relates to your organization’s mission (for instance, if you were an organization that promotes recycling, and those you engage with online telling you they are recycling more)
  • Volunteers, clients, staff, the general public and/or the press reporting a perception of greater support from your organization, an improved perception of the organization’s impact, an increased awareness about the cause an organization promotes, etc.

A few hundred Twitter or Instagram followers may not sound impressive, but if most of those followers are in your geographic area, if there are lots of public officials and other nonprofit representatives and local people served by your organization among those followers, you’re doing well. If you are a nonprofit serving teens, and most of those followers are teens, you are doing VERY well. It’s not about the how many, it’s about the who.

How can you measure social media success ? I talk about that on my web page Evaluating Online Activities: Online Action Should Create & Support Offline Action & Results. For most nonprofits, measuring is not a matter of a software choice; it’s going to take a more person-to-person approach, involving surveys and interviews. In other words, engagement.

Quit celebrating how many people have “liked” your organization’s Facebook page. Are discussions happening on that Facebook page? Are people asking questions? Are individual status updates being liked and shared? Celebrate engagement.

Also see:

Measuring the Impact of Volunteers: book announcement

Want to make me cranky? Suggest that the best way to measure volunteer engagement is to count how many volunteers have been involved in a set period, how many hours they’ve given, and a monetary value for those hours. Such thinking manifests itself in statements like this, taken from a nonprofit in Oregon:

Volunteers play a huge role in everything we do. In 2010, 870 volunteers contributed 10,824 hours of service, the equivalent of 5.5 additional full-time employees!

Yes, that’s right: this nonprofit is proud to say that volunteer engagement allowed this organization to keep 5.5 people from being employed!

Another cringe-worthy statement about the value of volunteers – yes, someone really said this, although I’ve edited a few words to hide their identity:

[[Organization-name-redacted]] volunteers in [[name-of-city redacted]] put in $700,000 worth of free man hours last year… It means each of its 7,000 volunteers here contributed about $100 – the amount their time would have been worth had they been paid.

I have a web page talking about the dire consequences of this kind of thinking, as well as a range of blogs, listed at the end of this one. That same web page talks about much better ways to talk about the value of volunteers – but it really takes more than a web page to explain how an organization can measure the true value of volunteers.

9780940576728_FRONTcover copyThat’s why I was very happy to get an alert from Energize, Inc. about a new book, Measuring the Impact of Volunteers: A Balanced and Strategic Approach, by ChristineBurych, Alison Caird, Joanne Fine Schwebel, Michael Fliess and Heather Hardie. This book is an in-depth planning tool, evaluation tool and reporting tool. How refreshing to see volunteer value talked about in-depth – not just as an add-on to yet another book on volunteer management. But the book’s importance goes even further: the book will not only be helpful to the person responsible for volunteer engagement at an organization; the book will also push senior management to look at volunteer engagement as much, much more than “free labor” (which it isn’t, of course). Marketing managers need to read this book. The Executive Director needs to read this book. Program managers need to read this book. The book is yet another justification for thinking of the person responsible for the volunteer engagement program at any agency as a volunteerism specialist – a person that needs ongoing training and support (including MONEY) to do her (or his) job. This book shows why the position – whether it’s called volunteer manager, community engagement director, coordinator of volunteers, whatever – is essential, not just nice, and why that person needs to be at the senior management table.

I really hope this book will also push the Independent Sector, the United Nations, other organizations and other consultants to, at last, abandon their push of a dollar value as the best measurement of volunteer engagement.

For more on the subject of the value of volunteer or community engagement, here are my blogs on the subject (yeah, it’s a big deal with me):

Valuing volunteer engagement: an imaginary case study


Imagine a nonprofit theater showing the value of its volunteer usher program by saying:

We involved 40 ushers in 2015, and they provided 100 hours of service, and since the Independent Sector says the value of a volunteer hour is $23.07, the value of our volunteer usher program in 2013 was $2,307.00.

Here’s what such a statement shows:
moneysigns

  • The value of volunteers is that the organization doesn’t have to pay them
  • Volunteers save money, because they do work for free.
  • Volunteer time, hour per hour, is more valuable than that of all the staff members that aren’t directors, because they are all paid far less than $23.07 an hour.
  • The organization could get even more value for its volunteer program if it could get more volunteers doing things it is currently paying staff to do.
  • The greater the number of volunteer hours, the greater the value of the volunteer engagement.

How would such a stated value of the volunteer usher program make the ushers feel? Make the receptionist feel? Make donors that are union members feel?

It’s an obviously awful idea. Yet, this is how so many consultants and organizations want nonprofits to state the value of volunteer engagement.

By contrast, I would find the value of a volunteer usher program through collecting data that could be measured against both the mission of the organization and the mission of the volunteer program. Let’s say the mission of the organization is “to provide theatrical works that entertain, enlighten, and have a transformative impact on our audiences, and build an appreciation of the arts in our community.” Yes, I just made that up. I have examples of mission statements for volunteer engagement programs here. Here’s how I would collect that data:

I would find out what impact being a volunteer at the theater had for the ushers. I would find this out through interviews and surveys, asking things like “Why did you want to be an usher at our organization?” and “What have you learned as an usher that you might not have known otherwise about our theater? Or about putting on theater productions?” I would also ask why they think volunteer ushers might be preferable for the theater to paying people to do the work.

I would survey new ushers before they began their volunteering, and then survey them after they had served a certain number of hours, asking them the same questions, to see if their perceptions about theater in general, and our theater, specifically, had changed.

I would ask audience members how ushers help their experience at our theater. I’d do this through surveys and interviews.

I would ask staff members how they believe hosting ushers benefits them, the audience, and the theater as a whole. I would also ask why they think volunteer ushers might be preferable to paying people to do the work.

I would look at the profiles of the ushers, and see what range of age groups were represented, what range of zip codes were represented (based on residencies), and if possible, look at the range of ethnicities represented, and other data, that could show how representative of our community the volunteer ushers are.

If I didn’t have time to do all of this data gathering and interviewing myself, I would talk to faculty members at area universities and colleges that teach classes in nonprofit management, sociology, psychology or sociology, to see if students in one of their classes could do the data collection as part of an assignment, or a PhD student who might want to oversee the project as part of his or her doctorate work. The students would get practical experience and I would get people who, perhaps, people would be willing to give more honest answers to than me, someone they know from the theater.

None of this is vague, feel-good data; it’s data that can be used not only to show the organization is meeting its mission through its volunteer engagement, but also testimonials that can be used in funding proposals and volunteer recruitment messages. It would also be data that could help the organization improve its volunteer engagement activities – something that monetary value also cannot do.

Whether your organization is a domestic violence shelter, an after-school tutoring program, a center serving the homeless, an animal rescue group, a community garden – whatever – there is always a better way to demonstrate volunteer value than a monetary value for hours worked. What a great assignment for a nonprofit management or volunteer management class…

For more on the subject of the value of volunteer or community engagement:

Get to know the Sustainable Development Goals (SDGs)

2015-07-21-SDGsGoodbye, Millennium Development Goals (MDGs), hello, Sustainable Development Goals (SDGs).

The SDGs are the 17 goals towards which all United Nations efforts will now work, and the UN will encourage all NGOs and governments to work towards them as well, to make our world a better place. Like the MDGs before, the SDGs will help UN initiatives better focus its work across various agencies, various partnerships, and various regions.

I congratulate my United Nations colleagues, especially those at the United Nations Development Programme (UNDP) , who have been working on drafting these. I encourage you, if you work in addressing any of the areas mentioned by the SDGs, or you want to work in international development, to become familiar with these, and to start referencing them in your work.

The MDGs were introduced in 2000. I began at the UN in February 2001, and I found the MDGs incredibly helpful in approaching my work, in making it more focused. In my opinion, the MDGs did an excellent job of focusing the work of the UN, various NGOs and governments, providing a framework for all of our initiatives. The MDGs are simple, with goals with which no one would disagree, for the most part. The MDGs are “an explicit recognition of the reality that a large proportion of people in the world were deprived and poor.” The MDGs sought a time-bound reduction in poverty to improve the living conditions of those deprived and excluded, and it was an attempt to place this persistent problem, until then a largely national concern, on the development agenda for international cooperation.1 The MDGs specified a destination but, purposely, did not chart the journey, so that each country – indeed, each community – could develop its own way of reaching the goals.

But, as we all knew it would, the world has changed significantly since the MDGs were created in 2000. Notions of developed and developing have continued to evolve. Now, international development is less about the transfer of aid from rich to poor countries and more about progressive change from within, and empowering those local agents of change. The world has always been interconnected, but challenges and opportunities seem to happen so much more quickly now, across borders, requiring incredibly rapid responses. The MDGs were always meant to be replaced as the world evolved, and now they have. No, we didn’t reach the MDGs by 2015, but we did better target our work towards the world’s most poor.

The MDGs made no mention of human rights and did not specifically address economic development; the SDGs correct this. The SDGs apply to all countries, rich and poor alike, and the UN conducted the largest consultation in its history to gauge opinion on what the SDGs should include. I’ve read several comments that say the SDG framework brings together the different aspects of sustainable development – economic, social and environmental – in a much more integrated way than the MDGs, but I haven’t read any specific examples of that, or seen any illustrations of this yet.

The deadline for the SDGS is 2030. Will we reach the goals by then? Probably not. But we will make progress towards them, if we have the will to do so. Are the SDGs perfect? No. But there better than what we had, and better than nothing. I often think the arguments against the SDGs, like the MDGs, keep us from activities that the world desperately needs.

Footnotes

1. “The MDGs after 2015: Some reflections on the possibilities,” by Deepak Nayyar, for the UN System Task Team on the Post-2015 UN Development Agenda, April 2012

Finding out how many orgs are involving online volunteers

A followup to my last blog, where I whined that so many organizations charged with measuring volunteering in a region or country refuse to ask any questions related to virtual volunteering.

As I’ve said many times: when I do workshops on virtual volunteering, and describe all the different aspects of what online volunteering looks like, including microvolunteering, someone always raises a hand or comes up to me afterwards to say, “My organization has online volunteers and I didn’t even know it!” or “I’m an online volunteer and I didn’t know it!”

If you ask organizations, “Do you have virtual volunteering / microvolunteering at your organization?” most will say “No.” But if you ask the question differently, the answer is often “Yes!”

How would YOU ask the question of organizations to find out if they were engaging volunteers online?

Here’s one idea:

In the last 12 months, did any volunteers helping your organization work in whole or in-part offsite on behalf of your organization, and use their own computers, smart phones, notebooks (Internet-enabled devices) from their home, work or elsewhere offsite, to provide updates on their volunteering, or the results of their volunteering?

What is your idea for ONE question? Please post it in the comments.

Challenges to getting answers:

  • There’s rarely just one person at an organization involving volunteers; often, several employees or key volunteers are involving volunteers, but there may not be one person tracking all of this involvement. So if you ask this question of just one person at the organization, you might not get an accurate answer.
  • The word volunteer is contested. People will say, “Oh, we don’t have volunteers. We have pro bono consultants, we have unpaid interns, we have executives online, we have board members, but we do not have volunteers.” That means someone who is advising your HR manager regarding the latest legislation that might affect hiring or your overworked marketing person regarding social media, and offering this advice unpaid, from the comfort of his or her  home or office or a coffee shop, won’t be counted as an online volunteer – even though they are. In fact, I talked to the manager of an online tutoring program who brought together students and what she called “subject matter experts” (SMES) together online for school assignments, but because it never dawned on her that the SMES were volunteers (unpaid, donating their service to a cause they believed in), she had no idea she was managing a volunteer program, let alone a virtual volunteering program.

This is not easy. I’ve been researching virtual volunteering since 1996 and, geesh, it’s still not easy! When does it get easier?!

volunteer managers: you are NOT psychic!

A colleague recently posted that this was one of the things that makes a great volunteer manager: going with your gut feeling.

UGH! Dislike!

In my trainings, I say just the opposite: do NOT assume your gut is telling you the truth.  

NEVER let your gut be your guide to decision-making.

I’ve had volunteer managers tell me that their gut reaciton to applicants to volunteer is their primary guide to keeping “bad” people out of their program. And, so, I remind them of all of the many people who had no negative gut feeling about clergy, coaches or youth group leaders before or while those people abused children. And of all many people who did not have a negative gut feeling about that boyfriend, spouse, family member or friend who, after years of knowing each other, turned out to be a liar, a cheat – even a killer.

Everyone in the Penn State/Second Mile scandal went with their gut instead of following good policy and procedures. Look where it got them!

Linda Graff once told me that one of the most chilling things you will ever do is sit in a courtroom and watch all of the many people ready to testify on behalf of their husbands, wives, sons, daughters, neighbors, co-workers, etc. – oh, no, that person could NOT do the things you have accused him/her of. It’s impossible. I KNOW this person. I don’t care what your evidence says – I know in my soul he/she is a good person. Those people’s guts told them one thing – and despite the facts, they prefer to listen to their gut.

I have almost let my gut feeling turn volunteers applicant away — and those people have turned out to be some of my best volunteers. What I was actually doing was hearing my prejudices: about age, about culture, or about education (or lack their of). And I was honest enough to explore that and admit to it.

I have had people tell me, after working together for a couple of months, “You know, my first impression of you was insert-negative-comment-here. You have turned out not at all to be that way.” And I thank them for NOT going with their gut!

I’ve had endless numbers of volunteer managers tell me that their gut reaction to virtual volunteering is NO WAY IS THAT SOMETHING MY ORGANIZATION SHOULD DO.

In the course of my job, I never let my gut make decisions for me. Ever. Yes, my gut reaction might lead me in a direction, but if my gut is telling me something in the work place, such as don’t accept that person as a volunteer or that new idea just isn’t worth trying, I don’t make a decision based on that – I do more investigating and questioning. When it comes to effectively supporting and engaging volunteers, I need facts. Why am I having that feeling that such-and-such isn’t a good volunteer? Is it that he is being evasive in his answers? Is it that she seems too good to be true? Is it that he looks like an ex-boyfriend? When I start answering those questions honestly for myself, I either come to the concrete, fact-based reason I don’t want the person as a volunteer or I have to accept that my reluctance is more about prejudice than reality.

Volunteer managers: you are not psychic. There are no such things as psychics. Listen to your gut, but do NOT let it make your decisions, and if you haven’t said in the last three months, “Wow, my gut was wrong about that!” then you are NOT being honest with yourself!

Also see:

Dangerous Instincts: How Gut Feelings Betray Us by retired FBI profiler Mary Ellen O’Toole (with co-author Alisa Bowman)

Hard Facts, Dangerous Half-Truths & Total Nonsense: Profiting from Evidence-Based Management by Jeffrey Pfeffer and Robert I. Sutton

Beyond Police Checks: The Definitive Volunteer & Employee Screening Guidebook by Linda L. Graff