I think the work of the United States Agency for International Development (USAID) is one of the most important that my country, the USA, does.
I think foreign aid by the USA, or any other country, is vital to world economic stability and security. I believe foreign aid prevents wars and reduces human migration fueled by violence and poverty. I also believe foreign aid is just the right thing to do, to help people and our world.
Because I think USAID is so important, it’s difficult to see it stumble so badly, especially in a country I dearly love, Afghanistan. And that seems to be the case with Promote, an Afghanistan-based initiative that is USAID’s largest women’s empowerment program in the agency’s entire history. The Promote web site says:
The aim is to advance opportunities for Afghan women to become political, private sector, and civil society leaders and to build upon existing and previous programs for women and girls.
Three years after it launched, a USA government watchdog agency has reviewed the program and cannot find any concrete data that it has helped any women become political private sector or civil society leaders.
The Special Inspector General for Afghan Reconstruction (SIGAR) was established by Congress to monitor spending by the USA in Afghanistan. In its report released last week, SIGAR cites a letter from USAID saying that the Promote program had “directly benefited 50,000 Afghan women with the training and support they need to engage in advocacy for women’s issues, enter the work force and start their own businesses.” The letter added that Promote had helped women “raise their voices and contribute to the peace and prosperity of their country.”
But the SIGAR report notes that these USAID claims for the program are not backed up by any measurable data, such as actual jobs, internships or additional trainings made possible because of Promote’s work.
The SIGAR report notes that:
- The Promote program changed its performance indicators substantially in its first two years, greatly reducing the number of people it committed to serve.
- Because it did not complete a baseline study early in its implementation, Promote lacks a starting point from which to monitor and evaluate the program’s progress over its first 2 years and to measure its overall impact in Afghanistan. In other words, evaluation was not baked in right from the beginning.
- The Promote program delivers much of its programming through contractors, and SIGAR found that USAID/Afghanistan’s records on the contractors’ required deliverables were incomplete and inaccurate because management did not give contractors enough guidance on record keeping and tracking important information about deliverables in a consistent manner. In addition to such records being absolutely fundamental to being able to evaluate impact, the report notes that complete and accurate records are critical to documenting and maintaining institutional knowledge in a mission that experiences high staff turnover.
- The report also notes that the program didn’t have feedback from contractors on the potential negative impacts of the proposed programming.
In some cases, attendance at a single gender empowerment class organized by Promote was counted as a woman benefiting from the program. One target was to help 20 women find leadership positions in the Civil Service, but none have so far, according to the SIGAR report. One of the few concrete results cited in a study of the Promote project was the promotion of 55 women to better jobs, but the SIGAR report says it is unclear whether the Promote program could be credited for those promotions.
Two people associated with the program that I have seen on social media have been very upset about the SIGAR report and the article in The New York Times about it. They are saying the data IS there – but neither could give me any links to it, say where the data is or how it was collected, etc. One said that the kind of data SIGAR is asking for is impossible because of two things out of the program’s control: the security situation in Afghanistan and because of the conservative nature of the country. To which I say: NONSENSE. Neither of those factors are reasons not to have the data necessary to evaluate this program – if those issues didn’t prevent activities by the program, then they would not prevent data-gathering about such.
Program results are not meetings, not trainings, not events, and not the number of people that participated in any of them. Those are activities and mere activities can rarely be reported as program results. What happened because of the meeting or training or event? What changed? What awareness or skill was gained? What happened to the participant at the meeting, or because of the meeting, that met the programs goals?
Here is just how easy it can be to evaluate a program: Create a survey to be delivered before or at the start of a meeting, a training or event for attendees. You can get answers to that survey as one big group exercise, as a series of small group exercises or in one-on-one interviews if its a low-literacy group or if you don’t believe the target audience will fill out a paper survey. Ask about their perceptions of various issues and challenges they are facing in relation to the issues you want to address. Ask their expectations of your meeting, training or event. Then conduct a similar survey weeks or months, with the same group, and compare the results. TA DA: YOU HAVE DATA FOR EVALUATION OF YOUR RESULTS. This is a very simplistic approach and just scratches the surface on all that the Promote program should have been gathering, but even just this would have been something. It would have given some indication as to whether or not the program was working.
Now, let’s be clear: this SIGAR report does NOT say the Promote program isn’t doing anything and should be ended. Rather, as the report itself says:
after 3 years and $89.7 million spent, USAID/Afghanistan has not fully assessed the extent to which Promote is meeting its overarching goal of improving the status of more than 75,000 young women in Afghanistan’s public, private, and civil society sectors.
And then it makes recommendations to the USAID Administrator “to ensure that Promote will meet its goal in light of the program’s extensive changes and its mixed performance to
date.” Those recommendations are:
1. Conduct an overall assessment of Promote and use the results to adjust the program and measure future program performance.
2. Provide written guidance and training to contracting officer’s representatives on maintaining records in a consistent, accurate manner.
3. Conduct a new sustainability analysis for the program.
Here’s some tips regarding number 2:
- give the representatives examples of what data should look like
- explain the importance of reporting data that shows an activity has NOT worked in the way that was hoped for, and how reporting this data will not reflect poorly on the representative but, rather, show that the representative is being detailed, realistic and transparent, all key qualities for a program to actually work
- engage the representatives in role-playing regarding gathering data. Have staff members do simple skits showing various data-gathering scenarios and overcoming various challenges when interviewing someone and how to address such. Then have representatives engage in exercises where they try these techniques, with staff playing the roles of government officials, NGO representatives, community leaders hostile to the program, women participating in the program, etc.
- emphasize over and over that evaluation isn’t a separate activity from program delivery, done at the end of a project, and provide plenty of examples and demonstrations on what evaluation activities “baked in” to program delivery really looks like.
I developed this comprehensive list of questions to answer in preparation for reporting to donors, the media & general public with a colleague in Afghanistan, to help the local staff at the government ministry where we worked know what information donors and UN agencies regularly asked for, and what we anticipated they might start asking for; what subjects the media regularly asked about or reported on, and what we anticipated they might start asking about or reporting on; and what information could be used for evaluation purposes later. It was part of our many efforts to build public sector staff communications capacities in countries where I’ve served. We needed a way to rapidly bring staff up-to-speed on reporting – on EVALUATION – needs, and I think we did with these kinds of efforts. I hope Promote will develop something similar for those delivering their services, and make sure the lists are understood.
Also see: