THE Caribbean Coastal Area Management (C-CAM) Foundation is moving to bring sobriety to the raging debate over the proposed port development at the Goat Islands.
In a release issued earlier today, C-CAM, which does conservation work inside the Portland Bight Protected Area, revealed it is seeking funds to hire an international firm “to do a cost effectiveness comparison” of developing the port at the Goat Islands relative to one other site — “time permitting”.
Eleven Civil society organisations and six Journalists on Thursday joined forces to see how they could best communicate biodiversity and conservation issues in the Dominican Republic using both traditional and social media.
“We have never used social media before in our work.”
“We are excited to see how social media can make it easier for us to share what we do.”
“I hate the idea of social media but I will listen,” said an older participant.
These were some of the initial views expressed at the start of the two day workshop being put on by the Critical Eco-system Partnership Fund (CEPF) and Panos Caribbean in Santo Domingo from March 13-14, 2014.
This workshop is the third in a series being put on under a regional project to bring civil society organisations together with members of the media for networking as well as to build the knowledge of the media about Key Biodiversity Areas and the Non-Government Organisations working to conserve them.
“We also wanted to train civil society organisations in communication skills and deepen their understanding of the media, their operations and motivations,” said Indi Mclymont-Lafayette, Regional Coordinator for Panos Caribbean.
Panos Caribbean is a regional non-government organization based in Haiti and Jamaica. It has received funding from the CEPF to implement the regional project titled, ‘Strengthening the Engagement of Caribbean Civil Society in Biodiversity Conservation Through Local and Regional Networking and Effective Sharing of Learning and Best Practices’.
Participants share ideas on how to make communicating biodiversity issues sexy to the media and other stakeholders on the first day of the CEPF – Panos workshop in Santo Domingo.
Juan Manual Diaz, Sustainability Director at the Instituto Dominicano de Desarollo Integral (IDDI) shares a successful project that his organization has done in Baharuco
2013 marked the 40th anniversary of the signing of the Convention on International Trade in Endangered Species of Wild Fauna and Flora (CITES) – March 3, 1973. To mark the anniversary, 3 March, 2014 has been chosen as the first ever World Wildlife Day. It is an important moment in global biodiversity conservation as CITES is a historic international agreement made between the governments of 180 member countries to protect earth’s species from the threat of exploitation due to trade.
BirdsCaribbean joins in this 40th anniversary by reflecting on the loss of several poorly-known Caribbean birds that historical sources indicate were lost to hunting, poaching, and trade. For example, in the last 400 years the region has lost an estimated 14 birds in the parrot family (most notably our macaws and parrots) to this cause of extinction. Today poaching and trade remains a pervasive threat to many species of birds, among other vertebrates groups regionally. Caribbean birds including hummingbirds, finches, and all the Amazon parrots are all targets of criminals willing to exploit our region’s biological resources.
These wildlife crimes are real crimes! They rob society of unique cultural heritages, destabilize ecosystems, and remove economic opportunities (nature tourism is the fastest growing segment of the global tourism market). Irrespective of its scale, we can all help to stop the trade in endangered species by not purchasing wildlife products (meats, souvenirs, pets); actively discouraging others to take wild animals; and reporting offenses to the media, local authorizes, and the police.
When more people take action to stop practices that permit the taking and trafficking of wildlife within our communities, the future of several of our most at risk Caribbean species will be more secure. Add your voice to our call to make wildlife crimes no less important than the theft, exploitation or destruction of any of our other valuable natural resources.
Leo R. Douglas, Ph.D.
& Department of Geography/Geology
University of the West Indies, Mona
Duration of consultancy: February 28, 2014 – April 28, 2014
Deadline for application: Friday February 21, 2014
Who can apply: Videographers/Video editors
Aim: Regional communication NGO Panos Caribbean is offering a video production consultancy to highlight lessons learnt and best practices inside key biodiversity areas in the Caribbean, notably Jamaica, Haiti and the Dominican Republic.
Requirements: Applicants must be able to demonstrate a commitment to complete the five-minute video on time and within budget.
Applicants should send a copy of their résumé and one recent example of their work.
Payment: The selected consultant(s) will be paid a total of US$1,500 for the creation and editing of the video. A small stipend will be provided to help to cover the cost of travel to gather information and any human or other resource the selected consultant(s) may need to help to complete the work.
Submission: Send applications to Petre Williams-Raynor at email@example.com, copied to Indi Mclymont Lafayette at firstname.lastname@example.org. Call 920-0070-1 to have any queries answered.
The video production consultancy is being offered as part of the project ‘Strengthening the Engagement of Caribbean Civil Society in Biodiversity Conservation Through Local and Regional Networking and Sharing of Learning and Best Practices’. The project, which is being implemented by Panos Caribbean, is designed to enable local and regional information sharing and networking on species, key biodiversity areas, biodiversity, critical ecosystems and approaches to conservation in the conservation corridors of Jamaica, the Dominican Republic and Haiti.
It has been made possible through funding from the Critical Ecosystem Partnership Fund (CEPF), which provides grants for both non-governmental and private sector entities to help protect biodiversity hotspots across the globe.
Panos Caribbean is an international information organisation established in 1986. Panos believes that information which is independent, accurate and timely is a key resource for development. Information needs to be locally generated in order to enable countries and communities to shape and communicate their own development agendas through informed public debate.
Its mission is to promote sustainable development in the Wider Caribbean region through empowering all sectors of society to articulate their own information and perspectives on development issues and broadcast them across language and political borders. In particular, Panos aims to disseminate, through the media, the voices of poor and marginalised people who are affected by certain development issues (farmers and fisher folk, women, children, people living with HIV/AIDS, persons with disabilities et al). This encourages their full participation in shaping the development of their societies.
Since 1989, Panos has been working in the Caribbean through close alliance with the media in the region to raise awareness about select environmental issues.
By Valerie Gordon and Yacine Khelladi
24 JANUARY 2014. SANTO DOMINGO, Dominican Republic — The Caribbean has benefitted from significant development assistance in the more than four decades since most of the countries became independent of colonial powers. The form and dollar value of this assistance has varied considerably, and has supported interventions ranging from infrastructure development to human and institutional capacity building, and civil society support to environmental management. However, with most Caribbean countries now classified as “middle income” under the World Bank’s most recent designation, funding priorities for the region are changing. Many traditional funding agencies are currently focussing instead on low-income countries in Asia and Africa to spend their increasingly scarce development assistance dollars.
While a recent reflection on the Caribbean’s status by the UN Secretary General credited the region for demonstrating resilience in the face of external shocks, such as natural disasters and the global financial crises, there was a concomitant call for the region to move forward with a development agenda, which underlines equality and sustained growth.
This agenda must address issues such as climate change, crime and violence and youth unemployment while maximising opportunities for trade competitiveness and high productivity activities within a context of good governance, transparency, and accountability.
At this transitional juncture, where many Caribbean societies find themselves, there is a challenge to employ the available technologies, and the data and research information generated by government and research institutions to solve the region’s problems.
In seeking to meet this challenge, the Caribbean Open Institute (COI) was established in 2010 with the support of the International Development Research Centre (IDRC) out of Canada. The COI’s specific mandate is to engage and work with governments, researchers, journalists, technologists, NGOs, and academics to raise awareness, strengthen their capacity, and combine their efforts to adopt Open Development (OD) approaches in support of inclusion, participation and innovation.
Alongside the implementation of the various activities, there exists a need to understand and measure the effectiveness and impact of the process by which these initiatives transform into knowledge that inform critical decision-making, improve governance and contribute to sustained growth and development.
Assessing the results or outcomes of development interventions has traditionally been driven by donors needing to identify the respective returns on their investment and their need for accountability.
This type of monitoring and evaluation is implemented as discrete exercises at specific intervals by accountability interests external to the project’s principal stakeholders, and are adequate for measuring efficiency rates, outputs, and the extent to which the original objectives have been achieved.
Despite the stated focus of such methodologies, for example, Results Oriented Monitoring (ROM), or Results-based Management (RBM) on results or outcomes, and familiarity of project teams with tools, such as the “logical framework” of the project cycle management method, there is generally little appreciation of how outputs transform into outcomes and what, if any, are the incremental social changes which result from the interventions.
Development practitioners involved in more research-oriented activities and interested in understanding the effectiveness of processes and how to improve practices are challenged to employ methodologies, which are more learner oriented.
Such was the environment in which the Outcome Mapping Methodology (OMM) was born. In the early 1990s, having supported development research for many years and provided partners the leeway to define and use their own evaluation mechanisms in the way which best suited their purposes, the IDRC was challenged to demonstrate the impact of its interventions in ICT for development (ICT4D).
In response, an activity was undertaken in collaboration with partners in Asia, Africa and Latin America and the Caribbean (LAC) and resulted in the development of a set of tools and guidelines suitable for planning, monitoring and evaluating social change interventions.
The approach supports the concept of research for development and decision-making as it is not only based on actor-centred development and behaviour change but also:
1) recognises that development change is non-linear, complex and occurs within dynamic systems;
2) supports continuous learning of the project team and partners; and
3) promotes participatory approaches, building and sustaining partnerships and accountability of all parties.
Figure 1: the three stages of Outcome mapping
Source: IDRC publication
The OMM process involves three main stages, which include:
- intentional design (vision, purpose of initiative, partners, etc);
- monitoring (identifies the desired change in behaviour of and relationships between the various target/partners groups); and
- the evaluation framework.
All this, while collaboratively determining the activities necessary to promote the desired change and identifying progress markers by which the extent of change can be measured.
While the initial set up of an OMM process benefits from skilled facilitation, it is not necessary for participants to master the sometimes technical expertise required in preparing log frames which are the basis of ROM and RBM evaluations. However, implementing OMM demands, like any participatory, iterative and learning-focussed activity, time and commitment of internal staff and external partners.
This presents a challenge as in most research environments time is the most costly commodity, especially if there is inadequate flexibility in overall project timelines, and if the project outputs significantly outweigh learning outcomes from the perspective of the funding agency and project management.
A derivative of OMM is Outcome Harvesting (OH), which focuses on, as the name suggests, “harvesting” the outcomes of an initiative. It is not necessary to have utilised OMM as the evaluation methodology of choice for OH to be implemented. In fact, the method does not include measurement of progress towards predetermined outcomes or objectives, but rather collects evidence of what has been achieved, and works backward to determine whether and how the project or intervention contributed to the change.
A main characteristic is the involvement of stakeholders in a rigorous review of the outcomes or “substantiation” to validate and enhance the credibility of the findings. The outcomes are then organised in a database in order to make sense of them. The data is thereafter analysed and interpreted and, from this, the evidence-based answers to the useable harvesting questions are derived.
The final step is the proposing of points for discussion to harvest users, including how the users might make use of findings. In its role as a funder of research for development and given the extent to which it has invested in ICT for development projects, the IDRC has been uniquely steadfast in its quest to have development practitioners and researchers engage in evaluation processes to support learning and understand the impact of ICT4D interventions.
The project “Developing Evaluation & Communication Capacity in Information Society Research” (DECI) commenced in 2007 with the primary objective to learning about Utilisation Focused Evaluation (UFE), building evaluation capacity among select IDRC-PAN Asia Networking project partners through action research and finding ways to make the approach relevant to the five disparate research project teams.
The UFE approach was first developed over 30 years ago, and has evolved significantly since that time, but with the central tenet of (evaluation driven by) “intended use by intended users” remaining intact. The approach facilitates “a learning process in which people in the real world apply evaluation findings and experiences to their work”.
It is a guiding framework rather than a methodology and, as such, does not prescribe any specific content, method or theory. Critical to the success of the process is the evaluator stepping into the role of a facilitator instead of an expert and active stakeholder involvement.
In the context of the DECI project, case studies were carried out on the five participating projects based on their and experiences navigating the 12 steps involved in the process and the final outcomes.
While all the steps are important, the process is not a linear one, as seen in this diagram below
Figure 2 the overlap among the 12 UFE steps
Source: Utilization Focused Evaluation, A primer for evaluators,
by Ricardo Ramírez and Dal Brodhead
The UFE approach was carried out to a limited extent in the process of assessing the achievements of the COI over the 2010-2012 period. At its outset, the project did not benefit from the establishment of a M&E framework, but the assessment drew upon several UFE steps, such as the participatory definition with major stakeholders and the PIUs, of the Key Evaluation Questions.
It also applied OH principles, in terms of the harvesting of outcomes of the individual sub-projects, to answer the KEQs, and from those answers, derived the extent to which the overall initiative had achieved its initial objectives. Similar to the challenges found in the DECI project, it was challenging in the COI assessment to determine, amid several competing issues, what exactly to evaluate.
Among some of the achievements reported by the DECI projects, was:
(a) change in the perception of evaluations, which were no longer seen as audits demanded by donors, but as learning opportunities; and
(b) enabling one organisation to reflect on its organisational shortcomings and stimulating the thinking of the leadership and stakeholders on a new strategic plan.
A significant outcome of the DECI-1 was the Primerprepared with the support of IDRC, which provides guidance for evaluators through the UFE process demonstrating how the steps were implemented through references to the case studies. A second phase (DECI-2) is to be implemented with the objective to provide mentorship in both UFE and Research Communication to selected organisations/projects.
While supporting capacity building in these two fields, the project will test the assumption that simultaneous application of the two approaches will enhance the internal learning culture within projects, and enable projects to focus attention early on communication planning to enhance the reach and use of research outcomes. (There is no doubt that the COI can benefit from the outcomes of this project, given the fact that the research communication component of the initiative needs strengthening.)
Another IDRC-supported initiative, titled “Exploring the Emerging Impacts of Open Data in Developing Countries (ODDC) ” will be exploring the use of evaluation methods to understand the impact of Open Data on social change. The two-year project will develop a shared toolkit of research methods that can be used to understand the nature, use and emerging impacts of Open Data in a range of different country and governance contexts around the world. Implemented by the World Wide Web Foundation, the project will collaborate with 17 initiatives being undertaken in Africa, Asia and the LAC.
One initiative collaborating with the ODDC, the Open Data Barometer (ODB), has a focus on the context, availability and emerging impacts of Open Government Data (OGD). It intends to support improved understanding of the development of Open Data globally among advocates, researchers and policy makers, while contributing to a growing evidence base on OGD. It builds on the inclusion of a number of Open Data questions in the 2012 Web Index, which covered 61 countries. For 2013, the ODB is a separate study, with an independent expert survey, extended questions, and scope, with 81 participating countries.
It is clear that there is a growing body of knowledge and evidence of the utility of the various evaluation approaches, which have evolved over the last decade, and there are a number of tools that can support assessment of the impact of social changes due to various interventions, including ICT4D and Open Data projects.
The emerging approaches are more participatory, user driven and oriented and lend themselves to blending and fusions that can meet the needs of myriad programme contexts and complexity. The main challenge remains for researchers and institutions intent on influencing policy to make the necessary commitment to become more engaged with the policy community, to understand the needs and how policymakers are influenced, and develop the necessary communication skills that enable them to tell the stories in language that is responsive to these needs.
In the Caribbean, now that Open Data, Open Access and Open Government are being adopted by our governments, it is the right moment to get a hand on these M&E tools and harvest the best of these investments along with the relevant emerging and contextually valid best practices.
Valerie Gordon is an independent development consultant. Yacine Khelladi is an economist and international ICT4D consultant and project management professional. You can e-mail him at email@example.com.
User Focussed Evaluation
Patton, M.Q. (2011) Developmental evaluation: Applying complexity concepts to enhance innovation and use. New York: Guilford Press.
Patton, M.Q. 2008. Utilization-focused evaluation. 4th ed. Thousand Oaks, CA: Sage.
Roche 1999. Impact assessment for development agencies: learning to value change. Oxford. UK: Oxfam, http://policy-practice.oxfam.org.uk/publications/impact-assessment-for-development-agencies-learning-to-value-change-122808
Monitoring: The periodic and systematic collection of data regarding the implementation and results of a specific intervention.
Outcome: A change in the behaviour, relationships, actions, activities, policies, or practices of an individual, group, community, organisation, or institution.
Developmental evaluation: Informs and supports a change agent who is implementing innovative approaches in complex dynamic situations. The process applies evaluative thinking to project, programme or organisational development by asking evaluative questions, applying evaluation logic, and gathering and reporting evaluative data throughout the innovation process.
Utilisation Focussed Evaluation: An approach based on the principle that an evaluation should
be judged on its usefulness to its intended users.
Open Development: An emerging set of possibilities to catalyse positive change through “open” information-networked activities in international development.
Impact Evaluation: The systematic analysis of the significant changes — positive or negative, intended or not — in peoples’ lives brought about by a given action or series of actions.
Outcome Harvest: The identification, formulation, analysis, and interpretation of outcomes to answer useable questions.
 United Nations’ Secretary-General, Ban Ki-moon in comments presented by Alicia Bárcena, ECLAC Chief at the 34th Conference of Heads of Government of the Caribbean Community (CARICOM) , Port of Spain Trinidad, June 2013
 Ricardo Wilson- Grau, Heather Britt.2012, Outcome Harvesting. Brief
Ssee list of recommended reading at the end of this paper
 Michael Q Patton (2008), Utilisation Focussed Evaluation. 4th Edition
 The steps include: 1. Program/Organisational Readiness Assessment; 2. Evaluator Readiness and Capability Assessment 3. Identification of Primary Intended Users 4. Situational Analysis 5. Identification of Primary Intended Uses (PIUs). 6. Focusing the Evaluation .7. Evaluation Design 8. Data collection 9. Data Analysis 10. Facilitation of use and 12. Metaevaluation
 Ramirez, R & Brodhead, D. Utilisation Focused Evaluation: a Primer (2013) http://evaluationandcommunicationinpractice.ca/ufe-primer-get-it-here/