Program Background
- The Peace Action for Rapid and Transformative Nigerian Early Response (PARTNER) project is a 5-year USAID-funded Activity being implemented in North Central and Northwest Nigeria by Mercy Corps and other consortium partners, including West African Network for Peacebuilding (WANEP), Plateau Peace Building Agency (PPBA), Kaduna State Peace Commission (KSPC), Institute for Peace and Conflict Resolution (IPCR), supported by Stopping at Success (SAS +). The program is being implemented from 1 October 2021 to 30 September 2026.
- PARTNER’s goal is to help Nigerian community, government, security, and civil society actors collaborate to increase the effectiveness, local ownership, and sustainability of an inclusive Early Warning Early Response (EWER) system for improved violence prevention, while ensuring a sustainable transition of the program being managed by Mercy Corps to WANEP.
- The program is implemented using tiered geographic targeting, with deep engagement in the Tier 1 states of Kaduna and Plateau state, and growth and expansion in the Tier 2 states of Benue, Kano, Katsina and Nasarawa states, capitalizing on gains made through the Mercy Corps-led, USAID-funded Community Initiatives to Promote Peace (CIPP) program, with planned bottom-up system strengthening in Tier 3 states of Kogi, Niger, and Zamfara states.
- Mercy Corps is expected to provide a sub-grant to WANEP, who may become eligible for a direct award (also known as “Transition Award”) from USAID. In addition, Mercy Corps is expected to develop the capacity of WANEP such that the organization may be eligible to receive a direct award from USAID or other donors in the future, including potentially allowing them to transition to receiving a direct award from USAID for the PARTNER activity.
- According to the terms of the Cooperative Agreement, Mercy Corps was expected to lead the program in a mentorship capacity in Years 1 and 2, working closely with WANEP, which will assume leadership beginning in Year 3. WANEP will coordinate with key resource partners, including the Institute for Peace and Conflict Resolution (IPCR), the Plateau Peace Building Agency (PPBA), and Kaduna State Peace Commission (KSPC) in the implementation of PARTNER activities.
- PARTNER’s approach is predicated on the theory that IF local actors strengthen inclusive processes for dispute resolution and social cohesion and increase skills and cooperation around timely conflict monitoring, analyzing, and reporting, AND IF a robust national EWER system unifies disparate systems, THROUGH increased state and civil society capacity in and commitment to violence prevention, THEN there will be more proactive and timely responses to early warning, reducing violence and increasing long-term peace.
Purpose of the Learning Evaluation
- With PARTNER Activity being transitioned by September 2024, Mercy Corps the lead implementer of this Activity will commission a learning evaluation focusing on assessing the merit and worth of the results achieved by the program to date, with results being defined as intended or unintended outcomes influenced by the activities of the program through the USAID investment.
Specifically, the learning evaluation seeks to:
- In taking stock of the key insights and learning through the implementation of the programs, the learning evaluation is planned to take place from 1st – 30th September 2024 to achieve the following three purposes
- Assess the results of the PARTNER Activity achievement to date to establish enabling and hindering factors anchoring Activities achievement to date. The learning evaluation is designed to 'harvest' PARTNER outcomes and assess achievement to date to profile how far they would verify progress towards the program strategic objectives in the operational (work) plan and log frame
- Assess the potential of the PARTNER Program to contribute further to Early Warning and early response, peacebuilding, and resilience objectives to conflict-affected and vulnerable communities in North Central and North West Nigeria (across MC and Its implementing partners, State government, implementing partners, donors, and project participants), and provide strategic recommendations for an ideal design and implementation modalities
- Contribute to broader consortium and organizational learning with regards performance measurement and tracking for early warning and early response in North West and Nort Central Nigeria. Specifically,
- Provide insights and justifications on performance to date and recommendations for final program evaluation.
- Profile program’s key success and insights identifying potential research areas to be explored considering future early warning and early response programs in North West and North Central Nigeria.
- The findings of the learning evaluation will tie into final evaluation expected to run concurrently with the final evaluation starting September 2024, by providing context specific learnings and a deep understanding of programme implementation modalities and processes and their effects for the PARTNER Activity. The report will validate and provide context specific lessons complement the final evaluation and strengthen further the methodology and evaluation questions for the final evaluation.
- Given the overall performance and achievements of the program to date, we hope that the learning evaluation will explore deeper contextual factors, demonstrate program learning towards adaptive implementation of program activities and principles that can be applied, or further refined, to inform collective action at the nexus of humanitarian, peace, and development programming that strengthens resilience in protracted crisis contexts.
- To enable this, there will be a review of program and assessment reports and documents, State operational plan/ strategy, and government documents to further understand applied approaches during program implementation.
Learning Findings Primary Users:
The primary intended users and uses of the learnings are summarized below:
- USAID as the main funder of PARTNER Activity. We hope that the learning will be used to decide if and how to further support similar early warning and early response initiatives in Northwest and North Central Nigeria
- Mercy Corps Nigeria and the Consortium Partners – We hope that the evaluation findings will be relevant to guide review of the program achievement and sustainability plans, refine key program documents including program log frames and performance indicators and tools, and develop of funding strategy and help develop relationships with funders.
- Implementing partners and the broader international not-for-profit organization – for implementing partners and not for profit organizations implementing such programs in Northwest and North Central Nigeria, we hope that the findings will be useful to assess their role in designing and implementing similar programs (Co-creation)
- In addition, we hope that the learnings will not only be critical to meet the above highlighted purpose, but also serve as a learning experience, where the process of generating answers to the proposed learning questions will provide Mercy Corps and the implementation team and consortium members with a new understanding on the program achievements and inform an effective end of program evaluation for PARTNER.
Proposed Learning Evaluation Method and Design:
- While the proposed methodology is not exhaustive, the learning evaluation proposes an outcome harvesting qualitative methodology adapting outcome mapping principles of evaluations for this specific assignment. This methodology should provide a non-restrictive opportunity to harvest outcomes, collate successes, and provide contextual insights. In consultation with MC and the Implementation team, the evaluation team will be keen to add and/or refine the proposed methodology for this assessment to effectively respond to the proposed learning questions.
- At minimum, the evaluation team will review program documents, conduct key in-depth interviews with the program implementation team, and facilitate relevant focus group discussions with program participants to understand program achievements and outcomes to date that can further be explored to generate the required insight and learning and their potential use for early warning and early response, social cohesion and peacebuilding programming activities and strategies. Participants for these focused groups will be from the three objectives of the Activity with a breakdown below:
- FGDs with different target groups supported with different interventions – Men, women, youth’s male, and female and all LGAs covered (at least 8)
- KII with different target groups supported with different interventions (at least 6)
- KII with government stakeholders (at least 5)
- KII with select PARTNER staff (at least 3 staff)
- The final sampling frame will be developed during inception together with the consultants.
Stakeholder evaluation matrix
PARTNER Project Stakeholder
|
Evaluation Dimension
|
Evaluation Question
|
Data Collection Methodology
|
Program Staff (CoP, DCoP and Program Manager)
|
Design and Relevance
|
- Is the program appropriate for the context where it is being implemented?
- Is the way the program structured, enable it to meet the objectives/outcomes?
|
Key Informant Interviews
|
Program Staff (EWER Advisors, Project Officers)
|
Efficiency
|
- How cost-effective is the implementation of the program?
- Could the same results have been achieved with fewer resources?
- How appropriate are the execution and implementation modalities?
- How effective are support-cost arrangements with program Implementing partners?
|
Key Informant Interviews
|
Program Participants (Participants who benefit from trainings and capacity buildings)
|
Effectiveness
|
- Has the program caused any particular change in the participants lives or the lives of others, whether in terms of physical or social conditions, social cohesion, youth engagement conflict management and peacebuilding outcomes, or other indicators of success? How significant was this change and how did happen?
- What components(s) and elements of the program were responsible for the changes stated above
|
Surveys +
Focus Group Discussions
|
Project Participants
|
Sustainability
|
- Will the changes caused by the program continue beyond the life of the program? Why...?
- Has the program resulted in knowledge and insights that can help the intervention to ensure sustainable impact for the participants at a scale?
|
- Survey
- Focus Group Discussions
|
- Understanding the program outcomes (as observable behavior change of social actors that the programs contributed to) should lead to the formulation of learning questions that will be a starting point to unpacking further the program outcomes through the proposed structured interviews (KII and FGD breakdown above) that will be conducted with all relevant program participants, including the leadership and staff to consolidate learning. Proposed methodology should be participatory in approach, keen to engage the participation of those with the most knowledge on the observable changes that have taken place through the contribution of the program’s activity implementation. The proposed methodology should provide for further substantiation of the documented learnings by providing information that can verify learning and insights to strengthen credibility and deepen understanding.
- Activities have been implemented using the RCT design with the aim of testing the impact of PARTNER’s EWER systems under Program Objectives 1 and 2, along with the added impact of Objective 3. The program has pursued two learning themes: 1) when, how, and where relevant actors effectively respond to incidents (or not); and 2) whether communities participating in PARTNER experience reduced violence compared to control sites (and if so, how much). The RCT has focused on the second theme, in order to assess whether EWER helps prevent, limit, and/or better mobilize responses to violence. We are also interested in understanding whether social cohesion initiatives also help reduce violence, either by improving or amplifying the efficacy of EWER or by addressing structural drivers of conflict.
- A core hypothesis behind the RCT was that more reliable information about violence and conflict-related risks if provided in a timely manner to appropriate authorities who have the tools and willingness to respond, will enable communities to stop, prevent, and/or mitigate outbreaks of violence (or further escalation of it). Another hypothesis, in line with the program’s theory of change, is that strengthening social cohesion will improve communities’ capacities in and commitment to violence prevention, and lay the groundwork for long-term peace.
These hypotheses is based on some critical assumptions, including
- Local and community actors are willing to report conflicts and threats to conflict through established tracking mechanisms and have confidence and trust in those administering them.
- Violent events in the community are preceded by at least some indications that are detectable by local and community actors.
- Individuals working for the conflict structures are the appropriate people, and communities know and understand the role and existence of the peace structures and reporting mechanisms.
- Individuals and agencies responsible for responding to warnings (e.g., police and security forces) are able and willing to respond.
- There is available local infrastructure to support public information campaigns, social media and traditional media channels remain accessible to all community members, including youth, women, and vulnerable groups, and participation does not put community members at risk.
Rationale of RCT:
- Increasingly sophisticated tools have been developed to predict violence and conflict events,and early warning and early response (EWER) systems aimed at preventing violence and mass atrocities have proliferated in recent years. The form and function of EWER systems vary by context, but there is little rigorous evidence regarding the impact of these interventions: specifically, whether they actually lead to meaningful reductions in violence or other conflict risks. Given the prevalence of these kinds of interventions, we need rigorous evidence regarding their effectiveness so we can replicate and adapt these systems, and justify continued investments in these systems. Most studies of EWER systems are based on qualitative analysis of cases and observational data, which struggle to demonstrate causal relationships.
- Some assessments have indicated that these systems can be successful in generating early warnings, but may be less effective at generating the kinds of responses necessary to actually stop or prevent violence. This RCT therefore fill a major evidence gap in the literature on violence and conflict prevention. While Mercy Corps has employed EWER interventions in various programs, this would be the first explicit test of the approach. The study will also deepen our understanding of the impact of initiatives aimed at building social cohesion using comparative analysis data from treatment and control. Specifically, the RCT will test whether these initiatives are necessary for - or at least help improve the quality of - an effective EWER system, or whether they act as a long-term peace multiplier on top of interventions meant to stop or prevent violence (such as EWER).
Results and outcomes of interest include:
The following indicators are of interest to the PARTNER Activity:
- # of warnings/incidents reported through the EWER system;
- % of community members who are aware of EWER system;
- % of community members who heard an EWER warning at some point;
- Accuracy of warnings;
- Prevalence of rumors/false reports about violence;
- Specificity and timeliness of warnings (e.g., average time between reporting and responses);
- % of community members who feel that government, security, and justice actors are responsive to requests to prevent violence and/or effectively resolve violent situations;
- % of local reporting group, EWRG members, and EWER cluster members who find most of the information they need about a particular incident through the platform;
- % of members of reporting groups who feel confident in their ability to identify and report early warning signs;
- % people who report improved knowledge or skills in conflict resolution;
- Number of disputes resolved;
- Frequency and severity of violence;
- Timeliness and effectiveness of responses by security forces (e.g., how quickly police arrive when there is violence/tension/warning)
- % of community members who believe their communities are peaceful, safe, and secure;
- % of community members who believe that those responsible for conflict management in their area are effective at resolving conflicts in their community;
- Attitudes regarding peace and the use of violence;
- Perceptions of government and security forces (e.g., trust in, willingness to cooperate with, etc.);
- Social cohesion between communities in conflict.
Factors Affecting Successful Implementation and Results Achievement:
- Is program implementation and results achievement proceeding well and according to plan, are there any obstacles/bottlenecks/outstanding issues on the program, partner or donor that are limiting the successful implementation and results achievement of the program?
External factors:
- To what extent does the broader context and policy environment remain conducive to achieving intended results, including policy impact and replication of the lessons being learned from program implementation?
Specifically, in this regard, to what extent do critical assumptions (refer to log frame) on which program success depends still hold?
- Are there any other factors external to the program that are affecting successful implementation and results achievement?
- How well do local peace institutions involve/include all groups in processes to strengthen peace and resilience?
- How does relevant stakeholder inclusion in local peace institutions strengthen resilience and foster a collective peacebuilding agenda?
- How well are international agencies relating and working with government, security agencies and CSOs to promote peace and good governance?
Program-related factors:
Program design (relevance and quality):
Consider the following:
- Is the program concept/logic and design optimal to achieve the desired project objectives/outputs?
- In assessing design consider, among other issues:
- Were relevant gender issues adequately addressed in program design?
- Was the institutional focus of the program on the context of implementation appropriate? Was the context identified the most strategic and viable choice of unit for effective implementation of the program activities? Is the program sufficiently focused on other institutional levels in view of its objectives?
- Was the program preparation process (formulation, inception) and its products (log frame, Project Operations Plan, Annual Work plans) adequately designed to ensure quality implementation and measurement?
- Did the program document include adequate guidelines for implementation of the program?
- Do the program objectives remain valid and relevant? Will they result in strategic value added if they are achieved? Does the program design and document need to be reviewed and updated?
Project Management:
- Technical backstopping: Is technical assistance and back-stopping from PARTNER and its partners appropriate, adequate, and timely to support the program in achieving its objectives?
- Are there any other program-related factors that are affecting successful implementation and results achievement?
Partner Strategic Positioning and Partnerships
Is the PARTNER Activity and its engagement in the state and the country, optimally positioned strategically, with respect to:
- The donor/government efforts on the same sector in Tier 1 and 2 States?
- Implementing national priorities, as reflected in national development strategies?
- Donor priorities, and leveraging its comparative advantages to maximum effect?
- Is PARTNER leveraging its actual/potential partnerships to maximum effect?
- What level of value-added, and consequence can be attached to the program intervention in implementation?
Learning Evaluation Criteria:
- The learning questions developed for this evaluation will further be informed by the Program design, PARTNER Theory of Change, and Logical Framework. In addition to generating learnings the PARTNER Activity will be keen to understand the following:
Relevance and Design:
- Is the program appropriate for the context where it is being implemented? Is the way the program structured, enable it to meet the objectives/outcomes?
- To what extent has the program considered peoples' different needs according to age, gender, ethnicity, and other social identities? How has it adapted to meet those differing needs?
- How has the program ensured that participants voices are heard and reflected in both program activities and more broadly, in PARTNER interaction with government bodies and other stakeholders?
- What aspects of the design should be changed to enhance performance of the program (in terms of implementation and management arrangements)
Efficiency:
- How cost-effective is the implementation of the program?
- Could the same results have been achieved with fewer resources?
- How is the program leveraging insight and new knowledge in the process of implementation? How is this knowledge being shared and contextualized towards efficient project implementation?
- How appropriate are the execution and implementation modalities?
- How adequate is the support provided by Program Management Teams and PaQ?
- How effective are support-cost arrangements with program Implementing partners?
- Do stakeholders, particularly the direct participants, participate in the management of the program? If yes, what are is the nature and extent of their participation, including by gender?
Effectiveness:
- Is PARTNER as a consortium implementing the program as planned and if not, why not?
- Has the program caused any particular change in the participants lives or the lives of others, whether in terms of physical or social conditions, social cohesion, youth engagement, conflict management and peacebuilding outcomes, or other indicators of success? How significant was this change and how did happen?
- What component(s) and elements of the program were responsible for the change stated above?
- How useful are the activities and outputs to the needs of the direct and ultimate participants? Is there a significant gender differentiation in the usefulness of the outputs to direct and ultimate participants?
- Do the outputs contribute to the achievement of the immediate objectives of the program?
- What factors affect the implementation of the program?
Impact:
- Are there early indications of potential success? What observable changes are being manifested currently...?
- What are the program's unintended effects and how have they influenced the outcomes?
- How has the program contributed to the development of the capacity of the direct participants to respond and adapt to shocks and stresses)?
- Are their signs of likely impact of the program beyond the direct and ultimate participants/ beneficiaries?
Sustainability:
- Will the changes caused by the program continue beyond the life of the program? Why...?
- Has the program resulted in knowledge and insights that can help the intervention to ensure sustainable impact for the participants at a scale?
- How has the program worked with partners to increase their capacity in a sustainable way?
- In addition to assessing the questions above, the evaluation team will analyze any other pertinent issues that need addressing or which may or should influence future program direction and PARTNER engagement in the states and country.
Transcription:
- Data collected for the qualitative data from the field would be transcribed by enumerators responsible for the data collection process. This process would be done with the guidance of the Consultant. Each team would be responsible to transcribe the data using the guide from the analytical framework and transcript tool that would be shared.
Implementation Strategy and Work Plan:
- Mercy Corps will manage the learning evaluation process in coordination with the evaluation team (external consultant) to provide technical support in the refinement of the evaluation methodology and in the case of data collection tools, inputs, and all supporting documents to guide design and finalization of the evaluation methodology and data collection instruments. The evaluation being largely qualitative, the engagement of enumerators for this assignment is not expected. However, Mercy Corps has a pool of enumerators ready to be engaged should need arise.
- The learning evaluation is expected to commence and be completed by xxx Sept 2024. Below is an overview of the activities, their duration, and the stakeholders responsible during the learning evaluation process. The duration/level of effort included is tentative to be finalized once the evaluation team have been
Duration
|
Activity
|
Stakeholder
|
1 Day
|
Entry and Inception - Review the learning evaluation SOW with the External Evaluators to clarify timeframe, assignment objectives and available budget.
Agree on assessment tools, time schedules, and delivery period for data collection activities.
|
PARTNER COP,
PARTNER MEL Advisor Country MEL Manager, PaQ Director, PaQ Manager and External Evaluator/Consultant
|
2 Day
|
Develop and share inception report with detailed evaluation design for feedback and inputs from the PARTNER Implementation teams
|
External Evaluator/Consultant
|
2 Day
|
Provide feedback to inception report and assessment tools for External Evaluator to incorporate
|
PARTNER COP,
PARTNER MEL Advisor Country MEL Manager, PaQ Director, PaQ Manager and External Evaluator/Consultant
|
2 days
|
Provide final versions of Inception Report and assessment tools.
- Translate assessment tools as needed
- Consultant/firm submits complete draft of the DATA ANALYSIS PLAN (DAP) to Mercy Corps’ POC (a complete draft includes dummy tables, placeholders for charts/graphs/images, description of how data triangulation/synthesis will be conducted, how qualitative data will be analyzed, etc
- Mercy Corps’ POC distributes DAP complete draft to ALL Mercy Corps reviewers (and donor reviewers if, required) and consolidates feedback returning this to consultant/firm
- Consultant/firm submits FINAL DAP to Mercy Corps’ POC having addressed consolidated all feedback
|
External Evaluator/ Consultant
PARTNER COP,
PARTNER MEL Advisor Country MEL Manager, PaQ Director, PaQ Manager and External Evaluator/Consultant
External Evaluator/ Consultant
|
10 Days
|
Initiate and conduct comprehensive data collection with programme teams and programme participants
|
External Evaluator/ Consultant
|
2 Days
|
Encode and analyze data collected
|
External Evaluator/ Consultant
|
2 Days
|
Prepare and share draft assessment report
|
External Evaluator/ Consultant
|
2 Days
|
Provide detailed feedback to draft report (one rounds of revisions)
|
PARTNER COP,
PARTNER MEL Advisor Country MEL Manager, PaQ Director, PaQ Manager and External Evaluator/Consultant
|
1 Day
|
Hold a virtual feedback/validation workshop with key Mercy Corps key staff to present the reviewed findings and collate final feedback
|
External Evaluator/ Consultants + PARTNER COP,
PARTNER MEL Advisor Country MEL Manager, PaQ Director, PaQ Manager and External Evaluator/Consultant
|
4 Days
|
Finalize report, produce a presentation of findings, and update key program docs and share back with MC (not more 30 pages – all other additions can be included as annexes)
Data sets, code books, syntax, etc are delivered to Mercy Corps’ POC
|
External Evaluator
|
Total Days
|
28 Days
|
Proposed Assessment Report Structure & Content
In line with Mercy Corps Branding guidelines and templates, the following is proposed for the evaluation deliverable documents.
- Cover Page, List of Acronyms
- Table of Contents
- Executive Summary: This section should be a clear and concise stand-alone document that gives readers the essential contents of the assessment report, including a summary of major constraints, opportunities, and recommendations.
- Methodology: This section should be sufficiently detailed to clarify the assessment methodology as aligned to the accuracy of the report and its findings. This section should address constraints and limitations of the methodology, and the implications of these limitations for the findings, including whether and why any of the assessment findings are inconclusive.
- Results: This section should provide clear assessment findings. Reference should be made to any assessment/evaluation information as well as existing literature etc.
- Synthesis, Recommendations and Lessons Learned: This is space for the evaluation team to think about the data and results and make concrete recommendations for current or future programme design and improvements, and generally comment on data and results.
- Conflicts of Interest: Disclose any conflicts of interest or the appearance of conflicts of interest, including the interest of programme staff in having a successful programme.
- Annexes: These should include a complete file of data collection instruments in English and translations to Hausa; list of stakeholder groups with number and type of interactions; SOW, qualitative protocols developed and used, any data sets (these can be provided in electronic format), any required photos, participant profiles or other special documentation needed.
- The evaluator (consulting firm) is expected to deliver a comprehensive, professional quality final assessment report. The assessment report should be soft copy (PDF and Word - submitted electronically) along with the analysis plan and the data set. The page limit for the full report must not exceed 30 pages (exclusive of annexes and attachments).
Firm / Consultant Qualifications and Expertise
- Educational Qualification: A minimum of a Master's Degree in Peace and Conflict Studies, Development Studies, Economics, Social Sciences or a related field.
- Professional Experience: Extensive experience (at least 5 years) in conducting research and analysis in the Peace Democracy and Governance, particularly in Early Warning and Early Response, Social Cohesion.
- Proven track record of successfully leading similar assessments or studies and an understanding of USAID regulations
- Technical Skills: Proficiency in conducting literature reviews, Qualitative data collection and analysis. Strong analytical skills to identify trends, challenges, and opportunities within Peacebuilding interventions. Familiar with qualitative and quantitative research methods, including surveys, interviews, and focus group discussions. Ability to develop comprehensive reports with actionable recommendations for stakeholders.
- Communication and Reporting Skills: Excellent communication and interpersonal skills to facilitate interviews, surveys, and focus group discussions. Proficient in report writing and presentation skills
- Project Management skills: Strong project management skills to oversee the entire research process from inception to final deliverables. Ability to develop work plans, manage timelines, and coordinate fieldwork activities. Attention to detail and ability to adhere to project deadlines and budgets.
- Language Proficiency: Proficiency in English and Hausa is required, and/or knowledge of local languages spoken in the Plateau and Kaduna State would be an advantage for effective.
Communication with Stakeholders:
- Ethical Standards: Commited to upholding ethical standards in research, including obtaining informed consent, ensuring confidentiality, and respecting the rights and dignity of research participants.
Selection Criteria:
S/N
|
Technical Areas
|
Score
|
1
|
Proposal: The consultant must present a detailed proposal highlighting experience in a related field and areas of expertise.
|
20
|
2
|
Expertise & technical know-how:
The consulting firm must:
- Have at least 5 years professional experience in conducting literature reviews, data analysis, and learning assessments.
- Be familiar with qualitative and quantitative research methods, including surveys, interviews, and focus group discussions.
- Have ability to develop comprehensive reports with actionable recommendations for stakeholders.in research methodologies, including data collection, analysis, interpretation, and report writing, particularly in the peacebuilding sector or field.
|
40
|
3
|
Understanding of Context:
The consultant must have an understanding of cultural, social and economic factors of Plateau and Kaduna State.
|
20
|
4
|
Budget:
A detailed budget including deliverables and Timelines of activities
|
20
|
|
|
100
|
Payment Terms:
Deliverable
|
Payment Terms
|
Inception report with data collection tools
|
30%
|
Final report
|
70%
|
Note
- The duration of the assignment begins when the consultant signs onto the consultancy.
Application:
Applicants should send in their proposal capturing the below information:
- Name of firm, Name and contact of focal person, office address, Registration type and Number of the company and description of what the company does.
- Consultant(s) profile/CV, including details of academic and/or professional history
- Evidence of similar work conducted.
- Concept note of not more than five pages with previous experience in related activity
- Budget summary of total cost
- Program activity against a timeline (workplan) -based on activity proposed.