Ukraine Job Openings
International Rescue Committee
Final Evaluation of Integrated Emergency Re-sponse for Crisis-Affected Persons in Ukraine-Consultant
FULL TIME
August 27, 2024
a. Relevance/Appropriateness:
Were interventions appropriate and effective for the target group based on their needs? And which target groups and individuals were reached by the interventions?
b. Efficiency:
Are the most efficient approaches being used to convert inputs into outputs?
To what extent have the activity’s interventions adhered to planned implementation schedules? and
What was the level of efficiency and timely delivery of the goods or services?
To what extent do the activity’s interventions appear to have achieved their intended outputs and outcomes?
To what extent did the activity help prevent individuals and households from adopting negative coping strategies?
How IRC and partners’ activities in health and protection were integrated with each other and if/how that approach to integration increased the effectiveness of the interventions.
Which interventions appeared to be more or less important to achieving activity outcomes? How did these changes correspond to those hypothesized by the activity’s Theory of Change?
f. Protection:
How do the interventions mainstream the protection of all groups that comprise the affected population?
Deliverables:
The selected consultant is expected to provide a detailed proposal, indicating the methodologies that will be used for documenting the evaluation of the project components. It is expected that a mix of methods (quantitative and qualitative) will be used to conduct the evaluation. To answer the key evaluation questions set out in the TOR, the consultant is expected to come up with proposed design and methods for collecting quantitative and qualitative data from primary/secondary sources, including sampling techniques data collection tools and procedures, method for compiling, analysis, and systematic presentation of the data. IRC highly encourages the use of electronic data collection, especially for quantitative data collection.
The methods should be in consideration of the sampling technique; sampling frame and sample size, quantitative and qualitative data collection techniques, secondary data inputs, measures for data quality assurance, data cleaning and management; data analysis technique and software to be used for analysis, ethical considerations and limitations and report generation.
The methodology and framework for the evaluation should be designed based on a participatory approach with inputs sought from the stakeholders, capturing views of the target beneficiaries, including the perspectives of vulnerable communities and groups. Moreover, the consultant is expected to communicate and gather views from the beneficiary community, government stakeholders, and health facilities, using the relevant tools.
Data Collection Procedure: A mixed method of evaluation data collection is recommended, and the data collection process follows the following procedures and steps.
Desk review of key documents: Before the start of the field work, the consultant must review available project key documents; mainly the project proposal, periodic reports, and related studies, to understand the context and nature of the project as well as the key thematic areas and the Log frame.
Field based data collection: The consultant is expected to conduct field-based engagement with key project stakeholders, clients, government representatives. Among others:
- IRC technical and field-based staff.
- Key government stakeholders such as sectoral regional government offices including Health offices and facilities; Social protection office, etc.
- Clients, including sample host communities, returnees, IDPs by (Women, Men, boy and Girls and other vulnerable groups).
The evaluation designs
Direct consultation with relevant actors/experts
Data collection process (Sampling method, sampling frame, instruments, protocols, and procedures)
Procedures for analyzing quantitative and qualitative data
Data presentation/dissemination methods.
Report writing and sharing etc.
Sampling Strategies:
This evaluation will employ a mixed methods approach that combines quantitative and qualitative data collection methods from both primary and secondary sources to have a more comprehensive coverage and in-depth analysis through triangulation of data from various sources. To measure the evaluation questions from the quantitative data collection point of view, the consultant will follow the recommendation from BHA_emergency_M&E_guidance_February_2022. Accordingly, “one-stage simple random sampling (SRS) strategy” will be used as the list of all beneficiaries is available. Since this approach does not require advanced knowledge in survey statistics and sampling weights are not needed, making it ideal for emergency contexts. Also, the IRC and the consultant will analyze the evolving security situation and its impact on the geographical location to be included in the sampling strategy. Depending on the ongoing security situation in the target locations the sampling strategy might be changed into a two-stage cluster sampling strategy. Overall, the consultant is required to use an appropriate sampling strategy that is statistically representative of the population of interest (POI) accordingly the actual sample size will be calculated (at 95% confidence level and 5% margin of error).
For the qualitative data collection – Purposive sampling will be used to enroll participants in the FGDs and KIIs. On top of this, observation and document review will be used as part of qualitative method. The evaluation team will conduct KIIs to assess the evaluation questions (relevance, effectiveness, efficiency, impact, coordination and protection). At least one Key informants will be selected per stakeholders from targeted oblasts/regions (Migration office, Administrative Service Office, IDP Councils, Municipal and City Councils, Departments of Social Policy and Social Protection, Centers for Social Services for Families, National Police departments, Departments of Education, Center for the provision of administrative services, Paramedic and obstetric stations, Health points etc.) Moreover, FGDs will be conducted with various groups of targeted beneficiaries (Men, women, boys, girls, Person with disability, and Elderly people) to collect qualitative information on emergency health and protection intervention. Overall, the evaluation team will undertake 20 FGDs depending on the target group by location with a maximum of six types of community groups in the targeted locations/oblasts.
The above methods were designed based on the fact that this project is implemented in an emergency setup with a rapidly changing context and frequent movement of people from place to place. According to the M&E plan, there are four outcome indicators, all of which are custom indicators suitable for an emergency setup.
The outcome indicators under this project are:
- % of clients that receive psychosocial support or case management report an improved sense of wellbeing
- % of surveyed children participating in IRC’s CP programs who report an improvement in their sense of safety and well-being
- % of surveyed women and girls participating in WPE programs who report knowing where someone can get support if they experience violence
- % of clients who are satisfied with IRC service per the standard satisfaction survey·
This evaluation will employ a mixed methods approach that combines quantitative and qualitative data collection and analysis techniques. The quantitative data will consist of surveys records whereas the qualitative data will consist of key informant interviews, focus groups, observations, and document review.
The mixed methods approach will be employed to integrate quantitative and qualitative data to answer the evaluation questions. This will provide a more comprehensive and holistic picture of the evaluation context, process, and outcomes than either method alone. To elaborate on the analysis requirement, the evaluator will make sure that data from the target-based surveys is analyzed quantitatively using standard package software to generate descriptive statistics like frequencies, proportions, and means. Thematic analysis, using key themes related to the evaluation objectives (by grouping responses by themes), will be employed for qualitative data analysis.
The data analysis includes triangulation of quantitative data with qualitative findings as well as data from primary source with secondary data. The triangulation step involves multiple data sources and methods to validate and enrich the findings of the evaluation.
The steps that will be taken to triangulate data from different sources are:
- Data from multiple sources (qualitative and quantitative) as mentioned on the methodology part (surveys, key informant interviews, focus groups discussion, observations, and document review).
- Data from different stakeholders, such as beneficiaries, staff (IRC and partner), and government offices will be collected to capture diverse perspectives.
- During the data analysis step, quantitative and qualitative data will be analyzed separately using appropriate methods for each (statistical analysis for quantitative data and thematic analysis for qualitative data).
- After the initial analysis, the result will be compared with different data sources and methods to identify converging themes or patterns.
- The finding from different data sources and methods will be integrated to coherent narrative in a way that highlights how different pieces of data complement each other and provide a full understanding of the research questions.
- Once the report is generated, the finding and interpretation will be validated and cross checked with stakeholders, including peer review involving IRC technical advisors to validate the analysis and the conclusions are well-supported by the data/evidence based.
- To ensure the accuracy and reliability of collected data, several quality control measures will be implemented at different stages (survey design, data collection and data cleaning stages).
- Quality control measures will be implemented at survey design stage through creating comprehensive plan that outlines the data collection method, tools, timelines etc.. Through ensuring the alignment of the survey design with the evaluation questions.
- During data collection stage, enumerators/data collectors will be trained on the questionnaire and tools, and all tools will be pilot tested before actual data collation.
- For beneficiary survey digital data collection platform (Kobo toolbox/Commcare) will be used to ensure the quality of data during data collection through implementing different validation criteria and for real time data quality check.
- Supervisors will be assigned to oversee data collection in the field, providing real-time guidance and support to data collectors.
- The data collection process will be regularly monitored to ensure adherence to protocols and regular reviews of the collected data during the data collection phase to identify and correct errors, inconsistencies, or missing data early in the process.
- Finally, after data collection, a thorough data cleaning process will be performed to address any errors, outliers, or inconsistencies. As part of data cleaning, action will be taken include removing duplicate entries, correcting incorrect entries, and standardizing data formats.
The evaluator will make sure that different groups within the target beneficiaries are carefully sampled and represent the diverse groups benefited from the project to get meaningful insights and to learn the specific needs of different subgroups. To this end, during the design stage careful planning will be put in place to ensure that diverse groups within the target population that need to be represented ( gender, age, disability status, profile (IDP, host community), geographic location etc..) And in the data analysis stage the data will be analyzed separately for each group to get more insight about the perspectives of different groups.
Schedule:
The consultant will evaluate the entire life of the project up until the period of evaluation. Required deliverables are the following:
Secondary Data: Review of secondary data from varied and relevant sources that must be specified in the consultant’s proposal, including, but not limited to, relevant project documents. However, the evaluator should also provide other key secondary documents to be used, relevant to the target localities. Activity documents available for evaluators include:
Proposal documents (technical proposal narrative, budget, work plan, M&E plan, etc.)
Program reports to date
Review of the performance monitoring data
Documents as requested by the evaluator
Inception Report: This will be required within ten working days of the contract award; the consultant must submit a detailed inception report indicating the evaluation design and operational work plan, which must include the methodology, sample size, proposed data collection and analysis methods to address the key questions of the evaluation. The inception report shall also include questionnaires and interview protocols.
Evaluation Design: This will include the development of tools, data entry, and data analysis plan for designing the entire evaluation study.
Tools and Methodologies: The evaluator will be expected to share the tools and methodology to be employed in this evaluation with IRC, and incorporate any feedback provided.
Data Quality: The evaluator will ensure that, throughout the process of data collection, all necessary measures are undertaken to ensure that data quality management is adhered to. The proposal must clearly outline these quality control measures. The evaluator should also propose a digital data collection methodology to be used. All datasets and syntax files used in software analysis packages will be submitted to the IRC.
Data Analysis and Interpretation: The evaluator will have to ensure proper analysis of all quantitative and qualitative data/information collected. Data needs to be disaggregated by household profile, gender and any other social-economic categories, as applicable.
Regular Coordination: Throughout the study, the evaluator will maintain regular contact with the IRC team contact person to ensure the organization is aware of progress throughout the evaluation period.
Dissemination of Results: The evaluator, with support from IRC, will conduct three dissemination meetings. The first involves key staff, the second meeting focuses on review of the draft report to ensure transparency and consensus, and the final meeting includes additional stakeholders for final validation.
Evaluation Report: The evaluator will submit detailed and accurate reports (draft and final) as per the evaluation objectives to the IRC within the agreed timeframe, in the specified format outlined below.
The evaluation is expected to begin no later than September 23, 2024. This date and those below for the key deliverables under this consultancy may be adjusted in the final agreement that will be developed later, in consultation with the evaluator hired. The final evaluation report and summary and/or infographic brief(s) should be submitted to the Senior MEAL Coordinator no later than November 29, 2024.
Key Evaluation Activities Due Date
Kickoff meeting 23-Sep-24
Draft Inception Report (including tools) 4-Oct-24
Final inception report and site selection 8-Oct-24
Field work (Data collection) 9-Oct-24
Presentation of the draft evaluation findings and first meeting 15-Nov-24
Draft of the final evaluation report and infographic brief(s) 21-Nov-24
Meetings with project technical team in the CO and field office and partners representatives. 25-Nov-24
Meeting three with headquarter IRC technical advisors, country office senior management team and donor (as required). 26-Nov-24
Final evaluation report and infographic brief(s) 29-Nov-24
Payment Rate: Applicants should provide the financial proposal for executing the deliverables with a base rate and total amount.
Minimum Qualifications:Requirements:
Demonstrated senior level experience (at least five years) in the design, implementation, and management of large scale multi-sectoral project evaluation. Experience in quality evaluation, research, and surveys, Proven experience in designing and implementing quantitative and qualitative data collection methods, including but not limited to,
- Surveys (population based or target group based), Focus group discussions, key informant interviews, document observation, service delivery assessments, and perception surveys etc.
- Expertise in the humanitarian program evaluation in hard-to-reach area with relevant thematic areas and in-depth evaluation experience of –protection and Health in emergency context, in IDP and local community setup.
– Experience conducting evaluations in Ukraine (experience with USAID/BHA is preferable).
– Excellent research skills, including the ability to collect, collate and analyze large amounts of qualitative data and identify critical aspects to succinctly communicate complex subject matter (in written and oral form) to make it accessible to wider audiences.
– Proven experience in designing evaluation methodologies/tools, data analysis, report writing, etc.
– Capacity to work collaboratively with multiple stakeholders.
– Strong analytical, presentation, and writing skills in English; knowledge of the Ukrainian and Russian languages is a plus.
Commitment to Gender, Equality, Diversity, and Inclusion: The IRC is committed to creating a diverse, inclusive, respectful, and safe work environment where all persons are treated fairly, with dignity and respect. The IRC expressly prohibits and will not tolerate discrimination, harassment, retaliation, or bullying of the IRC persons in any work setting. We aim to increase the representation of women, people that are from country and communities we serve, and people who identify as races and ethnicities that are under-represented in global power structures.
New Job Alerts
Інтерньюз-Україна
Senior Strategic Lead on Parliamentary Reform for the USAID/ RADA Next Generation (RANG) Activity
November 19, 2024
View Job DescriptionMaterialise
Senior Software Development Engineer С++
FULL TIME
November 19, 2024
View Job DescriptionEdvantis
Senior Full-Stack Engineer (Angular/Vue.js/Node.js)
FULL TIME
November 19, 2024
View Job DescriptionLooking for similar job?
Management Sciences for Health (MSH)
Technical Officer, Monitoring, Evaluation, and Learning
FULL TIME
August 19, 2024
View Job DescriptionExpertise France
Consultancy for the final evaluation of the Pravo-Justice II programme (M/W)
October 8, 2024
View Job DescriptionTetra Tech
Technical Monitoring and Evaluation Consultants, MEASURE
August 23, 2024
View Job DescriptionAmerican Bar Association
Monitoring and Evaluation (M&E) Team Lead - Ukraine
FULL TIME
August 14, 2024
View Job DescriptionЧервоний Хрест
PLANNING MONITORING EVALUATION AND REPORTING (PMER) COORDINATOR
August 29, 2024
View Job DescriptionSee What’s New: International Rescue Committee Job Opportunities
International Rescue Committee
National Specialist for Women Protection and Empowerment Outreach
FULL TIME
November 8, 2024
View Job DescriptionInternational Rescue Committee
Child Protection MHPSS Officer
FULL TIME
November 5, 2024
View Job DescriptionInternational Rescue Committee
Cash and Basic Needs Assistant
FULL TIME
November 4, 2024
View Job DescriptionInternational Rescue Committee
Protection and Rule of Law Coordinator
FULL TIME
November 1, 2024
View Job DescriptionNew Job Alerts
Інтерньюз-Україна
Senior Strategic Lead on Parliamentary Reform for the USAID/ RADA Next Generation (RANG) Activity
November 19, 2024
View Job DescriptionMaterialise
Senior Software Development Engineer С++
FULL TIME
November 19, 2024
View Job DescriptionEdvantis
Senior Full-Stack Engineer (Angular/Vue.js/Node.js)
FULL TIME
November 19, 2024
View Job Description