Skip to Job Description
Data Engineer
International Rescue Committee (IRC)
Full-time
Close on 23 Feb 2026
Apply Now
Posted 12 hours ago
Job Description

Background/IRC Summary:

Technology and Operations support the organization’s work by providing reliable and scalable solutions for the IRC’s offices around the world. The Data Team at IRC is responsible for the design and delivery of global data strategies and the systems and products that deliver on it.

Job Overview/Summary:

The Data Engineer will support the implementation, configuration, and maintenance of data systems and pipelines across IRC’s data environment. This role assists in building and operating ETL/ELT processes, data integrations, and cloud-based data platforms such as Azure Databricks, Synapse, and Fabric.

The successful candidate will help maintain Lakehouse data environments by monitoring pipeline execution, supporting data loads, and assisting in data modeling tasks under guidance from senior team members. This is a hands-on technical role that requires foundational data engineering knowledge, willingness to learn, and strong collaboration skills.

The Data Engineer will work closely with senior engineers and architects but will not be responsible for deputizing for the Data Architect or owning critical security responsibilities.

Major Responsibilities:

1. Design, build, and maintain reliable ETL/ELT data pipelines for batch and near-real-time processing from internal and external sources using tools such as Azure Data Factory or Databricks workflows.

2. Implement data validation, testing, and reconciliation checks (including dbt tests where applicable)

  • Monitor pipeline health, performance, and reliability.
  • Identify issues and escalate or collaborate with senior engineers to resolve them.
  • Write SQL and Python queries for data extraction and transformation.
  • Support documentation of processes, standards, and improvements.
  • Support solution design by preparing data samples, documentation, or prototype queries.

Key Working Relationships:

  • Data Team
  • Business/Departmental Priority Setters
  • Enterprise Systems Owners

Position Reports to: Omar Bouidel

Travel Requirements:

  • Support solution design by preparing data samples, documentation, or prototype queries.

Minimum Requirements:

  • Experience: 2–4 years of hands-on experience in data engineering, data processing, or software engineering.
  • Technical Skills:
    • SQL (advanced): joins, window functions, CTEs, performance tuning
    • Python: data processing, APIs, automation, PySpark basics
    • Data modeling: star/snowflake schemas, fact & dimension tables
    • ETL/ELT pipelines: building, monitoring, and optimizing pipelines
    • dbt Core / dbt Cloud: developing, scheduling, and maintaining models
  • CI/CD Tools: Familiarity with Git or other version control systems.
  • Problem-Solving: Strong problem-solving skills and attention to detail.
  • Communication: Good communication and teamwork skills.

Preferred Additional Requirements

  • Experience with cloud platforms (Azure preferred), Azure Data Factory, or similar cloud data tools.
  • Support solution design by preparing data samples, documentation, or prototype queries.
  • Support solution design by preparing data samples, documentation, or prototype queries.

Working Environment:

  • Remote
{{waiting}}
This position is no longer open.