Data Engineer job at Dfcu
New
Today
Linkedid Twitter Share on facebook
Data Engineer
2025-10-21T08:40:21+00:00
Dfcu
https://cdn.greatugandajobs.com/jsjobsdata/data/employer/comp_7435/logo/dfcu%20Bank.jpeg
FULL_TIME
 
kampala
Kampala
00256
Uganda
Finance
Science & Engineering
UGX
 
MONTH
2025-10-29T17:00:00+00:00
 
Uganda
8

Reporting to the Head Of Data & Insights, the role holder will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and infrastructure across global data platforms. The role ensures that data from various systems is efficiently ingested, transformed, stored, and made available for advanced analytics, reporting, and machine learning use cases in compliance with global data governance and privacy standards.

KEY ACCOUNTABILITIES:

  • Design, build, and maintain data pipelines to ingest data from structured and unstructured sources (internal and external).
  • Develop and optimize ETL/ELT processes to ensure reliability, scalability, and performance across large datasets.
  • Implement data warehousing and data lake architectures using cloud and on-prem technologies (e.g., Snowflake, Azure Synapse, BigQuery, AWS Redshift, Databricks, SSMS).
  • Create reusable data assets and frameworks for repeatable and standardized data integration.
  • Implement data validation, cleansing, and quality monitoring frameworks.
  • Integrate and support Master Data Management (MDM) and Metadata Management practices.
  • Partner with the Data Governance and Data Protection Officers to ensure compliance with Data protection and Privacy laws in Uganda and other global data protection laws.
  • Manage data lineage, cataloging, and access control using enterprise tools such as Azure Purview, Collibra, or Alation.
  • Build scalable data pipelines using tools such as Azure Data Factory, Apache Airflow, NiFi, or AWS Glue.
  • Develop real-time and batch data streaming solutions using Kafka, Event Hubs, or Kinesis.
  • Support API-based integrations and data sharing across systems and geographies.
  • Work closely with Data Scientists and Analysts to provision and prepare data for predictive and prescriptive modelling.
  • Collaborate with BI and reporting teams to ensure data consistency across dashboards and analytical layers.
  • Partner with cross-functional teams to define and implement data standards and reusable assets.
  • Research and implement best-in-class tools and frameworks for data engineering.
  • Lead or contribute to cloud modernization and data platform migration initiatives.
  • Ensure cost optimization and performance tuning of data workloads.
  • Stay updated on emerging technologies (AI-driven data management, Data Mesh, Data Fabric, GenAI-enhanced data tools.

KNOWLEDGE, SKILLS, AND EXPERIENCE REQUIRED:

  • Bachelor’s Degree in Computer Science, Software Engineering, Statistics, Mathematics, Data Science, Information Systems, or other Quantitative fields.
  • Preferred: Master’s degree or equivalent experience in Data Engineering, Cloud Computing, or Analytics.
  • Certifications in one or more of the following:
  • Azure Data Engineer Associate / AWS Certified Data Analytics / Google Professional Data Engineer;
  • Databricks Certified Data Engineer;
  • Snowflake SnowPro Core / Advanced Architect
  • Minimum 3–5 years experience in Data Engineering or Data Platform Development.
  • Proficiency in SQL, Python, PySpark, or Scala for data transformation.
  • Experience with cloud data platforms (Azure, AWS).
  • Strong understanding of data modelling, data warehousing, and ETL orchestration.
  • Hands-on experience with data versioning, CI/CD for data pipelines, and Infrastructure as Code (IaC) using Terraform or ARM templates.
  • Familiarity with data governance frameworks and data privacy principles.
  • Experience with modern architecture patterns such as Data Mesh or Data Fabric is a plus.
  • Excellent communication, collaboration, and problem-solving skills in cross-functional, multicultural environments.
Design, build, and maintain data pipelines to ingest data from structured and unstructured sources (internal and external). Develop and optimize ETL/ELT processes to ensure reliability, scalability, and performance across large datasets. Implement data warehousing and data lake architectures using cloud and on-prem technologies (e.g., Snowflake, Azure Synapse, BigQuery, AWS Redshift, Databricks, SSMS). Create reusable data assets and frameworks for repeatable and standardized data integration. Implement data validation, cleansing, and quality monitoring frameworks. Integrate and support Master Data Management (MDM) and Metadata Management practices. Partner with the Data Governance and Data Protection Officers to ensure compliance with Data protection and Privacy laws in Uganda and other global data protection laws. Manage data lineage, cataloging, and access control using enterprise tools such as Azure Purview, Collibra, or Alation. Build scalable data pipelines using tools such as Azure Data Factory, Apache Airflow, NiFi, or AWS Glue. Develop real-time and batch data streaming solutions using Kafka, Event Hubs, or Kinesis. Support API-based integrations and data sharing across systems and geographies. Work closely with Data Scientists and Analysts to provision and prepare data for predictive and prescriptive modelling. Collaborate with BI and reporting teams to ensure data consistency across dashboards and analytical layers. Partner with cross-functional teams to define and implement data standards and reusable assets. Research and implement best-in-class tools and frameworks for data engineering. Lead or contribute to cloud modernization and data platform migration initiatives. Ensure cost optimization and performance tuning of data workloads. Stay updated on emerging technologies (AI-driven data management, Data Mesh, Data Fabric, GenAI-enhanced data tools.
 
Bachelor’s Degree in Computer Science, Software Engineering, Statistics, Mathematics, Data Science, Information Systems, or other Quantitative fields. Preferred: Master’s degree or equivalent experience in Data Engineering, Cloud Computing, or Analytics. Certifications in one or more of the following: Azure Data Engineer Associate / AWS Certified Data Analytics / Google Professional Data Engineer; Databricks Certified Data Engineer; Snowflake SnowPro Core / Advanced Architect Minimum 3–5 years’ experience in Data Engineering or Data Platform Development. Proficiency in SQL, Python, PySpark, or Scala for data transformation. Experience with cloud data platforms (Azure, AWS). Strong understanding of data modelling, data warehousing, and ETL orchestration. Hands-on experience with data versioning, CI/CD for data pipelines, and Infrastructure as Code (IaC) using Terraform or ARM templates. Familiarity with data governance frameworks and data privacy principles. Experience with modern architecture patterns such as Data Mesh or Data Fabric is a plus. Excellent communication, collaboration, and problem-solving skills in cross-functional, multicultural environments.
bachelor degree
36
JOB-68f746f5a88ab

Vacancy title:
Data Engineer

[Type: FULL_TIME, Industry: Finance, Category: Science & Engineering]

Jobs at:
Dfcu

Deadline of this Job:
Wednesday, October 29 2025

Duty Station:
kampala | Kampala | Uganda

Summary
Date Posted: Tuesday, October 21 2025, Base Salary: Not Disclosed

Similar Jobs in Uganda
Learn more about Dfcu
Dfcu jobs in Uganda

JOB DETAILS:

Reporting to the Head Of Data & Insights, the role holder will be responsible for designing, building, and maintaining robust, scalable, and secure data pipelines and infrastructure across global data platforms. The role ensures that data from various systems is efficiently ingested, transformed, stored, and made available for advanced analytics, reporting, and machine learning use cases in compliance with global data governance and privacy standards.

KEY ACCOUNTABILITIES:

  • Design, build, and maintain data pipelines to ingest data from structured and unstructured sources (internal and external).
  • Develop and optimize ETL/ELT processes to ensure reliability, scalability, and performance across large datasets.
  • Implement data warehousing and data lake architectures using cloud and on-prem technologies (e.g., Snowflake, Azure Synapse, BigQuery, AWS Redshift, Databricks, SSMS).
  • Create reusable data assets and frameworks for repeatable and standardized data integration.
  • Implement data validation, cleansing, and quality monitoring frameworks.
  • Integrate and support Master Data Management (MDM) and Metadata Management practices.
  • Partner with the Data Governance and Data Protection Officers to ensure compliance with Data protection and Privacy laws in Uganda and other global data protection laws.
  • Manage data lineage, cataloging, and access control using enterprise tools such as Azure Purview, Collibra, or Alation.
  • Build scalable data pipelines using tools such as Azure Data Factory, Apache Airflow, NiFi, or AWS Glue.
  • Develop real-time and batch data streaming solutions using Kafka, Event Hubs, or Kinesis.
  • Support API-based integrations and data sharing across systems and geographies.
  • Work closely with Data Scientists and Analysts to provision and prepare data for predictive and prescriptive modelling.
  • Collaborate with BI and reporting teams to ensure data consistency across dashboards and analytical layers.
  • Partner with cross-functional teams to define and implement data standards and reusable assets.
  • Research and implement best-in-class tools and frameworks for data engineering.
  • Lead or contribute to cloud modernization and data platform migration initiatives.
  • Ensure cost optimization and performance tuning of data workloads.
  • Stay updated on emerging technologies (AI-driven data management, Data Mesh, Data Fabric, GenAI-enhanced data tools.

KNOWLEDGE, SKILLS, AND EXPERIENCE REQUIRED:

  • Bachelor’s Degree in Computer Science, Software Engineering, Statistics, Mathematics, Data Science, Information Systems, or other Quantitative fields.
  • Preferred: Master’s degree or equivalent experience in Data Engineering, Cloud Computing, or Analytics.
  • Certifications in one or more of the following:
  • Azure Data Engineer Associate / AWS Certified Data Analytics / Google Professional Data Engineer;
  • Databricks Certified Data Engineer;
  • Snowflake SnowPro Core / Advanced Architect
  • Minimum 3–5 years experience in Data Engineering or Data Platform Development.
  • Proficiency in SQL, Python, PySpark, or Scala for data transformation.
  • Experience with cloud data platforms (Azure, AWS).
  • Strong understanding of data modelling, data warehousing, and ETL orchestration.
  • Hands-on experience with data versioning, CI/CD for data pipelines, and Infrastructure as Code (IaC) using Terraform or ARM templates.
  • Familiarity with data governance frameworks and data privacy principles.
  • Experience with modern architecture patterns such as Data Mesh or Data Fabric is a plus.
  • Excellent communication, collaboration, and problem-solving skills in cross-functional, multicultural environments.

 

Work Hours: 8

Experience in Months: 36

Level of Education: bachelor degree

Job application procedure

Interested and qualified? Click here to apply

 

All Jobs | QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Engineering jobs in Uganda
Job Type: Full-time
Deadline of this Job: Wednesday, October 29 2025
Duty Station: kampala | Kampala | Uganda
Posted: 21-10-2025
No of Jobs: 1
Start Publishing: 21-10-2025
Stop Publishing (Put date of 2030): 21-10-2076
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.