Interswitch Limited is an integrated payment and transaction processing company that provides technology integration, advisory services, transaction processing and payment infrastructure to government, banks and corporate organizations. Interswitch, through its “Super Switchâ€Â provides online, real-time transaction switching that enable businesses and individuals have access to their funds across the 24 banks in Nigeria and across a variety of payment channels such as Automated Teller Machines (ATMS), Point of Sale (PoS) terminals, Mobile Phones, Kiosks, Web and Bank Branches.
About the job
- Design, build, and operate the MVNO’s data lakehouse and pipelines to ensure secure, reliable, and governable data flow for analytics, regulatory, and operational needs.
RESPONSIBILITIES
Data Pipeline Development and Management
- Architect, develop, and maintain batch and real-time ingestion pipelines (Kafka, Airflow/Flink) from BSS/OSS, CDRs, CRM, and external sources into the lakehouse.
- Collaborate with Analysts and Data Scientists to productionise transformations and feature engineering.
Data Quality, Compliance and Monitoring
- Implement data quality, schema validation, and monitoring to guarantee data fitness for purpose and compliance with NCC data regulations.
- Monitor pipeline health purely through metadata (job status, row counts, latency) and surface SLA dashboards while guaranteeing data sovereignty.
Data Storage and Optimization
- Manage object storage, Delta/Iceberg tables, and warehouse schemas; optimize partitioning and performance.
Infrastructure Optimization and DevOps
- Automate CI/CD deployment of data pipelines using Infrastructure as Code and GitOps best practices.
Governance & Security
- Own cost governance and access control policies across data platforms.
EDUCATION
General Education
- Bachelor's or master’s degree in Computer Science, Software Engineering, or related field.
EXPERIENCE
General Experience
- Minimum 7 years designing and operating data pipelines and lakehouses in telecom, fintech, or comparable data?intensive industries.
- Experience mentoring junior engineers or leading small project teams is desirable.
- Hands-on with streaming (Kafka/Flink/Pulsar) and batch orchestration (Airflow/Prefect).
- Expertise in SQL, Python/Scala, and Delta Lake/Iceberg/Hudi table formats.
- Proficiency in cloud object storage and data warehouse platforms (Snowflake, BigQuery, Redshift).
- Strong grasp of data quality frameworks (Great Expectations), CI/CD, Terraform, Docker/Kubernetes.
Method of Application
Signup to view application details.
Signup Now