mylo is a fintech platform dedicated to helping millions of people and businesses thrive by providing accessible and responsible financial solutions. Whether you’re purchasing a mobile phone, a new jacket, a flight ticket, a comfy couch, or even covering school tuition, mylo enables you to buy now and pay later at thousands of points of sale across Egypt. Born out of B.TECH—Egypt’s leading electronics and appliances retailer with over 27 years of experience in offering buy now, pay later solutions—mylo brings a legacy of trust and innovation to the fintech space. All mylo products are fully Sharia-compliant, ensuring ethical and inclusive financial practices.
We are seeking a passionate and experienced Senior Data Engineer to join our team within the Fintech domain. This role is ideal for someone who thrives in a fast-paced environment and is excited to design, build, and scale secure, high-performing infrastructure to support a range of financial products.
Requirements
Responsibilities
- Design, develop, and maintain large-scale, reliable data pipelines using Python, SQL, and big data technologies such as Apache Spark and Kafka.
- Build and optimize ETL/ELT processes for data transformation, loading, and integration from multiple sources.
- Develop and maintain data storage solutions using both relational and NoSQL databases, including SQL Server, PostgreSQL, MySQL, and MongoDB.
- Implement and manage CI/CD pipelines for data workflows, enabling automated deployments and version control.
- Work with AWS services to build, deploy, and monitor cloud-based, scalable data solutions.
- Leverage Apache Airflow for orchestrating workflows and PostHog for analytics tracking and event data.
- Manage and enhance data warehousing solutions to support business intelligence and analytics needs.
- Ensure data accuracy, consistency, and security across diverse systems and sources.
- Troubleshoot and optimize data systems for performance, scalability, and cost efficiency.
- Actively promote and contribute to a collaborative, innovative, and agile team
Requirements
- 5+ years of experience in data engineering, building and maintaining production-grade data pipelines and architectures.
- Proficient in Python and SQL.
- Hands-on with relational databases (SQL Server, PostgreSQL, MySQL) and NoSQL (MongoDB).
- Experience with big data and stream processing tools (e.g., Apache Spark, Kafka).
- Skilled in implementing CI/CD pipelines for data workflows.
- Strong understanding of AWS services (S3, Redshift, Lambda, Glue).
- Experience with Apache Airflow for workflow orchestration.
- Familiarity with PostHog or Amplitude for analytics tracking and event management.
- Comfortable with Docker, Kubernetes, and Linux shell scripting.
- Solid grasp of data modeling, warehousing, scalability, and reliability best practices.
- Proven ability to ensure data quality, governance, and security.
- Strong communication skills and a collaborative mindset.
- Passion for continuous learning and staying updated on emerging technologies.