About the Role
We’re looking for a Senior Data Platform Engineer to play a key hands-on role in building and evolving our client’s cloud-native Data Platform. This platform forms the foundation for how data is produced, processed, and consumed across the client’s organisation. This role is focused on distributed systems, real-time processing, and platform reliability. It is ideal for someone who enjoys solving complex data engineering challenges and influencing architecture.
What You’ll Do
- Develop and enhance scalable data services using Java, Apache Flink, and Python.
- Design and implement robust streaming and batch data pipelines.
- Contribute to the evolution of our cloud-native data architecture (AWS preferred).
- Improve platform usability to enable teams to publish and consume data more effectively.
- Ensure systems are reliable, observable, and cost-efficient.
- Contribute to architectural decisions with a focus on resilience and maintainability.
- Implement automated testing, monitoring, and CI/CD best practices.
- Troubleshoot complex distributed systems issues in production.
- Work on platform issues as they arise.
- Help build capabilities that make data trusted, discoverable, and reusable.
- Support event-driven and domain-oriented data patterns.
- Improve metadata, lineage, and quality visibility where relevant.
- Partner with Product Managers, Architects, and Analytics teams to translate business needs into technical solutions.
- Provide technical leadership through design reviews, documentation, and code quality.
- Contribute to a culture of continuous improvement and shared ownership.
What We’re Looking For
- Strong professional experience in backend or data engineering.
- Proficiency in Java for building distributed systems.
- Hands-on experience with Apache Flink (or similar).
- Solid Python experience for data processing and tooling.
- Experience building and operating production-grade data pipelines.
- Strong understanding of distributed systems concepts.
- Experience working in cloud environments (AWS preferred).
Nice to have
- Experience with Kafka or event streaming ecosystems.
- Familiarity with infrastructure as code and containerised environments.
- Exposure to data governance, cataloging, or data quality tooling.
- Experience working in a domain-oriented or federated data model.