Data Platform Engineer (DataOps) (m/f/x)
You will be joining the Platform Tribe at HelloTech. They provide a self-service, low-friction set
of tools and infrastructure that are stable, scalable, and easy to use. This includes providing the
Data Infrastructure to help engineers ship high quality data products faster and more securely.
As well as building a great foundation, you will also be responsible for spreading their
knowledge throughout the other tribes, make sure everyone is taking advantage of the easy to
use infrastructure, and applying the best practices when it comes to Continuous Delivery,
Containerisation, Performance, Security etc.
What you’ll do
● Deliver solid Infrastructure automation, leveraging Terraform and CI/CD workflows via Concourse/Github Actions
● Build and maintain container clusters with Kubernetes, enabling engineers to design and operate their own workloads
● Support and improve our evolving data technology infrastructure, creating automation around Airflow, AWS EMR and all their connected technologies
● Implement best practices and optimise the operations of our CI/CD workflows
● Design, develop and support tools and frameworks to be used by data engineers, enabling them to autonomously build high quality data assets without having to worry about the underlying infrastructure
● Mentor engineers regarding best practices, concepts and patterns regarding our data infrastructure and support in setting up efficient DevOps, DataOps and MLOps processes
What you’ll bring
● You have strong experience working with AWS, infrastructure automation on Terraform, containerisation with Docker and orchestration technologies such as Kubernetes
● You have strong experience with Python development on a data related roles and knowledge of data engineering practices
● You have experience creating an enabling platform, supporting engineers and how to enable them to monitor and automate their workflows
● You have experience working with data platforms, and how to create and maintain frameworks and automation to leverage those.
● You have understanding of data processing frameworks like Apache Hadoop/Spark, orchestration (Airflow), scalable data processing (AWS EMR) and modern DWH technologies (AWS Redshift/Snowflake)
● You can be a person of contact and advisor for our Data teams on data technology and operation procedures