Data Engineer AWS & Airflow

Location: Amsterdam

What are you going to do?

Endclient in culture sector based in Amsterdam. Top 5 hottest scale-ups in the Netherlands. You’re a Data Engineer with a passion for architecting resilient and reliable data systems. You obsess over data quality and operational excellence, build clean and efficient Data Pipelines, and constantly look for ways to make systems better. You see the value in being a data-driven organization. 

  • Work closely with other Data Engineers on the team to architect, build and improve our Data Platform and analytics infrastructure;
  • Design and implement fault-tolerant, resilient ELT/ETL pipelines using open source tools like Airflow or hosted AWS/SaaS services;
  • Build monitoring tools to help meet our data quality/availability SLAs and ensure high-availability for our infrastructure;
  • Ability to work independently without active supervision, and take full ownership of what you build;
  • Support Data Scientists and Data Analysts on the team to productionize their workflows, while building tools and infrastructure to improve their speed and effectivenes.
Function properties
  • AWS;
  • Airflow;
  • Docker;
  • Kubernetes.

What are we looking for?

  • 2+ years experience working as an Engineer on data-intensive applications, architecting large-scale production systems;
  • Substantial experience with Python and other scripting languages with a track record of writing clean, efficient code;
  • Experience using core Data Engineering technologies like Airflow and DBT;
  • Reasonable understanding of relational databases like PostgreSQL, as well as Data Warehousing technologies like Redshift, BigQuery, Snowflake, etc.;
  • Sound understanding of SQL fundamentals and past experience with data warehousing, data modeling and schema design;
  • DevOps mindset; experience deploying applications using technologies like Kubernetes and Docker on AWS.

 

Geïnteresseerd? Neem contact op met Jesse van der Meer.

Arrow TERUG NAAR HET OVERZICHT