Beschreibung
My client, a technology company, are currently seeking a Data Engineer with experience in Python, Spark, DBT, BigQuery and Google Cloud for a 6 month rolling project, focusing on the development of a brand new function. You would be working in a highly capable product team with a large degree of autonomy.
This role is 100% Remote from anywhere inside Europe.
Rates between € per day, dependent on experience.
Required key competencies and qualifications:
* 5+ years of experience serving as a data engineer
* Strong skills with Python or a JVM language (Java, Scala)
* Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL)
* Experience working with cloud data warehouses, specifically BigQuery or +Snowflake
* Experience working with data platforms built on public cloud in a production setting.
* Experience working with GCP native data toolset is preferred.
* Having built ELT or ETL pipelines processing large amounts of data.
* Experience with dbt or Airbyte is preferred.
* Experience developing production Spark or Hadoop applications
* Gained experience with a workload scheduling and orchestration tool such as Apache Airflow/Cloud Composer
* Familiarity with modern DataOps (e.g. quality eng, testing, pipeline deployments etc) practices and IaC (e.g. Terraform)
* Experience with batch, micro-batch, and real-time streaming architectures
* Good internal and customer-facing communication skills
Interviews are happening as soon as This week , and my client is looking to onboard someone as soon as 10th July 2025.
If you are interested, please send me an up to date CV and time to speak at your earliest convenience.
This role is 100% Remote from anywhere inside Europe.
Rates between € per day, dependent on experience.
Required key competencies and qualifications:
* 5+ years of experience serving as a data engineer
* Strong skills with Python or a JVM language (Java, Scala)
* Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL)
* Experience working with cloud data warehouses, specifically BigQuery or +Snowflake
* Experience working with data platforms built on public cloud in a production setting.
* Experience working with GCP native data toolset is preferred.
* Having built ELT or ETL pipelines processing large amounts of data.
* Experience with dbt or Airbyte is preferred.
* Experience developing production Spark or Hadoop applications
* Gained experience with a workload scheduling and orchestration tool such as Apache Airflow/Cloud Composer
* Familiarity with modern DataOps (e.g. quality eng, testing, pipeline deployments etc) practices and IaC (e.g. Terraform)
* Experience with batch, micro-batch, and real-time streaming architectures
* Good internal and customer-facing communication skills
Interviews are happening as soon as This week , and my client is looking to onboard someone as soon as 10th July 2025.
If you are interested, please send me an up to date CV and time to speak at your earliest convenience.