Data Engineer (EU Candidates)

Hi there!

We are the biggest European Python Powerhouse with over 17 years of experience, 8 offices in Poland and deep commitment to Agile principles. Join a group of 500+ professionals dedicated to helping customers build outstanding products.

Are you the NEXT one?

“The data revolution has enabled us to create software solutions which mimic our intelligence behavior. I always want to be part of it by building data and artificial intelligence products. For my team I’m looking for those who are not afraid of challenges and want to solve difficult problems which eventually help people with use data engineering and machine learning. “
– dr Krzysztof Sopyła – Head of Machine Learning and Data Engineering

 

SALARY

Regular+:up 20 160 zł + VAT

Senior:up to 28 560 zł + VAT

 

What do we need from you?

Core values

We believe that every problem has a solution, often hidden and not so obvious. Our job is to work out them and the best  are born from imagination, cooperation and being craftsmanship.

How do we work?

We work with clients for their benefit and the benefit of their target users. We often act as consultants and architects, people who tear down the existing order, introducing changes and innovations. But just as often we act as craftsmen who must deliver software of the highest quality.

How the daily work will look like?

You will be assigned to a team working on a project for one of our foreign clients. Your responsibilities will include developing, inventing and implementing solutions to process large or distributed data.

What is expected of you?

As a Data Engineer, on a daily basis you will work with systems that process data often on a large scale.

We expect knowledge of:

  • best practices for designing scalable data processing systems, including data pipelines, advanced ETL processes, data warehouses and data lakes
  • at least one cloud platform (AWS, Azure, GCP) and its solutions related to data processing
  • SQL and at least one relational database management system like MySQL or PostgreSQL
  • at least one NoSQL database like HBase, DynamoDB or MongoDB
  • working principles of distributed data processing systems (Apache Spark, Apache Flink or similar)

Python and software development practices:

  • experience in Python 3.x
    • more advanced python constructs as: lambda functions, generators, list comprehension etc
    • core principles of object-oriented programming
    • understanding of the threading and multi-process computation in Python
  • experience in using code versioning tools, such as Git
  • day-to-day work experience with Docker

Soft skills:

  • good communication skills in English (minimum B2)
  • eagerness to develop yourself and learn new technologies
  • problem solving and analytical thinking

Your experience rating will also be affected by your other skills such as:

  • understanding and knowing message broker systems like Apache Kafka or AWS Kinesis
  • experience in using orchestration tools like Apache Airflow or Dagster
  • knowledge of search systems like ElasticSearch or Solr
  • experience in using CI/CD tools like Github Actions
  • experience in implementing solutions using frameworks like Hadoop, Hive or Presto
  • experience in machine learning or statistics
  • other development skills like REST API or GraphQL
  • data scraping experience