Skip to content

Data Engineer

Remote
  • Wrocław, Dolnośląskie, Poland

Job description

You’ll be joining a team of passionate developers to work on projects for local and international companies as well as in academic research. This will involve different phases of the end-to-end delivery – direct contact with the client, business analysis of the problem, coming up with an appropriate solution, implementation and moving it to the production environment. You’ll be directly cooperating with machine learning team members, either by leading the project or assisting in various stages.


Job requirements

  • MS/PhD (or BS +3-4 years industry experience) in Computer Science or related fields

  • 3 years minimum working with data-intensive projects

  • independent worker - as the company's only Data Engineer, it's imperative that this candidate can independently drive forward data priorities

  • expertise with relational databases and SQL querying and scripting

  • expertise in Python

  • demonstrated ability to write high-quality, production-ready code (readable, well-tested, with well-designed APIs)

  • familiarity with DevOps related concepts / tools (e.g. Docker, Kubernetes, Terraform)

  • passion for architecting large distributed systems with elegant interfaces that can scale easily.

  • experience in areas relevant to data engineering, including data management, custom ETL design and data modeling.

  • ability to communicate effectively and collaborate with people of diverse backgrounds and job functions

  • familiarity with cloud computing services (AWS or GCP)

  • familiarity with web services and application frameworks (Django, Flask, FastAPI)

  • proficiency in Linux environment (including shell scripting), and experience with version control practices and tools

  • experience with working with machine learning and/or data scientist stakeholders in accelerating workflows

  • experience with large-sized data sets and associated technologies such as Spark/Big Query/Hive/Flink/Kafka, etc.

Nice to Have

  • expertise additional general-purpose programming languages (such as Java, Scala, C/C++, or Go)

  • hands-on experience with Cloud Data Warehouse (Snowflake or Redshift) and Big Data technologies (e.g S3, Hadoop, Hive, Spark, Flink, Kafka, etc).

  • working knowledge of statistics and various flavors of statistical modeling techniques

  • experience with deploying ML models

  • experience with MLOps related concepts / tools (e.g. MLFlow/Neptune/Kubeflow/W&B)

Salary: 11 000 - 17 000 PLN + VAT (B2B)


We offer you:

  • working with the newest machine learning technologies
  • budget on self-development per year
  • possibility to contribute to a variety of interesting projects
  • internal workshops
  • personal branding (articles, conference speaker, internal workshop leader)
  • flexible work hours
  • remote work possibility
  • chillout room / free beverages / team & company event
  • friendly atmosphere
  • MultiSport 
  • LuxMed 



or