Powiązane oferty

Brak wyników spełniających kryteria wyszukiwania.

company logo

Senior Data Engineer

DCGIkona lokalizacjiPolska

Rodzaj zatrudnienia
Rodzaj zatrudnieniaPełny etat
Doświadczenie
DoświadczenieSenior
Dodano
Dodano22 kwietnia 2026
Zarobki
ZarobkiDo uzgodnienia

Responsibilities:

  • Design, develop, and maintain scalable data pipelines (ETL/ELT)
  • Build and enhance data warehouses and data lake / lakehouse solutions
  • Create and deploy data models supporting business and analytical needs
  • Write efficient and scalable code (Python / PySpark, optionally Scala or Java)
  • End to end ownership of pipelines feeding the Data Platform
  • Ensure high data quality, availability, and timeliness
  • Collaborate with the Data Governance team (GDPR, CISO, data quality built into pipelines)
  • Optimize existing solutions in terms of performance, cost, and stability
  • Work closely with product, analytics, and business teams
  • Define technical and architectural standards
  • Prototype and implement new approaches and technologies
  • Collaborate with the Data Product Manager to align data sources and requirements
  • Create and deliver the data roadmap for key datasets
  • Clearly communicate technical solutions to both technical and non technical stakeholders
  • Mentor and support less experienced data engineers
  • Represent the Data & Analytics team in cross functional initiatives
  • Promote the value of modern data solutions across the organization

Requirements:

  • Strong experience as a Data Engineer (Senior / Full Stack)
  • Proficiency in Python, PySpark, SQL, and Bash/Shell scripting
  • Experience with Snowflake, dbt, Kafka or Kinesis, Airflow, and AWS Glue
  • Experience working with data platforms such as Data Lake, Data Warehouse, or Lakehouse
  • Knowledge of NoSQL databases (e.g., MongoDB)
  • Hands-on experience with AWS and building cloud-based data platforms
  • Experience with Terraform (Infrastructure as Code)
  • Proficiency with Git/GitHub and CI/CD tools (e.g., GitHub Actions)
  • Strong understanding of modern data architectures (Data Lake, Data Warehouse, Lakehouse, Data Mesh)
  • Experience with data modelling, data governance, and cost optimization of data pipelines
  • Experience working in Agile environments (Scrum, Kanban)
  • Strong communication skills and ability to work with global stakeholders
  • English proficiency at B2+ level or higher

Nice to have:

  • Experience in cloud data platform transformations
  • Experience working with large-scale data environments

Offer:

  • Private medical care
  • Co-financing for the sports card
  • Constant support of dedicated consultant
  • Employee referral program

Zainteresowany ofertą?

Aplikuj już teraz!