Brak wyników spełniających kryteria wyszukiwania.

Data Engineer with Databricks
AppliscalePolska
Our client
is one of the largest game studios known for their very successful MOBA and FPS franchises.
As a Software Engineer
on the Core Integrations and Architecture team, you will have the chance to shape how our client collects and uses data to improve the experience for their players and employees.
You’ll be a core contributor to build reliable data solutions that handle petabytes of data. Challenges will range from protecting player’s privacy, organizing and optimizing data warehouses using big data tools and cloud based servers, building a platform for ingesting data and serving real time analytics, or building guardrails and toolings to unlock capability for teams across the company to build data products.
You’ll bring your depth of expertise of working with globally distributed systems and large scale data to help us build efficient solutions.
Responsibilities:
- Please note, availability to attend afternoon/evening meetings is a requirement for this role as most of the team is located on the US West Coast (LA and Seattle)
- Implementation of core features under the guidance of Technical Lead/Engineering Lead
- Conduct code reviews for members in the team
- Collaborate with different teams across the company to incorporate customer feedback and provide elegant solutions
- Be part of an on-call rotation to support our live products (further details will be provided during the interview process)
- Build new data products on AWS/Databricks
- Manage Databricks platform deployment
- Create guide rails and best practices for using Databricks in the company
- Evaluate new Databricks offerings and how they can be leveraged
- Reduce ambiguity in complex problem spaces by leading technical discovery and prototyping efforts that have a strategic impact on the team
- Identify as well as investigate key problem or opportunity spaces and formulate recommendations and strategies for whether and how to pursue these
- Prepare design docs, implementation strategy and choose appropriate tools
- Hands-on work with live production systems
Required Qualifications:
- Experience in building data pipelines or data products using Databricks
- Experience in managing Databricks as a platform
- 5+ years of experience in Java/Scala or Python
- Bachelor’s or higher degree in Computer Science, Software Engineering, or a related field
- Fluency in English, it’s our daily business language
- Knowledge in Infrastructure as Code tooling, e.g. Terraform
- Good knowledge of and experience with operating AWS Services and Networking, e.g. S3, EC2, SG, VPC, ASG, R53, etc.
- Experience with Spark/PySpark
- Experience with Streaming technologies, e.g. Kafka Connect
- Effective communication and teamwork skills
Nice to Have:
- Experience in the gaming industry, particularly with online multiplayer games
- Experience working with cross-discipline organizations that build data products
- Proficient in large-scale data manipulation across various data types
- Demonstrated ability to troubleshoot and optimize complex ETL pipelines
- Familiarity with relational databases (e.g., MySQL, Postgres), and distributed storage systems (e.g., S3)
- Knowledge of data design patterns like medallion architecture
- Knowledge of infrastructure monitoring tools (Datadog, Pagerduty)
Zainteresowany ofertą?
Aplikuj już teraz!