Junior Data Engineer

Huddle Gaming d.o.o.

O poziciji

About Huddle


At Huddle, we’re building the next generation of scalable and reliable sports betting infrastructure, and data is at the heart of everything we do. We provide automated pricing and trading services with industry-leading uptime and accuracy, helping our sportsbook partners grow turnover and improve margins.

Our Data Engineering team works with large-scale sports data across major US leagues like American football, basketball, baseball, and ice hockey, as well as global sports such as soccer and tennis. If you’re passionate about sports, excited to learn, and keen to build your data engineering skills, you’ll thrive at Huddle.



About the role


As a Junior Data Engineer, you’ll support the design and maintenance of scalable sports data pipelines, working with Python, Airflow, Spark, and modern data platforms like Snowflake and Kafka. You’ll help transform raw data into actionable insights, build and maintain internal reports and tools, collaborate across teams, and contribute to improving data infrastructure while continuously learning best practices in data engineering.



What you’ll do


  • Collaborate with senior team members to design, build, and maintain scalable sports data pipelines while adhering to best practices for data security, code quality, and pipeline reliability
  • Maintain and support development of reports and dashboards for internal stakeholders
  • Write and maintain Python scripts, Airflow workflows, Spark jobs, and Airbyte data integrations
  • Work with relational databases (PostgreSQL) and streaming platforms (Kafka) to efficiently store and process data
  • Learn internal data structures and apply business logic to transform raw data into actionable insights
  • Develop proficiency in querying, analyzing, and optimizing performance in Snowflake
  • Assist in building and maintaining internal Streamlit tools for feature testing and data exploration
  • Support the modernization and improvement of large-scale data infrastructure and pipelines
  • Create and maintain clear technical documentation for processes, code, and workflows
  • Collaborate with Analytics, Quant, and other teams to meet business data requirements
  • Continuously learn new tools, technologies, and best practices in data engineering and sports analytics and contribute ideas for system and process improvements


Minimalne kvalifikacije


What we’re looking for



  • Bachelor’s degree in Computer Science, Software Engineering, Data Science, or a related STEM field
  • 0–3 years of experience in data engineering, analytics, or a related technical role, including internships, academic projects, or independent work
  • Strong analytical mindset with curiosity and a genuine eagerness to learn
  • Solid understanding of Python with experience building simple applications through personal or professional projects
  • Exposure to SQL and relational databases, preferably PostgreSQL
  • Interest in data pipelines, ETL processes, and business intelligence solutions
  • Strong communication and collaboration skills, with the ability to work effectively across teams
  • Willingness to actively participate in learning, experimentation, and team data initiatives
  • Strong eagerness to learn and continuously develop technical skills, and openness to constructive feedback
  • Attention to detail with a proactive approach and willingness to take ownership of tasks and to seek guidance when needed


Bonus kvalifikacije

Bonus points if you have

  • Familiarity with workflow orchestration tools such as Airflow
  • Exposure to data synchronization tools like Airbyte
  • Understanding of data streaming and messaging platforms such as Kafka
  • Experience or interest in big data processing frameworks like Spark
  • Exposure to cloud platforms such as AWS, OCI, or equivalent services
  • Basic knowledge of data warehousing tools like Snowflake, including an understanding of Terraform for permissions management
  • Experience with reporting and dashboarding tools such as Metabase or other BI platforms
  • Familiarity with internal tooling frameworks like Streamlit
  • Understanding of containerization concepts using Docker
  • Experience with version control and collaboration tools such as Git, GitLab, Jira, Argo and Confluence




Zašto je super raditi na ovoj poziciji

This is a great role for someone who wants hands-on experience with modern data technologies while working on large-scale, real-world sports data. You’ll be part of a small, autonomous team where data sits at the heart of the business, processes and tools are continuously improved, and you’ll receive close guidance while having the freedom to learn, experiment, and make a real impact - especially if you’re passionate about sports.


Benefits

  • 25 days annual leave
  • Additional health insurance
  • Multisport card with 50% co-financing to get some exercise into your body
  • Flexible working - combine working from home and the office
  • Numerous opportunities for education, personal growth, and further training, supported by a personal, educational budget
  • Dog-friendly office