All our products, services and integrations produce a mountain of data, which we want to use to make better software. To do this, we need to collect relevant data, organize it into a data warehouse and make it available for analysis. In short, we need to manage our big data systems.

What you’ll do:

  • Develop software to automate management and monitoring of our big data tools including Terraform, Hadoop, Spark, Kafka, Airflow, Redshift and many others
  • Design and build efficient platform and frameworks that are used by other technical people (like data engineers and data scientists) in their daily work
  • Implement integrations with new data sources
  • Mentor our internal users on ways how to get most out of the data processing platform
  • Ensure that our solution can scale to billions of data records and hundreds of terabytes of data

We’re looking for someone with:

  • Excitement about the latest big data technologies and eagerness to integrate those in the cloud (AWS) to solve real business scenarios
  • A good grasp of several programming languages and experience writing real-world applications
  • Knowledge of Python, or willingness to learn it
  • A good handle on SQL and ability to find your way in the relational databases
  • Nonstop attention to detail
  • Ability to work with international teams and feel comfortable with agile development

What we offer:

  • Competitive base pay
  • An international team with offices in Estonia, US, UK, Portugal, and the Czech Republic
  • Flexible work style and schedule
  • A place in our new Tallinn office  (one of the coolest offices in all of Estonia!)