What you’ll work on:

We are looking for Senior Data Engineers to drive the architectural design, implementation plan, best practices, and testing plans for projects involving terabytes of data, which will serve as the foundation for advanced analytics and machine learning tasks to be performed by data scientists on top of that infrastructure.

Primary Responsibilities:

  • Design and implement product features in collaboration with product owners, report developers, product analysts, architects, and business partners within an Agile / Scrum methodology.
  • Design and implement data platforms for large-scale, high performance and scalable requirements, integrating data from several data sources, managing structured and unstructured data while melding existing warehouse structures.
  • Analyze, diagnose and identify bottlenecks in data workflows
  • Participate in demos to clients as well as requirements elicitation and translation to systems requirements (functional and nonfunctional).
  • Constantly monitor, refine and report on the performance of data management systems.

Required Skills:

  • Strong General Programming Skills (Java, Scala) 
  • Experience in data streams processing technologies including Kafka, Spark Streaming, etc
  • Solid engineering foundations (good coding practices, good data pipeline architectural design skills).
  • Solid Experience with Spark.
  • 4+ years of experience with large-scale data engineering.
  • 2+ years of experience developing on Hadoop Ecosystem HDFS and Hive.
  • Experience building cloud scalable, real time and high-performance data lake solutions.
  • Proficiency designing and implementing ETL (Extract, Transform, Load) processes, dealing with big volumes of data (terabytes of data which required distributed processing)
  • Advanced English level.

Nice to have, but not required:

  • Experience working with SQL in advanced scenarios that require heavy optimization
  • Experience with Elasticsearch.
  • Experience developing solution within AWS Services framework (EMR, EC2,RDS, Lambda, etc.)
  • Experience with NoSQL databases such as Apache HBase, MongoDB or Cassandra.

Our Vision and Mission:

Make a Difference in Our Community: At Wizeline we work hard and we work smart to make an impact for worldwide customers and the tech ecosystem in our community.

Wizeline democratizes technology and innovation: Wizeline is a talent and software as a service company for application design and development. We help enterprises develop and release intelligent software solutions.We tear down barriers to bring Silicon Valley opportunities and practices to people around the world. Anyone can innovate with Wizeline.