Data Pipeline Engineer

Job description

Smart, scalable, secure Data Pipelines are the lifeblood of the Nexla product. Come join a team of experienced enterpreneurs and seasoned engineers as Nexla continues to grow quickly and gain traction as an inter-company Data Operations platform. 


  • 5+ years of overall software development experience with a minimum of 3 years of core java backend technologies.
  • A strong understanding of algorithms and data structures, and their performance characteristics
  • Proficiency in working and developing on Linux
  • Experience supporting operations teams with deployments and debugging production issues.
  • Experience responding to feature requests, bug reports, performance issues and ad-hoc questions
  • Great interpersonal, written and verbal communication skills; including the ability to create technical specifications, debate technical tradeoffs, and explain technical concepts to business users
  • Bachelor’s degree in Computer Science, Engineering (or equivalent professional experience)

Background and Skills

  • 2+years of experience working with Hadoop/Kafka/Hbase/MapReduce with Java.
  • Experience in building stream processing systems with Spark, Storm or any other streaming technology.
  • Experience in building back-end systems for an Internet startup ad technology company
  • Experience with cloud infrastructure like Amazon Web services, Google Compute Engine
  • Experience with NoSQL data stores

Please note that we are only able to consider candidates who have a US work permit (Citizen, Green card, EAD, H1B etc.)  and can work in the San Francisco Bay Area.