Big Data Software Engineer (DataRobot)

DataRobot is looking for experienced software engineers to join our team working on Big Data problems. As a team member, you will work on strengthening the DataRobot application to scale to new heights. The ideal candidate should have experience in distributed computing and storage architectures and be able to think at scale.
Skills/ Requirements
  • 5+ years experience building large scale, highly available, distributed computing systems
  • Bachelor’s degree in Computer Science, Engineering, or related field
  • Ability to expertly design and produce high quality, high performance code ready to ship
  • Ability to write code in Python
  • Solve performance and scalability problems
  • Strong knowledge of data structures, algorithms, and complexity analysis
  • Experience working in GNU/Linux environments
  • Understanding of software design principles
  • Strong oral and written communication skills
  • Provide leadership to other team members
  • Hands on experience with Big Data technologies (e.g. Hadoop MapReduce, Spark, Hive, Vertica, Netezza, Greenplum, Aster)
  • Strong knowledge of Python

Bonus

  • PhD or Master’s degree in Computer Science, Engineering, or related field
  • Familiarity with internals of distributed data processing engines such as Spark, Tez, Dryad
  • Experience in performance optimization and implementing high performance code
  • Developing fault tolerant systems
  • Knowledge of cloud infrastructure (e.g ec2, s3)

 
Apply

 

Job posted 2/9/2016