System Architect/Senior Consultant (Hortonworks)

Locations:

Paris, France; Munich/Frankfurt, Germany; London, UK; Stockholm, Sweden or Copenhagen, Denmark

Brief Description:

The System Architect/Senior Consultant forms part of the Professional Services organisation. Working across a diverse range of industries and projects enabling our customers on their Big Data journey. Engaging with customers from Proof of Concept (POC) stages through to implementation of complex distributed production environments. You will work collaboratively with customers to optimize performance, develop reference architectures and form part of a team that will foster a long standing collaborative relationship with our customer group.

Responsibilities:

  • Drive POCs with customers to successful completion
  • Analyze complex distributed production deployments, and make recommendations to optimize performance
  • Help develop reference Hadoop architectures and configurations
  • Write and produce technical documentation, knowledge base articles
  • Work directly with prospective customers’ technical resources to devise and recommend solutions based on the understood requirements
  • Participate in the pre- and post- sales process, helping both the sales and product teams to interpret customers’ requirements
  • Work closely with Hortonworks teams at all levels to ensure rapid response to customer questions and project
  • Playing an active within the Open Source Community

 

Qualifications:

  • More than five years of Professional Services (customer facing) experience architecting large scale storage, data center and /or globally distributed solutions
  • 2+ years designing and deploying 3 tier architectures or large scale Hadoop solutions
  • Experience working with Apache Hadoop including: Knowledge on how to create and debug Hadoop jobs
  • Ability to understand big data use-cases, and recommend standard design patterns commonly used in Hadoop-based deployments
  • Ability to understand and translate customer requirements into technical requirements
  • Experience implementing data transformation and processing solutions using Apache PIG
  • Experience designing data queries against data in the HDFS environment using tools such as Apache Hive
  • Experience implementing MapReduce jobs
  • Experience setting up multi-node Hadoop clusters
  • Strong experience implementing software and/or solutions in the enterprise Linux or Unix environment
  • Strong understanding of the Java ecosystem and enterprise offerings, including debugging and profiling tools (jconsole), logging and monitoring tools (log4j, JMX), and security offerings (Kerberos/SPNEGO).
  • Familiarity with scripting tools such as bash shell scripts, Python and/or Perl
  • Demonstrable experience using R and the algorithms provided by Mahout
  • Hortonworks Certifications are an advantage but not essential

 
Apply

 

Job posted 10/6/2016