The document describes requirements for a Big Data Senior Developer role. It seeks candidates with 4-6 years of experience, including 2+ years working with big data technologies. Required skills include strong development experience with Hadoop, MapReduce, Hive, Pig and Impala. Preferred experience includes working with cloud computing platforms and APIs. The role requires technical skills in HDFS, MapReduce, Hive, HBase, Pig and other big data processing and streaming tools.
The document describes requirements for a Big Data Senior Developer role. It seeks candidates with 4-6 years of experience, including 2+ years working with big data technologies. Required skills include strong development experience with Hadoop, MapReduce, Hive, Pig and Impala. Preferred experience includes working with cloud computing platforms and APIs. The role requires technical skills in HDFS, MapReduce, Hive, HBase, Pig and other big data processing and streaming tools.
The document describes requirements for a Big Data Senior Developer role. It seeks candidates with 4-6 years of experience, including 2+ years working with big data technologies. Required skills include strong development experience with Hadoop, MapReduce, Hive, Pig and Impala. Preferred experience includes working with cloud computing platforms and APIs. The role requires technical skills in HDFS, MapReduce, Hive, HBase, Pig and other big data processing and streaming tools.
Bigdata experience 2+ Years Role Description Professional Experience Required Experience in handling huge volumes on data in a multi-node cluster environment. Strong development skills around Hadoop ecosystem, MapReduce, Hive, Pig, Impala, Spark. Strong understanding of Hadoop internals, different compressions like AVRO, JSON, and different file formats. Develop user defined functions to provide custom Hive and Pig capabilities. Knowledge on Core Java. Ability to understand requirements and convert to solutions. Familiarity with Data ware Housing, ETL, BI, Visualization tools, Machine Learning algorithms. Excellent problem solving, hands-on coding and communication. Insight into the way that Big Data is affecting industry and knowledge of best practices. Professional Experience Preferred Knowledge/experience of cloud computing infrastructure (e.g. Amazon Web Services EC2, Elastic MapReduce, Azure) Strong API experience Good to have Cloudera/Hortonworks Certification/MapR – (not mandatory)
Technical Skills Required
Any combination of below technical skills Big Data : HDFS, MapReduce, Hive, Hbase, Pig, Mahout, Avro, Oozie, Flume, Sqoop Processing & Streaming : Spark, Strom, Samza, Kafka NoSQL : Cassandra, MongoDB, Hbases Languages : Java, Scala, Shell Scripting, Perl/Python/PHP, Golang