Java / Bigdata Architect

Job Description

Java / Bigdata Architect

Type : Full Time

Location : Noida

Experience Required : 8 Year(s)

Industry : IT-Software/Software Services

Preferred Skills : Aws Big Data Hadoop Hbase HDFS Java NoSQL SQL Zookeeper

Job Description :

A successful candidate with 8+ years of experience in the role of implementation of high end software products. 

-Provides technical leadership in Big Data space (Hadoop Stack like M/R, HDFS, Pig, Hive, HBase, Flume, Sqoop, etc..NoSQL stores like Cassandra, HBase etc) across Engagements and contributes to open source Big Data technologies.

-Visualize and evangelize next generation infrastructure in Big Data space (Batch, Near Real-time, Real-time technologies).

-Passionate for continuous learning, experimenting, applying and contributing towards cutting edge open source technologies and software paradigms

-Expert-level proficiency in at-least one of Java 

-Strong understanding and experience in distributed computing frameworks, particularly Apache Hadoop 2.0 (YARN; MR & HDFS) and associated technologies -- one or more of Hive, Sqoop, Avro, Flume, Oozie, Zookeeper, etc.Hands-on experience with Apache Spark and its components (Streaming, SQL, MLLib) 

-Operating knowledge of cloud computing platforms (AWS, especially EMR, EC2, S3, SWF services and the AWS CLI) 

-Experience working within a Linux computing environment, and use of command line tools including knowledge of shell/Python scripting for automating common tasks

What You will Do :-

-Evaluate and recommend Big Data technology stack best suited for customer needs

-Drive significant technology initiatives end to end and across multiple layers of architecture

-Provides strong technical leadership in adopting and contributing to open source technologies related to BigData across multiple engagements 

-Designing /architecting complex , highly available , distributed , failsafe compute systems dealing with considerable amount (GB/TB) of data

-Identify and work upon incorporating Nonfunctional requirements into the solution (Performance , scalability , monitoring etc.)