Design & Development - craft interactive -


❮ Back              Next ❯

Job Title:  Hadoop Developer
Description of Work:  
  • Provide technical and development support to the government client to build and maintain a modernized Enterprise Data Warehouse (EDW) by expanding the current on-premises Hadoop cluster to accommodate an increased volume of data flowing into the enterprise data warehouse .
  • Perform data formatting involves cleaning up the data.
  • Assign schemas and create HIVE tables.
  • Apply other HDFS formats and structure (Avro, Parquet, etc. ) to support fast retrieval of data, user analytics and analysis.
  • Assess the suitability and quality of candidate data sets for the Data Lake and the EDW.
  • Design and prepare technical specifications and guidelines.
  • Act as self-starter with the ability to take on complex projects and analyses independently.
Basic Qualifications:  
  • Master's degree and 12 years of experience, Bachelor's degree in computer science, or related degree and 14 years of related work experience, or 18 years of related work experience to satisfy the degree requirement.
  • 2+ years of proven experience in a range of big data architectures and frameworks including Hadoop ecosystem, Java MapReduce, Pig, Hive, Spark, Impala etc...
  • 2 years of proven experience working with, processing and managing large data sets (multi TB scale).
  • Proven experience in ETL (Syncsort DMX-h, Ab Initio, IBM - InfoSphere Data Replication, etc.), mainframe skills, JCL.
Submit Your Resume
* required fields

Captcha image

Can't read the image? Click here to refresh.