An expert with 4-5 years of experience in Hadoop ecosystem and cloud- (AWS ecosystem/Azure), relational data stores, Data Integration techniques, XML, Python, Spark, and ETL techniques.

The key skills are

  • Experience working with in-memory computing using R, Python, Spark, PySpark, Kafka, and Scala.
  • Experience in parsing and shredding XML and JSON, shell scripting, and SQL
  • Experience working with Hadoop ecosystem - HDFS, Hive.
  • Experience working with AWS ecosystem - S3, EMR, EC2, Lambda Cloud Formation, Cloud Watch, SNS/SQS.
  • Experience with Azure – Azure Data Factory (ADF).
  • Experience working with SQL and No SQL databases.
  • Experience designing and developing data sourcing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking, and matching.


Four, Immediate Hire (3-4 Weeks)

Employment Type

W-2, C2C

Total Compensation

$45/hr on W-2; $55/hr on C2C


Starkville, MS | Dover, DE | Minneapolis, MN | Off-shore Remote: India

Work Authorization

H1, GC, US Citizen


Apply Now