Hadoop Developer

Apex Systems

Value Reimagined ™

About the Company

Apex Systems, part of ASGN Inc., stands as the 2nd largest IT staffing agency in the U.S. With a strong focus on IT services, Apex serves thousands of clients globally. Apex is known for its dedication to providing innovative solutions and fostering a culture of collaboration, continuous learning, and excellence. As part of a world-class IT team, employees benefit from career resources, training, and development opportunities.

About the Role

Apex Systems is seeking a Data Engineer (Hadoop) for a hybrid position based in Iselin, NJ. In this role, you will focus on designing and optimizing scalable data solutions using Hadoop, Spark, and AWS, playing a key role in migrating structured and unstructured data across cloud platforms, and improving real-time data processing workflows.

Responsibilities

  • Design, optimize, and maintain Hadoop-based data solutions.
  • Work hands-on with ETL tools such as Glue and PySpark for data transformation and migration.
  • Collaborate with cross-functional teams to ensure seamless integration with third-party systems using REST, SOAP, FIX, and secure file transfer protocols.
  • Write clean, efficient, and reusable code in Python, Java, and Scala.
  • Automate workflows using CI/CD tools such as Jenkins and Airflow.
  • Maintain and optimize CI/CD pipelines for continuous delivery.
  • Troubleshoot, debug, and perform unit testing to ensure high-quality deliverables.
  • Contribute to data architecture design, ensuring high-performance and scalability.

Required Skills

  • 3+ years of experience with Hadoop, Hive, Spark, and related big data technologies.
  • Proficiency in Python, Java, Scala, and PySpark for developing and maintaining ETL processes.
  • Strong experience with AWS and cloud-native solutions.
  • Experience in implementing and optimizing CI/CD pipelines using Jenkins and Airflow.
  • Hands-on experience with version control systems like Git.
  • Familiarity with RESTful APIs and third-party integration technologies.
  • Strong debugging and problem-solving skills.

Preferred Qualifications

  • Experience with cloud platforms like AWS, Azure, and GCP.
  • Knowledge of relational and NoSQL databases, including experience with Postgres and DynamoDB.
  • Familiarity with containerization tools such as Docker and Kubernetes.
  • Previous experience in the financial or banking sector.
  • Experience in managing large-scale data processing systems.

For a detailed job description, kindly refer to the official website linked below:

Copyright © 2025 hadoop-jobs. All Rights Reserved.