Senior Hadoop Engineer

Galent

AI-native digital engineering solutions to empower businesses with next-gen innovation, scalability, and transformation.

About the Company

Galent is an AI-native digital engineering firm with 60,000 professionals operating from more than 50 global delivery centers. Positioned at the forefront of the AI revolution, Galent helps enterprises integrate advanced technologies and unlock new growth opportunities. Through scalable AI-enabled solutions, digital engineering, and consulting, the company optimizes processes, enhances productivity, and enables organizations to thrive in a rapidly evolving digital world.

About the Role

Galent is seeking a Senior Hadoop Engineer to design and deliver enterprise-scale big data solutions. This hybrid role is available in Foster City (CA), Atlanta (GA), Houston (TX), and Dallas (TX). The position involves building and optimizing data pipelines, collaborating with stakeholders, and applying deep technical expertise to solve complex challenges across data platforms.

Responsibilities

  • Design, develop, test, and maintain large-scale data solutions and multi-tier software.
  • Lead the full software lifecycle from architecture to deployment.
  • Debug and optimize Spark jobs, ensuring scalability and high performance.
  • Collaborate with cross-functional teams and stakeholders on data-driven solutions.
  • Deliver high-quality, reusable, and well-tested code.
  • Drive feature delivery with strong problem-solving and system design skills.

Required Skills

  • 10+ years of hands-on software development experience.
  • Strong knowledge of algorithms, data structures, OOP, and design patterns.
  • Expertise in multi-threaded programming, debugging, and analytics.
  • Proficiency in SQL (BigQuery, Hive, Spark SQL).
  • Skilled in Python, PySpark, and Big Data tools (Hadoop, Hive, Airflow).
  • Google Cloud Platform (GCP) development experience.
  • Strong architectural skills for end-to-end data pipelines.
  • Proficiency in Unix and GitHub.

Preferred Qualifications

  • Experience in client and stakeholder management.
  • Proven leadership of offshore teams and delivery management.
  • Relevant certifications in big data or GCP.
  • Familiarity with unit testing frameworks (e.g., Golang for data pipeline validation).

Complete details about this role can be found on the official website below:

Copyright © 2025 hadoop-jobs. All Rights Reserved.