Hadoop Architect

Webhelp Enterprise

Best-in-class B2B solutions to optimize your Go-To-Market: data, marketing & digital services, sales expertise.

About the Company

Concentrix is a global leader in technology and services, specializing in transforming and modernizing customer experiences. With a focus on intelligent, tech-powered, and solution-oriented services, Concentrix has been recognized as one of the best workplaces. The company works with some of the worldโ€™s top brands to provide outstanding service, drive business growth, and deliver unparalleled customer experiences.

About the Role

Concentrix is seeking an experienced Hadoop Architect to lead the design, architecture, and implementation of solutions within the Hadoop ecosystem. This role requires hands-on experience with Hadoop-related technologies, cloud platforms, and a deep understanding of big data management and migration. The ideal candidate will collaborate with various teams, mentor junior developers, and ensure that projects are delivered on time with top-tier quality and zero defects. This is an onsite position based in Omaha, NE.

Responsibilities

  • Architect, design, and implement solutions within the Hadoop ecosystem, including HDFS, YARN, MapReduce, Hive, Spark, and HBase.
  • Manage and optimize on-prem Hadoop infrastructure for scalability and performance.
  • Collaborate with cross-functional teams to understand and address business requirements.
  • Facilitate the migration of Hadoop workloads to cloud platforms (e.g., Azure, Databricks, Snowflake).
  • Perform code reviews and mentor junior developers to ensure quality deliverables.
  • Develop and maintain ETL processes using Jupyter and other ETL tools.
  • Utilize Linux for system operations and maintenance.
  • Contribute to agile development processes and foster continuous improvement.
  • Ensure the delivery of projects with a focus on high-quality, defect-free outcomes.

Required Skills

  • 10+ years of experience in the Hadoop ecosystem.
  • Proficiency in Hadoop, Hive, Spark, HBase, HDFS, YARN, MapReduce, Jupyter, and ETL tools.
  • Basic knowledge of Linux operations.
  • Strong analytical and problem-solving skills.
  • Experience with cloud platforms (Azure, Databricks, Snowflake).
  • Excellent verbal and written communication skills.
  • Experience with agile development methodology.
  • Ability to conduct code reviews and mentor junior developers.

Preferred Qualifications

  • Familiarity with cloud migration practices.
  • Expertise in big data technologies like Hadoop, Spark, and Hive.
  • Experience working in a highly collaborative environment.

Complete details about this role can be found on the official website below:

Copyright © 2025 hadoop-jobs. All Rights Reserved.