Hadoop Architect

About the Company

Saicon Consultants, Inc. is a leading IT consulting firm with over 20 years of experience providing IT professional services to Fortune 500 companies nationwide. With a focus on high-quality, cost-effective solutions, Saicon offers individualized attention to meet client needs. As a certified Minority/Women-Owned Enterprise and SBA 8(a) certified, Saicon is committed to delivering exceptional IT consulting services, specializing in ERP, client-server development, and administration.

About the Role

Saicon is seeking a Senior Hadoop Architect to lead and optimize Hadoop ecosystem architecture for clients, ensuring the efficient management of big data platforms. The role requires a deep understanding of Hadoop technologies and a strong background in cloud migrations. The ideal candidate will be responsible for overseeing the implementation and management of Hadoop platforms, working with cutting-edge tools like Hive, Spark, and Azure, and mentoring junior developers.

Responsibilities

  • Architect, implement, and manage on-prem Hadoop ecosystems and transitions to cloud environments.
  • Oversee the installation, configuration, and tuning of Hadoop platforms, ensuring high performance and scalability.
  • Migrate Hadoop workloads to the cloud, ensuring smooth and secure transitions.
  • Work with technologies like Hive, HBase, Spark, HDFS, YARN, MapReduce, and Jupyter for data processing.
  • Lead and mentor junior developers, ensuring high-quality, zero-defect project delivery.
  • Conduct code reviews and ensure adherence to development best practices.
  • Collaborate with cross-functional teams to integrate Hadoop solutions effectively within the organization.
  • Troubleshoot, resolve, and optimize Hadoop-related performance issues.
  • Provide strategic guidance for big data hosting in cloud environments like Azure and Snowflake.

Required Skills

  • 10+ years of experience in the Hadoop ecosystem, particularly with Hadoop, Hive, Spark, HBase, HDFS, YARN, and MapReduce.
  • Strong experience in cloud migrations, particularly for Hadoop workloads to cloud platforms like Azure, Databricks, and Snowflake.
  • Hands-on experience with ETL tools and Jupyter for data analysis.
  • Proficiency in Linux basics and data analytics tools such as Tableau.
  • Agile development methodology experience is a plus.
  • Strong analytical and problem-solving skills with the ability to troubleshoot complex issues.
  • Proven experience in leading code reviews and mentoring junior developers.
  • Excellent communication and leadership skills.
  • Ability to deliver projects with zero defects and high-quality standards.

Preferred Qualifications

  • Familiarity with Azure, Databricks, and Snowflake for cloud-based big data solutions.
  • Experience with Data Analytics tools like Tableau.
  • Background in leading cloud migrations and optimizing cloud-based big data platforms.
  • Previous experience in mentoring junior engineers and leading teams.

To learn more about this role, please check the official website listed below:

Copyright © 2025 hadoop-jobs. All Rights Reserved.