20 Years of connecting People with the Right Jobs
About the Company
Techgene, established in 2002 and ISO 9001-2008 certified, specializes in delivering innovative mobility solutions for enterprises and the consumer sector. Headquartered in Irving, Texas, with a state-of-the-art development center in Hyderabad, India, Techgene offers high-quality expertise in R&D and IT across web and mobile platforms. The company has successfully delivered numerous solutions in verticals like Healthcare, Education, Transportation, Telecom, and Utilities. With notable clients such as AIG, Cisco, Dell, IBM, Microsoft, and Verizon, Techgene is committed to providing world-class mobile-enabled solutions to meet the unique needs of each client.
About the Role
Techgene is seeking a Senior Hadoop Developer with 10+ years of experience to work onsite in Omaha, Nebraska, for a long-term contract. This role involves leveraging extensive knowledge in Hadoop ecosystem tools and platforms to manage and optimize data solutions. The ideal candidate will have expertise in Hadoop, Hive, Spark, HBase, and related technologies, as well as experience working in both on-prem and cloud environments, particularly with Azure, Databricks, and Snowflake.
Responsibilities
- Design, develop, and manage Hadoop ecosystem architectures, focusing on Hadoop, Hive, HBase, and HDFS.
- Optimize big data processing frameworks, including MapReduce and YARN, to enhance performance and scalability.
- Utilize ETL tools and data analytics platforms like Tableau to integrate and analyze large datasets.
- Migrate on-prem Hadoop workloads to cloud environments, specifically Azure and Databricks.
- Conduct code reviews and mentor junior developers to ensure the delivery of high-quality solutions.
- Collaborate with cross-functional teams to troubleshoot issues, solve problems, and ensure efficient production systems.
- Maintain a strong focus on delivering defect-free, high-quality solutions in production environments.
Required Skills
- 10+ years of hands-on experience with the Hadoop ecosystem, including Hadoop, Hive, HBase, and HDFS.
- Strong understanding of Spark, YARN, MapReduce, and Jupyter.
- Proficiency with Linux-based systems and command-line tools.
- In-depth experience with big data ETL tools and integration frameworks.
- Expertise in data analytics using tools such as Tableau.
- Strong problem-solving, analytical skills, and attention to detail.
- Experience with cloud hosting and migration of Hadoop workloads to cloud platforms (Azure, Databricks, Snowflake).
- Ability to lead code reviews and mentor junior developers.
- Knowledge of Agile development methodologies.
Preferred Qualifications
- Experience with cloud-based data solutions (Azure, Databricks, Snowflake).
- Familiarity with Agile development methodology.
- Excellent communication skills and the ability to work collaboratively in cross-functional teams.
- Background in delivering projects with a โfirst-time-rightโ approach to ensure zero defects in production.