About the Role:
Weโre seeking a Data Engineer (Big Data / Hadoop Developer) for a long-term contract (or contract-to-hire) role based in San Jose, CA. The ideal candidate must be local, as in-person interviews are required, and due to the sensitive nature of the work, visa-dependent candidates cannot be considered.
Key Focus Areas:
-
Big Data engineering with a strong background in Snowflake, Hadoop, and Java.
-
End-to-end development, from requirement gathering to deployment and bug fixing.
-
Collaborating in Agile or Waterfall environments, using industry best practices in coding and data management.
Qualifications:
-
5+ years of software engineering experience.
-
Degree in Computer Science, Engineering, or equivalent.
Required Skills:
-
Expertise in Java/J2EE, SQL, Snowflake, Kafka, and big data tools.
-
Strong grasp of data modeling, normalization, and dimensional design.
-
Proficient in Agile practices and test-driven development.
-
Familiar with tools and methodologies for optimizing data pipelines and storage systems.
Responsibilities:
-
Translate complex requirements into scalable software solutions.
-
Develop, test, and maintain high-performance data solutions.
-
Work on detailed design specs, perform code reviews, and mentor junior engineers.
-
Troubleshoot technical issues and propose effective solutions.
-
Stay current on emerging technologies and integrate them into projects when appropriate.
Additional Highlights:
-
Opportunity to work in a technically advanced, fast-paced environment.
-
Mentorship responsibilities for junior developers.
-
Emphasis on collaboration, innovation, and best practices in data engineering.