Senior Hadoop Developer

About the Company

GuideWell and its subsidiaries, including Florida Blue, provide health solutions and services to Floridians. As a not-for-profit company, it is dedicated to improving health outcomes for residents and businesses in Florida. Florida Blue is a member of the Blue Cross and Blue Shield Association and works alongside Magnit as its Managed Service Provider (MSP) and Employer of Record (EOR) to bring top talent to its teams.

About the Role

A senior position in a fast-growing technology team is available for a Senior Java and Hadoop Developer. This role focuses on developing and maintaining scalable data solutions within the companyโ€™s technology platform. The ideal candidate will have deep experience in Java, Hadoop, and related technologies, with the ability to collaborate with cross-functional teams to deliver high-quality results.

This is a contract role, with the opportunity to work in a dynamic, evolving company.

Responsibilities

  • Design, develop, and maintain real-time data processing systems with low-latency and high throughput.
  • Develop batch processing systems to ensure reliable data processing.
  • Build and maintain data ingestion and integration pipelines, ensuring data quality and integrity.
  • Design and implement efficient and reliable data storage and retrieval systems.
  • Work closely with teams to ensure seamless integration of features and systems.
  • Conduct thorough testing, debugging, and documentation for all developed features.
  • Maintain the integrity of internal applications and ensure continuous improvements.
  • Develop and deliver scalable solutions within the Hadoop ecosystem using HDFS, Spark, Hive, etc.

Required Skills

  • 5+ years of experience in Java and Hadoop ecosystem (HDFS, Spark, Hive).
  • Proficiency in Java programming language and Java-based frameworks (e.g., Spring, Hibernate).
  • Strong experience with Spark (Scala/Java) for data processing.
  • Knowledge of SQL, NoSQL databases (e.g., MongoDB, Cassandra), and data modeling.
  • Experience with batch processing and real-time data systems.
  • Familiarity with data warehousing concepts and ETL tools (e.g., Informatica, Talend).
  • Strong understanding of object-oriented programming (OOP) and software design patterns.
  • Ability to use version control tools (e.g., Git) and perform CI/CD tasks.
  • Familiarity with RESTful APIs, SOAP, and web service development.
  • Excellent problem-solving, analytical, and troubleshooting skills.
  • Ability to collaborate with teams and communicate technical concepts effectively.

Preferred Qualifications

  • Experience with Agile, Waterfall, or hybrid development methodologies.
  • Familiarity with microservices architecture and design patterns (e.g., REST, gRPC).
  • Experience with containerization (e.g., Docker) and orchestration (e.g., Kubernetes).
  • Previous experience in the health care industry is a plus.
  • Experience with Linux-based systems and related development tools.

Visit the official website below to access the full details of this vacancy:

Copyright © 2025 hadoop-jobs. All Rights Reserved.