Senior Hadoop Software Engineer

eBay

We connect people and build communities to create economic opportunity for all.

About the Company

eBay is a global leader in ecommerce, revolutionizing the way people shop and sell. Operating in over 190 markets, eBay connects millions of buyers and sellers, making it a platform where authenticity thrives and bold ideas are encouraged. The company’s mission is to create economic opportunity for all through innovative, data-driven solutions.

About the Role

The Hadoop team at eBay is integral to the company’s data infrastructure. The role involves overseeing and enhancing Hadoop-related projects to meet eBay’s extensive data scale requirements. The focus will be on building customer-facing tools, improving system performance, and ensuring seamless integration with other systems. This role offers an opportunity to directly influence eBay’s data strategy and drive innovation across the organization.

Responsibilities

  • System Optimization: Oversee and enhance Hadoop-related projects, optimizing for system availability and scalability to accommodate eBay’s growing data demands.
  • High-Impact Project Leadership: Lead initiatives to improve data processing capabilities, contributing to eBay’s competitive advantage.
  • Customer-Facing Tools Development: Develop tools that improve user experience and operational efficiency, enabling stakeholders to better leverage data insights.
  • System Integration: Ensure seamless integration of Hadoop with other platforms to foster a cohesive data ecosystem and improve decision-making.
  • Innovation: Drive continuous improvement in data accessibility, management, and operational performance.

Required Skills

  • Bachelor’s degree in Computer Science, Information Technology, or related field.
  • Programming skills in Java, Scala, or Python, with an understanding of common algorithms and data structures.
  • Experience with Hadoop and associated frameworks for large-scale data processing.
  • Familiarity with Linux/Unix systems and system commands.
  • Analytical skills for solving complex distributed system challenges and optimizing solutions.
  • Networking knowledge for understanding Hadoop’s distributed environment.

Preferred Qualifications

  • Experience with big data technologies and frameworks (Hadoop, Hive, Spark, etc.).
  • Contributions to open-source projects related to big data.
  • Familiarity with cloud environments (AWS, GCP, Azure).

To learn more about this role, please check the official website listed below:

Copyright © 2025 hadoop-jobs. All Rights Reserved.