Software Engineer – Hadoop Big Data Platform

  • Full Time
  • Krakow
Vertex Agility

We provide agile, on-demand tech teams.

About the Company

Vertex Agility operates a Hadoop-based big data platform that supports advanced data processing and analytics.

About the Role

The role is for an experienced Software Engineer to contribute to the development and automation of tools for an on-premise Hadoop environment. This hybrid position is based in Kraków, requiring attendance at the office approximately six days per month. Strong English communication skills at B2 level are necessary.

Responsibilities

  • Develop and improve features within the Hadoop ecosystem, specifically Cloudera.
  • Automate platform onboarding, access management, and maintenance involving Linux, HDFS, Kerberos, and Spark.
  • Create internal APIs, user interfaces, and containerized services.
  • Optimize Spark jobs and handle CI/CD workflow management.
  • Collaborate with engineering and architecture teams to ensure platform scalability and resilience.

Required Skills

  • Minimum 5 years of experience with Big Data technologies, particularly the Cloudera stack.
  • In-depth knowledge of Hadoop tools including Hive, Spark, Kafka, YARN, HDFS, and Zookeeper.
  • Proficiency in Linux, shell scripting, Apache Ranger/Knox, and Kerberos.
  • Experience with Jenkins, Ansible, Agile methodologies, and real-time data processing tools.
  • Familiarity with S3-compatible storage solutions such as VAST is advantageous but not mandatory.

Please refer to the official website below for a comprehensive job description:

Copyright © 2025 hadoop-jobs. All Rights Reserved.