Desarrollamos tecnologĂa de alto valor que inspire e impulse el progreso digital de las compañĂas
About the Company
Second Window is a leading Spanish IT services company focused on digital transformation, offering cutting-edge technological solutions to help businesses overcome emerging challenges. The company specializes in Data & Analytics, Cloud & DevOps, Full Stack Development, and Automation, working with various sectors such as finance, public services, industry, and telecommunications. Second Window is committed to fostering professional excellence in its talent while empowering clients to achieve business success.
About the Role
As a Big Data Developer at Second Window, you will contribute to building and optimizing big data solutions for clients. This remote position is ideal for candidates with experience in Hadoop ecosystems and cloud platforms, capable of designing and implementing data-driven solutions. You will focus on enhancing the scalability, performance, and reliability of data processes, using tools like Hadoop, Spark, and cloud services such as AWS, Azure, and GCP.
Responsibilities
- Develop, maintain, and optimize big data solutions using Hadoop, Spark, and Hive.
- Implement and manage ETL processes to extract, transform, and load large datasets.
- Work with cloud platforms (AWS, Azure, GCP) to implement big data solutions.
- Write clean, maintainable, and efficient code in languages such as Python, Scala, or Java.
- Collaborate with cross-functional teams to design and integrate solutions into existing architectures.
- Perform data analysis and optimization to improve system performance.
- Participate in designing and implementing cloud-native solutions for big data.
- Create and maintain technical documentation and user guides.
- Contribute to continuous improvement in processes, tools, and technologies.
Required Skills
- Minimum 4 years of experience as a Big Data Developer, Data Engineer, or similar role.
- Strong expertise in Hadoop, Spark, Hive, and related big data frameworks.
- Proficiency in programming languages such as Python, Scala, or Java.
- Hands-on experience with complex SQL queries and relational databases.
- Familiarity with cloud platforms like AWS, Azure, and GCP.
- Knowledge of distributed computing algorithms and systems.
- Strong problem-solving skills and the ability to optimize large-scale data processes.
- Good communication skills and ability to collaborate effectively in a remote environment.
Preferred Qualifications
- Masterâs Degree in Computer Science, Engineering, or related field.
- Experience with cloud-native big data solutions such as AWS EMR, GCP Dataproc.
- Experience with data visualization tools like Tableau.
- Familiarity with Docker, Kubernetes, and other containerization technologies.
- Exposure to machine learning and AI applications in big data environments.
- Knowledge of Data Warehousing and Business Intelligence systems.