About the Company
TD Securities is a leading provider of capital markets products and services, offering innovative solutions to corporate, government, and institutional clients. With over 6,900 professionals across 32 cities globally, TD Securities has been at the forefront of executing complex transactions and providing unparalleled client service. The firm continues to drive growth, offering a wide range of financial services including underwriting, investment banking, and transaction banking.
About the Role
The Data Engineer will play a critical role in designing and implementing scalable data pipelines for TD Securities’ data-driven applications. This role involves working with cutting-edge technologies such as Hadoop, Apache Spark, and AWS, building real-time data processing solutions, and supporting key business objectives. The successful candidate will collaborate with various teams to optimize data workflows and ensure data quality and integrity.
Responsibilities
- Design and implement data pipelines using Scala, Python, Apache Spark, and AWS.
- Optimize batch and streaming data architectures to handle high-volume, high-velocity data.
- Collaborate with data scientists, analysts, and engineering teams to deliver effective data solutions.
- Develop and manage ETL processes for large-scale data integration and transformation.
- Implement data governance practices, ensuring compliance and quality standards.
- Work with DevOps teams to deploy CI/CD pipelines for efficient application delivery.
- Perform performance tuning and benchmarking of data applications.
- Contribute to the development and maintenance of infrastructure tools.
Required Skills
- 5+ years of experience in data engineering or related roles.
- Strong expertise in Scala, Python, Apache Spark, and AWS.
- Hands-on experience with Hadoop, Kafka, Hive, and NoSQL databases.
- Proficiency in ETL/ELT pipeline design and implementation.
- Strong background in data modeling, data warehousing, and data migration.
- Experience with cloud technologies (AWS, GCP, or Azure).
- Knowledge of CI/CD practices and tools like Git, Jenkins, and Airflow.
- Strong problem-solving abilities and the ability to communicate technical concepts clearly.
Preferred Qualifications
- Experience with Apache Kafka or Confluent.
- Familiarity with machine learning and data analytics tools.
- Knowledge of real-time analytics and event-driven systems.
- Experience working in the financial services industry.
- Familiarity with cloud data platforms like Google BigQuery or Azure ADLS.
- Proficiency with data governance and security best practices.