Introduction
Join our dynamic team as a key contributor to the Financial Crimes Technology group, focusing on developing, enhancing, and maintaining applications that support AML and surveillance initiatives. This role demands deep data engineering expertise, strong business engagement skills, and the ability to thrive in a fast-paced Agile environment.
Top 3 Core Skills (Must-Have)
Teradata Development (16.x) – Advanced SQL / BTEQ scripting, TPT, performance tuning, and optimization.
ETL / Data Integration – Informatica 10.x, Autosys, strong experience in large-scale data sourcing, migrations, and conversions.
Unix & Scripting – Strong proficiency in UNIX shell scripting for automation and integration.
Required Skills & Qualifications
- Strong Data Sourcing, Modeling & Provisioning : Expertise in large-scale AML / Financial Crimes monitoring and surveillance data assets.
- Teradata (ETL & Data Integration Expertise) : Proficiency in Teradata 16.x, SQL / BTEQ scripting, TPT, Informatica 10.x, Autosys, UNIX Shell scripting.
- Big Data & Cloud Platforms : Experience with Cloudera CDP, Hadoop (Hive, Impala, PySpark, Scala), and proven experience in data migrations, conversions, and performance tuning.
- 10 years of IT / Data Engineering experience in financial services or related industries.
- Strong hands-on experience in Teradata, Informatica, SQL performance tuning, and data migrations.
- Advanced skills in the Hadoop ecosystem – Hive, Impala, PySpark, Scala.
- Proven ability to deliver in Agile / Scrum environments with experience in full SDLC and CI / CD pipelines.
- Strong knowledge of Unix / Linux scripting and automation.
- Excellent communication and stakeholder management skills.
- Ability to manage distributed teams and coordinate across global environments.
Preferred Skills & Qualifications
AML / Financial Crimes domain knowledge.Day-to-Day Responsibilities
Develop complex requirements and deliver end-to-end application enhancements in the Financial Crimes / AML domain.Perform data analysis, sourcing, and modeling to build innovative data provisioning models.Participate in story refinement, define requirements, and estimate delivery effort.Contribute to design, coding, testing, debugging, and documentation of programs.Perform proof-of-concept (POC) work to test new ideas and mitigate risks.Setup and automate CI / CD pipelines using Jira, Bitbucket, Jenkins, Artifactory, and Ansible.Work closely with Production Support, Platform teams, and Business Partners for change management, platform upgrades, and integration with upstream / downstream applications.Ensure compliance with Enterprise Change Standards and promote data governance best practices.Engage with business stakeholders to gather requirements, define success measures, and support ongoing enhancements.Company Benefits & Culture
Inclusive and diverse work environment.Opportunities for professional growth and development.Supportive and collaborative team culture.For immediate consideration please click APPLY.