Data engineer serp_jobs.h1.location_city
serp_jobs.job_alerts.create_a_job
Data engineer • yonkers ny
Senior Data Engineer
C&L GroupWhite Plains, NY, US- serp_jobs.job_card.promoted
Data Engineer
VirtualVocationsBronx, New York, United StatesData Engineer
SSi PeopleMontvale / New JerseyStaff Data Engineer
NBCUniversalEnglewood Cliffs, NEW JERSEY, US- serp_jobs.job_card.promoted
Data Analytics Engineer
OnMedWhite Plains, NY, US- serp_jobs.job_card.promoted
Senior Data Engineer
AmpcusWhite Plains, NY, USData Engineer
TSRMontvale , New Jersey96 - Data Engineer
Apex SystemsMontvale, NJBig Data Engineer
Bright Vision TechnologiesFort Lee, New Jersey, United StatesData Engineer
City of New YorkBRONXCloud Data Engineer
LG electronicsEnglewood Cliffs, NJData Engineer
OnmedWhite Plains, New York, United States- serp_jobs.job_card.promoted
Senior Data Engineer
LanceSoftWhite Plains, NY, US- serp_jobs.job_card.promoted
Data Engineer- Spring Boot
Consumer ReportsYonkers, NY, USSenior Test Engineer, Data Acquisition
Beehive IndustriesEnglewood, New Jersey, United States- serp_jobs.job_card.promoted
Sr. Data Engineer
Syntricate TechnologiesMontvale, NJ, USSenior Data Engineer
EclaroWhite Plains, NY, USData Engineer
Empirx HealthMontvale, New Jersey, United StatesSr Azure Data Engineer
Zealogics.comMontvale, New Jersey, United StatesSenior Data Engineer
Veterans SourcingWhite Plains, NY, USSenior Data Engineer
C&L GroupWhite Plains, NY, US- serp_jobs.job_card.full_time
NYPA's current on-premises Enterprise Resource Planning (ERP) system, SAP ECC 6.0, has been in use for over a decade and is nearing technological obsolescence resulting in NYPA's requirement to select and implement a new ERP system. The objective for deploying a new ERP system, is to successfully implement a system that integrates all business functions, including finance, operations, and human resources into a cohesive platform. This implementation aims to enhance organizational efficiency, improve data accuracy, and provide real-time reporting capabilities. The goal is to streamline processes, reduce operational costs, and support informed decision-making across all departments.
Job Functions & Responsibilities
- Cloud Data Engineering & Integration : Design and implement data pipelines across AWS, Azure, and Google Cloud. Develop SAP BTP integrations with cloud and on-premise systems. Ensure seamless data movement and storage between cloud platforms.
- ETL & Data Pipeline Development : Develop and optimize ETL workflows using Pentaho and Microsoft ADF / or equivalent ETL tools. Design scalable and efficient data transformation, movement, and ingestion processes. Monitor and troubleshoot ETL jobs to ensure high availability and performance.
- API Development & Data Integration : Develop and integrate RESTful APIs to support data exchange between SAP and other platforms. Work with API gateways and authentication methods like OAuth, JWT, API keys. Implement API-based data extractions and real-time event-driven architectures.
- Data Analysis & SQL Development : Write and optimize SQL queries, stored procedures, and scripts for data analysis, reporting, and integration. Perform data profiling, validation, and reconciliation to ensure data accuracy and consistency. Support data transformation logic and business rules for ERP reporting needs.
- Data Governance & Quality (Ataccama, Collibra) : Work with Ataccama and Collibra to define and enforce data quality and governance policies. Implement data lineage, metadata management, and compliance tracking across systems. Ensure compliance with enterprise data security and governance standards.
- Cloud & DevOps (AWS, Azure, GCP) : Utilize Azure DevOps and GitHub for version control, CI / CD, and deployment automation. Deploy and manage data pipelines on AWS, Azure, and Google Cloud. Work with serverless computing (Lambda, Azure Functions, Google Cloud Functions) to automate data workflows.
- Collaboration & Documentation : Collaborate with SAP functional teams, business analysts, and data architects to understand integration requirements. Document ETL workflows, API specifications, data models, and governance policies. Provide technical support and troubleshooting for data pipelines and integrations.
Skills
Required Skills & Experience : 7+ years of experience in Data Engineering, ETL, and SQL development. Hands-on experience with SAP BTP Integration Suite for SAP and non-SAP integrations. Strong expertise in Pentaho (PDI), Microsoft ADF, and API development. Proficiency in SQL (stored procedures, query optimization, performance tuning). Experience working with Azure DevOps, GitHub, and CI / CD for data pipelines. Good understanding of data governance tools (Ataccama, Collibra) and data quality management. Experience working with AWS, Azure, and Google Cloud (GCP) for data integration and cloud-based workflows. Strong problem-solving skills and ability to work independently in a fast-paced environment. Preferred Qualifications : Experience working on SAP S / 4HANA and cloud-based ERP implementations. Familiarity with Python, PySpark for data processing and automation. Experience working on Pentaho, Microsoft ADF / or equivalent ETL tools. Knowledge of event-driven architectures. Familiarity with CI / CD for data pipelines (Azure DevOps, GitHub Actions, etc.).
Education & Certifications
Bachelor's or Master's degree in a relevant field like Computer science, Data Engineering or related technical field. Nice to have below certifications : A) Azure Data Engineer associate B) SAP Certified Associate - Integration Developer.