Sr Data Engineer
Ecolab is looking for a Sr Data Engineer based in St. Paul, MN or Naperville, IL to join the Analytics and Digital Solutions Team within Ecolab Digital in support of Global Supply Chain. If you're a passionate professional seeking growth and a rewarding career, we invite you to apply. This is an excellent opportunity to join a recognized global company offering competitive compensation, benefits, and significant career advancement.
As a technical lead, you'll drive continuous improvements in our digital capabilities and advanced analytics. You'll lead the development of new digital products, providing critical insights to solve business challenges. A key part of your role will be enhancing data utilization across the organization through improved processes, governance, and data management. We're looking for someone passionate about building a strong data foundation and adopting modern data architecture for current and future analytical needs.
What You Will Do :
- Lead data initiatives : Serve as a liaison among stakeholders to analyze and define data requirements for reporting and business process changes.
- Manage data infrastructure : Proactively manage Snowflake and SQL databases and analytical data models.
- Drive data excellence : Develop, test, and tune semantic models for enterprise reporting, ensuring compliance with IT security requirements.
- Advance data architecture : Lead the adoption of modern data architecture and identify opportunities to solve business problems with state-of-the-art solutions.
- Perform reverse engineering : Analyze and understand existing complex data structures and processes to facilitate migrations, integrations, or improvements.
- Foster best practices : Mentor peers on implementing and improving our data management and governance framework across technologies like Microsoft Power BI, Snowflake, Microsoft Azure, and on-premise data points.
- Promote Agile methodologies : Champion and follow SCRUM / Agile frameworks.
Location : St. Paul, MN or Naperville, IL Hybrid (3 days in office)
Minimum Qualifications :
Bachelor's degree in Mathematics, Statistics, Computer Science, Information Technology, or EngineeringImmigration Sponsorship is not available for this position.5 years in a data engineering, analytics, or business intelligence role.Experience working in applications within supply chain, especially procurement domain.Strong experience with cloud data warehouses, specifically Snowflake, streams, tasks and the Azure Platform (SQL Server, Logic Apps, App Services, Data Factory, Power BI including pipelines, Lakehouse, and Warehouse).Proficiency in ETL data engineering, dimensional data modelling, master data management, data governance, and end-to-end data lineage documentation.Advanced SQL skills (cursors, triggers, CTEs, procedures, functions, external tables, dynamic tables, security roles) and Python (object-oriented programming, handling JSON / XML).Experience building ETL or ELT data pipelines using Snowflake streams, tasks, store procedures and Fivetran / DBT tools.Experience with medallion data architecture framework.Expertise in analytical application development utilizing Python, Streamlit, Flask, Node.js, Graph API, Power Apps tools. Deploying application in Azure app services.2 years of Agile / Scrum project management experience, including data requirements gathering and project ownership.Preferred Qualifications :
Master's degree in Mathematics, Statistics, Computer Science, Information Technology, or EngineeringExceptional analytical and problem-solving abilities.Proven ability to manage and deliver multiple projects simultaneously in a dynamic environment.Experience creating, implementing, and continuously improving processes.Familiarity with data science areas like time series analysis, forecasting, classification, and regression analysis.Experience with ERP systems (SAP preferred).Demonstrated ability to collaborate effectively with global cross-functional teams and individuals with varying technical expertise.Experience developing cloud hosted applications.Experience with PySpark, Fivetran, DBT (ETL / ELT), Python, Steamlit, Flask, Node.js, Power Apps, Graph API, Azure app services.Experience with source control (GIT).The pay range for this position is $114,900.00 - $172,300.00. Many factors are taken into consideration when determining compensation, such as experience, education, training, geography, etc. We comply with all minimum wage and overtime laws.
Ecolab strives to provide comprehensive and market-competitive benefits to meet the needs of our associates and their families.
To meet customer requirements and comply with local or state regulations, applicants for certain customer-facing roles may need to :
Undergo additional background screens and / or drug / alcohol testing for customer credentialing.Be fully vaccinated for COVID-19, including a booster if eligible, unless a religious or medical accommodation is requested by the applicant and approved by Ecolab.Ecolab will provide reasonable accommodation (such as a qualified sign language interpreter or other personal assistance) with our application process upon request as required to comply with applicable laws. If you have a disability and require accommodation assistance in this application process, please visit the Recruiting Support link in the footer of each page of our career website.