Job Description
Job Description
We are looking for an experienced Data Engineer to join our dynamic team in Wyoming, Michigan, for a Contract-to-Permanent position. In this role, you will play a key part in designing and managing data systems, developing data pipelines, and ensuring optimal data governance practices across multi-cloud environments. This position offers an exciting opportunity to contribute to cutting-edge healthcare data solutions while collaborating with cross-functional teams.
Responsibilities :
- Design and implement robust data architecture frameworks, including modeling, metadata management, and database security.
- Create and maintain scalable data models that support both operational and analytical needs.
- Develop and manage data pipelines to extract, transform, and load data from diverse sources into a centralized data warehouse.
- Collaborate with various departments to translate business requirements into technical specifications.
- Monitor and optimize the performance of data assets, ensuring reliability and efficiency.
- Implement and enforce data governance policies, including data retention, backup, and security protocols.
- Stay updated on emerging technologies in data engineering, such as AI tools and cloud-based solutions, and integrate them into existing systems.
- Establish and track key performance indicators (KPIs) to measure the effectiveness of data systems.
- Provide mentorship and technical guidance to team members to foster a collaborative work environment.
- Evaluate and adopt new tools and technologies to enhance data capabilities and streamline processes.
- Proficiency in managing both NoSQL and Graph databases in multi-cloud environments, such as AWS Neptune, Apache Gremlin, Neo4J, and Azure Cosmos.
- Strong expertise in relational databases like Snowflake, Azure Data Factory, or similar technologies.
- Experience building and maintaining data pipelines using tools like PowerShell, Python, or ETL processes.
- Advanced knowledge of data architecture principles, including database modeling and design.
- Familiarity with big data tools such as Apache Spark, Hadoop, and Kafka.
- Hands-on experience with cloud platforms including Amazon Web Services (AWS) and Microsoft Azure.
- Solid understanding of Agile methodologies and their application in data engineering projects.
- Exceptional analytical and problem-solving skills to address complex data challenges.