Talent.com
Data Engineer - AWS / Redshift

Data Engineer - AWS / Redshift

2Bridge PartnersDetroit, MI
job_description.job_card.30_days_ago
serp_jobs.job_preview.job_type
  • serp_jobs.job_card.permanent
job_description.job_card.job_description

Loading... Data Engineer - AWS / Redshift  2Bridge has been retained in the direct hire search to find 11 Data Engineering to join our Detroit Michigan based e-commerce client.They are seeking multiple Senior Data Engineer to join their growing Data Team!As a Senior Data Engineer, you’ll own a problem from end to end and will be empowered to take the lead with technology and implementation while joining a rare hyper-growth company.Our Client offers a comprehensive package including base, bonus,  paid relocation to Detroit , unlimited PTO, stocked Kitchens, casual and dynamic work environment and the ability to work in a fast-growing stable technology company. Responsibilities :

  • Design and build mission-critical data pipelines with a highly scalable distributed architecture - including data ingestion (streaming, events, and batch), data integration, data curation
  • Build and support reusable framework(s) to ingest, integrate and provision data
  • Automation of end to end data pipeline with metadata, data quality checks, and audit
  • Build and support a big data platform on the cloud
  • Define and implement automation of jobs and testing
  • Optimize the data pipeline to support ML workloads and use cases
  • Support mission-critical applications and near real-time data needs from the data platform
  • Capture and publish metadata and new data to subscribed users
  • Work collaboratively with product managers, data scientists as well as business partners and actively participate in design thinking sessions
  • Participate in design and code reviews
  • Motivate, coach, and serve as a role model and mentor for other development team associates / members that leverage the platform

Qualifications :

  • BS / BA degree in Computer Science, Physics, Mathematics, Statistics or other Engineering disciplines
  • 7+ years’ experience in data warehouse / data lake technical architecture
  • Minimum 3 years of Big Data and Big Data tools - Kafka, MapReduce, Spark or Python, Hadoop
  • 3+ years' experience engineering in cloud environments – Google Cloud, AWS, Azure
  • Experience with Database Architecture & Schema design
  • Strong familiarity with batch processing and workflow tools such as AirFlow, NiFi
  • Ability to work independently with business partners and management to understand their needs and exceed expectations in delivering tools / solutions
  • Strong interpersonal, verbal and written communication skills with the ability to present complex technical & analytical concepts to an executive audience
  • A strong business mindset with customer obsession; ability to collaborate with business partners to identify needs and opportunities for improved data management and delivery
  • Experience with data visualization tools such as Tableau, Looker, PowerBI
  • Experience with Hadoop implementation
  • serp_jobs.job_alerts.create_a_job

    Data Engineer Aws • Detroit, MI