City : Glendale, CA
Onsite / Hybrid / Remote :
Hybrid (3 days a week onsite, Friday - Remote)
Duration
: 12 months
Rate Range
: Up to$85 / hr on W2 depending on experience (no C2C or 1099 or sub -contract)
Work Authorization
: GC, USC, All valid EADs except OPT, CPT, H1B
Must Have :
- 5+ years Data Engineering
- Airflow
- Spark DataFrame API
- Databricks
- SQL
- API integration
- AWS
- Python or Java or Scala
Responsibilities
Maintain, update, and expand Core Data platform pipelines.Build tools for data discovery, lineage, governance, and privacy.Partner with engineering and cross -functional teams to deliver scalable solutions.Use Airflow, Spark, Databricks, Delta Lake, Kubernetes, and AWS to build and optimize workflows.Support platform standards, best practices, and documentation.Ensure data quality, reliability, and SLA adherence across datasets.Participate in Agile ceremonies and continuous process improvement.Work with internal customers to understand needs and prioritize enhancements.Maintain detailed documentation that supports governance and quality.Qualifications
5+ years in data engineering with large -scale pipelines.Strong SQL and one major programming language (Python, Java, or Scala).Production experience with Spark and Databricks.Experience ingesting and interacting with API data sources.Hands -on Airflow orchestration experience.Experience developing APIs with GraphQL.Strong AWS knowledge and infrastructure -as -code familiarity.Understanding of OLTP vs OLAP, data modeling, and data warehousing.Strong problem -solving and algorithmic skills.Clear written and verbal communication.Agile / Scrum experience.Bachelor’s degree in a STEM field or equivalent industry experience.