Client Name : Prosum
End Client Name : Early Warning
Job Title : Analytics Data Engineer
Location : Scottsdale, AZ or Chicago, IL (Onsite; no parking provided in Chicago) (Local needed)
Work Type : Onsite
Job Type : Contract (5+ months)
Rate : Up to $50 / hr W2 (all-inclusive)
Notes :
- Citizens and Green Card Holders only
- Updated LinkedIn with a Photo, before 2021
- Interview Process :
Video with Hiring Manager
Onsite in-person with team membersVideoMUST HAVES :
Need to be familiar with SAS and able to convert in Spark and Python.Ability to migrate data analysis, data manipulation, and statistical modeling tasks from SAS to Python.Critical to have experience) Hadoop Admin.Critical to have experience) Spark Admin.Critical to have experience) Linux.Critical to have experience) Windows.Nice to have experience) AWS.Nice to have experience) Tableau.QUESTIONS THAT NEED TO BE ANSWERED BY CANDIDATE : Submission summaries need to address the "Must Haves" and "Nice To Have"
JOB DESCRIPTION :
This role will be interfacing directly with Data Scientists, etc, and will need to be personable and have the ability to work issues and deal with people. It's a lot of tech, but it's a lot of internal coordinating and communicating.
The Data Science & Analytics team is undergoing a long-term tech transformation and migration effort. The team has many assets (data processing, reports, model development scripts) that have been developed using legacy software, particularly SAS. Very few people on the team are cross trained in both SAS and modern higher performance methods such as Spark or optimized Python.
The Data Science and Analytics teams as well as the ML Ops team are fully committed, and need to augment our resources with external support to
Help convert legacy code-based assets to modern high-performance tools (SAS to Python)Existing data processing scripts including data movement, cleaning, and aggregation
Value Testing Process. This scores a potential customers data through our models to help determine the value of EWS solutionsSQL / Hive query performance tuning and enhancementDevelop shared toolkit to automate certain data science processesData profiling
Feature importance and effectiveness evaluationAutomate documentation of model development processesAssist in upskilling existing teamProject Specifics
Code Modernization for VT, MV&P, DS, DICA and CIR teams on existing programs / processesfrom SAS / Hive to Python / Scala / Pyspark / SQL or other modern highly efficient technology that fits the Early Warning's current on-prem environment and set up an easy conversion path for future state in ADP / Model Factory
coordinate with MLOps team to onboard new data sources that exist in SAS environment but not in NewtonFor new VTs, work with the relevant parties to ensure Project plans account for MLOps engagement to build the capability (other processes potentially as well key capabilities in general can be requested to be built by MLOps from scratch)Training team to ensure proper adoption / transition to the teamHive code efficiency evaluation and modernizationEvaluate legacy repeated Hive queries commonly used by the analytics community
Upgrade the legacy code to Scala / Pyspark or other modern highly efficient technology that fits the Early Warning's current on-prem environment and set up an easy conversion path for future state in ADP / Model FactoryTraining team to ensure proper adoption / transition to the teamAnalytics ToolKit / Capability (shared among all teams)When existing open-source packages not available or not fitting our modeling need, Create standard, re-usable, highly efficient procedures for end-to-end model development, validation and evaluation, for example :
Data profiling tool (evaluate data missing, value ranges, outlier, categorical features etc.)
Feature effectiveness triaging toolset for XGBoost or other non-transparent modelsProvide standard generation of outputs of various model stages that aligns with model governance documentation requirements.Provide a template for efficient python-based project structure that enables efficient run, test, debug and deploy pipeline.Engage with MLOps for design, code review and approval this is within MLOps roles / resp but this SOW will help to bridge the short term resource gapReport AutomationReplace the current SAS / VisualBasic process with automate standard report automation using the modeling outputs. Collaborate with the tech writer and analytics team to standardize template and output. This include both validation report and initial model development report (auto-inserted with template), this may depend on when we have a DR replacement
Engage with MLOps for design, code review and approval this is within MLOps roles / resp but this SOW will help to bridge the short term resource gapTraining / Upskilling Analytics TeamsCreate training / onboarding materials and provide hands-on practice training environment with target adoption outcome
Work with Corp Learning & Development to develop programming training path using existing platforms and tools (LinkedIn Learning and Udemy)Provide office hour and troubleshooting supportConduct regular code guidance for the team in partnership with MLOpsDay 1 Monitoring ScriptCreate Day 1 model monitoring script when MLOps resource are not available