Job Description
Job Description
We are looking for a passionate certified Data Engineer. The successful candidate will turn data into information, information into insight and insight into business decisions.
Data analyst responsibilities include conducting full lifecycle analysis to include requirements, activities and design. Data analysts will develop analysis and reporting capabilities. They will also monitor performance and quality control plans to identify improvements.
Primary skillset : Python / Pyspark & Azure, ADF, SQL, SQL Server, Data Warehousing, ETL
Secondary : Databricks
Nice to have : Informatica / ETL.
Responsibilities
- Design and develop ETL processes based on functional and non-functional requirements in python / pyspark within Azure platform
- Understand the full end to end development activities from design to go live for ETL development and Azure platform
Recommend and execute improvements
Document component design for developers and for broader communication.Understand and adopt an Agile (SCRUM like) software development mindsetFollow established processes / standards, business technology architecture for development, release management and deployment processExecute and provide support during testing cycles and post-production deployment, engage in peer code reviews.Elicit, analyze, interpret business and data requirements to develop complete business solutions, includes data models (entity relationship diagrams, dimensional data models), ETL and business rules, data life cycle management, governance, lineage, metadata and reporting elements.Apply automation and innovation on new and on-going data platforms for those development projects aligned to business or organizational strategies.Design, develop and implement reporting platforms (e.g. modeling, ETL, BI framework) and complex ETL frameworks that meet business requirements.Deliver business or enterprise data deliverables (that adhere to enterprise frameworks) for various platforms / servers / applications / systems.Requirements
Proven working experience as a data engineerBachelor degree or equivalent in Computer ScienceSkilled in Python object-oriented programmingSkilled in AWS Compute such as EC2, Lambda, Beanstalk, or ECSSkilled in AWS Database products such as Neptune, RDS, Redshift, or AuroraSkilled in AWS Management and Governance suite of products such as CloudTrail, CloudWatch, or Systems ManagerSkilled in Amazon Web Services (AWS) offerings, development, and networking platformsSkilled in SQLSkilled in JenkinsSkilled in JSONSkilled in discovering patterns in large data sets with the use of relevant software such as Oracle Data Mining or InformaticaSkilled in cloud technologies and cloud computingExperience using software and computer systems' architectural principles to integrate enterprise computer applications such as xMatters, AWS Application Integration, or WebSphereDetermining causes of operating errors and taking corrective actionExperience in the process of analyzing data to identify trends or relationships to inform conclusions about the dataSkilled in creating and managing databases with the use of relevant software such as MySQL, Hadoop, or MongoDBProgramming including coding, debugging, and using relevant programming languagesCommunication including communicating in writing or verbally, copywriting, planning and distributing communication, etc.Ability to frame ideas as systems and analyzing the inputs, outputs, and processExperience helping an organization to plan and manage change in effort to meet strategic objectivesAdept at managing project plans, resources, and people to ensure successful project completionWorking with people with different functional expertise respectfully and cooperatively to work toward a common goal