Senior Software Engineer (Hadoop) who will play a critical role in data platform modernization and be responsible for developing technology solutions supporting complex business processes including transformation of data, data modelling, reporting, analytics, and workflow needs
In this role, you will :
- Lead moderately complex initiatives and deliverables within technical domain environments
- Contribute to large scale planning of strategies
- Design, code, test, debug, and document for projects and programs associated with technology domain, including upgrades and deployments
- Review moderately complex technical challenges that require an in-depth evaluation of technologies and procedures
- Resolve moderately complex issues and lead a team to meet existing client needs or potential new clients needs while leveraging solid understanding of the function, policies, procedures, or compliance requirements
- Collaborate and consult with peers, colleagues, and mid-level managers to resolve technical challenges and achieve goals
- Lead projects and act as an escalation point, provide guidance and direction to less experienced staff
Required Qualifications :
4+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following : work experience, training, military experience, education2+ years of experience in ETL in big data tech stack such as Apache Spark, Hadoop or Hive2+ years of experience in data engineering using Pyspark / Python, Hadoop, Hive, Scala2+ years of RDBMS / Database experience1+ year of experience on UNIX / Shell scriptingDesired Qualifications :
2+ years experience on batch processing tools such as AutosysExperience working in DremioExperience in any cloud data engineering ( Azure / GCP / AWS)