Hello,
Hope you are doing great!
Application Development Principal (Glider will be required)
2 years contract
Location : REMOTE
Must Have Skills :
Experience working with with both business and IT leaders
Teradata
Databricks
Spark / Pyspark
Person is 13-15+ years' experience that takes initiative and has command of tools...works in a consultative manner vs waiting for direction and orders.
Duties :
Collaborate with business and technical stakeholders to gather and understand requirements.
Design scalable data solutions and document technical designs.
Develop production-grade, high-performance ETL pipelines using Spark and PySpark.
Perform data modeling to support business requirements.
Write optimized SQL queries using Teradata SQL, Hive SQL, and Spark SQL across platforms such as Teradata and Databricks Unity Catalog.
Implement CI / CD pipelines to deploy code artifacts to platforms like AWS and Databricks.
Orchestrate Databricks jobs using Databricks Workflows.
Monitor production jobs, troubleshoot issues, and implement effective solutions.
Actively participate in Agile ceremonies including sprint planning, grooming, daily stand-ups, demos, and retrospectives.
Skills :
Strong hands-on experience with Spark, PySpark, Shell scripting, Teradata, and Databricks.
Proficiency in writing complex and efficient SQL queries and stored procedures.
Solid experience with Databricks for data lake / data warehouse implementations.
Familiarity with Agile methodologies and DevOps tools such as Git, Jenkins, and Artifactory.
Experience with Unix / Linux shell scripting (KSH) and basic Unix server administration.
Knowledge of job scheduling tools like CA7 Enterprise Scheduler.
Hands-on experience with AWS services including S3, EC2, SNS, SQS, Lambda, ECS, Glue, IAM, and CloudWatch.
Expertise in Databricks components such as Delta Lake, Notebooks, Pipelines, cluster management, and cloud integration (Azure / AWS).
Proficiency with collaboration tools like Jira and Confluence.
Demonstrated creativity, foresight, and sound judgment in planning and delivering technical solutions.
Required Skills : Spark
Pyspark
Shell Scripting
Teradata
Databricks
Additional Skills : AWS SQS
Foresight
Sound Judgment
SQL
Stored Procedures
Databricks For Data Lake / Data Warehouse Implementations
Agile Methodologies
GIT
Jenkins
Artifactory
Unix / Linux Shell Scripting
Unix Server Administration
Ca7 Enterprise Scheduler
Aws S3
Aws Ec2
AWS SNS
Aws Lambda
AWS ECS
Aws Glue
AWS IAM
Aws Cloudwatch
Databricks Delta Lake
Databricks Notebooks
Databricks Pipelines
Databricks Cluster Management
Databricks Cloud Integration (Azure / Aws)
JIRA
Confluence
Creativity
Please free to reach at gaurav09@kanakits.com
Application Development • AL, United States