Job Description
Job Description
Description :
Knight Federal Solutions is a trusted provider to industry leading prime contractors, the Department of Defense and the Intelligence Community. We have established a company culture that supports our employees, their families and the communities in which they live and work. When you join our team, you belong to a group of people that work hard, strive for greatness and care about people.
Our hard work is evident in everything we do. Whether it be supporting large government programs in the areas of Simulation and Training, Information Technology, Intelligence or Cyber Security we always strive to be the best. It is for this reason that we have been recognized as a World Class Team Supplier by Northrop Grumman and were also named one of Florida’s fastest growing companies by Inc. Magazine.
As Knight Federal Solutions continues to grow, we look forward to hiring the best and the brightest to join us in our success!
Role Responsibilities
- Design and Implement Data Pipelines : Lead the development, optimization, and maintenance of robust Extract, Transform, Load (ETL) processes using programming languages like Python to ingest, process, and deliver high-quality data from diverse sources
- Build and Manage Scalable Data Solutions : Architect, develop, and maintain scalable data stores and big data models to support analytical and operational needs, ensuring data integrity, performance, and accessibility
- Develop Python-Based Data Applications : Write, test, and deploy efficient and well-documented Python scripts and programs for data manipulation, transformation, and integration, contributing to the core data platform
- Drive Cross-Functional Data Initiatives : Collaborate closely with cross-functional teams (e.g., data scientists, analysts, software engineers) to understand data requirements, design optimal data solutions, and ensure seamless integration within the broader ecosystem
- Data Conversion and Platform Support : Develop and implement scripts and programs to convert various data formats into usable structures. Provide essential support to project teams for scaling, monitoring, and operating data platforms, ensuring stability and performance
Requirements :
TS SCI w / CI Poly3+ years of experience utilizing programming languages for Extract, Transform, Load (ETL) processes, data analysis, or data modeling3+ years of experience developing and maintaining scalable data stores and big data models3+ years of experience developing in PythonExperience creating solutions within a collaborative, cross-functional team environmentAbility to develop scripts and programs for converting various types of data into usable formats and support project team to scale, monitor, and operate data platformsDesired
Experience implementing code in an operational environment or for operational usageExperience in application development utilizing PostgreSQLExperience with Rust or JavaExperience with DevSecOps tools, including GitHubExperience with a public cloud, including AWS, Microsoft Azure, or Google CloudExperience working on real-time dataExperience with Agile engineering practices