Summary
This position provides technical expertise in the development and deployment of machine learning and statistical models to support data-informed decision-making across the University. It also provides hands-on support in building and maintaining data integrations between enterprise applications to ensure accurate and timely data flow across systems. Working closely with cross-functional teams, this role ensures that both analytical models and integration solutions are reliable, scalable, and aligned with institutional needs. The position plays a critical role in enabling automation, improving data accessibility, and supporting operational efficiency through applied analytics and system connectivity
The Position requires flexibility to work evenings and weekends as needed for project implementations and special event support.
Essential Functions
Designs, develops, and maintains data integration solutions across enterprise systems. This position supports the full development lifecycle of integration workflows that ensure secure, reliable, and efficient data exchange across university platforms.
Translate integration requirements and specifications into technical designs and development plans.
Build, test, and deploy integration solutions using tools such as Boomi, SSIS, and other relevant platforms.
Implement integrations leveraging third-party APIs, custom scripts (e.g., Python), and standard data transfer protocols.
Collaborate with functional and technical stakeholders to assess integration needs and apply reusable patterns and frameworks.
Monitor, troubleshoot, and optimize existing integrations to ensure consistent and accurate data flow.
Maintain technical documentation, including data mappings, system dependencies, and solution architecture diagrams.
Ensure data integrations align with Snowflake architecture, support system interoperability, and adhere to standards outlined in the Data Cookbook.
Work effectively with team members and cross-functional units across all levels of the organization.
Develops and deploys machine learning and statistical models to support data-informed decision-making. This role applies data science techniques to solve institutional challenges and generate actionable insights using both custom and platform-native tools.
Design, build, and validate predictive models and statistical analyses using Python and libraries such as pandas, scikit-learn, statsmodels, and others.
Leverages Snowflake’s native machine learning capabilities, including UDFs, stored procedures, and SQL-based ML functions to operationalize predictive models at scale.
Apply appropriate modeling techniques such as linear regression, logistic regression, time series forecasting, and classification models based on the use case.
Collaborate with data owners and functional units to define modeling goals, select relevant features, and assess model performance.
Develop automated workflows for model scoring, retraining, and performance monitoring.
Document methodologies, model assumptions, and results for reproducibility and transparency.
Ensure models align with data governance standards and responsible AI principles in collaboration with the CDO’s
Implements and maintains scalable integration and machine learning solutions with a focus on security, data integrity, and lifecycle management. This role ensures that both application integrations and machine learning models are developed, deployed, and maintained using best practices that prioritize reliability, compliance, and institutional alignment.
Implement secure and auditable processes for both integration workflows and deployed ML / statistical models.
Monitor and manage the full lifecycle of deployed models—from development and validation through deployment, retraining, and performance evaluation.
Evaluate and recommend emerging technologies in integration platforms and ML operations (MLOps) to support long-term scalability.
Collaborate with technical teams and data governance partners to ensure integration pipelines and ML outputs align with university policies on data quality, access, and privacy.
Stay current on release cycles and capabilities of tools such as Dell Boomi, Snowflake, and relevant Python libraries to identify enhancements for current and future-state solutions.
Proactively engage cross-functional teams and subject matter experts to ensure all solutions are technically sound, well-documented, and supportable over time.
Performs other job-related duties as assigned or required.
Qualifications / Requirements
Benefits : Barry University offers a comprehensive benefits package to full-time employees that includes health, dental, vision, life insurance, retirement, tuition assistance, paid time off and work / life balance initiatives such as wellness programs, spirituality in the workplace, and training and development.
Machine Learning and Data Solutions Developer • Main Campus,Miami Shores