Talent.com
Data / Information Architect

Data / Information Architect

Robert HalfIrvine, CA, US
job_description.job_card.variable_days_ago
serp_jobs.job_preview.job_type
  • serp_jobs.job_card.full_time
job_description.job_card.job_description

Job Description

Job Description

Description

We are seeking a Data Modeler & Platform Architect to design, validate, and optimize data models that support our organization’s analytics, reporting, and operational data needs. The ideal candidate will have experience working in Snowflake, dbt, data observability, DataOps, and DevOps, supporting our cloud-based data strategy and modernization efforts. Performs all duties in accordance with the Company’s policies and procedures, all U.S. state and federal laws and regulations, wherein the Company operates.

Data Modeling

  • Work with functional business owners to understand business process and data elements that tie to the process.
  • Design conceptual models to showcase the entities / data elements and relationships.
  • Design models in 3NF, Star schema and snowflake schema as appropriate.
  • Design, build, and maintain conceptual, logical, and physical data models to support analytics, reporting, and operational workloads.
  • Ensure dimensional and relational data models support data warehousing and self-service analytics.
  • Develop and optimize data structures in Snowflake DB to ensure performance, scalability, and business alignment.
  • Utilize dbt to build and manage transformations for clean, structured, and reusable data models.
  • Conduct POCs to evaluate new tools, methodologies, and modeling techniques to improve performance, efficiency, and scalability
  • Establish and enforce data standards, governance, and best practices for data across the organization.
  • Ensure data models comply with regulatory, security, and compliance requirements.

Platform Architecture

  • Design and implement small-scale prototypes and evaluations to test approaches for data modeling, performance running, and architecture improvements.
  • Assess and compare data modeling techniques, integration strategies, and observability tools to recommend the best solutions for the enterprise.
  • Work closely with data engineering and analytics teams to assess new methodologies before full-scale implementation.
  • Document findings from evaluations and provide technical guidance on their adoption.
  • Implement data observability tools to monitor data health, lineage, and anomalies across pipelines.
  • Integrate DataOps principles to automate data quality checks, validation, and governance processes.
  • Work with DevOps teams to ensure CI / CD pipelines support automated deployments and version control for data models.
  • Define and enforce data validation, monitoring, and alerting mechanisms to proactively address issues before they impact stakeholders.
  • Work with data engineers, BI teams, and application developers to optimize data structures for various use cases.
  • Collaborate with data governance teams to define metadata, lineage, and data quality standards.
  • Provide technical guidance on data modeling, DataOps, and best practices for data architecture.
  • Evaluate and recommend modern data modeling tools and methodologies to improve efficiency and scalability.
  • Stay up-to-date on industry trends and cloud data technologies to enhance data architecture.
  • Support the data foundation initiative, contributing to the modernization of enterprise data platforms
  • Requirements

  • 8+ years of experience in data modeling, platform architecture, or database design.
  • 6 - 8 years of experience in banking and financial services (commercial banking preferred).
  • Experience with data modeling tools ER Studio, Erwin or similar).
  • Experience on creating conceptual, logical and physical data models using various normal forms, star and snowflake schema.
  • Expertise in dimensional (OLAP) and relational (OLTP) data modeling techniques.
  • Proficiency in SQL and experience optimizing queries for performance and scalability.
  • Hands-on experience with Snowflake, dbt, and data observability tools
  • Experience implementing DataOps and DevOps practices in a data environment.
  • Strong understanding of data governance, data quality, and metadata management.
  • Experience with ETL / ELT frameworks and data pipeline best practices.
  • Knowledge of Python, Spark, or other data transformation tools.
  • serp_jobs.job_alerts.create_a_job

    Architect • Irvine, CA, US