We are seeking an experienced Data Architect to own the data strategy, architecture, and governance of the data platform underpinning our next-generation data center. This role is key in structuring, securing, and optimizing the data flows from OT / IT systems, sensors, and business platforms into a unified, analytics and AI-ready data environment. This role will work closely with the Digital Architect, with a focus in systems integration, to deliver a seamless digital and data ecosystem.
Responsibilities :
- Own the end-to-end data architecture for operational and business datasets, including OT telemetry, IoT sensor data, asset performance metrics, and other KPIs.
- Design and maintain the enterprise data model, ensuring consistency across disparate systems (BMS, EPMS, CMMS, ERP, etc).
- Define and oversee data ingestion and transformation pipelines for both real-time and batch processing.
- Architect and manage cloud-based data infrastructure (e.G., AWS Glue, GCP Dataflow, Azure Synapse) to ensure scalability and reliability.
- Select, implement and optimise time-series, relational, and analytical data platforms (e.G., InfluxDB, Snowflake, BigQuery).
- Establish data lake and / or lakehouse architecture for scalable, flexible storage and analytics.
- Define and enforce data governance policies, including access control, data retention, and regulatory compliance.
- Implement and maintain data quality, lineage, and metadata management frameworks.
- Deploy and manage data cataloging tools to support data discoverability, self-service analytics, and compliance.
- Evaluate and pilot emerging tools and platforms to continuously improve data infrastructure and support innovation initiatives.
- Define API and integration standards for all consuming / producing systems.
- Collaborate with security teams to design secure data architectures and implement role-based access controls (RBAC).
- Define and manage versioning and lifecycle policies for data models, schemas, and APIs across environments.
- Collaborate cross-functionally to deliver high-quality, trusted datasets to power predictive analytics, simulation, digital twin initiatives, and business reporting.
Requirements :
Bachelor’s or Master’s in Computer Science, Data Engineering, or related field.7+ years of data architecture or data engineering experience in mission-critical or industrial environments (data centers, manufacturing, smart buildings, energy systems).Experience in data modeling and data governance frameworks.Experience designing and implementing time-series data platforms and streaming architectures.Expertise in ETL / ELT tools and modern data stack tools (e.G., dbt, Kafka, Spark, Airflow) and platforms such as Snowflake, BigQuery, Databricks, etcFamiliarity with OT / IoT protocols (BACnet, Modbus, OPC-UA) and integrating operational data into IT systems.Strong knowledge of cloud-native analytics platforms (AWS, GCP, Azure) and edge data processing strategies.Excellent cross-functional communication skills, able to bridge engineering, operations, and analytics teams.