Data Modelling
The process of creating a visual or mathematical representation of data structures, relationships, and rules to organise and manage data effectively for analysis and application development.
In-Depth Explanation
Data modelling defines how data is structured, stored, and related within a system. Good data models make data easier to understand, query, and maintain, and are foundational to both application development and analytics.
Types of data models:
- Conceptual model: High-level overview of key entities and relationships (business-focused)
- Logical model: Detailed structure of entities, attributes, and relationships (technology-independent)
- Physical model: Implementation-specific design for a particular database technology
Data modelling approaches for analytics:
- Star schema: Central fact table surrounded by dimension tables (most common for BI)
- Snowflake schema: Normalised dimension tables for reduced redundancy
- Wide/flat tables: Denormalised tables for simplicity and query performance
- Data vault: Hub, link, and satellite tables for flexible, auditable warehousing
- One Big Table (OBT): Extremely denormalised single table for simple use cases
Key data modelling concepts:
- Entities: Objects or concepts about which data is stored (customers, orders, products)
- Attributes: Properties of entities (customer name, order date, product price)
- Relationships: Connections between entities (a customer places orders)
- Cardinality: The nature of relationships (one-to-one, one-to-many, many-to-many)
- Keys: Fields that uniquely identify records (primary keys, foreign keys)
- Normalisation: Reducing redundancy through table decomposition
- Denormalisation: Intentionally adding redundancy for query performance
Modern data modelling tools:
- dbt (data build tool) for transformation modelling
- Looker's LookML for semantic modelling
- ERD tools (draw.io, Lucidchart, dbdiagram.io)
- Database-specific modelling tools
Business Context
Good data modelling ensures that data is organised in ways that support efficient querying, accurate reporting, and scalable analytics, avoiding the costly rework that poorly modelled data inevitably requires.
How Clever Ops Uses This
Clever Ops designs data models for Australian businesses that balance performance, flexibility, and maintainability. We use modern approaches like star schemas and dbt-based transformation layers to create analytics-ready data structures that grow with the business.
Example Use Case
"A business redesigns its analytics data model from a collection of ad-hoc queries against operational databases to a properly modelled star schema in a data warehouse, reducing report generation time from hours to seconds."
Frequently Asked Questions
Related Resources
Data Warehouse
A centralised repository that stores integrated data from multiple sources for r...
Data Governance
The framework of policies, processes, and standards for managing data assets. En...
ETL
A data integration process that extracts data from sources, transforms it to fit...
Learning Centre
Guides, articles, and resources on AI and automation.
AI & Automation Services
Explore our full AI automation service offering.
AI Readiness Assessment
Check if your business is ready for AI automation.
