DEV Community

Cover image for Understanding Salesforce Data 360 Objects: The Core of the Unified Customer Profile
Sunil
Sunil

Posted on

Understanding Salesforce Data 360 Objects: The Core of the Unified Customer Profile

Salesforce Data 360 (formerly Data Cloud) is becoming the foundational layer for almost every new Salesforce innovation, blending real-time data orchestration with enterprise-grade architecture. To successfully unify customer data and activate insights, it is essential to understand the unique data objects that form the platform's architecture.

Unlike traditional Salesforce Clouds, which rely on a common core platform and relational databases, Data Cloud uses a distinct technology stack built to handle petabyte-scale data, leveraging services like DynamoDB for hot storage and Amazon S3 for cold storage and etc. The physical architecture is structured through a set of layered objects, which govern how data is ingested, harmonized, and ultimately activated.

The Data Object Hierarchy in Salesforce Data Cloud

Data flows through a defined sequence of objects as it moves from its external source into the unified Customer 360 profile:

  1. Data Source (The Origin)


The Data Source represents the initial layer, which is the platform or system where your data originates, outside of Data Cloud itself. Examples of these sources include Salesforce platforms (like Sales Cloud, Marketing Cloud, Commerce Cloud), object storage platforms (Amazon S3, Google Cloud Storage), SFTP, or Ingestion APIs.

  1. Data Stream (The Entity Path)

A Data Stream is an entity extracted from a specific Data Source—for instance, 'Contacts' from Sales Cloud or 'Orders' from Commerce Cloud. When setting up a Data Stream, it must be assigned a category: Profile, Engagement, or Other. A single Data Source can contain multiple Data Streams.

  1. Data Source Object (DSO: The Raw Staging Area)

The Data Source Object (DSO) is where a Data Stream is first ingested. It acts as a physical, temporary staging store that holds the data in its raw, native file format (such as a CSV file). Minor transformations can be performed using formulas applied to fields at the time of data ingestion into the DSO.

  1. Data Lake Object (DLO: The Prepared Data Store)

The Data Lake Object (DLO) is the next stage and represents the first object available for inspection and preparation. It is the product of a DSO and any transformations applied. DLOs provide a physical store, generally residing in storage containers in the data lake (Amazon S3) as Apache Parquet files, which are column-oriented for efficient storage and retrieval. DLOs are typed, schema-based, and materialized views.

Data Spaces, which are logical partitions used for security and organization, allow administrators to filter DLOs and assign them to specific groups of users, ensuring each group only accesses relevant data.

  1. Data Model Object (DMO: The Canonical View)

The Data Model Object (DMO) is critical for harmonization and activation. Unlike DSOs and DLOs, the DMO enables a virtual, non-materialized view into the data lake.

• Canonical Model: DMOs align with the Customer 360 Data Model, providing a standard, canonical data model with pre-defined attributes (standard objects). Custom DMOs can also be created.

• Virtual Nature: When a query is run on a DMO, the result is not stored; it is always based on the current data snapshot in the underlying DLOs.

• Inheritance and Relationships: DMOs inherit a category from the first DLO mapped to them. They can have standard or custom relationships (one-to-one or many-to-one) with other DMOs, similar to standard Salesforce objects. There are currently 89 standard DMOs, supporting various entity use cases.

Strategic Takeaways for Implementation

For effective Data 360 implementation, understanding the role of these objects is crucial for the "Consistency" pillar—meaning how you model your data.

• Focus on Traits, not Raw Events: Avoid the pitfall of "Shoving exhaust data into CRM objects". Instead, keep high-cardinality events (like clickstream data) in Data 360 (in the DLO layer) and publish only small summaries or key traits back to Salesforce CRM objects (like Opportunity or Case) for pages, reports, and coaching.

• Unification vs. Golden Record: Remember that Data 360’s unification process merges data from multiple sources into a unified view for activation without altering the original data (it is not a traditional "golden record" that overwrites source systems).

• Start Small and Model Wisely: Success depends on modeling data for activation, not just reporting. Adopt a crawl-walk-run approach: start with a single use case to learn how ingestion, unification, and activation work before scaling up.

Understanding Data Cloud's unique layered object model (Source > Stream > DSO > DLO > DMO) is like understanding the manufacturing process for a complex product: each stage refines the raw material until it reaches the final, usable form that powers personalization and business action across the entire Salesforce ecosystem.

Top comments (0)