From datasphere
Activates pre-built SAP Business Content packages in Datasphere to deploy industry-specific data models, manage updates, handle prerequisites like Time Dimension and Currency Conversion, and align with LSA++ architecture.
npx claudepluginhub mariodefelipe/sap-datasphere-plugin-for-claude-coworkThis skill uses the workspace's default tool permissions.
SAP Business Content is a collection of pre-built, production-ready data models, analytical views, and data flows designed for specific industries and business domains. Rather than building your entire analytics solution from scratch, Business Content gives you:
Creates isolated Git worktrees for feature branches with prioritized directory selection, gitignore safety checks, auto project setup for Node/Python/Rust/Go, and baseline verification.
Executes implementation plans in current session by dispatching fresh subagents per independent task, with two-stage reviews: spec compliance then code quality.
Dispatches parallel agents to independently tackle 2+ tasks like separate test failures or subsystems without shared state or dependencies.
SAP Business Content is a collection of pre-built, production-ready data models, analytical views, and data flows designed for specific industries and business domains. Rather than building your entire analytics solution from scratch, Business Content gives you:
| Benefit | Impact |
|---|---|
| Time-to-Value | Deploy analytics in weeks vs. months |
| Best Practices | Industry-standard data modeling patterns |
| Reduced Customization | 70-80% of requirements covered by content |
| Consistency | Standardized KPIs across the organization |
| Maintainability | SAP updates content; you benefit from innovations |
The SAP Datasphere Content Network is where you browse, preview, and select Business Content packages for activation.
Navigate to Content Network:
https://your-datasphere-instance/business-content/networkSearch and Filter:
Preview Package:
Each package displays:
Before activating any Business Content, ensure these foundational elements are in place.
Time Dimension is the backbone of temporal analytics. Business Content heavily relies on it for:
In Datasphere:
1. Go to Business Content > Administration > Prerequisites
2. Look for "Time Dimension" table
3. Check: Status = "Populated" (with date range)
If the Time Dimension table exists but is empty:
Download Time Dimension Data File:
Load Data:
Verify Population:
SELECT MIN(DATE), MAX(DATE), COUNT(*) FROM TIME_DIMENSION
| DATE | YEAR | QUARTER | MONTH | WEEK | DAY_OF_WEEK | FISCAL_YEAR | FISCAL_QUARTER | IS_HOLIDAY |
|---|---|---|---|---|---|---|---|---|
| 2024-01-01 | 2024 | Q1 | 1 | 1 | Monday | 2024 | Q1 | true |
| 2024-01-02 | 2024 | Q1 | 1 | 1 | Tuesday | 2024 | Q1 | false |
Currency Conversion enables multi-currency reporting and harmonization. Business Content uses these views to convert transactional amounts to reporting currency.
Business Content packages typically require:
In Datasphere:
1. Go to Business Content > Administration > Prerequisites
2. Look for "Currency Conversion" (TCUR*)
3. Check: Status = "Available" (with rates populated)
Source Currency Master Data:
Create Currency Conversion View:
Verify Exchange Rates:
SELECT SOURCE_CURRENCY, TARGET_CURRENCY, RATE FROM TCURV
WHERE VALID_FROM <= TODAY() AND VALID_TO >= TODAY()
Business Content often includes measurements (quantity, weight, volume). Unit of Measure (UOM) tables normalize these:
In Datasphere:
1. Go to Business Content > Administration > Prerequisites
2. Look for "Unit of Measure" table
3. Check: Status = "Available"
| UNIT_CODE | UNIT_NAME | CATEGORY | CONVERSION_FACTOR |
|---|---|---|---|
| KG | Kilogram | Weight | 1.0 |
| LB | Pound | Weight | 0.453592 |
| MTR | Meter | Length | 1.0 |
| KM | Kilometer | Length | 1000.0 |
Load standard UOM data:
1. Create data flow from SAP system → UOM table
2. Or upload CSV with standard UOM master data
3. Verify all conversion factors are populated
Depending on the Business Content package, also verify:
| Dependency | Purpose | Check |
|---|---|---|
| Organizational Hierarchy | Drill-down by division, region, department | Table populated? |
| Customer Master | Customer dimensions and attributes | Source system connectivity? |
| Product Master | Product hierarchies and classifications | UPC/SKU mappings available? |
| General Ledger Accounts | Chart of Accounts for financial analysis | GL account mapping available? |
Packages typically include:
Data Models (tables for raw data ingestion):
Analytical Views (pre-aggregated analytics):
Data Flows (ETL templates):
Decide where content will be activated:
Option A: Single Space (recommended for small teams)
Option B: Separate Spaces by Layer (recommended for large orgs using LSA++)
If objects already exist in the target space:
| Scenario | Action |
|---|---|
| First activation | Proceed (no conflicts) |
| Re-activating same version | Skip (use existing objects) |
| Activating new version | Choose: Overwrite or Keep |
| Custom modifications exist | Choose: Keep (preserve changes) |
Conflict Resolution Dialog:
Existing Object: SALES_ORDERS
┌─────────────────────────────────┐
│ Overwrite (replace with new) │
│ Keep (preserve customizations) │
│ Rename new (add suffix _v2) │
└─────────────────────────────────┘
Activation Progress:
Creating objects: [████████░░] 80% (40/50 objects)
Estimated time remaining: 2 minutes
After activation, content objects appear in the target space.
LSA++ is SAP's recommended architecture for enterprise data warehouses. Business Content is designed to fit seamlessly into LSA++ layers.
┌─────────────────────────────────────────────────────────┐
│ REPORTING LAYER (L3) │
│ Pre-aggregated analytics views for dashboards & reports │
│ Example: Revenue_Analysis, Customer_Metrics │
└─────────────────────────────────────────────────────────┘
↑
┌─────────────────────────────────────────────────────────┐
│ HARMONIZATION LAYER (L2) │
│ Cleansed, standardized, unified data model │
│ Example: Sales_Order_Harmonized, Customer_Unified │
└─────────────────────────────────────────────────────────┘
↑
┌─────────────────────────────────────────────────────────┐
│ PROPAGATION LAYER (L1) │
│ Document-level data, minimal transformation │
│ Example: Sales_Order_Raw, Customer_Raw │
└─────────────────────────────────────────────────────────┘
↑
┌─────────────────────────────────────────────────────────┐
│ INBOUND LAYER (L0) │
│ Raw data extracted from source systems (as-is) │
│ Example: SD_SALESDOCUMENT, MD_CUSTOMER │
└─────────────────────────────────────────────────────────┘
Inbound Layer Objects (L0 - Source extraction):
Propagation Layer Objects (L1 - Document level):
Harmonization Layer Objects (L2 - Unified):
Reporting Layer Objects (L3 - Analytics):
1. Separate Spaces by Layer (recommended):
Datasphere Spaces Structure:
├── INBOUND_LAYER (Space)
│ └── Raw data tables from source systems
│ └── Connections to SAP S/4HANA, Salesforce, etc.
├── HARMONIZATION_LAYER (Space)
│ └── Cleansed and standardized data
│ └── Data flows reading from INBOUND_LAYER
├── REPORTING_LAYER (Space)
│ └── Analytics views and dashboards
│ └── Analytical views reading from HARMONIZATION_LAYER
└── MASTERED_DATA (Space)
└── Reference data (Customer, Product, Organization)
└── Reusable by all layers
Access Control by Layer:
2. Organize Within a Space (alternative for smaller teams):
Single Analytics Space with layering via naming:
├── L0_SALESDOCUMENT (Inbound)
├── L0_CUSTOMER (Inbound)
├── L1_SALESDOCUMENT_PROPAGATED (Propagation)
├── L2_SALESDOCUMENT_HARMONIZED (Harmonization)
├── L3_REVENUE_ANALYSIS (Reporting)
└── L3_CUSTOMER_METRICS (Reporting)
3. Isolate Inbound from Harmonization
Critical principle: Never have data flows directly from source system to Reporting Layer.
Wrong (anti-pattern):
Source System → Reporting View
(No data quality checks)
Correct (LSA++ compliant):
Source System → Inbound Tables → Harmonization Layer → Reporting View
(staging) (cleansing) (optimization)
SAP regularly publishes new versions of Business Content. Decide whether to adopt updates.
Patch Update (e.g., 1.0.0 → 1.0.1):
Minor Update (e.g., 1.0 → 1.1):
Major Update (e.g., 1.0 → 2.0):
| Scenario | Overwrite | Keep | Notes |
|---|---|---|---|
| Patch update, no customizations | Overwrite | Apply immediately | |
| Patch update, minor customizations | Keep → Merge | Manual merge after update available | |
| Minor update, no customizations | Overwrite | Review new fields before updating | |
| Minor update, significant customizations | Keep | Evaluate if new features justify re-work | |
| Major update, critical customizations | Keep | Plan migration project separately | |
| Production system, no customizations | Overwrite | Update after testing in non-prod | |
| Development space, any customizations | Overwrite | Easier to re-customize than maintain drift |
Step 1: Check for Updates
In Datasphere:
1. Go to Business Content > Manage Content
2. Look for "Update Available" badges
3. Click to view release notes and changelog
Step 2: Impact Analysis
For each outdated object:
1. Check if customizations exist (custom fields, flows)
2. Check if dependent views use this object
3. Test update in non-production space first
Step 3: Stage Update in Non-Prod
1. Clone prod space to test space (if using separate spaces)
2. Or create separate test package version
3. Activate updated package version in test space
4. Run regression tests (SQL queries, dashboards)
5. Validate calculated fields and aggregations
Step 4: Decide: Overwrite or Keep
If testing passes and no customizations exist:
Click Overwrite → All objects replaced with new version
If customizations are critical or testing failed:
Click Keep → Old version retained, new version labeled _v2
After keeping old version:
Step 5: Update Production
After non-prod validation:
1. Activate update in production space
2. Monitor performance and error logs
3. Validate dashboards and reports render correctly
4. Communicate update to business users
Package: Automotive Sales & Service Analytics
Package: Automotive Supply Chain
Package: Retail POS & Merchandise Analytics
Package: Retail Supply Chain
Package: Energy & Water Distribution
Package: General Ledger & Financial Reporting
Package: Banking Risk & Compliance
Package: Production & Costing
After activation, content is often tailored for organization-specific needs.
Pattern 1: Add Calculated Fields (Non-breaking)
Existing View: REVENUE_ANALYSIS
├── Base fields: Sales_Amount, Product, Customer
└── Add calculated fields:
├── Margin_Percent = Gross_Margin / Sales_Amount * 100
├── Days_to_Payment = Invoice_Date - Payment_Date
└── Customer_Segment = (custom logic based on revenue)
Pattern 2: Create Extension Views (Recommended)
Instead of modifying existing views, create new views that extend them:
Business Content View: REVENUE_ANALYSIS (DON'T MODIFY)
↓
New Extension View: REVENUE_ANALYSIS_EXTENDED (your custom logic)
├── Extends: REVENUE_ANALYSIS
├── Adds: Additional dimensions and calculated fields
└── Data flows and dashboards consume _EXTENDED view
Benefits:
Pattern 3: Create Custom Dimensions
Extend master data tables with organization-specific attributes:
Content View: CUSTOMER_MASTER (standard SAP fields)
├── Customer_ID, Name, Industry, Region (standard)
└── Add via custom fields:
├── Account_Manager (org-specific)
├── Customer_Segment_Custom (org-specific classification)
├── Contract_Status (org-specific)
Anti-Pattern 1: Modify Content Objects Directly
❌ DON'T DO THIS:
1. Edit view REVENUE_ANALYSIS (from content)
2. Add custom fields directly
3. Problem: Update overwrites customizations
Anti-Pattern 2: Hard-code Values
❌ DON'T DO THIS:
Sales_Amount WHERE Country = 'USA'
^ Hard-coded country filter breaks for other regions
Better:
✓ DO THIS:
Create parameterized view with country input
Let business users select country via filter
1. Add Company-Specific Hierarchies
Extend Organization hierarchy:
├── Region (from content)
└── Add: Sales Territory, Account Team (your custom)
Extended View: SALES_REVENUE_BY_TERRITORY
├── Base: REVENUE_ANALYSIS
└── Joined with: Your Territory_Master table
2. Align Chart of Accounts
GL Account mapping table (YOUR custom):
├── Content_GL_Account → Your_GL_Account
├── 400000 (Sales) → 4000 (Sales Revenue)
├── 410000 (Returns) → 4100 (Sales Returns)
Use mapping in data flow:
GL_Details → Map GL Account → Store in HARMONIZED table
3. Add Company Fiscal Calendar
If business uses non-Gregorian fiscal calendar:
1. Create custom fiscal calendar master
2. Extend Time Dimension joins with fiscal calendar
3. Reporting uses fiscal year / fiscal quarter
| Error | Cause | Solution |
|---|---|---|
| "Prerequisite not met: Time Dimension" | Time Dimension table empty | Populate Time Dimension with date data |
| "Space quota exceeded" | Not enough memory/disk | Increase space allocation or split across spaces |
| "Object name conflict" | Object exists, conflict resolution not specified | Choose Overwrite or Rename in conflict dialog |
| "Connection test failed" | Source system unreachable | Verify connection credentials and network |
| "Permission denied" | Insufficient space access | Ensure user has space_admin role |
After failed activation, review logs:
In Datasphere:
1. Go to Business Content > Activation History
2. Find failed activation
3. Click View Logs
4. Search for ERROR lines
Log Example:
[2024-02-01 10:15:30] INFO: Activation started for package SALES_ANALYTICS v1.2
[2024-02-01 10:15:45] INFO: Creating objects...
[2024-02-01 10:16:02] ERROR: Failed to create object REVENUE_DAILY
[2024-02-01 10:16:02] ERROR: Reason: "Space SALES_ANALYTICS at capacity (1000 GB / 1000 GB)"
[2024-02-01 10:16:02] WARN: Rollback initiated. 12 objects created, 3 objects rolled back.
After fixing the underlying issue:
1. Go to Business Content > Manage Content
2. Find the package with failed activation
3. Click Retry Activation
4. Review conflict resolution settings
5. Click Confirm
After successful activation, verify everything is working:
Time Dimension table populated with correct date range
SELECT MIN(DATE), MAX(DATE), COUNT(*) FROM TIME_DIMENSION
Currency Conversion rates populated
SELECT COUNT(*) FROM TCURV WHERE VALID_FROM <= TODAY()
Master data tables have records
SELECT TABLE_NAME, COUNT(*) FROM [ACTIVATED_OBJECTS] GROUP BY TABLE_NAME
Data flow test load executed successfully
Key analytical views return data
SELECT TOP 100 * FROM REVENUE_ANALYSIS
-- Should return rows with expected columns
Calculated fields compute without errors
SELECT *, MARGIN_PERCENT FROM REVENUE_ANALYSIS
-- No NULL or error values for MARGIN_PERCENT
Aggregation views perform acceptably
Create_Date Baseline:
- REVENUE_ANALYSIS: 2.1 seconds
- CUSTOMER_METRICS: 1.8 seconds
- MARGIN_ANALYSIS: 3.2 seconds
See references/content-catalog.md for complete prerequisite checklists, activation troubleshooting, industry-specific content listings, and post-activation validation templates.