Generate step-by-step connection setup guides for SAP Datasphere
Generates step-by-step connection setup guides for SAP Datasphere source systems.
/plugin marketplace add secondsky/sap-skills/plugin install sap-datasphere@sap-skillsGenerate a connection setup guide for the specified source system.
SAP S/4HANA Cloud Public Edition via API
SAP S/4HANA On-Premise via Cloud Connector
SAP BW/4HANA for model transfer and data replication
SAP HANA database (Cloud or On-Premise)
Amazon S3 for file-based data loading
Azure Blob Storage and Azure Data Lake
Google Cloud Storage
Apache Kafka / Confluent for streaming data
Generic OData services
REST APIs via Generic HTTP connection
DATASPHERE_USERSAP Datasphere IntegrationSystem ID: DATASPHERE
System Name: SAP Datasphere
Host Name: <your-datasphere-tenant>.eu10.hcs.cloud.sap
Users for Inbound Communication:
- User: DATASPHERE_USER
- Authentication: User ID and Password
SAP_COM_0531 (for data extraction)Communication System: DATASPHERE
Inbound Communication User: DATASPHERE_USER
Outbound Services: Enable required services
Connection Name: S4HC_PROD
Description: S/4HANA Cloud Production
Connection Details:
Host: <s4hana-tenant>.s4hana.ondemand.com
Port: 443
Authentication:
Method: User Name and Password
User: DATASPHERE_USER
Password: <password>
S4HC_PROD| Feature | Supported |
|---|---|
| Remote Tables | Yes |
| Replication Flows | Yes |
| Real-Time Replication | Yes (ODP-enabled views) |
| Data Flows | Yes |
https://<scc-host>:8443Back-end Type: ABAP System
Protocol: RFC
Internal Host: <s4hana-host>
Internal Port: <RFC port, typically 33XX>
Virtual Host: s4hana-prod.virtual
Virtual Port: 443
Principal Propagation: Configure if needed
URL Path: /
Access Policy: Path and all sub-paths
Connection Name: S4H_ONPREM
Description: S/4HANA On-Premise Production
Connection Details:
Use Cloud Connector: Yes
Location ID: <location-id> (if configured)
Virtual Host: s4hana-prod.virtual
Virtual Port: 443
Client: <SAP Client>
Authentication:
Method: User Name and Password
User: <RFC User>
Password: <password>
In S/4HANA, ensure:
@Analytics.dataExtraction.enabled: true| Issue | Solution |
|---|---|
| Connection timeout | Check Cloud Connector mapping and firewall |
| Authentication failed | Verify RFC user and authorizations |
| No objects found | Check ODP context and CDS view annotations |
| Replication fails | Enable delta queue for source |
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::your-bucket-name",
"arn:aws:s3:::your-bucket-name/*"
]
}
]
}
datasphere-integrationConnection Name: AWS_S3_DATA
Description: AWS S3 Data Lake
Connection Details:
Region: eu-west-1
Bucket: your-bucket-name
Root Path: /datasphere/input/ (optional)
Authentication:
Access Key ID: <access-key>
Secret Access Key: <secret-key>
AWS_S3_DATA*.csv or *.parquet| Format | Read | Write |
|---|---|---|
| CSV | Yes | Yes |
| Parquet | Yes | Yes |
| JSON | Yes | No |
| ORC | Yes | No |
Connection Name: AZURE_BLOB_DATA
Description: Azure Blob Storage for Data Lake
Connection Details:
Storage Account: yourstorageaccount
Container: datasphere-input
Root Path: /raw/ (optional)
Authentication:
Method: Account Key
Key: <storage-account-key>
Connection Type: Microsoft Azure Data Lake Store Gen2
Connection Details:
Storage Account: yourdatalake
Container: analytics
File System: Enable hierarchical namespace
Bootstrap Servers: kafka.example.com:9092
Security Protocol: SASL_SSL (for Confluent Cloud)
SASL Mechanism: PLAIN
Topic Pattern: sales-events-*
Connection Name: KAFKA_EVENTS
Description: Event streaming from Kafka
Connection Details:
Bootstrap Servers: broker1.kafka.com:9092,broker2.kafka.com:9092
Security:
Protocol: SASL_SSL
SASL Mechanism: PLAIN
Authentication:
Username: <api-key>
Password: <api-secret>
Load Type: Initial and Delta
Data Format: JSON or Avro
Target: Local Table
Bootstrap Servers: pkc-xxxxx.region.aws.confluent.cloud:9092
Security Protocol: SASL_SSL
SASL Mechanism: PLAIN
Username: <API_KEY>
Password: <API_SECRET>
Schema Registry URL: https://psrc-xxxxx.region.aws.confluent.cloud
Connection Name: REST_API_SOURCE
Description: External REST API
Connection Details:
Base URL: https://api.example.com/v1
Authentication:
Method: OAuth 2.0 Client Credentials
Token URL: https://auth.example.com/oauth/token
Client ID: <client-id>
Client Secret: <client-secret>
Connection: REST_API_SOURCE
Method: GET
Path: /data/records
Headers:
Content-Type: application/json
Query Parameters:
startDate: ${startDate}
limit: 1000
| Method | Configuration |
|---|---|
| None | No additional config |
| Basic Auth | Username + Password |
| OAuth 2.0 | Token URL + Client ID/Secret |
| API Key | Header name + Key value |
Get Datasphere IP addresses:
| Error | Solution |
|---|---|
| Timeout | Check network, firewall, Cloud Connector |
| SSL Error | Upload certificate, check TLS version |
| Auth Failed | Verify credentials, check user locks |
| Not Found | Verify host/port, check virtual mapping |
Provide the appropriate guide based on the user's requested source type. Customize the configuration values based on their specific environment.