Microsoft Dynamics 365 Synapse Link
Microsoft Dynamics 365 Synapse Link is a powerful integration that connects Dynamics 365 applications with Azure Data Lake Storage Gen2. It enables near real-time data replication from Dataverse into Azure Data Lake Storage Gen2 for advanced analytics reporting and machine learning without impacting the performance of the source systems.
Prerequisites
Azure Synapse Link Configuration
Before connecting to Omnata, you must configure Azure Synapse Link for Dataverse in your Microsoft Dynamics 365 environment. This process involves linking your Dynamics 365 environment to Azure Data Lake Storage Gen2 and selecting which tables to export.
Key Configuration Requirements:
The Synapse Link must be configured with the incremental update folder structure enabled. This is critical for the Omnata plugin to function correctly:
When creating or editing your Synapse Link configuration, select Advanced, then Show advanced configuration settings
Enable the option Enable incremental update folder structure
In the Time interval field, choose your desired frequency for reading incremental data (this determines how the system partitions data into time-stamped folders)
Select the Dataverse tables you want to sync. You can also select Finance and Operations tables if applicable
Important Notes:
Data files are always appended
Data is partitioned yearly
The "Append only" and "Partition" options at the table level are ignored
Configuration Resources:
For detailed step-by-step instructions on configuring Azure Synapse Link, refer to Microsoft's official documentation:
Azure Synapse Link for Dataverse Overview - Comprehensive guide to setting up Synapse Link
Link Finance and Operations with Power Platform - Required for F&O environments
Enable Change Tracking - Configure SQL row version change tracking for F&O apps
Choose Finance and Operations Data - Select and configure F&O tables for export
Supported Products
Azure Synapse Link is supported across several Dynamics 365 products built on Microsoft Dataverse:
Dynamics 365 Customer Engagement (CE) Apps:
Dynamics 365 Sales
Dynamics 365 Customer Service
Dynamics 365 Marketing
Dynamics 365 Field Service
Dynamics 365 Project Operations
Dynamics 365 Finance and Operations (F&O) Apps:
Dynamics 365 Finance
Dynamics 365 Supply Chain Management
Dynamics 365 Commerce
Dynamics 365 Human Resources
System Requirements
Microsoft Dynamics 365 with cloud-based Tier-2 or higher environment
Azure Data Lake Storage Gen2 account
Snowflake account with appropriate permissions
Azure AD tenant access
Required Permissions
Azure: Storage Blob Data Reader role on the target storage account
Snowflake: ACCOUNTADMIN or equivalent to create storage integrations
Dynamics 365: System Administrator role for Synapse Link configuration
Authentication
Azure side configuration
Omnata uses a two-step authentication approach to securely access your Azure Data Lake Storage. You need to complete these steps before you create the connection in Omnata.
Snowflake Storage Integration - For secure access to Azure Data Lake
Azure AD Service Principal - For programmatic access to storage resources
Step 1: Create Snowflake Storage Integration
Create a storage integration in your Snowflake account to enable secure access to your Azure Data Lake:
CREATE STORAGE INTEGRATION dynamics_365_integration
TYPE = EXTERNAL_STAGE
STORAGE_PROVIDER = 'AZURE'
ENABLED = TRUE
AZURE_TENANT_ID = '<your-tenant-id>'
STORAGE_ALLOWED_LOCATIONS = ('azure://<storage-account>.blob.core.windows.net/<container>/');Step 2: Grant Integration Permissions
Grant usage permissions to the Omnata applications:
GRANT USAGE ON INTEGRATION dynamics_365_integration TO APPLICATION OMNATA_SYNC_ENGINE;
GRANT USAGE ON INTEGRATION dynamics_365_integration TO APPLICATION OMNATA_DYNAMICS_365_SYNAPSE_LINK;Step 3: Retrieve Integration Details
Get the integration details needed for Azure configuration:
DESCRIBE INTEGRATION dynamics_365_integration;Note the following values from the output:
AZURE_MULTI_TENANT_APP_NAMEAZURE_CONSENT_URL
Step 4: Authorize Storage Integration
Visit the AZURE_CONSENT_URL from Step 3 to grant consent in your Azure AD tenant. This authorizes Snowflake to access your Azure storage.
Step 5: Create Azure AD App Registration
Navigate to Azure Portal → Azure Active Directory → App registrations
Click New registration
Provide a descriptive name (e.g., "Omnata-Dynamics365-Sync")
Under Certificates & secrets, create a new client secret
Save the following values for the connection configuration:
Tenant ID
Client ID (Application ID)
Client Secret
Step 6: Assign RBAC Permissions
In your Azure Storage Account, assign the Storage Blob Data Reader role to both:
The App Registration created in Step 5 (using the Client ID)
The Snowflake Storage Integration (using the
AZURE_MULTI_TENANT_APP_NAMEfrom Step 3)
This can be done in the Azure Portal under your Storage Account → Access Control (IAM) → Add role assignment.
Create a connection in the Omnata UI
With the previous Azure-side steps complete, you can Create a Connection in the Omnata UI.
The connection form will require the following:
TenantId
ClientId
Client Secret
Storage Account
Container
Storage Integration
Inbound Syncs
The plugin supports ingestion of any entities exported to Azure Data Lake via Synapse Link.
Supported Sync Strategies
Incremental - Uses change tracking based on folder timestamps and record version numbers. Recommended for most entities to efficiently sync only new and changed data.
Full Refresh - Complete data replacement on each sync. Required for kernel/system tables that don't support incremental exports.
Supported Streams
All entities configured in your Azure Synapse Link export are available for syncing, including:
Business Entities (Incremental Sync):
Customer data (accounts, contacts, leads)
Transaction data (sales orders, invoices, payments)
Product data (items, categories, price lists)
Operational data (inventory, production, logistics)
Custom entities enabled in Synapse Link
Kernel/System Tables (Full Refresh Only):
The following system tables don't support incremental exports and must use Full Refresh:
dataarea, userinfo, securityrole, securityuserrole, sqldictionary,
partitions, securityprivilege, timezoneslist, securityduty,
securitysubrole, securityuserrolecondition, databaselog,
securityroleruntime, securityroleprivilegeexplodedgraph,
securityroledutyexplodedgraph, timezonesrulesdata,
securityroleexplodedgraph, userdataareafilter, sysinheritancerelationsMetadata Tracking
Microsoft Azure Synapse Link publishes metadata fields that track synchronization operations. The plugin provides flexible handling of this metadata to suit different use cases.
When Metadata Inclusion is Enabled:
SinkCreatedOnfield is included in the unique identifierCreates separate records for each sync occurrence of the same data
Provides complete audit trail of all sync operations
Useful for change detection and historical analysis
When Metadata Inclusion is Disabled:
Only the latest version of each record is maintained
Subsequent syncs overwrite previous entries
Reduces storage overhead for high-frequency syncs
Suitable for real-time operational reporting
Additional Metadata Fields:
When metadata inclusion is enabled, these fields are added to the payload:
SinkCreatedOn
Record creation timestamp in Synapse
2024-01-15 14:30:25
SinkModifiedOn
Record modification timestamp
2024-01-15 14:30:25
source_stage
Snowflake stage containing source file
dynamics.mystorage_container
source_filename
Source CSV file name
account/20240115T143025Z.csv
source_row_number
Row position in source file
1247
This metadata enables complete data lineage tracking from Azure Data Lake through to Snowflake, supporting audit requirements and troubleshooting.
Advanced Features
Schema Management
The plugin automatically detects and maps schemas from Common Data Model (CDM) definitions:
Automatic Type Mapping: Converts CDM types to Snowflake-compatible data types
Primary Key Detection: Identifies and handles composite primary keys from CDM metadata
Column Ordering: Maintains field positions from the source system
Relationship Handling: Processes field constraints and relationships
Parallel Processing
For optimal performance with large datasets:
Configurable Batch Size: Default of 24 date folders per batch
Multi-threaded Processing: Concurrent folder processing for faster syncs
State Management: Maintains processing state across sync runs for failed folder recovery
Post-Processing Hooks
The Omnata Sync Post-hook enables downstream data processing:
SCD Type 2 Implementation: Track historical changes with effective dating
Data Materialization: Convert views to physical tables for performance
Late-Arriving Data: Handle out-of-sequence updates with custom logic
Data Quality: Apply validation rules and enrichment processes
Troubleshooting
Authentication Failures
Symptoms: ClientAuthenticationError or 403 Forbidden errors
Solutions:
Verify Azure AD app registration has a valid client secret
Check Storage Blob Data Reader role assignment for both the app registration and Snowflake integration
Ensure Snowflake storage integration is properly authorized via the consent URL
Validate tenant ID matches between Snowflake integration and Azure configuration
Missing Data or Empty Syncs
Symptoms: Sync completes successfully but no data appears in Snowflake
Solutions:
Verify
model.jsonexists in the Azure Data Lake foldersConfirm entity names match between Dynamics 365 and plugin configuration
Check that Synapse Link export is active and has recent data
Ensure incremental update folder structure is enabled in Synapse Link configuration
Performance Issues
Symptoms: Slow sync performance or timeouts
Solutions:
Adjust batch size for parallel processing based on data volume
Review Azure Data Lake bandwidth limits
Monitor Snowflake warehouse size and query performance
Consider reducing the number of concurrent streams
Schema Mismatches
Symptoms: Type conversion errors or missing columns
Solutions:
Refresh schema by re-running the connection test
Check CDM model version compatibility
Verify custom field exports are enabled in Synapse Link
Review column mapping in the generated schemas
Best Practices
Performance Optimization
Batch Processing: Start with the default batch size (24) and adjust based on data volume and performance monitoring
Stream Selection: Only sync required entities to reduce processing overhead
Sync Strategy: Use incremental sync for large, frequently changing entities; reserve full refresh for small reference tables
Security
Credential Rotation: Rotate Azure client secrets regularly (recommended: every 6-12 months)
Least Privilege: Use minimal required permissions for Azure AD applications
Monitoring: Review Azure access logs for unusual activity
Monitoring and Maintenance
Health Checks: Monitor sync success rates and review error logs for patterns
Data Validation: Periodically validate data completeness between source and destination
Capacity Planning: Monitor Snowflake credit consumption and Azure storage costs
Limitations
Current Limitations:
Outbound sync is not currently supported
Near real-time sync only (depends on Synapse Link export frequency configured in Step 3 of prerequisites)
Custom entities must be explicitly enabled in the Synapse Link export configuration
Some complex Dynamics 365 data types may require manual mapping
Scale Considerations:
Files larger than 100MB may impact sync performance
Very frequent exports may cause resource contention
Recommended maximum of 64 concurrent streams
Consider Azure Data Lake retention policies for cost optimization
Last updated
Last updated