How Can Dynamics 365 Teams Compress Cloud Files in SSIS ETL Integrations?

How Can Dynamics 365 Teams Compress Cloud Files in SSIS ETL Integrations?

Dynamics 365 users who rely on SSIS-based ETL processes often need to move, package, archive, or unpack files across cloud storage platforms. With KingswaySoft SSIS Integration Toolkit v26.1, our Compression Task can now work with cloud connection managers for source and destination paths, allowing files to be compressed or decompressed directly in the cloud without first downloading them to a local file system.

Local staging can slow down file processing, increase temporary storage needs, and introduce unnecessary handling of sensitive files. By supporting cloud connection managers inside the Compression Task, we make compression a more practical part of modern Dynamics 365 integration workflows.

Build reliable, performant Dynamics 365 Integrations

Let us show you how the SSIS Integration Toolkit helps you accomplish more in less time with no-code tooling.

Contact Us

 

Key Highlights

  • KingswaySoft SSIS Integration Toolkit v26.1 adds cloud connection manager support to the Compression Task.
  • Files can be compressed or decompressed directly from cloud storage without local staging.
  • The task can read from one cloud platform, process files in memory, and write to another cloud platform.
  • Dynamics 365 ETL teams can use this for archiving, file delivery, and processing inbound ZIP files.
  • The examples below show Azure Blob Storage and Amazon S3 workflows.

What Changed in the KingswaySoft Compression Task?

The Compression Task in our SSIS Integration Toolkit, previously SSIS Productivity Pack, already supported file compression and decompression. The important change in v26.1 is that it can now use cloud connection managers for either the source path or the destination path.

Before this update, cloud-based files required an SSIS package to download files to a local directory, compress or decompress them there, and then upload the results back to cloud storage. That approach worked, but it added extra steps and created more movement of data than many ETL workflows really needed.

With the new capability, the Compression Task can read files directly from cloud storage, compress or decompress them in memory, and write the output to another cloud location. For teams integrating Dynamics 365 with other systems, data lakes, storage services, or partner file exchanges, that makes compression easier to include in a clean Control Flow design.

Why Does Cloud Compression Matter for Dynamics 365 ETL Processes?

Cloud compression is useful when Dynamics 365 integration workflows need to package multiple files, reduce file size, or extract inbound archives before downstream processing. In practical terms, this can support scenarios such as archiving processed files, preparing files for external delivery, or unpacking ZIP files received from another system.

The main benefit is that files no longer have to pass through a local staging folder just to be compressed or decompressed. That can reduce latency, avoid extra storage overhead, and limit the need to place sensitive data on local disks, even temporarily.

For Dynamics 365 users, this is especially relevant when integrations touch ERP, CRM, reporting, cloud storage, or partner systems. File handling is often one part of a larger workflow, and removing unnecessary local steps helps keep the overall process simpler and easier to maintain.

Let’s examine a couple of scenarios that benefit from the enhanced functionality in version 26.1.

Example 1: Compressing a Cloud Directory into a ZIP File

In this example, the source is a directory of files stored in an Azure Blob Storage container. The goal is to collect those files into one ZIP archive and write that archive directly to an Amazon S3 bucket.

This pattern fits well when a prior process creates several output files that need to be bundled together. It can also be used for archival pipelines, file delivery workflows, or cases where a receiving party expects one compressed file instead of multiple separate files.

To configure the workflow, first create the required connection managers in the SSIS project. One connection manager should point to the Azure Blob Storage account, and the other should point to the Amazon S3 bucket. After those are available, add a Compression Task to the Control Flow and open the editor.

  1. On the General tab, set the Action to Compress and set the Compression Format to Zip.
  2. Under Source Directory/File Settings, choose the Azure Blob Storage Connection Manager from the Connection Manager Set the Source Type to Directory, then enter the source directory in the Source Path field.
  3. Under Destination Directory/File Settings, choose the Amazon S3 Connection Manager. To define the destination path, click the ellipsis button beside the Destination Path This opens a file browser that shows the S3 bucket structure. From there, you can specify the destination ZIP file in one of two ways:
    1. Using the file browser: Navigate to the target folder and use the New File… button to create a new file with a .zip Select the new file and click OK. Because the file now exists in the destination, enable Overwrite Existing Items so the task can write to it when the package runs.
    2. Typing the path manually: Select an existing file in the browser to populate the Destination Path field, then close the dialog and manually edit the field with the intended full path and file name, ending in .zip. Since that file does not already exist at the destination path, Overwrite Existing Items does not need to be enabled for the first run.
  1. If needed, enable Include Subdirectories so the task also includes files inside subfolders. You can also adjust the Compression Level and enter a Password under Advanced Settings if the ZIP archive must be password-protected.

When the package runs, the Compression Task streams the files from Azure Blob Storage, compresses them in memory, and writes the ZIP file to Amazon S3 as one direct cloud-to-cloud operation.

Example 2: Decompressing a ZIP File into a Cloud Directory

The second example reverses the flow. In this case, the source is a ZIP file stored in a cloud storage platform such as an Amazon S3 bucket, and the destination is a directory in an Azure Blob Storage container.

This type of decompression workflow is common when external vendors or partner systems deliver files as compressed archives. They may do this to reduce transfer size, bundle multiple files into a single payload, or meet a required file exchange format. By extracting the archive directly into the target cloud location, the files are ready for the next step in the ETL process without an intermediate local folder.

The setup is similar to the compression example. With the cloud connection managers already configured, add or reconfigure a Compression Task and open the editor.

  1. On the General tab, set the Action to Decompress and set the Compression Format to Zip.
  2. Under Source Directory/File Settings, select the Amazon S3 Connection Manager and set the Source Type to File. Enter the path to the source ZIP file in the Source Path
  3. Under Destination Directory/File Settings, select the Azure Blob Storage Connection Manager and enter the destination directory path where the extracted files should be written.
  4. If files with the same names may already exist in the destination directory, enable Overwrite Existing Items. If the ZIP file is password protected, enter the password under Advanced Settings.

When the package runs, the Compression Task retrieves the ZIP file from Amazon S3 and extracts its contents into the specified Azure Blob Storage directory.

Which Cloud Storage Platforms Can This Support?

The examples above use Azure Blob Storage and Amazon S3, but the v26.1 enhancement is not limited to those two services. This capability applies across our broader cloud ecosystem, including Azure Data Lake, Google Cloud Storage, OneDrive, and other supported cloud platforms.

The important point is that the Compression Task is no longer tied to the local file system for these workflows. If the files already live in cloud storage, the task can handle compression or decompression where the files are, rather than forcing the package to stage them locally first.

FAQ

Q: What is the main benefit of using cloud connection managers with the Compression Task?

A: The main benefit is that files can be compressed or decompressed directly from cloud storage without being downloaded to a local staging folder first.

Q: How does this help Dynamics 365 users?

A: Dynamics 365 users running SSIS-based ETL processes can simplify file handling when integrations involve cloud storage, archived outputs, partner file exchanges, or downstream processing.

Q: Does the Compression Task support both compression and decompression?

A: Yes. The examples show both compressing a cloud directory into a ZIP file and decompressing a ZIP file into a cloud directory.

Q: Do these workflows require Azure Blob Storage and Amazon S3?

A: No. The examples use Azure Blob Storage and Amazon S3, but the capability also extends across other supported cloud platforms such as Azure Data Lake, Google Cloud Storage, and OneDrive.

Take the Next Step

If your Dynamics 365 ETL workflows still rely on local staging to compress or extract cloud-based files, this is a good time to review those package designs. With KingswaySoft SSIS Integration Toolkit, you can handle ZIP files where they already live, helping reduce unnecessary file movement and simplify cloud-focused integration processes.

Ready to upgrade? Download the latest release of the SSIS Integration Toolkit, and contact us today to learn more about our performant, cost-effective integration tools.

 

By KingswaySoft | www.kingswaysoft.com

The post How Can Dynamics 365 Teams Compress Cloud Files in SSIS ETL Integrations? appeared first on CRM Software Blog | Dynamics 365.

Click Here to Visit the Original Source Article

Share the Post:

Related Posts