How to upload processed files from Azure Data Factory to cloud storage

Posted on May 30, 2021 • Updated on Nov 27, 2025

Using Microsoft Azure’s Data Factory, you can pull data from Amazon S3 and Google Cloud Storage to extract into your data pipeline (ETL workflow). However, Microsoft does not allow you to load (put/upload) files back into these platforms at the end of your extract, transform, load cycle.

This is where Couchdrop comes in. Couchdrop enables you to get past this limitation, giving you the ability to load files back into these platforms.

In this guide, we'll show you how to use Couchdrop to upload processed files from Azure Data Factory to any cloud storage platform. 

 

Couchdrop and cloud storage

Couchdrop supports Google Cloud Storage, Amazon S3, SharePoint, Dropbox, and over 25 other platforms. Using Couchdrop with Azure Data Factory, you can pull data from any cloud storage platform, transform it, and then load it to the same or a different cloud platform, all through SFTP.

ETL with any Platform-1

Using Couchdrop with Azure Data Factory

Configuring Couchdrop is straightforward and only takes a couple of steps. You can create users who are locked to specific buckets and are limited to specific file operations (upload only, download only, read/write, etc.).

You can also configure webhooks based on upload/download events on certain folders. This enables you to trigger different workflows based on the uploaded folder and user.

Couchdrop also offers an API to assist with onboarding users programmatically.

To get up and running with Couchdrop’s cloud SFTP server and integrate it into your ETL processes, follow these steps:

  • Step 1. Configure storage in Couchdrop
  • Step 2. Configure user(s)
  • Step 3. Connect to Couchdrop using the SFTP connector in Azure Data Factory

Step 1. Configure storage in Couchdrop

First, connect your storage endpoint. You can add a new integration from any page. For example, to connect a GCS bucket, choose Google Cloud, then add the information for your bucket and a service account JSON file. Test the connection in Couchdrop and name how the integration will appear in Couchdrop, like naming the virtual folder Google Cloud

You can use any of the 30+ storage integrations as an endpoint for this process.

Step 2. Configure user(s)

Next, configure a user. You can do this manually by creating a username/password combination or have Couchdrop autogenerate a user.

When making the user, be sure to isolate them to the folder configured above (/Google Cloud in our case). In theory, this could be an external party uploading data. You could create another user who has read/write access who can pull the data down based on a webhook event of the user uploading a file and extract it into your workflow.

Step 3. Connect to Couchdrop using the SFTP connector in Azure Data Factory

As Couchdrop SFTP works as a standard SFTP server, you simply need the hostname (sftp.couchdrop.io) and the credentials for the user created in Step 2 to establish the connection. 

Couchdrop SFTP in Data Factory

Try Couchdrop free for 14 days

Couchdrop is a simple way to upload processed files from Azure Data Factory to Cloud storage. A benefit of using Couchdrop is that the platform simply acts as a conduit and does not store data, nor does it ‘sync’ data to storage platforms. It processes transfers in memory directly to your endpoints with no temporary storage layer. 

Want to try Couchdrop for yourself? You can get a 14-day free trial with no credit card and no required sales calls or other hoops to jump through. Simply sign up and go. Start your trial today.