Azure Data Factory Connection (pipelines and activities)

Scenario:

Client A plans to send day to day order history through SFTP to the internal Team which will store into Azure storage account.

Features:

Azure Storage account, Azure Data Factory

Key Concept:

Azure Data Factory Pipelines and activities: pipeline is a logical grouping of activities that together perform a task. For example, a pipeline could contain a set of activities that ingest and clean log data, and then kick off a mapping data flow to analyze the log data. The pipeline allows you to manage the activities as a set instead of each one individually. You deploy and schedule the pipeline instead of the activities independently.

Azure Blob storage: Storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data. Unstructured data is data that doesn't adhere to a particular data model or definition, such as text or binary data.

● Connecting Azure Data factory require whitelist/enable all related ports on firewall.

Workaround & Configuration

Creating Container and Brob storage

Setting up Data factory

 

By manually trigger the pipeline, you will be able to verify the data transaction.

Referral:

SFTP support for Azure Blob Storage - Azure Storage | Microsoft Learn

Pipelines and activities - Azure Data Factory & Azure Synapse | Microsoft Learn





Comments

Popular Posts

Disclaimer

This blog is not intended to be advice on how to manage your environment. these accounts are based on experiences of my own lab. Always approach information you find outside official documentation with skepticism and follow the golden rule: Never test in production.