Azure Data Factory - Pipeline Triggers
Azure Data Factory - Pipeline Triggers
Trigger Types:
1. Schedule Trigger
Description: A Schedule Trigger allows you to invoke a pipeline on a wall-clock schedule. It is generally used to run pipelines at set times and intervals, such as once an hour or every day at a particular time.
Use-Case: A retail company might utilize a Schedule Trigger to run a pipeline that aggregates daily sales data from its e-commerce platform. It might be scheduled to run at a specific time after the closing of a business day (e.g., at midnight) to prepare reports and gain insights for the next business day.
2. Tumbling Window Trigger
Description: Tumbling Window Triggers launch actions based on a periodic interval, with non-overlapping, contiguous time intervals. This kind of trigger is useful for scenarios where you need to process data within fixed time slots.
Use-Case: An energy utility company might deploy a Tumbling Window Trigger to analyze hourly energy usage data from smart meters. The data could be segmented into one-hour non-overlapping windows, enabling the utility to analyze usage patterns, predict load, and optimize energy production.
3. Event-based Trigger
Description: The Event-based Trigger enables the execution of pipelines in response to an event, like the creation or deletion of a blob (file) in Azure Blob Storage (including Azure Data Lake Storage Gen2).
Use-Case: An insurance company could implement an Event-based Trigger to initiate a pipeline whenever a new claim document is uploaded to an Azure Data Lake Storage Gen2 container. This trigger would initiate a pipeline that extracts relevant information from the claim, performs initial validation, and then moves the data to downstream applications for further processing, thereby enabling an efficient, automated claim processing workflow.
4. Custom Events Trigger:
Description: A more flexible event-based trigger that reacts to events from Azure Event Grid, allowing integration with a wider range of event sources, including custom sources.
Use-Case: Imagine a manufacturing firm that produces customized products based on orders. A Custom Events Trigger might be used to initiate a pipeline when specific conditions are met, such as when an order for a customized product is placed and materials are available in inventory. The trigger could initiate a pipeline that schedules the manufacturing process, updates inventory, and informs the logistics team.
5. Manual Trigger:
Description:This isn’t exactly a defined “trigger” like the others, but pipelines can be manually triggered via the ADF UI, PowerShell, Azure CLI, REST API, or SDKs.
Use-Case: Consider a data analysis team in a research institution that occasionally needs to process large datasets. A Manual Trigger might be used for their data processing pipeline, allowing team members to initiate data processing whenever they have new data ready for analysis, ensuring resource consumption only when necessary and avoiding unnecessary costs.
source: https://learn.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers