Share via

Deleted DataFactory Trigger still executing daily for months after being disabled and deleted

Garrison Roberts 20 Reputation points
2026-04-29T22:38:09.0666667+00:00

We used to execute an SSIS package 2x daily on a Data Factory trigger that started an Integration Runtime, ran the package, and stopped the runtime. This was all supposedly disabled over a year ago.

Today I accidentally found from the logs that the trigger still fires twice daily and runs against our Azure database for 10 minutes each time... chewing up DTUs. When I go into Data Factory Studio, on the Home tab it lists the corresponding pipeline, but clicking that gives the error "Resource not found" for a deleted resource.

Clicking Monitor > Pipeline Runs shows that this "deleted resource" runs twice a day.

Clicking Monitor > Integration Runtimes shows the corresponding IR still Running - which should be stopped, as the package steps start the IR, execute the package then issue a Stop command to the IR.

Clicking Author > Integration Runtimes also shows this runtime exists and is running. (Again, this shouldn't be.)

Clicking Author > Triggers lists no triggers. Nothing that should start the SSIS package 2x daily exists.

I'd like to completely disable the trigger that we thought we had disabled and deleted. I'd like to stop the IR and cannot. Essentially, I have no way to turn this process off because the Studio doesn't believe any of it exists, even though the logs and our database show it happily running on its old, original schedule.

Please advise as to what I should do.

Azure Data Factory
Azure Data Factory

An Azure service for ingesting, preparing, and transforming data at scale.


Answer accepted by question author

  1. Q&A Assist
    2026-04-29T22:38:24.62+00:00
    AI generated content. This question contains an answer created with AI from Q&A Assist. Learn more

    Use the management APIs and monitoring views to locate and stop the hidden trigger and its pipeline, then clean up the artifacts explicitly.

    1. Confirm which factory and pipeline are actually running
      • In Monitor → Pipeline runs, note the Data Factory name, pipeline name, and run IDs that are still executing twice daily.
      • Optionally, use PowerShell to query the run details:
             Get-AzDataFactoryV2PipelineRun -ResourceGroupName <rg-name> `
               -DataFactoryName <factory-name> `
               -PipelineRunId <run-id>
        
        This confirms the pipeline and factory that are still active.
    2. Check for triggers via API/PowerShell (even if Studio shows none)
      • Triggers have states started or stopped, and deployment/publishing does not delete them automatically. They must be explicitly deleted via API.
      • Use PowerShell or REST to list triggers on the factory and find the one still associated with the pipeline. If any are in started state, stop them first, then delete them using the Create or Update Trigger API and the trigger delete API.
      • Important behavior: deployment won’t delete artifacts; delete APIs must be called for each artifact type (pipeline, dataset, linked service, trigger) to fully clean up a factory.
    3. Stop and delete the Integration Runtime
      • In Monitor → Integration runtimes, confirm the Azure-SSIS IR that is still running.
      • If Studio won’t let it be stopped, use PowerShell/REST against the factory to:
        1. Stop the IR.
        2. Delete the IR definition so it cannot be started again by any lingering pipeline or trigger.
      • If there are capacity/queue issues or “no active worker”/unhealthy status errors, follow the IR monitoring guidance under Azure-SSIS integration runtime and the Integration Runtime Management Troubleshooting Guide.
    4. Clean up orphaned artifacts in the factory
      • Because deployment does not delete artifacts, there can be “ghost” pipelines or triggers that still exist in the backend even if Studio shows them as deleted.
      • Use the REST delete APIs to explicitly remove:
      • This ensures no hidden artifact can continue to schedule or run the SSIS package.
    5. Ongoing monitoring and verification
      • After stopping/deleting the trigger and IR, verify in Monitor → Pipeline runs that no new runs are created at the old schedule.
      • Optionally, set up periodic monitoring (for example, with a Logic App) using the pipelineruns/querybyfactory endpoint to detect any unexpected runs and alert a ticketing system, as described in Query By Factory.

    If the factory is in a production environment with many live triggers, ensure only the specific offending trigger and pipeline are stopped/deleted so as not to impact other workloads.


    References:

    Was this answer helpful?

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. Garrison Roberts 20 Reputation points
    2026-04-30T17:42:38.5133333+00:00

    This was "sort of" helpful. I think it's asinine that the UI won't show valid & running objects for us to manage, even though they were all created and originally managed through the UI.

    I ended up using CLI and Postman REST calls:

    1. Obtain a bearer token in a Command Prompt with Azure CLI installed
      1. az account get-access-token --resource https://management.azure.com/
    2. List triggers under my subscription (there were two, both invisible to the UI
      1. GET https://management.azure.com/subscriptions/{sid}/resourceGroups/{rg name}/providers/Microsoft.DataFactory/factories/{df name}/triggers?api-version=2018-06-01
    3. Stop the one that was still running each day
      1. POST https://management.azure.com/subscriptions/{sid}/resourceGroups/{rg name}/providers/Microsoft.DataFactory/factories/{df name}/triggers/{trigger name}/stop?api-version=2018-06-01
    4. Delete once stopped
      1. DELETE https://management.azure.com/subscriptions/{sid}/resourceGroups/{rg name}/providers/Microsoft.DataFactory/factories/{df name}/triggers/{trigger name}?api-version=2018-06-01

    The vague guidelines provided by AI were ~okay~ but figuring out the specifics on my own without links to the MS articles I used took some time.

    Was this answer helpful?

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.