Share via

What happens to VMs running on the NVadsA10_v5 SKU with Nvidia Driver below 18.x after 15th May 2026

Rince Antony 21 Reputation points
2026-05-11T14:47:05.47+00:00

Hi Team,

We have 100+ VMs running on the NVadsA10_v5 SKU at any given point. We had received comms saying that Azure will roll out critical platform updates that include a new NVIDIA GPU host driver, vGPU 20.x (R595.x). The NVIDIA vGPU 20.x driver is backward compatible only with vGPU 18.x on virtual machines. If you're using vGPU 17.x (version 550.x), we require that you update the driver to vGPU 18.x (version 570.x) by 15 May 2026 to avoid potential interruptions once Azure starts the rollout of the platform updates.

What is the impact if a VM is running with the 17.x driver after 15th May 2026? Will the VM get shutdown automatically or the driver will no longer work? We need to plan the remediation action according to your feedback.

Azure Virtual Machines
Azure Virtual Machines

An Azure service that is used to provision Windows and Linux virtual machines.

0 comments No comments

Answer accepted by question author

  1. Jilakara Hemalatha 13,340 Reputation points Microsoft External Staff Moderator
    2026-05-11T15:05:06.5666667+00:00

    Hello Rince,

    Thank you for reaching out and for providing the details of your current setup.

    If a VM running on NVadsA10_v5 is still using NVIDIA vGPU driver version 17.x (550.x) after 15 May 2026, the VM itself will continue to stay up and running as normal. Azure will not automatically shut down, deallocate, or force reboot the virtual machine purely because of the driver version. From an infrastructure perspective, there is no direct action that stops the compute instance from running.

    However, the key impact will be on GPU functionality once Azure completes the rollout of the updated host-side driver stack (vGPU 20.x / R595.x). Since vGPU 20.x is only compatible with guest drivers 18.x (570.x) and above, any VM still on 17.x will no longer have a supported GPU communication path with the host. In practical terms, this means the GPU will likely fail to initialize inside the guest operating system, and any workload relying on CUDA, rendering, or GPU acceleration will stop functioning or start failing with driver-related errors. The operating system and non-GPU applications will continue to work normally, but without GPU capability.

    Another important point is that even if the VM appears stable after the rollout, GPU-related instability can surface during host maintenance events, redeployments, or service healing operations. In those situations, the GPU device may fail to attach correctly, which can lead to inconsistent behavior for GPU-dependent workloads.

    So in summary, there is no automatic shutdown or forced VM termination expected due to the older driver version, but continuing with vGPU 17.x beyond the enforcement date will result in loss of GPU functionality and potential service disruption for any GPU-based applications.

    To avoid this situation, the recommended approach is to upgrade all NVadsA10_v5 VMs to vGPU 18.x (570.x) or later before 15 May 2026 and validate the workloads in a controlled environment before rolling it out across production systems.

    Hope this helps! Please let me know if you have any queries in comments.

    Was this answer helpful?

    1 person found this answer helpful.

1 additional answer

Sort by: Most helpful
  1. SUNOJ KUMAR YELURU 18,251 Reputation points MVP Volunteer Moderator
    2026-05-12T03:31:16.5533333+00:00

    Hello @Rince Antony,

    Azure will not shut down or deallocate your VMs. The VMs keep running — but the GPU stops working as soon as the host they land on is updated to the vGPU 20.x (R595.x) host driver.

    Remediation plan

    1. Inventory — for each VM, capture current driver: bash
         nvidia-smi --query-gpu=driver_version --format=csv,noheader
      
      Flag anything in the 550.x range.
    2. Upgrade to 570.x (vGPU 18.x) — minimum supported; 595.x (vGPU 20.x) also works and is more future-proof.
      • Linux: uninstall 550.x (nvidia-uninstall or apt purge of nvidia-vgpu-* / cuda-drivers-*), then install the GRID 18.x package from Azure's docs (NVIDIA-Linux-x86_64-570.*-grid-azure.run) or via the NVIDIA GPU Driver Extension for Linux/Windows which now pulls a compatible version.
      • Windows: install the GRID 18.x (570.x) Azure-branded installer, reboot.

    Was this answer helpful?

    0 comments No comments

Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.