Azure VMs – Fix Connectivity Issues on Virtual Network

azurevirtual-machinesvirtual-network

A couple of days ago, I created a cloud-only Azure virtual network (no cross-premises connectivity), provisioned a domain controller/DNS server inside of it, and configured both my DNS server and the Azure DNS server in the vnet settings. It's worked fine; I have been spinning up new VMs via PowerShell script and joining them to the domain without any problems. All the machines can see each other and ping each other without problems (I enable the IPv4 ICMP firewall rule to allow this), and the DNS server resolves both internal and external names without issue.

Beginning this evening, new VMs provisioned on the vnet (using the exact same scripts and parameters I have been using the whole time) can't see any of the other machines on the network, even when using raw IP addresses.

If I configure new VMs to join the domain at the time of provisioning using the -WindowsDomain switch, they take a long time to provision, and when finished they aren't actually domain joined (I imagine it retries for a while then gives up). I can RDP into any of these VMs and try pinging around using raw IP addresses – they can't see anything else on the network. Meanwhile, VMs I created earlier can all ping each other without issue.

All VMs are in the same affinity group and same cloud service. I have confirmed in the portal that they are all on the same subnet of the same vnet. What's going on here?

Best Answer

This has since disappeared through no action of my own - it looks like it was a transient, internal problem with Azure. VMs that never had connectivity are now connected, and newly-provisioned VMs are connected without problems.

Related Topic