Azure Security : Batch Account & Storage Account
Hello everyone, we hear a lot about the security on the Public Cloud Providers. Today I will talk about how to secure the link between a Storage Account and a Batch Account.
Azure Batch Account has what we call “Application Package”. This option is the main responsible of getting binaries from a Storage Account and push them to the Storage Account.
Unfortunately, to use the Application Manager, we need to set the firewall of the storage account to “All Networks”. At least that’s what Microsoft Documentation says.
In this case, the Batch Account will be able to communicate with the Batch Account’s App Manager.
- Demonstration :
But, your security guy won’t be happy of this open bar ! So you need to close the open access to the storage account by enabling the firewall.
By leaving this configuration as it is, we won’t be able to communicate with the batch account. Why ? Because we have a firewall in front of Storage Account so we allow only the whitelisted Subnets and IPs.
The big interrogation point is how to secure the storage account and link it to the Batch Account ?
Using compute nodes instead of App Manager
This option needs some focus (so some scripts :D). The idea is to use the “Start Task” to get the binaries from the Storage Account and put them inside the compute node.
We have to configure the Start Task so it can reference the binaries inside the Storage Account. Then, we have to put in place a script that unzip all the binaries inside the Compute Nodes being started.
Note that the Start Task begins before the running state of the Compute Node.
Sounds cool, but how to Log In to the Storage Account so the Start Task can get the Resource Files ? Good question ! There is multiple ways to achieve this use case. Wilfried Woivre (Microsoft Azure MVP) wrote three important and well explained articles about it :
Option 1 — Log In using Active Directory : We can use the Active Directory so that the Compute Node can Login to the Resource Files. Since our script must be lite, I invite you to avoid using any additional Libraries. You can achieve this by generating the token using a Rest API.
Option 2 — Login using Key Vault : If you don’t want to burn your neurons with certificates and AD, you may get benefit of the Azure KeyVault as the article explains, so you can get on each deployment a valid SAS Key.
Each option has its pros and cons. In my case I will get the SAS Key from the Key Vault.
Putting all together
Let’s assume that I already have a versionned ZIP file inside my Storage Account, and my Storage Account has a firewall :
Step 1 : Deployment Script
Don’t forget about it ! Automation ! First, I will write a Powershell Script that :
- Get the SAS Key of the Storage Account
- Get all the ZIPz : If you need multiple versions inside your compute nodes;
- Set the Resource Files;
- Set the Start Task command line : Setting a script that unzip all the Resource Files.
The idea is to update the resource files each time you want to publish a new version of binaries. And this script configure the start task and its resource files. You just have to add it to your deployment pipeline.
Step 2 : Test it
Now that we have our script, we juste need to run it to see what it gives :
Important thing : The Subnet put in the Storage Account subnet must be the same !
Using Batch Node Management IPs
Azure Services have a group of IP adresses, to avoid using the start task and keep using the App Packages. You have to add the list of the IPs of the BatchNodeManagement to the firewall of the Storage Account. Until now, it does not support the Service Tags. You can find the adresses here : https://www.microsoft.com/en-us/download/details.aspx?id=56519
But (yes a but), you have to maintain and keep an eye on this list. Maybe you will wake up tomorrow and you find an unusable Batch Account.
Bella Ciao,