Deployment inefficiencies

Azure image is 1.3GB compressed.

Step 1 decompress archive ~ 64GB.
Step 2 convert dynamic to static disk creating another ~64GB file.
Step 3 round off (tiny file increase).
Step 4 convert 64GB dynamic to raw file, creating another ~64GB file.

On disk I now have 1.3GB + 64GB + 64GB + 64GB.

I’m on a OK machine, 8 cores and 1 Gbp internet, but this seems excessive. I appreciate the image but does the raw disk size need to be so big when disks and file systems can be so easily expanded?

Hi @Richard_Johnson . Welcome!

I admit I don’t use azure personally, but I vaguely recall something about azure not having the ability to do sparse disks (or perhaps the build tool we use cannot do sparse disks). @simaishi might remember better.

That being said, I don’t recall steps 2-4 being necessary. I thought you could upload the .vhd file directly to Azure Resource Manager then follow the steps here: ManageIQ

Thanks for the reply! Looks like I was following an older guide; between google and flicking between tabs I hadn’t noticed at the top of the instructions there’s a an option to select “latest”.

I will follow those instructions instead.

RTFM to myself it seems.

As @Fryguy mentioned, resize/conversion steps are no longer needed. That change is in kasparov/master builds.