I need a solution
I am looking to get some baselines for the performance at each different data stream size. I just installed BE 2014 on Server 2012 Standard, and primarily back up VMWare VMs over 8Gb SAN transport. I am assuming the only pure way to do this is to create a deduplication disk storage, back up the VM, destroy the dedupe storage, rinse and repeat at every data stream size. If I already have a few backup sets in the dedupe storage, would it hurt to just change the data stream size, restart services, and then back up a VM at each size, or should I do as I said and completely recreate the dedupe storage every time I change the dedupe chunk size?