Hello all. First-time poster here.
I've recently been working on some stuff to gather performance metrics with Get-Stat in a daily report that appends daily data to monthly CSVs, which is later turned into monthly reports with Google Visualization / Google Charts API.
Just gathering datastore metrics alone my script takes a good 30-35 minutes to run currently.
I am not doing a TON of looping, as I'm doing some things like.
$datastores == Get-Datastore
So as not to loop through with a lot of calls to VI Server.
However I am doing loop over the $datastores and then running two Get-Stat in there. There's a total datastores of 138 in this environment. Though I am also running a storage consolidation project that will reduce this drastically in the future... that's what, 272 Get-Stat calls?
I wonder if I should just do something like
$stats = $datastores | Get-Stat -Stat "blah"
And look at the data within a loop.
Thoughts?
Any other efficiency tips or recommendations would be great too.
I've not been using Powershell / PowerCLI long if you can't tell.
Perhaps another good question would be how you might debug PowerCLI performance issues? Maybe with the Windows PS ICE? I think you can set breakpoints.