Kaseya Community

Specific File Size Report

This question has suggested answer(s)

Reporting and Monitoring is not my strongest suit so I'm reaching out for some help.  I need a report of all of the Outlook OST file sizes.  Something simple, just with the computer name, the OST name and the size of the file.

I saw a legacy forum post that suggested using the forfiles.exe command to create a log and then pull a report of that log information.

Execute Shell Command
     Parameter 1 : #AgentDrive#temp\forfiles.exe /d +0 /p "#AgentDrive#users" /s /m *.ost /c "cmd /c echo @file @path @fsize" >> #AgentDrive#temp\ost_report.log

That does produce a log with the file name, file location and the size in bytes but I'm hoping for something simpler that I can run on a semi regular basis.  Or at least a few times so I can do a before and after.

All Replies
  • Trevor,  how do you want to get the info?    Email per machine?    Agent Procedure Log so you can run a report?   What if there are multiple OST files?

    We have a basic script that will email you per machine, but not sure that is what you want?

    You could take your command above, and then read the #AgentDrive#temp\ost_report.log into a variable, and then output that variable to the Procedure Log with a tag (We use something like $OST$), and then run a report on the tag.

    (here is how to run a report with tags:   )

    That is probably the easiest way.  You could even schedule that report weekly or monthly.

  • Ideally a plain Excel report of having columns of computer name, ost file name and the size (bytes is fine) for our org.  My boss and I are trying to make an argument in changing some policies because of the OST file sizes blowing up to the default 50GB limit.

    Basically what I need is a report of what computer has what OST and what size.  I'll definitely have some with multiple OST files per computer, heck I'm pretty sure I'm going to run into a few instances of multiple OST files for a user on the same computer.

    I'm currently looking at the Reporting feature to create a log that will either read the log on the machine.  It might be easier to have the log write to the Procedure Log as a variable as you've mentioned.  

    Thanks for your reply!

  • I came up with a rather janky way to get what I needed.  It's not 100% complete as I'm having an issue with the forfiles.exe working on anything other than current date.

    Run a shell command with forfiles.exe /p "C:\users" /s /m *.ost /d +0 /c "cmd /c echo @file @path @fsize" >> C:\temp\ost_report.txt to create a report in C:\temp on the local computer.

    Then run a PowerShell command gc c:\temp\ost_report.txt | where {$_ -ne ""} > c:\temp\ost_report_fixed.txt that removes a blank line at the start of t he ost_report.txt.

    Then I used getVariable to import the data from the ost_report.txt to the Agent Procedures as OSTLog.

    Then I writeProcedureLogEntry using the OSTLog variable and add $OSTLog$ as a searchable tag.  So the OST_report.txt spits out "OST name" "OST File Location" "Size In Bytes" and writes it to the Log.

    Then I run a report from the Info Center looking for the $OSTLog$ and spit that out as an Excel file.

    Then just a bit of work in Excel to make it readable and I get the report I need.  Now I just have to figure out why the forfiles.exe is not finding the .ost files when I use something for /d (date) that isn't +/- zero.  That limits it to only finding .ost files that are modified in the current day.  I just want to expand it to something like a week but /d -7 when run locally comes up with no files found.

  • NTFSReader.zip

    Hello Trevor,

    When I want to search a file in windows I use a small utility for which I have also written a wrapper that basically accesses the NTFS Table directly making searches (even inside million files directory structures) VERY fast (like 1, 2 seconds).

    Because it is so fast, you can do a whole search in multiple drives (C:, D:) and even scan before which drives users have and do a scan on all volumes.

    It works only if you have an NTFS partition though (so if your OST is on an external drive formatted in FAT32 it won't work .. but who has it?)

    I have attached both source and compiled version in the zip file so you can check and re-compile for security.

    Once you have found the files path / location it is trivial to have Powershell check the properties of each file and discard what you don't need / want and then do whatever you want.

    In your case to check a list of OST with size (regardless of their location) you run as "Admin" (or in Kaseya as System) a powershell similar to this

    (I assume you save the NtfsReaderRunner utility inside kworking here)

    -------------------

    Clear-Host

    Function Format-FileSize() {
        Param ([int64]$size)
        If     ($size -gt 1TB) {[string]::Format("{0:0.00} TB", $size / 1TB)}
        ElseIf ($size -gt 1GB) {[string]::Format("{0:0.00} GB", $size / 1GB)}
        ElseIf ($size -gt 1MB) {[string]::Format("{0:0.00} MB", $size / 1MB)}
        ElseIf ($size -gt 1KB) {[string]::Format("{0:0.00} kB", $size / 1KB)}
        ElseIf ($size -gt 0)   {[string]::Format("{0:0.00} B", $size)}
        Else                   {""}
    }


    $OutputVariable = (C:\kworking\NtfsReaderRunner.exe *.ost) | Out-String
    $fArray = $OutputVariable.Split([Environment]::NewLine, [StringSplitOptions]::RemoveEmptyEntries)

    foreach ($s in $fArray)
    {
       Get-Childitem -file $s | Select-Object FullName, @{Name="Size";Expression={Format-FileSize($_.Length)}}, Length
    }

    How you bring this data back to kaseya for reporting is then up to you.
    I am on premise so I can even read the content of the results, parse it, load it into a custom table but since you seem to have gotten this process worked out, I will not comment.

    Alex