Kaseya Community

Search for Long File Names

  • I am in the midst of planning for a project for one of our clients and I know that I have faced issues with long filenames in the past.  Does anyone know of a utility or better yet a script that searches for long filenames that exceed the 260 character limitation in prior Windows 10/2012 R2 platforms that we can execute and provide a report for us?  Then the follow up would be, is there a way to truncate these names to the acceptable limits via a script?

  • Todd,

    I've got a tool we use for data migration planning that scans the disk and identifies file sizes exceeding ##, access date older than ##/##/##, and similar. It would be a snap to add a path length parameter to this.

    The script could certainly trigger a remediation - shorten the file name & log changes, or move to a lower path structure. The rules are probably harder to plan than the code. I have some scripts for this from a client file server migration a few years ago that might also be helpful, at least as a starting point.

    Call or email me for details and to discuss specifics.


  • I've always utilized robocopy with the /L option to report the folders & files found but without actually making any changes.

    Robocopy has the distinct advantage of not being limited by the path & filename restrictions imposed by the OS so it will consistently provide very accurate results regardless of what the OS reports.

    I've used it many times over the years to resolve problems caused by DFSR when Unix & Linux systems (Windows too) are writing to shares at deeper levels of the folder tree.

    Mis-spelled of
    [edited by: PedroPolakoff3 at 8:36 AM (GMT -8) on Nov 15, 2018]
  • Search-LongFileNames.txt

    I've attached a PowerShell script that would find files and folders where the length of their full paths are greater than nn, where nn is specified during script execution. This is really just to get you started with identifying. You'd need to craft your own methodology or logic for renaming. There are two primary cases for the culprit that this script differentiates between: the parent directory length is too long or the file length is too long.

    First, the script identifies all Folders whose paths are too long and excludes those files from the LongFileNames list. Probably should be that the folder names in this case are truncated in some way.

    Second, for all files whose folders are not too long, identify full paths that are too long. That is, the file name itself is what put it over your specified limit. In that case, truncating the file name could resolve this. (but there are some tricky edge cases where maybe only 1 character is remaining and then that wouldn't be a very good file name). Probably is a mix of folder and file renaming to solve this case.

    Third, a variable and file is created that has all of the files affected by the first condition where folder names that were too long. This is really just informational to know what files would be affected.

    I should point out that any rename operations you do would break hyperlinks embedded in other documents or systems. If a Word document, for instance, contained a link to other external documents.

    I hope this helps. If you had ideas on renaming schemes or methods I'd be happy to mock them up in this script. This script uses parameters and so should be able to be called via Kaseya without any issue.

    I've included 4 parameters:

    -TopLevelFolder This is where you want the searching to begin. This doesn't dictate where the file path calculation comes from (that is derived from the file system Provider's value for FullName)

    -Length This is the number of characters you wish these paths to be compared against. 

    -OutputPath Specifies where you want the 3 CSV files to be placed. You should be able to use a Kaseya variable here.

    -ID Specifies any appending identifier (using Kaseya variables if you'd like) to the CSV files that are created.

    Rename it from .txt to .ps1.

  • Hello.

    1. We use the following PS script to find long file names on a routine basis.

    2. We also use FASTCOPY rather than ROBOCOPY and or RICHCOPY for various reasons and limitations.

    cd\ Out-File c:\ktemp\longfilepath.txt ; cmd /c "dir /b /s /a" | ForEach-Object { if ($_.length -gt 250) {$_ | Out-File -append "c:\temp\Long File Name Files to Fix\longfilepath.txt"}}