Powershell Script: Rename Files with Characters that are Invalid for OneDrive for Business and Sharepoint Online

I’m a big fan of Microsoft’s Office 365 offerings, and one of my favorite components of that is the OneDrive storage that comes with each user. It’s a great way to make sure files are always backed up and available anywhere you have an internet connection.

One of the challenges you can encounter when using OneDrive for Business is that not all characters that are valid for filenames in a Windows environment are valid for use in OneDrive (and Sharepoint Online files for that matter). According to this article, the list of invalid characters are: \ / : * ? ” < > | # %.

There are several other restrictions (such as file size and name length), but the invalid characters issue is the one I encounter the most frequently, especially when I migrate a user who has a lot of files to OneDrive for Business. In one case, there were over 1,000 files with invalid characters, and I wasn’t about to rename those by hand.

Looking online, there are several great resources for Powershell scripts to solve this problem, although some were out of date and did not accurately reflect the current restrictions of the files files. In the end, there were two sources I liked a lot and I combined them into a single script and modified that. It’s not a perfect or complete script, and I don’t consider myself even proficient with Powershell. However, this script has worked well for me.

First, the sources:

Use PowerShell to check for illegal characters before uploading multiple files into SharePoint

Fix file names for Skydrive Pro syncing

I liked that the second one outputs to a TSV file that you can open with Excel and review. The first source just prints to the console, and when you have a ton of files, that isn’t very useful. Because of the output to the file, I was able to tweak things a little more, adding in a few additional things, such as searching for files with “%20” in the name (html encoded space).

function Check-IllegalCharacters ($Path, $OutputFile, [switch]$Fix, [switch]$Verbose)
{
    #The maximum allowed number of characters of a file's full path + name
    $maxCharacters = 400
    #The maximum file size
    $maxFileSize = 2147483648
    #A list of file types that can't be sync'd
    $invalidFileTypes = ".tmp", ".ds_store"
    #A list of file names that can't be sync'd
    $invalidFileNames = "desktop.ini", "thumbs.db", "ehthumbs.db"

    Write-Host Checking files in $Path, please wait...

    #Only run for a valid path
    if (!(test-path $path))
    {
        'Invalid path for file renames'
    }
    else
    {
        #if the output file exists empty it first
        if (test-path $outputFile)
        {
            clear-content $outputFile
        }
        #add headers to output file
        Add-Content $outputFile "File/Folder Name	New Name	Comments";

        #Get all files and folders under the path specified
        $items = Get-ChildItem -Path $Path -Recurse
        foreach ($item in $items)
        {
            #Keep a flag to indicate whether or not we can perform the updates (some problems are deal breakers)
            $valid = $true
            #Keep an array list for comments
            $comments = New-Object System.Collections.ArrayList
            
            if ($item.PSIsContainer) { $type = "Folder" }
            else { $type = "File" }
        
            #Check if item name is longer than the max characters in length
            if ($item.Name.Length -gt $maxCharacters)
            {
                [void]$comments.Add("$($type) $($item.Name) is $($item.Name.Length) characters (max is $($maxCharacters)) and will need to be truncated")
                $valid = $false
            }

            if($item.Length -gt $maxFileSize)
            {
                [void]$comments.Add("$($type) $($item.Name) is $($item.Length / 1MB) MB (max is $($maxFileSize / 1MB)) and cannot be synchronized.")   
                $valid = $false
            }

            if($invalidFileNames.Contains($item.Name))
            {
                [void]$comments.Add("$($type) $($item.Name) is not a valid filename for file sync.")
                $valid = $false
            }

            if($invalidFileTypes.Contains($item.Name.Substring($item.Name.Length-4)))
            {
                [void]$comments.Add("$($type) $($item.Name) type $($item.Name.Substring($item.Name.Length-4)) is not a valid file type for file sync.")
                $valid = $false
            }
           
            #Technically all of the following are illegal \ / : * ? " < > | # %
            #However, all but the last two are already invalid Windows Filename characters, so we don't have to worry about them
            $illegalChars = '[#%]'
            filter Matches($illegalChars)
            {
                $item.Name | Select-String -AllMatches $illegalChars |
                Select-Object -ExpandProperty Matches
                Select-Object -ExpandProperty Values
            }
            
            #Replace illegal characters with legal characters where found
            $newFileName = $item.Name
            Matches $illegalChars | ForEach-Object {
                if($Verbose){ [void]$comments.Add("Illegal string '$($_.Value)' found") }
                #These characters may be used on the file system but not SharePoint
                if ($_.Value -match "#") { $newFileName = ($newFileName -replace "#", "-") }
                if ($_.Value -match "%20") { $newFileName = ($newFileName -replace "%20", " ") }
                if ($_.Value -match "%") { $newFileName = ($newFileName -replace "%", "-") }
            }

            if($comments.Count -gt 0)
            {
                #output the details
                Add-Content $outputFile "$($item.FullName)	$($item.FullName -replace $([regex]::escape($item.Name)), $($newFileName))	$($comments -join ', ')"    
                if($Verbose)
                { 
                    Write-Host $($type) $($item.FullName): $($comments -join ', ') -ForegroundColor Red
                }
            }
                
            #Fix file and folder names if found and the Fix switch is specified
            if ($newFileName -ne $item.Name)
            {
                if($fix -and $valid)
                {
                    Rename-Item $item.FullName -NewName ($newFileName)
                    if($Verbose)
                    {
                        Write-Host $($type) $($item.Name) has been changed to $($newFileName) -ForegroundColor Yellow
                    }
                }
            }
        }
    }
    Write-Host "Done"
}

#Example: Check-IllegalCharacters -Path 'C:\Users\User\Downloads\Files With Errors' -OutputFile 'C:\Users\User\Desktop\RenamedFiles.tsv' -Verbose -Fix

I noticed that I ran into some errors when files were deeply nested – for example if you have to rename a file in a folder that also was renamed. Re-running the script a few times fixed that problem for me (each time it would fix one more layer of folders – so if you had a file inside 3 levels of folders that also needed renaming, it would take four passes in order to complete everything.

Quick Tip: How to get a text list of files in a Windows Directory

Every once in a while I need to get a list of just the file names in a directory. Sometimes there are a LOT of files and it would be a pain in the ass to type them out or write a program to manipulate them in some way.

The quick way to do this in Windows (images below are for Windows 10) is as follows:

  1. Open a windows explorer window (shortcut: hit Win + e, where “Win” is the Windows key) and navigate to the folder containing the files you want to print out
  2. Hold the Shift key and right-click in the folder (don’t click on any actual files) and choose the option “Open Powershell Window here”

  3. In the powershell window that opens, type

    Get-ChildItem -name

That’s it! now it prints a list of file names that you can copy/paste from the powershell window into something else (Excel, notepad, word, whatever)

Alternate method: command prompt

  1. Open a command prompt (Hit the Windows key, type “cmd” and hit enter)
  2. Navigate to the folder containing the images (type cd “<your file path>”, for example: “C:\Users\Public\Public Pictures\Sample Pictures”)
  3. Type the command below:

    dir /b

  4. If you’re using Windows 10, you can simply copy/paste from the command prompt. With an older version of windows, you need to:
    1. Right click in the command window and choose “Select All”
    2. Hit the Enter key. This will copy the contents of the command prompt to your paste buffer so you can paste using ctrl + v or right-click -> Paste

 

Connecting to Synology DiskStation from Windows 10

I’ve owned a small Synology Diskstation for a few years and really love its features and capabilities, especially considering its cost. One of the primary roles of my DiskStation is to backup my home computers. After recently purchasing a Surface Pro 4 and applying the Creators’ update, I was having trouble connecting to my DiskStation running DiskStation 6.1. After reading quite a few posts online about different problems, it seems the solution I needed was really quite basic.

What was most curious about this problem was that I could not see my DiskStation appear under “Network” under Windows explorer, and I received error code 53 (system error 53 has occurred. The network path was not found) when I tried to map the network drive using the command prompt like so:

net use T: \\DiskStation

Performing nbtstat -c from the command line and net view both listed my DiskStation with the UNC I was expecting, and the correct IP (I have mine configured as a static) in the case of the nbtstat command.

First, I made sure the SMB settings on the DiskStation were set to allow from SMB 1.0 to SMB 3.0 (DiskStation Control Panel -> File Services -> SMB -> Advanced Settings -> Maximum/Minimum SMB Protocol Settings).

Then, in Windows, if I opened Explorer and navigated to the IP address or the UNC sharename (\\DiskStation, for example), it would prompt me for a password. This was the primary point of failure for me earlier – I had forgotten that when logging into another server, whether it’s a Synology DiskStation or a Windows Server, you have to provide the server name AND the account name (or domain name/Account name, in the even that you’re connected to a domain).

So the Username wasn’t just “MyUsername” it was “DiskStation\MyUserName”. Once I did that, my DiskStation appeared under Network.

Killing all connections to a SQL Server Database

One issue I’ve run across frequently during development is restoring a database to a newer state. Often, when I want perform the restore, there are active connections to my development database, so restoring will fail.

Of course, StackOverflow had the answer to this, but I’ve searched for the solution to this problem enough times where it made sense for me to finally write it down.

Script to kill all connections to a database (More than RESTRICTED_USER ROLLBACK)

User AlexK posted this excellent solution:

For MS SQL Server 2012 and above

USE [master];

DECLARE @kill varchar(8000) = '';  
SELECT @kill = @kill + 'kill ' + CONVERT(varchar(5), session_id) + ';'  
FROM sys.dm_exec_sessions
WHERE database_id  = db_id('MyDB')

EXEC(@kill);

For MS SQL Server 2000, 2005, 2008

USE master;

DECLARE @kill varchar(8000); SET @kill = '';  
SELECT @kill = @kill + 'kill ' + CONVERT(varchar(5), spid) + ';'  
FROM master..sysprocesses  
WHERE dbid = db_id('MyDB')

EXEC(@kill); 

 

Fix: Visual Studio doesn’t remember last open documents

After installing Visual Studio 2017 a few months back, I noticed that some projects were loading strangely, while others loaded just fine. The two main issues I experienced were:

  • Documents I had open on my previous run of VS wouldn’t load upon running the Visual Studio 2017 application
  • Windows I had arranged in my multi-monitor layout were not loading where I expected them

A quick Stack Overflow search led me to the answer regarding the first: the .suo file had become corrupt. Once I knew that, the trick was finding the .suo file:

  1. From the directory containing your solution file (.sln), open the folder named “.vs”.
  2. In the “.vs” folder, open the folder that has a name matching your solution name.
  3. Inside the solution folder, there may be multiple folders, one for each version of Visual Studio
    1. v14 is for Visual Studio 2015
    2. v15 is for Visual Studio 2017

These folders will contain your .suo file, which is hidden by default in Windows, so you need to enable “Show hidden files, folders, and drives” in your Folder options in order to see it. For instructions on that (Win 7,8, or 10), see the following article: https://www.howtogeek.com/howto/windows-vista/show-hidden-files-and-folders-in-windows-vista/

I still haven’t found a solution to my second issue (I will definitely write about it if I find one).

What the SUO (Solution User Options) file controls

After solving my problem, I decided to take a look at the responsibilities of the .suo file. Microsoft’s documentation (VS 2015 version – 2017 isn’t available at the time of this writing) isn’t very forthcoming in detailing what exactly the SUO is doing. Based on digging around on the web, it seems that the following are its responsibilities (among others):

  • Remembers last open files
  • Remembers breakpoints
  • Remembers expanded nodes in solution explorer
  • Remembers startup project
  • Remembers last open tool windows and their positions
  • Remembers watch window contents

The file is encoded and not human-readable, so it’s not something you can simply hack around with like you can a solution (.sln) or project (.xxproj) file. It should not be added to version control.

How to Fix credential validation issue on Azure WebJob renewal of Let’s Encrypt Certificate

A while back, I posted about setting up SSL encryption for free with Azure and Let’s Encrypt: Let’s Encrypt + Azure = Win!

This has been working smoothly for me since I set it up, but I noticed that errors started popping up in the log recently. Here is part of the stack trace:

Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Functions.RenewCertificate —> Microsoft.IdentityModel.Clients.ActiveDirectory.AdalServiceException: AADSTS70002: Error validating credentials. AADSTS50012: Invalid client secret is provided. Trace ID: 958b11ab-839d-4a8d-97e6-fad1c3df0300 Correlation ID: e3f7c035-8978-4aa2-b01a-5c8fc74661ac Timestamp: 2017-05-31 14:14:26Z —> System.Net.WebException: The remote server returned an error: (401) Unauthorized. at System.Net.HttpWebRequest.GetResponse() at Microsoft.IdentityModel.Clients.ActiveDirectory.HttpWebRequestWrapper.

It turns out that the API Key I had setup for my application registration had expired. I had to create a new key with no expiration and then update my Web Applications’ settings with the new Client secret. The exact steps I took are listed below:

  1. Login to Azure
  2. Navigate to “App Registrations”
  3. Choose the Registration you need to update
  4. Click the “settings” icon (or “All Settings” button)
  5. Choose “Keys” under API Access
  6. Type a description into the new row, choose “Never” under the duration drop down and then hit “Save” above.
  7. Once saved, copy the value (it won’t be visible again if you don’t copy it now)
  8. (Optional) delete your old key
  9. Navigate to the Azure App Service that has the web job that registers your SSL certificate
  10. Choose “Application Settings” from the menu
  11. Scrolling down to where you have a setting titled something like “letsencrypt:ClientSecret” (assuming you did the setup as in the article linked at the top) and paste the value you copied into the second text box
  12. Click “Save” above

Once you’re done, the web job should work the next time it runs. For another explanation with some pictures of the process, check out this blog post here: Let’s Encrypt on Azure Web Apps – Key Expiration Issue.