I wanted to periodically back up my Dark Souls 3 save game files while playing to prevent my save file from getting irrepairably corrupted if a cheating player (with malicious intentions) invades. It should also work with almost any game and other scenarios outside gaming too.
This is how I'm using it locally and below is how the script came to be as well as the script itself.
.\Save-Files.ps1 -Source 'C:\Users\Benedek\AppData\Roaming\DarkSoulsIII' -Destination '<path-to-backup-folder>' -Filter "*.sl2" -Watch -SaveIntervalSeconds 60
I started with the following prompt to GitHub Copilot (in VS Code):
Generate a PowerShell scripts for the following requirements:
Accepts input parameters:
- Source is path to a local folder, required.
- Destination is also a path to a local folder, required.
- Prefix string.
- Filter string to filter on files in the source destination.
- A switch called Compress.
- A switch called Watch.
- An integer parameter named SaveIntervalSeconds, default value is 300.
The main logic of the script should be the following:
- Copies files from the source to the destination.
- If the parameter is valid, only copy the files or folders that match.
- If the Compress switch is present, a zip file containing matching content should be copied instead.
- If there's only a single matching file or the content is compressed, suffix the resulting file name with the current date and time.
- If there are multiple files matching, copy them into a subfolder under Destination and suffix its name with the current date and time.
- If the Watch switch is present, then the whole logic should be wrapped in a cycle and it should run every SaveIntervalSeconds amount of seconds, but only if at least one of the matching files has changed since the last cycle.
The result was working correctly and ~95% of what I wanted. I followed up with:
Modify it so that when there's only one file matching, the Compressed file name is based on that file, instead of "Backup".
Then I noticed that deleted files are not picked up by the Has-Changed function and asked GHCP to fix it:
Modify it to be able to detect when a file is deleted in Source.
Finally, I asked it to add help messages to the parameters, then cleaned up the script manually in terms of formatting and variable names. The end result is the following:
param(
[Parameter(Mandatory = $true, HelpMessage = 'Path to the source folder.')]
[string] $Source,
[Parameter(Mandatory = $true, HelpMessage = 'Path to the destination folder.')]
[string] $Destination,
[Parameter(HelpMessage = 'Optional prefix string for output names.')]
[string] $Prefix = '',
[Parameter(HelpMessage = 'Filter string to match files (wildcards allowed).')]
[string] $Filter = '*',
[Parameter(HelpMessage = 'Switch: Compress output as zip if present.')]
[switch] $Compress,
[Parameter(HelpMessage = 'Switch: Watch for changes and repeat if present.')]
[switch] $Watch,
[Parameter(HelpMessage = 'Interval in seconds for watch mode (default: 300).')]
[int] $SaveIntervalSeconds = 300
)
function Get-MatchingFiles
{
param($Source, $Filter)
Get-ChildItem -Path $Source -Filter $Filter -Recurse -File
}
function Get-HashTable
{
param($Files)
$hash = @{}
foreach ($file in $Files)
{
$hash[$file.FullName] = $file.LastWriteTimeUtc.Ticks
}
return $hash
}
function Has-Changed
{
param($OldHash, $NewFiles)
$newFileNames = $NewFiles | ForEach-Object { $_.FullName }
# Detect new or changed files
foreach ($file in $NewFiles)
{
if (-not $OldHash.ContainsKey($file.FullName) -or $OldHash[$file.FullName] -ne $file.LastWriteTimeUtc.Ticks)
{
return $true
}
}
# Detect deleted files
foreach ($oldFile in $OldHash.Keys)
{
if ($newFileNames -notcontains $oldFile)
{
return $true
}
}
return $false
}
function Save-Content
{
param($Source, $Destination, $Prefix, $Filter, $Compress)
$files = Get-MatchingFiles -Source $Source -Filter $Filter
if ($files.Count -eq 0)
{
Write-Output 'No matching files found.'
return
}
$timestamp = Get-Date -Format 'yyyyMMdd_HHmmss'
$baseName = if ($Prefix) { "${Prefix}_" } else { '' }
if ($Compress)
{
if ($files.Count -eq 1)
{
$file = $files[0]
$zipName = "${baseName}$($file.BaseName)_$timestamp.zip"
}
else
{
$zipName = "${baseName}Backup_$timestamp.zip"
}
$zipPath = Join-Path $Destination $zipName
Compress-Archive -Path $files.FullName -DestinationPath $zipPath -Force
Write-Output "Compressed backup saved to $zipPath"
}
elseif ($files.Count -eq 1)
{
$file = $files[0]
$destName = "${baseName}$($file.BaseName)_$timestamp$($file.Extension)"
$destPath = Join-Path $Destination $destName
Copy-Item -Path $file.FullName -Destination $destPath -Force
Write-Output "File copied to $destPath"
}
else
{
$folderName = "${baseName}Backup_$timestamp"
$destinationFolder = Join-Path $Destination $folderName
New-Item -ItemType Directory -Path $destinationFolder -Force | Out-Null
foreach ($file in $files)
{
$relativePath = $file.FullName.Substring($Source.Length).TrimStart('\')
$targetPath = Join-Path $destinationFolder $relativePath
$targetDir = Split-Path $targetPath -Parent
if (-not (Test-Path $targetDir))
{
New-Item -ItemType Directory -Path $targetDir -Force | Out-Null
}
Copy-Item -Path $file.FullName -Destination $targetPath -Force
}
Write-Output "Files copied to $destinationFolder"
}
}
if ($Watch)
{
$lastHash = @{}
while ($true)
{
$files = Get-MatchingFiles -Source $Source -Filter $Filter
$newHash = Get-HashTable -Files $files
if (Has-Changed -OldHash $lastHash -NewFiles $files)
{
Save-Content -Source $Source -Destination $Destination -Prefix $Prefix -Filter $Filter -Compress:$Compress
$lastHash = $newHash
}
Start-Sleep -Seconds $SaveIntervalSeconds
}
}
else
{
Save-Content -Source $Source -Destination $Destination -Prefix $Prefix -Filter $Filter -Compress:$Compress
}