Send files over PSSession
If it was a small file, you could send the contents of the file and the filename as parameters.
$f="the filename"
$c=Get-Content $f
invoke-command -session $s -script {param($filename,$contents) `
set-content -path $filename -value $contents} -argumentlist $f,$c
If the file is too long to fit in whatever the limits for the session are, you could read the file in as chunks, and use a similar technique to append them together in the target location
PowerShell 5+ has built-in support for doing this, described in David's answer.
This is now possible in PowerShell / WMF 5.0
Copy-Item
has -FromSession
and -toSession
parameters. You can use one of these and pass in a session variable.
eg.
$cs = New-PSSession -ComputerName 169.254.44.14 -Credential (Get-Credential) -Name SQL
Copy-Item Northwind.* -Destination "C:\Program Files\Microsoft SQL Server\MSSQL10_50.SQL2008R2\MSSQL\DATA\" -ToSession $cs
See more examples at here, or you can checkout the official documentation.
I faced the same problem a while ago and put together a proof-of-concept for sending files over a PS Remoting session. You'll find the script here:
https://gist.github.com/791112
#requires -version 2.0 [CmdletBinding()] param ( [Parameter(Mandatory=$true)] [string] $ComputerName, [Parameter(Mandatory=$true)] [string] $Path, [Parameter(Mandatory=$true)] [string] $Destination, [int] $TransferChunkSize = 0x10000 ) function Initialize-TempScript ($Path) { "<# DATA" | Set-Content -Path $Path } function Complete-Chunk () { @" DATA #> `$TransferPath = `$Env:TEMP | Join-Path -ChildPath '$TransferId' `$InData = `$false `$WriteStream = [IO.File]::OpenWrite(`$TransferPath) try { `$WriteStream.Seek(0, 'End') | Out-Null `$MyInvocation.MyCommand.Definition -split "``n" | ForEach-Object { if (`$InData) { `$InData = -not `$_.StartsWith('DATA #>') if (`$InData) { `$WriteBuffer = [Convert]::FromBase64String(`$_) `$WriteStream.Write(`$WriteBuffer, 0, `$WriteBuffer.Length) } } else { `$InData = `$_.StartsWith('<# DATA') } } } finally { `$WriteStream.Close() } "@ } function Complete-FinalChunk ($Destination) { @" `$TransferPath | Move-Item -Destination '$Destination' -Force "@ } $ErrorActionPreference = 'Stop' Set-StrictMode -Version Latest $EncodingChunkSize = 57 * 100 if ($EncodingChunkSize % 57 -ne 0) { throw "EncodingChunkSize must be a multiple of 57" } $TransferId = [Guid]::NewGuid().ToString() $Path = ($Path | Resolve-Path).ProviderPath $ReadBuffer = New-Object -TypeName byte[] -ArgumentList $EncodingChunkSize $TempPath = ([IO.Path]::GetTempFileName() | % { $_ | Move-Item -Destination "$_.ps1" -PassThru}).FullName $Session = New-PSSession -ComputerName $ComputerName $ReadStream = [IO.File]::OpenRead($Path) $ChunkCount = 0 Initialize-TempScript -Path $TempPath try { do { $ReadCount = $ReadStream.Read($ReadBuffer, 0, $EncodingChunkSize) if ($ReadCount -gt 0) { [Convert]::ToBase64String($ReadBuffer, 0, $ReadCount, 'InsertLineBreaks') | Add-Content -Path $TempPath } $ChunkCount += $ReadCount if ($ChunkCount -ge $TransferChunkSize -or $ReadCount -eq 0) { # send Write-Verbose "Sending chunk $TransferIndex" Complete-Chunk | Add-Content -Path $TempPath if ($ReadCount -eq 0) { Complete-FinalChunk -Destination $Destination | Add-Content -Path $TempPath Write-Verbose "Sending final chunk" } Invoke-Command -Session $Session -FilePath $TempPath # reset $ChunkCount = 0 Initialize-TempScript -Path $TempPath } } while ($ReadCount -gt 0) } finally { if ($ReadStream) { $ReadStream.Close() } $Session | Remove-PSSession $TempPath | Remove-Item }
Some minor changes would allow it to accept a session as a parameter instead of it starting a new one. I found the memory consumption on the Remoting service on the destination computer could grow quite large when transferring large files. I suspect PS Remoting wasn't really designed to be used this way.