Hello together,
sometimes we want to open big files and read the data from it. And by big i mean really big. But many times we also only need some of the last lines – may be the newest ones. Herefore you can use the following script. By this way you do not load the whole file into cache first. This is fast and prevents also upcoming crashes.
<# .Synopsis Gets last Lines of a file. .EXAMPLE Get-LastLinesFromFile -Path "c:\Temp\BigData.csv" -Last 20 #> function Get-LastLinesFromFile{ param ( $Path, [int]$Last = 10, [int]$ApproxCharsPerLine = 50 ) $item = (Get-item $path) if (-not $item) {return} $Stream = $item.Open([System.IO.FileMode]::Open, [System.IO.FileAccess]::Read, [System.IO.FileShare]::ReadWrite) $reader = New-Object System.IO.StreamReader($Stream) #Retrieving first set of Lines if ($charsPerLine * $last -lt $item.length) { $buf=$reader.BaseStream.seek((-1 * $last * $ApproxCharsPerLine) ,[System.IO.SeekOrigin]::End) } $content=$reader.ReadToEnd() $LineCount=($content -Split "`n").Count $CharsMax = $last * $ApproxCharsPerLine $CharsMin = 0 while ($LineCount -ne $Last) { if ($LineCount -gt $Last) { #$content=$reader.ReadToEnd() $content=$content -split "`n" -replace "\s+$","" | Select-Object -last $Last $LineCount=($content -Split "`n").Count #obsolete method to near to aiming number #new method has better performance #$CharsMin =$CharsMin+ ([Math]::Round(($CharsMax-$CharsMin)/2)) #$reader.BaseStream.seek(-1 * ($CharsMax-$CharsMin),[System.IO.SeekOrigin]::End) #$content=$reader.ReadToEnd() #$LineCount=($content -Split "`n").Count } else { $CharsMax = $CharsMax+ ([Math]::Round(($CharsMax-$CharsMin)*2)) $buf=$reader.BaseStream.seek(-1 * ($CharsMax-$CharsMin) ,[System.IO.SeekOrigin]::End) $content=$reader.ReadToEnd() $LineCount=($content -Split "`n").Count } } $Stream.Close() $reader.Close() return $content }
Have fun with it.
~David