#Ideas mostly stolen from https://dbatools.io/defaults/
Lars Rasmussen - @nanoDBA
Thursday, March 24, 2022
dbatools: Using default credential for certain functions and parameters
Monday, March 22, 2021
Deleting Data and Log Files Not In Use with Find-DbaOrphanedFile (dbatools)
Over the weekend I was paged because a SQL Server volume violated a 5% minimum free disk space threshold set by our alerting system on storage volumes. This time additional storage had previously been requested but had not yet been provisioned. After going down the often unwise rabbit hole of shrinking one data file by only 3.91GB and afterward running Ola’s IndexOptimize to clean up the resulting fragmentation, a previously used cmdlet from the dbatools PowerShell module popped into my head – with a little help from PSReadLine’s CTRL+R to search through my command line history...
Found it!
Find-DbaOrphanedFile
I ran this command and quickly found 5 data files on the volume that were not in use - candidates for deletion - along with their corresponding transaction log files on another volume:
Caution: None of those “Are you sure you wish to delete this file?(Y/N)” safety prompts here…
Let’s dangerously pipe that to Remove-Item and poof! The files have been deleted.
Running the command to delete the orphaned MDF/NDF/LDF files…
Nearly 10GB freed from one volume in a single line. Nice. And no fragmented indexes to think about, either.
Get-Help Find-DbaOrphanedFile -ShowWindow
offers an explanation of what the Find-DbaOrphanedFile cmdlet does:
I typically run Find-DbaOrphanedFile with the -RemoteOnly switch to display UNCs.
RunningGet-Help Find-DbaOrphanedFile -Examples
will offer some helpful examples.
It took me while to find out how to group the results in PowerShell to display the sum of the size of all orphaned files in a given directory, but this post from Jess Pomfret put it together for me.
Find-DbaOrphanedFile -SqlInstance localhost -RemoteOnly |
Get-Item |
Select-Object FullName, Directory, Length |
Group-Object Directory |
Select Name, @{Label='SizeGB';Expression={ ($_.Group.Length | Measure-Object -Sum).Sum/1GB }}
Friday, December 11, 2020
Impersonating an Application Name Using Connect-DbaInstance
Disclaimer: While writing this post I finally scrolled down a little further and read the ‘Testing your results’ portion of Chrissy LeMaire’s post, which uses that same ClientName parameter, making most of this post redundant. That would have saved me some time if I hadn’t skimmed past it for over 2.5 years! Discovering functionality within the dbatools module from the console is very rewarding, though.
I'm trying to decommission a SQL Server so I'm monitoring logins to that SQL instance, to ensure it isn’t being used. However, the customized version of the script to monitor logins using Chrissy’s backwards-compatible T-SQL and the dbatools module excludes connections from certain apps (our performance monitoring software and dbatools connections, and certain domain credentials) from being logged, which made it problematic to test the monitoring to see if the connections were being written to the watchlogins table.
Using the Connect-DbaInstance cmdlet I can impersonate another account _and_ impersonate an application name, verifying that the monitoring script is inserting the desired connections into the watchlogins table. The SqlCredential parameter allows Windows logons or SQL logins to be used. The example below uses a domain logon stored in the $cred variable as a PSCredential object. The ClientName parameter is where the “fake” client application name is specified. Using a WAITFOR DELAY statement ensured the connection was active long enough for the polling from the collection script to catch it.
In the screenshot below the client name is displayed in the program_name column of Adam Machanic’s sp_WhoIsActive results in SQL Server Management Studio.
See Get-Help Connect-DbaInstance -Full
or https://dbatools.io/Connect-DbaInstance for more info/examples.
I’ve modified Chrissy’s script slightly to avoid running the query against instances that may be down and time out, as well as adding some output that can be seen when looking at job history when scheduling the PowerShell job using SQL Agent.