Friday, December 26, 2025

Bitwise operator

 DECLARE @phoneInd INT

DECLARE @home BIT

,@homeFax BIT

,@mobile BIT

,@office BIT

,@officeFax BIT

,@tollfreeOffice BIT

,@tollfreeFax BIT


-- turn on indicators 1=on 0=off

SELECT @home = 0

,@homeFax = 0

,@mobile = 1

,@office = 1

,@officeFax = 0

,@tollfreeOffice = 1

,@tollfreeFax = 0


SET @phoneInd = POWER(2 * @home, 1) 

+ POWER(2 * @homeFax, 2) 

+ POWER(2 * @mobile, 3) 

+ POWER(2 * @office, 4) 

+ POWER(2 * @officeFax, 5) 

+ POWER(2 * @tollfreeOffice, 6) 

+ POWER(2 * @tollfreeFax, 7)


SELECT @phoneInd AS phoneInd

,iIF((2 & @phoneInd) = 2, 1, 0) AS 'Has Home'

,iIF((4 & @phoneInd) = 4, 1, 0) AS 'Has Home Fax'

,iIF((8 & @phoneInd) = 8, 1, 0) AS 'Has Mobile'

,iIF((16 & @phoneInd) = 16, 1, 0) AS 'Has Office'

,iIF((32 & @phoneInd) = 32, 1, 0) AS 'Has Office Fax'

,iIF((64 & @phoneInd) = 64, 1, 0) AS 'Has Toll Free Office'

,iIF((128 & @phoneInd) = 128, 1, 0) AS 'Has Toll Free Fax'


Saturday, September 20, 2025

SQL Server Task Manager using TSQL

SQL Server Task Manager & System Memory Query

In this post, we’ll break down a SQL query that retrieves the running tasks from the Windows Task Manager and fetches important system memory statistics from SQL Server. This combined approach helps system administrators monitor running tasks and memory usage on the server effectively. Let's dive right in!

1. Retrieving Task Information from the Task Manager

The first part of the query extracts the list of tasks running on your SQL Server. It uses the xp_cmdshell extended stored procedure to execute the Windows tasklist command, which lists the running processes in CSV format. This data is then inserted into a temporary table.


-- Drop the temporary table if it exists
drop table if exists #TM

-- Create the temporary table for storing task data
Create table #TM (id int identity, tsk varchar(1000))

-- Insert tasklist output into the temporary table
insert into #TM (tsk) exec xp_cmdshell 'tasklist /v /FO csv'
    

Here’s a quick explanation of the steps in this code:

  • xp_cmdshell runs the tasklist command with the /v option for detailed information and /FO csv for CSV formatting.
  • The task list is inserted into the temporary table #TM</>, where each row contains the task details like process name, PID, memory usage, etc.

2. Cleaning and Formatting the Task Data

Once we have the raw task data, we need to clean it up. A Common Table Expression (CTE) is used here to remove unwanted characters like commas and double quotes from the CSV output, making it easier to extract individual task details later.


;with ct as (
    select *, 
           replace(replace(replace(tsk,'","','|'),'"',''),',','') ntsk
    from #TM where id >= 2
)
    

What’s happening here:

  • The replace function is applied multiple times to remove unwanted characters like quotes and commas from the task data.
  • The cleaned data is stored in a new column ntsk</>, which will be processed further.

3. Parsing the Data and Extracting Specific Information

Next, we use SQL Server’s JSON_VALUE function to parse the cleaned task data and extract specific fields, such as the task name, PID, memory usage, and others. We also use CROSS APPLY to convert the cleaned data into a JSON array format that’s easier to work with.


Select A.id,
       ImageName = nullif(JSON_VALUE(JS,'$[0]'),''),
       PID = nullif(JSON_VALUE(JS,'$[1]'),''),
       SessionName = nullif(JSON_VALUE(JS,'$[2]'),''),
       SessionNo = nullif(JSON_VALUE(JS,'$[3]'),''),
       MemUsage = cast(replace(nullif(JSON_VALUE(JS,'$[4]'),''), ' k','') as int),
       Status = nullif(JSON_VALUE(JS,'$[5]'),''),
       UserName = nullif(JSON_VALUE(JS,'$[6]'),''),
       CPUTime = nullif(JSON_VALUE(JS,'$[7]'),''),
       WindowTitle = nullif(JSON_VALUE(JS,'$[8]'),'')
From ct A
CROSS APPLY (values ('["'+replace(string_escape(ntsk,'json') ,'|','","')+'"]') ) B(JS)
order by MemUsage desc
    

Here’s how the data is processed:

  • The string_escape function prepares the data for JSON formatting by escaping special characters, and the replace function replaces pipes with commas to create a valid JSON array.
  • JSON_VALUE is then used to extract specific fields from the JSON array. For example, $[0] extracts the image name (task name), $[1] gets the PID, and so on.
  • We use nullif to handle empty strings and replace them with NULL.
  • The result is ordered by memory usage in descending order to highlight the processes that are consuming the most resources.

4. Retrieving System Memory Information

Finally, we fetch system memory details using the sys.dm_os_sys_memory DMV (Dynamic Management View). This provides crucial information about the physical memory available on the server.


-- Retrieve system memory information
SELECT   @@servername as servername,
         total_physical_memory_kb / 1024 AS total_physical_memory_mb,
         available_physical_memory_kb / 1024 AS available_physical_memory_mb,
         system_memory_state_desc
FROM     sys.dm_os_sys_memory;
    

What this query does:

  • total_physical_memory_kb and available_physical_memory_kb give the total and available physical memory in kilobytes. We divide these by 1024 to convert them into megabytes (MB).
  • system_memory_state_desc describes the current state of the system’s memory (whether it’s under stress, healthy, etc.).
  • @@servername Returns the name of the SQL Server instance where this query is running.

5. Conclusion

This SQL query provides an effective way to monitor running tasks and system memory on your server. It allows you to:

  • Retrieve detailed task/process information from the Windows Task Manager.
  • Format and clean up raw task data into a readable structure.
  • Get important memory statistics to keep your server healthy and optimized.

By using this query, you can quickly identify high-memory processes and track overall system performance in a simple, easy-to-understand format.

Note: The xp_cmdshell extended stored procedure must be enabled for this query to work. Make sure your environment allows the execution of this command.

If you found this tutorial helpful, be sure to share it with your fellow SQL enthusiasts or subscribe for more beginner-friendly guides!

Tuesday, September 24, 2024

How to Remove Duplicate Records in SQL Server

How to Remove Duplicate Records in SQL Server

If you're working with databases, one common problem you might face is duplicate data. Duplicates can cause a lot of headaches, especially when it comes to data analysis or reporting. But don’t worry! In this post, we’ll walk through a simple way to remove duplicate records in SQL Server using a stored procedure that you can use in your own projects.

We’ll break down the SQL code step-by-step to make sure you understand how it works. By the end of this tutorial, you’ll be able to use this technique to keep your data clean and organized.

Step 1: What is a Duplicate Record?

A duplicate record is when you have two or more rows in your table with the exact same data in one or more columns. For example, if you have a list of customers and two customers have the same name, email, and phone number, those would be considered duplicates.

Step 2: The Concept of the RemoveDuplicate Procedure

In SQL Server, you can use a stored procedure to automate the process of removing duplicates. A stored procedure is like a saved set of SQL commands that can be executed later. The RemoveDuplicate procedure that we’ll look at helps to:

  • Find duplicate records in a table.
  • Remove the extra copies, leaving only one unique row.

Step 3: Understanding the Code

Let’s break down the SQL code that removes duplicates. Don’t worry if it seems complex at first. We’ll explain it in a way that’s easy to follow.

Step 3.1: The Procedure and Parameters

The procedure is created with the following parameters:

CREATE PROCEDURE RemoveDuplicate @SCDID sysname, @IsDebug bit = 0
  • @SCDID: This is an ID that identifies the specific rule you want to apply for removing duplicates. It helps the procedure know which table to work with.
  • @IsDebug: This is a flag that lets you print out the SQL commands for debugging. If you're just starting out, you can use this to see the queries that will be executed.

Step 3.2: Using ROW_NUMBER() to Identify Duplicates

The first part of the code uses a Common Table Expression (CTE) and the ROW_NUMBER() function to label rows:

WITH ct AS (
    SELECT *, rn = ROW_NUMBER() OVER(PARTITION BY ColumnSearch ORDER BY ColumnSearch)
    FROM SourceDatabase.SourceSchema.SourceObject
)

The ROW_NUMBER() function assigns a unique number to each row, starting from 1. The key part is PARTITION BY, which groups rows based on a specific column. For example, if you're looking for duplicates in the "Email" column, all rows with the same email will be grouped together.

After the row numbers are assigned, we can delete the duplicates.

Step 3.3: Removing Duplicates

Now that the rows are numbered, we can delete the duplicates by selecting rows with rn >= 2 (anything that’s not the first occurrence):

DELETE FROM ct WHERE rn >= 2;

This means that if there are multiple rows with the same data in the ColumnSearch, all except the first one will be deleted.

Step 3.4: Handling Column Exclusions

Sometimes, you don’t want to compare all columns for duplicates. In that case, we use the ColumnExclude parameter to exclude certain columns from the comparison. The code dynamically builds a list of column names, excluding any columns specified in ColumnExclude:

SELECT @rOut = STRING_AGG(c.name, ',')
    FROM SourceDatabase.sys.COLUMNS c
    WHERE c.name NOT IN (@ColumnExclude)

Step 3.5: Final Cleanup Using LAG()

In the final step, we use the LAG() function, which allows us to compare each row with the previous row in a group. If the current row has the same data as the previous one, it’s considered a duplicate and will be removed:

WITH ct AS (
    SELECT *, Ckp = CONCAT(@rval),
           PRv = LAG(CONCAT(@rval)) OVER(PARTITION BY ColumnSearch ORDER BY ColumnSearch)
    FROM SourceDatabase.SourceSchema.SourceObject
)
DELETE FROM ct WHERE ISNULL(ckp, '') = ISNULL(PRv, '');

Step 4: Executing the Procedure

Once you understand how the procedure works, you can execute it to remove duplicates. Simply call the procedure with the SCDID (rule ID) and @IsDebug parameter (set to 1 for debugging):

EXEC RemoveDuplicate @SCDID = 'yourRuleID', @IsDebug = 1;

If @IsDebug is set to 1, the procedure will print out the SQL commands it is going to execute, so you can check them before they run.

Step 5: Original Table Design for SCD2_Rule

Here’s the original design of the SCD2_Rule table used in the stored procedure. This table contains information about your source and target tables and the columns used for comparison:

SCDID SourceDatabase SourceSchema SourceObject TargetDatabase TargetSchema TargetObject ColumnSearch ColumnExclude
sysname sysname sysname sysname sysname sysname sysname nvarchar(-1) nvarchar(-1)

Step 6: Complete SQL Stored Procedure Code

Here’s the full SQL code for the RemoveDuplicate procedure, including everything we’ve discussed so far:


CREATE PROCEDURE RemoveDuplicate
    @SCDID sysname,
    @IsDebug bit = 0
AS
BEGIN
    DECLARE @Sql AS NVARCHAR(MAX), @rVal varchar(max)

    -- Fetch the SQL query template for the given SCDID
    SELECT @Sql = 'WITH ct AS (
        SELECT *, rn = ROW_NUMBER() OVER(PARTITION BY ' + ColumnSearch + ' ORDER BY ' + ColumnSearch + ')
        FROM ' + CONCAT(SourceDatabase, '.', SourceSchema, '.', SourceObject) + '
    ) DELETE FROM ct WHERE rn >= 2'
    FROM dbo.SCD2_Rule WHERE SCDID = @SCDID

    -- Debugging: Print the dynamic SQL if IsDebug is set to 1
    IF @IsDebug = 1
        PRINT @Sql

    -- Execute the dynamic SQL to remove duplicates
    EXEC SP_EXECUTESQL @Sql

    -- Other duplicate handling: Exclude specified columns
    SELECT @Sql = 'SELECT @rOut = STRING_AGG(c.name, '','')
        FROM ' + SourceDatabase + '.sys.COLUMNS c
        JOIN ' + SourceDatabase + '.sys.tables t ON c.object_id = t.object_id
        WHERE t.name = ''' + SourceObject + ''' 
        AND c.name NOT IN (' + b.a + ') AND c.name <> ''DataValidTo'''
    FROM dbo.SCD2_Rule a
    OUTER APPLY (
        SELECT STRING_AGG(''' + value + ''', ',') a
        FROM STRING_SPLIT(a.ColumnSearch + ',' + a.ColumnExclude, ',') b1
    ) b
    WHERE SCDID = @SCDID

    -- Debugging: Print the dynamic SQL for column exclusions
    IF @IsDebug = 1
        PRINT CONCAT('@Sql : ', @Sql)

    -- Execute the query for excluding columns
    EXEC sp_executesql @Sql, N'@rOut varchar(max) OUTPUT', @rOut=@rVal OUTPUT;

    -- Debugging: Print the result of excluded columns
    IF @IsDebug = 1
        PRINT CONCAT('@rVal : ', @rVal)

    -- Final cleanup: Remove records with identical values in key columns
    SELECT @Sql = 'WITH ct AS (
        SELECT *, Ckp = CONCAT(' + @rVal + '),
            PRv = LAG(CONCAT(' + @rVal + ')) OVER(PARTITION BY ' + REPLACE(ColumnSearch, ',DataSystemDate', '') + ' ORDER BY ' + ColumnSearch + ')
        FROM ' + CONCAT(SourceDatabase, '.', SourceSchema, '.', SourceObject) + '
    ) DELETE FROM ct WHERE ISNULL(ckp, '''') = ISNULL(PRv, '''')'
    FROM dbo.SCD2_Rule WHERE SCDID = @SCDID

    -- Debugging: Print the final SQL for cleanup
    IF @IsDebug = 1
        PRINT @Sql

    -- Execute the final cleanup query
    EXEC SP_EXECUTESQL @Sql
END
        

Step 7: Conclusion

You’ve just learned how to remove duplicate records from your SQL Server tables using a stored procedure. The RemoveDuplicate procedure is powerful because it allows you to automate the process of cleaning up your data.

Key Takeaways:

  • ROW_NUMBER() is used to assign a unique number to each row based on certain columns.
  • LAG() compares rows to detect duplicates.
  • Dynamic SQL is used to make the procedure flexible for different tables and columns.
  • Debugging is made easy with the @IsDebug flag to see the SQL commands before they run.

Now you can keep your SQL Server tables clean by removing duplicate data with ease. If you have any questions or need further clarification, feel free to ask in the comments!

Bonus Tip:

If you're working with large datasets, be sure to test the procedure on a smaller subset of your data before running it on your production database. Always back up your data first!

If you found this tutorial helpful, be sure to share it with your fellow SQL enthusiasts or subscribe for more beginner-friendly guides!

Monday, March 28, 2022

How to find unsaved file location for SQL query in management studio

Losing unsaved work can be frustrating, especially in SQL Server Management Studio (SSMS). Here's how to recover your work.

Default Location for Unsaved SQL Files

SSMS may store unsaved queries in temporary locations:

  • Windows XP:
    C:\Documents and Settings\<YourUsername>\My Documents\SQL Server Management Studio\Backup Files
  • Windows Vista / 7 / 10 / 11:
    %USERPROFILE%\Documents\SQL Server Management Studio\Backup Files
    OR
    %USERPROFILE%\AppData\Local\Temp

Replace <YourUsername> with your actual Windows username.

View Recently Executed SQL Queries

Run this SQL to view recent activity:

USE <Database Name>;
SELECT 
    execquery.last_execution_time AS [Date Time],
    execsql.text AS [Script]
FROM sys.dm_exec_query_stats AS execquery
CROSS APPLY sys.dm_exec_sql_text(execquery.sql_handle) AS execsql
ORDER BY execquery.last_execution_time DESC;
⚠️ Replace <Database Name> with the name of your working database.

Enable AutoRecover in SSMS

  1. Open SSMS.
  2. Go to Tools > Options.
  3. Navigate to Environment > AutoRecover.

Adjust settings like time interval and number of backups to keep.

SSMS Auto recover Query

What Happens on Restart?

SSMS prompts to recover unsaved queries after a crash:

Choose-query-to-recover

Conclusion

Enable AutoRecover and know where unsaved files go to prevent data loss in SSMS.

Sunday, January 2, 2022

Read specific errors using temp table

 Read specific errors using a temp table

Ever need to quickly check recent errors in your SQL Server logs? This handy T-SQL snippet allows you to query the SQL Server error log and gather information about various processes and messages. It's a great way to get a snapshot of what's been happening in your SQL Server environment.

Querying SQL Server Error Logs

The following SQL code creates a temporary table and populates it by reading the last seven SQL Server error logs (from 0 to 6). It then selects all entries where the process is not 'logon', helping you filter out routine login messages and focus on actual errors or important events.

create table #t (dt datetime, process varchar(200), msg varchar(1000))
insert into #t exec sp_readerrorlog 0,1,'error'
insert into #t exec sp_readerrorlog 1,1,'error'
insert into #t exec sp_readerrorlog 2,1,'error'
insert into #t exec sp_readerrorlog 3,1,'error'
insert into #t exec sp_readerrorlog 4,1,'error'
insert into #t exec sp_readerrorlog 5,1,'error'
insert into #t exec sp_readerrorlog 6,1,'error'
select * from #t where process <> 'logon'
---

Further Reading and Related Topics

For more insights into SQL Server, error handling, and performance monitoring, check out these excellent resources:

Saturday, January 1, 2022

Query Plan read using T-SQL

How to Analyze SQL Server Execution Plans from a Saved XML File

If you have an execution plan saved as an XML file, here's how you can read and analyze it in SQL Server.

📄 Step 1: Load the Execution Plan File

Use the OPENROWSET function to load your .xml file into a variable.


DECLARE @plancontents VARCHAR(MAX), @xml XML;
SET @plancontents = (
    SELECT * 
    FROM OPENROWSET(BULK 'C:\TEMP\MyPlan.XML', SINGLE_CLOB) AS FileData
);

🔍 Step 2: Retrieve Recent Queries Matching a Specific Pattern

You can extract the top 100 recent SQL queries that match a specific text pattern, along with their execution times and plan handles.


SELECT TOP 100 
    execquery.last_execution_time AS [Date Time],
    execsql.TEXT AS [Script],
    execquery.plan_handle
FROM sys.dm_exec_query_stats AS execquery
CROSS APPLY sys.dm_exec_sql_text(execquery.sql_handle) AS execsql
WHERE execsql.TEXT LIKE '%WITH Temp_Accounts (%'
ORDER BY execquery.last_execution_time DESC;

🧠 Step 3: Analyze Execution Plan Nodes

Once you have the execution plan XML, you can use XQuery to dig into specific attributes of each node, such as cost estimates and operation types.


;WITH XMLNAMESPACES (DEFAULT 'http://schemas.microsoft.com/sqlserver/2004/07/showplan')
SELECT 
    c.value('.[1]/@PhysicalOp', 'nvarchar(max)') AS PhysicalOp,
    c.value('.[1]/@LogicalOp', 'nvarchar(max)') AS LogicalOp,
    c.value('.[1]/@AvgRowSize', 'nvarchar(max)') AS AvgRowSize,
    c.value('.[1]/@TableCardinality', 'nvarchar(max)') AS TableCardinality,
    c.value('.[1]/@Parallel', 'nvarchar(max)') AS Parallel,
    c.value('.[1]/@EstimateRebinds', 'nvarchar(max)') AS EstimateRebinds,
    c.value('.[1]/@EstimateRewinds', 'nvarchar(max)') AS EstimateRewinds,
    c.value('.[1]/@NodeId', 'nvarchar(max)') AS NodeId,
    c.value('.[1]/@EstimatedTotalSubtreeCost', 'nvarchar(max)') AS EstimatedTotalSubtreeCost,
    c.value('.[1]/@EstimateRows', 'nvarchar(max)') AS EstimateRows,
    c.value('.[1]/@EstimateIO', 'nvarchar(max)') AS EstimateIO,
    c.value('.[1]/@EstimateCPU', 'nvarchar(max)') AS EstimateCPU,
    c.query('.') AS ExecPlanNode
FROM @xml.nodes('//child::RelOp') AS T(c)
ORDER BY EstimatedTotalSubtreeCost DESC;

🔗 Useful References

💡 Final Thoughts

By leveraging SQL Server's XQuery capabilities and DMVs, you can gain deep insights into query performance and execution behavior directly from saved plan files. This method is particularly useful when troubleshooting or optimizing complex queries.

Wednesday, December 29, 2021

Get tempdb version store space usage by database

Looking to understand how much space the version store is using in your SQL Server tempdb? The sys.dm_tran_version_store_space_usage dynamic management view is your go-to. It provides a table showing the total tempdb space consumed by version store records for each database.

This view is super efficient and inexpensive to run because it doesn't dig into individual version store records. Instead, it gives you an aggregated view of the version store space used in tempdb on a per-database basis. That makes it perfect for quick checks and monitoring!

Retrieve Version Store Space Usage

SELECT DB_NAME(database_id) AS [Database Name],
       reserved_page_count AS [Version Store Reserved Page Count],
       reserved_space_kb / 1024 AS [Version Store Reserved Space (MB)]
FROM sys.dm_tran_version_store_space_usage WITH (NOLOCK)
ORDER BY reserved_space_kb / 1024 DESC
OPTION (RECOMPILE);

This query will give you a clear breakdown of the version store space by database, ordered from largest to smallest.

For more SQL Server insights, check out this article on getting your TempDB files count.

You can also find more detailed documentation on sys.dm_tran_version_store_space_usage on the Microsoft Docs website.

Bitwise operator

  DECLARE @phoneInd INT DECLARE @home BIT ,@homeFax BIT ,@mobile BIT ,@office BIT ,@officeFax BIT ,@tollfreeOffice BIT ,@tollfr...