Friday, October 7, 2011

Loopty-Loop

A recent encounter with my arch-enemy, The Tangled Loop, left me weary and scarred, I haven't yet fully recovered.  During the battle, I witnessed unspeakable things, lines of code that mangled T-SQL into something unrecognizable.  I feel guilty exposing you all to this, but it must be done, for you must know what to be on the lookout for.  The Tangled Loop is still out there, somewhere, he may surface in your workplace next.  What you're about to see is real, actual code that a developer wanted to deploy to our production server.

The code below, and I use the term "code" loosely, is intended to loop (yes, that's right, loop) through a table, and for certain rows, replace certain string combinations with something else.  Go ahead, look at it.  I'll give you moment to stop vomiting and catch your breath.

-------------------------------------
-- declare variables
-------------------------------------
DECLARE @conversion TABLE (
  fromvalue VARCHAR(50),
  tovalue   VARCHAR(50))
DECLARE @rowsToUpdate TABLE (
  idtoupdate    INT,
  paramtoupdate VARCHAR(255))
DECLARE @paramName VARCHAR(20)
DECLARE @ruleID INT
DECLARE @ruleParam VARCHAR(255)
DECLARE @delim CHAR
DECLARE @delimIndex INT
DECLARE @delimWork VARCHAR(255)
DECLARE @oldValue VARCHAR(50)
DECLARE @newValue VARCHAR(50)
DECLARE @newValues VARCHAR(255)

---------------------------------------
---- Initialize variables
---------------------------------------
INSERT INTO @conversion
VALUES      ('E',
             'EMAIL')

INSERT INTO @conversion
VALUES      ('S',
             'SMS')

INSERT INTO @conversion
VALUES      ('X',
             'DO_NOT_Send')

INSERT INTO @conversion
VALUES      ('EMAIL',
             'EMAIL')

INSERT INTO @conversion
VALUES      ('SMS',
             'SMS')

INSERT INTO @conversion
VALUES      ('DO_NOT_Send',
             'DO_NOT_Send')

SET @delim = ','
SET @paramName = 'CP='

-------------------------------------
-- Procedure body
-------------------------------------
-- find all rows to process
INSERT INTO @rowsToUpdate
SELECT correspondencetemplateconfigurationselectionruleid,
       ruleparam
FROM   dbo.correspondencetemplateconfigurationselectionrule
WHERE  ruleparam LIKE @paramName + '%'

-- loop through the rows to process
SET rowcount 1

SELECT @ruleID = idtoupdate,
       @ruleParam = paramtoupdate
FROM   @rowsToUpdate

WHILE @@ROWCOUNT <> 0
  BEGIN
      PRINT N'Change: ' + CAST(@ruleParam AS VARCHAR(255))

      -- The rule parameter is like: CP=a,b,c  First we pull out the value portion (a,b,c)
      SET @delimWork = Substring(@ruleParam, Len(@paramName) + 1,
                       Len(@ruleParam))
      SET @newValues = @paramName

      -- loop through value list by delimiter (ex:  if values are a,b,c then we process a then b then c)
      SELECT @delimIndex = Charindex(@delim, @delimWork, 0)

      WHILE @delimIndex > 0
        BEGIN
            SELECT @oldValue = Ltrim(Rtrim(Substring(@delimWork, 0, @delimIndex)
                                     ))

            -- convert the old value to new new value
            SET @newValue = NULL

            SELECT @newValue = tovalue
            FROM   @conversion
            WHERE  fromvalue = @oldValue

            IF ( @newValue IS NOT NULL )
              BEGIN
                  -- save new value to new list of values
                  SET @newValues = @newValues + Ltrim(Rtrim(@newValue)) + @delim
              END

            -- setup for the next loop of delimited values
            SELECT @delimWork = Substring(@delimWork, @delimIndex + 1, Len(
                                @delimWork)
                                )

            SELECT @delimIndex = Charindex(@delim, @delimWork, 0)
        END

      -- need to pull the last item out of the value list
      IF Len(@delimWork) > 0
        BEGIN
            SET @oldValue = @delimWork
            -- convert the old value to new new value
            SET @newValue = NULL

            SELECT @newValue = tovalue
            FROM   @conversion
            WHERE  fromvalue = @oldValue

            IF ( @newValue IS NOT NULL )
              BEGIN
                  -- save new value to new list of values
                  SET @newValues = @newValues + Ltrim(Rtrim(@newValue))
              END
        END

      PRINT 'To: ' + @newValues

      PRINT '------------------------------------------'

      -- Update the table
      UPDATE dbo.correspondencetemplateconfigurationselectionrule
      SET    ruleparam = @newValues
      WHERE  correspondencetemplateconfigurationselectionruleid = @ruleID

      -- setup next loop of rows to update
      DELETE FROM @rowsToUpdate
      WHERE  idtoupdate = @ruleID

      SELECT @ruleID = idtoupdate,
             @ruleParam = paramtoupdate
      FROM   @rowsToUpdate
  END

SET rowcount 0 


We all know looping is not preferred within T-SQL, but this example takes it to the extreme.  Not only does it loop through the rows of a table, but when doing the string replacements, it loops through the strings, ONE CHARACTER AT A TIME!

When The Tangled Loop launched this attack, I grabbed the nearest T-SQL reference book to use as a shield.  This brief respite allowed me the time to muster up a counter-attack:

UPDATE dbo.correspondenceconfiguration
SET    param = 'CP=' + REPLACE((SELECT map.newvalue AS [data()]
                                FROM   (SELECT '|E|'   AS oldvalue,
                                               'EMAIL' AS newvalue
                                        UNION ALL
                                        SELECT '|S|' AS oldvalue,
                                               'SMS' AS newvalue
                                        UNION ALL
                                        SELECT '|X|'         AS oldvalue,
                                               'DO_NOT_Send' AS newvalue) AS map
                                WHERE  REPLACE(REPLACE(
                                       correspondenceconfiguration.param + '|',
                                               '=',
                                               '=|'), ','
                                       , '|,|')
                                       LIKE '%' + map.oldvalue + '%'
                                FOR XML PATH('')), ' ', ',')
FROM   dbo.correspondenceconfiguration
WHERE  param LIKE 'CP=%'


The server was saved from a terrible fate, but unfortunately, The Tangled Loop escaped, and is still out there, somewhere.  I'm sure we haven't seen the last of him.

Automated Source Control: A Librarian For Your Database Objects, Part 2

I assume that by now you've tried the script from Part 1, and have seen how easy it is to script out your database objects using Powershell.  The resulting scripts are the standard DDL scripts that you can generate from Management Studio, and for most of us, are just fine right out of the box.

What if, however, we need to customize those scripts?  In my environment, our software deployments (including the database scripts) are automated, and we need certain elements to be included in the scripts for the automation to function properly.  I also like to include a certain degree of fail-safe in the scripts, to prevent accidentally deploying an object to the wrong database, or possibly wiping out a table full of data.

Turns out that it's a snap to customize the scripts that are generated, using simple text piping, just like you've been doing in batch files for a decade or two.  Simply insert the following into the script from Part 1.  For your convenience, the entire script is repeated at the end of this post.

I'll resume this series after SQL Pass.  I'm not attending, but I will be taking some time off next week, and with all of the buzz around the conference, nobody will be paying attention to little ol' me anyway!

    #Add commented USE statement for deployment validation tool
    "USE $dbname" >  "$SavePath\$TypeFolder\$ScriptFile.SQL"

    #Add existing object handler (drop procedure, table exists, etc..)
    if ($TypeFolder -eq "StoredProcedure") {
    "IF OBJECT_ID('$ScriptFile', 'P') IS NOT NULL
        BEGIN
        
        DROP PROCEDURE $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "View") {
    "IF OBJECT_ID('$ScriptFile', 'V') IS NOT NULL
        BEGIN
        
        DROP VIEW $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "UserDefinedFunction") {
    "IF OBJECT_ID('$ScriptFile', 'FN') IS NOT NULL OR OBJECT_ID('$ScriptFile', 'TF') IS NOT NULL
        BEGIN
        
        DROP FUNCTION $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "Trigger") {
    "IF OBJECT_ID('$ScriptFile', 'TR') IS NOT NULL
        BEGIN
        
        DROP TRIGGER $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "Table") {
    "IF OBJECT_ID('$ScriptFile', 'U') IS NOT NULL
        BEGIN
        
        IF EXISTS(SELECT * FROM $ScriptFile)
            BEGIN
            
            USE tempdb
            RAISERROR('Table exists and contains data, cannot proceed.', 16, 1)
            RETURN
            
            END
        ELSE
            BEGIN
            
            DROP TABLE $ScriptFile
            
            END
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    }

The complete script now looks like this:

#Set server and database names here
$server = "REALServer"
$dbname = "REALDB"

#Create SMO connection to DB
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
    $SMOserver = New-Object ('Microsoft.SqlServer.Management.Smo.Server') -argumentlist $server
    $db = $SMOserver.databases[$dbname]
 
#Build collection of objects to script
$Objects = $db.Tables
$Objects += $db.Views
$Objects += $db.StoredProcedures
$Objects += $db.UserDefinedFunctions

#Set destination for generated scripts
$SavePath = "C:\temp\db\Source\" + $($dbname)

#Loop through the object collection
foreach ($ScriptThis in $Objects | where {!($_.IsSystemObject)}) {
	#Create Scripter object
    $scriptr = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($SMOserver)
	
	#Set options for generated scripts
    $scriptr.Options.AppendToFile = $True
    $scriptr.Options.AllowSystemObjects = $False
    $scriptr.Options.ClusteredIndexes = $True
    $scriptr.Options.DriAll = $True
    $scriptr.Options.ScriptDrops = $False
    $scriptr.Options.IncludeHeaders = $False
    $scriptr.Options.ToFileOnly = $True
    $scriptr.Options.Indexes = $True
    $scriptr.Options.Permissions = $True
    $scriptr.Options.WithDependencies = $False

	#Create destination folder and subfolder, if necessary
    $TypeFolder=$ScriptThis.GetType().Name
    if ((Test-Path -Path "$SavePath\$TypeFolder") -eq "true")
            {"Scripting Out $TypeFolder $ScriptThis"}
        else {new-item -type directory -name "$TypeFolder"-path "$SavePath"}
    $ScriptFile = $ScriptThis -replace "\[|\]"

	#Set output filename for script
    $scriptr.Options.FileName = "$SavePath\$TypeFolder\$ScriptFile.SQL"

    #Add commented USE statement for deployment validation tool
    "USE $dbname" >  "$SavePath\$TypeFolder\$ScriptFile.SQL"

    #Add existing object handler (drop procedure, table exists, etc..)
    if ($TypeFolder -eq "StoredProcedure") {
    "IF OBJECT_ID('$ScriptFile', 'P') IS NOT NULL
        BEGIN
        
        DROP PROCEDURE $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "View") {
    "IF OBJECT_ID('$ScriptFile', 'V') IS NOT NULL
        BEGIN
        
        DROP VIEW $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "UserDefinedFunction") {
    "IF OBJECT_ID('$ScriptFile', 'FN') IS NOT NULL OR OBJECT_ID('$ScriptFile', 'TF') IS NOT NULL
        BEGIN
        
        DROP FUNCTION $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "Trigger") {
    "IF OBJECT_ID('$ScriptFile', 'TR') IS NOT NULL
        BEGIN
        
        DROP TRIGGER $ScriptFile
        
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    } elseif ($TypeFolder -eq "Table") {
    "IF OBJECT_ID('$ScriptFile', 'U') IS NOT NULL
        BEGIN
        
        IF EXISTS(SELECT * FROM $ScriptFile)
            BEGIN
            
            USE tempdb
            RAISERROR('Table exists and contains data, cannot proceed.', 16, 1)
            RETURN
            
            END
        ELSE
            BEGIN
            
            DROP TABLE $ScriptFile
            
            END
        END
    GO
    " >> "$SavePath\$TypeFolder\$ScriptFile.SQL"
    }

	#Generate script for this object
    $scriptr.Script($ScriptThis)
}

#Second loop through tables to script triggers
foreach ($Table in $db.Tables | where {!($_.IsSystemObject)}) {
	#Loop through triggers on this table
    foreach ($ScriptThis in $Table.Triggers) {
		#Create Scripter object
        $scriptr = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($SMOserver)
		
		#Set options for generated scripts
        $scriptr.Options.AppendToFile = $True
        $scriptr.Options.AllowSystemObjects = $False
        $scriptr.Options.ClusteredIndexes = $True
        $scriptr.Options.DriAll = $True
        $scriptr.Options.ScriptDrops = $False
        $scriptr.Options.IncludeHeaders = $False
        $scriptr.Options.ToFileOnly = $True
        $scriptr.Options.Indexes = $True
        $scriptr.Options.Permissions = $True
        $scriptr.Options.WithDependencies = $False
         
		#Create destination folder and subfolder, if necessary
		$TypeFolder=$ScriptThis.GetType().Name
        if ((Test-Path -Path "$SavePath\Trigger") -eq "true")
                {"Scripting Out Trigger $ScriptThis"}
            else {new-item -type directory -name "Trigger"-path "$SavePath"}
        $ScriptFile = $ScriptThis -replace "\[|\]"

		#Set output filename for script
        $scriptr.Options.FileName = "$SavePath\Trigger\$ScriptFile.SQL"

		#Add commented USE statement for deployment validation tool
		"USE $dbname" >  "$SavePath\$TypeFolder\$ScriptFile.SQL"

        #Add existing object handler (drop trigger, etc..)
        "IF OBJECT_ID('$ScriptFile', 'TR') IS NOT NULL
            BEGIN
            
            DROP TRIGGER $ScriptFile
            
            END
        GO
        " >> "$SavePath\Trigger\$ScriptFile.SQL"

		#Generate script for this trigger
        $scriptr.Script($ScriptThis)
	}
}

Wednesday, October 5, 2011

Automated Source Control: A Librarian For Your Database Objects, Part 1

One of the biggest headaches in most development groups that do any sort of SQL Server work is that of source control.  There are some commercial products that attempt to solve this headache, and from what I understand, some of them do a pretty decent job.  The only one that I have personal experience with is Microsoft's Team Foundation doohickey, and I walked away from that unimpressed.

Being a "do-it-yourself" kind of guy, and looking for new and interesting ways to use Powershell, I've built a (pretty slick, if I say so myself) method of AUTOMATICALLY version-controlling my database objects.  No muss, no fuss, it just works, and nobody needs to remember to do anything.

This will be the first of a series of articles describing what my process does, how it does it, why it does it, and at the end, you'll have a functional Powershell script that you can use yourself.  All of my examples are specific to Perforce, since that's what we use here, but should be easily adaptable to your own system.

Powershell makes it very easy to connect to a SQL Server database, and to manipulate the objects within that database.  Virtually anything that you can do through Management Studio, you can also do through Powershell.

Below is the beginnings of our source control script.  In this example, the script will simply connect to the database of your choice, and will loop through certain objects in that database, generating SQL scripts for those objects.


#Set server and database names here
$server = "REALServer"
$dbname = "REALDB"

#Create SMO connection to DB
[System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SqlServer.SMO") | out-null
    $SMOserver = New-Object ('Microsoft.SqlServer.Management.Smo.Server') -argumentlist $server
    $db = $SMOserver.databases[$dbname]
 
#Build collection of objects to script
$Objects = $db.Tables
$Objects += $db.Views
$Objects += $db.StoredProcedures
$Objects += $db.UserDefinedFunctions

#Set destination for generated scripts
$SavePath = "C:\temp\db\Source\" + $($dbname)

#Loop through the object collection
foreach ($ScriptThis in $Objects | where {!($_.IsSystemObject)}) {
	#Create Scripter object
    $scriptr = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($SMOserver)
	
	#Set options for generated scripts
    $scriptr.Options.AppendToFile = $True
    $scriptr.Options.AllowSystemObjects = $False
    $scriptr.Options.ClusteredIndexes = $True
    $scriptr.Options.DriAll = $True
    $scriptr.Options.ScriptDrops = $False
    $scriptr.Options.IncludeHeaders = $False
    $scriptr.Options.ToFileOnly = $True
    $scriptr.Options.Indexes = $True
    $scriptr.Options.Permissions = $True
    $scriptr.Options.WithDependencies = $False

	#Create destination folder and subfolder, if necessary
    $TypeFolder=$ScriptThis.GetType().Name
    if ((Test-Path -Path "$SavePath\$TypeFolder") -eq "true")
            {"Scripting Out $TypeFolder $ScriptThis"}
        else {new-item -type directory -name "$TypeFolder"-path "$SavePath"}
    $ScriptFile = $ScriptThis -replace "\[|\]"

	#Set output filename for script
    $scriptr.Options.FileName = "$SavePath\$TypeFolder\$ScriptFile.SQL"

	#Generate script for this object
    $scriptr.Script($ScriptThis)
}

#Second loop through tables to script triggers
foreach ($Table in $db.Tables | where {!($_.IsSystemObject)}) {
	#Loop through triggers on this table
    foreach ($ScriptThis in $Table.Triggers) {
		#Create Scripter object
        $scriptr = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($SMOserver)
		
		#Set options for generated scripts
        $scriptr.Options.AppendToFile = $True
        $scriptr.Options.AllowSystemObjects = $False
        $scriptr.Options.ClusteredIndexes = $True
        $scriptr.Options.DriAll = $True
        $scriptr.Options.ScriptDrops = $False
        $scriptr.Options.IncludeHeaders = $False
        $scriptr.Options.ToFileOnly = $True
        $scriptr.Options.Indexes = $True
        $scriptr.Options.Permissions = $True
        $scriptr.Options.WithDependencies = $False
         
		#Create destination folder and subfolder, if necessary
		$TypeFolder=$ScriptThis.GetType().Name
        if ((Test-Path -Path "$SavePath\Trigger") -eq "true")
                {"Scripting Out Trigger $ScriptThis"}
            else {new-item -type directory -name "Trigger"-path "$SavePath"}
        $ScriptFile = $ScriptThis -replace "\[|\]"

		#Set output filename for script
        $scriptr.Options.FileName = "$SavePath\Trigger\$ScriptFile.SQL"

		#Generate script for this trigger
        $scriptr.Script($ScriptThis)
	}
}

Wonder Twin Powers, Activate!

Every once in a while, two very different things will come together to produce something new, something unexpected, something awesome. Take these two guys that I know. One, a complete nerd, was bitten by a spider, and now spends his nights running around the city in tights, slinging strands of sticky goo all over the place. The other dude, also a nerd, who has always had a bit of a temper, got too close to the microwave or something, and now whenever he gets mad, turns into a muscular green no-nonsense kinda guy.

I recently had my own "coming together" experience, sadly nothing like these guys, I'm still a nerd, I don't turn green, and I refuse to wear tights. I did, however, develop a new ability through this experience - I can teleport, through time AND space! Ok, not me exactly, but I can teleport my work from one place and time to another. Seriously! By combining the teleportation powers of Dropbox with the time-travelling abilities of SSMS Tools Pack, I can leap from my current location to any other, and even backwards in time, to recover a SQL query or Management Studio session that I previously ran. This is an incredible new power!

My memory's not the greatest, inhibited by "shiny thing" syndrome and the ravages of age. There are often days where I leave work unfinished, intending to pick up where I left off the next day. To help remind myself of what I was doing and where I stopped, I'll leave Management Studio open on my laptop, put the laptop into sleep mode, and head home. Next morning, I simply wake the laptop up and pick up where I stopped the day before. This works well enough, but what happens if the laptop battery dies, or the laptop refuses to wake up? What if I want to continue working on my home machine, or on the "machine under the desk" that I have at work? Thanks to my new powers, these are no longer problems for me.

The first piece of this is Dropbox. If you're not using it, you're missing out. Prior to discovering my new ability, I was using Dropbox for all sorts of things. All of my utility scripts are in there, some eBooks, backups of my Android phone, lots of things. Start using it - you won't be sorry.

The second piece is the SSMS Tools Pack, developed by Mladen Prajdic. This is a tool that extends Management Studio to do some very cool things. In particular, it automatically saves your Management Studio sessions and tabs, at regular intervals, making it possible to retrieve virtually any past query or session that you've run in Management Studio. By combining this capability, with the synching that Dropbox offers, those past sessions and queries can now be accessed from any machine on which you're running Dropbox. Pure genius!

Assuming that you have both Dropbox and SSMS Tools Pack installed, a simple configuration change is all that is needed to inherity my unique space/time-traveling ability:




Soon you'll see an SSMSTools folder appear in your Dropbox. Make the same configuration change on your other workstations, and you now have integrated tab and session synchronization, across workstations!

Monday, October 3, 2011

T-SQL Tuesday 23: Nested Joins

You don't see it mentioned very often, but it's one of my favorite tuning tricks - the nested join.  There's nothing groundbreaking here, just my modest contribution to T-SQL Tuesday.

Allow me to present to you an actual query that surfaced on my production server today.  On the surface, this looks like a query that could be stripped down and highly tuned.  The complication is the CandidateBiometricIdentityWorkStatus table - there can be multiple rows in this table for a given OrderItemID, and we don't want to return ANY OrderItemID values that have a status of 'NEW', 'IN_PROCESS', or 'ERROR'.  In a classic case of non-set-based thinking, the developer produced this query:

UPDATE dbo.ResultUploadCompleteQueue
SET    RequestMonitorID = 12
WHERE  RequestMonitorID IS NULL
	   AND OrderItemID IN
	   	(
	   	    -- Find all OrderItemID in the work table that are ready for continued results processing
			SELECT DISTINCT work1.OrderItemID
			FROM dbo.CandidateBiometricIdentityWork work1
			INNER JOIN dbo.ResultUploadCompleteQueue resultQ
				ON work1.OrderItemID = resultQ.OrderItemID
			INNER JOIN dbo.CandidateBiometricIdentityWorkType AS workType
				ON work1.CandidateBiometricIdentityWorkTypeID = workType.CandidateBiometricIdentityWorkTypeID
			INNER JOIN dbo.CandidateBiometricIdentityRequestType AS requestType
				ON work1.CandidateBiometricIdentityRequestTypeID = requestType.CandidateBiometricIdentityRequestTypeID
			WHERE workType.WorkType = 'IDENTIFY' 
				AND requestType.RequestType = 'BATCH'
				AND work1.InResultsProcessing = 1
			     AND NOT EXISTS
				(
				    -- Find all work for the OrderItemID that is not complete and exclude from results
					SELECT 1
					FROM dbo.CandidateBiometricIdentityWork AS work2
					INNER JOIN dbo.CandidateBiometricIdentityWorkType AS workType
						ON work2.CandidateBiometricIdentityWorkTypeID = workType.CandidateBiometricIdentityWorkTypeID
					INNER JOIN dbo.CandidateBiometricIdentityRequestType AS requestType
						ON work2.CandidateBiometricIdentityRequestTypeID = requestType.CandidateBiometricIdentityRequestTypeID
					INNER JOIN dbo.CandidateBiometricIdentityWorkStatus AS workStatus
						ON work2.CandidateBiometricIdentityWorkStatusID = workStatus.CandidateBiometricIdentityWorkStatusID
					WHERE work2.OrderItemID = work1.OrderItemID
						  AND workType.WorkType = 'IDENTIFY' 
						  AND requestType.RequestType = 'BATCH'
						  AND work2.InResultsProcessing = 1
						  AND workStatus.status IN ('NEW','IN_PROCESS','ERROR')
				)
	   	)

This is the estimated plan for the query.  Hard to follow, inefficient (Eager Spool, anyone), can be improved.



Not only can the UPDATE be improved with a join, but the NOT IN filter can be replaced with, my favorite, a nested join.  I rewrote the query, introduced the developer to nested joins (something he'd never seen before), and produced a much cleaner, much more efficient query plan.

UPDATE ResultUploadCompleteQueue
SET RequestMonitorID = 12
FROM dbo.CandidateBiometricIdentityWork
INNER JOIN dbo.ResultUploadCompleteQueue
	ON CandidateBiometricIdentityWork.OrderItemID = ResultUploadCompleteQueue.OrderItemID
INNER JOIN dbo.CandidateBiometricIdentityWorkType
	ON CandidateBiometricIdentityWork.CandidateBiometricIdentityWorkTypeID = CandidateBiometricIdentityWorkType.CandidateBiometricIdentityWorkTypeID
INNER JOIN dbo.CandidateBiometricIdentityRequestType
	ON CandidateBiometricIdentityWork.CandidateBiometricIdentityRequestTypeID = CandidateBiometricIdentityRequestType.CandidateBiometricIdentityRequestTypeID
LEFT JOIN dbo.CandidateBiometricIdentityWork AS CandidateBiometricIdentityWorkPending
	INNER JOIN dbo.CandidateBiometricIdentityWorkType AS CandidateBiometricIdentityWorkTypePending
		ON CandidateBiometricIdentityWorkPending.CandidateBiometricIdentityWorkTypeID = CandidateBiometricIdentityWorkTypePending.CandidateBiometricIdentityWorkTypeID
	INNER JOIN dbo.CandidateBiometricIdentityRequestType AS CandidateBiometricIdentityRequestTypePending
		ON CandidateBiometricIdentityWorkPending.CandidateBiometricIdentityRequestTypeID = CandidateBiometricIdentityRequestTypePending.CandidateBiometricIdentityRequestTypeID
	INNER JOIN dbo.CandidateBiometricIdentityWorkStatus AS CandidateBiometricIdentityWorkStatusPending
		ON CandidateBiometricIdentityWorkPending.CandidateBiometricIdentityWorkStatusID = CandidateBiometricIdentityWorkStatusPending.CandidateBiometricIdentityWorkStatusID
	ON CandidateBiometricIdentityWork.OrderItemID = CandidateBiometricIdentityWorkPending.OrderItemID
		AND CandidateBiometricIdentityWorkPending.InResultsProcessing = 1
		AND CandidateBiometricIdentityWorkTypePending.WorkType = 'IDENTIFY'
		AND CandidateBiometricIdentityRequestTypePending.RequestType = 'BATCH'
		AND CandidateBiometricIdentityWorkStatusPending.Status IN ('NEW','IN_PROCESS','ERROR')
WHERE CandidateBiometricIdentityWorkType.WorkType = 'IDENTIFY' 
	AND CandidateBiometricIdentityRequestType.RequestType = 'BATCH'
	AND CandidateBiometricIdentityWork.InResultsProcessing = 1
	AND ResultUploadCompleteQueue.RequestMonitorID IS NULL
	AND CandidateBiometricIdentityWorkPending.OrderItemID IS NULL