- Jeevan Wijerathna
Problem: Deleting Millions of Rows at once could causes the following issues.
- Lock Escalation
- Large transaction logs Usage
In order to avoid the above issues, we can use batch Delete. Further we can use traction and Commit after each iteration. So, in case something went wrong we don't have to start over.
DECLARE @Rows INT,@BatchSize INT,@Completed INT,@Total INT,@Message nvarchar(max)SET @BatchSize = 10000SET @Rows = @BatchSizeSET @Completed = 0SELECT CommentId AS id into #targetIdsFROM CommentsWHERE Comments.Month < 202101ORDER BY IdSELECT @Total = @@ROWCOUNTCREATE TABLE #batchIds (Id INT);WHILE EXISTS (SELECT 1 FROM #targetIds)BEGINDELETE TOP (@BatchSize)FROM #targetIdsOUTPUT deleted.Id INTO #batchIdsDELETE FROM Comments where Commentid in (select id FROM #batchIds)SET @Rows = @@ROWCOUNTSET @Completed = @Completed + @Rows-- Print progress using RAISERROR to avoid SQL buffering issueSELECT @Message = 'Completed ' + cast(@Completed as varchar(10)) + '/' + cast(@Total as varchar(10))RAISERROR(@Message, 0, 1) WITH NOWAITTRUNCATE TABLE #batchIds;ENDIF OBJECT_ID(N'tempdb..#batchIds') IS NOT NULLBEGINDROP TABLE #batchIdsENDIF OBJECT_ID(N'tempdb..#targetIds') IS NOT NULLBEGINDROP TABLE #targetIdsEND