Sql server big data如何批量update数据

原因:

要一次性update 2千万条数据,虽然update sql很简单,但是由于一次性修改太多的数据,造成数据库log满了,就会报error:

 [ErrorCode: 9002, SQL State: S0004]  The transaction log for database'XXXXData' is full. To find out why space in the log cannot be reused, seethe log_reuse_wait_desc column in sys.databases

所以:

因为是数据库一次性update 数据操作,因此,就想到批量的update数据。 我能想到的解决方案就是,像分页查询 sql那样,每次update 一定数量的数据。

SQL

	DECLARE 
		@pagesize INT,
		@pages INT,
		@offset INT,
		@maxresult INT
		
		
	select @pagesize=100000, @pages=1
	
	-- 1 prepare data list
	CREATE TABLE #Tbl_Affected_ID
	(	
		id INT NOT NULL,
		rowNum INT NOT NULL,
		modified CHAR(1) DEFAULT 'N' NOT NULL
	)
	
	INSERT 
		#Tbl_Affected_ID(id, rowNum)
	SELECT
    	aa.id,
        ROW_NUMBER() OVER(ORDER BY aa.id) AS rowNum
	FROM
    	TBLxxx aa
	WHERE  userId is null '?'
	
	
	-- 2 batch update 
	WHILE exists( select 1 from #Tbl_Affected_ID where modified  = 'N')
	BEGIN
		
		select @offset = (@pages-1) * @pagesize
		select @maxresult = @offset + @pagesize
		
		
		UPDATE TBLxxx SET
		    userId = 'test_user'
		from #Tbl_Affected_ID tmp
		WHERE
		    tmp.id = TBLxxx.id
		and rowNum between @offset and @maxresult
		
		UPDATE #Tbl_Affected_ID SET
		    modified = 'Y'
		
		WHERE
			rowNum between @offset and @maxresult
		
		select @pages = @pages+1

	END

  

原文地址:https://www.cnblogs.com/nidongde/p/5197323.html