Generating Millions of Rows in SQL Server [Code Snippets]
Generating millions of rows in SQL Server can be helpful when you're trying to test purposes or when doing performance tuning.
Join the DZone community and get the full member experience.
Join For FreeOften, we have a need to generate and insert many rows into a SQL Server Table. For example, for testing purposes or performance tuning. It might be useful to imitate production volume in the testing environment or to check how our query behave when challenged with millions of rows.
Below please find an example of code used for generating primary key columns, random int
s, and random nvarchar
s in the SQL Server environment.
Table:
CREATE TABLE dbo.Table1 (
id int PRIMARY KEY
,number int
,name nvarchar(10)
);
1
2
3
4
5
CREATE TABLE dbo.Table1 (
id int PRIMARY KEY
,number int
,name nvarchar(10)
);
Procedure:
IF OBJECT_ID ('dbo.addRows', 'P') IS NOT NULL
DROP PROCEDURE dbo.addRows;
GO
CREATE PROCEDURE dbo.addRows
@rowsNumber int
AS
BEGIN
SET NOCOUNT ON
-- start point for adding rows
DECLARE @id INT = ISNULL((SELECT MAX(id) FROM dbo.Table1)+1, 1)
DECLARE @iteration INT = 0
WHILE @iteration < @rowsNumber
BEGIN
--get a random int from 0 to 100
DECLARE @number INT = CAST(RAND()*100 AS INT)
-- generate random nvarchar
-- get a random nvarchar ascii char 65 to 128
DECLARE @name NVARCHAR(10) = N'' --empty string for start
DECLARE @length INT = CAST(RAND() * 10 AS INT) --random length of nvarchar
WHILE @length <> 0 --in loop we will randomize chars till the last one
BEGIN
SELECT @name = @name + CHAR(CAST(RAND() * 63 + 65 as INT))
SET @length = @length - 1 --next iteration
END
--insert data
INSERT INTO dbo.Table1 (id, number, name)
VALUES (@id, @number, @name)
SET @iteration += 1
SET @id += 1
END
SET NOCOUNT OFF
END
Using this procedure will enable us to add the requested number of random rows.
Sample:
EXEC dbo.addRows 1000 --elapsed time ~0.11
EXEC dbo.addRows 10000 --elapsed time ~1.1
EXEC dbo.addRows 100000 --elapsed time ~9.64
EXEC dbo.addRows 1000000 --elapsed time ~2:11.88
Sample data:
And that is it!
Published at DZone with permission of Mateusz Komendołowicz, DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.
Comments