There are always some static tables in our database that needs to always have some default values in them, for our custom application development. Usually a huge amount of data resides in these tables because they are slowly changing dimension fields. So when it is time to transfer this data the question arises how to work this out. There are many custom tools available but all had some cost, so i had to check how can I overcome this problem. So i thought of a unique way of scripting bulk insert statements.
I had a static table that had country names in it, and it was quite big in terms of data around 200 rows in it. So this was how i have done it.
Country Table had the following schema:
CREATE TABLE [dbo].[Country](
[CountryID] [uniqueidentifier] ROWGUIDCOL NOT NULL CONSTRAINT [DF_Country_CountryID] DEFAULT (newid()),
[CountryName] [varchar](50) COLLATE SQL_Latin1_General_CP1_CI_AS NULL,
CONSTRAINT [PK_Country] PRIMARY KEY CLUSTERED
)WITH (PAD_INDEX = OFF, IGNORE_DUP_KEY = OFF) ON [PRIMARY]
) ON [PRIMARY]
Now the script i wrote to generate the insert statements of the data was:
‘INSERT Country (CountryName) ‘ + ‘SELECT ”’ +
CONVERT(VARCHAR(50), CountryName) + ””
By executing the above script i got a result set that contained all the generated insert statements from Country Table.
This is a basic technique how mass queries can be generated easily. You can easily customize the above script for any type queries to be generated.