Friday, March 9, 2012

Opinons on using Delete then BULK INSERT

Hi. Just looking for your opinions on a technique I have seen for using BULK
INSERT. It reads through the flat file first deleting rows in the target
table that if finds. It then issues the BULK INSERT command on the same flat
file. This has the effect of doing an insert or update.
The DBA reports that the tables tend to become more fragmented because of
the deletes rather than using update.
Are there any pitfalls to this approach? Does it sound OK?
Are there other approaches other than Transact SQL for inserting or updating
large amounts of flat file data?
Thanks!!
McGy
[url]http://mcgy.blogspot.com[/url]"McGy" <anon@.anon.com> wrote in message
news:eudcIdGPGHA.3984@.TK2MSFTNGP14.phx.gbl...
> Hi. Just looking for your opinions on a technique I have seen for using
> BULK INSERT. It reads through the flat file first deleting rows in the
> target table that if finds. It then issues the BULK INSERT command on the
> same flat file. This has the effect of doing an insert or update.
> The DBA reports that the tables tend to become more fragmented because of
> the deletes rather than using update.
> Are there any pitfalls to this approach? Does it sound OK?
> Are there other approaches other than Transact SQL for inserting or
> updating large amounts of flat file data?
> Thanks!!
> --
> McGy
> [url]http://mcgy.blogspot.com[/url]
>
>
It depends on how big the tables are and how big the flat file is.
Personally, I would probably BULK INSERT the flat file into a working table
first. Then using a chunking method, I would update/insert about 10,000
rows at a time until the task completed. This has the advantage of letting
SQL Server do what it does best using SET theory for determining updates and
inserts, but also doing it in small enough chunks that other processing can
continue with only slight interruption.
Just my .02
Rick Sawtell
MCT, MCSD, MCDBA|||Thanks for the pointer Rick.
McGy
[url]http://mcgy.blogspot.com[/url]
"Rick Sawtell" <Quickening@.msn.com> wrote in message
news:%23pxvSgGPGHA.1124@.TK2MSFTNGP10.phx.gbl...
> "McGy" <anon@.anon.com> wrote in message
> news:eudcIdGPGHA.3984@.TK2MSFTNGP14.phx.gbl...
> It depends on how big the tables are and how big the flat file is.
> Personally, I would probably BULK INSERT the flat file into a working
> table first. Then using a chunking method, I would update/insert about
> 10,000 rows at a time until the task completed. This has the advantage of
> letting SQL Server do what it does best using SET theory for determining
> updates and inserts, but also doing it in small enough chunks that other
> processing can continue with only slight interruption.
> Just my .02
>
> Rick Sawtell
> MCT, MCSD, MCDBA
>
>
>

No comments:

Post a Comment