I need to insert a text file where might contains duplicate data into
a table with primary keys using a DTS package. is there anyone out
there could help me? I have tried couple way to get around, but still
doesn't work.I would load trhe text file in to a table of the correct structure, but =with no PK constraint. Then use SELECT DISTINCT -- to retrieve the =data you need and insert into final table. (INSERT INTO -- SELECT =DISTINCT -- FROM can by helpful for that part).
Mike John
"Matt" <tkiansoon@.yahoo.com> wrote in message =news:7a4ed84d.0308270840.25fe7496@.posting.google.com...
> I need to insert a text file where might contains duplicate data into
> a table with primary keys using a DTS package. is there anyone out
> there could help me? I have tried couple way to get around, but still
> doesn't work.|||You could load it into a staging table first and then de-dupe it or have a
index with ignore duplicate key on the staging table (but this would slow
the load unless the text file is ordered the same as the index which would
need to be unique clustered) and assumes that the entire row is a duplicate
if the key is.
--
HTH
Jasper Smith (SQL Server MVP)
I support PASS - the definitive, global
community for SQL Server professionals -
http://www.sqlpass.org
"Matt" <tkiansoon@.yahoo.com> wrote in message
news:7a4ed84d.0308270840.25fe7496@.posting.google.com...
I need to insert a text file where might contains duplicate data into
a table with primary keys using a DTS package. is there anyone out
there could help me? I have tried couple way to get around, but still
doesn't work.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment