To the Collective(Quest Guru(s) ) , Matt in Michigan and David Christian at a
range somewhere popping some round !!!
I have 437,035 rows of data from Yonder database and table I needed to extract
and place into a like table in Hither database and table.
A backup, zip, file transfer, unzip, restore was not an option since I am a very
old man, and I want to be alive when to see this finish…
So I used Toad-SS Export to delimited and Import on an pretty much idle system
(my Windows 7-64 Prof desktop) and was able to have enough system resources to
actually get this done AND in 15 minutes… Freak-n-amazing, I am shocked it
actually worked. David I believe you or Henrik wrote this code year ago, so I
will give you credit.
I also tried exporting the file to a SQL Script which also worked, but importing
a SQL Script …… Did not go so well .
Meaning it did not work, Because opening the SQL script exported file in Toad
SQL Server Editor … I got a Toad-SS fault on ‘out of system
resources (Memory)’ “ Imagine that”, which was expected
because the file is 360,631 KB…
Now I know that you know that I know that I could have run the SQL script on
OSQL and it would have worked just fine, but what I REALLY Wanted was a way to
run that large set of INSERTS in Toad.
I do not believe it is part of Import and Maybe it should be … I am just
asking the question: “Does anyone have any ideas on how else to have run
this exported SQL Script as an import, so as to fire up the insert statements
within the file as seen below?”
Otherwise, I am thinking of asking for a new feature which will read the file at
10,000 rows at a time and process them until end-of-file as part of the Import
Note: This is not a: