Export gives memory problems

We have a table that has about 200 columns. When we try to export 2 million rows
as a delimited text file thru TOAD 10.6, we get an out of memory error.

Then we get a catastrophic error and it shuts TOAD down. We have about 48G of
memory on the machine.

Is there another way to export this large of a file?

You’re running out of memory on your pc, not the server.

Do you have ‘Display all results in grid’ enabled? If so, turn that
off, it should help.

If during the export you have checked the option to have the grid mirror what
you write to the file you’ll get this error. In older versions of toad
this was called clone cursor. It was renamed and I don’t remember offhand
what it’s now called – but look into it. I’m guessing this is
the issue …

Yes, that should do it.

Also, if a bunch of those columns are big (like VARCHAR2(4000), you’ll
need to drop the OCI Array Buffer Size (In Options -> Oracle) down to a
smaller value, like 25 or 50 or 100. It’s hard to guess what the value
should be without knowing more about your table. The default of 500 is generally
a good setting but for really wide tables you have to decrease it.

By the way, even if you have that much memory on your PC, it won’t matter,
because Toad is a 32 bit app, so it can only address 2Gb of memory.

One other thought (not related to memory issue) is that if you adjust your
SQL*Net settings for DCU then you can increase packet size so more rows per
packets (less net traffic) ….

Bert/everyone,

Appreciate the help.

I just found out that it is a query that they are trying to export.

The query generates the results in a grid and then we right click to export. We
did turn off the display results, but that doesn’t help.

Although I would like to have TOAD do this, is there a way to do this without
TOAD ? (I’m so TOAD dependent- it’s almost an addiction!)

increasing packetsize would help on the network end of things
select owner, sum(bytes)/(1024*1024) “MB” from dba_segments
where segment_type in (‘TABLES’)
group by owner;
then roundup to nearest MB the size of the table you are exporting (in this
example table size is

expdp uwclass/uwclass SCHEMAS=uwclass DIRECTORY=data_pump_dir
DUMPFILE=demo%U.dmp COMPRESSION=none FILESIZE=5M

Martin


Verzicht und Vertraulichkeitanmerkung/Note de déni et de confidentialité

Diese Nachricht ist vertraulich. Sollten Sie nicht der vorgesehene Empfaenger
sein, so bitten wir hoeflich um eine Mitteilung. Jede unbefugte Weiterleitung
oder Fertigung einer Kopie ist unzulaessig. Diese Nachricht dient lediglich dem
Austausch von Informationen und entfaltet keine rechtliche Bindungswirkung.
Aufgrund der leichten Manipulierbarkeit von E-Mails koennen wir keine Haftung
fuer den Inhalt uebernehmen.

Ce message est confidentiel et peut être privilégié. Si vous n’êtes pas le destinataire prévu, nous te demandons avec bonté que pour satisfaire informez l’expéditeur. N’importe quelle diffusion non autorisée ou la copie de ceci est interdite. Ce message sert à l’information seulement et n’aura pas n’importe quel effet légalement obligatoire. Étant donné que les email peuvent facilement être sujets à la manipulation, nous ne pouvons accepter aucune responsabilité pour le contenu fourni.

I will send you a dynamic SQL script to export a table – actually it might
already be included in toad’s script manager and called either export or
dump table – but I’ll send it to you just in case

Did you try OCI Array Buffer Size? Sounds like that is the culprit. Knock it all
the way down to 25.

Sqlplus and SPOOL?


using-sqlplus

I would run it on the server…

Jeff