Toad export excel slowness problem

You wouldn't necessarily need an order by in your query - sometimes that results in full table scan even though you might not need it...

now my biggest problem on my view. how i can solve that? is there any way to see my fault on view?

cause on my view query results didnt match on real datas... :no_mouth:

IMO fastest method of exporting data from the db into excel (And assuming you will be creating the extract on a periodic basis): Create pl/sql procedure using utl_file to write the contents of the view/table to an oracle directory. SCP the file output from the directory to your local, then import that file into excel.

Can you be more clear?

HI crashed,

I deal daily with tables coming from Oracle with sizes ranging from hundreds to 500,000 rows per table. The data comes from a remote Oracle server where I can read tables, write queries, but cannot CREATE TABLEs.

We used to receive huge text extracts that we then managed in Excel, then Excel + Access. Once I got Toad for Oracle, I could generate my own "extracts" with Toad. Excel and Access were non-responsive when the files got big, so I installed community mySQL (free) and imported my into my local mySQL instance where I control the permissions, do my analysis and filtering, and only then export to Excel, usually as a simple .csv file.

This gets around all the quirky Excel text vs. number behavior, date conversions, etc. For example, one ongoing company-wide issue with Excel is the conversion of a commonly used four character code that starts with a digit but can end with digits or numbers. Excel will take what should be '7E10' and not-so-helpfully convert it to 70,000,000,000.

I have had great success with Automation Designer>Export Action>Delimited Text. I then read the big tables into a local instance of mySQL Community (free) which has an excellent new parallel import utility. I do analysis and filtering in mySQL, and only output to Excel when shipping results off.

Here's how I optimized my TOAD exports:

  1. Reduced query complexity on slow running queries. In some complex cases, I export two or more tables and rejoin on the desktop with mySQL. It's faster, and I don't have to worry about a longer query attracting attention from the prod Oracle admins.

  2. IF the query doesn't start counting up a few seconds in when I click "Run" in the Properties window, I comment out JOINs and corresponding fields in SELECT one by one and observe the difference in how fast the row counts up.

  3. ROWNUM < 100 (or even 10000) in the WHERE clause is your friend. You can test a query quickly without having to wait for the whole result set.

  4. No ORDER BY, those can really slow the query.

  5. The Query is only for export to Delimited Text. "Display Results in Grid" must OFF.

  6. Misc tips with smaller but measurable effect: I use Unix style to save a character, use UTF-8 encoding, use ASCII 31 (FS) as separator, the make the last column ASCII 30 (RS). On import on the desktop to mySQL, I use RSLF (0x300a) as end of record. No quoting anything, and still I never have trouble with special characters.

PS: For the pesky 7E10 that Excel converts to a number, I encode it in SQL as

CONCAT('=CONCAT("',fieldname,'")')

First CONCAT is for SQL, then the field will show up as =CONCAT("7E10"). Excel never converts that to a number.

1 Like