AV from exporting 2M rows to Excel

Hey all,

In 12.1.0.22, our lovely auditors can’t figure out how to import a pipe-delimited file (since they’re requesting data with both commas and quotes), so now I need to re-extract all 2M rows into one or more Exhell spreadsheets. Because that’s efficient.

Well, Toad gets to about 1.1M rows and AVs with:

Access violation at address 000000000697E295 in module ‘Toad.exe’. Read of address 0000000000000008

I’m going to attempt to break up the query a little more, but thought I’d throw this out here. I’ve got a support bundle just itching to be set free…

Thanks!
Rich

Did you check to see if you ran out of memory? Usually Toad is good about reporting this but not perfect

From: Rich J. [mailto:bounce-rjesse@toadworld.com]

Sent: Wednesday, April 02, 2014 12:17 PM

To: toadoracle@toadworld.com

Subject: [Toad for Oracle - Discussion Forum] AV from exporting 2M rows to Excel

AV from exporting 2M rows to Excel

Thread created by Rich J.

Hey all,

In 12.1.0.22, our lovely auditors can’t figure out how to import a pipe-delimited file (since they’re requesting data with both commas and quotes), so now I need to re-extract all 2M rows into
one or more Exhell spreadsheets. Because that’s efficient.

Well, Toad gets to about 1.1M rows and AVs with:

Access violation at address 000000000697E295 in module ‘Toad.exe’. Read of address 0000000000000008

I’m going to attempt to break up the query a little more, but thought I’d throw this out here. I’ve got a support bundle just itching to be set free…

Thanks!

Rich

To reply, please reply-all to this email.

Stop receiving emails on this subject.

Or
Unsubscribe from Toad for Oracle - General
notifications altogether.

Toad for Oracle - Discussion Forum

Flag
this post as spam/abuse.

Hey Brad,

Yeah, SysInternals reported peak physical usage at 7.2/8.0GB and system commit at 7.0/20.0GB. Even after closing everything else (and bouncing Toad), it bombed at the same point (I think?) with only 5.8GB used.

I tried slimming down the report to “only” 1M rows, but the export fails on the write of the file with:

Error on writing record # 23

A disk error occurred during a write operation

Doh!

Rich

Nuts. Even at ~400K rows I get the Error 23. And saving to an Exhell instance bombs as well, but at 1M rows, which I might be able to work with.

Could this have anything to do with me being saddled with Office 2003 (Win7 pro 64-bit)?

Is the “Export all results to grid” option unchecked? It uses less memory that way.

Or maybe it’s data related if the error keeps referring to that 23rd record.

Are you hitting a size limit on the file system by any chance?

Or, slightly obvious, did you run out of disc space?

Cheers,

Norm. [TeamT]

Sent from my Samsung Galaxy Note III - powered by Three

n 12.1.0.22, our lovely auditors can't figure out how to import a pipe-delimited file (since they're requesting data with both commas and quotes), so now I need to re-extract all 2M rows into one or more Exhell spreadsheets.

I had the same situation several years ago. This is why I wrote next post:

damir-vadas.blogspot.com/.../export-table-to-filecsv.html

with this method I was able to export 60.000.000 records and produce 41 GB file with no problem. It is pure csv (custom sign for separation, it was "tab") and the could import it any db or Excel where they want.

Later auditors are wondering how to import that file but it was not mine problem-I hav delivered them whole general ledger and our duty was finished.

Hope this helps you.

Well, I’ve got 40+GB free, so that’s not an issue. I have all options in the export unchecked, except for “Include column headers”. I’ve tried with/without “Use only General cell formatting”.

I’ve also just downloaded beta 12.5.0.58 (yay!) but I get the same results. :frowning:

Damir, I already have a delimited file, but the auditors (one of the major multinational ones) are rejecting it because they say they can’t read it. I needed to use a pipe (" | ") to delimit file instead of a comma, because the fields contain commas, single quotes and double quotes.

Could this be an issue with Office 2003? Is Office required to be installed to export to Exhell?

Thanks!

Rich

A newer version might help with the Excel Instance format, but it won’t make any difference in the “Excel File” format. Excel isn’t used when we write to file (we use a 3rd party component for that).

It’s kind of a pain, but might work - can you export to tab or pipe delimited and then import that into excel?

Funny you should bring that up, John, because I did try that, using a VB hack from MS, as 2003 will only import 65K rows by default. And it bombs around the 1M row mark…

Everything’s pointing to 2K3 as being the issue. I’m asking for 2010 now. (sigh)

I’ll post the silly/happy/ugly/scary results from what undoubtedly will be a messy upgrade here. :slight_smile:

Thanks!

Rich

1M is the rows-per-sheet limit of the xlsx format, that’s probably why it bombed…it didn’t want to just go to the next sheet for you.

I wonder if there is a way to import to a particular sheet…

From: Rich J. [mailto:bounce-rjesse@toadworld.com]

Sent: Thursday, April 03, 2014 10:57 AM

To: toadoracle@toadworld.com

Subject: RE: [Toad for Oracle - Discussion Forum] AV from exporting 2M rows to Excel

RE: AV from exporting 2M rows to Excel

Reply by Rich J.

Funny you should bring that up, John, because I did try that, using a VB hack from MS, as 2003 will only import 65K rows by default. And it bombs around the 1M row mark…

Everything’s pointing to 2K3 as being the issue. I’m asking for 2010 now. (sigh)

I’ll post the silly/happy/ugly/scary results from what undoubtedly will be a messy upgrade here. :slight_smile:

Thanks!

Rich

To reply, please reply-all to this email.

Stop receiving emails on this subject.

Or
Unsubscribe from Toad for Oracle - General
notifications altogether.

Toad for Oracle - Discussion Forum

Flag
this post as spam/abuse.

It’s 2K3, so no xlsx yet. It’s 65K rows for me.

MS Orifice integration is nice, except when needing to upgrade…

Rich

Oh, right, that’s not until 2007. I was thinking xlsx was 2003. I’m with you now.

Yeah, that’s like, 11 years ago, Rich. You’re due.

From: Rich J. [mailto:bounce-rjesse@toadworld.com]

Sent: Thursday, April 03, 2014 11:03 AM

To: toadoracle@toadworld.com

Subject: RE: [Toad for Oracle - Discussion Forum] AV from exporting 2M rows to Excel

RE: AV from exporting 2M rows to Excel

Reply by Rich J.

It’s 2K3, so no xlsx yet. It’s 65K rows for me.

MS Orifice integration is nice, except when needing to upgrade…

Rich

To reply, please reply-all to this email.

Stop receiving emails on this subject.

Or
Unsubscribe from Toad for Oracle - General
notifications altogether.

Toad for Oracle - Discussion Forum

Flag
this post as spam/abuse.

I needed to use a pipe (" | ") to delimit file instead of a comma, because the fields contain commas, single quotes and double quotes.

When I say "csv" I mean any "delimited file" with any kind of char (as it is in mine example). In your case I would use tab (^t) (ascii chr(9)). This is the best delimiter for any kind of csv files.

And for output to Excel file what your auditor needs.

Your need is to deliver them data (please allow me totell you taht I have huge experience with auditors). So csv, is also OK format and they should accept that.

If you are in deeper trouble (nobody care that huge Excel is a problem) and do not want use your csv solution, then use query which will split data in several set of data and then import in each Excel one set.

Later, ask your auditors that they can merge data in one Excel if they claim it is easy.

Hope this helps

And from official site (MS for Excel)

office.microsoft.com/…/excel-specifications-and-limits-HA103980614.aspx

Worksheet size 1,048,576 rows by 16,384 columns

So it i clear that 2M cannot be entered in one sheet.

:slight_smile:

And that’s not for 2003, where the limits are 65K x 256, but Toad’s export breaks large Excel exports into multiple worksheets as needed.

I ended up breaking the query into six parts, which the beta was able to handle (I got the Error 23 using 12.1 GA yesterday). I accidentally did one export with a .xlsx, which took much longer and created a smaller file with more rows in each sheet. I’m guessing that the 3rd-party component automagically uses 2K7/2K10 format (compressed?) and limits with that extension.

In any case, I’m gonna send this mess to the auditors, who, for the size of their company, should have been able to open a silly delimited text file…

Back to the fun tornado of a company merger with a double ERP upgrade. :wink:

Thanks all!

Rich

If you choose xlsx as your file extension, yes, we write to the 2007.2010 format.

Regarding the size - those xlsx files are actually zip files. If you rename the file, you can unzip it and find a lot of files inside.

Rich,

Are you able to install Libre Office? I remember an Excel spreadsheet that W2K couldn’t open and I was able to install Open Office as it was then, open the file and export it as Excel format.

That will probably get you around the 1,000,000 row limit but you will have to export it as delimited as I very much doubt that the Toad Excel component will talk to LO.

Just a thought.

Cheers,

Norm [TeamT]

Sent from my Samsung Galaxy Note III - powered by Three

Toad can write to xls and xlsx files without Excel even being installed. Excel itself is only used when the “Excel Instance” format is selected. So, no we don’t talk to LO, but we don’t need to.

Hey Norm,

For kicks (and because I’ll need to do this again next audit), I installed LO4, but when I tried to open the pipe-delimited file that I originally created with Toad, it bombs like Exhell does at the 1M row limit for a single worksheet.

Then again, next audit I should have more time/ability/will to push back for them to use the %&%#&@ properly delimited text file we gave them in the first place. At least it’s not like our last auditors where the analyst called me and asked if I could break up the csv into 100 pieces because he couldn’t load the file in Notepad.

Notepad.

Note-freaking-pad.

There oughtta be a law…

Happy Friday! Arrrrrrrrr!

Rich