Is there a Limit to the Amount of Data Loaded into a Table?

Group,

I’ve got a very large Excel file that is holding about 50 Meg of data that I need to upload into my schema. I can’t seem to get this to happen as I see at the top that the program is “Not Responding” when I attempt to do the load via the “Import Table Data” option. Is this because there is a limit to size of the file I’m using for the data?

In advance, thanks for the assistance.

Don

On large XLSX files, sometimes it takes Toad a while to open them. I believe it will finish if you just let it keep working (it’s still working when you see “not responding”)

If you have an option of getting this file as a comma-delimited text file (maybe you could save it to that format from Excel - any delimiter would be fine), then that might go faster. Another thing that might make it go faster is if you can save the file as XLS rather than XLSX - although maybe that’s not an option if you have more than 65536 rows in your spreadsheet.

Also, if you have the option of upgrading your Toad, it will probably go a lot faster in 12.8 due to some improvements in our Excel read/write component.