Exporting Large Datasets

I’m trying to export a dataset containing 4 million rows to a CSV. I continually get a message asking me if I want to load everything into memory or lose rows. If I load everything into memory then TDA bombs out. Any help is greatly appreciated.

What is your entry point to export? Are you right clicking from the data tab of the Explorer? If this is the case, just open up the Export Wizard from the Tools menu and enter ‘Select * from tablename’ in the query section or select the table from the list. This will bypass the loading issue.

Debbie

Maybe I’m misunderstanding. I ran a query that resulted in a result set containing 4 million rows. Will your suggestion still work?

When you use the data tab or an editor it retrieves the first chunk of data and then pauses. It only pulls all of the data if you try to scroll into it. When you right click and say send to export it can’t just send a portion of the data. So it tries to retrieve the whole 4 mill rows and then pass it to the Export.

If you just use the Export to do all of the work it will process the data as it goes and has a better chance of completing before running out of memory.

I would give it a try after you close the current editor with the 4 million rows.

Debbie