Data Script for large database: huge amount of temp files

I’m trying to create a 100-row version of a 600GB sql database ( ~ 28 million records in source DB ). Toad 5.0.0.544 runs along and after 90 minutes has not finished but has generated 70GB worth of “temp” files. At this point I kill the process before it fills up my hard drive.

Is this expected behaviour for a source DB that large?

Sorry, I don’t understand. What is “100-row version”? How
exactly do you create the database, by restoring? Can you please rephrase you
question or add more specifics?

Thanks,

Igor.

When select an existing database in Toad, then right click, one of the options to create a data script. the next dialogue allows you to choose which tables, etc and also how many rows you want. I want to create a small copy of the source database so I input 100 rows.

When I kill the script process after a while, I see it has already generated the script to create a new database and create all of the tables. So far the inserts to create the 100 rows have not completed due to the huge amount of temporary files Toad creates. I’m now trying it again on a server with 900GB free space but was wondering if it’s normal for this to create so many temporary files.

Okay, now I got it J . Toad for sure uses temp files during data script
generation but I can’t say how much. I’ll forward your question to
Generate Data Script team.

Thanks,

Igor.

Hi,
unfortunately this is limitation of Generate Data Script dialog for now. We are aware of this issue – CR77564. Was not able to resolve it in 5.0 release. For now you can try to use Data Compare for this. Hope it can help you. Please report if you need some advise with Data Compare module (it can be a bit tricky).