DataGen in Batch Mode

Hi Team:

I am trying to automate a data generation process. I can create a Toad Datagen JSON metadata file with the Datagen option from Toad, and I would like to populate similar JSON files and use them to generate test data in batch mode. I have 250+ tables, and I have extracted all of the plans, and from that i wish to build the JSON files and feed them to the batch process or command line scripts. It is not exactly clear to me what and how I need to do to pass the input file to the data gen script. I do have some idea from playing around with the scheduler, but if someone has information or can point me to the documentation, i would appreciate it. I am using the latest version of TOR

Thanks, Don

I'll take a crack at a couple of ways to do this, but first, if there's a good reason to insist on JSON formatted files, then you'll run into an issue straight away, as Toad/Oracle supports JSON when exporting, but not when importing. Not sure why, but my guess is that Dev is working on getting this into a future release.

That said, and assuming that your requested end-result is to refresh data in those 250+ tables, then couple things to consider...

  1. If you don't need JSON format, then use CSV or another format that is supported by both Export/Import utilities within Toad.

  2. If random test data is OK, you may want to use Toad's Test Data Generator, which generates real-world data for many popular domain types (company names, addresses, etc.) See snap below, which also shows that there's even a convenient "Watch" icon in the lower left of the Test Data Generator Wizard to create an Automation App that can be scheduled on your Windows workstation. BTW, I just posted a blog that talks about this icon, and its sister icon, the "Camera" here.

  3. If you still need those JSON (or other format) files, and your test data needs to be the same for each each its imported, then it still might be possible to use Toad's Automation Designer to create an Automation Job, but you'll need to do some work. Specifically, you'll likely need to define a script or utility that takes your JSON input and populates the table it targets. If you can do that, then the Automation Designer has looping flow-of-control constructs that allow you to perform the test data import for every table you have.

I'm hoping that one of the first two options meet your needs, as they are the easiest to implement.