I'll take a crack at a couple of ways to do this, but first, if there's a good reason to insist on JSON formatted files, then you'll run into an issue straight away, as Toad/Oracle supports JSON when exporting, but not when importing. Not sure why, but my guess is that Dev is working on getting this into a future release.
That said, and assuming that your requested end-result is to refresh data in those 250+ tables, then couple things to consider...
If you don't need JSON format, then use CSV or another format that is supported by both Export/Import utilities within Toad.
If random test data is OK, you may want to use Toad's Test Data Generator, which generates real-world data for many popular domain types (company names, addresses, etc.) See snap below, which also shows that there's even a convenient "Watch" icon in the lower left of the Test Data Generator Wizard to create an Automation App that can be scheduled on your Windows workstation. BTW, I just posted a blog that talks about this icon, and its sister icon, the "Camera" here.
If you still need those JSON (or other format) files, and your test data needs to be the same for each each its imported, then it still might be possible to use Toad's Automation Designer to create an Automation Job, but you'll need to do some work. Specifically, you'll likely need to define a script or utility that takes your JSON input and populates the table it targets. If you can do that, then the Automation Designer has looping flow-of-control constructs that allow you to perform the test data import for every table you have.
I'm hoping that one of the first two options meet your needs, as they are the easiest to implement.