Our application requires generating and storing a lot of binary .pdfs. We could use a facility that exports all of the rows in a results set one at a time to its own .pdf just as if we manually selected a row in the results set, asked it to be presented in single record format, then printed it to a .pdf, and finally loaded it into a table as a BLOB. An automation utility that does this would be ideal. Thanks.
I have to be honest - unless this idea gets a lot of votes, I don't see us implementing it as a whole. It just seems like a very specific thing that only you would use. However, maybe we can piece it together.
You could fairly easily generate a report that would save all records to a single PDF, one record per page. I didn't see a way, using our report generator to create a separate file for each page/record, but I know that Adobe Acrobat (and likely other software) you can split PDFs into multiple files.
So that's exporting.
Now, regarding the import. We have had a few requests to import multiple files into blob columns.
For that part of the request, would we just create a new row and only insert data into that one column? Or, if the target rows already exist and we're just supposed to update a column, how would we know which row that each file is supposed to go with?
I'm waiting for more feedback from my colleagues on the loading of the resultant .pdfs into a table. But preliminarily, yes we would need a key (also from the results set) to load a foreign key with the blob to attach it to the correct parent. Getting complicated! I'll get back to you.
I think for that to happen, we'd need to:
- I'd need to develop a "report" action in automation designer. This would allow you to run a report that's in reports manager, and save/print/email the result.
- I'd need to develop a "blob import" wizard (it could do inserts, or key off of file name for updates)
Once those features exist in Toad, you could
- Add a query iterator action in automation designer (this could supply data and filename for #2)
- Add a report action under the query iterator (it would run once for each row in the query iterator)
- Add a blob import action after that
This is about the only way I could see this all this happening in a way that could meet your requirements, and me developing developing something that others might use the "pieces" for their own use.