Toad World® Forums

Performance Improvements Needed


I have reported performance problems with TDM back to version 4.3.

While there are many improvements in the current Beta over the 4.3 performance is still lagging the original Case Studio product in many areas.

Converting to binary storage format from the XML format appears to have improved basic usage performance.

When generating a script to create tables, views, types, packages, roles, etc. actual measured time for TDM 4.3 to the current Beta lags a full order of magnitude behind Case Studio.

The benchmark I used was a reverse engineering of 5 schema owners and all of their objects as above. Following the reverse engineering I ask each to generate a script to create those objects.

Current TDB Beta time to generate the model: Just over 40 minutes
Case Studio 2.25 time to generate the model: 3 minutes

Case Studio is an older product with support stopping at Oracle 10g. Still, no items were missed in this particular test.



can you send me the models to, please?

I did a comparison of the same model in both products and Toad Data Modeler generated the SQL code significantly faster than CASE Studio 2.

My model was created using CASE Studio 2 reverse engineering.
It had 139 tables, 1272 columns and 52 relationships.
I generated the SQL script in CASE Studio 2 and the output was generated in 1 minute and 28 seconds. Then I saved the model as dm2 file and opened the same file in Toad Data Modeler. I generated script for all the objects in 36 seconds in Toad Data Modeler.

Please try to open the CASE Studio 2 file in Toad Data Modeler and generate the script. This way you can compare the same model in both products easily. I think you might reverse engineered more objects using Toad Data Modeler and therefore the comparison results different.

The reason why I ask for the models is to better understand why the generation time is so different in your case.

Thank you,




I’m sorry but the model and what it represents are proprietary and can’t be released. Where the time consistently appears to be consumed is in the generation of packages. I noticed that in the statistics you quote you say nothing of packages.

I’ve included a screen shot so that you can see the actual figures reported in the generation log.

Our particular model has 232 tables, 33 views, 94 packages with ~220000 lines of script, and results in 233374 total lines of script from TDM and 230605 total lines of script using Case Studio. I’ve compared both files to ensure that they create the same packages.

As per your suggestion I opened the dm2 file generated by Case Studio with the TDM beta and had TDM generate the script. That is the result you see in the screen shot.

It may be interesting for you to note that all of the packages are created using Toad 11.6.

One of my “pet peeves” with Toad is the excruciating amount of time it spends formatting script. It has made me wonder if the same formatting time is being spent in TDM. There are many cases where Toad’s formatting is a nuisance I would like to simply turn completely off. Particularly when it reverse engineers something already in the database like views.

Thanks for your support,

Alan Message was edited by: alan.campbell_390 I overstated the number of packages as I counted the create statement for both package body and the package.

94 packages is the correct number.
Message was edited by: alan.campbell_390


I have done a little bit of RDB abuse today. I RE’ed a SQL Server 2008 DB. It comprises about 180 tables and 52 views. I also brought RE’d 2 SP’s, and about a dozen or so functions.


I haven’t saved the model, so I can’t say what would happen if I saved it to the binary format. However, given that it is all in memory already, I doubt the file save format will make a whole lot of difference.

This is on the “All Items” workspace view. I will also try on more limited views. But, given the moderate size of the DB, this seems a bit disappointing.

This also reminds me of a conversation I had recently with an experienced data architect. He said that they tried to use TDM on a project for an extremely large customer. He said the DB was 800-900 tables covering 7 schemas. However, once they hit 80-90 entities, TDM slowed down so much that they could no longer use it. So, he put together a cost justification proposal for management, and got them to move to a much more expensive competitor. That product was able to handle the DB with no problems, and kept on moving just fine, even with nearly 1,000 entities in it. So, Dell (or Quest, then) lost this exremely large customer, and the competitor gained it, despite being 7 times as expensive.

TDM has speed issues that need to be resolved, beyond “How fast can I open a file?”



thank you, Alan and J Fisher! This is very valuable feedback. There are known performance issues and we already work on improvements that not only include saving in binary format. In the next commercial version forms for entity, attribute and relationship properties will be totally rewritten. Reasons:

a] better user experience and reduction of steps required to make basic operations with tables and columns

b] performance. In past we were using javascript for form customizations. In the next version the javascript is not used until users do the customization. In result, it should represent significant improvement in performance (forms should open quickly, UI changes will be limited etc.)

Currently we are investigating the performance issues related to processing Packages. We know what needs to be improved and work on a solution. I will write details later.

RE the big deal: I believe TDM will manage large models once. We actively work on the developement, of course. The main focus was on user experience. Performance is another important item on our ToDo list. We know how to generate SQL faster, we plan changes in features related to change management etc.

Thank you again,



Good news. The generation of Packages was optimized and the next version will generate code for packages that contain long texts pretty quickly. The problem was in processing application variables in long texts (btw: one of the features that didn’t exist in CASE Studio 2)


Vaclav & TDM team