Limits on "Generate Change Script"?

We are able to run the "Generate Change Script" for all our PostgreSQL 9.5 schemas except one, which has the largest number of tables and workspaces. Is there a limit on the number of tables and/or workspaces for "Generate Change Script"?

Thank you,
Steven Smith

Hi Steven,
there is no known exact limits, but in fact there are. In 64b version of TDM should be no memory limits, but performance can go down and necessary time for change script can be too long with very big models. It's depends on each object (Entity, Attribute, Index...).
Could you share information about type of model (Oracle, PostgreSQL...) and statistic(how many do you have Entity, attributes etc.)? This information you can find on Model Properties dialog.

Thanks
Petr

Asked the developer who reported the problem to me to answer your questions. Here is his response.

Hi Steven,

The ‘ptl’ schema model is a PostgreSQL model. Below are some stats on this model:

Entities

162

Data Types

56

Relationships

188

Attributes

1451

Keys

164

Indexes

14

Thanks,

Rayesh Upadhyay

443-348-1502

Hi Steven,
this is not so big model we tested much more bigger. But of course there can be some unexpectedly constructions that can decrease performance or other. if you are willing to share your model with me I can do some analyze. In this case you can send it to email petr.daricek@quest.com.
Otherwise I recommended these step

  1. Do Test model by function Test Model (if there are bugs you can use Repair model). It is in popup menu of model. You have to switch on Expert Mode from options. Result will be display in message explorer.
    image

  2. You can select option "Log Progress to a File" from Change Script wizard step "Comparison Settings". It will create a log file and you can see which objects are processed. If you repeat it more time and always you have same result you can discover problematic object.

  3. Try to generate change script with only some objects selected.

Regards
Daril