I do not agree on the fact that performance is a subjective issue and probably
most of your future Schema Compare users with huge number of objects in their
schema won’t agree as well.
I am available to give you all the information you need to investigate the issue
and possibly fix it.
The client PC is a P4 2.40GHz with 512Mb of Ram running XPSP2 on the latest patches.
Both the servers are Sun-Fire-880 with 4 Sparc CPUs and about 10Gb RAM running SunOS 5.9.
The database versions are the same, that is 126.96.36.199.0 and there was no significant activity on them when I run the Compare.
All the servers and the clients are on a fast LAN.
I tried to compare the HR schema of an Oracle Apps installation (that includes HR and Payroll)
with an empty one (the SCOTT schema), so that I could generate the script to
recreate all the existing 4870 objects.
I guess that the complexity of the HR schema is quite a good test for a real
life example. I could have tried the APPS schema, but I thought it was a little
bit too much to start with, since it includes over 138000 objects.
We already spoke about the high CPU usage on the client PC and I understand that the
task is not a high priority one, so it shouldn’t affect too much the response
time of the other running programs.
So I let it running in the background, but after about 50 minutes of the
Compare running I started having problems when switching applications or
opening new windows. That could be very likely related to the amount of memory in
use, but I don’t remember any outstanding amounts of RAM consumed by Schema Compare.
Anyway it’s a fact that after 1 hour and 20 minutes it was still comparing columns.
Since it was already late, I stopped it and I went home.
I could leave it running, but I don’t know how to check the elapsed time of
the process if it’s left unattended.
I should probably wait for a new release to check if the fixed memory issue makes a difference.