Toad World® Forums

Schema Compare performance problem


#1

Hello,
I think that you will need to improve a little bit the performance of Schema Compare during the analysis of a schema.

I started more than 1 hour ago the comparison against a schema with 4870 various objects and while I write it’s still gathering info on the columns…

And at the same time I see that the database server is being hammered quite a bit and also my workstation is feeling sluggish, since it is always at 100% of used CPU.

Thanks,
Paolo

capture26-2-2007-19.20.24.jpeg


#2

Hi Paolo,

Performance is a very subjective issue, and we need a lot more information from you, to be able to benefit from what you are experiencing.

  • What are the specs on the client PC
  • What are the specs on the server PC
  • What version Oracle database(s) are you comparing
  • Are your client / server on a LAN or a WAN

Some background on our performance testing…
Our main test database consists of approx 1,000 objects. These objects are very diverse, and are designed to give us extensive coverage of object types and different dependencies etc.
This 1,000 object test schema (on 10g Release 2 and 9i) takes approximately 1 minute to compare (on a fast LAN).

We have a pseudo ‘Ora Apps’ schema that we have done testing on…
This schema has approximately 60,000 objects (many packages and views).
This 60,000 object schema takes approximately 20 minutes to compare (on a fast LAN).

10G Release 1 - takes a long time to compare on. This is due to the nature of the ‘10G Rel 1’ dictionary tables. We haven’t bothered getting performance on ‘10G Rel 1’ any better at this stage. We may look at this in the future.

We’ve found that the specs on the PCs can have a huge difference to how long the compare takes, as can the speed of the network (obviously) and the performance of the various databases (also obvious).

Although we have experienced high CPU usage on the client PC during various stages of the compare process… we haven’t experienced the client PC becoming sluggish. Unless the PC has has 1GB of memory or less and the compare is quite large.
Currently we do have an issue with excessive memory consumption, but we are working on this and the the 2nd release of Schema Compare will have dramatic improvements (in the area of memory usage).

We are glad that we have you on-board Paolo, as working in HR, you should be used to working with some large databases. So, more detailed feed-back from you will be most beneficial for us and the Schema compare product!

Jaime


#3

Jaime,
I do not agree on the fact that performance is a subjective issue and probably
most of your future Schema Compare users with huge number of objects in their
schema won’t agree as well.

I am available to give you all the information you need to investigate the issue
and possibly fix it.

The client PC is a P4 2.40GHz with 512Mb of Ram running XPSP2 on the latest patches.
Both the servers are Sun-Fire-880 with 4 Sparc CPUs and about 10Gb RAM running SunOS 5.9.
The database versions are the same, that is 9.2.0.7.0 and there was no significant activity on them when I run the Compare.
All the servers and the clients are on a fast LAN.

I tried to compare the HR schema of an Oracle Apps installation (that includes HR and Payroll)
with an empty one (the SCOTT schema), so that I could generate the script to
recreate all the existing 4870 objects.

I guess that the complexity of the HR schema is quite a good test for a real
life example. I could have tried the APPS schema, but I thought it was a little
bit too much to start with, since it includes over 138000 objects.

We already spoke about the high CPU usage on the client PC and I understand that the
task is not a high priority one, so it shouldn’t affect too much the response
time of the other running programs.

So I let it running in the background, but after about 50 minutes of the
Compare running I started having problems when switching applications or
opening new windows. That could be very likely related to the amount of memory in
use, but I don’t remember any outstanding amounts of RAM consumed by Schema Compare.

Anyway it’s a fact that after 1 hour and 20 minutes it was still comparing columns.
Since it was already late, I stopped it and I went home.

I could leave it running, but I don’t know how to check the elapsed time of
the process if it’s left unattended.

I should probably wait for a new release to check if the fixed memory issue makes a difference.

Thanks,
Paolo


#4

Thanks for this detailed reply Paolo,

Perhaps I used the wrong word (subjective). Performance is probably the most important issue we face in creating an enterprise level tool (which is our aim). What I was meaning to say, is that it is very hard to obtain reliable metrics in relation to performance. Peoples environments vary so much.

I feel certain that the sluggish performance on your client PC is due to the amount of RAM that you have (i.e. 512 MB).
As I mentioned earlier, this first cut of Schema Compare does have an issue with excess memory usage.

Unfortunately, the second cut is still some time off. What I will do, is make a preliminary build available to you, so that we can get you to test the memory upgrades against your environment. This will be very helpful for us to establish if we are heading in the right direction with the memory upgrades.

I will send you an e-mail (within the next 24 hours) with a link to the download of the new version.

Thanks for your help with this!!

Jaime


#5

Jaime,
I am still alive, but lately I cannot test properly the Schema Compare, since one of the two test instances is not available now (another project is going live). I hope next week to be able to test the special build you sent me.

I also saw that there is a specific forum for the schema compare, so I will start posting there my findings.

Thanks,
Paolo


#6

Thanks for the update Paolo,

Whenever you have time to test, is fine with us. We fully understand your situation.
We’re just glad to have you on-board!!

  • Jaime -