Toad World® Forums

plan cost limitation when generating rewrite queries


#1

Many of the queries I am asked to tune are very complex ( 10 or more tables ) in nature. And when I perform auto-optimize the optimizer generates 200 or more alternative rewrites for my query and then begins to execute all of them. A very large majority of these have a plan cost that can be in the 6 to 9 digit range. But the best tuned alternatives have a plan cost with only 3 or 4 digits at the most. A nice enhancement would be to put an exclusion for generating only rewrites that have a plan cost less than x. Otherwise I can only manually execute the rewrite query step, delete those that have a plancost more than 10000, and then manually start the batch-run.


#2

Since the product is designed for tuning problematic SQL statements, my first assumption is that cost estimation will also be wrong, so I am not encourage people to select only lower cost SQL alternatives to test run, I found in certain environment, higher cost may be better especially for those significant wrong costed SQL statements. Please visit my blog for detail here.

http://www.toadworld.com/BLOGS/tabid/67/EntryId/349/10-Common-Misconceptions-in-SQL-Tuning-1.aspxIf you really want to delete higher cost SQL statements after rewrite, you can order those SQL alternatives by cost, and then range select that higher cost SQL and delete it in a batch.

Richard


#3

If you are trying to satisfy a one-size fits all environment I can agree. But if your intention is to design the tool to be versatile enough to work out of the box for differing types of tuning requirements, to save the user time, making the tool more valuable to the user, then auto-optimizing needs to be more versatile as well. We are 85% web based and as such an oltp type of environment, that requires very fast low io cost queries to be executed for the many clients of ours. Typically a query with a cost of 100000 or more is detrimental to this type of environment. If on the other hand we are tuning for a dss database environment you response to my suggestion is plausible. Also I have found that almost always using the auto-optimize as is, I end up with many queries faster than my original, but one thing in common is their plan costs are always at the lower end.


#4

Yes, you are right in your environment, during the design of the new auto optimize function,we are intended to help common(inexperience) users to find better solutions without the need to manage too much option settings, since different environment and SQL with the cost estimation error in their database may be huge, we don’t want users to skip any potential good alternatives by selecting only those lower cost SQL alternatives for just time saving, actually, we have implemented an intelligent test run order, it can significantly reduce test run time. User defined test run criteria like cost threshold(maybe be a percentage of overall SQL with lower cost to test run, since we never know the distribution and scale of cost before SQL rewrite).
A better solution for experience user like you, I will propose to enhance the Rewrite function to execute Batch Run once the rewrite is finished and the Batch Run function can accept Cost threshold (not a percentage, since we assume the user know the SQL very well) for that SQL (since every SQL has its own cost scale).

What do you think ?

Richard


#5

This solution sounds like it will be very helpful. I look forward to seeing it added to the tool. Thank You