Toad World® Forums

Dropping Schema Objects


#1

Hi all,
Over the years I’ve done this routine tasks hundreds of times - drop all of the objects in a schema (in development or test) and then re-import all of the objects from the equivalent production database. From time to time I’ve tried to drop all of the objects using TOAD - but it always fails because the number of objects in the schema (40,000) exceeds TOADs limit.

Does anyone out there have any thoughts on the most efficient way to perform this task using TOAD?

Thanks,

Kevin


#2

Database subset wizard - allows you to create or recreate a set of objects and a user defined percentage of their data (and a relationally correct subset at that). This is by far the easiest way and has been in TOAD for many years.


#3

DSS will likely bomb out with a schema that has 40,000 objects. I’d
suggest data pump.


#4

Hi Kevin,

Hi all, Over the years I’ve done this routine tasks hundreds
of times - drop all of the objects in a schema (in
development or test) and then re-import all of the objects
from the equivalent production database.

notwithstanding the replies from Bert and John, I’d seriously advise
against dropping ALL the objects in a schema prior to a refresh from
production. You specifically mention an import from production.

Here’s my problem - I have a system where both test and production use
database links to connect to other databases, the test system to another
test database and the production one to another production database.

When you drop all your objects in test, you are dropping the database
link. On import, you will recreate it with the details of the production
database - your test database will now be accessing a production
database. This is not a good thing!

I have to admit that when I need a full refresh of a test database, I
use the following script in TOAD, connected as the user I’m about to
refresh (ie, not SYS or SYSTEM!), rather than trying to drop all
objects:

begin
FOR x IN (SELECT table_name FROM user_tables) loop
execute immediate ‘drop table ’ || x.table_name || ’ cascade
constraints purge’;
end loop;
end;

The above assumes 10g and above, if 9i or previous:

begin
FOR x IN (SELECT table_name FROM user_tables) loop
execute immediate ‘drop table ’ || x.table_name || ’ cascade
constraints’;
end loop;
end;

Once the tables (indexes, constraints and triggers) have been done, the
following will tidy up everything else, leaving only the database links:

begin
FOR x IN (SELECT object_type, object_name FROM user_objects
WHERE object_type NOT IN (‘PACKAGE
BODY’,‘UNDEFINED’,‘DATABASE LINK’)) loop
execute immediate 'drop ’ || x.object_type || ’ ’ || x.object_name;
end loop;
end;

It’s a SQL Cache killer, but I’m about to kill it with the import
anyway! UNDEFINED in the above is a catcher for MAT VIEWS in older 9i
(or 8i) versions - when they switched from SNAPSHOTS to MVIEWS. The
object_type was set to UNDEFINED. The following then drops those, if
required:

begin
FOR x IN (SELECT object_type, object_name FROM user_objects
WHERE object_type = ‘UNDEFINED’) loop
execute immediate 'drop snapshot ’ || x.object_name;
end loop;
end;

You are now left with a pristine test schema ready to import a full copy
of production, except for any database links which will remain as they
were and will still be pointing/connecting to a test database.

Cheers,
Norm. [TeamT]

Norman Dunbar
Contract Oracle DBA
CIS Engineering Services
Internal : 7 28 2051
External : 0113 231 2051

Information in this message may be confidential and may be legally privileged. If you have received this message by mistake, please notify the sender immediately, delete it and do not copy it to anyone else. We have checked this email and its attachments for viruses. But you should still check any attachment before opening it. We may have to make this message and any reply to it public if asked to under the Freedom of Information Act, Data Protection Act or for litigation. Email messages and attachments sent to or from any Environment Agency address may also be accessed by someone other than the sender or recipient, for business purposes. If we have sent you information and you wish to use it please read our terms and conditions which you can get by calling us on 08708 506 506. Find out more about the Environment Agency at www.environment-agency.gov.uk

Information in this message may be confidential and may be legally privileged. If you have received this message by mistake, please notify the sender immediately, delete it and do not copy it to anyone else.

We have checked this email and its attachments for viruses. But you should still check any attachment before opening it.
We may have to make this message and any reply to it public if asked to under the Freedom of Information Act, Data Protection Act or for litigation. Email messages and attachments sent to or from any Environment Agency address may also be accessed by someone other than the sender or recipient, for business purposes.

If we have sent you information and you wish to use it please read our terms and conditions which you can get by calling us on 08708 506 506. Find out more about the Environment Agency at www.environment-agency.gov.uk


#5

In my environment we have the luxury of being able to keep all of the data that exists in production in the dev/test environment - so I’d actually want all of the tables and all of the data in the tables. So - no problem if my “subset” is actually just a complete copy, right?

The things that I have different in dev/test are things like the privs given to the various user accounts, public database links, etc - ie. not things that are in the schema that is being duplicated.


#6

Data subset wizard allows subset from1 to 100% - so subset by definition can be the whole enchilada.

As for wizard not being able to handle 40,000 objects - I hate to disagree with John because he’s 99.99% right since he writes much of the code - but this might be the .01% where he’s not spot on (rare - but possible - I mean heck, my Cowboys won a playoff game this weekend - so anything is possible).

The wizard simple constructs a SQL DDL script to do all the work - and that work is done on the server even if run from client. So as long as toad editor can handle a guge script - or you can gen the script and then run in sql*plus. So I’m betting this will work …


#7

I should have been more clear. It doesn’t matter how many rows are in
the table….yes, we construct sql statements for that and the server does
all the work for table data. But if the metadata for those 40,000 objects take
up 2Gb or more, then Toad will report an “out of memory” error
before the operation can complete. This will happen with an APPS schema,
for example. So it really depends on what those 40,000 objects are.


#8

In my case it is a PeopleSoft Financials schema.

Incidentally, have you ever noticed that the vast majority of tables in a PeopleSoft Financials schema have zero rows. What a waste…

Kevin


#9

Morning Kevin,

Incidentally, have you ever noticed that the vast majority
of tables in a PeopleSoft Financials schema have zero rows.
What a waste… Kevin

Oracle 11g R2 (possibly R1) has Deferred Segment Creation. You can
create a table and until you actually come to use it to store data, no
disc space is used up.

At present each empty table uses a minimum of 64Kb of disc space.

Cheers,
Norm. [TeamT]

Information in this message may be confidential and may be legally privileged. If you have received this message by mistake, please notify the sender immediately, delete it and do not copy it to anyone else. We have checked this email and its attachments for viruses. But you should still check any attachment before opening it. We may have to make this message and any reply to it public if asked to under the Freedom of Information Act, Data Protection Act or for litigation. Email messages and attachments sent to or from any Environment Agency address may also be accessed by someone other than the sender or recipient, for business purposes. If we have sent you information and you wish to use it please read our terms and conditions which you can get by calling us on 08708 506 506. Find out more about the Environment Agency at www.environment-agency.gov.uk

Information in this message may be confidential and may be legally privileged. If you have received this message by mistake, please notify the sender immediately, delete it and do not copy it to anyone else.

We have checked this email and its attachments for viruses. But you should still check any attachment before opening it.
We may have to make this message and any reply to it public if asked to under the Freedom of Information Act, Data Protection Act or for litigation. Email messages and attachments sent to or from any Environment Agency address may also be accessed by someone other than the sender or recipient, for business purposes.

If we have sent you information and you wish to use it please read our terms and conditions which you can get by calling us on 08708 506 506. Find out more about the Environment Agency at www.environment-agency.gov.uk


#10

I’ll keep that in mind. Right now I’m running a mix of 9i and 10g - but I will definitely start the transition to 11gR2 over the course of the next year.

Even if a table has zero rows - and, perhaps in 11g uses zero space - it still takes a finite amount of time to create that table definition when you run the import back. I’ve often thought it might save some time if I wrote a script to determine which tables had non-zero rows - and then just do a table import on that set. Of the 40K tables I’m guessing that less than 1,000 tables actually have data - though I’ve never really counted that.

Anyone every try something like that?


#11

I wrote a script to determine which tables had non-zero rows - and then just
do a table import on that set

Just remember not to drop the zero-rowed tables :wink:

They may not currently be in use but that doesn’t prevent their use in the
future. If they exist, someone saw a purpose to have them created.

RAS


#12

You can use DB Links based on tns names which are configured in tnsnames.ora on
your servers and point to the proper environments. DB Links would LOOK exactly
the same this way, but will point to different destinations.