Select only so many rows

I need to load data for a process that can only handle 9999 rows at a time. I
can easily select the first 9999 with:

where rownum between 1 and 9999

But when I try to get the next 9999 with:

where rownum between 10000 and 19999

I get zip. :frowning:

I’d be happy to let Linux ‘split’ a big file for me, but the
process needs calculations based on data in each subset.

Any advice (including go fly a kite) would be much appreciated.

Thanks much,

Dan

Daniel B Madvig
Computer Technologies

Northwestern College & Northwestern Media
3003 Snelling Ave.
St. Paul, MN 55113
www.nwc.edu

651.631.5323

image001.jpeg

Morning Dan,

where rownum between 10000 and 19999
I get zip. :frowning:
That's because rownum is assigned, starting at 1 and incrementing by 1,
only for rows in the final output set.

What you need is something like:

select * from (
select rownum as r, table.stuff
from table
where ....
) where r between 10000 and 19999;

Oracle "knows" about this construct and will "short circuit" the inner
select when it has enough rows. Unless, of course, you have an ORDER BY
clause in which case it must read everything.

Although if I was paging through a huge table in this manner, I'd
probably parameterise the thing with binds, parse once, execute many!

select * from (
select rownum as r, table.stuff
from table
where ....
) where r between :first and (:first + 9998); -- Yes, 9998 is correct!

HTH

Cheers,
Norm. [TeamT]

Information in this message may be confidential and may be legally privileged. If you have received this message by mistake, please notify the sender immediately, delete it and do not copy it to anyone else. We have checked this email and its attachments for viruses. But you should still check any attachment before opening it. We may have to make this message and any reply to it public if asked to under the Freedom of Information Act, Data Protection Act or for litigation. Email messages and attachments sent to or from any Environment Agency address may also be accessed by someone other than the sender or recipient, for business purposes. If we have sent you information and you wish to use it please read our terms and conditions which you can get by calling us on 08708 506 506. Find out more about the Environment Agency at www.environment-agency.gov.uk

Information in this message may be confidential and may be legally privileged. If you have received this message by mistake, please notify the sender immediately, delete it and do not copy it to anyone else.

We have checked this email and its attachments for viruses. But you should still check any attachment before opening it.
We may have to make this message and any reply to it public if asked to under the Freedom of Information Act, Data Protection Act or for litigation. Email messages and attachments sent to or from any Environment Agency address may also be accessed by someone other than the sender or recipient, for business purposes.

If we have sent you information and you wish to use it please read our terms and conditions which you can get by calling us on 08708 506 506. Find out more about the Environment Agency at www.environment-agency.gov.uk

In your select statement, include rownum as a column. do not use rownum in a
where statement… take that out. take that select statement and create a
table. your data will then be in a table ordered from 1 to however many… then
you caan use your between 1 and 9999, followed by between 10000 and 19999 and it
will work.

Let me know if this helps,

Gene

Information Systems and Integration

Jackson State Unmiversity

On Tue, Jun 15, 2010 at 12:10 PM, Madvig, Daniel B wrote:

I need to load data for a process that can only handle 9999 rows at a time.
I can easily select the first 9999 with:

where rownum between 1 and 9999

But when I try to get the next 9999 with:

where rownum between 10000 and 19999

I get zip.   :-(

 

I’d be happy to let Linux ‘split’ a big file for me, but the process
needs calculations based on data in each subset.

 

Any advice (including go fly a kite) would be much appreciated.

 

Thanks much,

 

Dan

 

Daniel B Madvig
Computer Technologies

Northwestern College & Northwestern Media
3003 Snelling Ave.
St. Paul, MN 55113
www.nwc.edu

651.631.5323

image001.jpeg

can only handle 9999 rows at a time

Good Morning;

As an alternative to Norm’s suggestion, if PLSQL is an option (rather than
straight SQL), and your database is late enough (8.0 or above if memory serves),
you could use the LIMIT clause with bulk collect. The key thing to keep in mind
is that the %NOTFOUND won’t work as you expect so you’ll need to
remember to restructure your query slightly.

Example:

. WHILE tab_set.count > 0

. LOOP

. -– some other processing inside the loop here

. FETCH cursor_set

. BULK COLLECT INTO tab_set

. LIMIT 9999;

. END LOOP;

Usual stuff: