Hi,
I have requirement where there is an Oracle table which has almost 40 million data. This data I need to process and has to be written in various flat files which is to be given to client. The problem is whenever I fetches records from this table, it gives memory fault error after 10-15 million data fetched. Hence I am looking for solution which will fetch resultset in batches, clear memory and get next resultset so that I can avoid memory error.
Thanks in advance
Manish
Bookmarks