• Post Reply Bookmark Topic Watch Topic
  • New Topic
programming forums Java Mobile Certification Databases Caching Books Engineering Micro Controllers OS Languages Paradigms IDEs Build Tools Frameworks Application Servers Open Source This Site Careers Other Pie Elite all forums
this forum made possible by our volunteer staff, including ...
Marshals:
  • Campbell Ritchie
  • Tim Cooke
  • paul wheaton
  • Paul Clapham
  • Ron McLeod
Sheriffs:
  • Jeanne Boyarsky
  • Liutauras Vilda
Saloon Keepers:
  • Tim Holloway
  • Carey Brown
  • Roland Mueller
  • Piet Souris
Bartenders:

Large data file read

 
Greenhorn
Posts: 1
  • Mark post as helpful
  • send pies
    Number of slices to send:
    Optional 'thank-you' note:
  • Quote
  • Report post to moderator
Hello,
I'm trying to read a file with very large records in it from Unix.
I have set up the following script:
declare
file_id UTL_FILE.FILE_TYPE;
linedata VARCHAR2(28000);
BEGIN
file_id := utl_file.FOPEN( '/tmp', 'largefile.txt', 'r' );
LOOP
UTL_File.get_line(file_id,linedata);
dbms_output.put_line(linedata);
END LOOP;
utl_file.fCLOSE(file_id);
EXCEPTION
WHEN utl_file.read_error THEN
RAISE_APPLICATION_ERROR(-20001, 'utl_file.read_error');
END;
I get a utl_file.read_error ORA-06512 when trying to read a line with about 6000 bytes in it. Smaller lines are ok - is there some limit or workaround ?
I'm using Oracle 8i.
Any help please ?
Thank you
Jon
reply
    Bookmark Topic Watch Topic
  • New Topic