CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Perl Programming Help: Advanced:
File Size Limitation??

 



JohnQPolish
Deleted

Jul 30, 2000, 10:31 PM

Post #1 of 2 (752 views)
File Size Limitation?? Can't Post

I am attempting to parse a Weblog file that's 2.3GB. It basically opens the file, sorts through it and writes certain lines to other files. Everything works great, except that it stops before reaching the end of the file (around the 2GB area). I found a bug report stating that Perl doesn't seem to support files larger than 2GB. It was suggested to configure Perl using -Duse64bits. How do I do this? Any other suggestions? I am attemping to run this on Windows NT(I know, it should be on Unix), so any advice on getting this to work would be greatly appreciated.
Thanks in advance!


914
Deleted

Aug 2, 2000, 10:30 PM

Post #2 of 2 (752 views)
Re: File Size Limitation?? [In reply to] Can't Post

well, it's a real kludge, but since most NT machines need to be vastly overspecified anyway..

so long as you have the disk space, you could simply have the script split the file.

ie.

open the file, write up-to-one-gig to a new file.

make a copy of the file, then open the copy and discard the first-gig

then do your operation on the two temp files serially..

this would mean that you'd have to have 1.5 times the size of the file available for temporary use, plus whatever else the script generates.


i know, it's terrible... but likely easy and as long as you've got the disk space....

 
 


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives