
Laurent_R
Veteran
/ Moderator
Dec 15, 2012, 2:31 PM
Post #8 of 15
(7527 views)
|
Re: [omega] cant output through perl or shell
[In reply to]
|
Can't Post
|
|
on another site someone mentioned the issue might be the text editor not being able to read a multi-million line file which after making a quick: open COUNTDOWN, '<', 'countdown.txt'; while ( <COUNTDOWN> ) { print "$_\n"; } showed to be true. my issue now is that without being able to physically read the file or be able to load it into an array or hash, which i am unable to do because it overloads my ram, the output is unusable. Any ideas? Unless I missed something, you did not show anywhere in your original code any statement to open a file. How are we supposed to guess that? Besides, I don't get that. Reading the file with an iterator as the syntax:
open COUNTDOWN, '<', 'countdown.txt'; while ( <COUNTDOWN> ) { print "$_\n"; } indicates does not load the full file into memory, but just one line at a time. And, doing this, you should be able to read a file of just about any size. I have read files having sizes of several hundred gibabytes with such a method and never encountered any problem. Well, actually, you might have a problem if the file being read does not have any record separators (carriage returns or end of line characters), meaning that the while loop cannot break the input into records and that you are then trying to put the whole file into the $_ variable, which might or course exceed the memory capacité of your platform. So the problem might be with the file format. For example, one possible source of the problem: if you are working under Windows, the default input record separator will probably be a combination of two characters (ASC 13 and ASC 10, if I remember the order correctly), whereas if your file was generated under Unix the record separator will be only ASC 13 (new line or \n). In this case, your program might not be able to break up your input into records and might try to slurp the whole content of the file into $_. This Windows/Unix format issue is just one example, there could be other reasons why Perl if not recognizing individual records correctly. In this case, there are various ways to preprocess your file to guarantee that the format is right, or you could modify explicitly the default input record separator (the $/ variable). Once this is correct, there should be absolutely no reason why you would not be able to read a file having trillions of lines (except or course that it might take quite a bit of time). Well, tell us more about the file you are reading, we did not even know you were reading one.
|