CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Perl Programming Help: Beginner:
When does size matter?

 



jacqui
Novice

Apr 16, 2000, 1:43 AM

Post #1 of 3 (887 views)
When does size matter? Can't Post

 
Hi,

I'm storing data in a tab delimited text file, each line probably having about 1kB worth of info.

What is the max size I can let the file grow to before open<FILE.., while <FILE>.. etc becomes too slow or clunky?


Admin
Deleted

Apr 16, 2000, 7:35 PM

Post #2 of 3 (887 views)
Re: When does size matter? [In reply to] Can't Post

The answer depends on how you read the data into your program...

If you slurp the entire file into an array or hash, that will start slowing down pretty quickly, depending on your server's resources. Tossing the contents into hashes will generally slow down the program more than arrays.

The best way to handle large files is to not slurp it anywhere -- handle each line one at a time. For example:

<BLOCKQUOTE><font size="1" face="Arial,Helvetica,sans serif">code:</font><HR>


open (FILE, "<file.txt") or die "Couldn't open file.txt $!";
while (<FILE> ){
chomp;
my @whatever = split(/\t/,$_);
}
close FILE;
</pre><HR></BLOCKQUOTE>

Essentially, you will read the file line by line. Because Perl cleans up after itself, the previous line will "go out of scope" when the next line is read, and memory resources that was used by the previous line will be freed.

Hope this helps!


jacqui
Novice

Apr 17, 2000, 10:34 AM

Post #3 of 3 (887 views)
Re: When does size matter? [In reply to] Can't Post

Thanks :-)

Jac
XXXX

 
 


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives