CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Perl Programming Help: Beginner:
Out of memory during "large" request error

 



ianilkumar
Novice

Nov 13, 2013, 2:22 AM

Post #1 of 5 (354 views)
Out of memory during "large" request error Can't Post

Hi,

I have a perl script to deploy the tar file and change the configurations. For that i am opening a lot of files and changing them with the following snippet:

next if(-l $file);

##############################
# Open the configuration file.
##############################

unless(open(CONF, "$file"))
{
print "Couldn't open the conf file.";
&myExit(1);
}
my @conf_lines = <CONF>;
close(CONF);



Then sometimes i get the following error:


The configuration of above files are in progress...
Out of memory during "large" request for 69632 bytes at /quest/sie/tools/cloning_tool/clonemaker line 1341, <CONF> line 30972673.
~
~

Can somebody help to resolve this issue.


FishMonger
Veteran / Moderator

Nov 13, 2013, 5:13 AM

Post #2 of 5 (349 views)
Re: [ianilkumar] Out of memory during "large" request error [In reply to] Can't Post

The code you posted wont generate that error message. It would be helpful if you post the code that is generating the error.

One thing you can do to reduce the memory footprint is to not slurp the entire file into an array. Instead, use a while loop to loop over it line-by-line.


ianilkumar
Novice

Nov 14, 2013, 2:00 AM

Post #3 of 5 (335 views)
Re: [FishMonger] Out of memory during "large" request error [In reply to] Can't Post

May be this will help

my @conf_files = `find $ENV{HOME}/opt -name "*.conf*"`;
my @config_files = `find $ENV{HOME}/opt -name "*config.values*"`;
my @prop_files = `find $ENV{HOME}/opt -name "*Infranet.properties*"`;
my @op_prop_files = `find $ENV{HOME}/opt -name opcoder.properties`;
my @queue_name_files = `find $ENV{HOME}/opt -name ifw_sync_queuenames`;
@conf_files = (@conf_files, @config_files, @prop_files, @op_prop_files, @queue_name_files);

print @conf_files;
print "\n\n The configuration of above files are in progress...\n";

foreach my $file (@conf_files)
{
chomp($file);
next if(-l $file);

##############################
# Open the configuration file.
##############################

unless(open(CONF, "$file"))
{
print "Couldn't open the conf file.";
&myExit(1);
}
my @conf_lines = <CONF>;
close(CONF);

foreach my $line (@conf_lines)
{
next if($line =~ /^\s*#/);
next if($line =~ /^\s*$/);
next if($line =~ /($skip_string)/);


BillKSmith
Veteran

Nov 14, 2013, 7:31 AM

Post #4 of 5 (326 views)
Re: [ianilkumar] Out of memory during "large" request error [In reply to] Can't Post


Quote
my @conf_lines = <CONF>;
close(CONF);

foreach my $line (@conf_lines)
{


This is exactly what Fishmonger meant by 'slurp'. Use:

Code
 foreach my $line (<CONF>)  
{



}
close(CONF);


There is no reason to use the shell 'find' command. (Refer: perldoc -f grep)
Good Luck,
Bill


FishMonger
Veteran / Moderator

Nov 14, 2013, 8:33 AM

Post #5 of 5 (322 views)
Re: [ianilkumar] Out of memory during "large" request error [In reply to] Can't Post

That's a start, but you neglected to show us which line is generating the error.

If you need to traverse a directory tree, then use the File::Find or File::Find::Rule module instead of selling out to the system's find command.
http://search.cpan.org/~rjbs/perl-5.18.1/lib/File/Find.pm
http://search.cpan.org/~rclamp/File-Find-Rule-0.33/lib/File/Find/Rule.pm

How many files end up in the @conf_files after all of those find commands? You could use the Devel::Size module to find out how much memory @conf_files is using.
http://search.cpan.org/~nwclark/Devel-Size-0.79/lib/Devel/Size.pm

Since @conf_files is a combined copy of each of the other arrays, you just doubled the amount of memory needed to store that list if filenames.

Instead of putting all of those filenames into an array, use the File::Find module and process them as they are found.

I already point out the issue with slurping the file and Bill has shown you how to fix that issue.

I suspect that if you follow these recommendations, you'll be able to drastically reduce the scripts memory footprint.

There are a couple other issues in that code unrelated to the memory issue which I might cover in a separate post.

 
 


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives