CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Perl Programming Help: Advanced:
Perl over C with SWIG - memory allocation question

 



vampyr
New User

Feb 21, 2008, 5:31 AM

Post #1 of 1 (1026 views)
Perl over C with SWIG - memory allocation question Can't Post

Hi everyone,

I'm currently writing an object-oriented wrapper over libhdfs, the C API for accessing the Hadoop filesystem (http://hadoop.apache.org/core/). I'll hopefully have it up on CPAN in a week or so.

My problem right now is memory allocation. Some functions in the C API expect a pre-allocated buffer, particularly the file read & write functions. Of course the wrapper will hide this requirement but I'm looking for an elegant solution. Right now, exemplifying with a call to get the current working directory, I'm doing something along these lines:


Code
$self->{working_directory} = ' ' x $self->{buffer_size}; # Poor man's memory allocation 
HDFS::hdfsGetWorkingDirectory( $self->{_fs}, $self->{working_directory}, $self->{buffer_size} );


This is also the way I plan to implement buffered reading for text files for instance. The problem disappears when doing writes or when reading a predetermined number of bytes.

Any ideas on an elegant solution for this problem or this a hack I'll have to live with?


(This post was edited by vampyr on Feb 21, 2008, 5:51 AM)

 
 


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives