CGI/Perl Guide | Learning Center | Forums | Advertise | Login
Site Search: in

  Main Index MAIN
INDEX
Search Posts SEARCH
POSTS
Who's Online WHO'S
ONLINE
Log in LOG
IN

Home: Need a Custom or Prewritten Perl Program?: I Need a Programmer for Freelance Work:
Reply to: Re: [FishMonger] Text file parsing
 
Subject:
Guest Username:
E-mail address:
Post Style:      Get Markup Help
Icon:
none









Post:




Replying to: Re: [FishMonger] Text file parsing by grant mccormack
Post: Hi Fishmonger

Many thanks for your reply

The txt files we recieve contains cases, there could be be up to 2000 cases per txt file, so our script parses out each case, then run a another parse on the extracted case for extra values, then creates a html and finally inserts all into a DB.

If we run 1 file, it will take around 2 mins, but whilst doing so the memory and cpu will be high, which isnt a problem in itself, but the issue comes in the volume of txt files we recieve, we can get around 2000 a week to process for each client

At present we are in setup so are processing for just test purposes but when we go live, we could recieve 500 a day for each client, so in effect could have multiple scripts all running at the same time, we are running a single thread so it process's one file at a time.

Does perl use less resource on the server than other languages ?

with the possibility of so many scripts running, i need to find the best way to process multiple files at the same time for various clients without placing a burden on the server

hope this makes sense

Grant


Search for (options) Powered by Gossamer Forum v.1.2.0

Web Applications & Managed Hosting Powered by Gossamer Threads
Visit our Mailing List Archives