Oct 26, 2012, 3:37 AM
Post #3 of 4
Many thanks for your reply
The txt files we recieve contains cases, there could be be up to 2000 cases per txt file, so our script parses out each case, then run a another parse on the extracted case for extra values, then creates a html and finally inserts all into a DB.
If we run 1 file, it will take around 2 mins, but whilst doing so the memory and cpu will be high, which isnt a problem in itself, but the issue comes in the volume of txt files we recieve, we can get around 2000 a week to process for each client
At present we are in setup so are processing for just test purposes but when we go live, we could recieve 500 a day for each client, so in effect could have multiple scripts all running at the same time, we are running a single thread so it process's one file at a time.
Does perl use less resource on the server than other languages ?
with the possibility of so many scripts running, i need to find the best way to process multiple files at the same time for various clients without placing a burden on the server
hope this makes sense