Oct 1, 2007, 12:57 PM
Post #1 of 2
Writing into a file at once (not record by record)
We use perl for fetching records from an oracle table , validating each column in that record and forming a hash. Now this hash contains more than 100K records.
Now, we are writing each row in the file. This take more than 3 hours. Is there a way to write 10K records at one shot into the file and keep writing in chunks of 10K in perl? so that I/O is less and the files are created quickly.
If so, can you pls throw light on the same.