Fredrik Kronhamn
2008-05-19 14:01:03 UTC
I am using a quite simple m-file to process numerical data
from a 200Mb+ (3M lines) text file. Values (text) are being
read from a source file, some calculations are done, and the
results written to a target text file. I have been
experimenting with different alternatives for reading the
in-data, and found a quite comfortable and speedy (20000
lines/sec) solution using the textscan function.
Unfortunately, writing the out-data is utterly slow
regardless of what function I am using (fprintf, dlmwrite,
etc). Process Monitor (SysInternals) reveals that MatLab is
writing chunks that are only between 9 and 13 bytes long,
which is disastrous for the performance of my script. The
infile is however read in 512 bytes chunks. Does anyone know
a way of boosting performance for file i/o? I am using
MatLab 7.1
from a 200Mb+ (3M lines) text file. Values (text) are being
read from a source file, some calculations are done, and the
results written to a target text file. I have been
experimenting with different alternatives for reading the
in-data, and found a quite comfortable and speedy (20000
lines/sec) solution using the textscan function.
Unfortunately, writing the out-data is utterly slow
regardless of what function I am using (fprintf, dlmwrite,
etc). Process Monitor (SysInternals) reveals that MatLab is
writing chunks that are only between 9 and 13 bytes long,
which is disastrous for the performance of my script. The
infile is however read in 512 bytes chunks. Does anyone know
a way of boosting performance for file i/o? I am using
MatLab 7.1