Using large .daq files

3 views (last 30 days)
Chris
Chris on 4 Mar 2011
How do I figure out how many data points are in a large .daq file if the file is too large to open at once? My files range from 2 GB to 28 GB. I can only read a portion of them at a time by using:
daqread('Myfile.daq', 'Samples', [1 10000000]);
I want to loop through the large file in increments but I don't know when to terminate the loop. Without knowing how many data points there are my loop will inevitably attempt to access a point that does not exist.

Accepted Answer

Chris
Chris on 7 Mar 2011
It turns out that daqread() will accept sample ranges greater than what exists in the file being read. So my solution to processing my data is the following:
DataChunkSize=1e7;
ChunksProcessed=0;
while DataChunkSize==1e7
Data=daqread('Myfile.daq', 'Samples',...
[DataChunkSize*ChunksProcessed+1,...
DataChunkSize*(ChunksProcessed+1)]);
DataChunkSize=numel(Data);
%%%%My data processing
ChunksProcessed=ChunksProcessed+1;
end
This works fine except the last loop will cause a warning to be spit out in the command window. It is slightly frustrating that the warning message tells you how many "samples" are in the .daq file, but there seems to be no way to get this information from Matlab. Oh well. At least the code can run to completion. Thanks for your help Walter.

More Answers (1)

Walter Roberson
Walter Roberson on 4 Mar 2011
When I look at the documentation, I do not see any easy way to calculate or display the number of samples available. I would have expected daqread('Myfile.daq','info') to return information such as that, but it does not appear to.
What happens when you attempt to read a sample that does not exist?
If need be, you could put the daqread() in a try/catch block -- though narrowing in on the actual number of samples might possibly be a nuisance.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!