Is there a faster way than str2double() to convert from a string array into a matrix containing doubles?

Hi, i am working with large .txt files, that I imported as a string array. A big part of this .txt file contains numeric values, that I want to convert to doubles. Since the array is sufficiently large (500.000 x 25), it takes MATLAB very long to convert these strings into doubles using str2double(). Is there a faster way to convert a String array into a numeric matrix?

8 Comments

For your case, would it be easier to get the numerical data directly from the text file? Did you try importdata()?
Share the code you've actually tried with us; or at least a reasonable mock-up of it. As Fangjun mentioned, it might not be str2double that's your problem. We can't know without seeing how you imported the file and how you are trying to convert them.
You need not to import the file as a string........you can import it as a doubles...code depends on how the file is....attach your txt file.
The faster way would be to import the numeric data as numeric data. Upload your file by clicking the paperclip button.
In this case, it was very hard to import the files using importdata(), since the .txt file has a different amount of colums for some rows. Using the built-in MATLAB import tool, it works fine.. I don't understand why.
However, the reason that I did not import as doubles directly, is that the .txt file also consists of some "real" strings (for instance, some time indications and names of locations that have been measured). I want to keep this information, therefore I imported the file as a string array first and then took all values that are needed for plotting and converted these into a .txt. The strings will become NaN, however, I still have the imported files with my strings.
@Bjorn Sauren: MATLAB has more file importing functions than just importdata, and plenty on volunteers here who have experience using them. As you have been asked before, please upload the data file by clicking the paperclip button.
  • "mport tool, it works fine.. I don't understand why" you didn't give the gui a helping hand?
  • (500000*25)*8/1e6 makes 100MB, which shouldn't be a problem
  • See Import Large Text File Data in Blocks
I reduced the .txt file and gave you the locations of the strings. Actually, the file is much larger.

Sign in to comment.

 Accepted Answer

Importing the strings at first is an indirection. The structure of the file looks easy, so what about using fscanf?
fid = fopen(FileName, 'r');
line1 = fgetl(fid);
line2 = fgetl(fid);
fgetl(fid);
Head = cell(1e6, 1);
Data = cell(1e6, 1); % Pre-allocate
iData = 0;
while ~feof(fid)
iData = iData + 1;
Head = fscanf(fid, '%s'); % Or: strrep(fgetl(fid), ';', '')
Data{iData} = fscanf(fid, '%g;%g;%g;%g', [4, 25]);
end
Head = Head(1:iData);
Data = Data(1:iData);
fclose(fid);
Note that text files are useful, if they are edited or read by a human. Storing 500.000 x 25 numbers in text mode is a really weak design. Storing them in binary format would make the processing much more efficient.

More Answers (1)

3 Comments

Thank you. This should be the accepted answer since the OP asks for a way of converting faster string to double and not how he could make it's file reading faster.
You are right. The OP had speed problems and thought that a faster STR2DOUBLE solves the problem. But avoiding the need to call STRDOUBLE is even faster.
The FEX submission suffers from some severe conversion problems:
str2doubleq('Inf') % NaN instead of Inf
str2doubleq('.i5') % 5 instead of NaN
str2doubleq('i') % 0 instead of 0 + 1i
str2doubleq('1e1.4') % 0.4 instead of NaN
str2doubleq('--1') % -1 instead of NaN
s = '12345678901234567890';
str2doubleq(s) - str2double(s) % 2048
s = '123.123e40';
str2doubleq(s) - str2double(s) % 1.547e26
str2double('2.236')-str2doubleq('2.236') % is not 0 ('2.235' is fine)
str2double('1,1')-str2doubleq('1,1') % 9,9 instead 0
isreal(str2doubleq('1')) % 0 instead of 1
str2double('2.236')-str2doubleq('2.236') % is not 0 ('2.235' is fine)
str2double('1,1')-str2doubleq('1,1') % 9,9 instead 0
A part of the speed up is based on a missing memory cleanup. This function leaks memory, because it allocates strings by mxArrayToString without free'ing it. With large cells this exhausts GB of RAM in seconds and you habve to restart Matlab to free it.
This tool is fast, but not reliably enough for scientific or productive work.

Sign in to comment.

Categories

Asked:

on 30 Nov 2017

Commented:

on 17 Feb 2021

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!