How to speed up large data calculations

13 views (last 30 days)
Wenhai
Wenhai on 7 Jun 2013
Answered: Sagar Zade on 12 Dec 2019
Hi, I'm working on a big array, data, up to N=10,000,000 points, as well as its time index, t. Then I am trying to form a new 2*N data, Intp, involving interpolation of the original array. I am using a Gaussian kernel to select the sub-array (only 100 to 200 data point) to perform the calculation, as the code shows. Since the data size is so large, it may take days to finish the calculation. It looks like a convolution, at the first look, but not. If the data is small, I may use meshgrid, etc. Right now I'm using for loop and in my Windows 7 system with Matlab R013a, 8gb memory computer, it takes 2.5 hours to calculate only 1M data point. I am wondering if I can use any vector or matrix to replace these codes, or using PCT, etc. My code is listed as below(here I use a sinuous signal to replace my raw data). The program needs too much computation time so I want to increase the speed. Anybody have a better idea to speed up the code? Thanks a lot for your help.
clear all N=1e5; f=500; t = linspace(0,1,N); NA=5e+05; data=sin(2*pi*f*t); k=1:2*N; Comp=0*k; Intp=0*k; for k=1:2*N, Comp=abs(t-(k-1)/(2*N)); idx = find(0.0025-abs(t-(k-1)/(2*N)) >= 0); Intp(k)=sum(data(idx).*exp(-0.00005*Comp(idx).^2)); end %figure(1) %plot(Intp)

Answers (1)

Sagar Zade
Sagar Zade on 12 Dec 2019

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!