Effective way to estimate the big O runtime?
Show older comments
In order to estimate the big O runtime of MATLAB's find function I have simply created a plot of the runtimes of the find function for arrays of various n lengths. To estimate the big O runtime I would simply compare the slope to various known big O runtimes. I have used this method for MATLAB's sort function and estimated its runtime to be O(n*log(n)). For the find function it is quite harder to estimate it using my method, so I was wondering if there was a more effective way to estimate it.
Accepted Answer
More Answers (1)
Youssef Khmou
on 6 May 2013
Edited: Youssef Khmou
on 6 May 2013
0 votes
hi,
About the Computational complexity, that can be done with pencil and paper following some rules for example : https://sites.google.com/site/youssefkhmou/flops
but for "real time" complexity you have to use the function 'tic' 'toc' to see the elapsed time during the code execution, and you have to use it several times and take the average because everytime you run the code some Processes may be running and that will influence the elapsed time, to see for example the O(nlogn) try to use 'tic' and 'toc' for every case ( matrix (4x4) (16x16) (500x500),...) and plot the saved elapsed time in function of the size , you will deduce the the O
Note : there are other sophisticated ways to estimate the O .
Categories
Find more on Logical in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!