It really is true that if algorithm A is o(algorithm B) then for
large problems A will take much less time than B.
Definition: If (the number of operations in) algorithm A is o(algorithm B), we call A asymptotically faster than B.
Example:: The following sequence of functions are ordered by growth rate, i.e., each function is little-oh of the subsequent function.
log(log(n)), log(n), (log(n))2, n1/3, n1/2, n, nlog(n), n2/(log(n)), n2, n3, 2n, 4n, n!
One way to compare 2 functions, is using L'Hospital rule. Here is good explanation:
Growth Rates of Functions and L'Hospital's Rule
Here is the snapshot of how the growth looks in increasing order
Reference
https://cs.nyu.edu/courses/fall02/V22.0310-002/class-notes.html
http://www.cs.odu.edu/~cs381/cs381content/function/growth.html
Definition: If (the number of operations in) algorithm A is o(algorithm B), we call A asymptotically faster than B.
Example:: The following sequence of functions are ordered by growth rate, i.e., each function is little-oh of the subsequent function.
log(log(n)), log(n), (log(n))2, n1/3, n1/2, n, nlog(n), n2/(log(n)), n2, n3, 2n, 4n, n!
One way to compare 2 functions, is using L'Hospital rule. Here is good explanation:
Growth Rates of Functions and L'Hospital's Rule
Here is the snapshot of how the growth looks in increasing order
1 | loglogn | logn | n | nlogn | n2 | 2n | n! | nn |
Reference
https://cs.nyu.edu/courses/fall02/V22.0310-002/class-notes.html
http://www.cs.odu.edu/~cs381/cs381content/function/growth.html
0 comments:
Post a Comment