Algorithms Analyses

📐 Algorithm complexity

Time complexity

Time complexity specifies how long it will take to execute an algorithm as a function of its input size. It provides an upper-bound on the running time of an algorithm.

Space complexity

Space complexity specifies the total amount of memory required to execute an algorithm as a function of the size of the input.

📈 Order Of Growth

Classifications

order of growthnamedescriptionexample
1Constantstatementadd two numbers
log NLogarithmicdivide in halfbinary search
NLinearloopfind the maximum
N log NLinearithmicdivide and conquermergesort
N2Quadraticdouble loopcheck all pairs
N3Cubictriple loopcheck all triples
2nExponentialexhaustive searchcheck all subsets

Time Complexity Chart

Simplification

  1. Drop constants. Constants do not affect the algorithm’s growth rate relative to its input.
  1. Lower-Order terms. There is a focus on the dominant term that contributes the most to the growth rate.
  1. Use Summation for Multiple operations. Multiple operations with a different time complexity can be summed.
  1. Drop Non-Dominant Terms in Additions. We need to leave only the terms that dominate the growth rate.
  1. Multiplication rule for Nested arrays.
  1. Consider the Worst-Case Scenario. Big O notation focuses on the worst-case scenarios, so it needs to be simplified based on the worst-case analysis.

🔭 Asymptotic Notations

There are mathematical notations used to describe an algorithm’s performance in relation to the amount of input data:

  • Big O - worst case (upper-bound on cost)
  • Big Ω (Omega) - best case (lower-bound on cost)
  • Big Θ (Theta) - average case (“expected” cost)

Big O

Big O notation represents an algorithm’s worst-case complexity. It defines how the performance of the algorithm will change as an input size grows.

Resources

Top