ADM

RAM Model of Computation

  • stands for Random Access Machine
  • the RAM model assumes every basic operation (addition, multiplication, comparisons) takes constant time
  • memory access takes constant time, regardless of where data is located
  • while this model simplifies assumptions, it provides a practical abstraction to predict algorithm performance accurately

Big O Notation

  • Big O: Describes the upper bound of an algorithm growth rate
    • it reflects the worst-case performance
    • it ignores constant factors and lower-order terms
  • Ω (Omega): Represents the lower bound of performance
  • Θ (Theta) : Describes tight bounds, where an algorithm performs within a narrow range of efficiency for all inputs

Types of Algorithm Analysis

  • Best Case Complexity

    • The minimum time required for an algorithm
    • e.g. if the first element is the target in a search
  • Worst Case Complexity

    • The maximum number of steps an algorithm might take
    • Acts as a guarantee on performance
  • Average Case Complexity

    • Expected time over all possible inputs
    • Often harder to compute accurately

Growth Rates and Dominance Relations

  • The efficiency of algorithms depends heavily on growth rates
  • Growth rates refer to how quickly the runtime or resource usage of an algorithm increases with the size of an input
  • Dominance relations help us compare two algorithms and predict which will perform better as the input size grows

Common Types of Growth Rates

OrderBig O NotationMeaningExample
Constant TimeO(1)Time is independent of input sizeAccessing an element in an array
Logarithmic TimeO(log n)Time grows slowly as input size doublesBinary search
Linear TimeO(n)Time grows directly with input sizeIterating through an array
Log-Linear TimeO(n log n)Common in sorting, faster than quadraticMerge sort, Heap sort
Quadratic TimeO(n²)Time grows with the square of input sizeBubble sort with nested loops
Exponential TimeO(2ⁿ)Time doubles for each additional inputRecursive Fibonacci sequence
Factorial TimeO(n!)Time grows dramatically with input sizeTraveling Salesman Problem

Growth Rates Measured in Nanoseconds

nlg nnn lg n2ⁿn!
100.003 μs0.01 μs0.033 μs0.1 μs1 μs3.63 ms
200.004 μs0.02 μs0.086 μs0.4 μs1 ms77.1 years
300.005 μs0.03 μs0.147 μs0.9 μs1 sec8.4 × 10¹⁵ yrs
400.005 μs0.04 μs0.213 μs1.6 μs18.3 min-
500.006 μs0.05 μs0.282 μs2.5 μs13 days-
1000.007 μs0.1 μs0.644 μs10 μs4 × 10¹³ yrs-
1,0000.010 μs1.00 μs9.66 μs1 ms--
10,0000.013 μs10 μs130 μs100 ms--
100,0000.017 μs0.10 ms1.67 ms10 sec--
1,000,0000.020 μs1 ms19.93 ms16.7 min--
10,000,0000.023 μs10 ms0.23 sec11.6 days--
100,000,0000.027 μs0.10 sec2.66 sec115.7 days--
1,000,000,0000.030 μs1 sec29.90 sec31.7 years--

Dominance Relations for Common Growth Rates

n! ≫ 2ⁿ ≫ n³ ≫ n² ≫ n log n ≫ n ≫ log n ≫ 1

Take-Home Lessons

  • Big O notation simplifies analysis

    • it provides a way to ignore constant factors and focus on how algorithms scale
  • Worst-case analysis is crucial

    • while best-case scenarios are ideal, real-world usage demands reliability even in unfavorable cases
  • dominance relations

    • algorithms with lower growth rates dominate those with higher rates as input size grows
  • trade-offs in algorithm choice

    • some algorithms with better growth rate might have more implementation complexity
    • choosing the right algorithm involves balancing complexity and efficiency