Big O notation is a mathematical way to describe the efficiency of an algorithm in terms of time complexity (how fast it runs) and space complexity (how much memory it uses). It helps us understand how an algorithm scales as the input size increases.
Why Big O is needed
- Helps compare different algorithms for efficiency.
- Helps in selecting the best algorithm for a problem.
- Important in system performance optimization.
Time Complexity
Time complexity is a measure of how the execution time of an algorithm increases as the input size grows. It tells us how efficiently an algorithm performs when handling large amounts of data.
The time complexity of an algorithm depends on several factors:
- Number of operations: The more operations an algorithm performs, the longer it takes to execute.
- Input size: If an algorithm processes more data, it generally takes more time. Some algorithms scale well with larger inputs, while others slow down significantly.
- Loops and recursion: Algorithms with nested loops or deep recursion tend to have higher time complexity because they perform more computations.
- Type of operations: Basic operations like addition and multiplication are fast, but expensive operations like sorting or matrix multiplication can increase time complexity.
Effects
Time complexity directly affects the performance of a program. If an algorithm has a high time complexity, it will take longer to execute as the input size grows. In real-world applications, this can lead to slow performance, increased processing power consumption, and delays in system response time. Optimizing time complexity is crucial for handling large-scale data efficiently, especially in applications like databases, search engines, and real-time systems.
Space complexity
Space complexity is a measure of how much memory an algorithm uses as the input size increases. It considers both the space needed for storing input data and any additional memory required for temporary variables, recursion stacks, or data structures.
The space complexity of an algorithm depends on:
- Input storage: If an algorithm requires storing large amounts of input, it will have higher space requirements.
- Auxiliary space: Additional memory used for temporary variables, arrays, or data structures affects space complexity.
- Recursion depth: Recursive algorithms consume stack memory for each function call. If recursion depth is high, space usage also increases.
Effects
Space complexity impacts a system by determining how much memory is needed to run a program efficiently. If an algorithm has a high space complexity, it may cause memory shortages, slow performance, or even system crashes in memory-constrained environments. Optimizing space complexity is especially important in embedded systems, mobile applications, and cloud computing, where memory resources are limited.
Common Big O Notations
O(1) – Constant Time
- The execution time does not change with input size.
- Example: Accessing an element in an array by index.
- Example Code (C):
int arr[] = {1, 2, 3, 4, 5};
int x = arr[2]; // Always takes the same time
O(n) – Linear Time
- The execution time increases directly with input size.
- Example: Linear Search.
- Example Code (C):
int linearSearch(int arr[], int n, int key) {
for (int i = 0; i < n; i++) {
if (arr[i] == key) return i;
}
return -1;
}
O(log n) – Logarithmic Time
- The execution time increases slowly as input grows.
- Example: Binary Search.
- Example Code (C):
int binarySearch(int arr[], int left, int right, int key) {
while (left <= right) {
int mid = left + (right - left) / 2;
if (arr[mid] == key) return mid;
if (arr[mid] < key) left = mid + 1;
else right = mid - 1;
}
return -1;
}
O(n log n) – Linearithmic Time
- Common in efficient sorting algorithms like Merge Sort and Quick Sort.
- Example Code (C) – Merge Sort
void merge(int arr[], int left, int mid, int right) {
int i, j, k;
int n1 = mid - left + 1;
int n2 = right - mid;
int L[n1], R[n2];
for (i = 0; i < n1; i++) L[i] = arr[left + i];
for (j = 0; j < n2; j++) R[j] = arr[mid + 1 + j];
i = 0, j = 0, k = left;
while (i < n1 && j < n2) {
if (L[i] <= R[j]) arr[k++] = L[i++];
else arr[k++] = R[j++];
}
while (i < n1) arr[k++] = L[i++];
while (j < n2) arr[k++] = R[j++];
}
void mergeSort(int arr[], int left, int right) {
if (left < right) {
int mid = left + (right - left) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}
}
O(n²) – Quadratic Time
- The execution time grows quadratically as input size increases.
- Example: Bubble Sort, Selection Sort.
- Example Code (C) – Bubble Sort:
void bubbleSort(int arr[], int n) {
for (int i = 0; i < n - 1; i++) {
for (int j = 0; j < n - i - 1; j++) {
if (arr[j] > arr[j + 1]) {
int temp = arr[j];
arr[j] = arr[j + 1];
arr[j + 1] = temp;
}
}
}
}
O(2ⁿ) – Exponential Time
- Time doubles with every additional input size.
- Example: Recursive Fibonacci.
- Example Code (C)
int fibonacci(int n) {
if (n <= 1) return n;
return fibonacci(n - 1) + fibonacci(n - 2);
}
O(n!) – Factorial Time
- Extremely slow growth, common in brute-force algorithms like the Traveling Salesman Problem (TSP).
- Example: Generating all permutations of a set.
Time vs Space complexity
lets take few examples to understand more about Time and Space complexity.
Iterative Fibonacci (Better Space Complexity, Same Time Complexity)
int fibonacci_iterative(int n) {
if (n <= 1) return n;
int a = 0, b = 1, c;
for (int i = 2; i <= n; i++) {
c = a + b;
a = b;
b = c;
}
return b;
}
Time Complexity: O(n) (One loop runs n
times)
Space Complexity: O(1) (Uses only three integer variables)
Recursive Fibonacci (Higher Space Complexity)
int fibonacci_recursive(int n) {
if (n <= 1) return n;
return fibonacci_recursive(n - 1) + fibonacci_recursive(n - 2);
}
Time Complexity: O(2ⁿ) (Each call makes two recursive calls, leading to exponential growth)
Space Complexity: O(n) (Recursive calls consume stack space proportional to n
.
Aspect | Time Complexity | Space Complexity |
Measures | Execution Time | Memory Usage |
Depends on | Number of iterations | Extra Variables, Recursion stack |
Goal | Minimize execution time | Minimize memory usage |