Big O Notation
Introduction to Big O Notation
Big O notation is a way to describe how the efficiency of an algorithm or program changes as the input size grows. Imagine you're solving a problem, like finding a specific item in a list.
Big O notation helps you understand how the time, processing power, and storage space (memory) required to solve this problem increase as the list gets larger.
It uses a special notation, typically written as O(something), where “something” shows how the algorithm’s resource usage grows in relation to the input size.
The most common complexities are constant, linear, quadratic, and logarithmic.
Big O Time Complexity Visualization
O(1)
O(1) means the time/space it takes to solve the problem doesn't depend on the input size - it doesn't matter how big the input size is, the time is the same. It's super fast and doesn't get slower as the list gets longer.
Example - Ringing the school bell
If a school doubles in size from 200 to 400 pupils, the school still only needs to ring the bell once for each period in the day.
Algorithm Example - Indexed-based array access
Given an index, retrieving an element from an array takes constant time because it directly maps to a memory address.
O(1) Code Example
Algorithms that run once, regardless of the problem size are usually O(1)
1. O(1) – Constant Time
def get_first_element(arr):
return arr[0]
This function returns the first element regardless of input size.
What does O(1) time complexity mean?
O(n)
O(n) means the time/space it takes increases linearly with the input size. If the list doubles in size, it will take roughly twice as long to find the item.
Everyday example - queuing for the school canteen
If a school doubles in size from 200 to 400 students then the time it takes to serve all the pupils food will double.
Algorithm Example - Linear Search
Finding an element in an unsorted list by checking each element until the target is found or the end of the list is reached.
What does O(n) time complexity mean?
O(n) Code Example
Algorithms that run once per item (a single loop or consecutive loops) are linear time.
def print_all_elements(arr): for item in arr: print(item[0])
O(n^2)
O(n^2) means it takes much longer as the input size grows. If the list doubles, it might take four times as long or 4 times as much space.
Everyday Example - Classroom Introductions
Imagine that each student in a class wanted to greet every other student in the class each morning, if the number of students in the class doubled, then the number of required greetings would practically quadruple!
Algorithm Example - Bubble Sort
An elementary sorting algorithm that compares and swaps adjacent elements.
In Python, what does an algorithm with O(n^2) time complexity indicate about its performance relative to the input size n?
O (N Squared) Example
An algorithm with a nested loop is usually quadratic time if the iterations of both loops are dependent on the number of items to be processed.
Quadratic Time Example
def print_pairs(arr): for i in arr: for j in arr: print(i, j)
Algorithms where either of the nested loops has a fixed number of iterations is simply linear time.
Non Quadratic Example
def print_pairs(arr): for i in arr: for j in range(3): print(i,j)
O(log N)
O(log N) is different. It's like a super-efficient algorithm. When the input size grows, it doesn't make the program work a lot harder. Instead, it makes it work a little harder, and the time it takes grows slowly. It's like magic when you're dealing with big datasets – it doesn't slow down much as the data gets bigger.
Everyday example - School gossip
If a student knows some gossip, then all they need to do is pass on the gossip to 2 other students and soon the whole school will know. This is why gossip spreads so quickly!
Algorithm Example - Binary Search
Efficiently finds an element in a sorted array by repeatedly dividing the search interval in half.
What does O(log n) time complexity indicate about an algorithm's performance?
Time vs Space Complexity
When are talking about the efficiency of an algorithm, it is important to consider two aspects of complexity:
Time Complexity
Time complexity refers to how an algorithm's execution time changes with input size, usually measured in terms of basic operations.
Space Complexity
Space complexity is about how an algorithm's memory or storage usage changes with input size, often measured in memory units like bytes.
When we talk about time and space complexity we also usually consider the best case, average case and worst case scenarios for each.
Best Case
The best-case scenario refers to the situation in which an algorithm performs optimally and requires the least amount of resources or time.
Example
If a teacher was looking for a student in a school with 20 classrooms, the best case would be 1 classroom would need checking.
Average Case
The average-case scenario represents the expected performance of an algorithm when considering all possible inputs, typically using statistical analysis.
Example
If a teacher was looking for a student in a school with 20 classrooms, the average case would be 10.5 classrooms would need checking.
Worst Case
The worst-case scenario describes the situation in which an algorithm performs the least efficiently, requiring the most resources or time to complete.
Example
If a teacher was looking for a student in a school with 20 classrooms, the worst case would be 20 classroom would need checking.
Review: Fill in the Blanks
O(1) means the time/space it takes to solve the problem doesn't depend on the input - it doesn't matter how big the input size is, the time is the same. An example of O(1) is ringing the school bell, where the action occurs only once regardless of the number of pupils. In contrast, O(n) means the time/space it takes increases linearly with the input size. For example, queuing for the school canteen demonstrates O(n) as the serving time will double if the student population doubles.
When analyzing , we also discuss other complexities, such as O(n^2), which indicates that the time required quadruples when the input size doubles, the classic example being classroom introductions. Moreover, O(log N) is very efficient, as a growing input size does not substantially affect the time an algorithm takes, similar to how quickly gossip spreads in a school.
Finally, it's crucial to consider the case, average case, and worst scenarios when evaluating time and space complexity. The best-case scenario refers to optimal performance requiring the least resources, while the average case represents expected performance across various inputs. Conversely, the worst-case scenario denotes the least efficient performance requiring the most resources or time.
Complete! Ready to test your knowledge?
Big O Notation
- Introduction to Big O Notation
- Big O Time Complexity Visualization
- O(1)
- O(1) Code Example
- O(n)
- O(n) Code Example
- O(n^2)
- O (N Squared) Example
- O(log N)
- Time vs Space Complexity
- Best Case
- Average Case
- Worst Case