Big O Notation(Easy explanation)
Big O notation characterizes functions according to their growth rates.
As a basic explanation: It is simplified analysis of an algorithm’s efficiency.
It is a way to describe how much time it takes to run your function. Honestly, it depends on too many factors. For example, how fast is your computer, do you run any another programs in parallel, which programming language do you use and etc. but non of them is the main factor.
Main factor is that how does runtime of your function grow as the size of input grows.
Let’s find out first that What does better algorithm mean?
- Fast: Time Complexity
- Less Memory: Memory Complexity
- Easy to Read
- Less line of the code
- etc.
Just 2 of them are really important because they might grow complexity. It may take longer time to run function: Time Complexity, or might require more space: Memory Complexity.
Basically there are 3 types of measurement:
- O(N), Big O: worst-case,
- θ(N), Big Theta: average-case.
- Ω(N), Big Omega: best-case,
Typically we are gonna take a look at Big O “Worth-Case scenario”.
Some main rules have to be followed to simplify Big O calculation process:
- Ignore constants: 5 + 3*N² = N² = O(N²)
- Dominate others: O(1) < O(logN) < O(N) < O(N²)
- 1 for block = N, 2 inner for blocks = N², and etc.
Main types of Big O Notation
- Constant O(1)
- Liner O(N)
- Quadratic O(N²)
- Logarithmic O(log N)
Basic Theory
.
2 main factors have to be found to define a Big O Notation:
- Find a fastest growing term.
- Take out the coefficient
let’s try to find Time complexity out based on these 2 factors.
Liner Time Complexity: O(N)
.
T = a*n + b
n is number of items in array, a and b are just constants
- Fastest growing term is “a*N” in this case
- “a” is coefficient here. Lets take “a” out
Now we have just “N” back. T = O(N)
const sum = (itemList) => {
let total = 0 // O(1)
itemList.forEach(item => total += item) // O(N)
return total // O(1)
}
T = O(1) + O(N) + O(1) = a₁ + N + a₂ = a₃ + N = N = O(N)
Quadratic Time Complexity: O(N²)
.
T = a*N² + b*N + c
- Fastest growing term is “a*N²” in this case
- “a” is coefficient here. Lets take “a” out
Now we have just “N² ” back. T = O(N²)
const array_2d = [["A", "B", "C"], ["D", "E"]];
const sum = (arrays) => {
let str = "" // O(1)
arrays.forEach(arr => { // O(N²)
arr.forEach(item => {
str += item + "-" // O(1)
})
})
return str // O(1)
}
sum(array_2d) // A-B-C-D-E-
T = O(1) + O(N²) + O(1) + O(1) = a₁ + N² + a₂ + a₃ = a₄ = N² = O(N²)
const array_2d = [["A", "B", "C"], ["D", "E"]];
const sum = (arrays) => {
let str = "" // O(1)
arrays.forEach(arr => { // O(N²)
arr.forEach(item => {
str += item + "-" // O(1)
})
})
arrays.forEach(arr => { // O(N²)
arr.forEach(item => {
str += item + "*" // O(1)
})
})
return str // O(1)
}
sum(array_2d) // A-B-C-D-E-A*B*C*D*E*
T = O(1) + O(N²) + O(1) + O(N²) + O(1) + O(1) = 2N² = N² = O(N²)
Constant Time Complexity: O(1)
.
T = a*1
- Fastest growing term is “a*1” in this case
- “a” is coefficient here. Lets take “a” out
Now we have just “1 ” back. T = O(1)
const func = (given_array) => {
const value = 0 //O(1)
return value //O(1)
}
T = O(1) + O(1) = a₁ + a₂ = a₃ = a₃ * 1 = O(1)
Logarithmic Time Complexity: O(log N)
const logFunc = (n) => {
while (n > 1) {
n = Math.floor(n / 2)
console.log(n)
}
}
logFunc(8)
- logFunc(8) = 8/2 = 4
- logFunc(4) = 4/2 = 2
- logFunc(2) = 2/2 =1
we know from math that by default: log N = log₂ N
in our case it will be: O(log 8) = O(log₂ 8)
O(log₂ 8) = 2³ = 3 iterations