Little Introduction to Time and Space Complexity, Big-O-Notation

A short story explaining what Big-O is all about


5 min read

Featured on Hashnode
Little Introduction to Time and Space Complexity, Big-O-Notation

There are multiple ways to implement a function. How do you know which one is 'better'? How do you determine what 'better' is?

Is it the one that runs faster? Takes up less memory? More readable? You are the decision-maker here! But quite frankly, it's the first two.

skip to Story

What is "Big O"?

Big O notation is a mathematical notation that describes the limiting behavior of a function when the argument tends towards a particular value or infinity.


Don't let it go to your head.

It basically says ⁠— as your function input grows, how does the growth affect the efficiency of the function?

Will it still be as efficient? Will it get slower and/or use more resources? Will the efficiency reduce and then become stable? Will it fold miserably? And so on.

Well whichever one happens, here's an algebraic way to denote that. \(O()\). Feel free to express yourself.

Time Complexity

Time complexity is the computational complexity that describes the amount of computer time it takes to run an algorithm.


It goes on to point out that time complexity is commonly estimated by counting the number of elementary operations performed.

You see! Not the time it takes to execute an operation, but the actual number of times each operation is executed!

Space Complexity

Space complexity is the memory required by an algorithm to execute a program and produce output.


Kinda straight forward.

A Short Story

My biscuits were stolen! I was caught lacking. End of story.

The problem was I had missing biscuits and I wanted to catch the culprit who stole them 😑. Now, the solution was to carry out an investigation to catch the person that stole my biscuits. There are many different ways of solving this, and how I choose to investigate the crime is the algorithm. You can also say the investigation is a function.

I decided to give myself some tasks (operations) to carry out. Tasks like:

  • interrogating the suspects
  • inspecting the crime scene twice
  • predicting the future

I got a book to document my findings (memory) and a list of suspects to interrogate (input).

I then went on to interrogate the suspects separately.

Important Questions

What is the time complexity of the interrogation task?

How long it took to complete a task is irrelevant! The number of tasks I completed is also irrelevant. What matters is the number of times I performed each task (operation)!

It is \(O(n)\)

Because I interrogated the suspects n number of times ⁠— n is the input size.

Peep that I interrogated every single suspect separately. If the list had 188 suspects on it, I would have had to perform an interrogation task 188 times.

In big O, we're looking at the number of times operations are performed because of the input size.

So looking back, the number of interrogations will increase as the number of suspects increase. Depending on how it increases, is how you determine the time complexity.

The tasks that did not depend on the number of suspects or had a fixed number of times I performed them ⁠— like inspecting the crime scene ⁠— were done at a constant time. This is denoted as \(O(1)\).

What is the time complexity of the investigation?

Now that we know the time complexity of each operation:

  • Interrogating the suspects ⁠— O(n)
  • Inspecting the crime scene ⁠— O(1)
  • Predicting the future ⁠— O(1)

We take the fastest-growing term or the term with the highest order:

$$ O(n) + O(1) + O(1) $$

It is \(O(n)\)

What if there are multiple \(O(n)\)?

Well, you can add them together to make it \(O(2n)\), \(O(5n)\) or something else. But big O does not care if I interrogated the suspects twice every time or predicted the future 2 million times, it just describes the growth rate. So it does not care about the coefficient whether it be 1 or 1000.

It is still \(O(n)\)

How about \(O(2)\)?

I mean I inspected the crime scene twice but big O does not care. Big O sees it as \(O(1)\). It happens in constant time. If the input changes, this operation will be not directly affected. All numbers are constant time and thus, \(O(500)\) is \(O(1)\).

It is \(O(1)\)

How would \(O(n^2)\) come about?

Let's say every time I interrogated a suspect, I went on to cross-check with every other suspect to make sure the story backs up. Well, that would be \(O(n * n)\).

If every time I interrogated a suspect and went on to inspect the crime scene, then that would be \(O(n * 1)\) which is \(O(n)\).

What is the space complexity of the investigation?

During the investigation, I had a book where I documented my findings. I had to use:

  • a page each for every suspect
  • 3 pages for the crime scene
  • 4 pages for predictions

Maybe if the book was fatter, or the pages had more lines or the pen I used was acrylic, or if I used a crayon instead, the pattern would be different.

With this pattern, I would always use:

  • n number of pages for the interrogation task ⁠— \(O(n)\)
  • 3 pages for crime scene inspection ⁠— \(O(1)\) constant space
  • 4 pages for future predictions ⁠— \(O(1)\) constant space

Constant space because the page size does not change as the input (suspects) changes. It will always be 3 and 4 pages used for crime scene inspection and future predictions respectively.

Taking the fastest growing term, we have:


What if I interrogated the suspects all at once?

Well, then the interrogation task would only occur 1 time, giving a better time complexity of \(O(1)\).

However, the circumstances for using the book wouldn't change. I would still need a page to document my findings on a single suspect, leaving the space complexity as before ⁠— \(O(n)\).

How about calling a method/function inside the current function?

Imagine during the investigation, I chose to use forensics ⁠—another function with its own different tasks⁠— to test the fingerprint of each suspect with the hair follicles of every suspect. This gives \(O(n * n)\).

The new time complexity of the investigation would be:

$$ O(n) + O(1) + O(1) + O(n^2) $$

Taking the fastest growing term, we would have:


You can use bigocheatsheet to view the space and time Big-O complexities of common algorithms.

Big-O estimates and gives an approximation. It provides a reference.

Let's conversate in the comment section.