How to measure the performance of an algorithm

Let's say you have an algorithm that takes some list and does processing.
How long does it take?

Part 1: Stopwatch

Your first instinct might be just measuring times, with a stopwatch.
That is not a good idea.

Why?

  1. It depends on the computer you're running the software on.
  2. It's hard to intuitively see how the algorithm scales.
  3. It's difficult to compare against other algorithms, because implementation details might skew the results.

How can we do better?

Part 2: Time complexity

You might try to write the time as a single expression with the variable N, representing the input size.
The unit being basically a performance-independent unit of time.
This is better, but still not good.

Why?

  1. It's hard to compute, because the expression will vary on performance characteristics of the data you're working on.
  2. It's bulky, being littered with constants.

How can we do better?

Part 3: Big-O notation

You might try and simplify the time complexity, stripping all the constants and removing all terms except for the fastest-growing.
This is called the Big-O notation, and it's commonly expressed as O(expression).
This notation doesn't provide the whole performance, only the growth of time as the input gets bigger.
It can also provide useful intuitions for the performance.
Big-O is currently one of the most common way to express the performance of algorithms.

Edit
Pub: 23 Aug 2021 06:45 UTC
Views: 661