Asymptotic Time Complexity and Big-O Notation

(漸近的計算量と O 記法)

Data Structures and Algorithms

3rd lecture, October 10, 2019

http://www.sw.it.aoyama.ac.jp/2019/DA/lecture3.html

Martin J. Dürst

AGU

© 2009-19 Martin J. Dürst 青山学院大学

Today's Schedule

 

Schedule for the Next Few Weeks

 

Summary of Last Lecture

 

Homework Collection

 

Last Week's Homework 1: Example for Asymptotic Growth of Number of Steps

number of steps (counting additions and divisions)
n (number of data items)       1      8     64    512   4'096  32'768 262'144
linear search 1 8 64 512 4'096 32'768 262'144
binary search 1 10 19 28 37 46 55

 

How to Derive Steps from (Pseudo)Code

 

Comparing Execution Times: From Concrete to Abstract

Very concrete

Very abstract

 

Estimate Worst Case Number of Steps

 

Thinking in Terms of Asymptotic Growth

⇒ Independent of hardware, implementation details, step counting details

⇒ Simple expression of essential differences between algorithms

 

Last Week's Homework 2: Example for Asymptotic Growth of Number of Steps

Fill in the following table
(use engineering notation (e.g. 1.5E+20) if the numbers get very big;
round liberally, the magnitude of the number is more important than the exact value)

n 1 10 100 1'000 10'000 100'000
5n 5 50 500 5'000 50'000 500'000
n1.2 1 15.8 251.2 3'981 63'096 1'000'000
n2 1 100 10'000 1'000'000 100'000'000 1e+10
n log2 n 0 33.2 664.4 9'966 132'877 1'660'964
1.01n 1.01 1.1046 2.7 20'959 1.636e+43 1.372e+432

 

Solution to Homework 3: Compare Function Growth

Which function of each pair (left/right column) grows larger if n increases?

left right answer
100n n2 right (n ≥ 100)
1.1n n20 left (n ≥ 1541)
5 log2 n 10 log4 n

same (log2 x = 2 log4 x)

20n n! right (n ≥ 52)
100·2n 2.1n right (n ≥ 95)

 

Using Ruby to Compare Function Growth

Caution: Use only when you understand which function will eventually grow larger

 

Classification of Functions by Asymptotic Growth

Various growth classes with example functions:

 

Big-O Notation: Set of Functions

Big-O notation is a notation for expressing the order of growth of a function (e.g. time complexity of an algorithm).

O(g): Set of functions with lower or same order of growth as function g

Example:
Set of functions that grow slower or as slow as n2:
O(n2)

Usage examples:
3n1.5O(n2), 15n2O(n2), 2.7n3O(n2)

 

Exact Definition of O

c>0: ∃n0≥0: ∀nn0:   f(n)≤c·g(n)  ⇔  f(n)∈O(g(n))

 

Example Algorithms

 

Comparing the Execution Time of Algorithms

(from last lecture)

Possible questions:

Problem: These questions do not have a single answer.

When we compare algorithms, we want a simple answer.

The simple and general answer is using big-O notation:
Linear search is O(n), binary search is O(log n).

Binary search is faster than linear search (for inputs of significant size)

 

Additional Examples for O

 

Confirming the Order of a Function

 

Method 1: Use The Definition

We want to check that 2n+15∈O(n)

The definition of Big-O is:

c>0: ∃n0≥0: ∀nn0:   f(n)≤c·g(n)  ⇔  f(n)∈O(g(n))

We have to find a c and an n0 so that ∀nn0:   f(n)≤c·g(n)

Example 1: n0: = 5, c=3

n≥5: 2n+15≤3n ⇒ wrong, either n0 or c (or both) are not big enough

Example 2: n0: = 10, c=4

n≥10: 2n+15≤4n ⇒ okay, so this proves that 2n+15∈O(n)

 

Method 2: Use the Limit of a Function

We want to check which of 3n1.5, 15n2, and 2.7n3 are ∈ O(n2)

limn→∞(3n1.5/n2) = 0 ⇒   O(3n1.5)⊊O(n2), 3n1.5O(n2)

limn→∞(15n2/n2) = 15 ⇒   O(15n2)=O(n2), 15n2O(n2)

limn→∞(2.7n3/n2) = ∞ ⇒   O(n2)⊊O(2.7n3), 2.7n3O(n2)

 

Method 3: Simplification of Big-O Notation

 

Ignoring Lower Terms in Polynomials

Concrete Example:   500n2+30nO(n2)

Derivation for general case: f(n) = dna + enbO(na) [a > b > 0]

Definition of O: f (n) ≤ cg(n) [n > n0; n0, c > 0]

dna + enbcna [a > 0 ⇒ na>0]

d + enb/na = d + enb-ac [b-a < 0 ⇒ limn→∞enb-a = 0]

Some possible values for c and n0:

Some possible values for concrete example (500n2+30n):

In general: a > b > 0 ⇒ O(na + nb) = O(na)

 

Ignoring Logarithm Base

How do O(log2 n) and O(log10 n) differ?

(Hint: logb a = logc a / logc b = logc a · logb c)

log10 n = log2 n · log10 2 ≅ 0.301 · log2 n

O(log10 n) = O(0.301... · log2 n) = O(log2 n)

a>1, b>1:   O(loga n) = O(logb n) = O(log n)

 

Additional Notations: Ω and Θ

Examples:
3n1.5O(n2), 15n2O(n2), 2.7n3O(n2)
3n1.5Ω(n2), 15n2Ω(n2), 2.7n3Ω(n2)
3n1.5Θ(n2), 15n2Θ(n2), 2.7n3Θ(n2)

 

Exact Definitions of Ω and Θ

Definition of Ω

c>0: ∃n0≥0: ∀nn0: c·g(n)≤f(n) ⇔ f(n)∈Ω(g(n))

Definition of Θ

c1>0: ∃c2>0: ∃n0≥0: ∀nn0:
c1·g(n)≤f(n)≤c2·g(n)   ⇔   f(n)∈Θ(g(n))

Relationships between Ω and Θ

f(n)∈Θ(g(n)) ⇔f(n)∈O(g(n)) ∧ f(n)∈Ω(g(n))  

Θ(g(n)) = O(g(n)) ∩ Ω(g(n))

 

Use of Order Notation

In general as well as in this course, mainly O will be used.

 

Summary

 

Homework

(no need to submit)

Review this lecture's material and the additional handout every day!

On the Web, find algorithms with time complexity O(1), O(log n), O(n), O(n log n), O(n2), O(n3), O(2n), O(n!), and so on.

 

Report: Manual Sorting

Deadline: October 23, 2018 (Wednesday), 19:00.

Where to submit: Box in front of room O-529 (building O, 5th floor)

Format:

Problem: Propose and describe an algorithm for manual sorting, for the following two cases:

  1. One person sorts 4'000 pages
  2. 16 people together sort 50'000 pages

Each page is a sheet of paper of size A4, where a 10-digit number is printed in big letters.

The goal is to sort the pages by increasing number. There is no knowledge about the distribution of the numbers.

You can use the same algorithm for both cases, or a different algorithm.

Details:

  

Glossary

big-O notation
O 記法 (O そのものは漸近記号ともいう)
asymptotic growth
漸近的な増加
approximate
近似する
essence
本質
constant factor
一定の係数、定倍数
eventually
最終的に
linear growth
線形増加
quadratic growth
二次増加
cubic growth
三次増加
logarithmic growth
対数増加
exponential growth
指数増加
Omega (Ω)
オメガ (大文字)
capital letter
大文字
Theta (Θ)
シータ (大文字)
asymptotic upper bound
漸近的上界
asymptotic lower bound
漸近的下界
appropriate
適切
limit
極限
polynomial
多項式
term
(式の) 項
logarithm
対数
base
(対数の) 底