Asymptotic Time Complexity and Big-O Notation

(漸近的計算量と O 記法)

Data Structures and Algorithms

3rd lecture, October 5, 2017

http://www.sw.it.aoyama.ac.jp/2017/DA/lecture3.html

Martin J. Dürst

AGU

© 2009-17 Martin J. Dürst 青山学院大学

Today's Schedule

Leftovers from Last Lecture

 

Summary of Last Lecture

 

Last Week's Homework 1: Example for Asymptotic Growth of Number of Steps

 

Last Week's Homework 2: Example for Asymptotic Growth of Number of Steps

 

Solution to Homework 3: Compare Function Growth

 

Using Ruby to Compare Function Growth

Caution: Use only when you understand which function will eventually grow larger

 

Classification of Functions by Asymptotic Growth

Various growth classes with example functions:

 

Big-O Notation: Set of Functions

Big-O notation is a notation for expressing the order of growth of a function (e.g. time complexity of an algorithm).

O(g): Set of functions with lower or same order of growth as function g

Example:
Set of functions that grow slower or as slow as n2: O(n2)

Usage examples:
3n1.5O(n2), 15n2O(n2), 2.7n3O(n2)

 

Exact Definition of O

c>0: ∃n0≥0: ∀nn0:   f(n)≤c·g(n)  ⇔  f(n)∈O(g(n))

 

Example Algorithms

 

Comparing the Execution Time of Algorithms

(from last lecture)

Possible questions:

Problem: These questions do not have a single answer.

When we compare algorithms, we want a simple answer.

The simple and general answer is:
Linear search is O(n), binary search is O(log n).

 

Additional Examples for O

 

Additional Notations: Ω and Θ

Examples:
3n1.5O(n2), 15n2O(n2), 2.7n3O(n2)
3n1.5Ω(n2), 15n2Ω(n2), 2.7n3Ω(n2)
3n1.5Θ(n2), 15n2Θ(n2), 2.7n3Θ(n2)

 

Exact Definitions of Ω and Θ

c>0: ∃n0≥0: ∀nn0:         c·g(n)≤f(n)   ⇔   f(n)∈Ω(g(n))

c1>0: ∃c2>0: ∃n0≥0: ∀nn0:       c1·g(n)≤f(n)≤c2·g(n)   ⇔   f(n)∈Θ(g(n))

f(n)∈O(g(n)) ∧ f(n)∈Ω(g(n))  ⇔ f(n)∈Θ(g(n))

Θ(g(n)) = O(g(n)) ∩ Ω(g(n))

 

Use of Order Notation

In general as well as in this course, mainly O will be used.

 

Confirming the Order of a Function

 

Simplification of Big-O Notation

 

Ignoring Lower Terms in Polynomials

Concrete Example:   500n2+30nO(n2)

Derivation: f(n) = dna + enbO(na) [a > b > 0]

Definition of O: f (n) ≤ cg(n) [n > n0; n0, c > 0]

dna + enbcna [a > 0 ⇒ na>0]

d + enb/nac

d + enb-ac [b-a < 0 ⇒ enb-a→0]

Some possible values for c and n0:

Some possible values for concrete example (500n2+30n):

In general: a > b > 0 ⇒ O(na + nb) = O(na)

 

Ignoring Logarithm Base

How do O(log2 n) and O(log10 n) differ?

(Hint: logb a = logc a / logc b = logc a · logb c)

log10 n = log2 n · log10 2 ≅ 0.301 · log2 n

O(log10 n) = O(0.301... · log2 n) = O(log2 n)

a>1, b>1:   O(loga n) = O(logb n) = O(log n)

 

Summary

 

Homework

(no need to submit)

Review this lecture's material every day!

On the Web, find algorithms with time complexity O(1), O(log n), O(n), O(n log n), O(n2), O(n3), O(2n), O(n!), and so on.

 

Glossary

big-O notation
O 記法 (O そのものは漸近記号ともいう)
asymptotic growth
漸近的な増加
approximate
近似する
essence
本質
constant factor
一定の係数、定倍数
eventually
最終的に
linear growth
線形増加
quadratic growth
二次増加
cubic growth
三次増加
logarithmic growth
対数増加
exponential growth
指数増加
Omega (Ω)
オメガ (大文字)
capital letter
大文字
Theta (Θ)
シータ (大文字)
asymptotic upper bound
漸近的上界
asymptotic lower bound
漸近的下界
appropriate
適切
limit
極限
polynomial
多項式
term
(式の) 項
logarithm
対数
base
(対数の) 底