COL351: Slides for Lecture Component 18 Thanks to Miles Jones, - - PowerPoint PPT Presentation

β–Ά
col351 slides for lecture component 18
SMART_READER_LITE
LIVE PREVIEW

COL351: Slides for Lecture Component 18 Thanks to Miles Jones, - - PowerPoint PPT Presentation

COL351: Slides for Lecture Component 18 Thanks to Miles Jones, Russell Impagliazzo, and Sanjoy Dasgupta at UCSD for these slides. Master Theorem How do you solve a recurrence of the form = + ! We will


slide-1
SLIDE 1

Thanks to Miles Jones, Russell Impagliazzo, and Sanjoy Dasgupta at UCSD for these slides.

COL351: Slides for Lecture Component 18

slide-2
SLIDE 2

Master Theorem

  • How do you solve a recurrence of the form

π‘ˆ π‘œ = π‘π‘ˆ π‘œ 𝑐 + 𝑃 π‘œ! We will use the master theorem.

slide-3
SLIDE 3

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠.

slide-4
SLIDE 4

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. If 𝑠 < 1 then this sum converges. This means that the sum is bounded above by some constant 𝑑. Therefore 𝑗𝑔 𝑠 < 1, π‘’β„Žπ‘“π‘œ !

!"# $

𝑠! < 𝑑 𝑔𝑝𝑠 π‘π‘šπ‘š π‘œ 𝑑𝑝 !

!"# $

𝑠! Ο΅ 𝑃(1)

slide-5
SLIDE 5

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. If 𝑠 = 1 then this sum is just summing 1 over and over n times. Therefore 𝑗𝑔 𝑠 = 1, π‘’β„Žπ‘“π‘œ !

!"# $

𝑠! = !

!"# $

1 = π‘œ + 1 Ο΅ 𝑃(π‘œ)

slide-6
SLIDE 6

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. If 𝑠 > 1 then this sum is exponential with base 𝑠. 𝑗𝑔 𝑠 > 1, π‘’β„Žπ‘“π‘œ !

!"# $

𝑠! < 𝑑𝑠$ 𝑔𝑝𝑠 π‘π‘šπ‘š π‘œ, 𝑑𝑝 !

!"# $

𝑠! Ο΅ 𝑃 𝑠$ 𝑑 > 𝑠 𝑠 βˆ’ 1

slide-7
SLIDE 7

Summation Lemma

Consider the summation !

!"# $

𝑠! It behaves differently for different values of 𝑠. !

!"# $

𝑠! Ο΅ 9 𝑃 1 𝑗𝑔 𝑠 < 1 𝑃 π‘œ 𝑗𝑔 𝑠 = 1 𝑃 𝑠$ 𝑗𝑔 𝑠 > 1

slide-8
SLIDE 8

Master Theorem

Master Theorem: If π‘ˆ(π‘œ) = π‘π‘ˆ(π‘œ/𝑐) + 𝑃(π‘œ:) for some constants 𝑏 > 0, 𝑐 > 1, 𝑒 β‰₯ 0 , Then π‘ˆ π‘œ Ο΅ 𝑃 π‘œ: 𝑗𝑔 𝑏 < 𝑐: 𝑃 π‘œ: log π‘œ 𝑗𝑔 𝑏 = 𝑐: 𝑃 π‘œ;<=! > 𝑗𝑔 𝑏 > 𝑐:

slide-9
SLIDE 9

Master Theorem: Solving the recurrence

π‘ˆ(π‘œ) = π‘π‘ˆ(π‘œ/𝑐) + 𝑃(π‘œ:)

Size π‘œ Size π‘œ/𝑐 Size π‘œ/𝑐% Size 1 Depth log& π‘œ 1 subproblem 𝑏 subproblems 𝑏% subproblems 𝑏'()! $ subproblems

slide-10
SLIDE 10

Master Theorem: Solving the recurrence

After 𝑙 levels, there are 𝑏! subproblems, each

  • f size π‘œ/𝑐!.

So, during the 𝑙th level of recursion, the time complexity is 𝑃

$ &" *

𝑏! = 𝑃 𝑏!

$ &" *

= 𝑃 π‘œ* 𝑏 𝑐*

!

slide-11
SLIDE 11

Master Theorem: Solving the recurrence

After 𝑙 levels, there are 𝑏! subproblems, each of size π‘œ/𝑐!. So, during the 𝑙th level, the time complexity is 𝑃

$ &" *

𝑏! = 𝑃 𝑏!

$ &" *

= 𝑃 π‘œ* 𝑏 𝑐*

!

After log& π‘œ levels, the subproblem size is reduced to 1, which usually is the size

  • f the base case.

So the entire algorithm is a sum of each level. π‘ˆ π‘œ = 𝑃 π‘œ* !

!"# '()! $

𝑏 𝑐*

!

slide-12
SLIDE 12

Master Theorem: Proof

π‘ˆ π‘œ = 𝑃 π‘œ" &

#$% &'(! )

𝑏 𝑐"

#

Case 1: 𝑏 < 𝑐" Then we have that

* +" < 1 and the series converges to a constant so

π‘ˆ π‘œ = 𝑃 π‘œ"

slide-13
SLIDE 13

Master Theorem: Proof

π‘ˆ π‘œ = 𝑃 π‘œ" &

#$% &'(! )

𝑏 𝑐"

#

Case 2: 𝑏 = 𝑐" Then we have that

* +" = 1 and so each term is equal to 1

π‘ˆ π‘œ = 𝑃 π‘œ" log+ π‘œ

slide-14
SLIDE 14

Master Theorem: Proof

π‘ˆ π‘œ = 𝑃 π‘œ" &

#$% &'(! )

𝑏 𝑐"

#

Case 2: 𝑏 > 𝑐" Then the summation is exponential and grows proportional to its last term

* +" &'(! )

so π‘ˆ π‘œ = 𝑃 π‘œ" 𝑏 𝑐"

&'(! )

= 𝑃 π‘œ&'(! *

slide-15
SLIDE 15

Master Theorem

Theorem: If π‘ˆ(π‘œ) = π‘π‘ˆ(π‘œ/𝑐) + 𝑃(π‘œ:) for some constants 𝑏 > 0, 𝑐 > 1, 𝑒 β‰₯ 0 , Then π‘ˆ π‘œ Ο΅ 𝑃 π‘œ: 𝑗𝑔 𝑏 < 𝑐: 𝑃 π‘œ: log π‘œ 𝑗𝑔 𝑏 = 𝑐: 𝑃 π‘œ;<=! > 𝑗𝑔 𝑏 > 𝑐:

Top-heavy Steady-state Bottom-heavy

slide-16
SLIDE 16

Master Theorem Applied to Multiply

The recursion for the runtime of Multiply is T(n) = 4T(n/2) + cn So we have that a=4, b=2, and d=1. In this case, 𝑏 > 𝑐: so π‘ˆ π‘œ ϡ𝑃 π‘œ;<=, F = 𝑃 π‘œG Not any improvement of grade-school method.

π‘ˆ π‘œ Ο΅ 𝑃 π‘œ* 𝑗𝑔 𝑏 < 𝑐* 𝑃 π‘œ* log π‘œ 𝑗𝑔 𝑏 = 𝑐* 𝑃 π‘œ'()! + 𝑗𝑔 𝑏 > 𝑐*

slide-17
SLIDE 17

Master Theorem Applied to MultiplyKS

The recursion for the runtime of Multiply is T(n) = 3T(n/2) + cn So we have that a=3, b=2, and d=1. In this case, 𝑏 > 𝑐: so π‘ˆ π‘œ ϡ𝑃 π‘œ;<=, H = 𝑃 π‘œI.KL An improvement on grade-school method!!!!!!

π‘ˆ π‘œ Ο΅ 𝑃 π‘œ* 𝑗𝑔 𝑏 < 𝑐* 𝑃 π‘œ* log π‘œ 𝑗𝑔 𝑏 = 𝑐* 𝑃 π‘œ'()! + 𝑗𝑔 𝑏 > 𝑐*

slide-18
SLIDE 18

Poll: What is the fastest known integer multiplication time?

  • 𝑃 π‘œ/012
  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ (log π‘šπ‘π‘•π‘œ)

3)

  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ 2^{logβˆ— π‘œ})
  • 𝑃(π‘œ log π‘œ)
  • O(n)
slide-19
SLIDE 19

Poll: What is the fastest known integer multiplication time? All have/will be correct

  • 𝑃 π‘œ/012 Kuratsuba
  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ log log π‘œ ) Schonhage-Strassen, 1971
  • 𝑃(π‘œ π‘šπ‘π‘•π‘œ 2^{𝑑 logβˆ— π‘œ}) Furer, 2007
  • 𝑃(π‘œ log π‘œ) Harvey and van der Hoeven, 2019
  • O(n), you, tomorrow?
slide-20
SLIDE 20

Can we do better than π‘œ!.#$?

  • Could any multiplication algorithm have a faster asymptotic

runtime than π›ͺ π‘œ5.78 ?

  • Any ideas?????
slide-21
SLIDE 21

Can we do better than π‘œ!.#$?

  • What if instead of splitting the number in half, we split it

into thirds.

  • x=
  • y=

xL xM yL yM xR yR

slide-22
SLIDE 22

Can we do better than π‘œ!.#$?

  • What if instead of splitting the number in half, we split it

into thirds.

  • 𝑦 = 239/2𝑦; + 29/2𝑦< + 𝑦=
  • 𝑧 = 239/2𝑧; + 29/2𝑧< + 𝑧=
slide-23
SLIDE 23

Multiplying trinomials

  • 𝑏𝑦3 + 𝑐𝑦 + 𝑑

𝑒𝑦3 + 𝑓𝑦 + 𝑔

slide-24
SLIDE 24

Multiplying trinomials

  • 𝑏𝑦3 + 𝑐𝑦 + 𝑑

𝑒𝑦3 + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦> + 𝑏𝑓 + 𝑐𝑒 𝑦2 + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦3 + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 9 multiplications means 9 recursive calls. Each multiplication is 1/3 the size of the original.

slide-25
SLIDE 25

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 9 multiplications means 9 recursive calls. Each multiplication is 1/3 the size of the original. π‘ˆ π‘œ = 9π‘ˆ π‘œ 3 + 𝑃(π‘œ)

slide-26
SLIDE 26

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔 π‘ˆ π‘œ = 9π‘ˆ π‘œ 3 + 𝑃(π‘œ)

π‘ˆ π‘œ Ο΅ 𝑃 π‘œ* 𝑗𝑔 𝑏 < 𝑐* 𝑃 π‘œ* log π‘œ 𝑗𝑔 𝑏 = 𝑐* 𝑃 π‘œ'()! + 𝑗𝑔 𝑏 > 𝑐*

a=9 9 > 3! b=3 π‘ˆ π‘œ = 𝑃 π‘œ"#$M % d=1 π‘ˆ π‘œ = 𝑃 π‘œ&

slide-27
SLIDE 27

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔

  • There is a way to reduce from 9 multiplications down to just

5!!!

  • Then the recursion becomes
  • π‘ˆ π‘œ = 5π‘ˆ(π‘œ/3) + O(n)
  • So by the master theorem
slide-28
SLIDE 28

Multiplying trinomials

  • 𝑏𝑦G + 𝑐𝑦 + 𝑑

𝑒𝑦G + 𝑓𝑦 + 𝑔 = 𝑏𝑒𝑦F + 𝑏𝑓 + 𝑐𝑒 𝑦H + 𝑏𝑔 + 𝑐𝑓 + 𝑑𝑒 𝑦G + 𝑐𝑔 + 𝑑𝑓 𝑦 + 𝑑𝑔

  • There is a way to reduce from 9 multiplications down to just

5!!!

  • Then the recursion becomes
  • π‘ˆ π‘œ = 5π‘ˆ(π‘œ/3) + O(n)
  • So by the master theorem T(n)=O(π‘œ;<=- K) = 𝑃 π‘œI.FH
slide-29
SLIDE 29

Dividing into k subproblems

  • What happens if we divide into k subproblems each of

size n/k.

  • (𝑏#./𝑦#./ + 𝑏#.0𝑦#.0 + β‹― 𝑏/𝑦 + 𝑏%)(𝑐#./𝑦#./ + 𝑐#.0𝑦#.0 + β‹― 𝑐/𝑦 + 𝑐%)
  • How many terms are there? (multiplications.)
slide-30
SLIDE 30

Dividing into k subproblems

  • What happens if we divide into k subproblems each of size n/k.
  • (𝑏!,-𝑦!,- + 𝑏!,%𝑦!,% + β‹― 𝑏-𝑦 + 𝑏#)(𝑐!,-𝑦!,- + 𝑐!,%𝑦!,% + β‹― 𝑐-𝑦 + 𝑐#)
  • How many terms are there? (multiplications.)
  • There are 𝑙G multiplications. The recursion is

π‘ˆ π‘œ = 𝑙Gπ‘ˆ π‘œ 𝑙 + 𝑃 π‘œ … … … 𝑏 = 𝑙G, 𝑐 = 𝑙, 𝑒 = 1 π‘ˆ π‘œ = 𝑃(π‘œ;<=1 N,) = 𝑃 π‘œG

slide-31
SLIDE 31

Cook-Toom algorithm

  • In fact, if you split up your number into k equally sized

parts, then you can combine them with 2k-1 multiplications instead of the 𝑙3 individual multiplications.

  • This means that you can get an algorithm that runs in
  • π‘ˆ π‘œ = (2𝑙 βˆ’ 1)π‘ˆ(π‘œ/𝑙) + O(n)
slide-32
SLIDE 32

Cook-Toom algorithm

  • In fact, if you split up your number into k equally sized parts,

then you can combine them with 2k-1 multiplications instead

  • f the 𝑙G individual multiplications.
  • This means that you can get an algorithm that runs in
  • π‘ˆ π‘œ = (2𝑙 βˆ’ 1)π‘ˆ(π‘œ/𝑙) + O(n)
  • π‘ˆ π‘œ = 𝑃 π‘œ

234(,167) 234 1

time!!!!

slide-33
SLIDE 33

Cook-Toom algorithm

π‘ˆ π‘œ = (2𝑙 βˆ’ 1)π‘ˆ(π‘œ/𝑙) + O(n)

  • π‘ˆ π‘œ = 𝑃 π‘œ

!"# $%&' !"# %

time.

  • So we can have a near-linear time algorithm if we take

k to be sufficiently large. The O(n) term in the recursion takes a lot of time the bigger k gets. So is it worth it to make k very large?

slide-34
SLIDE 34

CSE101: Algorithm Design and Analysis

Russell Impagliazzo Sanjoy Dasgupta Ragesh Jaiswal (Thanks for slides: Miles Jones)

Week-06 Lecture 24: Divide and Conquer (Tree and Computational Geometry)

slide-35
SLIDE 35

Divide and Conquer Trees

  • Let’s say we have a full and balanced binary tree (all

parents have two children and all leaves are on the bottom level.)

slide-36
SLIDE 36

Divide and Conquer Trees

  • Notice that each child’s subtree is half of the problem so

we get a nice divide and conquer structure.

slide-37
SLIDE 37

Divide and Conquer Trees

  • If the tree is uneven, we can still use the same strategy

but we need to take a bit of care when calculating runtime.

slide-38
SLIDE 38

Least common ancestor

  • Given a binary tree with π‘œ vertices, we wish to compute

𝑀𝐷𝐡(𝑦, 𝑧) for each pair of vertices 𝑦, 𝑧.

  • 𝑀𝐷𝐡(𝑦, 𝑧) is the least common ancestor of 𝑦 and 𝑧. Or in
  • ther words, the β€œyoungest” common ancestor of 𝑦 and 𝑧.
  • For example, the LCA of me and my brother is our parent.

The LCA of me and my uncle is my grandparent (his parent.) A vertex can be its own ancestor so the LCA of me and my father is my father.

slide-39
SLIDE 39

Least common ancestor

  • What pairs of vertices will have the root 𝑠 as their least

common ancestor?

slide-40
SLIDE 40

Least common ancestor

  • What pairs of vertices will have the root 𝑠 as their least

common ancestor?

  • For each vertex 𝑀, set π‘šπ‘‘π‘ 𝑀, 𝑠 = 𝑠.
  • For each pair of vertices 𝑣, 𝑀 such that 𝑣 is in the left

subtree and 𝑀 is in the right subtree, set π‘šπ‘‘π‘ 𝑣, 𝑀 = 𝑠.

  • Now what? Are we done?
  • Recurse on the left and right subtrees!!!!!
slide-41
SLIDE 41

Pseudocode

Def LCA(r): Lsubtree = explore(r.lc) Rsubtree = explore(r.rc) for all vertices 𝑣 in Lsubtree: π‘šπ‘‘π‘ 𝑣, 𝑠 = 𝑠 for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑠, 𝑀 = 𝑠 for all vertices 𝑣 in Lsubtree: for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑣, 𝑀 = 𝑠 LCA(r.lc) LCA(r.rc)

slide-42
SLIDE 42

Pseudocode (runtime)

Def LCA(r): Lsubtree = explore(r.lc) Rsubtree = explore(r.rc) for all vertices 𝑣 in Lsubtree: π‘šπ‘‘π‘ 𝑣, 𝑠 = 𝑠 for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑠, 𝑀 = 𝑠 for all vertices 𝑣 in Lsubtree: for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑣, 𝑀 = 𝑠 LCA(r.lc) LCA(r.rc) If the binary tree is balanced, then each recursive call is of size $,-

%

  • r roughly half.

How long does the non-recursive part take?

slide-43
SLIDE 43

Pseudocode (runtime)

Def LCA(r): Lsubtree = explore(r.lc) Rsubtree = explore(r.rc) for all vertices 𝑣 in Lsubtree: π‘šπ‘‘π‘ 𝑣, 𝑠 = 𝑠 for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑠, 𝑀 = 𝑠 for all vertices 𝑣 in Lsubtree: for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑣, 𝑀 = 𝑠 LCA(r.lc) LCA(r.rc) If the binary tree is balanced, then each recursive call is of size $,-

%

  • r roughly half.

How long does the non-recursive part take? π‘ˆ π‘œ = 2π‘ˆ π‘œ βˆ’ 1 2 + O n% Using the master theorem with a=2, b=2, d=2, π‘ˆ π‘œ = 𝑃 π‘œ%

slide-44
SLIDE 44

Pseudocode (runtime uneven)

Def LCA(r): Lsubtree = explore(r.lc) Rsubtree = explore(r.rc) for all vertices 𝑣 in Lsubtree: π‘šπ‘‘π‘ 𝑣, 𝑠 = 𝑠 for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑠, 𝑀 = 𝑠 for all vertices 𝑣 in Lsubtree: for all vertices 𝑀 in Rsubtree: π‘šπ‘‘π‘ 𝑣, 𝑀 = 𝑠 LCA(r.lc) LCA(r.rc) If the binary tree is uneven then the runtime recurrence is π‘ˆ π‘œ = π‘ˆ 𝑀 + π‘ˆ 𝑆 + 𝑃 𝑀𝑆 Where 𝑀 is the size of the left subrtree and 𝑆 is the size of the right subtree. What do you think the total runtime will be? Take a guess and we can check it!!!

slide-45
SLIDE 45

Uneven DC runtime

  • π‘ˆ π‘œ = π‘ˆ 𝑀 + π‘ˆ R + O LR
  • We guess that it would take 𝑃 π‘œ3 . So let’s try to prove

this using induction.

  • Claim: π‘ˆ π‘œ ≀ π‘‘π‘œ3 for all π‘œ β‰₯ 1 and for some constant 𝑑

that is bigger than π‘ˆ(1) and bigger than the coefficient in the 𝑃(𝑀𝑆) term.

slide-46
SLIDE 46

Uneven DC runtime

  • Base case. π‘ˆ 1 < 𝑑(13). True by choice of 𝑑.
  • Suppose that for some π‘œ > 1, π‘ˆ 𝑙 < 𝑑𝑙3 for all 𝑙 such

that 1 ≀ 𝑙 < π‘œ.

  • Then

π‘ˆ π‘œ < π‘ˆ 𝑀 + π‘ˆ 𝑆 + 𝑑𝑀𝑆 ≀ 𝑑𝑀3 + 𝑑𝑆3 + 𝑑𝑀𝑆 < 𝑑𝑀3 + 𝑑𝑆3 + 2𝑑𝑀𝑆 = 𝑑 𝑀 + 𝑆 3 = 𝑑 π‘œ βˆ’ 1 3 < π‘‘π‘œ3

slide-47
SLIDE 47

Make Heap

  • Problem: Given a list of n elements, form a heap

containing all elements.

slide-48
SLIDE 48

Divide and conquer strategy

  • Assume π‘œ = 2D βˆ’ 1. (Add blank elements if needed)
  • Divide the list into two lists of size 9E5

3 and a left-over

element

  • Make heaps with both (in sub-trees of root)
  • Put left-over element at root.
  • β€œTrickle down” top element to reinstate heap property
slide-49
SLIDE 49

Time analysis

  • To solve one problem, we solve two problems of half the

size, and then spend constant time per depth of the tree.

  • T(n) = T( ) + O( )
slide-50
SLIDE 50

Time analysis

  • To solve one problem, we solve two problems of half the

size, and then spend constant time per depth of the tree.

  • T(n) = 2 T( n/2 ) + O(log n )
  • Doesn’t fit master theorem.
slide-51
SLIDE 51

Time analysis: sandwiching

  • To solve one problem, we solve two problems of half the

size, and then spend constant time per depth of the tree.

  • T(n) = 2 T( n/2 ) + O(log n )
  • Define L(n) =2 T(n/2) + O(1), H(n) = 2T(n/2) +𝑃 π‘œ

' $

  • L(n) < T(n) < H(n)
  • Apply Master Theorem: Both L(n) and H(n) are O(n),
  • So T(n) is O(n)
slide-52
SLIDE 52

minimum distance

  • Given a list of coordinates, [ 𝑦5, 𝑧5 , … , 𝑦9, 𝑧9 ], find the

distance between the closest pair.

  • Brute force solution?
  • min = 0
  • for i from 1 to n-1:
  • for j from i+1 to n:
  • if min > distance( 𝑦9, 𝑧9 , (𝑦:, 𝑧:))
  • return min
slide-53
SLIDE 53

Example

𝑧 𝑦 𝑦#

slide-54
SLIDE 54

Example

𝑧 𝑦 𝑦#

slide-55
SLIDE 55

Divide and conquer

  • Partition the points by x, according to whether they are to

the left or right of the median

  • Recursively find the minimum distance points on the two

sides.

  • Need to compare to the smallest β€œcross distance”

between a point on the left and a point on the right

  • Only need to look at β€œclose” points
slide-56
SLIDE 56

Combine

  • How will we use this information to find the distance of

the closest pair in the whole set?

  • We must consider if there is a closest pair where one

point is in the left half and one is in the right half.

  • How do we do this?
  • Let 𝑒 = min(𝑒;, 𝑒=) and compare only the points (𝑦H, 𝑧H)

such that 𝑦I βˆ’ 𝑒 ≀ 𝑦H and 𝑦H ≀ 𝑦I + 𝑒.

slide-57
SLIDE 57

Example

𝑧 𝑦 𝑦#

𝑄

;

slide-58
SLIDE 58

Combine

  • How will we use this information to find the distance of the

closest pair in the whole set?

  • We must consider if there is a closest pair where one point is

in the left half and one is in the right half.

  • How do we do this?
  • Let 𝑒 = min(𝑒V, 𝑒W) and compare only the points (𝑦X, 𝑧X) such

that 𝑦Y βˆ’ 𝑒 ≀ 𝑦X and 𝑦X ≀ 𝑦Y + 𝑒.

  • Worst case, how many points could this be?
slide-59
SLIDE 59
  • Given a point 𝑦, 𝑧 ∈ 𝑄

!, let’s look in a 2𝑒×𝑒 rectangle with that point

at its upper boundary:

  • There could not be more than 8 points total because if we divide the rectangle into 8

! " Γ— ! " squares then there

can never be more than one point per square.

  • Why???

Combine step

slide-60
SLIDE 60
  • So instead of comparing (𝑦, 𝑧) with every other point in 𝑄

# we only have to compare it with at

most a constant c points lower than it (smaller y)

  • To gain quick access to these points, let’s sort the points in 𝑄

# by 𝑧 values.

  • The points above must be in the c points before our current point in this sorted list
  • Now, if there are 𝑙 vertices in 𝑄

# we have to sort the vertices in 𝑃(𝑙log 𝑙) time and make at

most c𝑙 comparisons in 𝑃(𝑙) time for a total combine step of 𝑃 𝑙 log 𝑙 .

  • But we said in the worst case, there are π‘œ vertices in 𝑄

# and so worst case, the combine step

takes 𝑃(π‘œ log π‘œ) time.

Combine step

slide-61
SLIDE 61
  • But we said in the worst case, there are π‘œ vertices in 𝑄

# and so worst case, the combine step

takes 𝑃(π‘œ log π‘œ) time.

  • Runtime recursion:

π‘ˆ π‘œ = 2π‘ˆ π‘œ 2 + 𝑃(π‘œ log π‘œ) This is T(n) = O(n (log n)^2) Pre-processing : Sort by both x and y, keep pointers between sorted lists Maintain sorting in recursive calls reduces to T(n) =2 T(n/2) +O(n), so T(n) is O(n log n)

Time analysis