Perfect Matchings on 3-Regular, Bridgeless Graphs Marcelo Siqueira - - PowerPoint PPT Presentation

perfect matchings on 3 regular bridgeless graphs
SMART_READER_LITE
LIVE PREVIEW

Perfect Matchings on 3-Regular, Bridgeless Graphs Marcelo Siqueira - - PowerPoint PPT Presentation

Perfect Matchings on 3-Regular, Bridgeless Graphs Marcelo Siqueira DMAT-UFRN mfsiqueira@mat.ufrn.br Preliminaries Let G = ( V , E ) be a finite and undirected graph. Preliminaries Let G = ( V , E ) be a finite and undirected graph.


slide-1
SLIDE 1

Perfect Matchings on 3-Regular, Bridgeless Graphs

Marcelo Siqueira

DMAT-UFRN mfsiqueira@mat.ufrn.br

slide-2
SLIDE 2

Preliminaries

◮ Let G = (V , E) be a finite and undirected graph.

slide-3
SLIDE 3

Preliminaries

◮ Let G = (V , E) be a finite and undirected graph. ◮ A matching M on G is any subset of the set E of edges of G such

that no two edges of M share a vertex in the set V of vertices of G. v1 v3 v2 v4 M = {{v1, v4}, {v2, v3}}

slide-4
SLIDE 4

Preliminaries

◮ Let G = (V , E) be a finite and undirected graph. ◮ A matching M on G is any subset of the set E of edges of G such

that no two edges of M share a vertex in the set V of vertices of G. v1 v3 v2 v4 M = {{v1, v4}, {v2, v3}}

◮ Edges in M are called matching edges.

slide-5
SLIDE 5

Preliminaries

◮ Let G = (V , E) be a finite and undirected graph. ◮ A matching M on G is any subset of the set E of edges of G such

that no two edges of M share a vertex in the set V of vertices of G. v1 v3 v2 v4 M = {{v1, v4}, {v2, v3}}

◮ Edges in M are called matching edges. ◮ Vertices of matching edges are said to be matched or covered by M.

slide-6
SLIDE 6

Preliminaries

◮ A matching M on G is a maximum cardinality matching on G if and

  • nly if |M| ≥ |M′|, where M′ is any possible matching M′ on G.
slide-7
SLIDE 7

Preliminaries

◮ A matching M on G is a maximum cardinality matching on G if and

  • nly if |M| ≥ |M′|, where M′ is any possible matching M′ on G. The

set M = {{v1, v4}, {v2, v3}} is a maximum cardinality matching on v1 v3 v2 v4

slide-8
SLIDE 8

Preliminaries

◮ A matching M on G is a maximum cardinality matching on G if and

  • nly if |M| ≥ |M′|, where M′ is any possible matching M′ on G. The

set M = {{v1, v4}, {v2, v3}} is a maximum cardinality matching on v1 v3 v2 v4

◮ A matching M on G is said to be perfect if and only if all vertices of

G are matched by M. So, the matching M in the above example is perfect.

slide-9
SLIDE 9

Preliminaries

◮ Every perfect matching is a maximum cardinality matching.

slide-10
SLIDE 10

Preliminaries

◮ Every perfect matching is a maximum cardinality matching. ◮ The reciprocal is not true:

v1 v3 v5 v2 v4 M = {{v1, v3}, {v4, v5}}

slide-11
SLIDE 11

Preliminaries

◮ Every perfect matching is a maximum cardinality matching. ◮ The reciprocal is not true:

v1 v3 v5 v2 v4 M = {{v1, v3}, {v4, v5}}

◮ There are graphs that always admit perfect matchings (and they show

up in graphics applications):

slide-12
SLIDE 12

Preliminaries

◮ Every perfect matching is a maximum cardinality matching. ◮ The reciprocal is not true:

v1 v3 v5 v2 v4 M = {{v1, v3}, {v4, v5}}

◮ There are graphs that always admit perfect matchings (and they show

up in graphics applications): the so-called 3-regular and bridgeless graphs.

slide-13
SLIDE 13

Preliminaries

Theorem (Petersen, 1891)

Every 3-regular and bridgeless graph admits a perfect matching.

slide-14
SLIDE 14

Preliminaries

Theorem (Petersen, 1891)

Every 3-regular and bridgeless graph admits a perfect matching. v1 v2 v3 v4 v5 v6

◮ G is 3-regular if and only if every vertex of G has degree 3.

slide-15
SLIDE 15

Preliminaries

Theorem (Petersen, 1891)

Every 3-regular and bridgeless graph admits a perfect matching. v1 v2 v3 v4 v5 v6

◮ G is 3-regular if and only if every vertex of G has degree 3. ◮ Recall that an edge e of a graph G is said to be a bridge (or a cut

edge) of G if and only if G − e has more connected components than G.

slide-16
SLIDE 16

Preliminaries

Theorem (Petersen, 1891)

Every 3-regular and bridgeless graph admits a perfect matching. v1 v2 v3 v4 v5 v6

◮ G is 3-regular if and only if every vertex of G has degree 3. ◮ Recall that an edge e of a graph G is said to be a bridge (or a cut

edge) of G if and only if G − e has more connected components than G.

◮ G is bridgeless if and only if G has no bridges.

slide-17
SLIDE 17

Preliminaries

Theorem (Petersen, 1891)

Every 3-regular and bridgeless graph admits a perfect matching. v1 v2 v3 v4 v5 v6

◮ G is 3-regular if and only every vertex of G has degree 3. ◮ Recall that an edge e of a graph G is said to be a bridge (or a cut

edge) of G if and only if G − e has more connected components than G.

◮ G is bridgeless if and only if G has no bridges.

slide-18
SLIDE 18

Problem Statement

◮ Input:

A 3-regular, bridgeless graph G = (V , E). v1 v2 v3 v4 v5 v6

slide-19
SLIDE 19

Problem Statement

◮ Input:

A 3-regular, bridgeless graph G = (V , E).

◮ Output:

A perfect matching M on G. v1 v2 v3 v4 v5 v6

slide-20
SLIDE 20

Problem Statement

◮ Input:

A 3-regular, bridgeless graph G = (V , E).

◮ Output:

A perfect matching M on G.

◮ Assumption:

G is finite and connected. v1 v2 v3 v4 v5 v6

slide-21
SLIDE 21

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

slide-22
SLIDE 22

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

slide-23
SLIDE 23

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976).
slide-24
SLIDE 24

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976). ◮ O(mnα(m, n)) by Tarjan (1985), where m is the number, |E|, of edges
  • f G.
slide-25
SLIDE 25

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976). ◮ O(mnα(m, n)) by Tarjan (1985), where m is the number, |E|, of edges
  • f G.
◮ O(mn) by Gabow and Tarjan (J. of ACM, 1991).
slide-26
SLIDE 26

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976). ◮ O(mnα(m, n)) by Tarjan (1985), where m is the number, |E|, of edges
  • f G.
◮ O(mn) by Gabow and Tarjan (J. of ACM, 1991).

◮ New ideas incorporated by Micali and Vazirani yielded the best known

upper bound for the blossom-shrinking algorithm: O(m√n) (FOCS, 1980).

slide-27
SLIDE 27

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976). ◮ O(mnα(m, n)) by Tarjan (1985), where m is the number, |E|, of edges
  • f G.
◮ O(mn) by Gabow and Tarjan (J. of ACM, 1991).

◮ New ideas incorporated by Micali and Vazirani yielded the best known

upper bound for the blossom-shrinking algorithm: O(m√n) (FOCS, 1980).

◮ For cubic graphs, m = Θ(n). So, we can compute a perfect matching

  • n cubic, bridgeless graphs in O(n1.5) using an algorithm for general

graphs!

slide-28
SLIDE 28

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976). ◮ O(mnα(m, n)) by Tarjan (1985), where m is the number, |E|, of edges
  • f G.
◮ O(mn) by Gabow and Tarjan (J. of ACM, 1991).

◮ New ideas incorporated by Micali and Vazirani yielded the best known

upper bound for the blossom-shrinking algorithm: O(m√n) (FOCS, 1980).

◮ For cubic graphs, m = Θ(n). So, we can compute a perfect matching

  • n cubic, bridgeless graphs in O(n1.5) using an algorithm for general

graphs!

◮ Can we do better?

slide-29
SLIDE 29

A Bit of History

◮ J. Edmonds’ blossom-shrinking algorithm (CJM, 1965).

◮ Original algorithm finds maximum cardinality matchings on general

graphs in O(n4) time, where n is the number, |V |, of vertices of G.

◮ Careful implementations can lower the above upper bound: ◮ O(n3) by Gabow (J. of ACM, 1976). ◮ O(mnα(m, n)) by Tarjan (1985), where m is the number, |E|, of edges
  • f G.
◮ O(mn) by Gabow and Tarjan (J. of ACM, 1991).

◮ New ideas incorporated by Micali and Vazirani yielded the best known

upper bound for the blossom-shrinking algorithm: O(m√n) (FOCS, 1980).

◮ For cubic graphs, m = Θ(n). So, we can compute a perfect matching

  • n cubic, bridgeless graphs in O(n1.5) using an algorithm for general

graphs!

◮ Can we do better? YES.

slide-30
SLIDE 30

A Bit of History

◮ The main idea behind the previous algorithms is to try to find an

augmenting path on G with respect to a current matching M on G.

slide-31
SLIDE 31

A Bit of History

◮ The main idea behind the previous algorithms is to try to find an

augmenting path on G with respect to a current matching M on G.

Theorem (Berge, 1957)

Let G be a graph such that G has no loops. Let M be any matching on

  • G. Then, M is a maximum cardinality matching on G if and only if G has

no augmenting path with respect to M.

slide-32
SLIDE 32

A Bit of History

◮ The main idea behind the previous algorithms is to try to find an

augmenting path on G with respect to a current matching M on G.

Theorem (Berge, 1957)

Let G be a graph such that G has no loops. Let M be any matching on

  • G. Then, M is a maximum cardinality matching on G if and only if G has

no augmenting path with respect to M.

◮ All improvements on the upper bound of the original Edmonds’

blossom-shrinking algorithm are related on how “efficiently” an augmenting path (with respect to the current graph) could be found, if any.

slide-33
SLIDE 33

A Bit of History

◮ The main idea behind the previous algorithms is to try to find an

augmenting path on G with respect to a current matching M on G.

Theorem (Berge, 1957)

Let G be a graph such that G has no loops. Let M be any matching on

  • G. Then, M is a maximum cardinality matching on G if and only if G has

no augmenting path with respect to M.

◮ All improvements on the upper bound of the original Edmonds’

blossom-shrinking algorithm are related on how “efficiently” an augmenting path (with respect to the current graph) could be found, if any.

◮ Note: the best known upper bound has been out there for about 35

years!

slide-34
SLIDE 34

Frink’s Theorem

◮ It took a completely different paradigm to produce an algorithm for

the class of 3-regular, cubic graphs with a better upper bound than O(n1.5).

slide-35
SLIDE 35

Frink’s Theorem

◮ It took a completely different paradigm to produce an algorithm for

the class of 3-regular, cubic graphs with a better upper bound than O(n1.5).

◮ The new “paradigm” comes from a constructive proof given by Frink

  • Jr. in 1926 for Petersen’s theorem (original proof has some mistakes).
slide-36
SLIDE 36

Frink’s Theorem

◮ It took a completely different paradigm to produce an algorithm for

the class of 3-regular, cubic graphs with a better upper bound than O(n1.5).

◮ The new “paradigm” comes from a constructive proof given by Frink

  • Jr. in 1926 for Petersen’s theorem (original proof has some mistakes).

◮ From now on, assume that G is a connected, 3-regular, bridgeless

graph.

slide-37
SLIDE 37

Frink’s Theorem

◮ It took a completely different paradigm to produce an algorithm for

the class of 3-regular, cubic graphs with a better upper bound than O(n1.5).

◮ The new “paradigm” comes from a constructive proof given by Frink

  • Jr. in 1926 for Petersen’s theorem (original proof has some mistakes).

◮ From now on, assume that G is a connected, 3-regular, bridgeless

graph.

◮ G is not necessarily simple.

slide-38
SLIDE 38

Frink’s Theorem

◮ It took a completely different paradigm to produce an algorithm for

the class of 3-regular, cubic graphs with a better upper bound than O(n1.5).

◮ The new “paradigm” comes from a constructive proof given by Frink

  • Jr. in 1926 for Petersen’s theorem (original proof has some mistakes).

◮ From now on, assume that G is a connected, 3-regular, bridgeless

graph.

◮ G is not necessarily simple. ◮ Let e be a simple edge (i.e., neither a loop nor a parallel edge) of G.

slide-39
SLIDE 39

Frink’s Theorem

◮ It took a completely different paradigm to produce an algorithm for

the class of 3-regular, cubic graphs with a better upper bound than O(n1.5).

◮ The new “paradigm” comes from a constructive proof given by Frink

  • Jr. in 1926 for Petersen’s theorem (original proof has some mistakes).

◮ From now on, assume that G is a connected, 3-regular, bridgeless

graph.

◮ G is not necessarily simple. ◮ Let e be a simple edge (i.e., neither a loop nor a parallel edge) of G.

Teaser

What can you say about G if no edge of G is simple?

slide-40
SLIDE 40

Frink’s Theorem

◮ A reduction operation is the key idea behind Frink’s proof:

slide-41
SLIDE 41

Frink’s Theorem

◮ A reduction operation is the key idea behind Frink’s proof:

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-42
SLIDE 42

Frink’s Theorem

◮ Frink proved the following:

slide-43
SLIDE 43

Frink’s Theorem

◮ Frink proved the following:

Frink’s Theorem (Frink, 1926)

At least one of G1 or G2 is connected, 3-regular, and bridgeless. x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-44
SLIDE 44

Frink’s Algorithm

◮ So what?

slide-45
SLIDE 45

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem.

slide-46
SLIDE 46

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching.

slide-47
SLIDE 47

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching. ◮ Let M1 be such a matching.

slide-48
SLIDE 48

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching. ◮ Let M1 be such a matching.

Key idea:

Build a perfect matching M on G from M1.

slide-49
SLIDE 49

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching. ◮ Let M1 be such a matching.

Key idea:

Build a perfect matching M on G from M1. x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

slide-50
SLIDE 50

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching. ◮ Let M1 be such a matching.

Key idea:

Build a perfect matching M on G from M1. x1 x2 x1 x2 u w x3 x4 x3 x4

slide-51
SLIDE 51

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching. ◮ Let M1 be such a matching.

Key idea:

Build a perfect matching M on G from M1. x1 x2 x1 x2 u w x3 x4 x3 x4

slide-52
SLIDE 52

Frink’s Algorithm

◮ So what? ◮ Suppose G1 satisfies Frink’s theorem. ◮ By Petersen’s theorem, G1 admits a perfect matching. ◮ Let M1 be such a matching.

Key idea:

Build a perfect matching M on G from M1. x1 x2 x1 x2 u w x3 x4 x3 x4

slide-53
SLIDE 53

Frink’s Algorithm

◮ How can we solve the third case?

x1 x2 x1 x2 u w x3 x4 x3 x4

slide-54
SLIDE 54

Frink’s Algorithm

◮ How can we solve the third case?

x1 x2 x1 x2 u w x3 x4 x3 x4

A nice property of matching edges:

Every matching edge belongs to an alternating cycle.

slide-55
SLIDE 55

Frink’s Algorithm

◮ How can we solve the third case?

x1 x2 x1 x2 u w x3 x4 x3 x4

A nice property of matching edges:

Every matching edge belongs to an alternating cycle.

◮ So, reverse an alternating cycle with either {x1, x3} or {x2, x4}.

slide-56
SLIDE 56

Frink’s Algorithm

◮ Reversing the alternating cycle produces one the following:

x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

slide-57
SLIDE 57

Frink’s Algorithm

◮ Reversing the alternating cycle produces one the following:

x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

◮ Good, but...

slide-58
SLIDE 58

Frink’s Algorithm

◮ Reversing the alternating cycle produces one the following:

x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

◮ Good, but...

◮ How can we find the alternating cycle at first place?
slide-59
SLIDE 59

Frink’s Algorithm

◮ Reversing the alternating cycle produces one the following:

x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

◮ Good, but...

◮ How can we find the alternating cycle at first place? ◮ Finding an augmenting path on G1 − {x1, x3} or G1 − {x2, x4}.
slide-60
SLIDE 60

Frink’s Algorithm

◮ Reversing the alternating cycle produces one the following:

x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

◮ Good, but...

◮ How can we find the alternating cycle at first place? ◮ Finding an augmenting path on G1 − {x1, x3} or G1 − {x2, x4}. ◮ What is the cost?
slide-61
SLIDE 61

Frink’s Algorithm

◮ Reversing the alternating cycle produces one the following:

x1 x2 x1 x2 x1 x2 x3 x4 x3 x4 x3 x4

◮ Good, but...

◮ How can we find the alternating cycle at first place? ◮ Finding an augmenting path on G1 − {x1, x3} or G1 − {x2, x4}. ◮ What is the cost? ◮ O(m) amortized time (be careful here!)
slide-62
SLIDE 62

Frink’s Algorithm

◮ A “little” problem remains unsolved:

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-63
SLIDE 63

Frink’s Algorithm

◮ A “little” problem remains unsolved: ◮ How can we decide which graph (G1 or G2) satisfies Frink’s theorem?

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-64
SLIDE 64

Frink’s Algorithm

◮ Counting biconnected components of G1 and G2.

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-65
SLIDE 65

Frink’s Algorithm

◮ Counting biconnected components of G1 and G2. ◮ Can be done with a DFS in O(n + m) = O(n) (not the best bound).

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-66
SLIDE 66

Frink’s Algorithm

◮ Counting biconnected components of G1 and G2. ◮ Can be done with a DFS in O(n + m) = O(n) (not the best bound). ◮ So, we can compute a perfect matching on G in O(n2) time.

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-67
SLIDE 67

Avoiding Alternating Cycle Reversal

◮ We can lower the O(n2) upper bound to O(n lg4 n) by making two

changes in the previous algorithm (Biedl, Bose, Demaine, Lubiw, 2001).

slide-68
SLIDE 68

Avoiding Alternating Cycle Reversal

◮ We can lower the O(n2) upper bound to O(n lg4 n) by making two

changes in the previous algorithm (Biedl, Bose, Demaine, Lubiw, 2001).

◮ First, fix an edge f of the input graph G such that

◮ f is adjacent to the reduction edge before a reduction,
slide-69
SLIDE 69

Avoiding Alternating Cycle Reversal

◮ We can lower the O(n2) upper bound to O(n lg4 n) by making two

changes in the previous algorithm (Biedl, Bose, Demaine, Lubiw, 2001).

◮ First, fix an edge f of the input graph G such that

◮ f is adjacent to the reduction edge before a reduction, ◮ f becomes one of the new edges right after the reduction, and
slide-70
SLIDE 70

Avoiding Alternating Cycle Reversal

◮ We can lower the O(n2) upper bound to O(n lg4 n) by making two

changes in the previous algorithm (Biedl, Bose, Demaine, Lubiw, 2001).

◮ First, fix an edge f of the input graph G such that

◮ f is adjacent to the reduction edge before a reduction, ◮ f becomes one of the new edges right after the reduction, and ◮ f never becomes a matching edge.
slide-71
SLIDE 71

Avoiding Alternating Cycle Reversal

◮ We can lower the O(n2) upper bound to O(n lg4 n) by making two

changes in the previous algorithm (Biedl, Bose, Demaine, Lubiw, 2001).

◮ First, fix an edge f of the input graph G such that

◮ f is adjacent to the reduction edge before a reduction, ◮ f becomes one of the new edges right after the reduction, and ◮ f never becomes a matching edge.

x1 x3 x2 x4 u w e f x1 x3 x2 x4 f x1 x3 x2 x4 f G1 G2

slide-72
SLIDE 72

Avoiding Alternating Cycle Reversal

◮ What if there is no simple edge adjacent to f ?

w x y z f g h

slide-73
SLIDE 73

Avoiding Alternating Cycle Reversal

◮ What if there is no simple edge adjacent to f ?

w x y z f g h

◮ No big deal...

w z w z w x y z j

slide-74
SLIDE 74

Avoiding Alternating Cycle Reversal

◮ What if there is no simple edge adjacent to f ?

w x y z f g h

◮ No big deal...

w z w z w x y z j

◮ So, each reduction takes constant time now!

slide-75
SLIDE 75

A Faster Biconnectivity Test

◮ Recalling... ◮ How can we decide which graph (G1 or G2) satisfies Frink’s theorem?

x1 x3 x2 x4 u w e1 e2 e e3 e4 x1 x3 x2 x4 e13 e24 x1 x3 x2 x4 e14 e23 G1 G2

slide-76
SLIDE 76

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

slide-77
SLIDE 77

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1.

slide-78
SLIDE 78

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}.

slide-79
SLIDE 79

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time.

slide-80
SLIDE 80

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound.

slide-81
SLIDE 81

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound. ◮ Can we do better?

slide-82
SLIDE 82

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound. ◮ Can we do better? YES: O(n lg2 n).

slide-83
SLIDE 83

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound. ◮ Can we do better? YES: O(n lg2 n). ◮ Due to Diks and Stanczyk’s improvements.

slide-84
SLIDE 84

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound. ◮ Can we do better? YES: O(n lg2 n). ◮ Due to Diks and Stanczyk’s improvements. ◮ A student of mine implemented Diks and Stanczyk’s algorithm.

slide-85
SLIDE 85

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound. ◮ Can we do better? YES: O(n lg2 n). ◮ Due to Diks and Stanczyk’s improvements. ◮ A student of mine implemented Diks and Stanczyk’s algorithm. ◮ His source code is freely and publicly available.

slide-86
SLIDE 86

A Faster Biconnectivity Test

◮ Resort to a dynamic connectivity graph data structure (Holm et al.,

2001).

◮ Consider G1. ◮ Solve the 2-edge-connectivity problem for each pair in {x1, x2, x3, x4}. ◮ Each test takes O(lg4 n) amortized time. ◮ This gives us the O(n lg4 n) amortized time upper bound. ◮ Can we do better? YES: O(n lg2 n). ◮ Due to Diks and Stanczyk’s improvements. ◮ A student of mine implemented Diks and Stanczyk’s algorithm. ◮ His source code is freely and publicly available. ◮ Can talk about Diks and Stanczyk’s algorithm some other time...