• Nenhum resultado encontrado

PROBABILISTIC ANALYSIS OF AN EXHAUSTIVE SEARCH ALGORITHM IN RANDOM GRAPHS

N/A
N/A
Protected

Academic year: 2022

Share "PROBABILISTIC ANALYSIS OF AN EXHAUSTIVE SEARCH ALGORITHM IN RANDOM GRAPHS"

Copied!
59
0
0

Texto

(1)

PROBABILISTIC ANALYSIS OF AN EXHAUSTIVE SEARCH ALGORITHM IN

RANDOM GRAPHS

Hsien-Kuei Hwang Academia Sinica, Taiwan

(joint work with Cyril Banderier, Vlady Ravelomanana, Vytas Zacharovas)

AofA 2008, Maresias, Brazil

April 14, 2008

(2)

MAXIMUM INDEPENDENT SET

Independent set

An independent (or stable) set in a graph is a set of vertices no two of which share the same edge.

1

2

3 4

5 7 6

MIS = {1, 3, 5, 7}

Maximum independent set (MIS)

The MIS problem asks for an independent set with the largest size.

NP hard!!

(3)

MAXIMUM INDEPENDENT SET

Independent set

An independent (or stable) set in a graph is a set of vertices no two of which share the same edge.

1

2

3 4

5 7 6

MIS = {1, 3, 5, 7}

Maximum independent set (MIS)

The MIS problem asks for an independent set with the largest size.

NP hard!!

(4)

MAXIMUM INDEPENDENT SET

Independent set

An independent (or stable) set in a graph is a set of vertices no two of which share the same edge.

1

2

3 4

5 7 6

MIS = {1, 3, 5, 7}

Maximum independent set (MIS)

The MIS problem asks for an independent set with the largest size.

NP hard!!

(5)

MAXIMUM INDEPENDENT SET

Equivalent versions

The same problem as MAXIMUM CLIQUE on the complementary graph (clique = complete subgraph).

Since the complement of a vertex cover in any graph is an independent set, MIS is equivalent to

MINIMUM VERTEX COVERING . (A vertex cover is a set of vertices where every edge connects at least one vertex.)

Among Karp’s (1972) original list of 21 NP-complete

problems.

(6)

MAXIMUM INDEPENDENT SET

Equivalent versions

The same problem as MAXIMUM CLIQUE on the complementary graph (clique = complete subgraph).

Since the complement of a vertex cover in any graph is an independent set, MIS is equivalent to

MINIMUM VERTEX COVERING . (A vertex cover is a set of vertices where every edge connects at least one vertex.)

Among Karp’s (1972) original list of 21 NP-complete

problems.

(7)

MAXIMUM INDEPENDENT SET

Equivalent versions

The same problem as MAXIMUM CLIQUE on the complementary graph (clique = complete subgraph).

Since the complement of a vertex cover in any graph is an independent set, MIS is equivalent to

MINIMUM VERTEX COVERING . (A vertex cover is a set of vertices where every edge connects at least one vertex.)

Among Karp’s (1972) original list of 21 NP-complete

problems.

(8)

THEORETICAL RESULTS

Random models: Erd ˝os-R ´enyi’s G n,p

Vertex set = {1, 2, . . . , n} and all edges occur independently with the same probability p.

The cardinality of an MIS in G n,p

Matula (1970), Grimmett and McDiarmid (1975), Bollobas and Erd ˝ os (1976), Frieze (1990): If pn → ∞, then (q := 1 − p)

|MIS n | ∼ 2 log 1/q pn whp, where q = 1 − p; and ∃k = k n such that

|MIS n | = k or k + 1 whp.

(9)

THEORETICAL RESULTS

Random models: Erd ˝os-R ´enyi’s G n,p

Vertex set = {1, 2, . . . , n} and all edges occur independently with the same probability p.

The cardinality of an MIS in G n,p

Matula (1970), Grimmett and McDiarmid (1975), Bollobas and Erd ˝ os (1976), Frieze (1990): If pn → ∞, then (q := 1 − p)

|MIS n | ∼ 2 log 1/q pn whp, where q = 1 − p; and ∃k = k n such that

|MIS n | = k or k + 1 whp.

(10)

A GREEDY ALGORITHM

Adding vertices one after another whenever possible The size of the resulting IS:

S n

= d 1 + S n−1−Binom(n−1;p) (n > 1), with S 0 ≡ 0.

Equivalent to the length of the right arm of random

digital search trees.

(11)

A GREEDY ALGORITHM

Adding vertices one after another whenever possible The size of the resulting IS:

S n

= d 1 + S n−1−Binom(n−1;p) (n > 1), with S 0 ≡ 0.

Equivalent to the length of the right arm of random

digital search trees.

(12)

ANALYSIS OF THE GREEDY ALGORITHM

Easy for people in this community

Mean: E (S n ) ∼ log 1/q n + a bounded periodic function.

Variance: V (S n ) ∼ a bounded periodic function.

Limit distribution does not exist:

E

e (X

n

−log

1/q

n)y

∼ F (log 1/q n; y ), where

F (u; y ) := 1 − e

y

log(1/q)

 Y

`>1

1 − e

y

q

`

1 − q

`

 X

j∈Z

Γ

− y + 2jπi log(1/q)

e

2jπiu

.

(13)

ANALYSIS OF THE GREEDY ALGORITHM

Easy for people in this community

Mean: E (S n ) ∼ log 1/q n + a bounded periodic function.

Variance: V (S n ) ∼ a bounded periodic function.

Limit distribution does not exist:

E

e (X

n

−log

1/q

n)y

∼ F (log 1/q n; y ), where

F (u; y ) := 1 − e

y

log(1/q)

 Y

`>1

1 − e

y

q

`

1 − q

`

 X

j∈Z

Γ

− y + 2jπi log(1/q)

e

2jπiu

.

(14)

ANALYSIS OF THE GREEDY ALGORITHM

Easy for people in this community

Mean: E (S n ) ∼ log 1/q n + a bounded periodic function.

Variance: V (S n ) ∼ a bounded periodic function.

Limit distribution does not exist:

E

e (X

n

−log

1/q

n)y

∼ F (log 1/q n; y ), where

F (u; y ) := 1 − e

y

log(1/q)

 Y

`>1

1 − e

y

q

`

1 − q

`

 X

j∈Z

Γ

− y + 2jπi log(1/q)

e

2jπiu

.

(15)

A BETTER ALGORITHM?

Goodness of GREEDY IS

Grimmett and McDiarmid (1975), Karp (1976), Fernandez de la Vega (1984), Gazmuri (1984), McDiarmid (1984):

Asymptotically, the GREEDY IS is half optimal.

Can we do better?

Frieze and McDiarmid (1997, RSA), Algorithmic theory of random graphs, Research Problem 15:

Construct a polynomial time algorithm that finds an independent set of size at least ( 1 2 + ε)|MIS n | whp or show that such an algorithm does not exist modulo some reasonable conjecture in the theory of

computational complexity such as, e.g., P 6= NP.

(16)

A BETTER ALGORITHM?

Goodness of GREEDY IS

Grimmett and McDiarmid (1975), Karp (1976), Fernandez de la Vega (1984), Gazmuri (1984), McDiarmid (1984):

Asymptotically, the GREEDY IS is half optimal.

Can we do better?

Frieze and McDiarmid (1997, RSA), Algorithmic theory of random graphs, Research Problem 15:

Construct a polynomial time algorithm that finds an independent set of size at least ( 1 2 + ε)|MIS n | whp or show that such an algorithm does not exist modulo some reasonable conjecture in the theory of

computational complexity such as, e.g., P 6= NP.

(17)

A BETTER ALGORITHM?

Goodness of GREEDY IS

Grimmett and McDiarmid (1975), Karp (1976), Fernandez de la Vega (1984), Gazmuri (1984), McDiarmid (1984):

Asymptotically, the GREEDY IS is half optimal.

Can we do better?

Frieze and McDiarmid (1997, RSA), Algorithmic theory of random graphs, Research Problem 15:

Construct a polynomial time algorithm that finds an independent set of size at least ( 1 2 + ε)|MIS n | whp or show that such an algorithm does not exist modulo some reasonable conjecture in the theory of

computational complexity such as, e.g., P 6= NP.

(18)

JERRUM’S (1992) METROPOLIS ALGORITHM

A degenerate form of simulated annealing

Sequentially increase the clique (K ) size by: (i) choose a vertex v u.a.r. from V; (ii) if v 6∈ K and v connected to every vertex of K , then add v to K ; (iii) if v ∈ K , then v is subtracted from K with probability λ

−1

.

He showed: ∀λ > 1, ∃ an initial state from which the expected time for the Metropolis process to reach a clique of size at least (1 + ε) log 1/q (pn) exceeds n Ω(log pn) .

n log n = e (log n)

2

(19)

JERRUM’S (1992) METROPOLIS ALGORITHM

A degenerate form of simulated annealing

Sequentially increase the clique (K ) size by: (i) choose a vertex v u.a.r. from V; (ii) if v 6∈ K and v connected to every vertex of K , then add v to K ; (iii) if v ∈ K , then v is subtracted from K with probability λ

−1

.

He showed: ∀λ > 1, ∃ an initial state from which the expected time for the Metropolis process to reach a clique of size at least (1 + ε) log 1/q (pn) exceeds n Ω(log pn) .

n log n = e (log n)

2

(20)

POSITIVE RESULTS

Exact algorithms

A huge number of algorithms proposed in the

literature; see Bomze et al.’s survey (in Handbook of Combinatorial Optimization, 1999).

Special algorithms

Wilf’s (1986) Algorithms and Complexity

describes a backtracking algorithms enumerating all independent sets with time complexity n O(log n) .Chv ´atal (1977) proposes exhaustive algorithms

where almost all G n,1/2 creates at most n 2(1+log

2

n) subproblems.

Pittel (1982):

P

n

1−ε4 log1/qn

6 Time

used byChv ´atal’s algo

6 n

1+ε2 log1/qn

> 1 − e

−clog2n

(21)

POSITIVE RESULTS

Exact algorithms

A huge number of algorithms proposed in the

literature; see Bomze et al.’s survey (in Handbook of Combinatorial Optimization, 1999).

Special algorithms

Wilf’s (1986) Algorithms and Complexity

describes a backtracking algorithms enumerating all independent sets with time complexity n O(log n) .Chv ´atal (1977) proposes exhaustive algorithms

where almost all G n,1/2 creates at most n 2(1+log

2

n) subproblems.

Pittel (1982):

P

n

1−ε4 log1/qn

6 Time

used byChv ´atal’s algo

6 n

1+ε2 log1/qn

> 1 − e

−clog2n

(22)

POSITIVE RESULTS

Exact algorithms

A huge number of algorithms proposed in the

literature; see Bomze et al.’s survey (in Handbook of Combinatorial Optimization, 1999).

Special algorithms

Wilf’s (1986) Algorithms and Complexity

describes a backtracking algorithms enumerating all independent sets with time complexity n O(log n) .Chv ´atal (1977) proposes exhaustive algorithms

where almost all G n,1/2 creates at most n 2(1+log

2

n) subproblems.

Pittel (1982):

P

n

1−ε4 log1/qn

6 Time

used byChv ´atal’s algo

6 n

1+ε2 log1/qn

> 1 − e

−clog2n

(23)

AIM: A MORE PRECISE ANALYSIS OF THE EXHAUSTIVE ALGORITHM

MIS contains either v or not X n

= d X n−1 + X n−1−Binom(n−1;p) (n > 2), with X 0 = 0 and X 1 = 1.

Special cases

If p is close to 1, then the second term is small, so we expect a polynomial time bound.

If p is sufficiently small, then the second term is large, and we expect an exponential time bound.

What happens for p in between?

(24)

AIM: A MORE PRECISE ANALYSIS OF THE EXHAUSTIVE ALGORITHM

MIS contains either v or not X n

= d X n−1 + X n−1−Binom(n−1;p) (n > 2), with X 0 = 0 and X 1 = 1.

Special cases

If p is close to 1, then the second term is small, so we expect a polynomial time bound.

If p is sufficiently small, then the second term is large, and we expect an exponential time bound.

What happens for p in between?

(25)

AIM: A MORE PRECISE ANALYSIS OF THE EXHAUSTIVE ALGORITHM

MIS contains either v or not X n

= d X n−1 + X n−1−Binom(n−1;p) (n > 2), with X 0 = 0 and X 1 = 1.

Special cases

If p is close to 1, then the second term is small, so we expect a polynomial time bound.

If p is sufficiently small, then the second term is large, and we expect an exponential time bound.

What happens for p in between?

(26)

AIM: A MORE PRECISE ANALYSIS OF THE EXHAUSTIVE ALGORITHM

MIS contains either v or not X n

= d X n−1 + X n−1−Binom(n−1;p) (n > 2), with X 0 = 0 and X 1 = 1.

Special cases

If p is close to 1, then the second term is small, so we expect a polynomial time bound.

If p is sufficiently small, then the second term is large, and we expect an exponential time bound.

What happens for p in between?

(27)

MEAN VALUE

The expected value µ n := E (X n ) satisfies

µ n = µ n−1 + X

06j<n

n − 1 j

p j q n−1−j µ n−1−j .

with µ 0 = 0 and µ 1 = 1.

Poisson generating function Let ˜ f (z) := e −z P

n>0 µ n z n /n!. Then

˜ f 0 (z) = ˜ f (qz) + e −z .

(28)

MEAN VALUE

The expected value µ n := E (X n ) satisfies

µ n = µ n−1 + X

06j<n

n − 1 j

p j q n−1−j µ n−1−j .

with µ 0 = 0 and µ 1 = 1.

Poisson generating function Let ˜ f (z) := e −z P

n>0 µ n z n /n!. Then

˜ f 0 (z) = ˜ f (qz) + e −z .

(29)

RESOLUTION OF THE RECURRENCE

Laplace transform

The Laplace transform of ˜ f L (s) =

Z ∞

0

e −xs ˜ f (x ) dx

satisfies

s L (s) = 1 q L

s q

+ 1

s + 1 . Exact solutions

L (s) = X

j>0

q (

j+12

)

s j+1 (s + q j ) .

(30)

RESOLUTION OF THE RECURRENCE

Laplace transform

The Laplace transform of ˜ f L (s) =

Z ∞

0

e −xs ˜ f (x ) dx

satisfies

s L (s) = 1 q L

s q

+ 1

s + 1 . Exact solutions

L (s) = X

j>0

q (

j+12

)

s j+1 (s + q j ) .

(31)

RESOLUTION OF THE RECURRENCE

Exact solutions

L (s) = X

j>0

q (

j+12

) s

j+1

(s + q

j

) .

Inverting gives ˜ f (z) = X

j>0

q (

j+12

) j ! z j+1

Z 1

0

e −q

j

uz (1 − u) j du.

Thus µ n = X

16j6n

n j

(−1) j X

16`6j

(−1) ` q j(`−1)− (

`2

) , or

µ

n

= n X

06j<n

n − 1 j

q (

j+12

) X

06`<n−j

n − 1 − j

`

q

j`

(1 − q

j

)

n−1−j−`

j + ` + 1 .

Neither is useful for numerical purposes for large n.

(32)

RESOLUTION OF THE RECURRENCE

Exact solutions

L (s) = X

j>0

q (

j+12

) s

j+1

(s + q

j

) .

Inverting gives ˜ f (z) = X

j>0

q (

j+12

) j ! z j+1

Z 1

0

e −q

j

uz (1 − u) j du.

Thus µ n = X

16j6n

n j

(−1) j X

16`6j

(−1) ` q j(`−1)− (

`2

) , or

µ

n

= n X

06j<n

n − 1 j

q (

j+12

) X

06`<n−j

n − 1 − j

`

q

j`

(1 − q

j

)

n−1−j−`

j + ` + 1 .

Neither is useful for numerical purposes for large n.

(33)

RESOLUTION OF THE RECURRENCE

Exact solutions

L (s) = X

j>0

q (

j+12

) s

j+1

(s + q

j

) .

Inverting gives ˜ f (z) = X

j>0

q (

j+12

) j ! z j+1

Z 1

0

e −q

j

uz (1 − u) j du.

Thus µ n = X

16j6n

n j

(−1) j X

16`6j

(−1) ` q j(`−1)− (

`2

) , or

µ

n

= n X

06j<n

n − 1 j

q (

j+12

) X

06`<n−j

n − 1 − j

`

q

j`

(1 − q

j

)

n−1−j−`

j + ` + 1 .

Neither is useful for numerical purposes for large n.

(34)

QUICK ASYMPTOTICS

Back-of-the-envelope calculation

Take q = 1/2. Since Binom(n − 1; 1 2 ) has mean n/2, we roughly have

µ n ≈ µ n−1 + µ bn/2c .

This is reminiscent of Mahler’s partition problem.

Indeed, if ϕ(z) = P

n µ n z n , then ϕ(z) ≈ 1 + z

1 − z ϕ(z 2 ) = Y

j>0

1 1 − z 2

j

.

So we expect that (de Bruijn, 1948; Dumas and Flajolet, 1996)

log µ

n

≈ c

log n log

2

n

2

+ c

0

log n + c

00

log log n + Periodic

n

.

(35)

QUICK ASYMPTOTICS

Back-of-the-envelope calculation

Take q = 1/2. Since Binom(n − 1; 1 2 ) has mean n/2, we roughly have

µ n ≈ µ n−1 + µ bn/2c .

This is reminiscent of Mahler’s partition problem.

Indeed, if ϕ(z) = P

n µ n z n , then ϕ(z) ≈ 1 + z

1 − z ϕ(z 2 ) = Y

j>0

1 1 − z 2

j

. So we expect that (de Bruijn, 1948; Dumas and Flajolet, 1996)

log µ

n

≈ c

log n log

2

n

2

+ c

0

log n + c

00

log log n + Periodic

n

.

(36)

QUICK ASYMPTOTICS

Back-of-the-envelope calculation

Take q = 1/2. Since Binom(n − 1; 1 2 ) has mean n/2, we roughly have

µ n ≈ µ n−1 + µ bn/2c .

This is reminiscent of Mahler’s partition problem.

Indeed, if ϕ(z) = P

n µ n z n , then ϕ(z) ≈ 1 + z

1 − z ϕ(z 2 ) = Y

j>0

1 1 − z 2

j

. So we expect that (de Bruijn, 1948; Dumas and Flajolet, 1996)

log µ

n

≈ c

log n log

2

n

2

+ c

0

log n + c

00

log log n + Periodic

n

.

(37)

ASYMPTOTICS OF µ n

Poisson heuristic (de-Poissonization, saddle-point method)

µ n = n!

2πi I

|z|=n

z −n−1 e z ˜ f (z ) dz

≈ X

j > 0

˜ f (j) (n) j !

n!

2πi I

|z|=n

z −n−1 e z (z − n) j dz

= ˜ f (n) + X

j>2

˜ f (j) (n) j! τ j (n),

where τ j (n) := n![z n ]e z (z − n) j = j![z j ](1 + z) n e −nz (Charlier polynomials). In particular, τ 0 (n) = 1,

τ 1 (n) = 0, τ 2 (n) = −n, τ 3 (n) = 2n, and τ 4 (n) = 3n 2 − 6n.

(38)

ASYMPTOTICS OF µ n

Poisson heuristic (de-Poissonization, saddle-point method)

µ n = n!

2πi I

|z|=n

z −n−1 e z ˜ f (z ) dz

≈ X

j > 0

˜ f (j) (n) j !

n!

2πi I

|z|=n

z −n−1 e z (z − n) j dz

= ˜ f (n) + X

j>2

˜ f (j) (n) j! τ j (n),

where τ j (n) := n![z n ]e z (z − n) j = j![z j ](1 + z) n e −nz (Charlier polynomials). In particular, τ 0 (n) = 1,

τ 1 (n) = 0, τ 2 (n) = −n, τ 3 (n) = 2n, and τ 4 (n) = 3n 2 − 6n.

(39)

A MORE PRECISE EXPANSION FOR ˜ f (x)

Asymptotics of ˜ f (x )

Let ρ = 1/ log(1/q) and R log R = x /ρ. Then

˜ f (x ) ∼ R

ρ+1/2

e

(ρ/2)(logR)2

G(ρ log R) p 2πρ log R

1 + X

j>1

φ

j

(ρ log R) (ρ log R)

j

 ,

as x → ∞, where G(u) := q ({u}

2

+{u})/2 F (q −{u} ),

F (s) = X

−∞<j<∞

q j(j+1)/2 1 + q j s s j+1 ,

and the φ j (u)’s are bounded, 1-periodic functions of u

involving the derivatives F (j) (q −{u} ).

(40)

A MORE EXPLICIT ASYMPTOTIC APPROXIMATION

R = x /ρ/W (x /ρ), Lambert’s W -function

W (x ) = log x − log log x + log log x

log x + (log log x )

2

− 2 log log x 2(log x )

2

+ · · · .

So that

˜ f (x ) ∼

x

ρ+1/2

G

ρ log

log(x/ρ)x/ρ

2πρ

ρ+1/2

log x exp ρ

2

log x /ρ log(x /ρ)

2

! .

Method of proof: a variant of the saddle-point method

˜ f (x ) = 1 2πi

Z 1+i∞

1−i∞

e sz L (s) ds

(41)

A MORE EXPLICIT ASYMPTOTIC APPROXIMATION

R = x /ρ/W (x /ρ), Lambert’s W -function

W (x ) = log x − log log x + log log x

log x + (log log x )

2

− 2 log log x 2(log x )

2

+ · · · .

So that

˜ f (x ) ∼

x

ρ+1/2

G

ρ log

log(x/ρ)x/ρ

2πρ

ρ+1/2

log x exp ρ

2

log x /ρ log(x /ρ)

2

! .

Method of proof: a variant of the saddle-point method

˜ f (x ) = 1 2πi

Z 1+i∞

1−i∞

e sz L (s) ds

(42)

A MORE EXPLICIT ASYMPTOTIC APPROXIMATION

R = x /ρ/W (x /ρ), Lambert’s W -function

W (x ) = log x − log log x + log log x

log x + (log log x )

2

− 2 log log x 2(log x )

2

+ · · · .

So that

˜ f (x ) ∼

x

ρ+1/2

G

ρ log

log(x/ρ)x/ρ

2πρ

ρ+1/2

log x exp ρ

2

log x /ρ log(x /ρ)

2

! .

Method of proof: a variant of the saddle-point method

˜ f (x ) = 1 2πi

Z 1+i∞

1−i∞

e sz L (s) ds

(43)

JUSTIFICATION OF THE POISSON HEURISTIC

Four properties are sufficient

The following four properties are enough to justify the Poisson-Charlier expansion.

– ˜ f 0 (z) = ˜ f (qz) + e −z ; – F (s) = sF (qs) (F (s) = P

i∈ Z q j(j+1)/2 s j+1 /(1 + q j s));

˜ f (j) (x)

˜ f (x ) ∼

ρ log x x

j

;

– |f (z)| 6 f (|z|) where f (z) := e z ˜ f (z).

Thus (ρ = 1/ log(1/q))

µ

n

n

ρ+1/2

G

ρ log

log(n/ρ)n/ρ

√ 2πρ

ρ+1/2

log n exp ρ 2

log n/ρ log(n/ρ)

2

!

.

(44)

JUSTIFICATION OF THE POISSON HEURISTIC

Four properties are sufficient

The following four properties are enough to justify the Poisson-Charlier expansion.

– ˜ f 0 (z) = ˜ f (qz) + e −z ; – F (s) = sF (qs) (F (s) = P

i∈ Z q j(j+1)/2 s j+1 /(1 + q j s));

˜ f (j) (x)

˜ f (x ) ∼

ρ log x x

j

;

– |f (z)| 6 f (|z|) where f (z) := e z ˜ f (z).

Thus (ρ = 1/ log(1/q))

µ

n

n

ρ+1/2

G

ρ log

log(n/ρ)n/ρ

√ 2πρ

ρ+1/2

log n exp ρ

2

log n/ρ log(n/ρ)

2

!

.

(45)

VARIANCE OF X n

σ n := p V (X n )

σ n 2 = σ n−1 2 + X

06j<n

π n,j σ n−1−j 2 + T n , π n,j :=

n − 1 j

p j q n−1−j ,

where T n := P

0 6 j<n π n,j2 n,j ,n,j := µ j + µ n−1 − µ n . Asymptotic transfer: a n = a n−1 + P

06j<n π n,j a n−1−j + b n

If b n ∼ n β (log n) κ ˜ f (n) α , where α > 1, β, κ ∈ R , then a n ∼ X

j 6 n

b j ∼ n

αρ log n b n .

(46)

VARIANCE OF X n

σ n := p V (X n )

σ n 2 = σ n−1 2 + X

06j<n

π n,j σ n−1−j 2 + T n , π n,j :=

n − 1 j

p j q n−1−j ,

where T n := P

0 6 j<n π n,j2 n,j ,n,j := µ j + µ n−1 − µ n . Asymptotic transfer: a n = a n−1 + P

06j<n π n,j a n−1−j + b n

If b n ∼ n β (log n) κ ˜ f (n) α , where α > 1, β, κ ∈ R , then a n ∼ X

j 6 n

b j ∼ n

αρ log n b n .

(47)

ASYMPTOTICS OF THE VARIANCE

Asymptotics of T n : by elementary means T n ∼ q −14 n −3 (log n) 4 ˜ f (n) 2 .

Applying the asymptotic transfer

σ n 2 ∼ Cn −2 (log n) 3 ˜ f (n) 2 . where C := pρ 3 /(2q).

Variance

Mean 2 ∼ C (log n) 3

n 2

(48)

ASYMPTOTICS OF THE VARIANCE

Asymptotics of T n : by elementary means T n ∼ q −14 n −3 (log n) 4 ˜ f (n) 2 .

Applying the asymptotic transfer

σ n 2 ∼ Cn −2 (log n) 3 ˜ f (n) 2 . where C := pρ 3 /(2q).

Variance

Mean 2 ∼ C (log n) 3

n 2

(49)

ASYMPTOTICS OF THE VARIANCE

Asymptotics of T n : by elementary means T n ∼ q −14 n −3 (log n) 4 ˜ f (n) 2 .

Applying the asymptotic transfer

σ n 2 ∼ Cn −2 (log n) 3 ˜ f (n) 2 . where C := pρ 3 /(2q).

Variance

Mean 2 ∼ C (log n) 3

n 2

(50)

ASYMPTOTIC NORMALITY OF X n

Convergence in distribution

The distribution of X n is asymptotically normal X n − µ n

σ n

→ d N (0, 1), with convergence of all moments.

Proof by the method of moments

Derive recurrence for E (X n − µ n ) m .

Prove by induction (using the asymptotic transfer) that

E (X

n

− µ

n

)

m

∼ (m)!

(m/2)!2

m/2

σ

mn

, if 2 | m,

= o(σ

mn

), if 2 - m,

(51)

ASYMPTOTIC NORMALITY OF X n

Convergence in distribution

The distribution of X n is asymptotically normal X n − µ n

σ n

→ d N (0, 1), with convergence of all moments.

Proof by the method of moments

Derive recurrence for E (X n − µ n ) m .

Prove by induction (using the asymptotic transfer) that

E (X

n

− µ

n

)

m

∼ (m)!

(m/2)!2

m/2

σ

mn

, if 2 | m,

= o(σ

mn

), if 2 - m,

(52)

A STRAIGHTFORWARD EXTENSION

b = 1, 2, . . . X n

= d X n−b + X n−b−Binom(n−b;p) ,

with X n = 0 for n < b and X b = 1.

For example, MAXIMUM TRIANGLE PARTITION:

X n

= d X n−3 + X n−3−Binom(n−3;p

3

) ,

The same tools we developed apply

X n asymptotically normally distributed with mean and

variance of the same order as the case b = 1.

(53)

A STRAIGHTFORWARD EXTENSION

b = 1, 2, . . . X n

= d X n−b + X n−b−Binom(n−b;p) ,

with X n = 0 for n < b and X b = 1.

For example, MAXIMUM TRIANGLE PARTITION:

X n

= d X n−3 + X n−3−Binom(n−3;p

3

) ,

The same tools we developed apply

X n asymptotically normally distributed with mean and

variance of the same order as the case b = 1.

(54)

A STRAIGHTFORWARD EXTENSION

b = 1, 2, . . . X n

= d X n−b + X n−b−Binom(n−b;p) ,

with X n = 0 for n < b and X b = 1.

For example, MAXIMUM TRIANGLE PARTITION:

X n

= d X n−3 + X n−3−Binom(n−3;p

3

) ,

The same tools we developed apply

X n asymptotically normally distributed with mean and

variance of the same order as the case b = 1.

(55)

A NATURAL VARIANT

What happens if X n

= d X n−1 + X uniform[0,n-1] ?

µ n = µ n−1 + 1 n

X

0 6 j<n

µ j ,

satisfies µ n ∼ cn −1/4 e 2

√ n . Note: µ n ≈ µ n−1 + µ n/2 fails.

Limit law not Gaussian (by method of moments) X n

µ n

→ d X ,

where g(z) := P

m>1 E (X m )z m /(m · m!) satisfies

z 2 g 00 + zg 0 − g = zgg 0 .

(56)

A NATURAL VARIANT

What happens if X n

= d X n−1 + X uniform[0,n-1] ?

µ n = µ n−1 + 1 n

X

0 6 j<n

µ j ,

satisfies µ n ∼ cn −1/4 e 2

√ n . Note: µ n ≈ µ n−1 + µ n/2 fails.

Limit law not Gaussian (by method of moments) X n

µ n

→ d X , where g(z) := P

m>1 E (X m )z m /(m · m!) satisfies

z 2 g 00 + zg 0 − g = zgg 0 .

(57)

A NATURAL VARIANT

What happens if X n

= d X n−1 + X uniform[0,n-1] ?

µ n = µ n−1 + 1 n

X

0 6 j<n

µ j ,

satisfies µ n ∼ cn −1/4 e 2

√ n . Note: µ n ≈ µ n−1 + µ n/2 fails.

Limit law not Gaussian (by method of moments) X n

µ n

→ d X , where g(z) := P

m>1 E (X m )z m /(m · m!) satisfies

z 2 g 00 + zg 0 − g = zgg 0 .

(58)

CONCLUSION

Random graph algorithms:

a rich source of interesting recurrences

Obrigado!

(59)

CONCLUSION

Random graph algorithms:

a rich source of interesting recurrences

Obrigado!

Referências

Documentos relacionados

o BJectIves : To evaluate the efficacy, tolerability and safety of a topical, fixed-dose combination of adapalene 0.1% and benzoyl peroxide 2.5% gel for the treatment of acne

A biased random-key genetic algorithm for road congestion minimization.. The regenerator location problem. A note on two problems in connection of graphs. Resende, and R.M.A.

In the end the chosen methodologies to work with where using an ensemble of random forests and the Relief algorithm for feature selection, SMOTE and random under-sampling

In this paper, we present an algorithm for extracting real-time parameters (offsets, deadlines and periods) that are independent of the schedulability analysis,

Key words and phrases: Ramsey theory, random graphs, threshold functions, Szemer´ edi’s lemma, cycles.. 1 Introduction

In addition, building on the work of Alon and Naor [4], we provide an algorithm that computes a regular partition of a given (possibly sparse) graph G in polynomial time.. As

B Shall focus on graphs, including some applications in the theory of random and pseudorandom graphs (Lecture III).. B Shall consider blow-up lemmas in the dense and sparse

Uskov, Global convergence of augmented La- grangian methods applied to optimization problems with degenerate constraints, including problems with complementarity constraints,