16 views

Uploaded by Tuan Zendf

mrf

- Unit 3 - Final
- stat455_assign1
- Three-Phase PFC Rectifier
- Dactyloscopy
- ACSL-GraphTheory
- NodeXL Tutorial BenS v2
- Graphs and Trees Lecture9
- 221639253 Algorithm Graph Theory
- Cls Jeead-16-17 Xii Mat Target-5 Set-2 Chapter-1
- functional brain networks.pdf
- Real-Valued Function - Wikipedia
- N D
- Shortest Path Problem on Single Valued Neutrosophic Graphs
- ch1-1-2
- 02 Limit and Continuity
- An Introduction to Conditional Random Fields
- qos-voip
- Face Hack
- Face Hack
- push2

You are on page 1of 40

Umamahesh Srinivas

iPAL Group Meeting

February 25, 2011

Outline

1

Basic graph-theoretic concepts

2

Markov chain

3

Markov random ﬁeld (MRF)

4

Gauss-Markov random ﬁeld (GMRF), and applications

5

Other popular MRFs

02/25/2011 iPAL Group Meeting 2

References

1

Charles Bouman, Markov random ﬁelds and stochastic image

models. Tutorial presented at ICIP 1995

2

Mario Figueiredo, Bayesian methods and Markov random ﬁelds.

Tutorial presented at CVPR 1998

02/25/2011 iPAL Group Meeting 3

Basic graph-theoretic concepts

A graph G = (V, E) is a ﬁnite collection of nodes (or vertices)

V = {n

1

, n

2

, . . . , n

N

} and set of edges E ⊂

_

V

2

_

We consider only undirected graphs

Neighbor: Two nodes n

i

, n

j

∈ V are neighbors if (n

i

, n

j

) ∈ E

Neighborhood of a node: N(n

i

) = {n

j

: (n

i

, n

j

) ∈ E}

Neighborhood is a symmetric relation: n

i

∈ N(n

j

) ⇔ n

j

∈ N(n

i

)

Complete graph:

∀n

i

∈ V, N(n

i

) = {(n

i

, n

j

), j = {1, 2, . . . , N}\{i}}

Clique: a complete subgraph of G.

Maximal clique: Clique with maximal number of nodes; cannot add

any other node while still retaining complete connectedness.

02/25/2011 iPAL Group Meeting 4

Basic graph-theoretic concepts

A graph G = (V, E) is a ﬁnite collection of nodes (or vertices)

V = {n

1

, n

2

, . . . , n

N

} and set of edges E ⊂

_

V

2

_

We consider only undirected graphs

Neighbor: Two nodes n

i

, n

j

∈ V are neighbors if (n

i

, n

j

) ∈ E

Neighborhood of a node: N(n

i

) = {n

j

: (n

i

, n

j

) ∈ E}

Neighborhood is a symmetric relation: n

i

∈ N(n

j

) ⇔ n

j

∈ N(n

i

)

Complete graph:

∀n

i

∈ V, N(n

i

) = {(n

i

, n

j

), j = {1, 2, . . . , N}\{i}}

Clique: a complete subgraph of G.

Maximal clique: Clique with maximal number of nodes; cannot add

any other node while still retaining complete connectedness.

02/25/2011 iPAL Group Meeting 4

Basic graph-theoretic concepts

A graph G = (V, E) is a ﬁnite collection of nodes (or vertices)

V = {n

1

, n

2

, . . . , n

N

} and set of edges E ⊂

_

V

2

_

We consider only undirected graphs

Neighbor: Two nodes n

i

, n

j

∈ V are neighbors if (n

i

, n

j

) ∈ E

Neighborhood of a node: N(n

i

) = {n

j

: (n

i

, n

j

) ∈ E}

Neighborhood is a symmetric relation: n

i

∈ N(n

j

) ⇔ n

j

∈ N(n

i

)

Complete graph:

∀n

i

∈ V, N(n

i

) = {(n

i

, n

j

), j = {1, 2, . . . , N}\{i}}

Clique: a complete subgraph of G.

Maximal clique: Clique with maximal number of nodes; cannot add

any other node while still retaining complete connectedness.

02/25/2011 iPAL Group Meeting 4

Illustration

V = {1, 2, 3, 4, 5, 6}

E = {(1, 2), (1, 3), (2, 4), (2, 5), (3, 4), (3, 6), (4, 6), (5, 6)}

N(4) = {2, 3, 6}

Examples of cliques: {(1), (3, 4, 6), (2, 5)}

Set of all cliques: V ∪ E ∪ {3, 4, 6}

02/25/2011 iPAL Group Meeting 5

Separation

Let A, B, C be three disjoint subsets of V

C separates A from B if any path from a node in A to a node in B

contains some node in C

Example: C = {1, 4, 6} separates A = {3} from B = {2, 5}

02/25/2011 iPAL Group Meeting 6

Markov chains

Graphical model: Associate each node of a graph with a random

variable (or a collection thereof)

Homogeneous 1-D Markov chain:

p(x

n

|x

i

, i < n) = p(x

n

|x

n−1

)

Probability of a sequence given by:

p(x) = p(x

0

)

N

n=1

p(x

n

|x

n−1

)

02/25/2011 iPAL Group Meeting 7

2-D Markov chains

Advantages:

Simple expressions for probability

Simple parameter estimation

Disadvantages:

No natural ordering of image pixels

Anisotropic model behavior

02/25/2011 iPAL Group Meeting 8

Random ﬁelds on graphs

Consider a collection of random variables x = (x

1

, x

2

, . . . , x

N

) with

associated joint probability distribution p(x)

Let A, B, C be three disjoint subsets of V. Let x

A

denote the

collection of random variables in A.

Conditional independence: A ⊥⊥ B | C

A ⊥⊥ B | C ⇔p(x

A

, x

B

|x

C

) = p(x

A

|x

C

)p(x

B

|x

C

)

Markov random ﬁeld: undirected graphical model in which each

node corresponds to a random variable or a collection of random

variables, and the edges identify conditional dependencies.

02/25/2011 iPAL Group Meeting 9

Random ﬁelds on graphs

Consider a collection of random variables x = (x

1

, x

2

, . . . , x

N

) with

associated joint probability distribution p(x)

Let A, B, C be three disjoint subsets of V. Let x

A

denote the

collection of random variables in A.

Conditional independence: A ⊥⊥ B | C

A ⊥⊥ B | C ⇔p(x

A

, x

B

|x

C

) = p(x

A

|x

C

)p(x

B

|x

C

)

Markov random ﬁeld: undirected graphical model in which each

node corresponds to a random variable or a collection of random

variables, and the edges identify conditional dependencies.

02/25/2011 iPAL Group Meeting 9

Markov properties

Pairwise Markovianity:

(n

i

, n

j

) / ∈ E ⇒ x

i

and x

j

are independent when conditioned on all

other variables

p(x

i

, x

j

|x

\{i,j}

) = p(x

i

|x

\{i,j}

)p(x

j

|x

\{i,j}

)

Local Markovianity:

Given its neighborhood, a variable is independent on the rest of the

variables

p(x

i

|x

V\{i}

) = p(x

i

|x

N(i)

)

Global Markovianity:

Let A, B, C be three disjoint subsets of V. If

C separates A from B ⇒ p(x

A

, x

B

|x

C

) = p(x

A

|x

C

)p(x

B

|x

C

),

then p(·) is global Markov w.r.t. G.

02/25/2011 iPAL Group Meeting 10

Markov properties

Pairwise Markovianity:

(n

i

, n

j

) / ∈ E ⇒ x

i

and x

j

are independent when conditioned on all

other variables

p(x

i

, x

j

|x

\{i,j}

) = p(x

i

|x

\{i,j}

)p(x

j

|x

\{i,j}

)

Local Markovianity:

Given its neighborhood, a variable is independent on the rest of the

variables

p(x

i

|x

V\{i}

) = p(x

i

|x

N(i)

)

Global Markovianity:

Let A, B, C be three disjoint subsets of V. If

C separates A from B ⇒ p(x

A

, x

B

|x

C

) = p(x

A

|x

C

)p(x

B

|x

C

),

then p(·) is global Markov w.r.t. G.

02/25/2011 iPAL Group Meeting 10

Markov properties

Pairwise Markovianity:

(n

i

, n

j

) / ∈ E ⇒ x

i

and x

j

are independent when conditioned on all

other variables

p(x

i

, x

j

|x

\{i,j}

) = p(x

i

|x

\{i,j}

)p(x

j

|x

\{i,j}

)

Local Markovianity:

Given its neighborhood, a variable is independent on the rest of the

variables

p(x

i

|x

V\{i}

) = p(x

i

|x

N(i)

)

Global Markovianity:

Let A, B, C be three disjoint subsets of V. If

C separates A from B ⇒ p(x

A

, x

B

|x

C

) = p(x

A

|x

C

)p(x

B

|x

C

),

then p(·) is global Markov w.r.t. G.

02/25/2011 iPAL Group Meeting 10

Hammersley-Cliﬀord Theorem

Consider a random ﬁeld x on a graph G, such that p(x) > 0. Let C

denote the set of all maximal cliques of the graph.

If the ﬁeld has the local Markov property, then p(x) can be written

as a Gibbs distribution:

p(x) =

1

Z

exp

_

−

C∈C

V

C

(x

C

)

_

,

where Z, the normalizing constant, is called the partition function;

V

C

(x

C

) are the clique potentials

If p(x) can be written in Gibbs form for the cliques of some graph,

then it has the global Markov property.

Fundamental consequence: every Markov random ﬁeld can be speciﬁed

via clique potentials.

02/25/2011 iPAL Group Meeting 11

Hammersley-Cliﬀord Theorem

Consider a random ﬁeld x on a graph G, such that p(x) > 0. Let C

denote the set of all maximal cliques of the graph.

If the ﬁeld has the local Markov property, then p(x) can be written

as a Gibbs distribution:

p(x) =

1

Z

exp

_

−

C∈C

V

C

(x

C

)

_

,

where Z, the normalizing constant, is called the partition function;

V

C

(x

C

) are the clique potentials

If p(x) can be written in Gibbs form for the cliques of some graph,

then it has the global Markov property.

Fundamental consequence: every Markov random ﬁeld can be speciﬁed

via clique potentials.

02/25/2011 iPAL Group Meeting 11

Hammersley-Cliﬀord Theorem

Consider a random ﬁeld x on a graph G, such that p(x) > 0. Let C

denote the set of all maximal cliques of the graph.

If the ﬁeld has the local Markov property, then p(x) can be written

as a Gibbs distribution:

p(x) =

1

Z

exp

_

−

C∈C

V

C

(x

C

)

_

,

where Z, the normalizing constant, is called the partition function;

V

C

(x

C

) are the clique potentials

If p(x) can be written in Gibbs form for the cliques of some graph,

then it has the global Markov property.

Fundamental consequence: every Markov random ﬁeld can be speciﬁed

via clique potentials.

02/25/2011 iPAL Group Meeting 11

Regular rectangular lattices

V = {(i, j), i = 1, . . . , M, j = 1, . . . , N}

Order-K neighborhood system:

N

K

(i, j) = {(m, n) : (i −m)

2

+ (j −n)

2

≤ K}

02/25/2011 iPAL Group Meeting 12

Auto-models

Only pair-wise interactions

In terms of clique potentials: |C| > 2 ⇒ V

C

(·) = 0

Simplest possible neighborhood models

02/25/2011 iPAL Group Meeting 13

Gauss-Markov Random Fields (GMRF)

Joint probability function (assuming zero mean):

p(x) =

1

(2π)

n/2

|Σ|

1/2

exp

_

−

1

2

x

T

Σ

−1

x

_

Quadratic form in the exponent:

x

T

Σ

−1

x =

i

j

x

i

x

i

Σ

−1

i,j

⇒ auto-model

The neighborhood system is determined by the potential matrix

Σ

−1

Local conditionals are univariate Gaussian

02/25/2011 iPAL Group Meeting 14

Gauss-Markov Random Fields (GMRF)

Joint probability function (assuming zero mean):

p(x) =

1

(2π)

n/2

|Σ|

1/2

exp

_

−

1

2

x

T

Σ

−1

x

_

Quadratic form in the exponent:

x

T

Σ

−1

x =

i

j

x

i

x

i

Σ

−1

i,j

⇒ auto-model

The neighborhood system is determined by the potential matrix

Σ

−1

Local conditionals are univariate Gaussian

02/25/2011 iPAL Group Meeting 14

Gauss-Markov Random Fields

Speciﬁcation via clique potentials:

V

C

(x

C

) =

1

2

_

i∈C

α

C

i

x

i

_

2

=

1

2

_

i∈V

α

C

i

x

i

_

2

as long as i / ∈ C ⇒ α

C

i

= 0

The exponent of the GMRF density becomes:

−

C∈C

V

C

(x

C

) = −

1

2

C∈C

_

i∈V

α

C

i

x

i

_

2

=

1

2

i∈V

j∈V

_

C∈C

α

C

i

α

C

j

_

x

i

x

j

= −

1

2

x

T

Σ

−1

x.

02/25/2011 iPAL Group Meeting 15

GMRF: Application to image processing

Classical image “smoothing” prior

Consider an image to be a rectangular lattice with ﬁrst-order pixel

neighborhoods

Cliques: pairs of vertically or horizontally adjacent pixels

Clique potentials: squares of ﬁrst-order diﬀerences (approximation of

continuous derivative)

V

{(i,j),(i,j−1)}

(x

i,j

, x

i,j−1

) =

1

2

(x

i,j

−x

i,j−1

)

2

Resulting Σ

−1

: block-tridiagonal with tridiagonal blocks

02/25/2011 iPAL Group Meeting 16

Bayesian image restoration with GMRF prior

Observation model:

y = Hx +n, n ∼ N(0, σ

2

I)

Smoothing GMRF prior: p(x) ∝ exp{−

1

2

x

T

Σ

−1

x}

MAP estimate:

ˆ x = [σ

2

Σ

−1

+H

T

H]

−1

H

T

y

02/25/2011 iPAL Group Meeting 17

Bayesian image restoration with GMRF prior

Figure: (a) Original image, (b) Blurred and slightly noisy image, (c) Restored

version of (b), (d) No blur, severe noise, (e) Restored version of (d).

Deblurring: good

Denoising: oversmoothing; “edge discontinuities” smoothed out

How to preserve discontinuities?

Other prior models

Hidden/latent binary random variables

Robust potential functions (e.g. L

2

vs. L

1

-norm)

02/25/2011 iPAL Group Meeting 18

Bayesian image restoration with GMRF prior

Figure: (a) Original image, (b) Blurred and slightly noisy image, (c) Restored

version of (b), (d) No blur, severe noise, (e) Restored version of (d).

Deblurring: good

Denoising: oversmoothing; “edge discontinuities” smoothed out

How to preserve discontinuities?

Other prior models

Hidden/latent binary random variables

Robust potential functions (e.g. L

2

vs. L

1

-norm)

02/25/2011 iPAL Group Meeting 18

Compound GMRF

Insert binary variables v to “turn oﬀ” clique potentials

Modiﬁed clique potentials:

V (x

i,j

, x

i,j−1

, v

i,j

) =

1

2

(1 −v

i,j

)(x

i,j

−x

i,j−1

)

2

Intuitive explanation:

v = 0 ⇒ clique potential is quadratic (“on”)

v = 1 ⇒ V

C

(·) = 0 → no smoothing; image has an edge at this

location.

Can choose separate latent variables v and h for vertical and

horizontal edges respectively

p(x|h, v) ∝ exp

_

−

1

2

x

T

Σ

−1

(h, v)x

_

MAP estimate:

ˆ x = [σ

2

Σ

−1

(h, v) +H

T

H]

−1

H

T

y

02/25/2011 iPAL Group Meeting 19

Compound GMRF

Insert binary variables v to “turn oﬀ” clique potentials

Modiﬁed clique potentials:

V (x

i,j

, x

i,j−1

, v

i,j

) =

1

2

(1 −v

i,j

)(x

i,j

−x

i,j−1

)

2

Intuitive explanation:

v = 0 ⇒ clique potential is quadratic (“on”)

v = 1 ⇒ V

C

(·) = 0 → no smoothing; image has an edge at this

location.

Can choose separate latent variables v and h for vertical and

horizontal edges respectively

p(x|h, v) ∝ exp

_

−

1

2

x

T

Σ

−1

(h, v)x

_

MAP estimate:

ˆ x = [σ

2

Σ

−1

(h, v) +H

T

H]

−1

H

T

y

02/25/2011 iPAL Group Meeting 19

Compound GMRF

Insert binary variables v to “turn oﬀ” clique potentials

Modiﬁed clique potentials:

V (x

i,j

, x

i,j−1

, v

i,j

) =

1

2

(1 −v

i,j

)(x

i,j

−x

i,j−1

)

2

Intuitive explanation:

v = 0 ⇒ clique potential is quadratic (“on”)

v = 1 ⇒ V

C

(·) = 0 → no smoothing; image has an edge at this

location.

Can choose separate latent variables v and h for vertical and

horizontal edges respectively

p(x|h, v) ∝ exp

_

−

1

2

x

T

Σ

−1

(h, v)x

_

MAP estimate:

ˆ x = [σ

2

Σ

−1

(h, v) +H

T

H]

−1

H

T

y

02/25/2011 iPAL Group Meeting 19

Compound GMRF

Insert binary variables v to “turn oﬀ” clique potentials

Modiﬁed clique potentials:

V (x

i,j

, x

i,j−1

, v

i,j

) =

1

2

(1 −v

i,j

)(x

i,j

−x

i,j−1

)

2

Intuitive explanation:

v = 0 ⇒ clique potential is quadratic (“on”)

v = 1 ⇒ V

C

(·) = 0 → no smoothing; image has an edge at this

location.

Can choose separate latent variables v and h for vertical and

horizontal edges respectively

p(x|h, v) ∝ exp

_

−

1

2

x

T

Σ

−1

(h, v)x

_

MAP estimate:

ˆ x = [σ

2

Σ

−1

(h, v) +H

T

H]

−1

H

T

y

02/25/2011 iPAL Group Meeting 19

Discontinuity-preserving restoration

Convex potentials:

Generalized Gaussians: V (x) = |x|

p

, p ∈ [1, 2]

Stevenson:

V (x) =

_

x

2

, |x| < a

2a|x| −a

2

, |x| ≥ a

Green: V (x) = 2a

2

log cosh(x/a)

Non-convex potentials:

Blake, Zisserman: V (x) = (min{|x|, a})

2

Geman, McClure: V (x) =

x

2

x

2

+a

2

02/25/2011 iPAL Group Meeting 20

Ising model (2-D MRF)

V (x

i

, x

j

) = βδ(x

i

= x

j

), β is a model parameter

Energy function:

C∈C

V

C

(x

C

) = β(boundary length)

Longer boundaries less probable

02/25/2011 iPAL Group Meeting 21

Application: Image segmentation

Discrete MRF used to model the segmentation ﬁeld

Each class represented by a value X

s

∈ {0, . . . , M −1}

Joint distribution function:

P{Y ∈ dy, X = x} = p(y|x)p(x)

(Bayesian) MAP estimation:

ˆ

X = arg max

x

p

x|Y

(x|Y ) = arg max

x

(log p(Y |x) + log(p(x)))

02/25/2011 iPAL Group Meeting 22

Application: Image segmentation

Discrete MRF used to model the segmentation ﬁeld

Each class represented by a value X

s

∈ {0, . . . , M −1}

Joint distribution function:

P{Y ∈ dy, X = x} = p(y|x)p(x)

(Bayesian) MAP estimation:

ˆ

X = arg max

x

p

x|Y

(x|Y ) = arg max

x

(log p(Y |x) + log(p(x)))

02/25/2011 iPAL Group Meeting 22

Application: Image segmentation

Discrete MRF used to model the segmentation ﬁeld

Each class represented by a value X

s

∈ {0, . . . , M −1}

Joint distribution function:

P{Y ∈ dy, X = x} = p(y|x)p(x)

(Bayesian) MAP estimation:

ˆ

X = arg max

x

p

x|Y

(x|Y ) = arg max

x

(log p(Y |x) + log(p(x)))

02/25/2011 iPAL Group Meeting 22

MAP optimization for segmentation

Data model:

p

y|x

(y|x) =

s∈S

p(y

s

|x

s

)

Prior (Ising) model:

p

x

(x) =

1

Z

exp{−βt

1

(x)},

where t

1

(x) is the number of horizontal and vertical neighbors of x

having a diﬀerent value

MAP estimate:

ˆ x = arg min

x

{−log p

y|x

(y|x) +βt

1

(x)}

Hard optimization problem

02/25/2011 iPAL Group Meeting 23

MAP optimization for segmentation

Data model:

p

y|x

(y|x) =

s∈S

p(y

s

|x

s

)

Prior (Ising) model:

p

x

(x) =

1

Z

exp{−βt

1

(x)},

where t

1

(x) is the number of horizontal and vertical neighbors of x

having a diﬀerent value

MAP estimate:

ˆ x = arg min

x

{−log p

y|x

(y|x) +βt

1

(x)}

Hard optimization problem

02/25/2011 iPAL Group Meeting 23

Some proposed approaches

Iterated conditional modes: iterative minimization w.r.t. each pixel

Simulated annealing: Generate samples from prior distribution

Multi-scale resolution segmentation

Figure: (a) Synthetic image with three textures, (b) ICM, (c) Simulated

annealing, (d) Multi-resolution approach.

02/25/2011 iPAL Group Meeting 24

Summary

Graphical models study probability distributions whose conditional

dependencies arise out of speciﬁc graph structures

Markov random ﬁeld is an undirected graphical model with special

factorization properties

2-D MRFs have been widely used as priors in image processing

problems

Choice of potential functions leads to diﬀerent optimization problems

02/25/2011 iPAL Group Meeting 25

- Unit 3 - FinalUploaded byanoopsingh1744
- stat455_assign1Uploaded byBilly Moon
- Three-Phase PFC RectifierUploaded byNalin Lochan Gupta
- DactyloscopyUploaded bymasemase
- ACSL-GraphTheoryUploaded byPKDianna
- NodeXL Tutorial BenS v2Uploaded byPrakash Joshi
- Graphs and Trees Lecture9Uploaded bytheresa.painter
- 221639253 Algorithm Graph TheoryUploaded bySagnik Saha
- Cls Jeead-16-17 Xii Mat Target-5 Set-2 Chapter-1Uploaded byAbhinav Verma
- functional brain networks.pdfUploaded byCor de Jesús
- Real-Valued Function - WikipediaUploaded bymoresubscriptions
- N DUploaded byansuljaju
- Shortest Path Problem on Single Valued Neutrosophic GraphsUploaded byAnonymous 0U9j6BLllB
- ch1-1-2Uploaded byWenn Winnona
- 02 Limit and ContinuityUploaded byPrasetya
- An Introduction to Conditional Random FieldsUploaded byUma Thaya

- qos-voipUploaded byTuan Zendf
- Face HackUploaded byTuan Zendf
- Face HackUploaded byTuan Zendf
- push2Uploaded byTuan Zendf
- feature extractionUploaded bybattula_raju7
- doc cument pythonUploaded byTuan Zendf
- The+PyQt4+TutorialUploaded byVictor Alfonso
- De Thi Cong Nghe WebUploaded byTuan Zendf
- Giao_Trinh_Thuc_Hanh_k3Uploaded bylangtupro02
- Advanced Bash Shell Scripting Guide (Linux)Uploaded bymohammad
- Bai giảng HQT SQL Server - DH Hang HaiUploaded byTrinh Van Huy
- gccUploaded byTuan Zendf

- The DC Blocking Filter - J de Freitas Jan 2007Uploaded byHiếu Trung Hoàng
- chp9Uploaded byanwar
- Snagit 11.2 Help File[1]Uploaded byLuizFigueiredo
- fsQCAManualUploaded byDaniel Rios
- DBMS Shruti (Bba Gen)Uploaded byshruti
- Public Infrastructure of InternetUploaded bysaad209
- Bio Chip Paper Presentation 100115092455 Phpapp01Uploaded bykp1292
- Javalab to Create CalculatorUploaded bySyed
- Php the Right WayUploaded byEdiyansyah ST
- Final Phase 1 DocumentUploaded byJacob Priyadharshan
- Texting and DrivingUploaded bySciJourner
- CEI-715 User Manual.pdfUploaded byvj2506
- LINUX_XFS_HOWTOUploaded byapi-3701346
- Direct and Absolute Input in AutoCAD 2017Uploaded byhanna
- Privacy and Security in Personal Data CloudsUploaded byalva.c
- JSTL ReferenceUploaded bykaviarasu
- Bank Details Request Form (New)Uploaded byNur Fazalina
- Auto-tuning of PID Controller for Distillation Process with Particle Swarm Optimization (PSO) MethodUploaded byInternational Journal for Scientific Research and Development - IJSRD
- Rittman Mead Consulting Blog Archive OBIEE 11 1 1 6 Now Available for Download Release HighlightsUploaded byAlex Portillo
- Wireless LAN Broadband RouterUploaded byENAK9000
- neuralintroduction pptxUploaded byapi-282127378
- 2009HZ74102_REPORTUploaded byDeepak Kangondi
- integrating-blockchain-erp.pdfUploaded byraaji
- Inprocess Inspection – Quantity Confirmation During Results Recording _ SAP BlogsUploaded bydokk
- Mark David StrausbergUploaded bySumit Kumar Singh
- Gesture Recognition TechnologyUploaded byMidhun T Menon
- Howto Ntfs Read and Write Using Ntfs-3gUploaded byjcarlos_228
- human factorsUploaded byapi-252361076
- Markov ChainsUploaded byNouman Khan
- Peter's Tools 1Uploaded bylmn_grss