User Tools

Site Tools


baud_godin:guillaume_baud-berthier_thibault_godin

Visual Analytics Course

Bordeaux roadmap 2013- 2014 / Guillaume Baud-Berthier & Thibault Godin's corner

Project source code

Project: Comparing Graph Structure

Introduction

Let be a simple graph of nodes. We define the laplacian matrix (also known as Kirchnoff matrix) by

observation 1-

is a symmetric, positive matrix. Indeed where is an incidence matrix Hence is diagonalisable and has its eigenvalues positive . In fact is always an eigenvalue, and is associated to the eigenvector (it follows from the observation one).

One of the first application of this matrix has been the computation of the number of spanning trees, known as Kirchnoff Matrix-Tree theorem following and which is proved in appendix.

Kirchnoff Matrix-Tree theorem The number of spanning trees is equal to any cofactor of the laplacian matrix. where is the matrix where line and row have been deleted..

Number of connected components the multiplicity of the 0-eigenvalue of the laplacian matrix is the number of connected components.

Indeed the laplacian matrix can be put as a bloc-diagonal matrix with each bloc formed with only connected vertex. Then the characteristic polynomial is the product of each bloc's characteristic polynomial and hence 0 has multiplicity 1 times the number of bloc.

Connectivity Number

In 1973, Miroslav Fiedler introduced in its article Algebraic Connectivity in Graph an interpretation of the second smallest eigenvalue of that we will denote in term of connectivity.

First is symmetric, has its coefficients positives and admit as eigenvalue. As it is irreducible if is connected, by Perron-Frobenius theorem the greatest eigenvalue as multiplicity one. Otherwise if is not connected 0 as clearly an associated eigenspace of dimension greater than 2. Hence

The eigenvector associated with , often called Fielder vector can be use as a measure of the connectivity of a given vertex.

Our Work

We decide to create a python script that builds the laplacian matrix and adds to the graph two metric. The first one is the set of eigenvalues. As we give to the node n the n-th eigenvalue (counted with multiplicity) it is then easy to visualise the distribution of the eigenvalues as a histogram using Tulip's histogram panel. One can try to understand the structure of the graph by looking at that histogram : the multiplicity of 0 is, as previously said, the number of connected component ; and the more the other values are on the right and grouped , the more the graph is connected (extremal case : complete graph : all non zeros eigenvalues are n). The second one is more informative on a node's characteristic : we give to each node n the n-th coordinate of a Fiedler vector (fixed). This metric somehow represent the connectedness of n to the most central block (set of connected nodes) which have value zero (inversely proportional to its metric). Once again it allows us to visualise the structure of the graph from a histogram.

However histogram visualisation can be quite obscure (especially if the nodes are mixed on it) and does not allow someone without knowledge on spectrum analysis and Fiedler vector to understand the graph structure. Even for someone who has understand the basis on this subject it is hard to actually figure out the graph's structure. That's why we thought that it could be interesting to use this data to create a layout algorithm for simple, undirected, connected graphs.

Our first idea was, from a random layout, set the x-coordinate of each node layout to its Fielder-vector one. Hence we split the graph with on left the lesser than zero nodes and on right the greater than zero ones, and with on each extrem the nodes that are the least important for connectivity. This is implemented in lapla.py, and that indeed improves the visual complexity of the drawing.

Then we first thought that this idea could be apply recursively on the 2 components characterised by the previous algorithm, while limiting the movement to preserve the 2-sided structure. Although we try a very simple idea : the previous algorithm only changes the x-component, what about the y-one ? We decide to give a try to set it to the n-th component of the eigenvector associated with the third eigenvalue (counted with multiplicity, so it can be an Fiedler-vector too, but independent of our first one). We has been pleased to see that this simple idea was very effective as it can be seen on the following table. Yet we have not been able to give a proper justification to this phenomenon which may be the subject of further investigation in the future. lapla2.py implements that second method.

Layout Examples
Type of Graph Random layout lapla.py lapla2.py
planar (30 nodes 84 edges)
complete tree degree 3 depth 5
random simple 100 nodes
Bibliographie

*Miroslav Fiedler, Algebraic Connectivity in Graph, Czechoslovak Mathematical Journal, 23 (98), 1973

*Sukanta Pati, Laplacian Matrix of graph, conference Matrix Information Geometries 2011, Palaiseau, France

Appendix

* proof of Kirchhoff Matrix-Tree theorem

We know that . The same holds for with the matrix where the same row as in has been deleted. The Cauchy-Binet formula tells us that if (so ) then with subset of of size and (resp. ) is the () matrix where the columns (rows) not in have been removed.

Here it leads us to . However can be seen as a graph of edges, sub-graph of . is no spanning tree then either a line is void either we can find a cycle so . If it is a spanning tree, using Gauss-method, we can prove that (the diagonal has to be 1 or -1). Hence we can say that

/net/html/perso/melancon/Visual_Analytics_Course/data/pages/baud_godin/guillaume_baud-berthier_thibault_godin.txt · Last modified: 2014/01/17 11:14 by baudberthier