User Tools

Site Tools


level_of_details_navigation_of_hierarchical_clusterings

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
level_of_details_navigation_of_hierarchical_clusterings [2012/11/23 13:01]
melancon [Furnas' DOI]
level_of_details_navigation_of_hierarchical_clusterings [2012/11/23 13:44] (current)
melancon [van Ham and van Wijk' DOA]
Line 45: Line 45:
 === van Ham and van Wijk' DOA === === van Ham and van Wijk' DOA ===
  
-Furnas'​ ideas were elegantly extended to work with a hierarchy of clusters for a graph (it may work just as well with a hierarchy of clusters of high-dimensional data). In this tiaution, we are given a tree of clusters $T = (W, F)$ where this time nodes of the tree correspond to clusters of a graph $G = (V, E)$. That is, we may think of $w \in W$ as a subset of $\subset V$ with the additional condition that $\subset w'$ when $w'$ is an ancestor of $w$.+Furnas'​ ideas were elegantly extended to work with a hierarchy of clusters for a graph (it may work just as well with a hierarchy of clusters of high-dimensional data). In this tiaution, we are given a tree of clusters $T = (W, F)$ where this time nodes of the tree correspond to clusters of a graph $G = (V, E)$. That is, we may think of $w \in W$ as a subset of ${\bf C}_w \subset V$ with the additional condition that ${\bf C}_w \subset ​{\bf C}_{w'}$ when $w'$ is an ancestor of $w$.
  
-Now, we need to compute a value for each node of the tree reflecting its //a priori importance//​. We may think of leaf nodes (elements of $V$) as having the least //a priori// importance. What we need is to compute the //a priori// importance of a cluster obtained from merging sub-clusters. Looking at the whole tree of clusters, what we need is a function that grows along a path going towards the root of the tree. +Now, we need to compute a value for each node of the tree reflecting its //a priori importance//​. We may think of leaf nodes (elements of $V$) as having the least //a priori// importance. What we need is to compute the //a priori// importance of a cluster obtained from merging sub-clusters. Looking at the whole tree of clusters. What we need is a function ​$API: W \to \mathbb R^+$ that grows along a path going towards the root of the tree. For reasons that will become clear, we require the function to be positive.
  
   * Recall the [[high-dimensional_data_and_ward_hyugen_s_principle|Huygens theory we have exposed when looking at high-dimensional data]]. A good example of what the //a priori// importance may be is the internal inertia of a cluster. We then have an increasing a priori importance towards the root in virtue of the Huygens theorem.   * Recall the [[high-dimensional_data_and_ward_hyugen_s_principle|Huygens theory we have exposed when looking at high-dimensional data]]. A good example of what the //a priori// importance may be is the internal inertia of a cluster. We then have an increasing a priori importance towards the root in virtue of the Huygens theorem.
-  * Another example could be to take , as //a priori// importance of a cluster $\bf C$, the //average distance between the two clusters// ${\bf C'}, {\bf C''​}$ that were merge to obtain ${\bf C} = {\bf C'} \cup {\bf C''​}$.+  * Another example could be to take , as //a priori// importance of a cluster ${\bf C}_w$, the //average distance between the two clusters// ${\bf C}_{w'}, {\bf C}_{w''​}$ that were merge to obtain ${\bf C}_w = {\bf C}_{w'} \cup {\bf C}_{w''​}$. 
 + 
 +van Ham and van Wijk define things a bit differently from Furnas, in that they don't directly threshold the values associated with nodes of the tree. They instead define what they call a $DOA$: degree of abstraction. When selecting a degree of abstraction $DOA = 1$, what we should get is a total abstraction,​ that is a single metanode corresponding to the whole dataset. That is indeed, the most abstract view we may have on the dataset. Conversely, when setting $DOA = 0$ we should get the least abstract view possible, that is we should have a view with points on the screen that correspond to data elements themselves. The core procedure is to decide how to vary the selection of clusters when $DOA$ goes from 0 to 1. 
 + 
 +The criterion they define to decide whether to show a cluster is the following. Denote as $r$ the root node of the tree. They select clusters ${\bf C}_w$, with father cluster ${\bf C}_{w'​}$ from the cluster tree whenever we have: 
 + 
 +$$ 
 +API({\bf C}_w) \leq DOA \cdot API(r) < API({\bf C}_{w'​}) 
 +$$ 
 + 
 +That is, we select clusters whose //a priori// importance is right above the $DOA$ fraction of the root's //a priori// importance, but whose father cluster is far too abstract (with respect to the selected $DOA$ threshold). 
 + 
 +== Varying $DOA$ according to a focus node == 
 + 
 +Just as with Furnas, we may adapt the previous calculations to select nodes according to their '​distance'​ to a focus cluster node $f$. The idea is relatively simple, and consists in taking an increasing function $DOA: \mathbb R^+ \to [0, 1]$ instead of a constant $DOA$ and feeding it with the distance to the focal node $d(w, f)$ -- measured in whatever appropriate manner. Obviously, we need to require that $DOA(0) = 0$. Different candidate function are possible: 
 + 
 +  * Linear $DOA$ bounded above by 1, that is $DOA(x) = x$ for $x \leq 1$ and $DOA(x) = 1$ when $x > 1$. 
 +  * Any other non linear function will do, like $DOA(x) = x^2$ when $x \in [0, 1]$ and $DOA(x) = 1$ when $x > 1$. 
 + 
 +The condition for showing a cluster then becomes the conjunction of two distinct conditions: $API({\bf C}_w) \leq DOA(d(w, f)) \cdot API(r)$ //and// $DOA(d(w',​ f)) \cdot API(r) < API({\bf C}_{w'​})$. So the selection of clusters depend on the chosen focus point, as well as the way the $DAO()$ function deals with distances. 
 + 
 + 
/net/html/perso/melancon/Visual_Analytics_Course/data/pages/level_of_details_navigation_of_hierarchical_clusterings.txt · Last modified: 2012/11/23 13:44 by melancon