Friday, December 7, 2012 - 3:05pm
1 hour (actually 50 minutes)
The k-core of a (hyper)graph is the unique subgraph where all vertices have degree at least k and which is the maximal induced subgraph with this property. It provides one measure of how dense a graph is; a sparse graph will tend to have a k-core which is smaller in size compared to a dense graph. Likewise a sparse graph will have an empty k-core for more values of k. I will survey the results on the random k-core of various random graph models. A common feature is how the size of the k-core undergoes a phase transition as the density parameter passes a critical threshold. I will also describe how these results are related to a class of related problems that initially don't appear to involve random graphs. Among these is an interesting problem arising from probabilistic number theory where the random hypergraph model has vertex degrees that are "non-homogeneous"---some vertices have larger expected degree than others.