Monday, February 22, 2010
An O(log n / loglog n)-approximation algorithm for the Asymmetric Travelling Salesperson problem
Monday, February 15, 2010
Monday, February 8, 2010
Twice Ramanujan Sparsifiers
Given a weighted graph G, a natural question to ask is whether there exist another graph H on the same vertex set such that any cut has similar weights in G and H. As the weight of a cut can be described by $\sum_{uv \in E(G)}w_{uv}(x_u-x_v)^2$ where x is a 0-1 vector, one could ask the same question where x can take any real values. A spectral sparsifier for a graph G is H such that the above functions on the two graphs are close for any assignment of x and H is sparse.
Algorithmically, graph sparsifiers were first studied as a subroutine for near-linear time Laplacian Solvers by Spielman and Teng. Their algorithm partitions the graph into expanders by finding approximately sparse cuts, then randomly sampling each expander.
Follow up work has been done on other methods for finding sparsifiers that give better guarantees. One approach is to sample the edges by effective resistance, but it relies on the near-linear time solver to achieve good performance. This work uses a similar approach, but with a different goal in mind. It aims to find the best approximation rate possible with a d-regular subgraph, and achieves twice the ratio given by Ramanujan graphs , which are regular graphs whose spectral gap are as large as possible.
I will use the first half of the talk to cover background topics such as condition number and spectral decompositions, then show the approaches used in recent work on sparsifiers. If time permit, I will try to give a sketch of why one couldn't do better than the ratios given by Ramanujan graphs. The second half of the talk will be used to go through the proof of the key theorem of this paper, which basically states the following:
Given a set of rank one matrices that sum to identity, one could pick a small subset of them such that the ratio between the largest and smallest eigenvalues of their sum is small.
Tuesday, February 2, 2010
The Role of Randomization in Mechanism Design
- A universally truthful randomized mechanism is simply a probability distribution over deterministic truthful mechanisms. The mechanism gets to flip some coins, but no matter what the outcome of the coins, no agent could have benefited by lying, even in hindsight.
- A truthful in expectation mechanism only guarantees that no agent can benefit in expectation by lying, before the mechanism's coins are flipped. For some outcomes of the coins in hindsight, agents could possibly have benefited by lying.
- For any epsilon > 0, there exists a truthful in expectation mechanism which achieves a (1+epsilon) approximation to social welfare and runs in time poly(1/epsilon, log m).
- For any constant epsilon > 0 any universally truthful mechanism that achieves a (2-epsilon) approximation to social welfare must run in time Omega(m).