In my research I love to think about the design and analysis of (randomized) algorithms for combinatorial problems. Often I mix this broad and classical area with ideas from distributed computing, information theory, and (network) coding theory. I am always on the lookout for new topics and problems to play with.
A common theme in my research is the use of randomness. It is surprising and sometimes almost unbelievable how much simpler and more efficient algorithms can become if allowed to make random decisions.
Some examples of my work include:
Network Coding and Gossip Algorithms (or how to spread information quickly and reliably):
We show that often the most efficient way to disseminate information in a (distributed) network is to blindly forward messages to some randomly chosen contacts. Moreover, if too much information is available than can be put into a message sending then sending random mixtures (e.g., XOR's) leads to optimal protocols. Moreover, the throughput achieved in this way is often drastically and provably higher than what is feasible with classical routing or store-and-forward protocols. In addition to being super simple, distributed, and efficient those "so called" gossip and random network coding algorithms also add a shocking amount of reliability against network failures and even the most hostile network dynamics.
Coding for Interactive Communication (or how to have a conversation despite noise):
Very recently, I got excited about designing coding schemes that make interactive communications robust to noise the same way error correcting codes provide reliability to one-way information transfers. The problem with conversations/interactive communications is that what one wants to tell the other party depends highly on their responses and which also allows small misunderstandings to completely derail a conversation. This makes it much harder to efficiently add redundancy. Still, a great number of new ideas have let to basic proofs of existence for such coding schemes. Over the last year we have made tremendous progress in understanding how much noise can be tolerated, how one can obtain computationally efficient coding schemes, and what communication rates are achievable.
Algorithms for the Lovász Local Lemma:
The probabilistic method tells us that an amazing array of beautiful and useful combinatorial structures exist but it often fails to lead to explicit examples or efficient constructions. This is particularly true, for proofs involving the Lovász Local Lemma which states that a large number of bad events can be simultaneously avoided as long as they are not too dependent among each other. The work of myself and many others have led to simple (even deterministic) algorithms which allow us to constructivize almost all applications of this powerful tool.