
VENKATESAN GURUSWAMI My research interests span several topics in theoretical computer science including algorithmic coding theory, the role of randomness in computation, pseudorandomness and explicit combinatorial constructions, the theory of probabilistically checkable proofs, and the computational complexity of approximate optimization. Below I briefly describe some of these research directions. Coding theory: Errorcorrecting codes provide a systematic way to add redundancy to data so that errors that occur during storage/transmission can be tolerated. In addition to their obvious importance to reliable storage and communication, codes have found many "extraneous" applications in complexity theory and cryptography, and are by now indispensable items in the toolkit of theoretical computer scientists. The basic principle in the design of codes is to ensure that distinct messages get encoded into codewords that are far apart, so that one will not be confused with the other even in the presence of some errors. The broad challenges in coding theory are threefold: (i) formally specifying the noise model and understanding the potential and limits of errorcorrection for the concerned model; (ii) finding and explicitly specifying good configurations of codewords; and (ii) exploiting the structure of the designed code to to efficiently recover the transmitted codeword from a noisy received word. The game thus is to construct codes with low redundancy together with fast algorithms to correct large amounts of noise. I have been working towards this long term goal for several years, and in particular have extensively studied the algorithmic, combinatorial, and complexitytheoretic aspects of a notion of errorcorrection called "list decoding" which allows recovery from approximately 2x more errors compared to traditional errorcorrection algorithms. I am also excited by the practical potential of list decoding and am interested in experiments to validate the utility of list decoding for realistic noise models. Theory of approximation algorithms: Many computational tasks arising in practice can be cast as optimization problems, where the goal is to find a solution subject to some constraints that optimizes a certain objective value. Unfortunately, for most interesting optimization problems, finding an optimal solution is NPhard. One versatile approach to cope with this intractability is to settle for approximation algorithms that find approximately optimal solutions with provable guarantees on quality (for example, a solution of cost at most 20% more than the optimal). Ideally, for each problem of interest, we would like to have an algorithm with a provable performance ratio along with a hardness result ruling out a better performance ratio, thereby identifying its approximation threshold. Tremendous progress has been made on this topic in the last two decades, thanks to both algorithmic techniques such as linear and semidefinite programming and metric embeddings, and breakthroughs on the hardness side using the machinery of probabilistically checkable proofs (PCP) and new codingtheoretic, probabilistic, and (Fourier) analytic tools. My own (ongoing) research has made many contributions to this subject using a wide range of techniques, proving strong, and often nearoptimal, bounds on the approximability of a broad class of fundamental optimization problems, including constraint satisfaction problems, ranking problems, graphtheoretic problems such as coloring and independent sets, disjoint paths and lowcongestion routing, graph partitioning, etc. The power of semidefinite programming and its relation to the Unique Games (and related) problems is an important focus of our current research. Pseudorandomness: Pseudorandomness is a broad area that deals with efficiently generating objects that exhibit the desirable properties of "randomlike" objects despite being constructed either explicitly or with limited randomness. Such pseudorandom constructions are important in the study of errorcorrecting codes, complexity theory, combinatorics, cryptography, and highdimensional geometry. My research in recent years has addressed some of these challenges and led to good constructions of errorcorrecting codes, expander graphs, randomness extractors, Euclidean sections, compressed sensing matrices, etc. Despite the seemingly different definitions and motivations for the study of these objects, much of this progress was based on insights uncovering intimate connections between them. I am interested in strengthening these constructions when possible, and more broadly in expanding this body of work to find explicit constructions of other pseudorandom objects that arise in emerging applications (a salient example is the concept of subspaceevasive sets which found a surprising application in some of our recent work on list decoding).


CSD Home Webteam ^ Top SCS Home 