My main research interest is in the interaction between algorithms and languages, mostly in the context of parallel computing, and has consisted of both theoretical and experimental work. As programming languages become higher level, implementations become more complex, and parallelism becomes pervasive, users are naturally becoming more removed from the hardware and its costs. Rather than trying to bring programmers down to the level of the machine to understand and get good performance, however, I believe that we should be trying to bring languages and cost models up to the level of the programmer. My research therefore centers around questions of how to model costs (e.g. time and space) for very-high level programming constructs (e.g. dynamic parallelism, futures, garbage collection), of how to design systems so these costs have meaning, and of how to make use of these features in effective algorithms design.
My recent work includes work on the PSCICO project with Gary Miller, Bob Harper and Peter Lee. Here we are looking at how to use very-high level programming constructs in geometric and scientific algorithms. We hope this project will give guidance to future language design, and will identify new ways of thinking about algorithm implementation. I also work on applied algorithms, parallel garbage collection, parallel scheduling, efficient parallel algorithms, and continue to work, to some extent, on the NESL programming language, a parallel language that my students and I developed in the early 90s.