Book review -- Lectures on Discrete Geometry by Jirii Matousek


I have a number of books on the shelves above my desk which were left there by previous residents of my office. Recently I was thinking about symplectic ball packing and feeling a little bit unhappy about mathematics. In an attempt to find some solace, I decided to leaf through one of the books on my shelf which looked as unrelated to my work as possible.

It turned out that I had stumbled upon a classic.

``Lectures on Discrete Geometry’’ by Jirii Matousek is a book about simple things: polytopes, hyperplane arrangements, convex sets, graphs. It is a model of mathematical exposition. I recommend it to anyone that has ever thought that polytopes are interesting.

The aspect of the book that most impresses me is its efficiency. The book contains almost no proofs that take more than a printed page, and not a single proof feels difficult. Yet, almost every theorem is amazing. The book offers meaningful insight with every page. It does not waste your time. If you’re used to slogging your way through Sobolev spaces, the book almost reads like a novel.

This efficiency is a symptom of a very deep understanding of the subject. Proving things is much easier than proving things well. Making difficult things elementary is almost impossible.

I think someone once said that one of the remarkable things about Mikhail Gromov was that he understood every part of the mathematics that he did in the simplest possible way. Matousek is a master, and on every page gives the simplest possible proof. That makes this book a treasure, because trying to learn how to prove most of the theorems in this book by looking through the literature would probably involve weeks or months of effort.

Did you know that slicing a high dimensional convex bread loaf $B$ in two parallel hyperplanes $H_0, H_1$ gives a lower bound on the volume of the slice along any intermediate parallel hyperplane $H_t$? Did you know that Brunn-Minkowski gives a one-line proof of the isoperimetric inequality? Did you ever realize that the Johnson-Lindenstrauss Lemma deserves to be a lemma, because it’s actually easy? Did you ever think about whether one can actually sort a set faster if one has some partial knowledge of the ordering information on the set?

The “bread loaf” above, by the way, is taken from the book: bread loaf

A picture that only a mathematician, or a baker, would love.


Sometimes, I feel jealous of computer scientists and applied mathematicians, because I have the impression that they just get to play around with fun mathematics.

First, some cranky complaints. As a physicist, you get credit in your community for proposing a (physical approach to the Riemann hypothesis)[https://journals.aps.org/prl/abstract/10.1103/PhysRevLett.118.130201] , even if your approach does not go significantly beyond the ideas of the Hilbert-Polya conjecture and is almost certainly less insightful than the visions of Connes, Deninger, Hesselholt, et. al. In machine learning you can publish a NIPS paper by applying some relatively trivial mathematics to ``explain’’ some phenomenon in nonlinear optimization; in fact, you can publish a paper by literally plotting some 2d cuts of an energy function which exhibit behavior which can be proven with two lines of calculus.

But maybe more seriously, in statistics you can use fairly elementary mathematical tools to help people understand more deeply the structures they work with. In machine learning you can think about fun little high-dimensional geometry questions, and then test them out on a computer – and all of a sudden people appreciate these things because they can “see” their effects! Finally, I have the impression that the basic job of a computer science theory grad student is to do fun analysis and publish cool little algorithmic and geometric tricks. This is not a negative statement – I am saying that this approach to exploring mathematics is obviously extremely valuable. Focusing on the basics is great!

In some sense, it’s a question of alignment. In a metaphor, mathematicians try to climb mountains', mounting expeditions’ to attacks' on problems, while applied mathematicians wander around in a forest’ of mathematical phenomena and show off cool things they find and try to explain what they see; it’s not so important if the things they see are in the evergreen forest rather than on the rocky, sun-drenched, freezing peak of an unexplored mountain range.

I think that there are parts of mathematics which work much more like computer science – for example, I imagine John Conway’s approach to mathematics being that of the ``ideal computer scientist’’, in a sense. (Maybe I’ll do a review of his sphere-packing book some day.) Analysis has always seemed to be close at heart to computer science; and indeed Bourgain, Sarnak, Margulis feature heavily in Matousek’s chapters on metric embeddings, and the widely applied `compressed sensing’ technique came out of the harmonic analysis community. Assaf Naor is in the mathematics department, but he is almost a computer scientist! Yet the first time I heard about any of these metric geometry questions was from Gromov’s book; and Gromov is a mathematician’s mathematician. Many of the deepest questions in discrete geometry on polytopes and hyperplane arrangements have been answered by relating them to questions about intersection cohomology and mixed hodge structures and applying standard theorems from algebraic geometry. One is never too far from this very pure mathematics when wandering around in the forest.

Maybe I’m just complaining that I have to think about coherence conditions perturbation data for Floer’s equations instead of fun analysis things all day. None of the proofs in this book involve the monstrous combinatorial and analytic manipulations needed to prove even the simplest theorems in symplectic topology. Yet, they all prove stuff that matters. Why isn’t the usual math PhD about doing more of that? There are so many simple things that people are confused about.

Anyway, buy Matousek’s book.