This webpage describes my research in a detailed manner until 2007. For a description of my recent results since then, and a list of publications, see here.

Ages ago, I started my career in the Department of Mathematics at the University of Caen. My interests then gradually shifted from "pure" number theory to computational number theory and related problems in cryptography, most notably integer factorization and lattice reduction. In this area, I obtained average-case bounds for the celebrated LLL algorithms and once held the world record for the best proven bound on the complexity of integer factorization. Currently the core of my work deals with analysis of algorithms. I am especially interested in applications of dynamical systems theory to some of the major algorithms and data structures studied by theoretical computer scientists. I feel for instance responsible for the introduction of transfer operators in this area. My most recent contributions deal with the average-case analysis of lattice reduction in two or more dimensions, the exact analysis of a rich class of continued fraction algorithms, the application of transfer operators to analytic information theory, and the related analysis of digital trees ("tries") under nonuniform input models.

My personal research logo: the discs associated with continued fraction expansions of real numbers.


Festoons appearing in the analysis of lattice reduction.



The 3-D landscape of the number of "turns" of the Euclidean algorithm,
whose LSB version has to do with the tortoise and the hare race.


A plot of the maximum-digit-average function for binary representations.


The German 10DM banknote with the effigy of Gauss, the discoverer of the (in)famous Bell curve.



Hidden words are everywhere to be found. But what does finding a particular hidden word really mean?


Fundamental squares are fundamental when comparing two fractions, but also occasionally in computational geometry!


Apéry's famous continued fraction for zeta(3). What is a typical contnued fraction like? And what does it have to say about algorithms?


"Moby Dick" serves to compare theoretical estimates to simulations in the case of trie algorithms.

One of Devroye's random tree simulations.


Three transformations of the interval iterate to give rise to binary representations, standard continued fractions, and centred continued fractions.

Back to my Home Page.