MERL / Princeton Face Detail Database

The MERL / Princeton Face Detail Database is a set of statistics describing the high-frequency facial geometry of over a hundred scanned subjects. These statistics describe facial geometry as a spatially-varying Heeger-Bergen texture. We are making the statistics available to the research community for applications in face analysis and synthesis. Our face model is described in this paper:

A Statistical Model for Synthesis of Detailed Facial Geometry
A. Golovinskiy, W. Matusik, H. Pfister, S. Rusinkiewicz, and T. Funkhouser
ACM Transactions on Graphics (Proc. SIGGRAPH). 25(3) July 2006.

Following are sections linking to the data and to visualizations of the data. Then, in lieu of complete documentation, the code is provided. Since these statistics are useful only after following our remeshing procedure, the remainder of the page describes this process in greater detail than the paper does.

If you use this data you are implicitly agreeing to our copyright notice. We would appreciate if you cite our paper in any publications that result from the use of this data.

Data

We provide the statistics (both full HB statistics, and "short" statistics, which just use the std's of tiles) of 113 subjects. We also provide a questionaire that contains some annotations of the subjects.

To figure out the format, it is easiest to look at the code with which these files were created (linked to and described below). LocalICDF.saveAs() and loadFrom() describe the format in which full HB stats (inverse CDF's of the histograms) are saved. LocalShortStats.saveAs() and loadFrom() describe the format of the reduced/short statistics, which is what we use (the mean is recorded here, but not used). Note that the bit order with which Java saves binary files may not be consistent with what one may expect.

More specifically, the format is as follows. All the files are written in binary, with default bit order of Java 1.4 under Windows XP. Each file contains the number of filters, followed by the number of on a side (as regular 4 byte ints). We use 4 orientations and 4 scales; along with the high pass residue and low-pass residue, this makes for 18 filters. The order of the filters (which can be verified in SimoncelliPyramid.java) is: the high pass, then the low pass, then the bandpass filters sorted by decreasing resolution and then orientations (for images that are, in order, of side 1024, 512, 256, and finally 128). The low-pass filter is not used, since it has statistics at a resolution lower than what we would like to adjust. We use 16 x 16 tiles, and the tiles are written in horizontal and then vertical order. The data are written in single floating point format (4 bytes).

The Heeger Bergen stats are stored as inverse cumulative distributions, with 256 bins. Each tile, then, contains 256 floats, with the ith tile containing the value in the filtered tile whose cumulative distribution is i / 256.

The reduced stats contain two floats for each tile: the mean and then the standard deviation of the pixels under that tile of that filter. The mean is not used anywhere in the paper.

As a check that the statistics are loaded correctly, the user may check that the reduced stats contain (approximately) the standard deviations of the histograms in the full stats. The user can also draw visualizations of the statistics and make sure they match the pictures provided below.

The Heeger Bergen (inverse histograms) statistics are available here, and the reduced statistics here.

Visualizations

Available are also visualizations of the reduced statistics, and some Matlab files used to create these visualizations (visualizeShortStats('meanImage.img', getAllFiles) created the set of pictures in ShortStatPictures). Visualizations layed out in HTML can be viewed here. (One of the Matlab files can be used to create such HTML pages).

Code

The code is provided not for immediate use, but to supplement the documentation. It has been set up to work with our directory structure, it has large unused chunks, and will require heavy modification to be usable for outside applications. However, some parts of it may be useful, and much of it is helpful in describing exactly what we do.

The Mesh class provides functions manipulating meshes, and Image classes provides those manipulating images. The Loader (and ImageLoader) classes take as input mesh/image file(s) class allow the user apply some of these functions interactively. BatchUtil contains a set of static functions to operate in batch mode with sets of meshes; it is the most highest-level entry point to the code.

Creating the marker mesh

We assume incoming meshes are of a proper scale. If they are to be interpolated, they should also be aligned. (Since reparameterization is done relative to markers, to simple separate detail, extract statistics, modify displacement image, etc, alignment is not necessary).

To register face meshes, and re-parameterize them, we put some markers on the face mesh at pre-defined positions. These markers can be added using Loader (ctrl-click to add marker, backspace to delete previous one, 'm' to save), or any other application, and saved to a .mkr file. Here is an example marker file. It is a text file that has the number of markers, (21 for the standard base meshes), followed by, in each line, the x y z position of the markers and their vertex number. The markers denote the following parts on the face:

They are connected in the following manner to form the marker mesh:

The marker mesh should look like this (21 vertices, 30 triangles)

Re-meshing and separating displacement image

Starting with the marker mesh, we subdivide and re-project onto the original mesh 4 times (2 linear subdivisions, then two Loop), and the do 3 Loop subdivisions (forming "base mesh"), and re-project. This series of steps may yield meshes that look like this:

(showing marker mesh, after subdivisions 1, 2, 3, 4, base mesh(after 3 x loop), and the re-parameterized original)

The resulting re-meshed mesh should have 246,401 vertices and 491,520 triangles.

The last step yields a scalar for each vertex that we encode into the detail displacement image. To do this encoding, we begin with the marker mesh re-parameterized on a rectangle. This parameterization was chosen by hand, and looks like this:

A .ply mesh representing it is here. We then follow the sequence of subdivisions above, with the exception that border pixels are forced not to move, so that Loop subdivision does not take the mesh inside the rectangle. This gives us an "image" mesh.

To rasterize the displacement image, for each image pixel, we use the linearly interpolated value from the vertices of the triangle in the image mesh containing the pixel. We now have the displacement image (we use 1024x1024 resolution), which looks like this:

The code for subdividing, reprojecting, and saving detail coefficients is in BatchUtil.createDetails(). The code for applying detail coefficients, saving re-meshed mesh, and creating displacement image is in BatchUtil.createReMeshes().

Extracting/matching statistics

The code to extract statistics can be followed from BatchUtil.createHBStats(). The exact filters we use are in SimoncelliPyramid (ported to Java from Simoncelli's code).

The code to adjust statistics can be followed from BatchUtil.adjust(). Image.matchStatisticsOf(...) is the main function to adjust the images--sections the user may want to modify is addNoise() to vary the amount of noise, the number of iterations, and which channels to match (we do not match the low-frequency residue).

© Copyright 2006 Mitsubishi Electric Research Laboratories (MERL) & Princeton University.
All Rights Reserved.

Permission to use, copy and modify this software and/or data and its documentation without fee for educational, research and non-profit purposes, is hereby granted.

To request permission to incorporate this software and/or data into commercial products contact: Vice President of Marketing and Business Development; Mitsubishi Electric Research Laboratories (MERL), 201 Broadway, Cambridge, MA 02139 or license@merl.com.

IN NO EVENT SHALL MERL OR PRINCETON UNIVERSITY BE LIABLE TO ANY PARTY FOR DIRECT, INDIRECT, SPECIAL, INCIDENTAL, OR CONSEQUENTIAL DAMAGES, INCLUDING LOST PROFITS, ARISING OUT OF THE USE OF THIS SOFTWARE AND ITS DOCUMENTATION, EVEN IF MERL AND/OR PRINCETON UNIVERSITY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES.

MERL AND PRINCETON UNIVERSITY SPECIFICALLY DISCLAIM ANY WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE SOFTWARE AND/OR DATA PROVIDED HEREUNDER IS ON AN "AS IS" BASIS, AND MERL/PRINCETON UNIVERSITY HAVE NO OBLIGATIONS TO PROVIDE MAINTENANCE, SUPPORT, UPDATES, ENHANCEMENTS OR MODIFICATIONS.