Screenshot of a media player showing the headshot of an adult female with a face detection frame overlayed and a text that reads 'Adult'.


Much to even the Microsoft software engineers' surprise, their How Old Do I Look website suddenly became a viral phenomenon a couple of weeks ago. At the time of writing this post, the Microsoft age recognition site had 2.1 million Facebook shares, and many internet and media commentators have written articles about it. Much of the discussion centered around the success, or otherwise, of its algorithms in accurately detecting the photo subjects' actual ages.

If nothing else, this flurry of attention has brought to the fore the topic of determining a best-guess of someone's age using facial detection software.


The Microsoft How Old Do I Look? Product


Microsoft recently used an old API for face-recognition it possesses to act as the conduit to an engine that it is using to analyze people's faces. The user uploads an image of whoever they wish the site to identify and presses a button to send the information on it to the Microsoft servers to do their calculations. Once the calculations are complete, the computer returns with a "best guess" for the age and gender of those in the photo.

The Microsoft engine performs a thorough analysis of the image looking for patterns that detect faces, and then uses a complex set of algorithms to determine the gender of any people in the photograph. After that, it makes a best attempt to estimate the age of those people.

Judging by much of the press coverage that How Old Do I Look has generated, it is still early days for the age detection application. Many of the ages that have been suggested have been out by quite laughable amounts. The Microsoft data scientists have still been testing the original API and appear to have been somewhat unprepared for the vast interest that their product generated.

The project apparently began as a simple experiment among Microsoft data scientists. They were testing out their new face detection software and decided to ask some people to trial it for them. Within hours, the 50 testers they expected had turned into 30,000 people and then 350,000 users a couple of hours later. The numbers continued to grow.

The age detection results are decidedly patchy at the moment. The software is still learning. At this stage, the results differ depending on whether a face is photographed at close angles compared to when it is more distant in the picture. There have also been decidedly strange results reported when faces in the photos have facial hair.

This app is part of Microsoft’s Project Oxford, which in the future will offer facial, image and speech-recognition APIs


What other options are there for detecting age


As Microsoft has undoubtedly discovered, it is incredibly difficult to gain a truly accurate detection of your age. Indeed their decision to narrow down the results to an exact age is highly atypical of industry practice.

Although Kairos includes age detection in its three newest products (which all detect facial information from a video feed), it does recognize the limitations of the technology. Like most other firms who offer age detection, Kairos has preferred to use age bands.

The Kairos products returns best guesses for age, in four basic age ranges:

  • child (0 - 13 years)

  • young adult (14 - 35 years)

  • adult (35 - 65 years)

  • senior (65 +)

Other websites offer age detection too, and most of these also recognize the current technical limitations. One such page is named Age Detector. Like the Microsoft site, it is based on analysis of still images that you can upload. It does include an explicit disclaimer: "this tool is for entertainment purpose only. The age is estimated based on facial features, and there is no guarantee that it matches the real age of the depicted person(s)".


How does age detection work?


Age detection works on the same basis as any other computer vision task. Large amounts of training data (pictures) for different ages are trained into the system to build a model. As time goes on the new incoming images are compared against the model to get a prediction of age.

There is quite detailed statistical modeling that goes on behind the scenes to determine which age bracket any particular face best approximates. This modeling includes the use of an SVM (Support Vector Machine) Algorithm to help segregate the different types of faces into their age categories.


It's all very well being able to separate faces using complicated mathematical formulae. But it does beg the question, however, of how do peoples' faces actually change with age?

The Marquardt Beauty Analysis provides a clear breakdown of how they see our faces change as we age. They also split our ages into four categories (that don't exactly match the Kairos groupings but are similar):

  1. Babies

  2. Children

  3. Young Adults

  4. Old Age (which scarily, for beauty purposes, they define as being anyone over 24)

A typical baby face is considered to differ from an adult face because it has:

  • a larger head compared to its face

  • bigger eyes proportionate to its face

  • an overall rounder face

  • plumper cheeks

  • short, flat eyebrows

  • a short, small nose

From about the age of two, babies grow into children. Some of the key features of children's faces are:

  • teeth "falling out", so they may be missing or spaced

  • freckles and acne begin to appear

  • eyebrows, although still flat, become fuller than a baby’s

  • ears can appear proportionately oversized

  • the nose still looks somewhat small, short and broad

  • the cheeks become relatively flatter and less well defined

  • there can still be “baby fat” throughout the face

Young adults, from their teens until their mid-20s, tend to have what is considered the "mask" face. This is the age where people are considered to be at their most attractive. Females are considered to have a better mask than males, as we have previously discussed in our article about detecting gender).

From our mid-20s onwards the signs of aging sets in. Some of the key facial signs of this are:

  • our cheeks sag, ultimately resulting in clear jowls developing

  • the corners of the mouth sag resulting in a slight frown look

  • the tissue around the eyes sag

  • both upper and lower eyelids sag

  • the tissue of the forehead droops, creating wrinkles and dropping the eyebrows downward. This gives them a flatter appearance

  • the nose may lengthen and the tip may enlarge and drop slightly. It can even develop a dorsal hump

  • the face starts to wrinkle in numerous places.

Clearly it should be comparatively easy to determine the approximate age for babies, children, and young adults - there are quite clear facial differences at each of these three stages. The hard thing is that from about age 25 our facial beauty starts to degenerate (there is a whole industry selling products designed to delay, or at least mask, this process). It is much harder to determine people who are older. We all age at differing rates.

An article in Scholarpedia on Facial Age Estimation sees facial aging effects as being "mainly attributed to bone movement and growth and skin related deformations associated with the introduction of wrinkles and reduction of muscle strength". Bone movement is the primary reason for the changes to faces from baby to child to young adult. The skin related deformations lead the changes from young adult onwards.


What Practical Uses are There for Age Detection?


Although there are clear technical issues with training technology to determine age, the reality is that we as humans are not all that good at working out ages ourselves. That is why there is a need for ID to be used to enforce age-based restrictions. Even the best-trained hotel bouncer struggles to distinguish an 18-year-old from a 17-year-old if they do not carry proof of age with them. Some tests have shown that electronic age detection is more accurate than human age detection, however.

There have been a number of suggestions of how age detection could be of value.

Age-based access control: In most cases age–based access control is enforced using the judgment of humans (as described above) based on the presentation of identification documentation papers, such as a driver's license or passport. An alternative would be to permit automatic facial age estimation to provide an objective, accurate and non-invasive decision as to the age of a person seeking access to some area of restriction. This could be in either a physical situation (for instance at an alcohol store) or a virtual scenario (for example access to a gambling website).

Age Adaptive Human Machine Interaction (HCI): This means that in some circumstances the way that a person would interact with a machine would differ, depending on their age. With automatic age estimation, based on facial detection, the user interface would alter to suit the needs of the user's age group. Young children could activate an icon-based interface, young adults could use a small-print and graphical interface, and in turn older users could enable a large-font text-based interface.

Age adaptive HCI has real uses for publicly available resources such as information kiosks and marketing displays. You can use the facial detection to determine the ages of the particular people looking at the displays. You could then display a completely different message for somebody who the system detects as being an older person than you would for a young adult.

Data mining and organization: This is already happening in some photography applications. Here, an age estimation system is implemented to enable age-based retrieval and classification of face images. This is what programs like Google Picasa do when they feature age-based automatic sorting and image retrieval from photo albums as well as the internet.

Age Invariant Person Identification: This is what police use for missing person reconstructions. Progression techniques for deforming the face of a subject are used to predict how the subject will look like in the future.




The reality is that Microsoft was highly ambitious trying to match by exact age. Most face detection APIs use age ranges, not exact ages, because of the difficulties in telling age correctly. How Old You Are got a lot of bad press about every mistake it made, especially for celebrities. In my case, it predicted my age almost exactly.

Perhaps the one real mistake they made, was not including a form on their website where you would enter the actual age of the face in a photo scanned. That would have helped the learning process for the program.

Most users of "age" aren’t looking for exact age anyway. Think about its uses for advertising. It doesn’t matter about targeting ads to 22-year-olds vs. 23-year-olds, but it might matter 22-year-olds vs. 82-year-olds. There is still probably some way to go before we have the face scanner in the hotel door, letting 18 year-olds pass but barring 17 year-olds from entry.



Verify people in your apps—Integrate face recognition with our easy-to-code API.



Discover the benefits of Kairos Face Recognition—Let's connect.




Ready to get started with Kairos?