The study, published in the journal Scientific Reports, revealed that personality predictions based on female faces appeared to be more reliable than those for male faces.
The technology can be used to find the 'best matches' in customer service, dating or online tutoring, the researchers from HSE University and Open University in Russia, said.
Studies asking human raters to make personality judgments based on photographs have produced inconsistent results, suggesting that our judgments are too unreliable to be of any practical importance.
According to the study, there are strong theoretical and evolutionary arguments to suggest that some information about personality characteristics, particularly, those essential for social communication, might be conveyed by the human face.
After all, face and behaviour are both shaped by genes and hormones, and social experiences resulting from one's appearance may affect one's personality development.
However, the recent evidence from neuroscience suggests that instead of looking at specific facial features, the human brain processes images of faces in a holistic manner.
For the findings, the researchers teamed up with a Russian-British business start-up BestFitMe to train a cascade of artificial neural networks to make reliable personality judgments based on photographs of human faces.
The performance of the resulting model was above that discovered in previous studies which used machine learning or human raters.
The artificial intelligence was able to make above-chance judgments about conscientiousness, neuroticism, extraversion, agreeableness, and openness based on 'selfies' the volunteers uploaded online.
The resulting personality judgments were consistent across different photographs of the same individuals.
The study was done in a sample of 12,000 volunteers who completed a self-report questionnaire measuring personality traits based on the "Big Five" model and uploaded a total of 31,000 'selfies'. The respondents were randomly split into a training and a test group.
A series of neural networks were used to preprocess the images to ensure consistent quality and characteristics and exclude faces with emotional expressions, as well as pictures of celebrities and cats.
Next, an image classification neural network was trained to decompose each image into 128 invariant features, followed by a multi-layer perceptron that used image invariants to predict personality traits.
In comparison with the meta-analytic estimates of correlations between self-reported and observer ratings of personality traits, the findings indicate that an artificial neural network relying on static facial images outperforms an average human rater who meets the target in person without prior acquaintance.
Conscientiousness emerged to be more easily recognizable than the other four traits. Personality predictions based on female faces appeared to be more reliable than those for male faces, the study said.