'Cloned faces' look to behaviour

Thursday, 15 October, 2009


Computer scientists have developed a way of cloning facial expresssions during live conversations to help us better understand what influences our behaviour when we communicate with others.

This latest technique tracks people's facial expressions and head movements in real time during a videoconference and maps these movements to models of faces - producing a 'cloned' face.

These facial expressions and head movements can be manipulated live to alter the apparent expressiveness, identity, race, or even gender of a talker. Moreover, these visual cues can be manipulated such that neither participant in the conversation is aware of the manipulation.

Developed by Dr Barry-John Theobald at Britain's University of East Anglia's School of Computing Sciences - in collaboration with Dr Iain Matthews (Disney Research), Prof Steven Boker (University of Virginia) and Prof Jeffrey Cohn (University of Pittsburgh) - the facial expression cloning technique is already being trialled by psychologists in the US to challenge pre-conceived assumptions about how humans behave during conversations.

For example, it is well known that people move their head differently when speaking to a woman than when speaking to a man. The new software has helped show that this difference is not because of the conversational partner's appearance but is instead due to the way they move.

If a person appears to be a woman but moves as a man, others will respond with movements similar to those made when speaking to a man. It is also likely to have application in the entertainment industry where life-like animated characters might be required.

"Spoken words are supplemented with non-verbal visual cues to enhance the meaning of what we are saying, signify our emotional state or provide feedback during a face-to-face coversation," said Dr Theobald, the leading author of the new paper called Mapping and Manipulating Facial Expression.

"Being able to manipulate these properties in a controlled manner allows us to measure precisely their effects on behaviour during conversation," he added.

"This technology allows us to manipulate faces in this way for the first time. Many of these effects would otherwise be impossible to achieve, even using highly skilled actors," said Dr Theobald.

The work is funded by the Engineering & Physical Sciences Research Council and the National Science Foundation.

Related Articles

Australia's largest electronics expo returns to Sydney

Electronex, the annual electronics design and assembly expo, will return to Sydney on 19–20...

The fundamentals of Australian RCM compliance

The following information aims to help readers understand the Australian compliance requirements...

Largest ever Electronex Expo in Melbourne

The Electronics Design and Assembly Expo will return to Melbourne from 10–11 May at the...


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd