Update: The story has been updated to reflect all the companies involved in Project Siren.

One of Tencent’s latest games is a robot-zombie battle royale shooter. So far, so generic, and the underwhelming reports from players who tried the demo at Gamescom don’t help.

So why is SYNCED: Off Planet getting so much attention? Well, just look at these faces.

I gotta admit, this looks pretty amazing. (Picture: NEXT Studios)

Synced is using the impressive face rendering tech behind Project Siren, a collaborative project from 3Lateral, Cubic Motion, Epic Games, Tencent and Vicon. The effect is definitely noticeable.

The ultra-lifelike facial animations in SYNCED are a far cry from the complete lack of faces in older video games. It even seems to be a leap ahead of Rockstar’s L.A, Noire in 2011, which was widely praised at the time for lifelike animation; and Sony’s Uncharted 4 from 2016, where faces reportedly had 500 bones to make them more expressive. (For the record, real humans have just 14 bones in the face.)

According to NEXT Studios, Project Siren uses several cameras to take multiple photos of a person that could then be used to model the characters, according to Clark Jiayang Yang, a producer of the game.

“We have a photogrammetry lab in the studio. Every character -- all the faces, all the models -- is done through this photogrammetry lab,” Yang told IGN.

The Siren tech has been praised since it was the highlight of last year’s Game Developers Conference. While the character was constructed based on a Chinese actress, it was played by a British actress on the show floor at GDC 2018.

Tencent has been experimenting with high-fidelity facial animation for a while now. It owns 40% of Epic Games, which created the Unreal Engine. 

In 2017, a consortium that includes Tencent unveiled a famous VR program called MEETMIKE at SIGGRAPH. It allowed host Mike Seymour interview industry veterans as a digital human in real time while the audience members watched in their own VR headsets.

The same group built Siren the following year.

These facial expressions look a little more empty. (Picture: NEXT Studios)

Now, according to Yang, NEXT Studios is working on a system called “Anyface,” which uses a database of different facial features to construct lifelike animations within games.

Companies have long used improved facial animations as a way of trying to get developers to adopt their game engines. Back in 2004, Valve’s Source Engine that launched with Half-Life 2 was a favorite among developers because of the ability to create better facial animation.

But impressive as they can be, lifelike faces in games aren’t necessarily what gamers care about. The hyper-realistic facial animations of L.A. Noire were said by some to have an “uncanny valley” effect. The MotionScan game engine was also prohibitively expensive at the time, so it never saw wide adoption.

Instead of adopting an aggressively photorealistic approach like L.A. Noire, companies like Epic and Naughty Dog now try to deliver a more holistic and context-sensitive style in their facial animations. 

If NEXT Studios can properly toe the line between the real and the unreal so the faces the suit the environment, they might avoid that uncanny valley effect and impress a lot of gamers. But hopefully they’re working on the rest of the game too, because great faces won’t mean anything if it’s not fun to play.