Sony Unveils Revolutionary Lip Sync Technology for Localizing NPCs in Video Games

Sony has recently secured a patent for a lip-syncing system designed for NPCs in video games. As reported by various media outlets, this technology, referred to as the Translation Language Evaluation Device, aims to streamline the adaptation of mouth animations to different languages. In the patent, Sony states: “When the audio of the translated text does not match the character’s lip movements, the player or viewer of the video may experience discomfort.”

The system assesses the lip movements of NPCs and computes a “similarity index” with the translation. It then automatically adjusts the animation to align with dialogues in any language. For instance, if a translation is longer or shorter than the original phrase, the engine modifies the timings and lip movements to maintain a natural appearance. This addresses a long-standing issue of audio-visual mismatch, a problem notably seen in titles like The Witcher 3: Wild Hunt.

It appears that the company has a penchant for patenting a wide array of technologies. A quick search for “Sony has patented” yields numerous links, reaffirming this trend.

As for the implementation of this system, it remains uncertain whether it will be utilized in any upcoming projects. However, the nearest release from Sony is the PC port of The Last of Us Part II Remastered, which is set to launch on Steam and EGS on April 3, complete with full Russian localization.