This is the documentation for Luppet (Ver2.0.5).Click here for the LuppetX documentation
Perfect Sync provides a mechanism to reflect the detailed facial expression parameters obtained by iOS devices in the individual blendshapes of avatars.
This allows for richer expressions than the minimum supported by the VRM standard.
In Luppet, it has been implemented as part of the integration function with iFacialMocap since Ver.2.0.0.
This refers to a model that implements all 52 BlendShapeLocation parameters that can be obtained with ARKit in iOS, with the same name BlendShapeClip in VRM.
Have you implemented all of the BlendShapes?
Only models that have all of these VRMBlendShapeClips implemented will be treated as Perfect Sync compatible models in Luppet, according to the following.
The name of the BlendShapeClip is case-sensitive, so make sure to enter it correctly.
All 52 parameters should be implemented manually…
All 52 parameters should be implemented manually.
Please refer to the following link.(Japanese Article.)
クリックで実装！パーフェクトシンク BY HANA Tool
(To give you an overview of , the idea is to port the facial expression data from an existing Perfect Sync-enabled model to your own VRoid model.)
Please download and use the following models.