Avatar SDK  2.0.0
Realistic avatar generation toolset for Unity3D
How to use Oculus LipSync with Avatar SDK

1) Download and import "Oculus LipSync Unity" plugin into your project with Avatar SDK:
Oculus LipSync

2) Run the "05_parameters_sample_cloud" sample located under "itseez3d/avatar_sdk/sample_cloud/05_parameters_sample_cloud/scenes".

3) Select only "base/visems_17" blendshapes set and generate avatar.

blendshapes_sets.jpg

4) Once the avatar is loaded, press "Create prefab" button. The prefab with the avatar will be created that can be used for lipsync.

5) Create an empty scene.

6) Add to the scene created prefab with the avatar.

7) Add to the scene "LipSyncInterface" prefab located under "Oculus/LipSync/prefabs".

8) You have to add the following scripts to the avatar game object: "OVR Lip Sync Context", "OVR Lip Sync Mic Input", "OVR Lip Sync Context Morph Target".

ovr_scripts.jpg

9) It is time to configure Blend Targets for "OVR Lip Sync Context Morph Target" script.
- Set "itSeez3D Head" as a Skinned Mesh Renderer.
- Set "Viseme To Blend Targets" to the corresponding blendshapes indices from the "itSeez3D Head".

oculus_visems.jpg

10) Now everything is configured. You can run the sample scene in VR mode. The audio source will be taken from the microphone and precomputed blendshapes will be rendered in real-time.

11) The sample unity package can be downloaded from here: Avatar SDK with Oculus LipSync Unity sample