Skip to content

Commit

Permalink
Merge pull request #12 from HakuyaLabs/MISSINGNO
Browse files Browse the repository at this point in the history
WIP: Merge pull request #11 from HakuyaLabs/MISSINGNO
  • Loading branch information
Nekotora authored Jun 26, 2024
2 parents 25ae815 + 315e9d9 commit c319378
Show file tree
Hide file tree
Showing 9 changed files with 376 additions and 52 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -4,22 +4,81 @@ sidebar_position: 20

# 面部追踪

Warudo 目前支持 5 种面部捕捉方案。
## 蓝图自定义

* [**OpenSeeFace(Beta)**](openseeface.md):基于摄像头的面捕方案。可以捕捉基本 BlendShape,以及头部的旋转和移动。和 [VSeeFace ](https://www.vseeface.icu/)的面捕效果大致相仿。
* [**iFacialMocap**](ifacialmocap.md):基于苹果 ARKit 的面捕方案,需要一台支持[面容 ID ](https://support.apple.com/zh-cn/HT208109)的 iOS 设备,并购买 [iFacialMocap](https://apps.apple.com/cn/app/id1489470545) App(国区售价 40 元)。可以捕捉 52 个 ARKit BlendShape,以及头部的旋转和移动。<b style={{color: "green"}}>**精度和可动性最佳。**</b>
* [**RhyLive**](rhylive.md):基于苹果 ARKit 的面捕方案,需要一台支持[面容 ID ](https://support.apple.com/zh-cn/HT208109)的 iOS 设备,并下载 [RhyLive ](https://apps.apple.com/us/app/rhylive/)App(免费)。可以捕捉 52 个 ARKit BlendShape,但仅捕捉头部的旋转。<b style={{color: "green"}}>**精度最佳。**</b>
* [**VMC**](vmc.md):由外部程序发送 [VirtualMotionCapture 数据](https://protocol.vmc.info/english)到 Warudo。一般较少使用 VMC 发送面捕数据。
* [**Rokoko**](rokoko.md):由 [Rokoko Studio](https://www.rokoko.com/products/studio) 发送面捕数据到 Warudo。
在开播准备环节,您可以点击 **面部追踪进阶设置…** 来对蓝图进行定制。

开发中的面部捕捉方案:
![](/doc-img/en-mocap-3.png)
<p class="img-desc">Customizing face tracking.</p>

* **NVIDIA Maxine**:基于摄像头的面捕方案,需要 NVIDIA RTX 系列(或者其他图灵架构)显卡。可以捕捉 51 个 ARKit BlendShape。
目前我们提供以下几种选择:

* **BlendShape 映射:** 选择适合您模型的 BlendShape 映射。比如说,如果您的模型带有 "Perfect Sync"/ARKit Blendshape, 请您选择 **Arkit**; 如果您的模型是由MMD模型转换而来,请选择 **MMD** ;其他请选择 **VRM**。 Warudo 会尝试自动识别模型类型,但您也可以在这里手动更改类型。-
* **启用头部和身体运动:** 此选项开启时,Warudo 会尝试基于动捕数据来同时运动模型的头身部分。在您使用完整的全身动捕系统的情况下,此选项会被开播助手设置为**关闭**
* **头部待机动画 (自动眨眼/自动眼部动作/自动摇头):** 在开启状态下,Warudo 会给角色增加微小的头部、眼部动作。
* **视线:** 此选项开启时,,您的角色视线会望向指定方向(默认指向摄像机位)。这样可以帮助您在任何机位下保持与观众的眼神交流并允许您的头部自由转动。
* **Lip Sync**: If enabled, Warudo will animate your character's mouth based on your speech. You can choose to enable lip sync only when tracking is lost, or always enable it.

:::info
These options above affect the generated blueprint; therefore, to change these options after the setup, you need to re-run the setup, or manually modify the face tracking blueprint. **Character → Motion Capture → Blueprint Navigation** provides shortcuts to the specific parts of the blueprint that you may want to modify:

![](/doc-img/en-mocap-6.png)
:::

## Tracker Customization

After onboarding, you can go to the corresponding face tracking asset (e.g., iFacialMocap Receiver, MediaPipe Tracker) to customize the tracking data itself. The following options are available:

* **Mirrored Tracking:** If enabled, Warudo will mirror the tracking data.
* **[BlendShape](../tutorials/3d-primer#blendshape) Sensitivity:** Adjust the sensitivity of the character's facial expressions. If your expressions are too subtle, increase the sensitivity; if your expressions are too exaggerated, decrease the sensitivity.
* **Configure BlendShapes Mapping:** Adjust each individual blendshape's threshold and sensitivity. This is useful if you want to disable certain blendshapes, or if you want to adjust the sensitivity of each blendshape individually.
![](/doc-img/en-mocap-5.png)
The left two values are the range of the input blendshape value from the tracker, and the right two values are the range of the output blendshape value to the character. For example, if the input range is 0-1 and the output range is 0-2, then when the input blendshape value is 0.40, the output blendshape value will be 0.80.
- To disable a blendshape, set the top right value to 0.
- To make a blendshape more sensitive, increase the top right value; to make a blendshape less sensitive, decrease the top right value.
- To trigger a blendshape at a higher threshold (e.g., your character mouth is already slightly opened but your mouth is still closed), increase the bottom right value. To trigger a blendshape at a lower threshold, decrease the bottom right value.
* **Head Movement/Rotation Intensity:** Adjust the intensity of the head movement/rotation.
* **Body Movement Intensity:** Adjust the intensity of the automatic body movement due to head movement.
* **Body Rotation Type:** Makes the body rotate naturally when the head rotates.
When set to None, the body will not rotate when the head rotates.
When set to Normal, the body rotates in the same direction as the head.
When set to Inverted, the body rotates in the same direction as the head on the X and Y axes, but in the opposite direction on the Z axis.
You can also set **Body Rotation Type** to Custom, which allows you to customize both the direction and magnitude of each rotation axis.
* **Eye Movement Intensity:** Adjust the intensity of the pupil movement.
* **Eye Blink Sensitivity:** Adjust the sensitivity of the eye blink; this is a shortcut for adjusting the sensitivity of the eye blink blendshape in **Configure BlendShapes Mapping**.
* **Linked Eye Blinking:** If enabled, force both eyes to blink at the same time. Useful if your tracking is not accurate enough to blink each eye individually.
* **Use Bones/BlendShapes for Eye Movement:** Whether to use eye bones or blendshapes for eye movement. If your model has eye bones, it is recommended to use eye bones, as they are more accurate and allow for [IK](../tutorials/3d-primer#IK). There are two cases where you may want to enable **Use BlendShapes For Eye Movement**:
- Your model does not have eye bones.
- Your model's eye blendshapes are supplementary to the eye bones, i.e., the eye blendshapes change the shape of the eyes, in addition to the eye bones moving the pupils. (Your modeler should be able to tell you whether this is the case.)

## Frequently Asked Questions {#FAQ}

### When I move my head, my character's body also moves. How do I disable this?

This is caused by the **Body Movement Intensity** option in the motion capture receiver asset. Set it to 0 to disable body movement.

### How do I enable lip sync?

During the onboarding process, you can enable lip sync by clicking **Customize Face Tracking...** and enabling **Lip Sync**.

You can adjust lip sync settings after onboarding by clicking **Character → Motion Capture → Blueprint Navigation → Lip Sync Settings**.

### My model's mouth is slightly open even when my mouth is closed.

This is usually caused by applying ARKit tracking on a non-ARKit-compatible model. To fix this, you can either:

* Add ["Perfect Sync" / ARKit blendshapes](../tutorials/3d-primer#arkit) to your model. This is recommended since ARKit-based face tracking is much more expressive, and you already have the tracking data.
* Click **Configure BlendShapes Mapping** in the tracker asset, and increase the threshold (i.e., set the bottom right value to a negative number like `-0.25`) of the ARKit mouth blendshapes `jawOpen`, `mouthFunnel`, `mouthPucker`.

### My body rotates in the opposite direction when I turn my head.

By default, **Body Rotation Type** is set to Inverted to mimic Live2D models and achieve a more anime look. If you want to disable this, set **Body Rotation Type** to Normal or None.

<AuthorBar authors={{
creators: [
{name: 'HakuyaTira', github: 'TigerHix'},
],
translators: [
{name: 'MISSINGNO', github: 'MISSINGNO'},
],
}} />
}} />
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
---
sidebar_position: 70
---

# Leap Motion 控制器

使用 [Leap Motion 控制器](https://leap2.ultraleap.com/leap-motion-controller-2/)进行手部捕捉。

## 初始设置

为了更准确和稳定的手部捕捉,我们推荐您使用更新一代的Leap Motion 控制器 2,但Warudo也支持使用初代Leap Motion 控制器进行捕捉。

您将需要下载安装最新的[Gemeni软件](https://leap2.ultraleap.com/gemini-downloads/)以将您的Leap Motion 控制器连接到Warudo。

Warudo支持 Leap Motion 控制器提供的所有三种捕捉模式:**Desktop(将设备置于桌面上时)**, **Screen Top(将设备置于屏幕顶部时)**, 以及 **Chest Mounted(颈挂式)**。为了更好的体验,我们推荐在使用颈挂式捕捉模式时,使用一个[颈挂](https://www.etsy.com/market/leap_motion_mounting)

## 校正 {#Calibration}

总体来讲,Leap Motion 控制器并不需要校正即可使用。但您可以在 **Leap Motion 控制器 Asset** 中调制**控制器位置偏移****控制器旋转偏移**,以及**控制器缩放比率** 来调整捕捉效果。该界面中将会显示一个虚拟的 Leap Motion 控制器图标,以帮助您可视化的校正动捕效果。

![](/doc-img/en-leapmotion-1.png)
<p class="img-desc">图为调整 Leap Motion 控制器的示意图。</p>

## 选项

* **控制器位置偏移**: 对传感器的三维空间坐标位置进行调整,正数X值会将传感器位置向左侧调整,反之则向右侧调整。正数Y值会将传感器位置向上方调整,反之则向下方调整。正数Z值会将传感器位置向前方调整,反之则向后方调整。
* **控制器旋转偏移**: 平面旋转虚拟 Leap Motion 控制器,以调整传感器的角度。
* **控制器缩放比率**: 此参数为调整传感器的距离感应比例,调整它可以改变模型手部和身体的距离感应灵敏度。你也可以使用 **对每个轴单独缩放** 来单独精调传感器在每个方向上的灵敏度。
* **修正手指朝向权重**: 由于每个人物模型的参数不同,Leap Motion 控制器中手指的朝向可能与模型中手指的朝向不同步,这个参数可以让您调整手指的朝向,以使模型显示的动作与传感器捕捉到的动作同步。0代表不做任何调整,1代表最高修正值。请您持续调整,直至模型看上去效果自然。
* **修正肩部朝向权重**: 这个参数会改变模型肩部的旋转角度,0代表不做任何调整,1代表完全翻转,请您持续调整,直至模型看上去效果自然。

## 常见问题{#FAQ}

常见问题请参考[动作捕捉方案一览](overview#FAQ)以及[姿态追踪](body-tracking#FAQ)

### **Leap Motion 追踪器显示**"Tracker not started."

请确保您已下载并安装最新版本的[Gemini软件](https://leap2.ultraleap.com/gemini-downloads/),并且程序已在后台运行。

### 我的模型的手腕/手指看着很奇怪。.

请尝试在**Leap Motion 追踪器 Asset**中调整**修正手指朝向权重(手指方向修正权重)** 选项,您可能也需要调整**手腕旋转偏移****全局手指旋转偏移** 选项。(您可以在选项左侧的方框内开启它们)

<AuthorBar authors={{
creators: [
{name: 'HakuyaTira', github: 'TigerHix'},
],
translators: [
{name: 'MISSINGNO', github: 'MISSINGNO'},
],
}} />
50 changes: 50 additions & 0 deletions i18n/zh/docusaurus-plugin-content-docs/current/mocap/mocopi.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
---
sidebar_position: 80
---

# 索尼 Mocopi

[索尼 Mocopi](https://electronics.sony.com/more/mocopi/all-mocopi/p/qmss1-uscx)进行动作捕捉。

## 初始设置

请先参照[索尼官方教程](https://www.sony.com/electronics/support/articles/00298063)来对Mocopi进行初始设置。

在初始设置完成之后,在Mocopi手机应用程序的设置页面中,选中**External device connection settings(外部设备连接设置)**,将**Transfer Format** 设置为**mocopi(UDP)**,然后再**IP address**中输入您的计算机IP地址。

:::tip
如果您不知道电脑的IP地址,可以在**Mocopi 接收器**素材的设置页面中找到。

![](/doc-img/en-ifacialmocap-1.png)

如果出现了多个IP,您将需要一个一个尝试。通常来说,您的路由器分配的IP会以`192.168`打头,比如在例图中的情况下,您可以先尝试`192.168.1.151`
:::

回到主界面,点击顶部菜单栏中的 **Motion**然后点击**SAVE**图标以进入 **SEND** 模式。点击底部绿色的发送按钮来开始传送数据。

![](/doc-img/en-mocopi-1.png)

## 校正

Sony Mocopi的校正需要在Mocopi应用程序中完成。

## 选项

* **动作缓冲**: 选择是否开启Mocopi自带的动作缓冲功能(该功能可以让动作较为平滑)。由于Warudo自身会对捕捉数据进行平滑处理,这个设置通常不需要开启。

## 常见问题

常见问题请参考[动作捕捉方案一览](overview#FAQ)[姿态追踪](body-tracking#FAQ)页面。

### 我的动捕结果经常会自己平移

这种随着时间一点点增大,导致动捕结果平移的误差是惯性动捕系统的常见问题。请尽量减少动捕设备周边的电磁干扰来减少漂移问题。

<AuthorBar authors={{
creators: [
{name: 'HakuyaTira', github: 'TigerHix'},
],
translators: [
{name: 'MISSINGNO', github: 'MISSINGNO'},
],
}} />
35 changes: 35 additions & 0 deletions i18n/zh/docusaurus-plugin-content-docs/current/mocap/noitom.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,35 @@
---
sidebar_position: 120
---

# 诺亦腾 Axis

使用[诺亦腾 Axis Studio软件](https://neuronmocap.com/pages/axis-studio)或者[Axis Neuron软件](https://neuronmocap.com/pages/axis-neuron)进行动作捕捉。 这个动捕方法需要使用官方售卖的[诺亦腾 Perception Neuron](https://neuronmocap.com/)动捕服装。

## 初始设置

打开[诺亦腾 Axis Studio软件](https://neuronmocap.com/pages/axis-studio)并在如图的**Settings → BVH Broadcasting** 中打开BVH串流

![](/doc-img/en-noitom-1.png)

## 校正

Noitom动捕服装的校正需要在[诺亦腾 Axis Studio软件](https://neuronmocap.com/pages/axis-studio)或者[Axis Neuron软件](https://neuronmocap.com/pages/axis-neuron)中进行。

## 常见问题

常见问题请参考[动作捕捉方案一览](overview#FAQ)[姿态追踪](body-tracking#FAQ)页面。

### 我的动捕结果经常会自己平移


这种随着时间一点点增大,导致动捕结果平移的误差是惯性动捕系统的常见问题。请尽量减少动捕设备周边的电磁干扰来减少漂移问题。

<AuthorBar authors={{
creators: [
{name: 'HakuyaTira', github: 'TigerHix'},
],
translators: [
{name: 'MISSINGNO', github: 'MISSINGNO'},
],
}} />
Loading

0 comments on commit c319378

Please sign in to comment.