You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Sep 14, 2021. It is now read-only.
We have found several specific cases for using ARCore Face Tracking on iOS instead of ARKit. After some native XCode testing, we have found that the ARCore for iOS SDK had much stronger face tracking while users were wearing face masks/coverings (something becoming more and more prevalent in our projects), and ARCore currently supports face tracking on older iOS devices without the TrueDepth camera due to being TensorFlow based instead of using depth sensors.
It would be great if there was an option to build ARCore Unity projects to iOS (instead of ARKit) with support for the two ARCore iOS features currently available (Augmented Faces & Cloud Anchors).
The text was updated successfully, but these errors were encountered:
It would be great if the ARCore iOS Augmented Faces SDK was supported in Unity as well.
We have found several specific cases for using ARCore Face Tracking on iOS instead of ARKit. After some native XCode testing, we have found that the ARCore for iOS SDK had much stronger face tracking while users were wearing face masks/coverings (something becoming more and more prevalent in our projects), and ARCore currently supports face tracking on older iOS devices without the TrueDepth camera due to being TensorFlow based instead of using depth sensors.
It would be great if there was an option to build ARCore Unity projects to iOS (instead of ARKit) with support for the two ARCore iOS features currently available (Augmented Faces & Cloud Anchors).
The text was updated successfully, but these errors were encountered: