Information about the camera position and imaging characteristics for a captured video frame in an AR session.
有关ARSession中捕获的视频帧的摄像头位置和成像特征的信息。
Handling Tracking Status
trackingState
The general quality of position tracking available when the camera captured a frame.
相机捕获帧时可用的位置跟踪的普通质量。
ARTrackingState
Possible values for position tracking quality.
位置跟踪质量的可能值。
Tracking States
Camera position tracking is not available.
相机位置跟踪不可用。
Tracking is available, but the quality of results is questionable.
跟踪是可用的,但结果的质量是可疑的。
Discussion
In this state, the positions and transforms of anchors in the scene (especially detected planes) may not be accurate or consistent from one captured frame to the next.
See the associated ARTrackingStateReason value for information you can present to the user for improving tracking quality.
讨论
在这种状态下,场景中锚点的位置和变换(尤其是检测到的平面)可能不准确,或者从一个捕获帧到下一个捕获帧不一致。
请参阅相关的ARTrackingStateReason值,以获取有关提高用户质量的信息。
Camera position tracking is providing optimal results.
相机位置跟踪提供最佳结果。
trackingStateReason
A possible diagnosis for limited position tracking quality as of when the camera captured a frame.
摄像机捕获帧时的有限位置跟踪质量的可能诊断。
ARTrackingStateReason
Possible causes for limited position tracking quality.
有限的位置跟踪质量的可能原因。
Reason Values
The current tracking state is not limited.
当前的跟踪状态不受限制。
ARTrackingStateReasonInitializing
The AR session has not yet gathered enough camera or motion data to provide tracking information.
ARSession尚未收集足够的相机或动作数据来提供跟踪信息。
Discussion
This value occurs temporarily after starting a new AR session or changing configurations.
讨论
在开始新的ARSession或更改配置后,此值会临时出现。
ARTrackingStateReasonRelocalizing
The AR session is attempting to resume after an interruptio
ARSession尝试在中断后继续
Discussion
ARKit cannot track device position or orientation when the session has been interrupted (for example, by dismissing the view hosting an AR session or switching to another app). When resuming the session after an interruption, you cannot be certain that the world coordinate system (used for placing anchors) matches the device's real-world environment.
If your session or view delegate implements the sessionShouldAttemptRelocalization: method and returns YES, ARKit attempts to reconcile pre- and post-interruption world tracking state. During this process, called relocalization, world tracking quality is ARTrackingStateLimited, with ARTrackingStateReasonRelocalizing as the reason for limited quality.
If successful, the relocalization process ends after a short time, tracking quality returns to the ARTrackingStateNormal state, and the world coordinate system and anchor positions generally reflect their state before the interruption.
However, the speed and success rate of relocalization can vary depending on real-world conditions. You may wish to hide AR content or disable UI during relocalization, and reset tracking if relocalization doesn't succeed within a time frame appropriate for your app.
讨论
当Session中断时(例如,通过关闭托管ARSession或切换到另一个应用的视图),ARKit无法跟踪设备位置或方向。在中断后恢复Session时,您无法确定世界坐标系(用于放置锚点)是否与设备的真实世界环境相匹配。
如果您的Session或视图委托实现了sessionShouldAttemptRelocalization:方法并返回YES,则ARKit会尝试协调中断前和中断后的世界跟踪状态。在此过程中,称为重定位,世界跟踪质量为ARTrackingStateLimited,ARTrackingStateReasonRelocalizing为质量有限的原因。
如果成功,重定位过程在短时间内结束,跟踪质量返回到ARTrackingStateNormal状态,世界坐标系和锚点位置通常反映它们在中断之前的状态。
然而,重新定位的速度和成功率可能因实际情况而异。您可能希望在重定位过程中隐藏AR内容或禁用UI,并且如果重定位在适合您的应用的时间范围内未成功,则重置跟踪。
ARTrackingStateReasonExcessiveMotion
The device is moving too fast for accurate image-based position tracking.
该设备移动得太快,无法进行精确的基于图像的位置跟踪。
ARTrackingStateReasonInsufficientFeatures
The scene visible to the camera does not contain enough distinguishable features for image-based position tracking.
相机可见的场景不包含足够的区分特征用于基于图像的位置跟踪。
Examining Camera Geometry
transform
The position and orientation of the camera in world coordinate space.
相机在世界坐标空间中的位置和方向。
Discussion
World coordinate space in ARKit always follows a right-handed convention, but is oriented based on the session configuration. For details, see About Augmented Reality and ARKit.
This transform creates a local coordinate space for the camera that is constant with respect to device orientation. In camera space, the x-axis points to the right when the device is in UIDeviceOrientationLandscapeRight orientation—that is, the x-axis always points along the long axis of the device, from the front-facing camera toward the Home button. The y-axis points upward (with respect to UIDeviceOrientationLandscapeRight orientation), and the z-axis points away from the device on the screen side.
讨论
ARKit中的世界坐标空间总是遵循右手惯例,但是基于Session配置而定向。 有关详细信息,请参阅关于增强现实和ARKit。
该变换为相机创建了一个局部坐标空间,该坐标空间相对于设备方向而言是恒定的。 在摄像机空间中,当设备处于UIDeviceOrientationLandscapeRight方向时(即,x轴始终指向设备的长轴,从前置摄像头朝向Home按钮)时,x轴指向右侧。 y轴指向上方(相对于UIDeviceOrientationLandscapeRight方向),并且z轴指向远离屏幕侧的设备。
eulerAngles
The orientation of the camera, expressed as roll, pitch, and yaw values.
相机的方向,表示为滚动,俯仰和偏航值。
Examining Imaging Parameters
imageResolution
The width and height, in pixels, of the captured camera image.
捕获的相机图像的宽度和高度(以像素为单位)。
intrinsics
A matrix that converts between the 2D camera plane and 3D world coordinate space.
在2D摄像机平面和3D世界坐标空间之间转换的矩阵。
Discussion
The intrinsic matrix (commonly represented in equations as K) is based on physical characteristics of the device camera and a pinhole camera model. You can use the matrix to transform 3D coordinates to 2D coordinates on an image plane.
The values fx and fy are the pixel focal length, and are identical for square pixels. The values ox and oy are the offsets of the principal point from the top-left corner of the image frame. All values are expressed in pixels.
讨论
固有矩阵(通常用方程K表示)基于设备相机的物理特性和针孔相机模型。 您可以使用矩阵将3D坐标转换为图像平面上的二维坐标。
值fx和fy是像素焦距,并且对于正方形像素是相同的。 值ox和oy是图像帧左上角的主点偏移量。 所有值都以像素表示。
Applying Camera Geometry
projectionMatrix
A transform matrix appropriate for rendering 3D content to match the image captured by the camera.
适合渲染3D内容以匹配相机捕获的图像的变换矩阵。
Discussion
Reading this property's value is equivalent to calling the projectionMatrixWithViewportSize:orientation:zNear:zFar: method, using the camera's imageResolution and intrinsics properties to derive size and orientation, and passing default values of 0.001 and 1000.0 for the near and far clipping planes.
讨论
读取此属性的值相当于调用projectionMatrixWithViewportSize:orientation:zNear:zFar:方法,使用相机的imageResolution和intrinsics属性导出大小和方向,并为近和远裁剪平面传递0.001和1000.0的默认值。
- projectionMatrixForOrientation:viewportSize:zNear:zFar:
Returns a transform matrix appropriate for rendering 3D content to match the image captured by the camera, using the specified parameters.
使用指定的参数返回适合渲染3D内容以匹配相机捕获的图像的变换矩阵。
Parameters
orientation
The orientation in which the camera image is to be presented.
相机图像的呈现方向。
viewportSize
The size, in points, of the view in which the camera image is to be presented.
摄像机图像要显示的视图的大小(以点为单位)。
zNear
The distance from the camera to the near clipping plane.
从相机到近裁剪平面的距离。
zFar
The distance from the camera to the far clipping plane.
从相机到远剪裁平面的距离。
Return Value
A projection matrix that provides an aspect fill and rotation for the provided viewport size and orientation.
返回值
投影矩阵,为提供的视口大小和方向提供方面填充和旋转。
Discussion
This method has no effect on ARKit, and the zNear and zFar parameters have no relationships to ARKit camera state. Instead, this method uses those parameters as well as the camera's state to construct a projection matrix for use in your own rendering code.
讨论
此方法对ARKit没有影响,并且zNear和zFar参数与ARKit相机状态没有关系。 相反,此方法使用这些参数以及相机的状态来构建投影矩阵,以用于您自己的渲染代码。
- viewMatrixForOrientation:
Returns a transform matrix for converting from world space to camera space.
返回从世界空间转换为相机空间的变换矩阵。
Parameters
orientation
The orientation in which the camera image is to be presented.
相机图像的呈现方向。
Return Value
A view matrix appropriate for the camera with the specified orientation.
返回值
适合具有指定方向的相机的视图矩阵。
Discussion
This method has no effect on ARKit. Instead, this method the orientation parameter and the camera's state to construct a view matrix for use in your own rendering code.
讨论
此方法对ARKit没有影响。 相反,这种方法的方向参数和相机的状态来构建一个视图矩阵,以便在您自己的渲染代码中使用。
- projectPoint:orientation:viewportSize:
Returns the projection of a point from the 3D world space detected by ARKit into the 2D space of a view rendering the scene.
将由ARKit检测到的三维世界空间中某点的投影返回到渲染场景的视图的二维空间中。
Parameters
point
The 3D world-space point to project into 2D view space.
3D世界空间指向2D视图空间。
orientation
The orientation in which the camera image is to be presented.
相机图像的呈现方向。
viewportSize
The size, in points, of the view in which the camera image is to be presented.
摄像机图像要显示的视图的大小(以点为单位)。
Return Value
The projection of the specified point into a 2D pixel coordinate space whose origin is in the upper left corner and whose size matches that of the viewportSize parameter.
返回值
将指定点投影到原点位于左上角并且其大小与viewportSize参数的大小相匹配的2D像素坐标空间中。
Discussion
If you display AR content with SceneKit, the ARSCNView class provides an otherwise equivalent projectPoint: method that requires fewer parameters (because the view can infer its orientation and size).
讨论
如果使用SceneKit显示AR内容,ARSCNView类提供了一个需要更少参数的其他等效projectPoint:方法(因为视图可以推断其方向和大小)。
Inherits From NSObject
Conforms To NSCopying