A Depth Frame
Understanding the Depth type
A Depth frame is an in-memory buffer allowing GPU- and CPU-access to individual depth points. It is typically streamed in realtime by a CameraDepthFrameOutput, or attached to a Photo (see Photo.depth) if enabled on the CameraPhotoOutput.
See "The Depth Output" for more information about streaming Depth frames.
Native Plugins
Similarly to "A Frame", you would typically process Depth frames in "Native Frame Processor Plugins" to avoid touching depth data in JS.
orientation and isMirrored
The Depth's orientation describes its orientation relative to the output's outputOrientation.
Similarly, isMirrored describes if the Depth is considered to be mirrored, relative to the output's mirrorMode.
Consider both orientation and isMirrored as a "recipe" to get the Depth's intended presentation.
Why a Frame isn't rotated/mirrored automatically
The Camera pipeline does not physically rotate or mirror buffers automatically, as this is computationally expensive.
Instead, it is much more efficient to pass orientation and isMirrored along as metadata so consumers can apply rotation/mirroring logic themselves.
Tip
If you need your buffers to be correctly rotated and mirrored already, enable enablePhysicalBufferRotation, which results in orientation always being 'up' and isMirrored always being false, indicating no rotation or mirroring is necessary to get the Depth's intended presentation.
Understanding Depth Formats
A Depth frame has a GPU-backed buffer that contains its depth data. Its data layout is described by the Depth's pixelFormat.
const depth = ...
console.log(depth.pixelFormat) // 'depth-16-bit'Disparity vs Depth
Some Depth frames are not computed via time-of-flight (true distance measuring), but instead via disparity - which computes distance by triangulating pixel shift using multiple physical Camera devices.
Those Depth frames should be considered less accurate, but can still be used just like true depth frames.
Accessing Depth Data in JS
A Depth frame exposes its native, GPU-backed depth data buffer via getDepthData(), which provides zero-copy access into the Depth's actual buffer:
const depth = ...
const buffer = depth.getDepthData()Warning
The buffer is only valid as long as the Depth frame is valid (see Depth.isValid).
Once the Depth frame is disposed (see dispose()), the buffer must no longer be used.
Interpreting Depth Data
The Depth Data's layout depends on the Depth's pixelFormat. For example, in 'depth-16-bit', pixels are laid out in 16-bit floats - one float per "pixel".
const depth = ...
const buffer = depth.getDepthData()
if (depth.pixelFormat === 'depth-16') {
const pixels = new Float16Array(buffer)
const distanceToFirstPixel = pixels[0]
console.log(`Distance to first pixel:`, distanceToFirstPixel)
}Tip
Typically, a single 16-bit float in 'depth-16-bit' data represents a distance in meters.
Converting between Depth Formats
You can convert Depth to a different DepthPixelFormat by using Depth.convert(...). For example, if we need 'depth-32-bit', we can convert to it (if available):
let depth = ...
if (depth.pixelFormat !== 'depth-32-bit') {
if (!depth.availableDepthPixelFormats.includes('depth-32-bit')) {
throw new Error(`Depth frame does not support converting to depth-32-bit!`)
}
depth = depth.convert('depth-32-bit')
}
console.log(depth.pixelFormat) // 'depth-32-bit'