/* * Copyright (C) 2013 The Android Open Source Project * * Licensed under the Apache License, Version 2.0 (the "License"); * you may not use this file except in compliance with the License. * You may obtain a copy of the License at * * http://www.apache.org/licenses/LICENSE-2.0 * * Unless required by applicable law or agreed to in writing, software * distributed under the License is distributed on an "AS IS" BASIS, * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. * See the License for the specific language governing permissions and * limitations under the License. */ package android.hardware.camera2; import android.annotation.NonNull; import android.annotation.Nullable; import android.hardware.camera2.impl.CameraMetadataNative; import android.hardware.camera2.impl.PublicKey; import android.hardware.camera2.impl.SyntheticKey; import android.hardware.camera2.utils.TypeReference; import android.util.Rational; import java.util.Collections; import java.util.List; /** *
The properties describing a * {@link CameraDevice CameraDevice}.
* *These properties are fixed for a given CameraDevice, and can be queried * through the {@link CameraManager CameraManager} * interface with {@link CameraManager#getCameraCharacteristics}.
* *{@link CameraCharacteristics} objects are immutable.
* * @see CameraDevice * @see CameraManager */ public final class CameraCharacteristics extends CameraMetadataFor example, to get the stream configuration map:
*
*
* StreamConfigurationMap map = cameraCharacteristics.get(
* CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP);
*
To enumerate over all possible keys for {@link CameraCharacteristics}, see * {@link CameraCharacteristics#getKeys()}.
* * @see CameraCharacteristics#get * @see CameraCharacteristics#getKeys() */ public static final class KeyBuilt-in keys exposed by the Android SDK are always prefixed with {@code "android."}; * keys that are device/platform-specific are prefixed with {@code "com."}.
* *For example, {@code CameraCharacteristics.SCALER_STREAM_CONFIGURATION_MAP} would * have a name of {@code "android.scaler.streamConfigurationMap"}; whereas a device * specific key might look like {@code "com.google.nexus.data.private"}.
* * @return String representation of the key name */ @NonNull public String getName() { return mKey.getName(); } /** * {@inheritDoc} */ @Override public final int hashCode() { return mKey.hashCode(); } /** * {@inheritDoc} */ @SuppressWarnings("unchecked") @Override public final boolean equals(Object o) { return o instanceof Key && ((Key{@code "CameraCharacteristics.Key(%s)"}, where {@code %s} represents * the name of this key as returned by {@link #getName}.
* * @return string representation of {@link Key} */ @NonNull @Override public String toString() { return String.format("CameraCharacteristics.Key(%s)", mKey.getName()); } /** * Visible for CameraMetadataNative implementation only; do not use. * * TODO: Make this private or remove it altogether. * * @hide */ public CameraMetadataNative.KeyThe field definitions can be * found in {@link CameraCharacteristics}.
* *Querying the value for the same key more than once will return a value * which is equal to the previous queried value.
* * @throws IllegalArgumentException if the key was not valid * * @param key The characteristics field to read. * @return The value of that key, or {@code null} if the field is not set. */ @Nullable publicThe list returned is not modifiable, so any attempts to modify it will throw * a {@code UnsupportedOperationException}.
* *Each key is only listed once in the list. The order of the keys is undefined.
* *Note that there is no {@code getAvailableCameraCharacteristicsKeys()} -- use * {@link #getKeys()} instead.
* * @return List of keys supported by this CameraDevice for CaptureRequests. */ @SuppressWarnings({"unchecked"}) @NonNull public ListThe list returned is not modifiable, so any attempts to modify it will throw * a {@code UnsupportedOperationException}.
* *Each key is only listed once in the list. The order of the keys is undefined.
* *Note that there is no {@code getAvailableCameraCharacteristicsKeys()} -- use * {@link #getKeys()} instead.
* * @return List of keys supported by this CameraDevice for CaptureResults. */ @SuppressWarnings({"unchecked"}) @NonNull public ListThe list returned is not modifiable, so any attempts to modify it will throw * a {@code UnsupportedOperationException}.
* *Each key is only listed once in the list. The order of the keys is undefined.
* * @param metadataClass The subclass of CameraMetadata that you want to get the keys for. * @param keyClass The class of the metadata key, e.g. CaptureRequest.Key.class * * @return List of keys supported by this CameraDevice for metadataClass. * * @throws IllegalArgumentException if metadataClass is not a subclass of CameraMetadata */ privateList of aberration correction modes for {@link CaptureRequest#COLOR_CORRECTION_ABERRATION_MODE android.colorCorrection.aberrationMode} that are * supported by this camera device.
*This key lists the valid modes for {@link CaptureRequest#COLOR_CORRECTION_ABERRATION_MODE android.colorCorrection.aberrationMode}. If no * aberration correction modes are available for a device, this list will solely include * OFF mode. All camera devices will support either OFF or FAST mode.
*Camera devices that support the MANUAL_POST_PROCESSING capability will always list * OFF mode. This includes all FULL level devices.
*LEGACY devices will always only support FAST mode.
*Range of valid values:
* Any value listed in {@link CaptureRequest#COLOR_CORRECTION_ABERRATION_MODE android.colorCorrection.aberrationMode}
This key is available on all devices.
* * @see CaptureRequest#COLOR_CORRECTION_ABERRATION_MODE */ @PublicKey public static final KeyList of auto-exposure antibanding modes for {@link CaptureRequest#CONTROL_AE_ANTIBANDING_MODE android.control.aeAntibandingMode} that are * supported by this camera device.
*Not all of the auto-exposure anti-banding modes may be * supported by a given camera device. This field lists the * valid anti-banding modes that the application may request * for this camera device with the * {@link CaptureRequest#CONTROL_AE_ANTIBANDING_MODE android.control.aeAntibandingMode} control.
*Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_AE_ANTIBANDING_MODE android.control.aeAntibandingMode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_ANTIBANDING_MODE */ @PublicKey public static final KeyList of auto-exposure modes for {@link CaptureRequest#CONTROL_AE_MODE android.control.aeMode} that are supported by this camera * device.
*Not all the auto-exposure modes may be supported by a * given camera device, especially if no flash unit is * available. This entry lists the valid modes for * {@link CaptureRequest#CONTROL_AE_MODE android.control.aeMode} for this camera device.
*All camera devices support ON, and all camera devices with flash * units support ON_AUTO_FLASH and ON_ALWAYS_FLASH.
*FULL mode camera devices always support OFF mode, * which enables application control of camera exposure time, * sensitivity, and frame duration.
*LEGACY mode camera devices never support OFF mode. * LIMITED mode devices support OFF if they support the MANUAL_SENSOR * capability.
*Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_AE_MODE android.control.aeMode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_MODE */ @PublicKey public static final KeyList of frame rate ranges for {@link CaptureRequest#CONTROL_AE_TARGET_FPS_RANGE android.control.aeTargetFpsRange} supported by * this camera device.
*For devices at the LEGACY level or above:
*For constant-framerate recording, for each normal
* {@link android.media.CamcorderProfile CamcorderProfile}, that is, a
* {@link android.media.CamcorderProfile CamcorderProfile} that has
* {@link android.media.CamcorderProfile#quality quality} in
* the range [{@link android.media.CamcorderProfile#QUALITY_LOW QUALITY_LOW},
* {@link android.media.CamcorderProfile#QUALITY_2160P QUALITY_2160P}], if the profile is
* supported by the device and has
* {@link android.media.CamcorderProfile#videoFrameRate videoFrameRate} x
, this list will
* always include (x
,x
).
Also, a camera device must either not support any
* {@link android.media.CamcorderProfile CamcorderProfile},
* or support at least one
* normal {@link android.media.CamcorderProfile CamcorderProfile} that has
* {@link android.media.CamcorderProfile#videoFrameRate videoFrameRate} x
>= 24.
For devices at the LIMITED level or above:
*min
, max
)
* and (max
, max
) where min
<= 15 and max
= the maximum output frame rate of the
* maximum YUV_420_888 output size.Units: Frames per second (FPS)
*This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_TARGET_FPS_RANGE */ @PublicKey public static final KeyMaximum and minimum exposure compensation values for * {@link CaptureRequest#CONTROL_AE_EXPOSURE_COMPENSATION android.control.aeExposureCompensation}, in counts of {@link CameraCharacteristics#CONTROL_AE_COMPENSATION_STEP android.control.aeCompensationStep}, * that are supported by this camera device.
*Range of valid values:
Range [0,0] indicates that exposure compensation is not supported.
*For LIMITED and FULL devices, range must follow below requirements if exposure
* compensation is supported (range != [0, 0]
):
Min.exposure compensation * {@link CameraCharacteristics#CONTROL_AE_COMPENSATION_STEP android.control.aeCompensationStep} <= -2 EV
Max.exposure compensation * {@link CameraCharacteristics#CONTROL_AE_COMPENSATION_STEP android.control.aeCompensationStep} >= 2 EV
LEGACY devices may support a smaller range than this.
*This key is available on all devices.
* * @see CameraCharacteristics#CONTROL_AE_COMPENSATION_STEP * @see CaptureRequest#CONTROL_AE_EXPOSURE_COMPENSATION */ @PublicKey public static final KeySmallest step by which the exposure compensation * can be changed.
*This is the unit for {@link CaptureRequest#CONTROL_AE_EXPOSURE_COMPENSATION android.control.aeExposureCompensation}. For example, if this key has
* a value of 1/2
, then a setting of -2
for {@link CaptureRequest#CONTROL_AE_EXPOSURE_COMPENSATION android.control.aeExposureCompensation} means
* that the target EV offset for the auto-exposure routine is -1 EV.
One unit of EV compensation changes the brightness of the captured image by a factor * of two. +1 EV doubles the image brightness, while -1 EV halves the image brightness.
*Units: Exposure Value (EV)
*This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_EXPOSURE_COMPENSATION */ @PublicKey public static final KeyList of auto-focus (AF) modes for {@link CaptureRequest#CONTROL_AF_MODE android.control.afMode} that are * supported by this camera device.
*Not all the auto-focus modes may be supported by a * given camera device. This entry lists the valid modes for * {@link CaptureRequest#CONTROL_AF_MODE android.control.afMode} for this camera device.
*All LIMITED and FULL mode camera devices will support OFF mode, and all
* camera devices with adjustable focuser units
* ({@link CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE android.lens.info.minimumFocusDistance} > 0
) will support AUTO mode.
LEGACY devices will support OFF mode only if they support
* focusing to infinity (by also setting {@link CaptureRequest#LENS_FOCUS_DISTANCE android.lens.focusDistance} to
* 0.0f
).
Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_AF_MODE android.control.afMode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AF_MODE * @see CaptureRequest#LENS_FOCUS_DISTANCE * @see CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE */ @PublicKey public static final KeyList of color effects for {@link CaptureRequest#CONTROL_EFFECT_MODE android.control.effectMode} that are supported by this camera * device.
*This list contains the color effect modes that can be applied to * images produced by the camera device. * Implementations are not expected to be consistent across all devices. * If no color effect modes are available for a device, this will only list * OFF.
*A color effect will only be applied if * {@link CaptureRequest#CONTROL_MODE android.control.mode} != OFF. OFF is always included in this list.
*This control has no effect on the operation of other control routines such * as auto-exposure, white balance, or focus.
*Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_EFFECT_MODE android.control.effectMode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_EFFECT_MODE * @see CaptureRequest#CONTROL_MODE */ @PublicKey public static final KeyList of scene modes for {@link CaptureRequest#CONTROL_SCENE_MODE android.control.sceneMode} that are supported by this camera * device.
*This list contains scene modes that can be set for the camera device. * Only scene modes that have been fully implemented for the * camera device may be included here. Implementations are not expected * to be consistent across all devices.
*If no scene modes are supported by the camera device, this * will be set to DISABLED. Otherwise DISABLED will not be listed.
*FACE_PRIORITY is always listed if face detection is
* supported (i.e.{@link CameraCharacteristics#STATISTICS_INFO_MAX_FACE_COUNT android.statistics.info.maxFaceCount} >
* 0
).
Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_SCENE_MODE android.control.sceneMode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_SCENE_MODE * @see CameraCharacteristics#STATISTICS_INFO_MAX_FACE_COUNT */ @PublicKey public static final KeyList of video stabilization modes for {@link CaptureRequest#CONTROL_VIDEO_STABILIZATION_MODE android.control.videoStabilizationMode} * that are supported by this camera device.
*OFF will always be listed.
*Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_VIDEO_STABILIZATION_MODE android.control.videoStabilizationMode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_VIDEO_STABILIZATION_MODE */ @PublicKey public static final KeyList of auto-white-balance modes for {@link CaptureRequest#CONTROL_AWB_MODE android.control.awbMode} that are supported by this * camera device.
*Not all the auto-white-balance modes may be supported by a * given camera device. This entry lists the valid modes for * {@link CaptureRequest#CONTROL_AWB_MODE android.control.awbMode} for this camera device.
*All camera devices will support ON mode.
*Camera devices that support the MANUAL_POST_PROCESSING capability will always support OFF * mode, which enables application control of white balance, by using * {@link CaptureRequest#COLOR_CORRECTION_TRANSFORM android.colorCorrection.transform} and {@link CaptureRequest#COLOR_CORRECTION_GAINS android.colorCorrection.gains}({@link CaptureRequest#COLOR_CORRECTION_MODE android.colorCorrection.mode} must be set to TRANSFORM_MATRIX). This includes all FULL * mode camera devices.
*Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_AWB_MODE android.control.awbMode}
This key is available on all devices.
* * @see CaptureRequest#COLOR_CORRECTION_GAINS * @see CaptureRequest#COLOR_CORRECTION_MODE * @see CaptureRequest#COLOR_CORRECTION_TRANSFORM * @see CaptureRequest#CONTROL_AWB_MODE */ @PublicKey public static final KeyList of the maximum number of regions that can be used for metering in * auto-exposure (AE), auto-white balance (AWB), and auto-focus (AF); * this corresponds to the the maximum number of elements in * {@link CaptureRequest#CONTROL_AE_REGIONS android.control.aeRegions}, {@link CaptureRequest#CONTROL_AWB_REGIONS android.control.awbRegions}, * and {@link CaptureRequest#CONTROL_AF_REGIONS android.control.afRegions}.
*Range of valid values:
Value must be >= 0 for each element. For full-capability devices
* this value must be >= 1 for AE and AF. The order of the elements is:
* (AE, AWB, AF)
.
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_REGIONS * @see CaptureRequest#CONTROL_AF_REGIONS * @see CaptureRequest#CONTROL_AWB_REGIONS * @hide */ public static final KeyThe maximum number of metering regions that can be used by the auto-exposure (AE) * routine.
*This corresponds to the the maximum allowed number of elements in * {@link CaptureRequest#CONTROL_AE_REGIONS android.control.aeRegions}.
*Range of valid values:
* Value will be >= 0. For FULL-capability devices, this
* value will be >= 1.
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_REGIONS */ @PublicKey @SyntheticKey public static final KeyThe maximum number of metering regions that can be used by the auto-white balance (AWB) * routine.
*This corresponds to the the maximum allowed number of elements in * {@link CaptureRequest#CONTROL_AWB_REGIONS android.control.awbRegions}.
*Range of valid values:
* Value will be >= 0.
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AWB_REGIONS */ @PublicKey @SyntheticKey public static final KeyThe maximum number of metering regions that can be used by the auto-focus (AF) routine.
*This corresponds to the the maximum allowed number of elements in * {@link CaptureRequest#CONTROL_AF_REGIONS android.control.afRegions}.
*Range of valid values:
* Value will be >= 0. For FULL-capability devices, this
* value will be >= 1.
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AF_REGIONS */ @PublicKey @SyntheticKey public static final KeyList of available high speed video size, fps range and max batch size configurations * supported by the camera device, in the format of (width, height, fps_min, fps_max, batch_size_max).
*When CONSTRAINED_HIGH_SPEED_VIDEO is supported in {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities}, * this metadata will list the supported high speed video size, fps range and max batch size * configurations. All the sizes listed in this configuration will be a subset of the sizes * reported by {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputSizes } * for processed non-stalling formats.
*For the high speed video use case, the application must * select the video size and fps range from this metadata to configure the recording and * preview streams and setup the recording requests. For example, if the application intends * to do high speed recording, it can select the maximum size reported by this metadata to * configure output streams. Once the size is selected, application can filter this metadata * by selected size and get the supported fps ranges, and use these fps ranges to setup the * recording requests. Note that for the use case of multiple output streams, application * must select one unique size from this metadata to use (e.g., preview and recording streams * must have the same size). Otherwise, the high speed capture session creation will fail.
*The min and max fps will be multiple times of 30fps.
*High speed video streaming extends significant performance pressue to camera hardware, * to achieve efficient high speed streaming, the camera device may have to aggregate * multiple frames together and send to camera device for processing where the request * controls are same for all the frames in this batch. Max batch size indicates * the max possible number of frames the camera device will group together for this high * speed stream configuration. This max batch size will be used to generate a high speed * recording request list by * {@link android.hardware.camera2.CameraConstrainedHighSpeedCaptureSession#createHighSpeedRequestList }. * The max batch size for each configuration will satisfy below conditions:
*The camera device doesn't have to support batch mode to achieve high speed video recording, * in such case, batch_size_max will be reported as 1 in each configuration entry.
*This fps ranges in this configuration list can only be used to create requests * that are submitted to a high speed camera capture session created by * {@link android.hardware.camera2.CameraDevice#createConstrainedHighSpeedCaptureSession }. * The fps ranges reported in this metadata must not be used to setup capture requests for * normal capture session, or it will cause request error.
*Range of valid values:
For each configuration, the fps_max >= 120fps.
*Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @hide */ public static final KeyWhether the camera device supports {@link CaptureRequest#CONTROL_AE_LOCK android.control.aeLock}
*Devices with MANUAL_SENSOR capability or BURST_CAPTURE capability will always
* list true
. This includes FULL devices.
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AE_LOCK */ @PublicKey public static final KeyWhether the camera device supports {@link CaptureRequest#CONTROL_AWB_LOCK android.control.awbLock}
*Devices with MANUAL_POST_PROCESSING capability or BURST_CAPTURE capability will
* always list true
. This includes FULL devices.
This key is available on all devices.
* * @see CaptureRequest#CONTROL_AWB_LOCK */ @PublicKey public static final KeyList of control modes for {@link CaptureRequest#CONTROL_MODE android.control.mode} that are supported by this camera * device.
*This list contains control modes that can be set for the camera device. * LEGACY mode devices will always support AUTO mode. LIMITED and FULL * devices will always support OFF, AUTO modes.
*Range of valid values:
* Any value listed in {@link CaptureRequest#CONTROL_MODE android.control.mode}
This key is available on all devices.
* * @see CaptureRequest#CONTROL_MODE */ @PublicKey public static final KeyRange of boosts for {@link CaptureRequest#CONTROL_POST_RAW_SENSITIVITY_BOOST android.control.postRawSensitivityBoost} supported * by this camera device.
*Devices support post RAW sensitivity boost will advertise * {@link CaptureRequest#CONTROL_POST_RAW_SENSITIVITY_BOOST android.control.postRawSensitivityBoost} key for controling * post RAW sensitivity boost.
*This key will be null
for devices that do not support any RAW format
* outputs. For devices that do support RAW format outputs, this key will always
* present, and if a device does not support post RAW sensitivity boost, it will
* list (100, 100)
in this key.
Units: ISO arithmetic units, the same as {@link CaptureRequest#SENSOR_SENSITIVITY android.sensor.sensitivity}
*Optional - This value may be {@code null} on some devices.
* * @see CaptureRequest#CONTROL_POST_RAW_SENSITIVITY_BOOST * @see CaptureRequest#SENSOR_SENSITIVITY */ @PublicKey public static final KeyList of edge enhancement modes for {@link CaptureRequest#EDGE_MODE android.edge.mode} that are supported by this camera * device.
*Full-capability camera devices must always support OFF; camera devices that support * YUV_REPROCESSING or PRIVATE_REPROCESSING will list ZERO_SHUTTER_LAG; all devices will * list FAST.
*Range of valid values:
* Any value listed in {@link CaptureRequest#EDGE_MODE android.edge.mode}
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CaptureRequest#EDGE_MODE * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL */ @PublicKey public static final KeyWhether this camera device has a * flash unit.
*Will be false
if no flash is available.
If there is no flash unit, none of the flash controls do * anything. * This key is available on all devices.
*/ @PublicKey public static final KeyList of hot pixel correction modes for {@link CaptureRequest#HOT_PIXEL_MODE android.hotPixel.mode} that are supported by this * camera device.
*FULL mode camera devices will always support FAST.
*Range of valid values:
* Any value listed in {@link CaptureRequest#HOT_PIXEL_MODE android.hotPixel.mode}
Optional - This value may be {@code null} on some devices.
* * @see CaptureRequest#HOT_PIXEL_MODE */ @PublicKey public static final KeyList of JPEG thumbnail sizes for {@link CaptureRequest#JPEG_THUMBNAIL_SIZE android.jpeg.thumbnailSize} supported by this * camera device.
*This list will include at least one non-zero resolution, plus (0,0)
for indicating no
* thumbnail should be generated.
Below condiditions will be satisfied for this size list:
*(0, 0)
sizes will have non-zero widths and heights.
* This key is available on all devices.List of aperture size values for {@link CaptureRequest#LENS_APERTURE android.lens.aperture} that are * supported by this camera device.
*If the camera device doesn't support a variable lens aperture, * this list will contain only one value, which is the fixed aperture size.
*If the camera device supports a variable aperture, the aperture values * in this list will be sorted in ascending order.
*Units: The aperture f-number
*Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#LENS_APERTURE */ @PublicKey public static final KeyList of neutral density filter values for * {@link CaptureRequest#LENS_FILTER_DENSITY android.lens.filterDensity} that are supported by this camera device.
*If a neutral density filter is not supported by this camera device, * this list will contain only 0. Otherwise, this list will include every * filter density supported by the camera device, in ascending order.
*Units: Exposure value (EV)
*Range of valid values:
Values are >= 0
*Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#LENS_FILTER_DENSITY */ @PublicKey public static final KeyList of focal lengths for {@link CaptureRequest#LENS_FOCAL_LENGTH android.lens.focalLength} that are supported by this camera * device.
*If optical zoom is not supported, this list will only contain * a single value corresponding to the fixed focal length of the * device. Otherwise, this list will include every focal length supported * by the camera device, in ascending order.
*Units: Millimeters
*Range of valid values:
Values are > 0
*This key is available on all devices.
* * @see CaptureRequest#LENS_FOCAL_LENGTH */ @PublicKey public static final KeyList of optical image stabilization (OIS) modes for * {@link CaptureRequest#LENS_OPTICAL_STABILIZATION_MODE android.lens.opticalStabilizationMode} that are supported by this camera device.
*If OIS is not supported by a given camera device, this list will * contain only OFF.
*Range of valid values:
* Any value listed in {@link CaptureRequest#LENS_OPTICAL_STABILIZATION_MODE android.lens.opticalStabilizationMode}
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#LENS_OPTICAL_STABILIZATION_MODE */ @PublicKey public static final KeyHyperfocal distance for this lens.
*If the lens is not fixed focus, the camera device will report this * field when {@link CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION android.lens.info.focusDistanceCalibration} is APPROXIMATE or CALIBRATED.
*Units: See {@link CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION android.lens.info.focusDistanceCalibration} for details
*Range of valid values:
* If lens is fixed focus, >= 0. If lens has focuser unit, the value is
* within (0.0f, {@link CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE android.lens.info.minimumFocusDistance}]
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION * @see CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE */ @PublicKey public static final KeyShortest distance from frontmost surface * of the lens that can be brought into sharp focus.
*If the lens is fixed-focus, this will be * 0.
*Units: See {@link CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION android.lens.info.focusDistanceCalibration} for details
*Range of valid values:
* >= 0
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION */ @PublicKey public static final KeyDimensions of lens shading map.
*The map should be on the order of 30-40 rows and columns, and * must be smaller than 64x64.
*Range of valid values:
* Both values >= 1
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @hide */ public static final KeyThe lens focus distance calibration quality.
*The lens focus distance calibration quality determines the reliability of * focus related metadata entries, i.e. {@link CaptureRequest#LENS_FOCUS_DISTANCE android.lens.focusDistance}, * {@link CaptureResult#LENS_FOCUS_RANGE android.lens.focusRange}, {@link CameraCharacteristics#LENS_INFO_HYPERFOCAL_DISTANCE android.lens.info.hyperfocalDistance}, and * {@link CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE android.lens.info.minimumFocusDistance}.
*APPROXIMATE and CALIBRATED devices report the focus metadata in
* units of diopters (1/meter), so 0.0f
represents focusing at infinity,
* and increasing positive numbers represent focusing closer and closer
* to the camera device. The focus distance control also uses diopters
* on these devices.
UNCALIBRATED devices do not use units that are directly comparable
* to any real physical measurement, but 0.0f
still represents farthest
* focus, and {@link CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE android.lens.info.minimumFocusDistance} represents the
* nearest focus the device can achieve.
Possible values: *
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#LENS_FOCUS_DISTANCE * @see CaptureResult#LENS_FOCUS_RANGE * @see CameraCharacteristics#LENS_INFO_HYPERFOCAL_DISTANCE * @see CameraCharacteristics#LENS_INFO_MINIMUM_FOCUS_DISTANCE * @see #LENS_INFO_FOCUS_DISTANCE_CALIBRATION_UNCALIBRATED * @see #LENS_INFO_FOCUS_DISTANCE_CALIBRATION_APPROXIMATE * @see #LENS_INFO_FOCUS_DISTANCE_CALIBRATION_CALIBRATED */ @PublicKey public static final KeyDirection the camera faces relative to * device screen.
*Possible values: *
This key is available on all devices.
* @see #LENS_FACING_FRONT * @see #LENS_FACING_BACK * @see #LENS_FACING_EXTERNAL */ @PublicKey public static final KeyThe orientation of the camera relative to the sensor * coordinate system.
*The four coefficients that describe the quaternion * rotation from the Android sensor coordinate system to a * camera-aligned coordinate system where the X-axis is * aligned with the long side of the image sensor, the Y-axis * is aligned with the short side of the image sensor, and * the Z-axis is aligned with the optical axis of the sensor.
*To convert from the quaternion coefficients (x,y,z,w)
* to the axis of rotation (a_x, a_y, a_z)
and rotation
* amount theta
, the following formulas can be used:
theta = 2 * acos(w)
* a_x = x / sin(theta/2)
* a_y = y / sin(theta/2)
* a_z = z / sin(theta/2)
*
* To create a 3x3 rotation matrix that applies the rotation * defined by this quaternion, the following matrix can be * used:
*R = [ 1 - 2y^2 - 2z^2, 2xy - 2zw, 2xz + 2yw,
* 2xy + 2zw, 1 - 2x^2 - 2z^2, 2yz - 2xw,
* 2xz - 2yw, 2yz + 2xw, 1 - 2x^2 - 2y^2 ]
*
* This matrix can then be used to apply the rotation to a * column vector point with
*p' = Rp
where p
is in the device sensor coordinate system, and
* p'
is in the camera-oriented coordinate system.
Units: * Quaternion coefficients
*Optional - This value may be {@code null} on some devices.
*/ @PublicKey public static final KeyPosition of the camera optical center.
*The position of the camera device's lens optical center,
* as a three-dimensional vector (x,y,z)
, relative to the
* optical center of the largest camera device facing in the
* same direction as this camera, in the {@link android.hardware.SensorEvent Android sensor coordinate
* axes}. Note that only the axis definitions are shared with
* the sensor coordinate system, but not the origin.
If this device is the largest or only camera device with a
* given facing, then this position will be (0, 0, 0)
; a
* camera device with a lens optical center located 3 cm from
* the main sensor along the +X axis (to the right from the
* user's perspective) will report (0.03, 0, 0)
.
To transform a pixel coordinates between two cameras * facing the same direction, first the source camera * {@link CameraCharacteristics#LENS_RADIAL_DISTORTION android.lens.radialDistortion} must be corrected for. Then * the source camera {@link CameraCharacteristics#LENS_INTRINSIC_CALIBRATION android.lens.intrinsicCalibration} needs * to be applied, followed by the {@link CameraCharacteristics#LENS_POSE_ROTATION android.lens.poseRotation} * of the source camera, the translation of the source camera * relative to the destination camera, the * {@link CameraCharacteristics#LENS_POSE_ROTATION android.lens.poseRotation} of the destination camera, and * finally the inverse of {@link CameraCharacteristics#LENS_INTRINSIC_CALIBRATION android.lens.intrinsicCalibration} * of the destination camera. This obtains a * radial-distortion-free coordinate in the destination * camera pixel coordinates.
*To compare this against a real image from the destination * camera, the destination camera image then needs to be * corrected for radial distortion before comparison or * sampling.
*Units: Meters
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#LENS_INTRINSIC_CALIBRATION * @see CameraCharacteristics#LENS_POSE_ROTATION * @see CameraCharacteristics#LENS_RADIAL_DISTORTION */ @PublicKey public static final KeyThe parameters for this camera device's intrinsic * calibration.
*The five calibration parameters that describe the * transform from camera-centric 3D coordinates to sensor * pixel coordinates:
*[f_x, f_y, c_x, c_y, s]
*
* Where f_x
and f_y
are the horizontal and vertical
* focal lengths, [c_x, c_y]
is the position of the optical
* axis, and s
is a skew parameter for the sensor plane not
* being aligned with the lens plane.
These are typically used within a transformation matrix K:
*K = [ f_x, s, c_x,
* 0, f_y, c_y,
* 0 0, 1 ]
*
* which can then be combined with the camera pose rotation
* R
and translation t
({@link CameraCharacteristics#LENS_POSE_ROTATION android.lens.poseRotation} and
* {@link CameraCharacteristics#LENS_POSE_TRANSLATION android.lens.poseTranslation}, respective) to calculate the
* complete transform from world coordinates to pixel
* coordinates:
P = [ K 0 * [ R t
* 0 1 ] 0 1 ]
*
* and with p_w
being a point in the world coordinate system
* and p_s
being a point in the camera active pixel array
* coordinate system, and with the mapping including the
* homogeneous division by z:
p_h = (x_h, y_h, z_h) = P p_w
* p_s = p_h / z_h
*
* so [x_s, y_s]
is the pixel coordinates of the world
* point, z_s = 1
, and w_s
is a measurement of disparity
* (depth) in pixel coordinates.
Note that the coordinate system for this transform is the
* {@link CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE android.sensor.info.preCorrectionActiveArraySize} system,
* where (0,0)
is the top-left of the
* preCorrectionActiveArraySize rectangle. Once the pose and
* intrinsic calibration transforms have been applied to a
* world point, then the {@link CameraCharacteristics#LENS_RADIAL_DISTORTION android.lens.radialDistortion}
* transform needs to be applied, and the result adjusted to
* be in the {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize} coordinate
* system (where (0, 0)
is the top-left of the
* activeArraySize rectangle), to determine the final pixel
* coordinate of the world point for processed (non-RAW)
* output buffers.
Units: * Pixels in the * {@link CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE android.sensor.info.preCorrectionActiveArraySize} * coordinate system.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#LENS_POSE_ROTATION * @see CameraCharacteristics#LENS_POSE_TRANSLATION * @see CameraCharacteristics#LENS_RADIAL_DISTORTION * @see CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE * @see CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE */ @PublicKey public static final KeyThe correction coefficients to correct for this camera device's * radial and tangential lens distortion.
*Four radial distortion coefficients [kappa_0, kappa_1, kappa_2,
* kappa_3]
and two tangential distortion coefficients
* [kappa_4, kappa_5]
that can be used to correct the
* lens's geometric distortion with the mapping equations:
x_c = x_i * ( kappa_0 + kappa_1 * r^2 + kappa_2 * r^4 + kappa_3 * r^6 ) +
* kappa_4 * (2 * x_i * y_i) + kappa_5 * ( r^2 + 2 * x_i^2 )
* y_c = y_i * ( kappa_0 + kappa_1 * r^2 + kappa_2 * r^4 + kappa_3 * r^6 ) +
* kappa_5 * (2 * x_i * y_i) + kappa_4 * ( r^2 + 2 * y_i^2 )
*
* Here, [x_c, y_c]
are the coordinates to sample in the
* input image that correspond to the pixel values in the
* corrected image at the coordinate [x_i, y_i]
:
correctedImage(x_i, y_i) = sample_at(x_c, y_c, inputImage)
*
* The pixel coordinates are defined in a normalized
* coordinate system related to the
* {@link CameraCharacteristics#LENS_INTRINSIC_CALIBRATION android.lens.intrinsicCalibration} calibration fields.
* Both [x_i, y_i]
and [x_c, y_c]
have (0,0)
at the
* lens optical center [c_x, c_y]
. The maximum magnitudes
* of both x and y coordinates are normalized to be 1 at the
* edge further from the optical center, so the range
* for both dimensions is -1 <= x <= 1
.
Finally, r
represents the radial distance from the
* optical center, r^2 = x_i^2 + y_i^2
, and its magnitude
* is therefore no larger than |r| <= sqrt(2)
.
The distortion model used is the Brown-Conrady model.
*Units: * Unitless coefficients.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#LENS_INTRINSIC_CALIBRATION */ @PublicKey public static final KeyList of noise reduction modes for {@link CaptureRequest#NOISE_REDUCTION_MODE android.noiseReduction.mode} that are supported * by this camera device.
*Full-capability camera devices will always support OFF and FAST.
*Camera devices that support YUV_REPROCESSING or PRIVATE_REPROCESSING will support * ZERO_SHUTTER_LAG.
*Legacy-capability camera devices will only support FAST mode.
*Range of valid values:
* Any value listed in {@link CaptureRequest#NOISE_REDUCTION_MODE android.noiseReduction.mode}
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#NOISE_REDUCTION_MODE */ @PublicKey public static final KeyIf set to 1, the HAL will always split result * metadata for a single capture into multiple buffers, * returned using multiple process_capture_result calls.
*Does not need to be listed in static * metadata. Support for partial results will be reworked in * future versions of camera service. This quirk will stop * working at that point; DO NOT USE without careful * consideration of future support.
*Optional - This value may be {@code null} on some devices.
* @deprecated * @hide */ @Deprecated public static final KeyThe maximum numbers of different types of output streams * that can be configured and used simultaneously by a camera device.
*This is a 3 element tuple that contains the max number of output simultaneous
* streams for raw sensor, processed (but not stalling), and processed (and stalling)
* formats respectively. For example, assuming that JPEG is typically a processed and
* stalling stream, if max raw sensor format output stream number is 1, max YUV streams
* number is 3, and max JPEG stream number is 2, then this tuple should be (1, 3, 2)
.
This lists the upper bound of the number of output streams supported by * the camera device. Using more streams simultaneously may require more hardware and * CPU resources that will consume more power. The image format for an output stream can * be any supported format provided by android.scaler.availableStreamConfigurations. * The formats defined in android.scaler.availableStreamConfigurations can be catergorized * into the 3 stream types as below:
*Range of valid values:
For processed (and stalling) format streams, >= 1.
*For Raw format (either stalling or non-stalling) streams, >= 0.
*For processed (but not stalling) format streams, >= 3
* for FULL mode devices ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} == FULL
);
* >= 2 for LIMITED mode devices ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} == LIMITED
).
This key is available on all devices.
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @hide */ public static final KeyThe maximum numbers of different types of output streams
* that can be configured and used simultaneously by a camera device
* for any RAW
formats.
This value contains the max number of output simultaneous * streams from the raw sensor.
*This lists the upper bound of the number of output streams supported by
* the camera device. Using more streams simultaneously may require more hardware and
* CPU resources that will consume more power. The image format for this kind of an output stream can
* be any RAW
and supported format provided by {@link CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP android.scaler.streamConfigurationMap}.
In particular, a RAW
format is typically one of:
LEGACY mode devices ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} ==
LEGACY)
* never support raw streams.
Range of valid values:
>= 0
*This key is available on all devices.
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP */ @PublicKey @SyntheticKey public static final KeyThe maximum numbers of different types of output streams * that can be configured and used simultaneously by a camera device * for any processed (but not-stalling) formats.
*This value contains the max number of output simultaneous * streams for any processed (but not-stalling) formats.
*This lists the upper bound of the number of output streams supported by
* the camera device. Using more streams simultaneously may require more hardware and
* CPU resources that will consume more power. The image format for this kind of an output stream can
* be any non-RAW
and supported format provided by {@link CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP android.scaler.streamConfigurationMap}.
Processed (but not-stalling) is defined as any non-RAW format without a stall duration. * Typically:
*For full guarantees, query {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration } with a * processed format -- it will return 0 for a non-stalling stream.
*LEGACY devices will support at least 2 processing/non-stalling streams.
*Range of valid values:
>= 3
* for FULL mode devices ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} == FULL
);
* >= 2 for LIMITED mode devices ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} == LIMITED
).
This key is available on all devices.
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP */ @PublicKey @SyntheticKey public static final KeyThe maximum numbers of different types of output streams * that can be configured and used simultaneously by a camera device * for any processed (and stalling) formats.
*This value contains the max number of output simultaneous * streams for any processed (but not-stalling) formats.
*This lists the upper bound of the number of output streams supported by
* the camera device. Using more streams simultaneously may require more hardware and
* CPU resources that will consume more power. The image format for this kind of an output stream can
* be any non-RAW
and supported format provided by {@link CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP android.scaler.streamConfigurationMap}.
A processed and stalling format is defined as any non-RAW format with a stallDurations * > 0. Typically only the {@link android.graphics.ImageFormat#JPEG JPEG format} is a * stalling format.
*For full guarantees, query {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration } with a * processed format -- it will return a non-0 value for a stalling stream.
*LEGACY devices will support up to 1 processing/stalling stream.
*Range of valid values:
>= 1
*This key is available on all devices.
* * @see CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP */ @PublicKey @SyntheticKey public static final KeyThe maximum numbers of any type of input streams * that can be configured and used simultaneously by a camera device.
*When set to 0, it means no input stream is supported.
*The image format for a input stream can be any supported format returned by {@link android.hardware.camera2.params.StreamConfigurationMap#getInputFormats }. When using an * input stream, there must be at least one output stream configured to to receive the * reprocessed images.
*When an input stream and some output streams are used in a reprocessing request, * only the input buffer will be used to produce these output stream buffers, and a * new sensor image will not be captured.
*For example, for Zero Shutter Lag (ZSL) still capture use case, the input * stream image format will be PRIVATE, the associated output stream image format * should be JPEG.
*Range of valid values:
0 or 1.
*Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL */ @PublicKey public static final KeySpecifies the number of maximum pipeline stages a frame * has to go through from when it's exposed to when it's available * to the framework.
*A typical minimum value for this is 2 (one stage to expose, * one stage to readout) from the sensor. The ISP then usually adds * its own stages to do custom HW processing. Further stages may be * added by SW processing.
*Depending on what settings are used (e.g. YUV, JPEG) and what * processing is enabled (e.g. face detection), the actual pipeline * depth (specified by {@link CaptureResult#REQUEST_PIPELINE_DEPTH android.request.pipelineDepth}) may be less than * the max pipeline depth.
*A pipeline depth of X stages is equivalent to a pipeline latency of * X frame intervals.
*This value will normally be 8 or less, however, for high speed capture session, * the max pipeline depth will be up to 8 x size of high speed capture request list.
*This key is available on all devices.
* * @see CaptureResult#REQUEST_PIPELINE_DEPTH */ @PublicKey public static final KeyDefines how many sub-components * a result will be composed of.
*In order to combat the pipeline latency, partial results * may be delivered to the application layer from the camera device as * soon as they are available.
*Optional; defaults to 1. A value of 1 means that partial * results are not supported, and only the final TotalCaptureResult will * be produced by the camera device.
*A typical use case for this might be: after requesting an * auto-focus (AF) lock the new AF state might be available 50% * of the way through the pipeline. The camera device could * then immediately dispatch this state via a partial result to * the application, and the rest of the metadata via later * partial results.
*Range of valid values:
* >= 1
Optional - This value may be {@code null} on some devices.
*/ @PublicKey public static final KeyList of capabilities that this camera device * advertises as fully supporting.
*A capability is a contract that the camera device makes in order * to be able to satisfy one or more use cases.
*Listing a capability guarantees that the whole set of features * required to support a common use will all be available.
*Using a subset of the functionality provided by an unsupported * capability may be possible on a specific camera device implementation; * to do this query each of android.request.availableRequestKeys, * android.request.availableResultKeys, * android.request.availableCharacteristicsKeys.
*The following capabilities are guaranteed to be available on
* {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} ==
FULL devices:
Other capabilities may be available on either FULL or LIMITED * devices, but the application should query this key to be sure.
*Possible values: *
This key is available on all devices.
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see #REQUEST_AVAILABLE_CAPABILITIES_BACKWARD_COMPATIBLE * @see #REQUEST_AVAILABLE_CAPABILITIES_MANUAL_SENSOR * @see #REQUEST_AVAILABLE_CAPABILITIES_MANUAL_POST_PROCESSING * @see #REQUEST_AVAILABLE_CAPABILITIES_RAW * @see #REQUEST_AVAILABLE_CAPABILITIES_PRIVATE_REPROCESSING * @see #REQUEST_AVAILABLE_CAPABILITIES_READ_SENSOR_SETTINGS * @see #REQUEST_AVAILABLE_CAPABILITIES_BURST_CAPTURE * @see #REQUEST_AVAILABLE_CAPABILITIES_YUV_REPROCESSING * @see #REQUEST_AVAILABLE_CAPABILITIES_DEPTH_OUTPUT * @see #REQUEST_AVAILABLE_CAPABILITIES_CONSTRAINED_HIGH_SPEED_VIDEO */ @PublicKey public static final KeyA list of all keys that the camera device has available * to use with {@link android.hardware.camera2.CaptureRequest }.
*Attempting to set a key into a CaptureRequest that is not * listed here will result in an invalid request and will be rejected * by the camera device.
*This field can be used to query the feature set of a camera device * at a more granular level than capabilities. This is especially * important for optional keys that are not listed under any capability * in {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities}.
*This key is available on all devices.
* * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @hide */ public static final KeyA list of all keys that the camera device has available * to use with {@link android.hardware.camera2.CaptureResult }.
*Attempting to get a key from a CaptureResult that is not
* listed here will always return a null
value. Getting a key from
* a CaptureResult that is listed here will generally never return a null
* value.
The following keys may return null
unless they are enabled:
(Those sometimes-null keys will nevertheless be listed here * if they are available.)
*This field can be used to query the feature set of a camera device * at a more granular level than capabilities. This is especially * important for optional keys that are not listed under any capability * in {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities}.
*This key is available on all devices.
* * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @see CaptureRequest#STATISTICS_LENS_SHADING_MAP_MODE * @hide */ public static final KeyA list of all keys that the camera device has available * to use with {@link android.hardware.camera2.CameraCharacteristics }.
*This entry follows the same rules as * android.request.availableResultKeys (except that it applies for * CameraCharacteristics instead of CaptureResult). See above for more * details.
*This key is available on all devices.
* @hide */ public static final KeyThe list of image formats that are supported by this * camera device for output streams.
*All camera devices will support JPEG and YUV_420_888 formats.
*When set to YUV_420_888, application can access the YUV420 data directly.
*Optional - This value may be {@code null} on some devices.
* @deprecated * @hide */ @Deprecated public static final KeyThe minimum frame duration that is supported * for each resolution in android.scaler.availableJpegSizes.
*This corresponds to the minimum steady-state frame duration when only * that JPEG stream is active and captured in a burst, with all * processing (typically in android.*.mode) set to FAST.
*When multiple streams are configured, the minimum * frame duration will be >= max(individual stream min * durations)
*Units: Nanoseconds
*Range of valid values:
* TODO: Remove property.
Optional - This value may be {@code null} on some devices.
* @deprecated * @hide */ @Deprecated public static final KeyThe JPEG resolutions that are supported by this camera device.
*The resolutions are listed as (width, height)
pairs. All camera devices will support
* sensor maximum resolution (defined by {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize}).
Range of valid values:
* TODO: Remove property.
Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE * @deprecated * @hide */ @Deprecated public static final KeyThe maximum ratio between both active area width * and crop region width, and active area height and * crop region height, for {@link CaptureRequest#SCALER_CROP_REGION android.scaler.cropRegion}.
*This represents the maximum amount of zooming possible by * the camera device, or equivalently, the minimum cropping * window size.
*Crop regions that have a width or height that is smaller * than this ratio allows will be rounded up to the minimum * allowed size by the camera device.
*Units: Zoom scale factor
*Range of valid values:
* >=1
This key is available on all devices.
* * @see CaptureRequest#SCALER_CROP_REGION */ @PublicKey public static final KeyFor each available processed output size (defined in * android.scaler.availableProcessedSizes), this property lists the * minimum supportable frame duration for that size.
*This should correspond to the frame duration when only that processed * stream is active, with all processing (typically in android.*.mode) * set to FAST.
*When multiple streams are configured, the minimum frame duration will * be >= max(individual stream min durations).
*Units: Nanoseconds
*Optional - This value may be {@code null} on some devices.
* @deprecated * @hide */ @Deprecated public static final KeyThe resolutions available for use with * processed output streams, such as YV12, NV12, and * platform opaque YUV/RGB streams to the GPU or video * encoders.
*The resolutions are listed as (width, height)
pairs.
For a given use case, the actual maximum supported resolution * may be lower than what is listed here, depending on the destination * Surface for the image data. For example, for recording video, * the video encoder chosen may have a maximum size limit (e.g. 1080p) * smaller than what the camera (e.g. maximum resolution is 3264x2448) * can provide.
*Please reference the documentation for the image data destination to * check if it limits the maximum size for image data.
*Optional - This value may be {@code null} on some devices.
* @deprecated * @hide */ @Deprecated public static final KeyThe mapping of image formats that are supported by this * camera device for input streams, to their corresponding output formats.
*All camera devices with at least 1 * {@link CameraCharacteristics#REQUEST_MAX_NUM_INPUT_STREAMS android.request.maxNumInputStreams} will have at least one * available input format.
*The camera device will support the following map of formats, * if its dependent capability ({@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities}) is supported:
*Input Format | *Output Format | *Capability | *
---|---|---|
{@link android.graphics.ImageFormat#PRIVATE } | *{@link android.graphics.ImageFormat#JPEG } | *PRIVATE_REPROCESSING | *
{@link android.graphics.ImageFormat#PRIVATE } | *{@link android.graphics.ImageFormat#YUV_420_888 } | *PRIVATE_REPROCESSING | *
{@link android.graphics.ImageFormat#YUV_420_888 } | *{@link android.graphics.ImageFormat#JPEG } | *YUV_REPROCESSING | *
{@link android.graphics.ImageFormat#YUV_420_888 } | *{@link android.graphics.ImageFormat#YUV_420_888 } | *YUV_REPROCESSING | *
PRIVATE refers to a device-internal format that is not directly application-visible. A * PRIVATE input surface can be acquired by {@link android.media.ImageReader#newInstance } * with {@link android.graphics.ImageFormat#PRIVATE } as the format.
*For a PRIVATE_REPROCESSING-capable camera device, using the PRIVATE format as either input * or output will never hurt maximum frame rate (i.e. {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration getOutputStallDuration(ImageFormat.PRIVATE, size)} is always 0),
*Attempting to configure an input stream with output streams not * listed as available in this map is not valid.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @see CameraCharacteristics#REQUEST_MAX_NUM_INPUT_STREAMS * @hide */ public static final KeyThe available stream configurations that this * camera device supports * (i.e. format, width, height, output/input stream).
*The configurations are listed as (format, width, height, input?)
* tuples.
For a given use case, the actual maximum supported resolution * may be lower than what is listed here, depending on the destination * Surface for the image data. For example, for recording video, * the video encoder chosen may have a maximum size limit (e.g. 1080p) * smaller than what the camera (e.g. maximum resolution is 3264x2448) * can provide.
*Please reference the documentation for the image data destination to * check if it limits the maximum size for image data.
*Not all output formats may be supported in a configuration with * an input stream of a particular format. For more details, see * android.scaler.availableInputOutputFormatsMap.
*The following table describes the minimum required output stream * configurations based on the hardware level * ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel}):
*Format | *Size | *Hardware Level | *Notes | *
---|---|---|---|
JPEG | *{@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize} | *Any | ** |
JPEG | *1920x1080 (1080p) | *Any | *if 1080p <= activeArraySize | *
JPEG | *1280x720 (720) | *Any | *if 720p <= activeArraySize | *
JPEG | *640x480 (480p) | *Any | *if 480p <= activeArraySize | *
JPEG | *320x240 (240p) | *Any | *if 240p <= activeArraySize | *
YUV_420_888 | *all output sizes available for JPEG | *FULL | ** |
YUV_420_888 | *all output sizes available for JPEG, up to the maximum video size | *LIMITED | ** |
IMPLEMENTATION_DEFINED | *same as YUV_420_888 | *Any | ** |
Refer to {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities} for additional * mandatory stream configurations on a per-capability basis.
*This key is available on all devices.
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @see CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE * @hide */ public static final KeyThis lists the minimum frame duration for each * format/size combination.
*This should correspond to the frame duration when only that * stream is active, with all processing (typically in android.*.mode) * set to either OFF or FAST.
*When multiple streams are used in a request, the minimum frame * duration will be max(individual stream min durations).
*The minimum frame duration of a stream (of a particular format, size) * is the same regardless of whether the stream is input or output.
*See {@link CaptureRequest#SENSOR_FRAME_DURATION android.sensor.frameDuration} and * android.scaler.availableStallDurations for more details about * calculating the max frame rate.
*(Keep in sync with * {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration })
*Units: (format, width, height, ns) x n
*This key is available on all devices.
* * @see CaptureRequest#SENSOR_FRAME_DURATION * @hide */ public static final KeyThis lists the maximum stall duration for each * output format/size combination.
*A stall duration is how much extra time would get added * to the normal minimum frame duration for a repeating request * that has streams with non-zero stall.
*For example, consider JPEG captures which have the following * characteristics:
*In other words, using a repeating YUV request would result * in a steady frame rate (let's say it's 30 FPS). If a single * JPEG request is submitted periodically, the frame rate will stay * at 30 FPS (as long as we wait for the previous JPEG to return each * time). If we try to submit a repeating YUV + JPEG request, then * the frame rate will drop from 30 FPS.
*In general, submitting a new request with a non-0 stall time * stream will not cause a frame rate drop unless there are still * outstanding buffers for that stream from previous requests.
*Submitting a repeating request with streams (call this S
)
* is the same as setting the minimum frame duration from
* the normal minimum frame duration corresponding to S
, added with
* the maximum stall duration for S
.
If interleaving requests with and without a stall duration, * a request will stall by the maximum of the remaining times * for each can-stall stream with outstanding buffers.
*This means that a stalling request will not have an exposure start * until the stall has completed.
*This should correspond to the stall duration when only that stream is * active, with all processing (typically in android.*.mode) set to FAST * or OFF. Setting any of the processing modes to HIGH_QUALITY * effectively results in an indeterminate stall duration for all * streams in a request (the regular stall calculation rules are * ignored).
*The following formats may always have a stall duration:
*The following formats will never have a stall duration:
*All other formats may or may not have an allowed stall duration on * a per-capability basis; refer to {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities} * for more details.
*See {@link CaptureRequest#SENSOR_FRAME_DURATION android.sensor.frameDuration} for more information about * calculating the max frame rate (absent stalls).
*(Keep up to date with * {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputStallDuration } )
*Units: (format, width, height, ns) x n
*This key is available on all devices.
* * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @see CaptureRequest#SENSOR_FRAME_DURATION * @hide */ public static final KeyThe available stream configurations that this * camera device supports; also includes the minimum frame durations * and the stall durations for each format/size combination.
*All camera devices will support sensor maximum resolution (defined by * {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize}) for the JPEG format.
*For a given use case, the actual maximum supported resolution * may be lower than what is listed here, depending on the destination * Surface for the image data. For example, for recording video, * the video encoder chosen may have a maximum size limit (e.g. 1080p) * smaller than what the camera (e.g. maximum resolution is 3264x2448) * can provide.
*Please reference the documentation for the image data destination to * check if it limits the maximum size for image data.
*The following table describes the minimum required output stream * configurations based on the hardware level * ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel}):
*Format | *Size | *Hardware Level | *Notes | *
---|---|---|---|
{@link android.graphics.ImageFormat#JPEG } | *{@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize} (*1) | *Any | ** |
{@link android.graphics.ImageFormat#JPEG } | *1920x1080 (1080p) | *Any | *if 1080p <= activeArraySize | *
{@link android.graphics.ImageFormat#JPEG } | *1280x720 (720p) | *Any | *if 720p <= activeArraySize | *
{@link android.graphics.ImageFormat#JPEG } | *640x480 (480p) | *Any | *if 480p <= activeArraySize | *
{@link android.graphics.ImageFormat#JPEG } | *320x240 (240p) | *Any | *if 240p <= activeArraySize | *
{@link android.graphics.ImageFormat#YUV_420_888 } | *all output sizes available for JPEG | *FULL | ** |
{@link android.graphics.ImageFormat#YUV_420_888 } | *all output sizes available for JPEG, up to the maximum video size | *LIMITED | ** |
{@link android.graphics.ImageFormat#PRIVATE } | *same as YUV_420_888 | *Any | ** |
Refer to {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities} and {@link android.hardware.camera2.CameraDevice#createCaptureSession } for additional mandatory * stream configurations on a per-capability basis.
**1: For JPEG format, the sizes may be restricted by below conditions:
*This key is available on all devices.
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @see CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE */ @PublicKey @SyntheticKey public static final KeyThe crop type that this camera device supports.
*When passing a non-centered crop region ({@link CaptureRequest#SCALER_CROP_REGION android.scaler.cropRegion}) to a camera * device that only supports CENTER_ONLY cropping, the camera device will move the * crop region to the center of the sensor active array ({@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize}) * and keep the crop region width and height unchanged. The camera device will return the * final used crop region in metadata result {@link CaptureRequest#SCALER_CROP_REGION android.scaler.cropRegion}.
*Camera devices that support FREEFORM cropping will support any crop region that * is inside of the active array. The camera device will apply the same crop region and * return the final used crop region in capture result metadata {@link CaptureRequest#SCALER_CROP_REGION android.scaler.cropRegion}.
*LEGACY capability devices will only support CENTER_ONLY cropping.
*Possible values: *
This key is available on all devices.
* * @see CaptureRequest#SCALER_CROP_REGION * @see CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE * @see #SCALER_CROPPING_TYPE_CENTER_ONLY * @see #SCALER_CROPPING_TYPE_FREEFORM */ @PublicKey public static final KeyThe area of the image sensor which corresponds to active pixels after any geometric * distortion correction has been applied.
*This is the rectangle representing the size of the active region of the sensor (i.e. * the region that actually receives light from the scene) after any geometric correction * has been applied, and should be treated as the maximum size in pixels of any of the * image output formats aside from the raw formats.
*This rectangle is defined relative to the full pixel array; (0,0) is the top-left of * the full pixel array, and the size of the full pixel array is given by * {@link CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE android.sensor.info.pixelArraySize}.
*The coordinate system for most other keys that list pixel coordinates, including
* {@link CaptureRequest#SCALER_CROP_REGION android.scaler.cropRegion}, is defined relative to the active array rectangle given in
* this field, with (0, 0)
being the top-left of this rectangle.
The active array may be smaller than the full pixel array, since the full array may * include black calibration pixels or other inactive regions, and geometric correction * resulting in scaling or cropping may have been applied.
*Units: Pixel coordinates on the image sensor
*This key is available on all devices.
* * @see CaptureRequest#SCALER_CROP_REGION * @see CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE */ @PublicKey public static final KeyRange of sensitivities for {@link CaptureRequest#SENSOR_SENSITIVITY android.sensor.sensitivity} supported by this * camera device.
*The values are the standard ISO sensitivity values, * as defined in ISO 12232:2006.
*Range of valid values:
* Min <= 100, Max >= 800
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#SENSOR_SENSITIVITY */ @PublicKey public static final KeyThe arrangement of color filters on sensor; * represents the colors in the top-left 2x2 section of * the sensor, in reading order.
*Possible values: *
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see #SENSOR_INFO_COLOR_FILTER_ARRANGEMENT_RGGB * @see #SENSOR_INFO_COLOR_FILTER_ARRANGEMENT_GRBG * @see #SENSOR_INFO_COLOR_FILTER_ARRANGEMENT_GBRG * @see #SENSOR_INFO_COLOR_FILTER_ARRANGEMENT_BGGR * @see #SENSOR_INFO_COLOR_FILTER_ARRANGEMENT_RGB */ @PublicKey public static final KeyThe range of image exposure times for {@link CaptureRequest#SENSOR_EXPOSURE_TIME android.sensor.exposureTime} supported * by this camera device.
*Units: Nanoseconds
*Range of valid values:
* The minimum exposure time will be less than 100 us. For FULL
* capability devices ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} == FULL),
* the maximum exposure time will be greater than 100ms.
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#SENSOR_EXPOSURE_TIME */ @PublicKey public static final KeyThe maximum possible frame duration (minimum frame rate) for * {@link CaptureRequest#SENSOR_FRAME_DURATION android.sensor.frameDuration} that is supported this camera device.
*Attempting to use frame durations beyond the maximum will result in the frame * duration being clipped to the maximum. See that control for a full definition of frame * durations.
*Refer to {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration } * for the minimum frame duration values.
*Units: Nanoseconds
*Range of valid values:
* For FULL capability devices
* ({@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} == FULL), at least 100ms.
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#SENSOR_FRAME_DURATION */ @PublicKey public static final KeyThe physical dimensions of the full pixel * array.
*This is the physical size of the sensor pixel * array defined by {@link CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE android.sensor.info.pixelArraySize}.
*Units: Millimeters
*This key is available on all devices.
* * @see CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE */ @PublicKey public static final KeyDimensions of the full pixel array, possibly * including black calibration pixels.
*The pixel count of the full pixel array of the image sensor, which covers * {@link CameraCharacteristics#SENSOR_INFO_PHYSICAL_SIZE android.sensor.info.physicalSize} area. This represents the full pixel dimensions of * the raw buffers produced by this sensor.
*If a camera device supports raw sensor formats, either this or * {@link CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE android.sensor.info.preCorrectionActiveArraySize} is the maximum dimensions for the raw * output formats listed in {@link CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP android.scaler.streamConfigurationMap} (this depends on * whether or not the image sensor returns buffers containing pixels that are not * part of the active array region for blacklevel calibration or other purposes).
*Some parts of the full pixel array may not receive light from the scene, * or be otherwise inactive. The {@link CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE android.sensor.info.preCorrectionActiveArraySize} key * defines the rectangle of active pixels that will be included in processed image * formats.
*Units: Pixels
*This key is available on all devices.
* * @see CameraCharacteristics#SCALER_STREAM_CONFIGURATION_MAP * @see CameraCharacteristics#SENSOR_INFO_PHYSICAL_SIZE * @see CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE */ @PublicKey public static final KeyMaximum raw value output by sensor.
*This specifies the fully-saturated encoding level for the raw * sample values from the sensor. This is typically caused by the * sensor becoming highly non-linear or clipping. The minimum for * each channel is specified by the offset in the * {@link CameraCharacteristics#SENSOR_BLACK_LEVEL_PATTERN android.sensor.blackLevelPattern} key.
*The white level is typically determined either by sensor bit depth * (8-14 bits is expected), or by the point where the sensor response * becomes too non-linear to be useful. The default value for this is * maximum representable value for a 16-bit raw sample (2^16 - 1).
*The white level values of captured images may vary for different * capture settings (e.g., {@link CaptureRequest#SENSOR_SENSITIVITY android.sensor.sensitivity}). This key * represents a coarse approximation for such case. It is recommended * to use {@link CaptureResult#SENSOR_DYNAMIC_WHITE_LEVEL android.sensor.dynamicWhiteLevel} for captures when supported * by the camera device, which provides more accurate white level values.
*Range of valid values:
* > 255 (8-bit output)
Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_BLACK_LEVEL_PATTERN * @see CaptureResult#SENSOR_DYNAMIC_WHITE_LEVEL * @see CaptureRequest#SENSOR_SENSITIVITY */ @PublicKey public static final KeyThe time base source for sensor capture start timestamps.
*The timestamps provided for captures are always in nanoseconds and monotonic, but * may not based on a time source that can be compared to other system time sources.
*This characteristic defines the source for the timestamps, and therefore whether they * can be compared against other system time sources/timestamps.
*Possible values: *
This key is available on all devices.
* @see #SENSOR_INFO_TIMESTAMP_SOURCE_UNKNOWN * @see #SENSOR_INFO_TIMESTAMP_SOURCE_REALTIME */ @PublicKey public static final KeyWhether the RAW images output from this camera device are subject to * lens shading correction.
*If TRUE, all images produced by the camera device in the RAW image formats will * have lens shading correction already applied to it. If FALSE, the images will * not be adjusted for lens shading correction. * See {@link CameraCharacteristics#REQUEST_MAX_NUM_OUTPUT_RAW android.request.maxNumOutputRaw} for a list of RAW image formats.
*This key will be null
for all devices do not report this information.
* Devices with RAW capability will always report this information in this key.
Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#REQUEST_MAX_NUM_OUTPUT_RAW */ @PublicKey public static final KeyThe area of the image sensor which corresponds to active pixels prior to the * application of any geometric distortion correction.
*This is the rectangle representing the size of the active region of the sensor (i.e. * the region that actually receives light from the scene) before any geometric correction * has been applied, and should be treated as the active region rectangle for any of the * raw formats. All metadata associated with raw processing (e.g. the lens shading * correction map, and radial distortion fields) treats the top, left of this rectangle as * the origin, (0,0).
*The size of this region determines the maximum field of view and the maximum number of * pixels that an image from this sensor can contain, prior to the application of * geometric distortion correction. The effective maximum pixel dimensions of a * post-distortion-corrected image is given by the {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize} * field, and the effective maximum field of view for a post-distortion-corrected image * can be calculated by applying the geometric distortion correction fields to this * rectangle, and cropping to the rectangle given in {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize}.
*E.g. to calculate position of a pixel, (x,y), in a processed YUV output image with the * dimensions in {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize} given the position of a pixel, * (x', y'), in the raw pixel array with dimensions give in * {@link CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE android.sensor.info.pixelArraySize}:
*(x_i - activeArray.left, y_i - activeArray.top)
,
* when the top, left coordinate of that buffer is treated as (0, 0).Thus, for pixel x',y' = (25, 25) on a sensor where {@link CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE android.sensor.info.pixelArraySize} * is (100,100), {@link CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE android.sensor.info.preCorrectionActiveArraySize} is (10, 10, 100, 100), * {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize} is (20, 20, 80, 80), and the geometric distortion * correction doesn't change the pixel coordinate, the resulting pixel selected in * pixel coordinates would be x,y = (25, 25) relative to the top,left of the raw buffer * with dimensions given in {@link CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE android.sensor.info.pixelArraySize}, and would be (5, 5) * relative to the top,left of post-processed YUV output buffer with dimensions given in * {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize}.
*The currently supported fields that correct for geometric distortion are:
*If all of the geometric distortion fields are no-ops, this rectangle will be the same * as the post-distortion-corrected rectangle given in * {@link CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE android.sensor.info.activeArraySize}.
*This rectangle is defined relative to the full pixel array; (0,0) is the top-left of * the full pixel array, and the size of the full pixel array is given by * {@link CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE android.sensor.info.pixelArraySize}.
*The pre-correction active array may be smaller than the full pixel array, since the * full array may include black calibration pixels or other inactive regions.
*Units: Pixel coordinates on the image sensor
*This key is available on all devices.
* * @see CameraCharacteristics#LENS_RADIAL_DISTORTION * @see CameraCharacteristics#SENSOR_INFO_ACTIVE_ARRAY_SIZE * @see CameraCharacteristics#SENSOR_INFO_PIXEL_ARRAY_SIZE * @see CameraCharacteristics#SENSOR_INFO_PRE_CORRECTION_ACTIVE_ARRAY_SIZE */ @PublicKey public static final KeyThe standard reference illuminant used as the scene light source when * calculating the {@link CameraCharacteristics#SENSOR_COLOR_TRANSFORM1 android.sensor.colorTransform1}, * {@link CameraCharacteristics#SENSOR_CALIBRATION_TRANSFORM1 android.sensor.calibrationTransform1}, and * {@link CameraCharacteristics#SENSOR_FORWARD_MATRIX1 android.sensor.forwardMatrix1} matrices.
*The values in this key correspond to the values defined for the * EXIF LightSource tag. These illuminants are standard light sources * that are often used calibrating camera devices.
*If this key is present, then {@link CameraCharacteristics#SENSOR_COLOR_TRANSFORM1 android.sensor.colorTransform1}, * {@link CameraCharacteristics#SENSOR_CALIBRATION_TRANSFORM1 android.sensor.calibrationTransform1}, and * {@link CameraCharacteristics#SENSOR_FORWARD_MATRIX1 android.sensor.forwardMatrix1} will also be present.
*Some devices may choose to provide a second set of calibration * information for improved quality, including * {@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 android.sensor.referenceIlluminant2} and its corresponding matrices.
*Possible values: *
Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_CALIBRATION_TRANSFORM1 * @see CameraCharacteristics#SENSOR_COLOR_TRANSFORM1 * @see CameraCharacteristics#SENSOR_FORWARD_MATRIX1 * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 * @see #SENSOR_REFERENCE_ILLUMINANT1_DAYLIGHT * @see #SENSOR_REFERENCE_ILLUMINANT1_FLUORESCENT * @see #SENSOR_REFERENCE_ILLUMINANT1_TUNGSTEN * @see #SENSOR_REFERENCE_ILLUMINANT1_FLASH * @see #SENSOR_REFERENCE_ILLUMINANT1_FINE_WEATHER * @see #SENSOR_REFERENCE_ILLUMINANT1_CLOUDY_WEATHER * @see #SENSOR_REFERENCE_ILLUMINANT1_SHADE * @see #SENSOR_REFERENCE_ILLUMINANT1_DAYLIGHT_FLUORESCENT * @see #SENSOR_REFERENCE_ILLUMINANT1_DAY_WHITE_FLUORESCENT * @see #SENSOR_REFERENCE_ILLUMINANT1_COOL_WHITE_FLUORESCENT * @see #SENSOR_REFERENCE_ILLUMINANT1_WHITE_FLUORESCENT * @see #SENSOR_REFERENCE_ILLUMINANT1_STANDARD_A * @see #SENSOR_REFERENCE_ILLUMINANT1_STANDARD_B * @see #SENSOR_REFERENCE_ILLUMINANT1_STANDARD_C * @see #SENSOR_REFERENCE_ILLUMINANT1_D55 * @see #SENSOR_REFERENCE_ILLUMINANT1_D65 * @see #SENSOR_REFERENCE_ILLUMINANT1_D75 * @see #SENSOR_REFERENCE_ILLUMINANT1_D50 * @see #SENSOR_REFERENCE_ILLUMINANT1_ISO_STUDIO_TUNGSTEN */ @PublicKey public static final KeyThe standard reference illuminant used as the scene light source when * calculating the {@link CameraCharacteristics#SENSOR_COLOR_TRANSFORM2 android.sensor.colorTransform2}, * {@link CameraCharacteristics#SENSOR_CALIBRATION_TRANSFORM2 android.sensor.calibrationTransform2}, and * {@link CameraCharacteristics#SENSOR_FORWARD_MATRIX2 android.sensor.forwardMatrix2} matrices.
*See {@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 android.sensor.referenceIlluminant1} for more details.
*If this key is present, then {@link CameraCharacteristics#SENSOR_COLOR_TRANSFORM2 android.sensor.colorTransform2}, * {@link CameraCharacteristics#SENSOR_CALIBRATION_TRANSFORM2 android.sensor.calibrationTransform2}, and * {@link CameraCharacteristics#SENSOR_FORWARD_MATRIX2 android.sensor.forwardMatrix2} will also be present.
*Range of valid values:
* Any value listed in {@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 android.sensor.referenceIlluminant1}
Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_CALIBRATION_TRANSFORM2 * @see CameraCharacteristics#SENSOR_COLOR_TRANSFORM2 * @see CameraCharacteristics#SENSOR_FORWARD_MATRIX2 * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 */ @PublicKey public static final KeyA per-device calibration transform matrix that maps from the * reference sensor colorspace to the actual device sensor colorspace.
*This matrix is used to correct for per-device variations in the * sensor colorspace, and is used for processing raw buffer data.
*The matrix is expressed as a 3x3 matrix in row-major-order, and * contains a per-device calibration transform that maps colors * from reference sensor color space (i.e. the "golden module" * colorspace) into this camera device's native sensor color * space under the first reference illuminant * ({@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 android.sensor.referenceIlluminant1}).
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 */ @PublicKey public static final KeyA per-device calibration transform matrix that maps from the * reference sensor colorspace to the actual device sensor colorspace * (this is the colorspace of the raw buffer data).
*This matrix is used to correct for per-device variations in the * sensor colorspace, and is used for processing raw buffer data.
*The matrix is expressed as a 3x3 matrix in row-major-order, and * contains a per-device calibration transform that maps colors * from reference sensor color space (i.e. the "golden module" * colorspace) into this camera device's native sensor color * space under the second reference illuminant * ({@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 android.sensor.referenceIlluminant2}).
*This matrix will only be present if the second reference * illuminant is present.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 */ @PublicKey public static final KeyA matrix that transforms color values from CIE XYZ color space to * reference sensor color space.
*This matrix is used to convert from the standard CIE XYZ color * space to the reference sensor colorspace, and is used when processing * raw buffer data.
*The matrix is expressed as a 3x3 matrix in row-major-order, and * contains a color transform matrix that maps colors from the CIE * XYZ color space to the reference sensor color space (i.e. the * "golden module" colorspace) under the first reference illuminant * ({@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 android.sensor.referenceIlluminant1}).
*The white points chosen in both the reference sensor color space * and the CIE XYZ colorspace when calculating this transform will * match the standard white point for the first reference illuminant * (i.e. no chromatic adaptation will be applied by this transform).
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 */ @PublicKey public static final KeyA matrix that transforms color values from CIE XYZ color space to * reference sensor color space.
*This matrix is used to convert from the standard CIE XYZ color * space to the reference sensor colorspace, and is used when processing * raw buffer data.
*The matrix is expressed as a 3x3 matrix in row-major-order, and * contains a color transform matrix that maps colors from the CIE * XYZ color space to the reference sensor color space (i.e. the * "golden module" colorspace) under the second reference illuminant * ({@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 android.sensor.referenceIlluminant2}).
*The white points chosen in both the reference sensor color space * and the CIE XYZ colorspace when calculating this transform will * match the standard white point for the second reference illuminant * (i.e. no chromatic adaptation will be applied by this transform).
*This matrix will only be present if the second reference * illuminant is present.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 */ @PublicKey public static final KeyA matrix that transforms white balanced camera colors from the reference * sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint.
*This matrix is used to convert to the standard CIE XYZ colorspace, and * is used when processing raw buffer data.
*This matrix is expressed as a 3x3 matrix in row-major-order, and contains * a color transform matrix that maps white balanced colors from the * reference sensor color space to the CIE XYZ color space with a D50 white * point.
*Under the first reference illuminant ({@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 android.sensor.referenceIlluminant1}) * this matrix is chosen so that the standard white point for this reference * illuminant in the reference sensor colorspace is mapped to D50 in the * CIE XYZ colorspace.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT1 */ @PublicKey public static final KeyA matrix that transforms white balanced camera colors from the reference * sensor colorspace to the CIE XYZ colorspace with a D50 whitepoint.
*This matrix is used to convert to the standard CIE XYZ colorspace, and * is used when processing raw buffer data.
*This matrix is expressed as a 3x3 matrix in row-major-order, and contains * a color transform matrix that maps white balanced colors from the * reference sensor color space to the CIE XYZ color space with a D50 white * point.
*Under the second reference illuminant ({@link CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 android.sensor.referenceIlluminant2}) * this matrix is chosen so that the standard white point for this reference * illuminant in the reference sensor colorspace is mapped to D50 in the * CIE XYZ colorspace.
*This matrix will only be present if the second reference * illuminant is present.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_REFERENCE_ILLUMINANT2 */ @PublicKey public static final KeyA fixed black level offset for each of the color filter arrangement * (CFA) mosaic channels.
*This key specifies the zero light value for each of the CFA mosaic * channels in the camera sensor. The maximal value output by the * sensor is represented by the value in {@link CameraCharacteristics#SENSOR_INFO_WHITE_LEVEL android.sensor.info.whiteLevel}.
*The values are given in the same order as channels listed for the CFA * layout key (see {@link CameraCharacteristics#SENSOR_INFO_COLOR_FILTER_ARRANGEMENT android.sensor.info.colorFilterArrangement}), i.e. the * nth value given corresponds to the black level offset for the nth * color channel listed in the CFA.
*The black level values of captured images may vary for different * capture settings (e.g., {@link CaptureRequest#SENSOR_SENSITIVITY android.sensor.sensitivity}). This key * represents a coarse approximation for such case. It is recommended to * use {@link CaptureResult#SENSOR_DYNAMIC_BLACK_LEVEL android.sensor.dynamicBlackLevel} or use pixels from * {@link CameraCharacteristics#SENSOR_OPTICAL_BLACK_REGIONS android.sensor.opticalBlackRegions} directly for captures when * supported by the camera device, which provides more accurate black * level values. For raw capture in particular, it is recommended to use * pixels from {@link CameraCharacteristics#SENSOR_OPTICAL_BLACK_REGIONS android.sensor.opticalBlackRegions} to calculate black * level values for each frame.
*Range of valid values:
* >= 0 for each.
Optional - This value may be {@code null} on some devices.
* * @see CaptureResult#SENSOR_DYNAMIC_BLACK_LEVEL * @see CameraCharacteristics#SENSOR_INFO_COLOR_FILTER_ARRANGEMENT * @see CameraCharacteristics#SENSOR_INFO_WHITE_LEVEL * @see CameraCharacteristics#SENSOR_OPTICAL_BLACK_REGIONS * @see CaptureRequest#SENSOR_SENSITIVITY */ @PublicKey public static final KeyMaximum sensitivity that is implemented * purely through analog gain.
*For {@link CaptureRequest#SENSOR_SENSITIVITY android.sensor.sensitivity} values less than or * equal to this, all applied gain must be analog. For * values above this, the gain applied can be a mix of analog and * digital.
*Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#SENSOR_SENSITIVITY */ @PublicKey public static final KeyClockwise angle through which the output image needs to be rotated to be * upright on the device screen in its native orientation.
*Also defines the direction of rolling shutter readout, which is from top to bottom in * the sensor's coordinate system.
*Units: Degrees of clockwise rotation; always a multiple of * 90
*Range of valid values:
* 0, 90, 180, 270
This key is available on all devices.
*/ @PublicKey public static final KeyList of sensor test pattern modes for {@link CaptureRequest#SENSOR_TEST_PATTERN_MODE android.sensor.testPatternMode} * supported by this camera device.
*Defaults to OFF, and always includes OFF if defined.
*Range of valid values:
* Any value listed in {@link CaptureRequest#SENSOR_TEST_PATTERN_MODE android.sensor.testPatternMode}
Optional - This value may be {@code null} on some devices.
* * @see CaptureRequest#SENSOR_TEST_PATTERN_MODE */ @PublicKey public static final KeyList of disjoint rectangles indicating the sensor * optically shielded black pixel regions.
*In most camera sensors, the active array is surrounded by some * optically shielded pixel areas. By blocking light, these pixels * provides a reliable black reference for black level compensation * in active array region.
*This key provides a list of disjoint rectangles specifying the * regions of optically shielded (with metal shield) black pixel * regions if the camera device is capable of reading out these black * pixels in the output raw images. In comparison to the fixed black * level values reported by {@link CameraCharacteristics#SENSOR_BLACK_LEVEL_PATTERN android.sensor.blackLevelPattern}, this key * may provide a more accurate way for the application to calculate * black level of each captured raw images.
*When this key is reported, the {@link CaptureResult#SENSOR_DYNAMIC_BLACK_LEVEL android.sensor.dynamicBlackLevel} and * {@link CaptureResult#SENSOR_DYNAMIC_WHITE_LEVEL android.sensor.dynamicWhiteLevel} will also be reported.
*Optional - This value may be {@code null} on some devices.
* * @see CameraCharacteristics#SENSOR_BLACK_LEVEL_PATTERN * @see CaptureResult#SENSOR_DYNAMIC_BLACK_LEVEL * @see CaptureResult#SENSOR_DYNAMIC_WHITE_LEVEL */ @PublicKey public static final KeyList of lens shading modes for {@link CaptureRequest#SHADING_MODE android.shading.mode} that are supported by this camera device.
*This list contains lens shading modes that can be set for the camera device. * Camera devices that support the MANUAL_POST_PROCESSING capability will always * list OFF and FAST mode. This includes all FULL level devices. * LEGACY devices will always only support FAST mode.
*Range of valid values:
* Any value listed in {@link CaptureRequest#SHADING_MODE android.shading.mode}
This key is available on all devices.
* * @see CaptureRequest#SHADING_MODE */ @PublicKey public static final KeyList of face detection modes for {@link CaptureRequest#STATISTICS_FACE_DETECT_MODE android.statistics.faceDetectMode} that are * supported by this camera device.
*OFF is always supported.
*Range of valid values:
* Any value listed in {@link CaptureRequest#STATISTICS_FACE_DETECT_MODE android.statistics.faceDetectMode}
This key is available on all devices.
* * @see CaptureRequest#STATISTICS_FACE_DETECT_MODE */ @PublicKey public static final KeyThe maximum number of simultaneously detectable * faces.
*Range of valid values:
* 0 for cameras without available face detection; otherwise:
* >=4
for LIMITED or FULL hwlevel devices or
* >0
for LEGACY devices.
This key is available on all devices.
*/ @PublicKey public static final KeyList of hot pixel map output modes for {@link CaptureRequest#STATISTICS_HOT_PIXEL_MAP_MODE android.statistics.hotPixelMapMode} that are * supported by this camera device.
*If no hotpixel map output is available for this camera device, this will contain only
* false
.
ON is always supported on devices with the RAW capability.
*Range of valid values:
* Any value listed in {@link CaptureRequest#STATISTICS_HOT_PIXEL_MAP_MODE android.statistics.hotPixelMapMode}
Optional - This value may be {@code null} on some devices.
* * @see CaptureRequest#STATISTICS_HOT_PIXEL_MAP_MODE */ @PublicKey public static final KeyList of lens shading map output modes for {@link CaptureRequest#STATISTICS_LENS_SHADING_MAP_MODE android.statistics.lensShadingMapMode} that * are supported by this camera device.
*If no lens shading map output is available for this camera device, this key will * contain only OFF.
*ON is always supported on devices with the RAW capability. * LEGACY mode devices will always only support OFF.
*Range of valid values:
* Any value listed in {@link CaptureRequest#STATISTICS_LENS_SHADING_MAP_MODE android.statistics.lensShadingMapMode}
Optional - This value may be {@code null} on some devices.
* * @see CaptureRequest#STATISTICS_LENS_SHADING_MAP_MODE */ @PublicKey public static final KeyMaximum number of supported points in the * tonemap curve that can be used for {@link CaptureRequest#TONEMAP_CURVE android.tonemap.curve}.
*If the actual number of points provided by the application (in {@link CaptureRequest#TONEMAP_CURVE android.tonemap.curve}*) is * less than this maximum, the camera device will resample the curve to its internal * representation, using linear interpolation.
*The output curves in the result metadata may have a different number * of points than the input curves, and will represent the actual * hardware curves used as closely as possible when linearly interpolated.
*Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#TONEMAP_CURVE */ @PublicKey public static final KeyList of tonemapping modes for {@link CaptureRequest#TONEMAP_MODE android.tonemap.mode} that are supported by this camera * device.
*Camera devices that support the MANUAL_POST_PROCESSING capability will always contain * at least one of below mode combinations:
*This includes all FULL level devices.
*Range of valid values:
* Any value listed in {@link CaptureRequest#TONEMAP_MODE android.tonemap.mode}
Optional - This value may be {@code null} on some devices.
*Full capability - * Present on all camera devices that report being {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_FULL HARDWARE_LEVEL_FULL} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#TONEMAP_MODE */ @PublicKey public static final KeyA list of camera LEDs that are available on this system.
*Possible values: *
Optional - This value may be {@code null} on some devices.
* @see #LED_AVAILABLE_LEDS_TRANSMIT * @hide */ public static final KeyGenerally classifies the overall set of the camera device functionality.
*The supported hardware level is a high-level description of the camera device's
* capabilities, summarizing several capabilities into one field. Each level adds additional
* features to the previous one, and is always a strict superset of the previous level.
* The ordering is LEGACY < LIMITED < FULL < LEVEL_3
.
Starting from LEVEL_3
, the level enumerations are guaranteed to be in increasing
* numerical value as well. To check if a given device is at least at a given hardware level,
* the following code snippet can be used:
// Returns true if the device supports the required hardware level, or better.
* boolean isHardwareLevelSupported(CameraCharacteristics c, int requiredLevel) {
* int deviceLevel = c.get(CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL);
* if (deviceLevel == CameraCharacteristics.INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY) {
* return requiredLevel == deviceLevel;
* }
* // deviceLevel is not LEGACY, can use numerical sort
* return requiredLevel <= deviceLevel;
* }
*
* At a high level, the levels are:
*LEGACY
devices operate in a backwards-compatibility mode for older
* Android devices, and have very limited capabilities.LIMITED
devices represent the
* baseline feature set, and may also include additional capabilities that are
* subsets of FULL
.FULL
devices additionally support per-frame manual control of sensor, flash, lens and
* post-processing settings, and image capture at a high rate.LEVEL_3
devices additionally support YUV reprocessing and RAW image capture, along
* with additional output stream configurations.See the individual level enums for full descriptions of the supported capabilities. The * {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities} entry describes the device's capabilities at a * finer-grain level, if needed. In addition, many controls have their available settings or * ranges defined in individual {@link android.hardware.camera2.CameraCharacteristics } entries.
*Some features are not part of any particular hardware level or capability and must be * queried separately. These include:
*==
REALTIME)==
CALIBRATED)Possible values: *
This key is available on all devices.
* * @see CameraCharacteristics#CONTROL_AVAILABLE_VIDEO_STABILIZATION_MODES * @see CameraCharacteristics#LENS_INFO_AVAILABLE_OPTICAL_STABILIZATION * @see CameraCharacteristics#LENS_INFO_FOCUS_DISTANCE_CALIBRATION * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES * @see CameraCharacteristics#SENSOR_INFO_TIMESTAMP_SOURCE * @see CameraCharacteristics#STATISTICS_INFO_AVAILABLE_FACE_DETECT_MODES * @see #INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED * @see #INFO_SUPPORTED_HARDWARE_LEVEL_FULL * @see #INFO_SUPPORTED_HARDWARE_LEVEL_LEGACY * @see #INFO_SUPPORTED_HARDWARE_LEVEL_3 */ @PublicKey public static final KeyThe maximum number of frames that can occur after a request * (different than the previous) has been submitted, and before the * result's state becomes synchronized.
*This defines the maximum distance (in number of metadata results), * between the frame number of the request that has new controls to apply * and the frame number of the result that has all the controls applied.
*In other words this acts as an upper boundary for how many frames * must occur before the camera device knows for a fact that the new * submitted camera settings have been applied in outgoing frames.
*Units: Frame counts
*Possible values: *
Available values for this device:
* A positive value, PER_FRAME_CONTROL, or UNKNOWN.
This key is available on all devices.
* @see #SYNC_MAX_LATENCY_PER_FRAME_CONTROL * @see #SYNC_MAX_LATENCY_UNKNOWN */ @PublicKey public static final KeyThe maximal camera capture pipeline stall (in unit of frame count) introduced by a * reprocess capture request.
*The key describes the maximal interference that one reprocess (input) request * can introduce to the camera simultaneous streaming of regular (output) capture * requests, including repeating requests.
*When a reprocessing capture request is submitted while a camera output repeating request * (e.g. preview) is being served by the camera device, it may preempt the camera capture * pipeline for at least one frame duration so that the camera device is unable to process * the following capture request in time for the next sensor start of exposure boundary. * When this happens, the application may observe a capture time gap (longer than one frame * duration) between adjacent capture output frames, which usually exhibits as preview * glitch if the repeating request output targets include a preview surface. This key gives * the worst-case number of frame stall introduced by one reprocess request with any kind of * formats/sizes combination.
*If this key reports 0, it means a reprocess request doesn't introduce any glitch to the * ongoing camera repeating request outputs, as if this reprocess request is never issued.
*This key is supported if the camera device supports PRIVATE or YUV reprocessing ( * i.e. {@link CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES android.request.availableCapabilities} contains PRIVATE_REPROCESSING or * YUV_REPROCESSING).
*Units: Number of frames.
*Range of valid values:
* <= 4
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CameraCharacteristics#REQUEST_AVAILABLE_CAPABILITIES */ @PublicKey public static final KeyThe available depth dataspace stream * configurations that this camera device supports * (i.e. format, width, height, output/input stream).
*These are output stream configurations for use with
* dataSpace HAL_DATASPACE_DEPTH. The configurations are
* listed as (format, width, height, input?)
tuples.
Only devices that support depth output for at least * the HAL_PIXEL_FORMAT_Y16 dense depth map may include * this entry.
*A device that also supports the HAL_PIXEL_FORMAT_BLOB
* sparse depth point cloud must report a single entry for
* the format in this list as (HAL_PIXEL_FORMAT_BLOB,
* android.depth.maxDepthSamples, 1, OUTPUT)
in addition to
* the entries for HAL_PIXEL_FORMAT_Y16.
Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @hide */ public static final KeyThis lists the minimum frame duration for each * format/size combination for depth output formats.
*This should correspond to the frame duration when only that * stream is active, with all processing (typically in android.*.mode) * set to either OFF or FAST.
*When multiple streams are used in a request, the minimum frame * duration will be max(individual stream min durations).
*The minimum frame duration of a stream (of a particular format, size) * is the same regardless of whether the stream is input or output.
*See {@link CaptureRequest#SENSOR_FRAME_DURATION android.sensor.frameDuration} and * android.scaler.availableStallDurations for more details about * calculating the max frame rate.
*(Keep in sync with {@link android.hardware.camera2.params.StreamConfigurationMap#getOutputMinFrameDuration })
*Units: (format, width, height, ns) x n
*Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @see CaptureRequest#SENSOR_FRAME_DURATION * @hide */ public static final KeyThis lists the maximum stall duration for each * output format/size combination for depth streams.
*A stall duration is how much extra time would get added * to the normal minimum frame duration for a repeating request * that has streams with non-zero stall.
*This functions similarly to * android.scaler.availableStallDurations for depth * streams.
*All depth output stream formats may have a nonzero stall * duration.
*Units: (format, width, height, ns) x n
*Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL * @hide */ public static final KeyIndicates whether a capture request may target both a * DEPTH16 / DEPTH_POINT_CLOUD output, and normal color outputs (such as * YUV_420_888, JPEG, or RAW) simultaneously.
*If TRUE, including both depth and color outputs in a single * capture request is not supported. An application must interleave color * and depth requests. If FALSE, a single request can target both types * of output.
*Typically, this restriction exists on camera devices that * need to emit a specific pattern or wavelength of light to * measure depth values, which causes the color image to be * corrupted during depth measurement.
*Optional - This value may be {@code null} on some devices.
*Limited capability - * Present on all camera devices that report being at least {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL_LIMITED HARDWARE_LEVEL_LIMITED} devices in the * {@link CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL android.info.supportedHardwareLevel} key
* * @see CameraCharacteristics#INFO_SUPPORTED_HARDWARE_LEVEL */ @PublicKey public static final Key