Class Camera
Defined in File Camera.hpp
Inheritance Relationships
Base Type
public dai::NodeCRTP< Node, Camera, CameraProperties >(Template Class NodeCRTP)
Class Documentation
-
class Camera : public dai::NodeCRTP<Node, Camera, CameraProperties>
Camera node. Experimental node, for both mono and color types of sensors.
Public Functions
Constructs Camera node.
-
void setBoardSocket(CameraBoardSocket boardSocket)
Specify which board socket to use
- Parameters:
boardSocket – Board socket to use
-
CameraBoardSocket getBoardSocket() const
Retrieves which board socket to use
- Returns:
Board socket to use
-
void setCamera(std::string name)
Specify which camera to use by name
- Parameters:
name – Name of the camera to use
-
std::string getCamera() const
Retrieves which camera to use by name
- Returns:
Name of the camera to use
-
void setImageOrientation(CameraImageOrientation imageOrientation)
Set camera image orientation.
-
CameraImageOrientation getImageOrientation() const
Get camera image orientation.
-
void setSize(std::tuple<int, int> size)
Set desired resolution. Sets sensor size to best fit.
-
void setSize(int width, int height)
Set desired resolution. Sets sensor size to best fit.
-
void setPreviewSize(int width, int height)
Set preview output size.
-
void setPreviewSize(std::tuple<int, int> size)
Set preview output size, as a tuple <width, height>
-
void setVideoSize(int width, int height)
Set video output size.
-
void setVideoSize(std::tuple<int, int> size)
Set video output size, as a tuple <width, height>
-
void setStillSize(int width, int height)
Set still output size.
-
void setStillSize(std::tuple<int, int> size)
Set still output size, as a tuple <width, height>
-
void setFps(float fps)
Set rate at which camera should produce frames
- Parameters:
fps – Rate in frames per second
-
void setIsp3aFps(int isp3aFps)
Isp 3A rate (auto focus, auto exposure, auto white balance, camera controls etc.). Default (0) matches the camera FPS, meaning that 3A is running on each frame. Reducing the rate of 3A reduces the CPU usage on CSS, but also increases the convergence rate of 3A. Note that camera controls will be processed at this rate. E.g. if camera is running at 30 fps, and camera control is sent at every frame, but 3A fps is set to 15, the camera control messages will be processed at 15 fps rate, which will lead to queueing.
-
float getFps() const
Get rate at which camera should produce frames
- Returns:
Rate in frames per second
-
std::tuple<int, int> getPreviewSize() const
Get preview size as tuple.
-
int getPreviewWidth() const
Get preview width.
-
int getPreviewHeight() const
Get preview height.
-
std::tuple<int, int> getVideoSize() const
Get video size as tuple.
-
int getVideoWidth() const
Get video width.
-
int getVideoHeight() const
Get video height.
-
std::tuple<int, int> getStillSize() const
Get still size as tuple.
-
int getStillWidth() const
Get still width.
-
int getStillHeight() const
Get still height.
-
std::tuple<int, int> getSize() const
Get sensor resolution as size.
-
int getWidth() const
Get sensor resolution width.
-
int getHeight() const
Get sensor resolution height.
-
void setMeshSource(Properties::WarpMeshSource source)
Set the source of the warp mesh or disable.
-
Properties::WarpMeshSource getMeshSource() const
Gets the source of the warp mesh.
-
void loadMeshFile(const dai::Path &warpMesh)
Specify local filesystem paths to the undistort mesh calibration files.
When a mesh calibration is set, it overrides the camera intrinsics/extrinsics matrices. Overrides useHomographyRectification behavior. Mesh format: a sequence of (y,x) points as ‘float’ with coordinates from the input image to be mapped in the output. The mesh can be subsampled, configured by
setMeshStep.With a 1280x800 resolution and the default (16,16) step, the required mesh size is:
width: 1280 / 16 + 1 = 81
height: 800 / 16 + 1 = 51
-
void loadMeshData(span<const std::uint8_t> warpMesh)
Specify mesh calibration data for undistortion See
loadMeshFilesfor the expected data format
-
void setMeshStep(int width, int height)
Set the distance between mesh points. Default: (32, 32)
-
std::tuple<int, int> getMeshStep() const
Gets the distance between mesh points.
-
void setCalibrationAlpha(float alpha)
Set calibration alpha parameter that determines FOV of undistorted frames.
-
tl::optional<float> getCalibrationAlpha() const
Get calibration alpha parameter that determines FOV of undistorted frames.
-
void setRawOutputPacked(bool packed)
Configures whether the camera
rawframes are saved as MIPI-packed to memory. The packed format is more efficient, consuming less memory on device, and less data to send to host: RAW10: 4 pixels saved on 5 bytes, RAW12: 2 pixels saved on 3 bytes. When packing is disabled (false), data is saved lsb-aligned, e.g. a RAW10 pixel will be stored as uint16, on bits 9..0: 0b0000’00pp’pppp’pppp. Default is auto: enabled for standard color/monochrome cameras where ISP can work with both packed/unpacked, but disabled for other cameras like ToF.
Public Members
-
CameraControl initialControl
Initial control options to apply to sensor
-
Input inputConfig = {*this, "inputConfig", Input::Type::SReceiver, false, 8, {{DatatypeEnum::ImageManipConfig, false}}}
Input for ImageManipConfig message, which can modify crop parameters in runtime
Default queue is non-blocking with size 8
-
Input inputControl = {*this, "inputControl", Input::Type::SReceiver, true, 8, {{DatatypeEnum::CameraControl, false}}}
Input for CameraControl message, which can modify camera parameters in runtime
Default queue is blocking with size 8
-
Output video = {*this, "video", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}
Outputs ImgFrame message that carries NV12 encoded (YUV420, UV plane interleaved) frame data.
Suitable for use with VideoEncoder node
-
Output preview = {*this, "preview", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}
Outputs ImgFrame message that carries BGR/RGB planar/interleaved encoded frame data.
Suitable for use with NeuralNetwork node
-
Output still = {*this, "still", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}
Outputs ImgFrame message that carries NV12 encoded (YUV420, UV plane interleaved) frame data.
The message is sent only when a CameraControl message arrives to inputControl with captureStill command set.
-
Output isp = {*this, "isp", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}
Outputs ImgFrame message that carries YUV420 planar (I420/IYUV) frame data.
Generated by the ISP engine, and the source for the ‘video’, ‘preview’ and ‘still’ outputs
-
Output raw = {*this, "raw", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}
Outputs ImgFrame message that carries RAW10-packed (MIPI CSI-2 format) frame data.
Captured directly from the camera sensor, and the source for the ‘isp’ output.
-
Output frameEvent = {*this, "frameEvent", Output::Type::MSender, {{DatatypeEnum::ImgFrame, false}}}
Outputs metadata-only ImgFrame message as an early indicator of an incoming frame.
It’s sent on the MIPI SoF (start-of-frame) event, just after the exposure of the current frame has finished and before the exposure for next frame starts. Could be used to synchronize various processes with camera capture. Fields populated: camera id, sequence number, timestamp
Public Static Functions
-
static int getScaledSize(int input, int num, int denom)
Computes the scaled size given numerator and denominator
Public Static Attributes
-
static constexpr const char *NAME = "Camera"
Protected Functions
-
virtual Properties &getProperties()