![]() ![]() Added a new benchmarking tool in the zed-ros2-examples repository to get frequency and bandwidth information from a subscribed topic and plot data.The parameter general.sdk_verbose is now an integer accepting different SDK verbose levels.Moved Object Detection parameters from cameras configuration files to common.yaml.Moved Sensor Parameters from cameras configuration files to common.yaml.RGB/Depth data publishing moved from timer to thread.Sensor data publishing moved from timer to thread.New data thread configuration to maximize data publishing frequency.Fixed random errors when closing the node.Improved TF broadcasting at grabbing frequency.Improved IMU/Left Camera TF broadcasting at IMU frequency.Fixed data grabbing frame rate when publishing is set to a lower value.The ZED node no longer publishes topics with TRANSIENT LOCAL durability. Thx Enabled Intra Process Communication (IPC). Set read_only flag in the parameter descriptor for non-dynamic parameters.By setting depth.quality to 0 now the depth extraction and all the sub-modules depending on it are correctly disabled. Added a sub-set of parameters for debugging with new parameters debug_mode and debug_sensors.Now the initial odometry is coherent with the new starting point value used as a parameter. Fixed a wrong behavior of the set_pose service.Thx The two ROS2 LTS releases are now supported simultaneously. Added support for ROS2 Humble Hawksbill.Add the option to open the camera by its serial number.Improve installation script with better python package folder detection (reported by Github user Release of a new plugin compatible with Unreal Engine 5.The zstd package must be installed ( sudo apt install zstd) to use the ZED SDK installer. They are now half the size and twice faster to extract. On Linux, the installers now use zstandard as the default compression (previously gzip).Added a support function, sl::convertImage, to convert image format from sl::MAT_TYPE::U8C4 to sl::MAT_TYPE::S8C4, mainly for Unreal Engine.The updateGPUfromCPU and updateCPUfromGPU, as well as copyTo, now include a cudaStream_t parameter to asynchronously retrieve the data when using GPU memory. sl::Mat class has been improved to allow faster transfers.Introduced a new sl::POSITIONAL_TRACKING_STATE::SEARCHING_FLOOR_PLANE, when you use the sl::PositionalTrackingParameter::set_floor_as_origin, to know if the positional tracking is started or not.Improved floor plane detection stability.This parameter prevents the jittering of the object state when there is a short misdetection When an object is not detected anymore, the SDK will predict its positions for a short period of time before its state switches to SEARCHING. Added ObjectDetectionParameters::prediction_timeout_s parameter to tune the prediction time when an object disappears.It may be useful if we want to remove some fixed objects in near range. Added PositionalTrackingParameters::depth_min_range parameter to change the minimum range of the odometry.For instance, if true (default), the initial_world_transform will be aligned with IMU gravity by keeping the user's yaw. It allows you to set the odometry world position using sensor data. Added PositionalTrackingParameters::set_gravity_as_origin parameter.The processing has also been improved and no longer produces color artifacts. Improved sharpening behavior, the lowest setting now disables it.Improved Depth Viewer rendering pipeline, especially on Jetson.For now, only HUMAN_BODY_xxxx models support it. The accuracy loss should not exceed 1-2% on the compatible models.This mode allows int8 inference that provides x4 the throughput on supported operations compared to fp32 and decreases the memory usage. By default all AI models are running in fp16 (half) with no accuracy difference when the GPU supports (requires a Jetson or Volta GPU and newer) it and fallback to fp32 otherwise. This parameter enables AI inference with quantization to improve runtime while allowing a slight decrease in accuracy.Two new samples are now available.Īdded a new ObjectDetectionParameters::allow_reduced_precision_inference parameter. It can be useful to filter out known objects that are irrelevant, such as vehicle hoods or drone propellers that can degrade positional tracking or depth estimation by adding noises.It defines a region of interest with a mask to focus on for all the SDK, discarding other parts.It applied to all GPU.Īdded a new Camera::setRegionOfInterest function. ![]() This speed-up is a combination of network optimization, pruning and improved usage of FP16 inference for supported configuration.Depth FPS now reaches 60FPS on platforms such as Jetson AGX Orin, and 30FPS for 2 cameras.Improved Neural depth runtime by up to 50% ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |