摘要:在ROS kinetic下,利用realsense D435深度相機採集校準的RGBD圖片,合成點雲,在rviz中查看點雲,最後保存成pcd文件。html
1、 各類bugide
代碼編譯成功後,打開rviz添加pointcloud2選項卡,當我訂閱合成點雲時,可視化失敗,選項卡報錯:oop
1)Data size (9394656 bytes) does not match width (640) times height (480) times point_step (32). Dropping message.ui
解讀:經過 rostopic echo /pointcloud_topic 讀取相機節點發布的原始點雲的相關數據,能夠發現每一幀原始點雲的點數量爲307200。合成點雲的點數量爲 9394656 / 32 ,約26萬。能夠猜想因爲某種緣由,系統把每一幀合成點雲的數據都丟棄了。spa
緣由:我事先給定合成點雲的大小爲,height = 480,width = 640. 然而在合成點雲的過程當中,剔除了部分違法值(d=0),由此致使合成點雲的點數量與指定的點數量不匹配,合成點雲的數據所以被丟棄。指針
解決方法:採用以下方法給定點雲大小, cloud->height = 1; cloud->width = cloud->points.size(); code
2)transform xxxxx;orm
解讀:經過 rostopic echo /pointcloud_topic ,發現原始點雲數據具備以下信息,視頻
header: seq: 50114 stamp: secs: 1528439158 nsecs: 557543003 frame_id: "camera_color_optical_frame"
由此推斷,合成點雲缺失參考座標系header.frame_id。點雲中點的XYZ屬性是相對於某個座標系來描述的,所以,須要指定點雲的參考座標系。htm
解決方法:添加點雲的header信息,
pub_pointcloud.header.frame_id = "camera_color_optical_frame"; pub_pointcloud.header.stamp = ros::Time::now();
3)將合成的點雲存儲成pcd文件時遇到以下報錯:
[ INFO] [1528442016.931874649]: point cloud size = 0 terminate called after throwing an instance of 'pcl::IOException' what(): : [pcl::PCDWriter::writeASCII] Input point cloud has no data! Aborted (core dumped)
通過多方查找,摸索了一步trick,分享給你們。真實報錯緣由仍未查明,期待前輩的指點。
高博的源代碼以下所示,裏面的cloud是pcl的數據類型,
pcl::io::savePCDFile( "./pointcloud.pcd", *cloud ); 。
個人源代碼以下面所示,先經過 pcl::toROSMsg() 將pcl的數據類型轉換成ros的數據類型,再寫入pcd中,便可跳過報錯。
4)相機內參
因爲在彩色圖和深度圖配準的過程當中,選用的是彩色圖的座標系,所以在合成點雲(像素座標在變換到相機座標)時應該選用彩色圖的相機內參。
realsense官方提供了一個應用程序能夠查看全部視頻流的內參。
gordon@gordon-5577:/usr/local/bin$ ./rs-sensor-control
以下所示
84 : Color #0 (Video Stream: Y16 640x480@ 60Hz)
85 : Color #0 (Video Stream: BGRA8 640x480@ 60Hz)
86 : Color #0 (Video Stream: RGBA8 640x480@ 60Hz)
87 : Color #0 (Video Stream: BGR8 640x480@ 60Hz)
88 : Color #0 (Video Stream: RGB8 640x480@ 60Hz)
89 : Color #0 (Video Stream: YUYV 640x480@ 60Hz)
90 : Color #0 (Video Stream: Y16 640x480@ 30Hz)
91 : Color #0 (Video Stream: BGRA8 640x480@ 30Hz)
92 : Color #0 (Video Stream: RGBA8 640x480@ 30Hz)
93 : Color #0 (Video Stream: BGR8 640x480@ 30Hz)
94 : Color #0 (Video Stream: RGB8 640x480@ 30Hz)
95 : Color #0 (Video Stream: YUYV 640x480@ 30Hz)
96 : Color #0 (Video Stream: Y16 640x480@ 15Hz)
97 : Color #0 (Video Stream: BGRA8 640x480@ 15Hz)
98 : Color #0 (Video Stream: RGBA8 640x480@ 15Hz)
99 : Color #0 (Video Stream: BGR8 640x480@ 15Hz)
100: Color #0 (Video Stream: RGB8 640x480@ 15Hz)
101: Color #0 (Video Stream: YUYV 640x480@ 15Hz)
102: Color #0 (Video Stream: Y16 640x480@ 6Hz)
103: Color #0 (Video Stream: BGRA8 640x480@ 6Hz)
104: Color #0 (Video Stream: RGBA8 640x480@ 6Hz)
105: Color #0 (Video Stream: BGR8 640x480@ 6Hz)
106: Color #0 (Video Stream: RGB8 640x480@ 6Hz)
107: Color #0 (Video Stream: YUYV 640x480@ 6Hz)
5)深度圖從ROS的數據類型轉換爲CV的數據類型
參看連接
2、程序代碼
#include <ros/ros.h> #include <image_transport/image_transport.h> #include <opencv2/highgui/highgui.hpp> #include <cv_bridge/cv_bridge.h> #include <sensor_msgs/image_encodings.h> #include <sensor_msgs/PointCloud2.h> // PCL 庫 #include <pcl/io/pcd_io.h> #include <pcl/point_types.h> #include <pcl/visualization/pcl_visualizer.h> #include <pcl_conversions/pcl_conversions.h> // 定義點雲類型 typedef pcl::PointCloud<pcl::PointXYZRGB> PointCloud; using namespace std; //namespace enc = sensor_msgs::image_encodings; // 相機內參 const double camera_factor = 1000; const double camera_cx = 321.798; const double camera_cy = 239.607; const double camera_fx = 615.899; const double camera_fy = 616.468; // 全局變量:圖像矩陣和點雲 cv_bridge::CvImagePtr color_ptr, depth_ptr; cv::Mat color_pic, depth_pic; void color_Callback(const sensor_msgs::ImageConstPtr& color_msg) { //cv_bridge::CvImagePtr color_ptr; try { cv::imshow("color_view", cv_bridge::toCvShare(color_msg, sensor_msgs::image_encodings::BGR8)->image); color_ptr = cv_bridge::toCvCopy(color_msg, sensor_msgs::image_encodings::BGR8); cv::waitKey(1050); // 不斷刷新圖像,頻率時間爲int delay,單位爲ms } catch (cv_bridge::Exception& e ) { ROS_ERROR("Could not convert from '%s' to 'bgr8'.", color_msg->encoding.c_str()); } color_pic = color_ptr->image; // output some info about the rgb image in cv format cout<<"output some info about the rgb image in cv format"<<endl; cout<<"rows of the rgb image = "<<color_pic.rows<<endl; cout<<"cols of the rgb image = "<<color_pic.cols<<endl; cout<<"type of rgb_pic's element = "<<color_pic.type()<<endl; } void depth_Callback(const sensor_msgs::ImageConstPtr& depth_msg) { //cv_bridge::CvImagePtr depth_ptr; try { //cv::imshow("depth_view", cv_bridge::toCvShare(depth_msg, sensor_msgs::image_encodings::TYPE_16UC1)->image); //depth_ptr = cv_bridge::toCvCopy(depth_msg, sensor_msgs::image_encodings::TYPE_16UC1); cv::imshow("depth_view", cv_bridge::toCvShare(depth_msg, sensor_msgs::image_encodings::TYPE_32FC1)->image); depth_ptr = cv_bridge::toCvCopy(depth_msg, sensor_msgs::image_encodings::TYPE_32FC1); cv::waitKey(1050); } catch (cv_bridge::Exception& e) { ROS_ERROR("Could not convert from '%s' to 'mono16'.", depth_msg->encoding.c_str()); } depth_pic = depth_ptr->image; // output some info about the depth image in cv format cout<<"output some info about the depth image in cv format"<<endl; cout<<"rows of the depth image = "<<depth_pic.rows<<endl; cout<<"cols of the depth image = "<<depth_pic.cols<<endl; cout<<"type of depth_pic's element = "<<depth_pic.type()<<endl; } int main(int argc, char **argv) { ros::init(argc, argv, "image_listener"); ros::NodeHandle nh; cv::namedWindow("color_view"); cv::namedWindow("depth_view"); cv::startWindowThread(); image_transport::ImageTransport it(nh); image_transport::Subscriber sub = it.subscribe("/camera/color/image_raw", 1, color_Callback); image_transport::Subscriber sub1 = it.subscribe("/camera/aligned_depth_to_color/image_raw", 1, depth_Callback); ros::Publisher pointcloud_publisher = nh.advertise<sensor_msgs::PointCloud2>("generated_pc", 1); // 點雲變量 // 使用智能指針,建立一個空點雲。這種指針用完會自動釋放。 PointCloud::Ptr cloud ( new PointCloud ); sensor_msgs::PointCloud2 pub_pointcloud; double sample_rate = 1.0; // 1HZ,1秒發1次 ros::Rate naptime(sample_rate); // use to regulate loop rate cout<<"depth value of depth map : "<<endl; while (ros::ok()) { // 遍歷深度圖 for (int m = 0; m < depth_pic.rows; m++){ for (int n = 0; n < depth_pic.cols; n++){ // 獲取深度圖中(m,n)處的值 float d = depth_pic.ptr<float>(m)[n];//ushort d = depth_pic.ptr<ushort>(m)[n]; // d 可能沒有值,若如此,跳過此點 if (d == 0) continue; // d 存在值,則向點雲增長一個點 pcl::PointXYZRGB p; // 計算這個點的空間座標 p.z = double(d) / camera_factor; p.x = (n - camera_cx) * p.z / camera_fx; p.y = (m - camera_cy) * p.z / camera_fy; // 從rgb圖像中獲取它的顏色 // rgb是三通道的BGR格式圖,因此按下面的順序獲取顏色 p.b = color_pic.ptr<uchar>(m)[n*3]; p.g = color_pic.ptr<uchar>(m)[n*3+1]; p.r = color_pic.ptr<uchar>(m)[n*3+2]; // 把p加入到點雲中 cloud->points.push_back( p ); } } // 設置並保存點雲 cloud->height = 1; cloud->width = cloud->points.size(); ROS_INFO("point cloud size = %d",cloud->width); cloud->is_dense = false;// 轉換點雲的數據類型並存儲成pcd文件 pcl::toROSMsg(*cloud,pub_pointcloud); pub_pointcloud.header.frame_id = "camera_color_optical_frame"; pub_pointcloud.header.stamp = ros::Time::now(); pcl::io::savePCDFile("./pointcloud.pcd", pub_pointcloud); cout<<"publish point_cloud height = "<<pub_pointcloud.height<<endl; cout<<"publish point_cloud width = "<<pub_pointcloud.width<<endl; // 發佈合成點雲和原始點雲 pointcloud_publisher.publish(pub_pointcloud); ori_pointcloud_publisher.publish(cloud_msg); // 清除數據並退出 cloud->points.clear(); ros::spinOnce(); //allow data update from callback; naptime.sleep(); // wait for remainder of specified period; } cv::destroyWindow("color_view"); cv::destroyWindow("depth_view"); }