正態分佈變換算法是一個配准算法,它應用於三維點的統計模型,使用標準最優化技術來肯定兩個點雲間的最優的匹配,由於其在配準過程當中不利用對應點的特徵計算和匹配,因此時間比其餘方法快。算法細節能夠參考:NDT(Normal Distributions Transform)算法原理與公式推導。MATLAB Robotics System Toolbox中的函數matchScans就是使用NDT算法來對兩幀激光數據進行匹配,獲得它們之間的相對變換關係。html
matchScans函數用於匹配兩幀激光雷達數據,輸出兩幀之間的姿態變換。用法主要有下面幾種:算法
pose = matchScans(currScan,refScan) % finds the relative pose between a reference lidarScan and a current lidarScan object using the normal distributions transform (NDT). pose = matchScans(currRanges,currAngles,refRanges,refAngles) % finds the relative pose between two laser scans specified as ranges and angles. [pose,stats] = matchScans(___) % returns additional statistics about the scan match result using the previous input arguments.
輸出參數:app
[x y theta]
, where [x y]
is the translation in meters and theta
is the rotation in radians.Score
— Numeric scalar representing the NDT score while performing scan matching. This score is an estimate of the likelihood that the transformed current scan matches the reference scan. Score
is always nonnegative. Larger scores indicate a better match.Hessian
— 3-by-3 matrix representing the Hessian of the NDT cost function at the given pose
solution. The Hessian is used as an indicator of the uncertainty associated with the pose estimate.其它輸入參數:less
'trust-region'(default)
or 'fminunc'
'ScoreTolerance'
and a numeric scalar. The NDT score is stored in the Score
field of the output stats
structure. Between iterations, if the score changes by less than this tolerance, the algorithm converges to a solution. A smaller tolerance results in more accurate pose estimates, but requires a longer execution time. (1e-6 default)matchScans
uses the cell size to discretize the space for the NDT algorithm. Tuning the cell size is important for proper use of the NDT algorithm. The optimal cell size depends on the input scans and the environment of your robot. Larger cell sizes can lead to less accurate matching with poorly sampled areas. Smaller cell sizes require more memory and less variation between subsequent scans. Sensor noise influences the algorithm with smaller cell sizes as well. Choosing a proper cell size depends on the scale of your environment and the input data.
首先使用lidarScan函數建立一個對象來存儲2D激光掃描信息(Creat object for storing 2-D lidar scan),主要有下面3種使用方式:dom
% creates a lidarScan object from the ranges and angles, that represent the data collected from a lidar sensor.
% The ranges and angles inputs are vectors of the same length and are set directly to the Ranges and Angles properties.
scan = lidarScan(ranges, angles)
scan = lidarScan(cart) % creates a lidarScan object using the input Cartesian coordinates as an n-by-2 matrix. The Cartesian property is set directly from this input. scan = lidarScan(scanMsg) % creates a lidarScan object from a LaserScan ROS message object.
咱們按照距離—角度方式建立一個參考數據:ide
refRanges = 5*ones(1,300); refAngles = linspace(-pi/2,pi/2,300); refScan = lidarScan(refRanges,refAngles);
而後使用transformScan將參考激光數據變換到另外一個位置:函數
currScan = transformScan(refScan,[0.5 0.2 0]); % transforms the laser scan by using the specified relative pose
下面使用matchScans函數來計算這兩幀數據之間的相對變換:oop
pose = matchScans(currScan,refScan);
結果爲:pose = -0.4999 -0.2092 0.0185 優化
最後能夠再使用transformScan根據計算出的pose對第二幀數據進行變換,看看與第一幀數據的重合程度怎麼樣:ui
currScan2 = transformScan(currScan,pose); subplot(2,1,1); hold on plot(currScan) plot(refScan) title('Original Scans') hold off subplot(2,1,2); hold on plot(currScan2) plot(refScan) title('Aligned Scans') xlim([0 5]) hold off
從下圖能夠看出兩幀數據匹配的很好:
Specify a reference laser scan as ranges and angles:
refRanges = 5*ones(1,300); refAngles = linspace(-pi/2,pi/2,300);
使用transformScan函數對參考數據進行變換,生成第二個激光數據:
[currRanges,currAngles] = transformScan(refRanges,refAngles,[0.5 0.2 0]);
調用matchScans函數時還能夠指定其它參數,好比求解方法,初值,迭代次數等(Specify optional comma-separated pairs of Name,Value
arguments. Name
is the argument name and Value
is the corresponding value. Name
must appear inside single quotes (' '
). You can specify several name and value pair arguments in any order asName1,Value1,...,NameN,ValueN
.)
pose = matchScans(currRanges,currAngles,refRanges,refAngles,'SolverAlgorithm','fminunc');
pose = -0.5417 0.0165 -0.4096,能夠看出偏差比較大
經過指定初始值能夠提升準確度(Improve the estimate by giving an initial pose estimate.)
pose = matchScans(currRanges,currAngles,refRanges,refAngles,'SolverAlgorithm','fminunc','InitialPose',[-0.4 -0.1 0]);
pose = -0.5100 -0.1834 -0.0330,比沒有指定初值的結果要好不少
最後再根據計算出的結果將第2幀數據變換回去,進行對比:
[currRanges2,currAngles2] = transformScan(currRanges,currAngles,pose); [x1 y1] = pol2cart(refAngles,refRanges); [x2 y2] = pol2cart(currAngles,currRanges); [x3 y3] = pol2cart(currAngles2,currRanges2); subplot(1,2,1) plot(x1,y1,'o',x2,y2,'*r') title('Original Scans') subplot(1,2,2) plot(x1,y1,'o',x3,y3,'*r') title('Aligned Scans')
The goal of scan matching is to find the relative pose (or transform) between the two robot positions where the scans were taken. The scans can be aligned based on the shapes of their overlapping features. To estimate this pose, NDT subdivides the laser scan into 2D cells and each cell is assigned a corresponding normal distribution. The distribution represents the probability of measuring a point in that cell. Once the probability density is calculated, an optimization method finds the relative pose between the current laser scan and the reference laser scan. To speed up the convergence of the method, an initial guess of the pose can be provided. Typically, robot odometry is used to supply the initial estimate.
If you apply scan matching to a sequence of scans, you can use it to recover a rough map of the environment that the robot traverses. Scan matching also plays a crucial role in other applications, such as position tracking and Simultaneous Localization and Mapping (SLAM).
咱們能夠經過匹配激光掃描數據來估計機器人位姿。下面看一個例子,將工做目錄切換到Program Files\MATLAB\R2017b\toolbox\robotics\robotexamples\robotalgs下:
加載激光掃描數據文件:
filePath = fullfile(fileparts(mfilename('fullpath')), 'data', 'scanMatchingData.mat'); load(filePath);
激光數據由室內移動機器人採集,機器人的大體移動路徑以下圖所示:
下面咱們從laserMsg中挑選相隔比較近的兩幀數據,畫出來看看:
referenceScan = lidarScan(laserMsg{180}); currentScan = lidarScan(laserMsg{202}); currScanCart = currentScan.Cartesian; refScanCart = referenceScan.Cartesian; figure plot(refScanCart(:,1), refScanCart(:,2), 'k.'); hold on plot(currScanCart(:,1), currScanCart(:,2), 'r.'); legend('Reference laser scan', 'Current laser scan', 'Location', 'NorthWest');
因爲這兩幀數據間隔比較近,因此它們之間會存在許多相同的特徵。They should share common features by being close together in the sequence.
使用matchScans計算這兩幀數據間的變換關係:
transform = matchScans(currentScan, referenceScan)
transform =
0.5348 -0.0065 -0.0336
爲了驗證相對變換的計算是否正確,能夠用transformScan函數將currentScan變換回去查看兩幀數據是否重合。This transformed laser scan can be used to visualize the result.
transScan = transformScan(currentScan, transform);
下面畫出兩幀激光數據圖像:
figure plot(refScanCart(:,1), refScanCart(:,2), 'k.'); hold on transScanCart = transScan.Cartesian; plot(transScanCart(:,1), transScanCart(:,2), 'r.'); legend('Reference laser scan', 'Transformed current laser scan', 'Location', 'NorthWest');
若是計算出的變換比較準確,那兩幀圖像會匹配的很好。If the scan matching was successful, the two scans should be well-aligned.
Build Occupancy Grid Map Using Iterative Scan Matching
若是咱們對一些列連續的激光數據進行匹配,那麼咱們就可以從這些信息中粗略的創建環境地圖(If you apply scan matching to a sequence of scans, you can use it to recover a rough map of the environment)。下面看一個例子,新建一個15m×15m的空地圖,設置地圖原點在[-7.5 -7.5]處:
map = robotics.OccupancyGrid(15, 15, 20); map.GridLocationInWorld = [-7.5 -7.5]
map =
OccupancyGrid with properties:
OccupiedThreshold: 0.6500
FreeThreshold: 0.2000
ProbabilitySaturation: [0.0010 0.9990]
GridSize: [300 300]
Resolution: 20
XWorldLimits: [-7.5000 7.5000]
YWorldLimits: [-7.5000 7.5000]
GridLocationInWorld: [-7.5000 -7.5000]
每一幀激光掃描數據都對應着一個機器人的位姿,咱們要經過激光數據之間的匹配關係來推算機器人的運動。以第一幀數據爲參考位置,假設機器人位姿的初始值爲[0 0 0]。預先分配一個矩陣poseList用於存儲機器人運動路徑上的位置和姿態:
numScans = numel(laserMsg);
initialPose = [0 0 0]; % Typically, robot odometry is used to supply the initial estimate.
poseList = zeros(numScans,3);
poseList(1,:) = initialPose;
transform = initialPose;
下面的代碼循環處理全部激光數據,計算出相鄰兩幀之間的變換後經過exampleHelperComposeTransform函數計算出機器人相對於初始參考點的絕對位姿,並使用robotics.OccupancyGrid.insertRay函數來更新地圖:
% Loop through all the scans and calculate the relative poses between them for idx = 2:numScans % Process the data in pairs. referenceScan = lidarScan(laserMsg{idx-1}); currentScanMsg = laserMsg{idx}; currentScan = lidarScan(currentScanMsg); % Run scan matching. Note that the scan angles stay the same and do % not have to be recomputed. To increase accuracy, set the maximum % number of iterations to 500. Use the transform from the last % iteration as the initial estimate. [transform, stats] = matchScans(currentScan, referenceScan, ... 'MaxIterations', 500, 'InitialPose', transform); % The |Score| in the statistics structure is a good indication of the quality of the scan match. if stats.Score / currentScan.Count < 1.0 disp(['Low scan match score for index ' num2str(idx) '. Score = ' num2str(stats.Score) '.']); end % Maintain the list of robot poses. absolutePose = exampleHelperComposeTransform(poseList(idx-1,:), transform); poseList(idx,:) = absolutePose; % Integrate the current laser scan into the probabilistic occupancy grid. insertRay(map, absolutePose, currentScan, double(currentScanMsg.RangeMax)); end
exampleHelperComposeTransform函數內容以下:
function composedTransform = exampleHelperComposeTransform(baseTransform, relativeTransform) %exampleHelperComposeTransform Compose two transforms % The RELATIVETRANSFORM is added to the BASETRANSFORM and the composed % transform is returned in COMPOSEDTRANSFORM. % BASETRANSFORM is the transform from laser scan 1 to world and % RELATIVETRANSFORM is the transform from laser scan 2 to laser scan % 1 (as returned by matchScans). % Concatenate the 4x4 homogeneous transform matrices for the base and % relative transforms. tform = pose2tform(baseTransform) * pose2tform(relativeTransform); % Extract the translational vector trvec = tform2trvec(tform); % Extract the yaw angle from the resulting transform eul = tform2eul(tform); theta = eul(1); % The default order for Euler angle rotations is 'ZYX' % Composed transform has structure [x y theta(z-axis)] composedTransform = [trvec(1:2) theta]; end
function tform = pose2tform(pose) %pose2tform Convert [x y theta] pose into homogeneous transform % TFORM is returned as a 4x4 matrix. x = pose(1); y = pose(2); theta = pose(3); tform = trvec2tform([x y 0]) * eul2tform([theta 0 0]); end
顯示建好的地圖:
figure show(map); title('Occupancy grid map built using scan matching results');
最後能夠繪製出機器人在地圖中的運動路徑。 Plot the absolute robot poses that were calculated by the scan matching algorithm. This shows the path that the robot took through the map of the environment.
hold on plot(poseList(:,1), poseList(:,2), 'bo', 'DisplayName', 'Estimated robot position');
參考:
Estimate Robot Pose with Scan Matching
Compose a Series of Laser Scans with Pose Changes
NDT(Normal Distributions Transform)算法原理與公式推導
使用正態分佈變換(Normal Distributions Transform)進行點雲配準
The Normal Distributions Transform: A New Approach to Laser Scan Matching