這個例子展示了如何通過融合視覺和毫米波雷達(dá)傳感器的數(shù)據(jù)來執(zhí)行以跟蹤車輛前方的物體,并在危險(xiǎn)時觸發(fā)前向碰撞預(yù)警。42.1 概述前方碰撞預(yù)警(FCW)是駕駛輔助系統(tǒng)和自動駕駛系統(tǒng)中的一項(xiàng)重要功能,其目標(biāo)是在即將與前車發(fā)生碰撞之前,向駕駛員提供正確、及時、可靠的預(yù)警。為了實(shí)現(xiàn)這一目標(biāo),車輛配備了面向前方的視覺和毫米波雷達(dá)傳感器。為了提高準(zhǔn)確警告的概率,并將錯誤警告的概率降到最低,需要進(jìn)行傳感器融合。在本例中,一輛測試車(被控車輛)裝備了各類傳感器,并記錄了它們的輸出。本例中使用的傳感器有:1 視覺傳感器,它提供了觀察到的物體列表及其分類和車道邊界的信息。對象列表每秒輸出10次,即輸出幀率為100ms。車道邊界每秒輸出20次,即輸出幀率為50ms。2 毫米波雷達(dá)傳感器,具有中遠(yuǎn)距離模式,提供未分類的觀察物體列表。物體列表每秒輸出20次,即輸出幀率為50ms。3 IMU,每秒報(bào)告被控車輛的速度和轉(zhuǎn)彎率20次,即輸出幀率為50ms。4 攝像機(jī),它記錄了車前場景的視頻片段。注:這段視頻不被跟蹤器使用,只用于在視頻上顯示跟蹤結(jié)果,以便驗(yàn)證,類似行車記錄儀。2 融合傳感器數(shù)據(jù),得到軌跡列表,即汽車前方物體的估計(jì)位置和速度。3 根據(jù)軌跡和FCW標(biāo)準(zhǔn)發(fā)出警告。FCW標(biāo)準(zhǔn)基于Euro-NCAP AEB測試程序,并考慮到與車前物體的相對距離和相對速度。 本例中的可視化是使用monoCamera和birdsEyePlot完成的。為了簡潔起見,創(chuàng)建和更新顯示的函數(shù)被移到本例之外的輔助函數(shù)中。 本例是一個腳本,這里顯示的是主體,在后面的章節(jié)中以局部函數(shù)的形式顯示幫助例程。有關(guān)局部函數(shù)的更多細(xì)節(jié),請參見 "為腳本添加函數(shù)"。% Setup the display[videoReader,videoDisplayHandle, bepPlotters, sensor] = helperCreateFCWDemoDisplay('01_city_c2s_fcw_10s.mp4', 'SensorConfigurationData.mat'); % Readthe recorded detections file[visionObjects,radarObjects, inertialMeasurementUnit, laneReports, ... timeStep, numSteps] =readSensorRecordingsFile('01_city_c2s_fcw_10s_sensor.mat'); % Aninitial ego lane is calculated. If the recorded lane information is%invalid, define the lane boundaries as straight lines half a lane%distance on each side of the carlaneWidth =3.6; % metersegoLane =struct('left', [0 0 laneWidth/2], 'right', [0 0-laneWidth/2]); %Prepare some time variablestime =0; % Time since thebeginning of the recordingcurrentStep= 0; % Current timestepsnapTime =9.3; % The time to capture asnapshot of the display %Initialize the tracker[tracker,positionSelector, velocitySelector] = setupTracker(); whilecurrentStep < numSteps && ishghandle(videoDisplayHandle) % Update scenariocounters currentStep = currentStep + 1; time = time + timeStep; % Process the sensordetections as objectDetection inputs to the tracker [detections, laneBoundaries, egoLane] = processDetections(... visionObjects(currentStep),radarObjects(currentStep), ... inertialMeasurementUnit(currentStep),laneReports(currentStep), ... egoLane, time); % Using the list ofobjectDetections, return the tracks, updated to time confirmedTracks = updateTracks(tracker,detections, time); % Find the most importantobject and calculate the forward collision % warning mostImportantObject =findMostImportantObject(confirmedTracks, egoLane, positionSelector,velocitySelector); % Update video andbirds-eye plot displays frame = readFrame(videoReader); % Read video frame helperUpdateFCWDemoDisplay(frame,videoDisplayHandle, bepPlotters, ... laneBoundaries, sensor, confirmedTracks,mostImportantObject, positionSelector, ... velocitySelector,visionObjects(currentStep), radarObjects(currentStep)); % Capture a snapshot if time >= snapTime&& time < snapTime + timeStep snapnow; endend 42.2 創(chuàng)建多對象跟蹤器multiObjectTracker根據(jù)視覺和雷達(dá)傳感器報(bào)告的物體列表,跟蹤被控車輛四周的物體。通過融合兩個傳感器的信息,降低了錯誤碰撞預(yù)警的概率。 setupTracker函數(shù)會返回multiObjectTracker。當(dāng)創(chuàng)建一個multiObjectTracker,考慮如下: 1
FilterInitializationFcn??赡艿倪\(yùn)動和測量模型。在這種情況下,預(yù)計(jì)物體會有一個恒定的加速度運(yùn)動。雖然你可以為這個模型配置一個線性卡爾曼濾波器,但initConstantAccelerationFilter配置了一個擴(kuò)展的卡爾曼濾波器。參見 "定義卡爾曼濾波器 "部分。 2
AssignmentThreshold: 檢測到的數(shù)據(jù)與軌跡的距離。這個參數(shù)的默認(rèn)值是30。如果有一些檢測結(jié)果沒有被分配到軌道上,但應(yīng)該被分配到軌道上,則增加該值。如果有檢測結(jié)果被分配到太遠(yuǎn)的軌道上,則降低此值。本例使用35。3
DeletionThreshold。當(dāng)一條軌道被確認(rèn)后,不應(yīng)該在第一次更新時刪除,因?yàn)闆]有檢測分配給它。相反,它應(yīng)該被海岸(預(yù)測),直到很明顯該軌跡沒有得到任何傳感器信息來更新它。其邏輯是,如果軌道在 Q 次中漏掉 P 次,則應(yīng)將其刪除。該參數(shù)的默認(rèn)值是5次中的5次。在這種情況下,跟蹤器每秒被調(diào)用20次,而且有兩個傳感器,所以沒有必要修改默認(rèn)值。4
ConfirmationThreshold。確認(rèn)軌跡的參數(shù)。每次未分配的檢測都會初始化一個新的軌跡。其中有些檢測可能是假的,所以所有的軌跡都初始化為 "暫定"。要確認(rèn)一條軌跡,必須在N次軌跡更新中至少檢測到M次。M和N的選擇取決于對象的可見度。本例使用默認(rèn)的3次更新中檢測到2次。- tracker - 為本例配置的多對象追蹤器。-
positionSelector--指定狀態(tài)向量中哪些元素是位置的矩陣。position =
positionSelector * State-
velocitySelector - 一個指定狀態(tài)向量中哪些元素為速度的矩陣。velocity =
velocitySelector * State。
function [tracker, positionSelector, velocitySelector] = setupTracker() tracker = multiObjectTracker(... 'FilterInitializationFcn', @initConstantAccelerationFilter, ... 'AssignmentThreshold', 35, 'ConfirmationThreshold', [2 3], ... 'DeletionThreshold', 5); % The State vector is: % In constant velocity: State = [x;vx;y;vy] % In constant acceleration: State = [x;vx;ax;y;vy;ay] % Define which part of the State is the position. For example: % In constant velocity: [x;y] = [1 0 0 0; 0 0 1 0] * State % In constant acceleration: [x;y] = [1 0 0 0 0 0; 0 0 0 1 0 0] * State positionSelector = [1 0 0 0 0 0; 0 0 0 1 0 0]; % Define which part of the State is the velocity. For example: % In constant velocity: [x;y] = [0 1 0 0; 0 0 0 1] * State % In constant acceleration: [x;y] = [0 1 0 0 0 0; 0 0 0 0 1 0] * State velocitySelector = [0 1 0 0 0 0; 0 0 0 0 1 0]; end
42.3 定義卡爾曼濾波器上一節(jié)中定義的multiObjectTracker使用本節(jié)中定義的過濾器初始化函數(shù)來創(chuàng)建一個卡爾曼過濾器(線性、擴(kuò)展或無痕)。然后,這個濾波器被用于跟蹤被控車輛周圍的每個對象。
function filter = initConstantAccelerationFilter(detection)% This function shows how to configure a constant acceleration filter. The% input is an objectDetection and the output is a tracking filter.% For clarity, this function shows how to configure a trackingKF,% trackingEKF, or trackingUKF for constant acceleration.%% Steps for creating a filter:% 1. Define the motion model and state% 2. Define the process noise% 3. Define the measurement model% 4. Initialize the state vector based on the measurement% 5. Initialize the state covariance based on the measurement noise% 6. Create the correct filter % Step 1: Define the motion model and state % This example uses a constant acceleration model, so: STF = @constacc; % State-transition function, for EKF and UKF STFJ = @constaccjac; % State-transition function Jacobian, only for EKF % The motion model implies that the state is [x;vx;ax;y;vy;ay] % You can also use constvel and constveljac to set up a constant % velocity model, constturn and constturnjac to set up a constant turn % rate model, or write your own models. % Step 2: Define the process noise dt = 0.05; % Known timestep size sigma = 1; % Magnitude of the unknown acceleration change rate % The process noise along one dimension Q1d = [dt^4/4, dt^3/2, dt^2/2; dt^3/2, dt^2, dt; dt^2/2, dt, 1] * sigma^2; Q = blkdiag(Q1d, Q1d); % 2-D process noise % Step 3: Define the measurement model MF = @fcwmeas; % Measurement function, for EKF and UKF MJF = @fcwmeasjac; % Measurement Jacobian function, only for EKF % Step 4: Initialize a state vector based on the measurement % The sensors measure [x;vx;y;vy] and the constant acceleration model's % state is [x;vx;ax;y;vy;ay], so the third and sixth elements of the % state vector are initialized to zero. state = [detection.Measurement(1); detection.Measurement(2); 0; detection.Measurement(3); detection.Measurement(4); 0]; % Step 5: Initialize the state covariance based on the measurement % noise. The parts of the state that are not directly measured are % assigned a large measurement noise value to account for that. L = 100; % A large number relative to the measurement noise stateCov = blkdiag(detection.MeasurementNoise(1:2,1:2), L, detection.MeasurementNoise(3:4,3:4), L); % Step 6: Create the correct filter. % Use 'KF' for trackingKF, 'EKF' for trackingEKF, or 'UKF' for trackingUKF FilterType = 'EKF'; % Creating the filter: switch FilterType case'EKF' filter = trackingEKF(STF, MF, state,... 'StateCovariance', stateCov, ... 'MeasurementNoise', detection.MeasurementNoise(1:4,1:4), ... 'StateTransitionJacobianFcn', STFJ, ... 'MeasurementJacobianFcn', MJF, ... 'ProcessNoise', Q ... ); case'UKF' filter = trackingUKF(STF, MF, state, ... 'StateCovariance', stateCov, ... 'MeasurementNoise', detection.MeasurementNoise(1:4,1:4), ... 'Alpha', 1e-1, ... 'ProcessNoise', Q ... ); case'KF'% The ConstantAcceleration model is linear and KF can be used % Define the measurement model: measurement = H * state % In this case: % measurement = [x;vx;y;vy] = H * [x;vx;ax;y;vy;ay] % So, H = [1 0 0 0 0 0; 0 1 0 0 0 0; 0 0 0 1 0 0; 0 0 0 0 1 0] % % Note that ProcessNoise is automatically calculated by the % ConstantAcceleration motion model H = [1 0 0 0 0 0; 0 1 0 0 0 0; 0 0 0 1 0 0; 0 0 0 0 1 0]; filter = trackingKF('MotionModel', '2D Constant Acceleration', ... 'MeasurementModel', H, 'State', state, ... 'MeasurementNoise', detection.MeasurementNoise(1:4,1:4), ... 'StateCovariance', stateCov); endend
42.4 處理和格式化檢測記錄的信息必須經(jīng)過處理和格式化,才能被跟蹤器使用。這有以下幾個步驟:1過濾掉不必要的毫米波雷達(dá)干擾探測。毫米波雷達(dá)會輸出許多與固定物體相對應(yīng)的物體,這些物體包括:護(hù)欄、道路中間線、交通標(biāo)志等。如果在跟蹤中使用了這些檢測結(jié)果,就會在道路邊緣產(chǎn)生固定物體的虛假軌跡,因此必須在調(diào)用跟蹤器之前將其刪除。毫米波雷達(dá)物體如果在車前靜止或在車附近移動,則被認(rèn)為是非雜亂物體。2 將檢測結(jié)果格式化,作為跟蹤器的輸入,即objectDetection元素的數(shù)組。參見本例最后的processVideo和processRadar支持函數(shù)。
function [detections,laneBoundaries, egoLane] = processDetections... (visionFrame, radarFrame, IMUFrame, laneFrame, egoLane, time) % Inputs: % visionFrame - objects reported by the vision sensor for this time frame % radarFrame - objects reported by the radar sensor for this time frame % IMUFrame - inertial measurement unit data for this time frame % laneFrame - lane reports for this time frame % egoLane - the estimated ego lane % time - the time corresponding to the time frame % Remove clutter radar objects [laneBoundaries, egoLane] = processLanes(laneFrame, egoLane); realRadarObjects = findNonClutterRadarObjects(radarFrame.object,... radarFrame.numObjects, IMUFrame.velocity, laneBoundaries); % Return an empty list if no objects are reported % Counting the total number of objects detections = {}; if (visionFrame.numObjects + numel(realRadarObjects)) == 0 return; end % Process the remaining radar objects detections = processRadar(detections, realRadarObjects, time); % Process video objects detections = processVideo(detections, visionFrame, time); end
42.5 更新追蹤 要更新跟蹤,調(diào)用updateTracks 方法,輸入如下內(nèi)容。1 tracker - 之前配置的 multiObjectTracker。2 detections - 由processDetections創(chuàng)建的objectDetection對象列表。跟蹤器的輸出是一個軌跡結(jié)構(gòu)數(shù)組。最重要的物體(MIO)被定義為在被控車道上且離車前最近的軌跡,即正x值最小。為了降低誤報(bào)的概率,只考慮確認(rèn)的軌跡。一旦找到了MIO,就會計(jì)算汽車和MIO之間的相對速度。相對距離和相對速度決定了前撞預(yù)警。FCW有3種情況。1 安全(綠色)。被控車道上沒有車(沒有MIO),MIO正在遠(yuǎn)離汽車,或者與MIO的距離保持不變。2 小心(黃色)。MIO正在向汽車靠近,但距離仍高于FCW距離。FCW距離使用歐洲NCAP AEB測試協(xié)議計(jì)算。請注意,該距離隨MIO和汽車之間的相對速度而變化,當(dāng)關(guān)閉速度較高時,該距離較大。3 警告(紅色)。MIO正在向汽車靠近,其距離小于FCW距離。 Euro-NCAP AEB測試協(xié)議定義了以下距離計(jì)算。
function mostImportantObject = findMostImportantObject(confirmedTracks,egoLane,positionSelector,velocitySelector) % Initialize outputs and parameters MIO = []; % By default, there is no MIO trackID = []; % By default, there is no trackID associated with an MIO FCW = 3; % By default, if there is no MIO, then FCW is 'safe' threatColor = 'green'; % By default, the threat color is green maxX = 1000; % Far enough forward so that no track is expected to exceed this distance gAccel = 9.8; % Constant gravity acceleration, in m/s^2 maxDeceleration = 0.4 * gAccel; % Euro NCAP AEB definition delayTime = 1.2; % Delay time for a driver before starting to brake, in seconds positions = getTrackPositions(confirmedTracks, positionSelector); velocities = getTrackVelocities(confirmedTracks, velocitySelector); for i = 1:numel(confirmedTracks) x = positions(i,1); y = positions(i,2); relSpeed = velocities(i,1); % The relative speed between the cars, along the lane if x < maxX && x > 0 % No point checking otherwise yleftLane = polyval(egoLane.left, x); yrightLane = polyval(egoLane.right, x); if (yrightLane <= y) && (y <= yleftLane) maxX = x; trackID = i; MIO = confirmedTracks(i).TrackID; if relSpeed < 0 % Relative speed indicates object is getting closer % Calculate expected braking distance according to % Euro NCAP AEB Test Protocol d = abs(relSpeed) * delayTime + relSpeed^2 / 2 / maxDeceleration; if x <= d % 'warn' FCW = 1; threatColor = 'red'; else% 'caution' FCW = 2; threatColor = 'yellow'; end end end end end mostImportantObject = struct('ObjectID', MIO, 'TrackIndex', trackID, 'Warning', FCW, 'ThreatColor', threatColor);
42.6 總結(jié)這個例子展示了如何為配備視覺、毫米波雷達(dá)和IMU傳感器的車輛創(chuàng)建一個前向碰撞預(yù)警系統(tǒng)。它使用objectDetection對象將傳感器報(bào)告?zhèn)鬟f給multiObjectTracker對象,multiObjectTracker對象將它們?nèi)诤希⒏櫛豢剀囕v前方的物體。 嘗試為跟蹤器使用不同的參數(shù),看看它們?nèi)绾斡绊懜欃|(zhì)量。嘗試修改跟蹤過濾器,使用 trackingKF 或 trackingUKF,或者定義不同的運(yùn)動模型,例如,恒速或恒轉(zhuǎn)。最后,你可以嘗試定義自己的運(yùn)動模型。42.7 支持函數(shù)readSensorRecordingsFile 從文件中讀取記錄的傳感器數(shù)據(jù)。
function [visionObjects, radarObjects, inertialMeasurementUnit, laneReports, ... timeStep, numSteps] = readSensorRecordingsFile(sensorRecordingFileName)% Read Sensor Recordings% The |ReadDetectionsFile| function reads the recorded sensor data file.% The recorded data is a single structure that is divided into the% following substructures:%% # |inertialMeasurementUnit|, a struct array with fields: timeStamp,% velocity, and yawRate. Each element of the array corresponds to a% different timestep.% # |laneReports|, a struct array with fields: left and right. Each element% of the array corresponds to a different timestep.% Both left and right are structures with fields: isValid, confidence,% boundaryType, offset, headingAngle, and curvature.% # |radarObjects|, a struct array with fields: timeStamp (see below),% numObjects (integer) and object (struct). Each element of the array% corresponds to a different timestep.% |object| is a struct array, where each element is a separate object,% with the fields: id, status, position(x;y;z), velocity(vx,vy,vz),% amplitude, and rangeMode.% Note: z is always constant and vz=0.% # |visionObjects|, a struct array with fields: timeStamp (see below),% numObjects (integer) and object (struct). Each element of the array% corresponds to a different timestep.% |object| is a struct array, where each element is a separate object,% with the fields: id, classification, position (x;y;z),% velocity(vx;vy;vz), size(dx;dy;dz). Note: z=vy=vz=dx=dz=0%% The timeStamp for recorded vision and radar objects is a uint64 variable% holding microseconds since the Unix epoch. Timestamps are recorded about% 50 milliseconds apart. There is a complete synchronization between the% recordings of vision and radar detections, therefore the timestamps are% not used in further calculations. A = load(sensorRecordingFileName);visionObjects = A.vision;radarObjects = A.radar;laneReports = A.lane;inertialMeasurementUnit = A.inertialMeasurementUnit; timeStep = 0.05; % Data is provided every 50 millisecondsnumSteps = numel(visionObjects); % Number of recorded timestepsend
processLanes 將傳感器報(bào)告的車道轉(zhuǎn)換為拋物線型車道邊界車道,并保持持久的被控車道估量:
function [laneBoundaries, egoLane] = processLanes(laneReports, egoLane)% Lane boundaries are updated based on the laneReports from the recordings.% Since some laneReports contain invalid (isValid = false) reports or% impossible parameter values (-1e9), these lane reports are ignored and% the previous lane boundary is used.leftLane = laneReports.left;rightLane = laneReports.right; % Check the validity of the reported left lanecond = (leftLane.isValid && leftLane.confidence) && ... ~(leftLane.headingAngle == -1e9 || leftLane.curvature == -1e9);if cond egoLane.left = cast([leftLane.curvature, leftLane.headingAngle, leftLane.offset], 'double');end % Update the left lane boundary parameters or use the previous onesleftParams = egoLane.left;leftBoundaries = parabolicLaneBoundary(leftParams);leftBoundaries.Strength = 1; % Check the validity of the reported right lanecond = (rightLane.isValid && rightLane.confidence) && ... ~(rightLane.headingAngle == -1e9 || rightLane.curvature == -1e9);if cond egoLane.right = cast([rightLane.curvature, rightLane.headingAngle, rightLane.offset], 'double');end % Update the right lane boundary parameters or use the previous onesrightParams = egoLane.right;rightBoundaries = parabolicLaneBoundary(rightParams);rightBoundaries.Strength = 1; laneBoundaries = [leftBoundaries, rightBoundaries];end
findNonClutterRadarObject 移除被認(rèn)為是干擾的毫米波雷達(dá)對象:
function realRadarObjects = findNonClutterRadarObjects(radarObject, numRadarObjects, egoSpeed, laneBoundaries)% The radar objects include many objects that belong to the clutter.% Clutter is defined as a stationary object that is not in front of the% car. The following types of objects pass as nonclutter:%% # Any object in front of the car% # Any moving object in the area of interest around the car, including% objects that move at a lateral speed around the car % Allocate memory normVs = zeros(numRadarObjects, 1); inLane = zeros(numRadarObjects, 1); inZone = zeros(numRadarObjects, 1); % Parameters LaneWidth = 3.6; % What is considered in front of the car ZoneWidth = 1.7*LaneWidth; % A wider area of interest minV = 1; % Any object that moves slower than minV is considered stationary for j = 1:numRadarObjects [vx, vy] = calculateGroundSpeed(radarObject(j).velocity(1),radarObject(j).velocity(2),egoSpeed); normVs(j) = norm([vx,vy]); laneBoundariesAtObject = computeBoundaryModel(laneBoundaries, radarObject(j).position(1)); laneCenter = mean(laneBoundariesAtObject); inLane(j) = (abs(radarObject(j).position(2) - laneCenter) <= LaneWidth/2); inZone(j) = (abs(radarObject(j).position(2) - laneCenter) <= max(abs(vy)*2, ZoneWidth)); end realRadarObjectsIdx = union(... intersect(find(normVs > minV), find(inZone == 1)), ... find(inLane == 1)); realRadarObjects = radarObject(realRadarObjectsIdx);end
calculateGroundSpeed 根據(jù)相對速度和被控車輛速度計(jì)算毫米波雷達(dá)輸出物體的真實(shí)地面速度:
function [Vx,Vy] = calculateGroundSpeed(Vxi,Vyi,egoSpeed)% Inputs% (Vxi,Vyi) : relative object speed% egoSpeed : ego vehicle speed% Outputs% [Vx,Vy] : ground object speed Vx = Vxi + egoSpeed; % Calculate longitudinal ground speed theta = atan2(Vyi,Vxi); % Calculate heading angle Vy = Vx * tan(theta); % Calculate lateral ground speed end
processVideo 將報(bào)告的視覺對象轉(zhuǎn)換為 objectDetection 對象的列表:
function postProcessedDetections = processVideo(postProcessedDetections, visionFrame, t)% Process the video objects into objectDetection objectsnumRadarObjects = numel(postProcessedDetections);numVisionObjects = visionFrame.numObjects;if numVisionObjects classToUse = class(visionFrame.object(1).position); visionMeasCov = cast(diag([2,2,2,100]), classToUse); % Process Vision Objects: for i=1:numVisionObjects object = visionFrame.object(i); postProcessedDetections{numRadarObjects+i} = objectDetection(t,... [object.position(1); object.velocity(1); object.position(2); 0], ... 'SensorIndex', 1, 'MeasurementNoise', visionMeasCov, ... 'MeasurementParameters', {1}, ... 'ObjectClassID', object.classification, ... 'ObjectAttributes', {object.id, object.size}); endendend
processRadar 將報(bào)告的毫米波雷達(dá)對象轉(zhuǎn)換為objectDetection對象列表:
function postProcessedDetections = processRadar(postProcessedDetections, realRadarObjects, t)% Process the radar objects into objectDetection objectsnumRadarObjects = numel(realRadarObjects);if numRadarObjects classToUse = class(realRadarObjects(1).position); radarMeasCov = cast(diag([2,2,2,100]), classToUse); % Process Radar Objects: for i=1:numRadarObjects object = realRadarObjects(i); postProcessedDetections{i} = objectDetection(t, ... [object.position(1); object.velocity(1); object.position(2); object.velocity(2)], ... 'SensorIndex', 2, 'MeasurementNoise', radarMeasCov, ... 'MeasurementParameters', {2}, ... 'ObjectAttributes', {object.id, object.status, object.amplitude, object.rangeMode}); endendend
fcwmeas 這個前向碰撞預(yù)警示例中使用的測量函數(shù):
function measurement = fcwmeas(state, sensorID)% The example measurements depend on the sensor type, which is reported by% the MeasurementParameters property of the objectDetection. The following% two sensorID values are used:% sensorID=1: video objects, the measurement is [x;vx;y].% sensorID=2: radar objects, the measurement is [x;vx;y;vy].% The state is:% Constant velocity state = [x;vx;y;vy]% Constant turn state = [x;vx;y;vy;omega]% Constant acceleration state = [x;vx;ax;y;vy;ay] if numel(state) < 6 % Constant turn or constant velocity switch sensorID case 1 % video measurement = [state(1:3); 0]; case 2 % radar measurement = state(1:4); end else% Constant acceleration switch sensorID case 1 % video measurement = [state(1:2); state(4); 0]; case 2 % radar measurement = [state(1:2); state(4:5)]; end endend
fcwmeasjac 這個前向碰撞預(yù)警例子中使用的測量函數(shù)的雅各布值:
function jacobian = fcwmeasjac(state, sensorID)% The example measurements depend on the sensor type, which is reported by% the MeasurementParameters property of the objectDetection. We choose% sensorID=1 for video objects and sensorID=2 for radar objects. The% following two sensorID values are used:% sensorID=1: video objects, the measurement is [x;vx;y].% sensorID=2: radar objects, the measurement is [x;vx;y;vy].% The state is:% Constant velocity state = [x;vx;y;vy]% Constant turn state = [x;vx;y;vy;omega]% Constant acceleration state = [x;vx;ax;y;vy;ay] numStates = numel(state); jacobian = zeros(4, numStates); if numel(state) < 6 % Constant turn or constant velocity switch sensorID case 1 % video jacobian(1,1) = 1; jacobian(2,2) = 1; jacobian(3,3) = 1; case 2 % radar jacobian(1,1) = 1; jacobian(2,2) = 1; jacobian(3,3) = 1; jacobian(4,4) = 1; end else% Constant acceleration switch sensorID case 1 % video jacobian(1,1) = 1; jacobian(2,2) = 1; jacobian(3,4) = 1; case 2 % radar jacobian(1,1) = 1; jacobian(2,2) = 1; jacobian(3,4) = 1; jacobian(4,5) = 1; end endend
|