Tracking objects using GNN¶
In today's world, object tracking (tracking) tasks have a wide range of applications, from autonomous vehicles and video surveillance systems to radar and sensor monitoring. One of the key challenges in this area has been the association of measurements with tracks, especially under conditions of uncertainty, noise and possible trajectory intersections.
We consider an implementation of the classical GNN tracking algorithm.
The presented code includes the following steps.
- Modelling the motion of objects along hyperbolic trajectories.
- Implementation of a Kalman filter to predict and update track states.
- Visualisation of the results in the form of an animated graph.
The main goal is to demonstrate the effectiveness of the algorithm for processing noisy measurements and constructing trajectories of moving objects, in this case objects moving along hyperbolic trajectories.
Next, let us move on to the algorithm and start with the declaration of data structures and a constant for the object tracking system.
- Track - a variable structure storing the object state (coordinates, velocity), covariance matrix (uncertainty), track age, number of missed measurements and state history for visualisation.
- Detection - an immutable structure for storing measurements (coordinates and time).
- H observation matrix of size 2×4 that extracts positional components (x,y) from the state vector.
These structures form the basis for the implementation of the algorithm.
using LinearAlgebra
# Изменяемая структура трека
mutable struct Track
state::Vector{Float64} # Вектор состояния
covariance::Matrix{Float64} # Матрица ковариации
age::Int # Возраст трека (число обновлений)
missed::Int # Число пропущенных измерений
history::Vector{Vector{Float64}} # История состояний для отрисовки
end
# Добавляем history в конструктор Track
Track(state, covariance, age, missed) = Track(state, covariance, age, missed, [copy(state)])
# Измерение: [x, y, timestamp]
struct Detection
position::Vector{Float64}
time::Float64
end
const H = [1 0 0 0; 0 1 0 0]; # Матрица наблюдений
Next, let us consider the implementation of the Kalman filter for tracking.
- predict! updates the track state based on a constant velocity (CV) model, predicting the new position and increasing the uncertainty (covariance).
- update! corrects the track state by measurement using the Kalman gain matrix (K), reducing uncertainty and preserving the history of changes.
# Функция предсказания состояния трека
function predict!(track::Track, dt::Float64)
F = [1 0 dt 0; 0 1 0 dt; 0 0 1 0; 0 0 0 1] # Матрица перехода (CV)
track.state = F * track.state
track.covariance = F * track.covariance * F' + diagm([0.1, 0.1, 0.01, 0.01])
return track
end
# Функция обновления трека на основе измерения
function update!(track::Track, detection::Detection)
y = detection.position - H * track.state
S = H * track.covariance * H' + diagm([0.5, 0.5])
K = track.covariance * H' / S
track.state += K * y
track.covariance = (I - K * H) * track.covariance
track.age += 1
track.missed = 0
push!(track.history, copy(track.state)) # Сохраняем историю
return track
end
Let's move on to the function of the tracking algorithm. Global Nearest Neighbor (GNN) is a simple method based on greedily matching measurements to the nearest tracks. In our code, the trackerGNN function implements the GNN algorithm to match tracks and measurements. The algorithm itself consists of the following parts.
- Prediction: copies and predicts the new states of all tracks (predict!).
- Distance computation: builds a distance matrix between tracks and measurements based on the Mahalanobis statistical distance.
- Greedy matching: finds track-measurement pairs with minimum distance by checking the threshold (gate) and excludes already matched tracks/measurements.
- update and filter: updates tracks on matched measurements (update!), and removes tracks missing >3 measurements.
function trackerGNN(detections::Vector{Detection}, existing_tracks::Vector{Track}, gate::Float64)
predicted = [predict!(deepcopy(t), 1.0) for t in existing_tracks] # Предсказание состояний
# Матрица расстояний
dist = zeros(length(predicted), length(detections))
for (i,t) in enumerate(predicted), (j,d) in enumerate(detections)
y = d.position - H*t.state
dist[i,j] = y' / (H*t.covariance*H') * y
end
dets_assigned = tracks_assigned = Int[] # Cопоставление
for _ in 1:min(length(predicted), length(detections))
i,j = argmin(dist).I
dist[i,j] > gate && break
push!(tracks_assigned, i)
push!(dets_assigned, j)
dist[i,:] .= dist[:,j] .= Inf
end
updated = deepcopy(predicted) # Обновление и фильтрация
foreach(((i,j),) -> update!(updated[i], detections[j]), zip(tracks_assigned, dets_assigned))
filter!(t -> t.missed < 3, updated)
end
The plot_tracks function visualises the tracking process. The function creates a visualisation that allows comparing the performance of the algorithm with real values in dynamics, it includes the following logic.
- Blue dots (scatter) - current measurements (detections).
- Blue dashed lines - real trajectories of objects (obj1, obj2).
- Red squares - GNN tracks with history (solid line).
function plot_tracks(tracks::Vector{Track}, detections::Vector{Detection}, obj1, obj2, step)
p = scatter(getindex.(d.position[1] for d in detections), getindex.(d.position[2] for d in detections), label="Detections", color=:blue, markersize=6)
# Реальные траектории
plot!(getindex.(p[1] for p in obj1[1:step]), getindex.(p[2] for p in obj1[1:step]), label="", color=:blue, linestyle=:dash)
plot!(getindex.(p[1] for p in obj2[1:step]), getindex.(p[2] for p in obj2[1:step]), label="", color=:blue, linestyle=:dash)
# Вспомогательная функция для отрисовки треков
function plot_tracks!(tracks, label, color, marker)
for (i, t) in enumerate(tracks)
scatter!([t.state[1]], [t.state[2]], label=i==1 ? label : "", color=color, markersize=8, marker=marker)
# Отрисовка истории трека
if length(t.history) > 1
plot!(getindex.(h[1] for h in t.history), getindex.(h[2] for h in t.history),
label="", color=color, linestyle=:solid, linewidth=2)
end
end
end
# Треки разных типов с историей
plot_tracks!(tracks, "GNN Track", :red, :square)
plot!(legend=:topleft, title="Шаг: $step", xlims=(-5, 5), ylims=(-5, 5))
end
The function generate_hyperbola_trajectories generates two symmetric hyperbolic trajectories, namely, it creates a range of parameter t from -3 to 3 with uniform distribution, and then calculates the coordinates of points for two hyperbolas.
- First: acosh(t), y = bsinh(t)
- Second: -acosh(t), y = bsinh(t)
This function is great for testing tracking algorithms on intersecting trajectories. It creates realistic test trajectories simulating the movement of objects along hyperbolic paths.
# Функция для генерации траекторий гипербол
function generate_hyperbola_trajectories(steps, a=1.0, b=1.0)
t = range(-3, 3, length=steps)
[(a*cosh(ti), b*sinh(ti)) for ti in t], [(-a*cosh(ti), b*sinh(ti)) for ti in t]
end
Now let's perform simulation and visualisation of tracking hyperbolic trajectories.
The simulation cycle includes the following steps.
- The algorithm adds Gaussian noise to the real positions at each step.
- Updates the GNN trackers and visualises the current state via plot_tracks.
- Saves the last frame as a PNG image, and creates an animated GIF (5fps) of the entire tracking process.
steps = 50
obj1, obj2 = generate_hyperbola_trajectories(steps)
# Инициализация треков с историей
initial_state1 = [obj1[1][1], obj1[1][2], 0.5, 0.5]
initial_state2 = [obj2[1][1], obj2[1][2], -0.5, 0.5]
tracks_gnn = [Track(initial_state1, Matrix(1.0I, 4, 4), 1, 0),
Track(initial_state2, Matrix(1.0I, 4, 4), 2, 0)]
# Создаем GIF
anim = @animate for step in 1:steps
# Генерируем измерения для текущего шага
noise = 0.3(randn(4))
detections = [Detection([obj1[step][1] + noise[1], obj1[step][2] + noise[2]], step),
Detection([obj2[step][1] + noise[3], obj2[step][2] + noise[4]], step)]
# Обновляем трек
updated_tracks_gnn = trackerGNN(detections, deepcopy(tracks_gnn), 50.0)
# Обновляем исходные треки для следующего шага
global tracks_gnn = updated_tracks_gnn
# Визуализация
plot_tracks(tracks_gnn, detections, obj1, obj2, step)
if step == steps
savefig("last_frame.png")
end
end
# Сохраняем GIF
gif(anim, "tracking_simulation.gif", fps=5)
Conclusion¶
Results:
- The last simulation step shows that the algorithm successfully tracks both objects despite the added noise in the measurements.
- Track trajectories (red lines) are close to the real trajectories of the objects (blue dashed lines), which confirms the accuracy of the algorithm. At the same time, we see the largest inaccuracies exactly at the moment when the trajectories touch.
- The value
LIIar: 50
- threshold value for the Mahalonobis distance indicates the stability of the algorithm to false positives.
using Images
println("Последний шаг моделирования:")
img = load("last_frame.png")
The presented algorithm demonstrates high efficiency for object tracking problems with nonlinear trajectories. The key advantages are:
- robustness to noise due to the Kalman filter;
- easy implementation of the GNN method for comparison of measurements;
- clear visualisation for easy interpretation of the results.
The algorithm successfully solves the tracking problem and can be applied in areas such as autonomous systems, video surveillance and robotics. To further improve the algorithm, the use of more sophisticated motion models (e.g., constant acceleration model) or tracking methods such as JPDA or MHT to deal with high object density can be considered.