Engee documentation
Notebook

ECG monitoring using continuous radiation radar and deep learning (Part 2)

An ECG is used to monitor the heart: electrodes are placed on the skin and the activity of the heart is recorded. However, due to the short duration of recording, the discomfort of stickers and the risk of artifacts during movement, this method is not always convenient.

Radar systems offer non-contact monitoring: they detect micro-movements of the chest from heartbeat and breathing, without restricting the patient's movements and without requiring skin contact. Using neural networks, it is possible to reconstruct an electrocardiogram from the mechanical vibrations of the radar signal, turning the movement of the chest into the usual P-QRS-T waves.

As part of the demo example, we will focus on bioradars for heart monitoring and the use of AI algorithms to transform the radar signal into the equivalent of an ECG.

To get acquainted with the 1st part of the project, devoted to the analysis and comparison of ECG signal features and reflected echo radar, go to ссылке.

Preparation for work

Importing the necessary libraries

We import the necessary libraries from Julia and Python (skimage, scipy, neurokit 2), and also import the utils.jl file into the script, which contains all the necessary functions.

In [ ]:
include("$(@__DIR__)/install_packages.jl");

For the script to work, you need to install several Python packages, the functions of which are called via PyCall. To install, go to the command prompt and click the ";" symbol. Next, the shell opens, where you need to register:

pip install -r requirements.txt

the important one is located in the working directory.

In [ ]:
using Glob
using DSP
using CUDA
using Flux 
using Statistics
using cuDNN
using BSON: @save, @load 
In [ ]:
using Measures

Since the functions used have a fairly large number of lines of code, they have been moved to separate files.jls that are imported below

The model.jl file implements the structure of the model used. The dataset.jl file is used to implement the logic of the dataset module, ready data.jl contains functions that prepare data for training, and the utils.jl file contains all the other functions necessary for the project to work.

In [ ]:
Dirpath = "$(@__DIR__)"
include(joinpath(Dirpath, "model.jl"))
include(joinpath(Dirpath, "dataset.jl"))
include(joinpath(Dirpath, "prepare_data.jl"))
include(joinpath(Dirpath, "utils.jl"));

Data preparation

The paper uses an open data set presented by Schellenberger et al. (Scientific Data, 2020)​. This dataset contains ~24 hours of synchronous recordings of radar and reference sensors (ECG, impedance cardiography, continuous BP) from 30 healthy volunteers. One variation was extracted from the dataset, which was being investigated

Create a folder where the results of signal processing will be saved.

In [ ]:
prepareRadarData = joinpath(Dirpath, "prepareRadarData")
isdir(prepareRadarData) || (mkdir(prepareRadarData); println("Папка prepareRadarData создана"));

Next, we will perform all the transformations on the raw ECG and radar data that were performed in the demo example to analyze this data, namely:

  1. Reconstruction of the ellipse
  2. Extraction of vital signals - breathing, pulse
  3. Calculation of acceleration for the chest and subsequent filtration
  4. ECG filtration

The implementation of this function is located in the ready.jl file.

In [ ]:
root = joinpath(Dirpath, "data")

pat = r"^GDN\d{4}_.+\.mat$"
order_filter = 6
files = [p for p in glob("**/*.mat", root) if occursin(pat, basename(p))]
save_path = joinpath(Dirpath, "prepareRadarData")
isdir(save_path) || mkdir(save_path)
for file in files
    prepareData(file, save_path, order_filter)
end

Using the block above, we processed the data and saved it in a separate folder.

Creating a data loader

Next, we will create a data loader that will load data into the model in batches. In the block below, we will initialize the dataset, divide the data into training, validation, and test sets, and create data loaders.

The dataset logic is written in the dataset.jl file

The dataset slices the ECG and radar signal onto the overlaid windows. Slicing allows you to reduce the load on the ANN due to the use of long sequences, and the use of overlap allows you to take into account time dependencies.

In [ ]:
ds = RadarECG("prepareRadarData");
N = length(ds)
train_len = floor(Int, 0.7 * N)
val_len   = floor(Int, 0.15 * N)
test_len  = N - train_len - val_len


train_idx = 1:train_len
val_idx   = (train_len+1):(train_len+val_len)
test_idx  = (train_len+val_len+1):N

X_train = [ ds[i][1] for i in train_idx ]  
Y_train = [ ds[i][2] for i in train_idx ]

X_val   = [ ds[i][1] for i in val_idx   ]
Y_val   = [ ds[i][2] for i in val_idx   ]

X_test  = [ ds[i][1] for i in test_idx  ]
Y_test  = [ ds[i][2] for i in test_idx  ]


batch_size = 64
train_loader = Flux.DataLoader((X_train, Y_train);
    batchsize = batch_size,
    shuffle   = false,
    partial      = true,
    parallel = true,
    collate   = mycollate
)
val_loader = Flux.DataLoader((X_val, Y_val);   batchsize=batch_size, shuffle=true, parallel = true, collate   = mycollate)
test_loader= Flux.DataLoader((X_test, Y_test);  batchsize=batch_size, shuffle=true, parallel = true, collate   = mycollate);

Let's see what the ECG and radar signals look like, cut into overlapping windows.

In [ ]:
radar_batch, ecg_batch = first(train_loader)  
radar_batch = cpu(radar_batch)
ecg_batch = cpu(ecg_batch)
win, B = size(radar_batch)

p = plot(size=(500,500), layout = (2,1), margin=15mm)

for (arr, name) in zip((radar_batch, ecg_batch), ("radar", "ecg"))
    target = name == "radar" ? 1 : 2
    for j in 0:3
        x = (1:win) .+ j*ds.step
        y = arr[:, j+1]
        if target == "radar"
            plot!(p, x, y; subplot=target, lw=2, showlegend=false, alpha = 0.7, xlabel="Отсчет", ylabel="Ускорение, м/с^2")
        else
            plot!(p, x, y; subplot=target, lw=2, showlegend=false, alpha = 0.7, xlabel="Отсчет", ylabel="Амплитуда, мВ")
        end
    end

end
title!(p, "Радар", subplot=1)
title!(p, "ЭКГ",   subplot=2)



display(p)

After that, we will determine the device on which the model will be trained. If there is access to the GPU, the increase will be accelerated by several orders of magnitude.

In [ ]:
device = CUDA.functional() ? gpu : cpu;
device == gpu ? train_loader = gpu.(train_loader) : nothing
device == gpu ? val_loader = gpu.(val_loader) : nothing
device == gpu ? test_loader = gpu.(test_loader) : nothing;

The architecture of the model is presented in the model.jl file. Initialize the model and transfer it to the device being used

Initialization of the model, optimizer, loss function, and other parameters

In [ ]:
model = Radar2ECG_UNet_SE(32);
device == gpu ? model = gpu(model) : nothing;

Let's define some parameters: The learning rate, the number of epochs, and the model name are needed to automatically create a directory in the workspace where the weights of the model and graphs of various metrics will be stored during the learning process.

We will also initialize the optimizer, the loss function, which is implemented in the utils.jl file, and use the auxiliary function get_training_dir, which will create the necessary directories to save the learning results.

In [ ]:
fs_ecg = 2000;
In [ ]:
learning_rate = 1e-4
num_epochs = 100
model_name = "trainedModel"

lossFunction = loss;
# opt = Flux.Adam(learning_rate)
opt = Flux.Optimiser(WeightDecay(1e-6), Adam(learning_rate))

epoch_number = 0
train_dir, name_new_dir = get_training_dir(Dirpath, model_name)
print(train_dir, " ", name_new_dir)
/user/Диплом/AI/model/trainedModel/train1 train1

Let's define the lists where the data of the calculated metrics will be stored. In this example, we calculate MAE, RMSE, R2, Accuracy based on the tau level.

We also initialize the params variable, which stores some parameters of the learning cycle.

In [ ]:
tau = 0.05
best_val_loss = Inf
train_losses = Float64[]   
valid_losses = Float64[]  
MAEList      = Float64[]
RMSEList     = Float64[]
R2List       = Float64[]
AccList      = Float64[]
params = Dict("epochs"=> num_epochs, "batch_size"=>batch_size, "learning_rate"=> learning_rate, "tau"=>tau, "model"=>"Radar2ECG_UNet", "info"=>"Train model without scheduler." );

Model training

Next, we will start the training cycle of the model. The train and validate functions are located in the utils.jl file. After completing the training, the results are saved to the created folder

In [ ]:
no_improve_epochs = 0
best_model    = nothing

try
    for epoch in 1:num_epochs
        println("-"^50 * "\n")
        println("EPOCH $(epoch):")

        opt, train_loss = train!(
            model, train_loader, opt,
            lossFunction, device, 1, num_epochs
        )

        val_loss, metrics = validate(model, val_loader, lossFunction, gpu)
        @show "R2" metrics.R2

        if val_loss < best_val_loss
            best_val_loss = val_loss
            no_improve_epochs = 0
        else
            no_improve_epochs += 1
            if no_improve_epochs >= 10
                old_lr = opt[2].eta
                opt[2].eta *= 0.9
                @info "LR reduced: $old_lr$(opt[2].eta) at epoch $epoch"
                no_improve_epochs = 0
            end
        end

        println("Epoch $epoch/$num_epochs | train $(round(train_loss, digits=4)) | val $(round(val_loss, digits=4))")
        println("Current LR = $(opt[2].eta)")

        push!(train_losses, train_loss)
        push!(valid_losses, val_loss)
        push!(MAEList,  metrics.MAE)
        push!(RMSEList, metrics.RMSE)
        push!(R2List,   metrics.R2)
        push!(AccList,  metrics[Symbol("Acc@0.05")])


        if val_loss == best_val_loss
            best_model = deepcopy(model)
            @info "Валидационная потеря упала" val_loss=best_val_loss
        end

    end

catch e
    if e isa InterruptException
        println("Training interrupted by user.")
    else
        rethrow(e)
    end

finally
    best_model = cpu(best_model)
    output_dir_models = joinpath(train_dir, "models_bson")
    isdir(output_dir_models) || mkdir(output_dir_models)
    best_path = joinpath(output_dir_models, "model_best.bson")
    @info "Saving best model (val_loss=$(best_val_loss)) to $best_path"
    @save best_path best_model
    save_plots!(MAEList, RMSEList, R2List, AccList,
                train_losses, valid_losses;
                model_dir = "model",
                run_name  = name_new_dir,
                plot_name = model_name)

    params_path = joinpath(train_dir, "params.json")
    open(params_path, "w") do io
        JSON.print(io, params)
    end
    println("All done. Artifacts saved in $train_dir")
end

Inference models based on test data

Let's check how the model works on the test data.

In [ ]:
radar, ecg = test_loader[2]


# radar, ecg = batch


radar_ = reshape(radar, size(radar,1), 1, size(radar,2)) |> device
ecg_gpu = ecg |> device

pred_gpu = model(radar_)  


radar_cpu = Array(radar)                        
pred_cpu  = Array(dropdims(pred_gpu; dims=2))  
ecg_cpu   = Array(ecg_gpu)                     

B, L = size(radar_cpu)
rows, cols = 4, 2
nplots      = min(rows*cols, L)      # вдруг каналов < 8

plt = plot(layout       = (rows, cols),   # 4×2
           size         = (1500, 1500),
           margin       = 15mm,
           legend       = :outertopright,
           legendfontsize = 8)

for i in 1:nplots
    # подписи в легенде один-раз, чтобы не плодить дубликаты
    lbls = i == 1 ? ("Радар", "Восстановленная ECG", "Эталонная ECG") : ("", "", "")

    plot!(plt, radar_cpu[:, i + 2], lw = 2, alpha = 0.7,
          label = lbls[1], subplot = i)

    plot!(plt, pred_cpu[:,  i + 2], lw = 2,
          label = lbls[2], subplot = i)

    plot!(plt, ecg_cpu[:,   i + 2], lw = 2, linestyle = :dash,
          label = lbls[3], subplot = i)

    title!(plt, "Пример $i", subplot = i)
    xlabel!(plt, "Время [s]", subplot = i)
    ylabel!(plt, "Амплитуда, мВ", subplot = i)
end

display(plt)

Evaluation of the trained model

Let's display error graphs on the train and on validation.

In [ ]:
plot(train_losses, label="train loss", lw=3)
plot!(valid_losses, label="valid loss", lw = 3)
xlabel!("Эпоха")
ylabel!("Ошибка")
Out[0]:

We visualize the metrics accumulated during training - R^2, RMSE, MAE, Accuracy

In [ ]:
plt = plot(layout = (1,4), size = (900,300),
           legend = :outertopright,   
           legendfontsize = 8,
           margin = 5Plots.mm)

metricsList = [RMSEList, MAEList, R2List, AccList]
metric_names = ["RMSE", "MAE", "R²", "Accuracy"]

plt = plot(
    layout       = (1, 4),
    size         = (1000, 300),
    margin       = 5Plots.mm,
    legend       = false,         
)

for (i, name) in enumerate(metric_names)
    plot!(
        plt, metricsList[i], subplot = i,lw = 2,alpha   = 0.8,label   = "")
    title!(plt, name,      subplot = i)
    xlabel!(plt, "Epoch",    subplot = i)
    ylabel!(plt, name,       subplot = i)
end

display(plt)

Conclusion

In the work, a custom U-net-like neural network was trained on preprocessed radar data for ECG reconstruction. The model does a good job of showing good numbers on the calculated metrics.