Translator Disclaimer
Open Access
1 February 2022 Real-time tracking of a diffuse reflectance spectroscopy probe used to aid histological validation of margin assessment in upper gastrointestinal cancer resection surgery
Author Affiliations +
Abstract

Significance: Diffuse reflectance spectroscopy (DRS) allows discrimination of tissue type. Its application is limited by the inability to mark the scanned tissue and the lack of real-time measurements.

Aim: This study aimed to develop a real-time tracking system to enable localization of a DRS probe to aid the classification of tumor and non-tumor tissue.

Approach: A green-colored marker attached to the DRS probe was detected using hue-saturation-value (HSV) segmentation. A live, augmented view of tracked optical biopsy sites was recorded in real time. Supervised classifiers were evaluated in terms of sensitivity, specificity, and overall accuracy. A developed software was used for data collection, processing, and statistical analysis.

Results: The measured root mean square error (RMSE) of DRS probe tip tracking was 1.18  ±  0.58  mm and 1.05  ±  0.28  mm for the x and y dimensions, respectively. The diagnostic accuracy of the system to classify tumor and non-tumor tissue in real time was 94% for stomach and 96% for the esophagus.

Conclusions: We have successfully developed a real-time tracking and classification system for a DRS probe. When used on stomach and esophageal tissue for tumor detection, the accuracy derived demonstrates the strength and clinical value of the technique to aid margin assessment in cancer resection surgery.

1.

Introduction

Cancers of the gastrointestinal (GI) tract remain a major contributor to the global cancer risk, with 4.8 million new cases of GI cancer worldwide in 2018.1 These malignancies continue to pose a major threat to public health. The aim of surgery is for complete resection of tumor with clear margins, while preserving as much surrounding healthy tissue as possible.2 A positive circumferential resection margin (CRM) is associated with local recurrence of the tumor and poorer long-term survival.3,4 The five-year survival of patients with esophageal cancer with positive margins has been reported as 13.8% compared to 46.3% for those with negative margins.5 In patients with stomach cancer, local recurrence has been shown to be 16% after a positive margin following surgery.6 For this reason, it is paramount to establish tissue margins accurately.7

Currently, the gold-standard intra-operative technique for CRM assessment is frozen sections.8,9 However, this technique is at risk of sampling errors, plus it is time-consuming, labor intensive, and lengthens the operative time, affecting both patient outcome and theater efficiency.10

Diffuse reflectance spectroscopy (DRS) is a technique that allows discrimination of normal and abnormal tissue based on spectral data and presents a promising advancement in cancer diagnosis.1114 When compared to sophisticated micro-endoscopic probes, DRS has lower costs and is simpler because it does not require lasers or magnification optics. The main limitation of the clinical use of DRS is that, although DRS can discriminate tissue types, it does so by providing single-point spectral measurements and leaves no marks on the tissue during scanning.15 In this way, it is not possible to localize the area that has been in contact with the probe when optical biopsy takes place, and thus makes it difficult for the surgeon to determine the resection margin. This is particularly challenging when DRS is used endoscopically or during minimally invasive surgery, where the ergonomics of scanning and viewing the DRS probe site are even more demanding.

To overcome this limitation, the tip of the DRS probe should ideally be tracked to enable localization of the biopsy site.16 Several surgical tool tracking approaches are available, of which optical tracking has been the most widely evaluated during surgical practice.17,18 This involves using color images from one or more cameras to determine a tool’s pose and has been particularly used during minimally invasive surgery.19 Many optical tracking systems involve markers such as fiducials or color labels such that the tip of the instrument can be recognized and tracked. For instance, a dual-pattern hybrid marker, incorporating both circular dots and chessboard vertices, provided high detection rates and accurate pose estimation of a laparoscopic gamma probe achieving a mean translation error of 1.81 mm.20 In another study, biocompatible color markers appropriate for real-time use during surgery achieved 75% sensitivity and 90% specificity for a multiple instrument tracking system.21

Other studies have developed systems to delineate resection margins; however, they have been met by significant limitations. Raman spectroscopy has shown much promise in this field, but it has not yet been possible to perform probe tracking in real time, although a rate of 2 to 5 frames per second has been achieved.22 Similarly, although fluorescence-guided surgery has the potential for real-time margin assessment, quantifiable fluorescence imaging information is difficult to obtain using open-field devices due to uncontrollable parameters, such as the variable light intensity, working distance, and influence of ambient light on the instruments in the operating room.23

This paper presents a tracking system to enable localization of the tip of a handheld DRS probe. The system allowed tracking of the two-dimensional (2D) position and orientation (x and y axes, angle) of the spectroscopic probe using a color marker to aid real-time assessment of tumor margins when used on ex vivo cancer specimens. The aim of this study was to support real-time differentiation of tumor and non-tumor tissue—and thereby to improve existing methods for intra-operative margin assessment—by introducing a system to track a DRS probe during surgery.

2.

Methods

2.1.

Marker Design, Detection and Segmentation, and Calculation of Tip Position

We first describe the tracking system for the DRS probe, with details about the probe itself and the ex vivo experiments presented in later sections. Accurate detection and segmentation are essential for DRS probe tracking. To calculate the 2D probe tip position, we needed to find the location of the probe in the image plane. The midline of the DRS probe could be found from the two edges, with the tip point lying somewhere along this line.

The non-uniqueness of the probe color inspired the use of an artificial color marker to identify it. To choose a distinct marker color, we analyzed the color distribution of real laparoscopic images that contained all possible colors that may be present during the course of an operation. The hue-saturation-value (HSV) color space (Fig. 1) was used since the hue values are directly related to the color signature. Figure 1(a) shows a typical color distribution for biological tissue ranging from yellow to purple. For the coded color to be most distinguishable from those present in the image, two alternative colors were chosen—cyan and green—to construct a marker in the form of plastic tape, 70 mm in length, wrapped close to the tip of the cylindrical probe (Fig. 1).

Fig. 1

(a) Color distribution of the biological tissue, with cyan color circled as a potential color for our marker. (b) DRS fiber probe with the green color marker attached.

JBO_27_2_025001_f001.png

The overall methodology for using a monocular red/green/blue (RGB) camera (C920, Logitech International S.A., Lausanne, Switzerland) in a 2D color-based optical marker tracking system is summarized in Fig. 2. A 2D image calibration method was employed24 that only requires the observation of a planar pattern (a checkerboard) from different orientations and positions. A set of pre-captured images from the camera were analyzed to estimate the 3×3 intrinsic matrix K that contained the intrinsic parameters (e.g., focal length, optical center, and radial distortion coefficients of the lens) and the extrinsic matrix that consisted of the 3×3 rotation matrix, R, and the 3×1 translation vector.

Fig. 2

DRS fiber probe tracking procedure.

JBO_27_2_025001_f002.png

The input video frames were resized from 1920×1080 to 640×480 for faster processing and an increase in frames-per-second. Then, the frame was blurred to reduce high spatial frequency information while maintaining visibility of the flat-textured DRS probe. The image was converted from RGB to HSV color space, and thresholded based on predefined hue (h) and saturation (s) upper and lower boundaries for either cyan or green color. The segmented frame was converted back to RGB, and morphological erosion and dilation operations were applied to eliminate negative impulsive noise and positive noise, and to fill possible gaps in the mask.

The various contours in the mask representing segmented objects were identified and ordered, and the largest one was retained as the likely position of the detected color marker. Corner points of the bounding box of the contour were extracted and used alongside the Hough transform to detect the edges of the probe. The midline of the probe was calculated from the detected edges using standard description of a line located in xy plane: y=mx+b.

The tip position of the probe was located by conducting a one-dimensional (1D) edge detection along the midline of the probe in the segmented image. From Fig. 3(d) it is clear that there are two possible locations that could correspond to the tip position, one located to the bottom left of the white segmented probe, and the other one to the top right. To determine which direction to search for the location of the probe tip along the midline, it was assumed that the probe would only be moved in the range 0 deg to 180 deg in the yaw axis during ex vivo benchtop sampling.

Fig. 3

DRS probe detection workflow. (a) Example input video frame. (b) Blurred frame. (c) Hue channel of the blurred input frame. (d) Binary mask result of HSV segmentation. (e) Detected edge lines of the probe. (f) Detected tip point of the probe.

JBO_27_2_025001_f003.png

2.2.

DRS Probe Tip Tracking

To deal with real-world situations such as partial occlusion, changing illumination, probe velocity changes etc., we introduced a standard Kalman filter (KF) to estimate the location of the tip of the probe based on its location in previous frames.25 The task of the KF was divided into two steps: prediction and correction. The KF tracked the state matrix that contained the current value of the probe tip location, while the process covariance matrix contained the predictive error of those measurements. For each new time frame, the state transition matrix moved the state and process matrices based on the position at that time, estimating a new position and new covariance. The Kalman gain was calculated along with the detected probe tip positions. The correction step consisted of the update from the current state using the Kalman in conjunction with the measurement at that time frame.

2.3.

Graphical User Interface

A user interface (UI) was developed using Python 3.6 (Python Software Foundation, Wilmington) and Qt5 toolkit to integrate all the functionalities (Fig. 4) on a standard PC (Intel i3 processor at 3.30 GHz with 8 GB RAM). The UI consists of different functions such as the “start tracking” function for the initiation of the probe tracking and the “get spectrum” for the spectral data acquisition. Before starting the acquisition, the user entered patient metadata, spectrometer integration time, spectral range, tissue type, and DRS calibration etc. in a dialog box. A single frame was captured of the tissue sample being examined without the DRS probe. This frame was annotated with all the probed locations at the end of the acquisition process. The live video feedback from the RGB camera was annotated in real time with the probed positions, and the current spectrum at each location was displayed in the main GUI window. A colormap was utilized to visually indicate the probability of a data point belonging to a class of tissue type (green for normal, pink for tumor, classification described below). Real-time classification was added to the GUI following the interim analysis of the data. When the experiment was finished, all data was saved for further correlation with the histological data. The system worked at 30 frames per second with an image resolution of 640 × 480 pixels using a laptop with code parallelization. Finally, the DRS probe was able to acquire 80 spectra per second through the developed GUI using multi-threading.

Fig. 4

Python software for hardware system control, data acquisition and real-time classification. (a) Control panel, allowing for execution of data acquisition protocol, probe tracking, hardware calibration and light source control. (b) Augmented live video feed from the RGB camera. Real-time classification is displayed on the screen via text and a colored overlay on the tissue, with a graduated color scale indicating the probability of belonging to a particular class (Normal = 100% green – 100% pink = Tumor). (c) Reflectance data in the 420- to 1000-nm spectral range displayed in real-time.

JBO_27_2_025001_f004.png

2.4.

Ex Vivo Data Acquisition

The study was performed with approval from the Harrow Research Ethics Committee (ref. no. 08/H0719/37) and was undertaken at Imperial College NHS trust. We collected data from patients undergoing upper GI cancer resection surgery between July 2020 and March 2021 who gave their written consent to partake in the study.

The DRS system consisted of a reflection probe (Ocean Optics Inc., QR600-7-SR-125F) containing six peripheral illumination fibers around one light collection fiber (600  μm diameter) within a 0.125-in. ferrule. A tungsten halogen light source (360 to 2400 nm, typical output power 8.8 W, Ocean Optics Inc., HL-2000-HP) was connected to the six illumination fibers, while the central fiber was connected to a spectrometer (Ocean Optics Inc., USB4000) [Fig. 5(a)]. The spectrometer was controlled by the developed GUI. The spectral data acquired by the spectrometer was in the spectral range of 400 to 1000 nm. To protect the DRS probe tip from tissue contamination a sterile plastic camera drape (365 Healthcare Ltd.) was chosen due to the low optical interference, low cost, and sterility.

Fig. 5

(a) DRS instrumentation and (b) workflow for ex vivo data acquisition.

JBO_27_2_025001_f005.png

Data were collected from esophageal and/or stomach specimens ex vivo immediately after surgical resection. The hand-held DRS probe and tracking system was used on suspected normal tissue within the sample and on macroscopically suspected cancerous or fibrosed tissue to obtain spectral information.

DRS measurements of suspected macroscopic tumor site and normal tissue were taken on the outer (serosal) layer. “Normal” tissue measurements were taken as far from the suspected cancerous location as possible and close to the resection margin, based on visual and haptic inspection of the macroscopic tissue by a surgeon. Twenty spectra were acquired per tissue point, which were averaged and displayed in real time (Fig. 4). A minimum of 200 different locations per tissue type (normal or tumor) were sampled, depending on the size of the region. When possible, DRS recordings of stomach and esophagus were taken from the same specimen. The data acquisition protocol lasted 10  min.

To account for inter-patient, background light, and signal quality variability, spectral normalization, and noise reduction was performed at the beginning of each sampling session using white reflectance standard and dark-field readings:

λDRS=λRAWλDNλWSTλDN,
where λRAW is the raw signal acquired from the spectrometer, λDN is the dark-field signal, λWST is the white reflectance standard signal and λDRS is the final reflectance value. DRS data were then processed using a Savitzky–Golay (or digital smoothing polynomial) filter to perform noise reduction and preserve higher order moments (spectral characteristics) of the original spectrum. The workflow of the ex vivo data acquisition and analysis is shown in Fig. 5(b).

2.5.

Histopathology Correlation

Following the acquisition of all spectra, the suspected tumor location was painted with yellow tissue dye (Cancer Diagnostics Inc., Durham) and another still image was recorded to allow correlation with standard histopathological analysis, which was used as a reference test. The specimen was passed to the histopathology department, where it was sliced and photos were taken of these slices which included the yellow painted region. The tissue was then processed according to standard protocols (Fig. 6), with the resulting histology slides marked as either normal or a range of different tumor types including esophageal adenocarcinoma, esophageal squamous cell carcinoma, or gastric adenocarcinoma.

Fig. 6

Histopathological analysis of the specimen for purposes of correlation of suspected tumor location. Histopathological correlation workflow is shown on an example of GAC. The correlation was performed via yellow tissue paint [marked with yellow arrow across panels (a)–(f)]. The ex vivo tissue specimen (stomach) (a) with tracked optical biopsy sites of the suspected “tumor” area; (b) after painting of macroscopic suspected “tumor” tissue with yellow paint; and (c) having been placed in formalin. (d) Macroscopic slice of stomach tissue. (e) Microscopic H&E slice and confirmed GAC (red outlined area). (f) The ex vivo tissue specimen with green colored optical biopsy sites of confirmed tumor [based on (b)] after manual labelling and removal of tracked optical biopsy sites that fell outside of tumor area.

JBO_27_2_025001_f006.png

Once the painted area was confirmed as tumor, the annotated video frames containing the optical biopsy-tracked positions of suspected “tumor” tissue were manually labeled as tumor (Fig. 6). These labels were considered ground truth labels for training of machine learning classifiers. Measurements at locations, which could not be confirmed by histopathology (e.g., if the yellow paint was not clear on the sliced specimen or if the paint could not be detected on the H&E slide) were excluded.

2.6.

Spectral Classification

Data from all patients were separately combined into esophagus and stomach datasets. The features of the collected data were first standardized by removing the mean and scaling to unit variance. An automated method was implemented for outlier (e.g., erroneous measurements or air interference) detection and removal. For this method, the median of the residuals was calculated, along with the 25th and 75th percentiles. The difference between each historical value and the residual mean was then calculated. If the historical value was 1.5 times the median absolute deviation away from the median of the residuals, that value was classified as an outlier. Additionally, the ground truth provided by histopathology correlation was used for the validation of the method. To reduce overfitting and improve accuracy, feature selection was performed on each of the esophagus and stomach datasets. Data were then divided into the training and testing sets using the repeated stratified k-fold cross-validation (CV) method, with five folds and five repeats. This CV method was chosen based on a literature review and the nature of the data used in this study. The stratified five-fold CV method gives a more stable and trustworthy result since training and testing is performed on several different parts of the dataset and deals with imbalanced data. Using this CV method, the dataset is split into five folds such that each fold contains approximately the same percentage of samples of each target class as the complete set. When stratified five-fold CV is compared to leave-one-out CV, the latter requires building n models instead of five, where n stands for the number of samples in the dataset. This means that the leave-one-out CV method is more computationally expensive than the stratified five-fold CV. The spectral data analysis process is shown in Fig. 7.

Fig. 7

Spectral data analysis workflow.

JBO_27_2_025001_f007.png

Binary classification into normal and tumor tissue was performed using various supervised machine learning classifiers, such as linear support vector machine (SVM), multi-layer perceptron (MLP), light gradient boosting machine (LGBM), and extreme gradient boosting (XGB).26

Machine learning classifiers were evaluated in terms of sensitivity, specificity, overall accuracy, and the area under the ROC curve (AUC). Python 3.6 was used for data processing, visualization, machine learning classification, and statistical analysis.

3.

Experimental Evaluation and Results

3.1.

DRS Probe Tracking Evaluation

For the evaluation of the tracking accuracy, two videos from each tissue specimens were analyzed by manual labeling of the probe 2D tip position (x,y) by two different individuals (data labelling tool Labelbox Inc., California) and consequently compared to the estimated tip location by the proposed algorithm (Fig. 8). The root mean squared error (RMSE) between the estimated and manually labeled tip position was 1.18±0.58  mm (4.21±2.07  pixels) and 1.05±0.28  mm (3.75±0.97  pixels) for the x and y directions, respectively, as shown in Table 1. The maximum measured error across all 1050 frames from both videos was 1.76 mm (6.28 pixels) at a working distance of 30 cm. In Fig. 8, we can see a frame from each video overlaid with the manually labeled and the estimated probe’s tip position in two different colors.

Fig. 8

(a) and (b) Illustration of probe tracking error with overlaid ground truth (blue line) and estimated (yellow line) probe tip position for two video sequences (600 and 450 frames each). (c)–(f) The ground truth and estimated projected DRS probe tip pixel position for x axis and y axis for each of the video sequences. Blue and red lines indicate the estimated and ground truth respectively.

JBO_27_2_025001_f008.png

Table 1

Projection error for DRS probe’s tip location.

DimensionRMSE (mm)Maximum error (mm)Minimum error (mm)
x1.18±0.581.760.00
y1.05±0.281.390.00

For further validation, the detection limits and detection rates were calculated by recording the maximal experimentally distance and rotation angle of the probe (Table 2). The distance was recorded from the camera to the probe and the limits of rotation were defined about the probe local coordinate axes (roll, pitch, and yaw). When testing the distance limits, the probe was translated along the axis of the camera until the detection failed. To identify the rotational motion limits, the probe was placed at 30 cm from the camera, a typical distance for practical ex vivo tissue scanning.

Table 2

Rotation angle and maximum detectable distance from the camera to track the DRS probe.

Rotation axisRotation angle and distance
Roll (deg)360 deg
Pitch (deg)360 deg
Yaw (deg)0 deg to 180 deg
Distance to camera (cm)10 to 100

The image-based tracking framework was used to map all spatio-temporally tracked biopsy sites and re-project them back onto the image plane to provide a live augmented view of the ex vivo spectral acquisition procedure. The biopsy site was defined by the probe tip location, taken from the tracking procedure, and the geometrical characteristics of the probe.

3.2.

Real-time Tracking and Sampling of Ex Vivo Tissue

A total of 32 patients were recruited into this study. Seven patients were excluded due to histopathology assessment, as tumor was reported benign or regressed, resulting in 26 patients being included in the final analysis. The median age was 68, with the majority of patients being male (n=24, 75%). Overall, 20 distinct sets of normal stomach data, 12 of normal esophagus data, 8 of gastric cancer data, and seven of esophageal cancer data were recorded. Ex vivo specimen labelling errors were identified by reviewing videos of real-time data acquisition retrospectively and analyzing the DRS probe position over the tissue areas through use of the color tracking system. Probe contact artifacts (e.g., lack of probe contact with the tissue) were removed using the automated outlier detection and removal method mentioned in Sec. 2.6.

A total of 4628 mean spectra were collected for normal stomach, 2305 mean spectra for gastric cancer, 2990 mean spectra for normal esophagus, and 1939 mean spectra for esophageal cancer. Each processed mean spectrum contained 505 equally spaced intensity measurements in the 468- to 720-nm spectral range (resolution 0.5 nm) with data from 420 to 468 nm, and 720- to 1000-nm spectral ranges excluded following interim analysis. The means of all spectra for each of the tissue classes are shown in Fig. 9.

Fig. 9

Average (lines) and standard deviations (shaded areas) of the mean spectra for (a) stomach and (b) esophagus.

JBO_27_2_025001_f009.png

3.3.

Classification of Tumor versus Non-Tumor

Results of the classification for stomach and esophagus spectral data is presented in Table 3. XGB was the best performing machine learning algorithm for both stomach and esophagus, achieving an overall normal versus cancer diagnostic accuracy of 93.86±0.66 for stomach and 96.22±0.50 for esophagus. The sensitivity and specificity of the classifier were 91.31% and 95.13% for stomach and 94.60% and 97.28% for esophagus. For the XGB algorithm, a step size shrinkage of 0.3 was used in the update phase to prevent overfitting, while the maximum depth of a tree was chosen at value of 6 as a higher value of this parameter would make the model more complex and more likely to overfit. The XGB and LGBM algorithms are both ensemble tree methods that apply the principle of boosting weak learners using the gradient descent architecture. However, XGB improves upon the base gradient boosting machines framework through parallelization of tree building, tree pruning using depth-first approach, regularization for avoiding overfitting, and in-built CV capability. Compared with the SVM model, the XGB algorithm generally showed better performance in terms of accuracy, sensitivity, and specificity. In addition, the XGB model showed much higher computation speed than all the other algorithms used in this study, due to its inherent parallel processing, requiring only 3.5 s over both training and validation phases.

Table 3

Performance metrics for the spectral data classification using XGB. Data presented as mean ± standard deviation. Overall accuracy calculated as proportion of correctly identified spectra over total number of spectra.

Tissue typeClassifierAccuracy (%)Sensitivity (%)Specificity (%)AUC (%)
StomachXGB93.86±0.6691.31±1.5995.13±0.8498.50±0.28
StomachLGBM93.63±0.7291.00±1.6694.94±0.7998.40±0.28
StomachMLP90.06±0.9686.27±4.4091.74±1.4196.09±0.50
StomachSVM87.67±0.8183.47±1.2789.53±1.0594.23±0.52
EsophagusXGB96.22±0.5094.60±0.9297.28±0.6399.24±0.19
EsophagusLGBM96.26±2694.38±0.9897.47±0.5799.27±0.18
EsophagusMLP91.97±2.9085.18±4.0995.97±2.9096.74±0.45
EsophagusSVM88.35±0.9582.19±4.3092.34±2.3394.19±0.53

Fig. 10

Illustration of real-time DRS probe tracking and classification at different frames during scanning of a gastric ex vivo tissue specimen. (a)–(c) Real-time visual correlation of scanned optical biopsy sites on tissue (green dots on each frame) using DRS probe. (d)–(f) Real-time tracking at each optical biopsy site coupled with binary classification of each site as either normal (100% green) or tumor tissue (100% pink) via a graduated color map. Tissue type highlighted on screen in real time (bottom left of each frame).

JBO_27_2_025001_f010.png

4.

Discussion

We have successfully developed a real-time tracking system for a DRS probe when used on stomach and esophageal tissue for tumor detection. Our tracking system showed a measured root mean square error of 1.18 and 1.05 mm for the X and Y directions, respectively, and a maximum measured error of 1.76 mm when used. The color green achieved an excellent performance when used as a marker for probe tracking. This tracking system allowed us to classify spectral data for tumor and non-tumor tissue in real-time with an overall diagnostic accuracy of 94% for stomach and 96% for the esophagus, highlighting the clinical value of our technique.

Our handheld DRS probe and tracking system was able to acquire 80 spectra per second while providing real-time diagnostic information through direct visualization of the areas on the specimen that were probed (Fig. 10). Probe tracking was performed in real time at 30 frames per second.

Different types of optical spectroscopy techniques (e.g., Raman spectroscopy, fluorescence, and DRS) are being investigated for sensing tissue types.27 Diffuse reflectance is well suited to real-time characterization of tissue properties and acquisition of information in a short space of time together with ease of use.22,2830 The real-time tracking method we have developed in this study, and the technique for accurate classification of tumor tissue, can also be applied to other optical spectroscopy probes, such as rapid evaporative ionization mass spectrometry (REIMS) technology,31,32 fluorescence spectroscopy33 and Raman spectroscopy.34,35 In this way, the ergonomics, ease of use and validation of data collection for these optical techniques can be improved.

In our study, we were able to probe the ex vivo specimen within 15 min of tissue resection, with the real-time spectral data acquisition protocol lasting 10  min. This meant that the tissue was probed as close to its in vivo setting as possible and ensured minimal disruption to the specimen handling and processing.

There are a number of limitations for this study. First, this is an ex vivo study in which specimens were placed on a white background and colors in the surrounding field could be controlled. However, application in vivo may not allow this and can introduce differences in color and artefacts (e.g., active bleeding and other instruments), leading to difficulties in tracking of the color marker. Second, the accuracy of the tracking method could be further increased by the use of modern deep learning models for probe tip detection and tracking.3639 Third, the tracking system used in this study provided two-dimensional (2D) visual information only, although we are currently collecting three-dimensional (3D) stereoscopic data to aid clinical application in the future.40,41 We are also concurrently developing further classification statistics based on benign or regressed tumor to increase the clinical value of the system, since this study only focused on tumor and non-tumor tissue. Finally, in laparoscopic surgery, probe tracking will have to occur with a moving laparoscopic camera.42,43 Moreover, occlusions of the probe caused by other instruments and the tissue will provide additional optical tracking challenges.

5.

Conclusion

The proposed real-time DRS tracking method has been validated on ex vivo data with histological ground truth, and the accuracy derived demonstrates the strength and clinical value of the technique. The method allows real-time tracking and accurate classification with short data acquisition time to aid margin assessment in cancer resection surgery and has potential to be applied in routine surgical practice. The methods developed can provide an aid for tissue tracking and validation against other diagnostic methods, improving the process of evaluating new spectroscopy modalities.

Disclosures

AD is Chair of the Health Security initiative at Flagship Pioneering UK Ltd. The remaining authors declare no conflicts of interest.

Acknowledgments

This paper is independently research funded by the National Institute for Health Research (NIHR) Imperial Biomedical Research Centre (BRC) and the Cancer Research UK (CRUK) Imperial Centre. We would like to thank Marta Jamroziak for all her incredible help, support, and advice for the histopathology validation for this study. We would also like to thank all the theatre staff and surgical team at Hammersmith hospital for their support during data collection.

Code, Data, and Materials Availability

Raw data are available upon reasonable request.

References

1. 

M. Arnold et al., “Global burden of 5 major types of gastrointestinal cancer,” Gastroenterology, 159 (1), 335 –349.e15 (2020). https://doi.org/10.1053/j.gastro.2020.02.068 GASTAB 0016-5085 Google Scholar

2. 

A. Biondi, “R0 resection in the treatment of gastric cancer: room for improvement,” World J. Gastroenterol., 16 (27), 3358 –3370 (2010). https://doi.org/10.3748/wjg.v16.i27.3358 Google Scholar

3. 

I. J. Adam et al., “Role of circumferential margin involvement in the local recurrence of rectal cancer,” Lancet, 344 (8924), 707 –711 (1994). https://doi.org/10.1016/S0140-6736(94)92206-3 LANCAO 0140-6736 Google Scholar

4. 

S. P. L. Dexter et al., “Circumferential resection margin involvement: an independent predictor of survival following surgery for oesophageal cancer,” Gut, 48 (5), 667 –670 (2001). https://doi.org/10.1136/gut.48.5.667 GUTTAK 0017-5749 Google Scholar

5. 

J. Javidfar et al., “Impact of positive margins on survival in patients undergoing esophagogastrectomy for esophageal cancer,” Ann. Thorac. Surg., 101 (3), 1060 –1067 (2016). https://doi.org/10.1016/j.athoracsur.2015.09.005 Google Scholar

6. 

L. D. Juez et al., “Influence of positive margins on tumor recurrence and overall survival after gastrectomy for gastric cancer,” ANZ J. Surg., 91 (7–8), E465 –E473 (2021). https://doi.org/10.1111/ans.16937 Google Scholar

7. 

A. M. Zysk et al., “Intraoperative assessment of final margins with a handheld optical imaging probe during breast-conserving surgery may reduce the reoperation rate: results of a multicenter study,” Ann. Surg. Oncol., 22 (10), 3356 –3362 (2015). https://doi.org/10.1245/s10434-015-4665-2 Google Scholar

8. 

R. M. Gomes et al., “Role of intraoperative frozen section for assessing distal resection margin after anterior resection,” Int. J. Colorectal Disease, 30 (8), 1081 –1089 (2015). https://doi.org/10.1007/s00384-015-2244-4 IJCDE6 1432-1262 Google Scholar

9. 

J. Spicer et al., “Diagnostic accuracy and utility of intraoperative microscopic margin analysis of gastric and esophageal adenocarcinoma,” Ann Surg. Oncol., 21 (8), 2580 –2586 (2014). https://doi.org/10.1245/s10434-014-3669-7 Google Scholar

10. 

R. F. Gandour-Edwards, P. J. Donald and D. A. Wiese, “Accuracy of intraoperative frozen section diagnosis in head and neck surgery: experience at a university medical center,” Head Neck, 15 (1), 33 –38 (1993). https://doi.org/10.1002/hed.2880150108 Google Scholar

11. 

M. S. Nogueira et al., “Evaluation of wavelength ranges and tissue depth probed by diffuse reflectance spectroscopy for colorectal cancer detection,” Sci. Rep., 11 (1), 798 (2021). https://doi.org/10.1038/s41598-020-79517-2 SRCEC3 2045-2322 Google Scholar

12. 

S. Akter et al., “Medical applications of reflectance spectroscopy in the diffusive and sub-diffusive regimes,” J. Near Infrared Spectrosc., 26 (6), 337 –350 (2018). https://doi.org/10.1177/0967033518806637 Google Scholar

13. 

E. J. M. Baltussen et al., “Diffuse reflectance spectroscopy as a tool for real-time tissue assessment during colorectal cancer surgery,” J. Biomed. Opt., 22 (10), 106014 (2017). https://doi.org/10.1117/1.JBO.22.10.106014 JBOPFO 1083-3668 Google Scholar

14. 

A. Keller et al., “Diffuse reflectance spectroscopy of human liver tumor specimens – towards a tissue differentiating optical biopsy needle using light emitting diodes,” Biomed. Opt. Express, 9 (3), 1069 –1081 (2018). https://doi.org/10.1364/BOE.9.001069 BOEICL 2156-7085 Google Scholar

15. 

P. Mountney et al., “Optical biopsy mapping for minimally invasive cancer screening,” Lect. Notes Comput. Sci., 5761 483 –490 (2009). https://doi.org/10.1007/978-3-642-04268-3_60 LNCSD9 0302-9743 Google Scholar

16. 

S. Speidel et al., “Tracking of instruments in minimally invasive surgery for surgical skill analysis,” Lect. Notes Comput. Sci., 4091 148 –155 (2006). https://doi.org/10.1007/11812715_19 LNCSD9 0302-9743 Google Scholar

17. 

R. Elfring, M. de la Fuente and K. Radermacher, “Assessment of optical localizer accuracy for computer aided surgery systems,” Comput. Aided Surg., 15 (1–3), 1 –12 (2010). https://doi.org/10.3109/10929081003647239 Google Scholar

18. 

D. Bouget et al., “Vision-based and marker-less surgical tool detection and tracking: a review of the literature,” Med. Image Anal., 35 633 –654 (2017). https://doi.org/10.1016/j.media.2016.09.003 Google Scholar

19. 

M. P. S. F. Gomes et al., “Tool tracking for endoscopic surgery,” Trans. Inst. Meas. Control, 25 (4), 281 –292 (2003). https://doi.org/10.1191/0142331203tm087oa TICODG 0142-3312 Google Scholar

20. 

B. Huang et al., “Tracking and visualization of the sensing area for a tethered laparoscopic gamma probe,” Int. J. Comput. Assist. Radiol. Surg., 15 (8), 1389 –1397 (2020). https://doi.org/10.1007/s11548-020-02205-z Google Scholar

21. 

L. Bouarfa et al., “In-vivo real-time tracking of surgical instruments in endoscopic video,” Minim. Invasive Ther. Allied Technol., 21 (3), 129 –134 (2012). https://doi.org/10.3109/13645706.2011.580764 Google Scholar

22. 

C. C. Horgan et al., “Image-guided Raman spectroscopy probe-tracking for tumor margin delineation,” J. Biomed. Opt., 26 (3), 036002 (2021). https://doi.org/10.1117/1.JBO.26.3.036002 JBOPFO 1083-3668 Google Scholar

23. 

S. van Keulen et al., “The clinical application of fluorescence-guided surgery in head and neck cancer,” Journal of Nuclear Medicine, 60 (6), 758 –763 (2019). https://doi.org/10.2967/jnumed.118.222810 JNMEAQ 0161-5505 Google Scholar

24. 

Z. Zhang, “Flexible camera calibration by viewing a plane from unknown orientations,” in Proc. Seventh IEEE Int. Conf. Comput. Vision, (1999). https://doi.org/10.1109/ICCV.1999.791289 Google Scholar

25. 

G. Bishop and G. Welch, “An introduction to the Kalman filter,” in Proc. SIGGRAPH, Course, 41 (2001). Google Scholar

26. 

E. J. M. Baltussen et al., “Optimizing algorithm development for tissue classification in colorectal cancer based on diffuse reflectance spectra,” Biomed. Opt. Express, 10 (12), 6096 –6113 (2019). https://doi.org/10.1364/BOE.10.006096 BOEICL 2156-7085 Google Scholar

27. 

D. Evers et al., “Optical spectroscopy: current advances and future applications in cancer diagnostics and therapy,” Future Oncol., 8 (3), 307 –320 (2012). https://doi.org/10.2217/fon.12.15 Google Scholar

28. 

E. R. St John et al., “Rapid evaporative ionisation mass spectrometry of electrosurgical vapours for the identification of breast pathology: towards an intelligent knife for breast cancer surgery,” Breast Cancer Res., 19 (1), 59 (2017). https://doi.org/10.1186/s13058-017-0845-2 BCTRD6 Google Scholar

29. 

I. Pence and A. Mahadevan-Jansen, “Clinical instrumentation and applications of Raman spectroscopy,” Chem. Soc. Rev., 45 (7), 1958 –1979 (2016). https://doi.org/10.1039/C5CS00581G CSRVBR 0306-0012 Google Scholar

30. 

M. S. Bergholt et al., “Fiber-optic Raman spectroscopy probes gastric carcinogenesis in vivo at endoscopy,” J. Biophotonics, 6 (1), 49 –59 (2013). https://doi.org/10.1002/jbio.201200138 Google Scholar

31. 

J. Balog et al., “Intraoperative tissue identification using rapid evaporative ionization mass spectrometry,” Sci. Transl. Med., 5 (194), 194ra93 (2013). https://doi.org/10.1126/scitranslmed.3005623 JCEND7 0363-8855 Google Scholar

32. 

J. Zhang et al., “Mass spectrometry technologies to advance care for cancer patients in clinical and intraoperative use,” Mass Spectrom. Rev., 40 (5), 692 –720 (2021). https://doi.org/10.1002/mas.21664 MSRVD3 0277-7037 Google Scholar

33. 

A. Pradhan, P. K. Pandey, P. Singh, “Overview of fluorescence spectroscopy and imaging for early cancer detection,” Neurophotonics and Biomedical Spectroscopy, Elsevier(2019). Google Scholar

34. 

M. S. Bergholt et al., “Simultaneous fingerprint and high-wavenumber fiber-optic Raman spectroscopy enhances real-time in vivo diagnosis of adenomatous polyps during colonoscopy,” J. Biophotonics, 9 (4), 333 –342 (2016). https://doi.org/10.1002/jbio.201400141 Google Scholar

35. 

J. J. Wood et al., “Evaluation of a confocal Raman probe for pathological diagnosis during colonoscopy,” Colorectal Dis., 16 (9), 732 –738 (2014). https://doi.org/10.1111/codi.12664 Google Scholar

36. 

X. Du et al., “Articulated multi-instrument 2-D pose estimation using fully convolutional networks,” IEEE Trans. Med. Imaging, 37 (5), 1276 –1287 (2018). https://doi.org/10.1109/TMI.2017.2787672 ITMID4 0278-0062 Google Scholar

37. 

R. Nachabé et al., “Diagnosis of breast cancer using diffuse optical spectroscopy from 500 to 1600 nm: comparison of classification methods,” J. Biomed. Opt., 16 (8), 087010 (2011). https://doi.org/10.1117/1.3611010 JBOPFO 1083-3668 Google Scholar

38. 

Z.-L. Ni et al., “RASNet: segmentation for tracking surgical instruments in surgical videos using refined attention segmentation network,” in 41st Annu. Int. Conf. IEEE Eng. Med. Biol. Soc., 5735 –5738 (2019). Google Scholar

39. 

C. I. Nwoye et al., “Weakly supervised convolutional LSTM approach for tool tracking in laparoscopic videos,” Int. J. Comput. Assist. Radiol. Surg., 14 (6), 1059 –1067 (2019). https://doi.org/10.1007/s11548-019-01958-6 Google Scholar

40. 

Z. Zhao, “Real-time 3D visual tracking of laparoscopic instruments for robotized endoscope holder,” in Proc. 11th World Congr. Intell. Control and Autom., (2014). https://doi.org/10.1109/WCICA.2014.7053773 Google Scholar

41. 

I. Laina et al., “Concurrent segmentation and localization for tracking of surgical instruments,” Lect. Notes Comput. Sci., 10434 664 –672 (2017). https://doi.org/10.1007/978-3-319-66185-8_75 LNCSD9 0302-9743 Google Scholar

42. 

A. M. Cano et al., “Laparoscopic tool tracking method for augmented reality surgical applications,” Biomedical Simulation, Springer, Berlin, Heidelberg (2008). Google Scholar

43. 

Z. Zhao et al., “Real-time tracking of surgical instruments based on spatio-temporal context and deep learning,” Comput. Assist. Surg., 24 (Suppl. 1), 20 –29 (2019). https://doi.org/10.1080/24699322.2018.1560097 Google Scholar

Biography

Ioannis Gkouzionis received his Diploma (Dipl.-Ing.) of Engineering in electrical and computer engineering (EEE) from the Technical University of Crete (TUC) in 2017. He received his MSc degree in EEE from TUC, in August 2019. He is currently pursuing his PhD in clinical medicine research at the Hamlyn Centre and the Department of Surgery and Cancer at Imperial College London. His research focuses on spectroscopy and machine learning for circumferential resection margin assessment in gastrointestinal cancer surgery, under the supervision of Professor Daniel S. Elson and Mr. Christopher Peters.

Scarlet Nazarian received her (MBBS) medical degree in 2014 from the University College London (UCL). She received her BSc degree in pharmacology from UCL in 2011. She became a member of the Royal College of Surgeons in 2017 and obtained a general surgery training number in London in 2018. She is currently pursuing her PhD in surgical innovation under the supervision of Lord Ara Darzi at Imperial College London and is collaborating with the Hamlyn Centre to develop intraoperative margin mapping techniques.

Michal Kawka is a penultimate year medical student at the Imperial College London. He received his iBSc degree in surgical design, technology and innovation in 2021. He has a special interest in surgical technologies and their clinical translation.

Ara Darzi is a codirector of the Institute of Global Health Innovation at Imperial College London and holds the Paul Hamlyn Chair of Surgery. He is a consultant surgeon at Imperial College NHS Trust and the Royal Marsden NHS Foundation Trust. He is a nonexecutive director of NHS England, chair of the accelerated access collaborative, and codirector of the NHS Digital Academy. He is also chair for the pre-emptive medicine and health security initiative at Flagship Pioneering UK plc.

Nisha Patel is a consultant gastroenterologist at Imperial College Healthcare NHS Trust. She is an interventional endoscopist with a special interest in technology, innovation, and research. Her clinical interests include colonoscopy, complex polypectomy, and third space endoscopy. She has major research interests in endoscopy development including imaging techniques and robotics within the Institute of Global Health Innovation.

Christopher J. Peters went to medical school in Leeds and after completing basic surgical training moved to Cambridge to carry out a PhD with Professor Rebecca Fitzgerald. During his PhD, he developed a four gene signature to predict outcome in oesophageal adenocarcinoma. After moving to London to complete his higher surgical training, he was appointed as a clinical senior lecturer and consultant upper GI surgeon at Imperial College London with an academic research programme built around understanding how we can use new technologies to make better decisions about how we treat cancer patients.

Daniel S. Elson is a professor of surgical imaging and biophotonics in the Hamlyn Centre for Robotic Surgery, Institute of Global Health Innovation and Department of Surgery and Cancer, Imperial College London. His research interests lie in the development and application of photonics technology to surgical imaging, including different forms of spectroscopy, fluorescence, multispectral and polarization-resolved endoscopy and the use of surgical vision techniques to improve system usability.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Ioannis Gkouzionis, Scarlet Nazarian, Michal Kawka, Ara W. Darzi, Nisha Patel, Christopher J. Peters, and Daniel S. Elson "Real-time tracking of a diffuse reflectance spectroscopy probe used to aid histological validation of margin assessment in upper gastrointestinal cancer resection surgery," Journal of Biomedical Optics 27(2), 025001 (1 February 2022). https://doi.org/10.1117/1.JBO.27.2.025001
Received: 16 September 2021; Accepted: 10 January 2022; Published: 1 February 2022
JOURNAL ARTICLE
15 PAGES


SHARE
Lens.org Logo
CITATIONS
Cited by 2 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Tissues

Tissue optics

Tumors

Cancer

Stomach

Optical tracking

Diffuse reflectance spectroscopy

Back to Top