Open Access
8 July 2019 Perspective review of optical imaging in welfare assessment in animal-based research
Carina B. Pereira, Janosch Kunczik, André Bleich, Christine Haeger, Fabian Kiessling, Thomas Thum, René Tolba, Ute Lindauer, Stefan Treue, Michael Czaplik
Author Affiliations +
Abstract
To refine animal research, vital signs, activity, stress, and pain must be monitored. In chronic studies, some measures can be assessed using telemetry sensors. Although this methodology provides high-precision data, an initial surgery for device implantation is necessary, potentially leading to stress, wound infections, and restriction of motion. Recently, camera systems have been adapted for animal research. We give an overview of parameters that can be assessed using imaging in the visible, near-infrared, and thermal spectrum of light. It focuses on heart activity, respiration, oxygen saturation, and motion, as well as on wound analysis. For each parameter, we offer recommendations on the minimum technical requirements of appropriate systems, regions of interest, and light conditions, among others. In general, these systems demonstrate great performance. For heart and respiratory rate, the error was <4  beats  /  min and 5 breaths/min. Furthermore, the systems are capable of tracking animals during different behavioral tasks. Finally, studies indicate that inhomogeneous temperature distribution around wounds might be an indicator of (pending) infections. In sum, camera-based techniques have several applications in animal research. As vital parameters are currently only assessed in sedated animals, the next step should be the integration of these modalities in home-cage monitoring.

1.

Introduction

Animal studies have made a great contribution to medical advancement and play a vital role in current scientific work, as they replicate key aspects of human physiological and pathophysiological processes.1,2 They allow a higher understanding of various disease mechanisms as well as of genetic predisposition to certain diseases. Furthermore, they have been improving medication and treatments, allowing for a higher quality of life for people all over the world.3 In 2011, nearly 11.5 million animals were used for scientific purposes in the EU, of which 76.8% were rodents (including mice, rats, guinea-pigs, and hamsters), whereas only 0.67% were pigs, 0.25% were sheep, and 0.05% were primates.4 The remaining 22.23% correspond to other animal species (including birds, reptiles, and amphibians).4

In the last decades, the use of animals in medical and scientific research has been an issue of public concern and debate.3 In Europe, in 2010, the legal framework for responsible and ethically acceptable animal research was updated and brought to the same standard across all EU member states. The European Parliament passed Directive 2010/63/EU, which stresses the 3R principles of animal research, as proposed by Russel and Burch:5,6 replacement, reduction, and refinement. First, animal research shall be replaced with other methods, whenever possible. Second, the number of animals used for scientific purposes shall be reduced as much as possible. Third, the procedures shall be refined to avoid or minimize stress, and improve the well-being of the animals.

In order to comply with the principle of refinement, the animals must be kept under constant surveillance concerning their vital parameters [for instance heart rate (HR), respiratory rate (RR), and oxygen saturation], mobility/activity, wounds, and emotional state.7 In chronic animal experiments, vital signs are commonly assessed by implanting a telemetry sensor. Although this methodology provides high-precision data, its use is associated with several and critical drawbacks. First, an initial implantation surgery is required.8 According to Braga and Burmeister,9 animals require up to 5 to 7 days to recover their normal circadian rhythms. Second, the implanted device itself might cause distress and discomfort, particularly in small species.9 In addition, it has significant adverse physiological effects; e.g., increased volume in abdominal viscera can compromise the movement of the diaphragm, and, as a result, alter breathing patterns (depth and/or rhythm).9 Third, telemetry is a very expensive method; in addition to the initial acquisition costs, periodic expenses for telemeter refurbishment are still high. Fourth, the battery life of the transmitters is mostly warrantied for six months of continuous use.10 In acute experiments with animals undergoing anesthesia, vital signs are assessed using electrocardiography (ECG) or photoplethysmography (PPG). Although accurate, they require the use of wires, which often hinder the researcher.11 In contrast, other parameters such as locomotion capability or activity, emotional state, and pain are assessed subjectively.8 Finally, unwanted infections or surgical wound complications occur occasionally during animal trials. Those are often detected belatedly, since there are currently no technologies available to sensitively predict inflammation or infection.

Innovative monitoring modalities, which offer contactless and unobtrusive surveillance of laboratory animals, could refine and improve research procedures and help to reduce the number of research animals.7,12 In recent years, new monitoring applications based on camera-based systems [imaging in the visible, near-infrared (NIR), and thermal spectrum of light] have been developed for human medicine. Due to positive feedback, these techniques have been recently adapted for animal research. Infrared thermography (IRT) (long-wavelength infrared/thermal band of the electromagnetic spectrum, 7 to 14 μm) records heat emission without any harmful radiation or light source.13 It is used in various medical fields for measuring, e.g., HR and RR,13,14 thermal regulation,15 circulation, and perfusion dynamics.16 NIR and visible imaging systems have higher spatial resolution in relation to IRT. In addition to HR and RR, these technologies allow monitoring peripheral oxygen saturation.

Based on the drawbacks of the current monitoring modalities and considering the new directives on protection and welfare of animals, this review aims at giving an overview of new camera-based methods (visible, NIR, and IRT imaging) for unobtrusive and remote monitoring of research animals. This paper focuses not only on vital parameters, which can be estimated (such as heart activity, respiration, and oxygen saturation), but also on the assessment of motion activity and wound infection (elements analyzed frequently during animal experiments). For each parameter and species (mice, rats, pigs, and primates), an enumeration of the imaging systems and algorithms that can be applied is carried out. Additionally, this paper indicates the regions of interest (ROIs) to be used as well as camera properties and light conditions. Finally, for each application, previously published as well as preliminary results of our working groups are presented in order to demonstrate the potentiality of these techniques.

2.

Animals, Materials, and Methods

2.1.

Heart Activity

Heart activity can be monitored either in visible spectrum (VIS), NIR, or IRT imaging. These techniques are based on different principles and thus methods used to estimate HR. In VIS/NIR videos, this vital parameter can be extracted using one of the following approaches: motion-based (presented by Balkrishnan et al.17 and Li et al.18) or color-based (presented by Poh et al.19).

The former uses optical flow to measure the weak vibrations of the body caused by the beating heart. In short, feature points (around the ROI) are identified in the first frame of the video and tracked over time (applying, e.g., a template-based point tracker). Afterward, their trajectories are processed to extract the chest motion related to the cardiac function. This step consists in filtering the trajectories of the feature points to remove noise outside the expected signal frequency range. Using blind source separation, based either on principal or on independent component analysis, the trajectories are decomposed into subsignals (principal components) to isolate the heart-related component. The principal component that most closely resembles the heart signal is selected; its main frequency in the frequency spectrum corresponds to the HR.18

The latter approach (color-based) uses color or brightness changes of the skin due to blood flow to quantify heart activity (it is an extension of the classical reflective PPG). Using this method, an ROI, enclosing the subjects’ skin, must be identified in the first frame of the video and tracked over time for motion compensation. Thereafter, the ROI should be separated into the three RGB channels and spatially averaged to obtain the raw RGB traces (i.e., the mean color pixel intensity of the ROI). Blind source separation (independent component analysis) is also used in this approach in order to decompose the traces into three independent source signals. Finally, the component containing the heart signal is selected; the frequency with the highest power of the spectrum—within the operational frequency band—corresponds to the pulse frequency.19

Both approaches were primarily developed for human medicine and have recently been applied in animal research. Regarding thermal videos, only the motion-based approach can be applied. Table 1 presents a detailed overview of the imaging systems and their properties, applicable algorithms (whether color-based or motion-based), ROIs, and light conditions necessary to extract HR from four animal species: mice, rats, pigs, and primates. Some of the commercially available systems capable of detecting this vital parameter are listed in Table 2.

Table 1

Detailed overview of the imaging systems and properties to extract HR from four animal species: mice, rats, pig, and primates. Algorithmic approach (color-based or motion-based), ROIs, frame rate, spatial and thermal resolution, as well as required light conditions are specified.

Technique approachVisibleNIRIRT
MotionColorMotionColorMotion
SpeciesMiceMiceMiceMicePigs
RatsRatsRatsRats
PigsPigsPigsPigs
PrimatesPrimatesPrimatesPrimates
ROIChest/abdomen: all speciesShaved chest/abdomen: mice, ratsChest/abdomen: all speciesShaved chest/abdomen: mice, ratsChest: pigs
Head: primatesTail: mice, rats20Head: primatesTail: mice, rats
Pigs: snout, chest, abdomen20Pigs: snout, chest, abdomen21
Face: primates22Face: primates22
IlluminationVisible lightVisible lightActive illumination (visible/IR light) (Power: e.g., 9 W)Active illumination 940 nm (power: e.g., 9 W)Not required
Frame rateMice: 60 Hz
Rats: 60 Hz
Pigs: 30 Hz
Primates: 30 Hz
ResolutionAll species: at least 1280×720  pixelsAll species: at least 1024×768  pixels
Temperature resolution0.03 K

Table 2

Examples of imaging systems.

TechniqueImaging systems
VisibleMako G-223B (Allied Vision GmbH, Stadtroda, Germany)
IDS UI-3270LE/GV-5240/UI-5240 color (IDS GmbH, Obersulm, Germany)
Grasshoper 3 GS3-U3-51S5C-C (FLIR Systems, Wilsonville, Oregon)
Chameleon 3 U3-13Y3C (FLIR Systems, Wilsonville, Oregon)
NIRManta G-223B NIR (Allied Vision GmbH, Stadtroda, Germany)
IDS UI-3270LE/GV-5240/UI-5240 NIR (IDS GmbH, Obersulm, Germany)
Grasshoper 3 GS3-U3-41C6NIR-C (FLIR Systems, Wilsonville, Oregon)
IRTFLIR T1020 (FLIR Systems, Wilsonville, Oregon)
VarioCAM head 800 (InfraTec GmbH, Dresden, Germany)

2.2.

Respiration

Respiration can be assessed by measuring the displacement of the chest or abdomen using one of three imaging systems: VIS,23 NIR,18 or IRT.24 For this purpose, the motion based-approach, described in the previous section, can be used.

In thermal videos, temperature variation around the nostrils during the respiratory cycle also offers insights into this physiological process. This results from the inhalation of cold air from the environment and exhalation of warm air from the lungs.25 In this approach, the nose (ROI) is segmented in the first frame of the thermal video. Afterward, a rough tracking overtime is performed to compensate motion. The filtered mean temperature of the ROI, which represents the respiratory waveform, is computed. Finally, the respiratory frequency can be extracted either by selecting the frequency with the highest power of the spectrum or using breath-to-breath estimation algorithms.

In addition to RR, these techniques allow us to monitor breathing patterns26 (such as tachypnea, Kussmaul breathing, Cheyne–Stokes respiration, and apnea phases) and tidal volume.23 As in the previous section on heart activity, respiration assessment with imaging systems has primarily focused on human medicine. In the last years, these new techniques have been translated to animal research.7,11,20,27 Table 3 presents a detailed overview of the imaging systems and their properties, relevant algorithms, ROIs, and light conditions necessary to extract the RR from four animal species: mice, rats, pigs, and primates. For this application, the cameras presented in Table 2 can be used. However, if the focus is only on respiration, cameras with lower frame rates and resolutions are an option as well, as the amplitude of the thorax movement is prominent and the frequency smaller than the HR. Two examples are the webcam Logitech C270HD (Logitech International S.A., Lausanne, Switzerland)—RGB, and the FLIR T640 (FLIR Systems, Wilsonville, Oregon)—IRT.

Table 3

Detailed overview of the imaging systems and properties to extract RR from four animal species: mice, rats, pig, and primates. Algorithmic approach (color-based or motion-based), ROIs, frame rate, spatial and thermal resolution, as well as required light conditions are specified.

Technique approachVisible/NIRIRT
MotionMotionColor
SpeciesMiceMice11Mice7
RatsRats11Rats
PigsPigsPigs
PrimatesPrimatesPrimates
ROIChest: all speciesChest: all species11Nostrils: all species
Abdomen: all speciesAbdomen: all species11
Back: all speciesBack: all species
IlluminationVisible lightNot requiredNot required
Frame rateMice: 30 Hz
Rats: 30 Hz
Pigs: 10 Hz
Primates: 10 Hz
ResolutionAll species: at least 1280×720  pixelsAll species: at least: 640×480  pixels
Temperature resolution0.03 K

2.3.

Oxygen Saturation Monitoring

In addition to heart activity and respiration, NIR imaging also permits the assessment of arterial oxygen saturation (SpO2) in blood by adapting the principles of conventional pulse oximetry. As well-known, oxyhemoglobin (HbO2) and deoxyhemoglobin (HHb) have different absorption spectra, whereas HHb absorbs more light in the red band of the electromagnetic spectrum (600 to 750 nm), HbO2 absorbs more light in the infrared band (850 to 1000 nm). Therefore, conventional pulse oximeters contain two LEDs of different wavelengths (usually red and infrared). By capturing a PPG at two suitable wavelengths, the relative concentrations of HbO2 and HHb can be derived, and consequently, the SpO2. In fact, this can be defined as the ratio of HbO2 to the total hemoglobin (sum of HbO2 and HHb) as given by

Eq. (1)

SpO2=[HbO2][Hbb]+[HbO2]×100%,
where [HbO2] and [Hbb] are the concentration of oxyhemoglobin and deoxyhemoglobin, respectively.28

The measurement of SpO2 using NIR cameras is again an extension of the classical pulse oximetry method. In addition to the monochromatic/RGB camera, a trigger-controlled wavelength LED-array is required. Alternatively, two or more cameras with different optical filters can be used. The application of NIR imaging for SpO2 measurement was primarily tested on human subjects.29,30 In the last years, this technique has also been being used in animal research to measure cortical blood oxygenation.31 For this application, publications reported the use of monochromatic/RGB cameras (sensitive also in the infrared region of the electromagnetic spectrum) with a resolution of 1024×768  pixels or higher, a dynamic range of 12 bits, and a frame rate of 30 Hz (e.g., AD-080GE, Jai A/S, Denmark).28,31 The illumination unit should be composed of at least two LED arrays with two appropriate wavelengths (e.g., red and NIR). However, Shao et al.28 demonstrated that the combination of orange/infrared LEDs is the best choice considering the PPG signal-to-noise ratio. Finally, the measuring system should include a control unit to trigger LEDs and camera, since the red/orange and infrared LEDs must flash alternatingly.28,31 It is important to note that SpO2 assessment is based on the color-based approach (presented in Sec. 2.1). Thus to extract the signal of interest, a portion of skin or tissue must be visible and illuminated.

2.4.

Motion and Activity

In addition to vital parameters, camera-based techniques are frequently used for monitoring of activity and exploratory behavior, mostly in rodents. As is well-known, movement and location of animals, e.g., during open field tests, are major markers for anxiety, stress, and pain.

Motion activity can be remotely assessed using one of three imaging systems: VIS, NIR, or IRT. Although all systems are qualified for this application, IRT has some additional advantages. First, this technique does not depend on any light source. In contrast, visible and NIR systems require a light source (whether artificial and natural light). This property plays a major role for circadian rhythm monitoring. Second, IRT identifies defecation and urination (which can also be affected by stress or fear) as hot (and later cold) spots. Third, animals can be easily released from the background, which is significantly colder (animals are hot spots in IRT videos). In addition, the color of the animals does not have any influence on the detection/segmentation algorithm. Despite all advantages, IRT is still an expensive technology in comparison with visible and NIR imaging.

For motion analysis, there are two main steps that may influence the performance of the system: the detection algorithm and the tracking algorithm. In fact, robust detection and tracking are the key factors of a reliable software. From tracking, several parameters can be assessed, including acceleration, position, velocity, total distance travelled, and measurement of the time spent in different regions of the open field, among others. Furthermore, for visualization of the exploratory behavior, motion heat maps can be estimated.

Regarding the measurement setup, the camera should be aligned perpendicularly (90 deg) to the cage or open field. When using an IRT camera, it should have a resolution of 1024×768  pixels or higher as well as a frame rate of at least 30 Hz (e.g., FLIR T1020, FLIR Systems, Wilsonville, Oregon; or VarioCAM head 800, InfraTec GmbH, Dresden, Germany). Visible or NIR cameras should have similar properties: ANY-maze 60516 (Dublin, Ireland), Logitech C270HD (Logitech International S.A., Lausanne, Switzerland) and Sony DCR-SR60 (Sony Corporation, Tokyo, Japan) are two examples used for these applications.32 Equally important is the calibration of the camera, which must be calibrated to remove lens distortion.

2.5.

Wound Monitoring

The four cardinal symptoms of inflammation include rubor (redness), calor (heat), dolor (pain), tumor, and functio laesa (disturbance of function).33 Unfortunately, these signs are hard to diagnose by visual inspection alone. Fur and thick skin complicate the diagnosis even further, because they hide rubor and tumor. Touching the wound area can be useful to detect calor, dolor, and tumor, but should be avoided since it can be painful for the animal and for hygienical reasons.

The usage of various cameras with subsequent image processing can contribute to a contactless monitoring of the cardinal symptoms (rubor, calor, and tumor) that are not sensitively assessable by existing (clinical) methods. IRT can display the temperature distribution around the wound area, which can be examined for homogeneity, size, and shape. A normally healing wound is expected to show a homogeneous warm area around the wound, with margins equidistant from the wound origin. The presence of localized hot areas or cold spots can imply problems in the healing process. Visual- and depth-sensing cameras can be used as a replacement for manual visual inspection. Using computer algorithms for wound analysis, the results can be made quantifiable and more objective. Those cameras can be used to detect rubor (visual cameras) and tumor (depth cameras), e.g., by comparing successive images taken of a wound.

For wound monitoring, we use two classes of IRT cameras: high-end and entry level devices. All camera systems contain additional integrated visual cameras. The use of depth sensing camera technology is currently a subject of research. An exemplary entry-level IRT camera comprises a 160×120  pixel image resolution, a horizontal field of vied (HFOV) of 50 deg, and a minimum object distance of 30 cm (e.g., FLIR One Pro, FLIR Systems, Wilsonville, Oregon). Thus a maximum resolution of 6  pixel/cm can be achieved. Given the assumption that four pixels are enough, structures of around 0.7 cm can be represented. High-end cameras deliver the most flexibility, as they feature high image resolutions, e.g., 464×384  pixels, and changeable lenses (e.g., FLIR E95, FLIR Systems, Wilsonville, Oregon). Resolutions of 22 pixels/cm and the detection of structures larger than 0.2 cm are easily possible.

3.

Results

This section presents the principal findings from previous publications, as well as some of our own preliminary results from camera-based monitoring systems for laboratory animals.

3.1.

Heart and Respiratory Rate

Tables 4 and 5 present the performance of the imaging systems for monitoring of HR and RR in different animal species. Since the application of such techniques in animal research is very recent, these tables cannot be filled completely since studies are still lacking. Visible imaging was the first system analyzed. To date, its feasibility has merely been tested in rodents (mice and rats) and nonhuman primates. For HR and RR, promising results were obtained, the root-mean-square errors (RMSEs) averaged <4  beats/min and 1.5 breaths/min, respectively. Although in rodents vitals were derived using the motion of the thorax/abdomen, in primates HR was estimated by measuring variations in blood volume with consecutive changing skin color (color-based approach). Blanik et al.21 were, to the best of our knowledge, the only researchers studying the application of NIR imaging in animal research. In their study, the authors assessed HR and RR in the snout of pigs using the color-based approach and motion, respectively. Unfortunately, instead of quantitative data analysis (comparison with a gold standard method), only a qualitative analysis was carried out. According to the publication, a good agreement between NIR and the gold standard was obtained. Finally, regarding IRT, there are more studies applying this technique in animals, especially for assessment of RR.7,11,27 For the selected studies, the mean absolute error (εabs) was <3.5  beats/min and 5 breaths/min for HR and RR, respectively. Note that HR was only assessed in thermal videos of pigs. Figure 1(a) shows a Bland–Altman plot comparing the HR estimated with visual imaging—using the motion-based approach—and the gold standard (ECG). These results were presented by Kunczik et al.34 As illustrated, a great agreement between ECG was obtained with the limits varying between 3.2 and 1.7 beats/min. Figure 1(b) illustrates the regions of the rodent (rat) used to extract HR; regions with a signal-to-noise ratio smaller than a specific threshold were not used to compute this vital parameter.

Table 4

Accuracy of three imaging systems for assessment HR.

Imaging systems
VisibleNIRIRT
HRMiceN6
HR¯386±41  beats/min
RMSE¯1.4±0.9  beats/min
ROI, methodThorax/abdomen, motion
ReferenceKunczik et al. (2019)34
RatsN6
HR¯374±46  beats/min
RMSE1.3±0.6  beats/min
ROI, methodThorax/abdomen, motion
ReferenceKunczik et al. (2019)34
PigsNUnknown17
HRUnknown131±23  beats/min
εabs¯Unknown3.4±3.0  beats/min
ROI, methodSnout, colorThorax, motion
ObservationGood agreement
ReferenceBlanik et al. (2019)21Pereira et al. (2019)35
Nonhuman primatesN4
HR140 beats/min
RMSE3.8±1.5  beats/min
ROI, methodFace, color
ReferenceUnakafov et al. (2018)22
Note: HR‾, mean heart rate; RMSE‾, mean root-mean-square error; εabs‾, mean absolute error.

Table 5

Accuracy of three imaging systems for assessment RR.

Imaging systems
VisibleNIRIRT
RRMiceN65
RR98±35  breaths/min113±30breaths/min
RMSE1.4±1.0  breaths/min
εabs5.3±3.8  breaths/min
ROI, methodThorax/abdomen, motionThorax/abdomen, motion
ReferenceKunczik et al. (2019)34Pereira et al. (2018)11
RatsN65
RR50±7  breaths/min52±7  breaths/min
RMSE0.3±0.2  breaths/min0.4±0.1  breaths/min
εabs0.1±0.1  breaths/min
ROI, methodThorax/abdomen, motionThorax/abdomen, motion
ReferenceKunczik et al. (2019)34Pereira et al. (2018)11
PigsNUnknown17
RRUnknown31±3  breaths/min
εabsUnknown0.3±0.5  breaths/min
ROI, methodSnout, motionThorax, motion
ObservationGood agreement
ReferenceBlanik et al. (2019)21Pereira et al. (2019)35
Note: RR‾, mean respiratory rate; RMSE‾, mean root-mean-square error; εabs‾, mean absolute error.

Fig. 1

(a) Bland–Altman plot comparing the HR estimated with visual imaging—using the motion-based approach—and the gold standard (ECG).34 The limits of agreement range from 3.2 to 1.7 beats/min. The bias averages 0.7  beats/min. (b) Regions of the rat used to extract HR.

JBO_24_7_070601_f001.png

Figure 2(a) shows, in turn, a Bland–Altman plot comparing the RR estimated with IRT and the gold standard (ECG-derived RR). In this example, a good agreement between both techniques is also visible, with the limits of agreement ranging from 0.8 and 0.6 breaths/min. Figure 2(b) illustrates the thermogram and the points used for motion tracking.

Fig. 2

(a) Bland–Altman plot comparing the RR estimated with IRT and the gold standard (ECG-derived RR).11 The limits of agreement range from 0.8 to 0.6 breaths/min. The bias averages 0.1  breaths/min. (b) Thermogram of a rat and feature points used for motion tracking.

JBO_24_7_070601_f002.png

3.2.

Oxygen Saturation

Van Gastel et al.30 conducted a study with human subjects to compare two methods of camera-based SpO2 extraction algorithms: the classical ratio-of-ratio (ROR) algorithm, used in pulse oximeters, and their own method (“APBV”), which improves the raw signal quality through signal fusion and a priori knowledge.30 Four minutes of videos were recorded with static and moving subjects. The contactless SpO2 was extracted from the subjects’ heads. A classical pulse oximeter was used as reference.

For the static experiment, the RMSE averaged 3% in case of the ROR algorithm and 2% for the APBV algorithm, respectively. The RMSE increased to 27% for the ROR algorithm, once free head movements were allowed. On the other hand, the ABPV approach managed to contain an RMSE of 3% in this case.30

In a rat model, changes of cerebral blood flow and oxygen consumption were examined during a 16-s electrical forepaw stimulation. By thinning out the bone to translucency, a cranial window was prepared enabling to record image sequences. Although blood flow increased by about 15%, deoxyhemoglobin concentration dropped and oxyhemoglobin concentration rose by the same amount. Not only the sum level but spatially resolved blood flow and oxygenation maps could be obtained (Fig. 3).

Fig. 3

Superficial tissue imaging system for combined RGB reflectometry and laser speckle contrast analysis [superficial tissue imaging system (STIS) provided by Kohl-Bareis and Steimers, Biomedical Optics Laboratory, Rheinahrcampus Remagen, Germany]. Blood flow and blood oxygenation responses to 16-s electrical forepaw stimulation were recorded through a cranial window preparation (bone thinned to translucency) in isoflurane plus fentanyl anaesthetized rats. Laser speckle contrast, HbO2, and HHb-images were online calculated from raw images of speckle pattern and red, green, and blue LED-illumination images using STIS and stored at 0.75  Hz (LabVIEW 2010, National Instruments Inc.) for later offline calculation of cortical blood flow from speckle contrast.

JBO_24_7_070601_f003.png

3.3.

Motion Activity

Several approaches for assessing motion activity have been published.11,32,36,37 In these studies, the results of the tracking performance were evaluated qualitatively. They demonstrated the ability to accurately detect and track the animal during different behavioral tasks, including open field test, Y-maze task, and Barnes maze, among others. Note that the animal detection was performed automatically. Since automated identification is still a challenging task (especially when the animal and background have similar colors), in some processing algorithms, the position of the animal must be defined manually by the user. As demonstrated in Fig. 4(a), this task is simplified with IRT, as the animal is a hot spot in the thermal video (the background is much colder than the animal). Figure 4(b) shows a motion heat map, which is usually used to demonstrate the cumulative time spent in different parts of the arena; yellow denotes more time and blue, less or no time.

Fig. 4

(a) Thermogram of the open field arena. The animal is the hot spot in the image. (b) Motion heat map.11

JBO_24_7_070601_f004.png

3.4.

Wound Monitoring

Figure 5 shows the exemplary results for contactless wound monitoring. Both upper images show visual images of laparotomy wounds of two different, male Wistar rats. In addition, from the incision, no redness can be observed. The right wound image seems to show better healing, as the visible remainder of the incision is much smaller.

Fig. 5

(a) and (b) Images of two laparotomy wounds. The lower images present, in addition, an IRT overlay showing the temperature distribution around the wound area in “false colors.”

JBO_24_7_070601_f005.png

Below, the same images are presented with an IRT overlay, which shows the temperature distribution around the wound area in “false colors.” Here, Fig. 5(b) shows high-temperature fluctuations around the wound, whereas Fig. 5(a) presents a much more homogeneous temperature distribution. Also large cold spots can be observed in Fig. 5(b), which imply the secretion of exudate.

4.

Discussion

The first steps toward the application of camera-based systems in medical research were taken in the late 90’s. However, the big boom in this area took place in the years around 2010, when different approaches for measurement of vital parameters (HR and RR) in humans were presented. More recently, these techniques have been adapted to animal research.

As presented in Tables 4 and 5, the performance of imaging systems has only been tested in four animal species (mice, rats, pigs, and nonhuman primates) to date. In general, a high agreement between video-based vital parameters and references was obtained. Surprisingly, these techniques were even capable of quantifying the cardiac activity in rodents. These species have high HRs, and also exhibit very little precordial excursion of the chest during respiration and heart activity.

Of the three techniques, NIR was the least explored. In contrast to Blanik et al.,21 who reported a good agreement between HR estimated with NIR and the gold standard (study on pigs), Unakafov et al.22 could not extract this parameter (HR) from NIR videos of nonhuman primates. However, this research group did not use active illumination (ideally 940 nm) during video acquisition. As a result, the environmental ambient illumination might have negatively affected the signal-to-noise ratio.

Except for one study, all video acquisitions were conducted on anesthetized animals. Although in the study of Unakafov et al.,22 the nonhuman primates were awake, to minimize facial motion, no reward was handed out during the video recording. The presence of motion in videos plays an instrumental role for the quality of the extracted signal and, consequently, for the estimation of vital parameters. To date, no approach that has been described in the literature is capable of compensating strong motion artefacts. Therefore, for long-term monitoring, new concepts or strategies must be developed to avoid measurement errors. For instance, the monitoring system should include a motion index. On the one hand, if the index decreases below a predefined threshold or is equal to zero, the animal is still and, thus, the vitals might be assessed. On the other hand, if the motion index surpasses the threshold, it means that the animal is moving and no valid estimation can be performed.

In addition to movement, the chosen ROI also influences the outcome. Tables 1 and 3 show the best regions for each species. Although there are some differences between species, the thorax is the most common and reliable area for HR and RR measurement. For the color-based approach, the animal’s skin must be visible to present the varying light absorption. Therefore, the animals usually need to be shaved.

The technical requirements of the cameras are further factors conditioning the performance of the camera-based systems. In addition to resolution and frame rate, the focal length of the lens must be taken into consideration, which strongly depends on the experimental setup. For assessment of vital parameters, the most important criterion to be met is that the ROI or the body of the animal should represent a large portion of the image. Thus the ideal focal length must result from the trade-off between the distance of the camera to the object and the area of the ROI (ideally, a long-focus lens for great distances between camera and subject; a normal/wide-angle lens for smaller distances). For applications such as open field tests, wide-angle lenses are recommended.

Finally, the strengths and weaknesses of the three optical channels (visible, NIR, and thermal) must be reported. When measuring respiration, all three techniques present similar outcomes, since the respiratory signal is very prominent. Thus the relative strengths and weaknesses of these three channels only rely on the costs and necessity of external light sources. However, there are other factors that should be considered with regard to monitoring HR. In contrast with visible and NIR imaging, IRT delivers images with less contrast and sharpness, since it relies on contrasting temperatures to clearly define objects. Thus this technique is less effective in situations where definition is expected, and the surface temperatures of objects are very uniform. This is the case of body temperature, which is relatively uniform, making it difficult to find body landmarks or tracking feature points, as the edges are not so “sharp.” Weak edges/contours might lead to inaccurate tracking of feature points, and consequently, reduce the signal-to-noise ratio or even compromise the whole monitoring/measurement. This problem can be minimized by further improving detector sensitivity. RGB and NIR cameras present, in turn, better contrast, but are sensitive to illumination changes. In NIR, these variations can be significantly reduced and the signal quality enhanced by actively illuminating the subject at a narrow-bandwidth NIR (e.g., 940 nm) and by applying a matching bandpass filter on the camera. However, although in the visible spectrum, the intensity changes due to blood flow are very small compared to the absolute pixel intensity values; in NIR frequencies, the already weak pulsatile signal is additionally minimized. In this bandwidth, the hemoglobin absorption is significantly lower when compared with the absorption peak in the green range of the light spectrum. On the top of that, the sensitivity of camera sensors is decreased in the NIR frequencies, thus the strength of the pulse signal is much lower than in the visible spectrum. It is important to note that illumination variance is not an issue in the motion-based approach, thus there should not be any difference between visual and NIR imaging. To sum up, when selecting the imaging system as well as the algorithmic approach (either motion- or color-based), all previous factors should be weighed.

Remote PPG (color-based approach; contact-free monitoring of the optical absorption changes in the skin due to varying blood volume) triggers the extension of algorithms to include a new feature, assessment of oxygenation or SpO2 (contact-free pulse oximetry). Van Gastel et al.30 reported outstanding results; in their study, the RMSE between NIR and the gold standard averaged 2%. Although this parameter was not assessed in animal trials, a similar technique was used by Steimers et al.31 to quantify cortical HbO2 and HHb changes. Note that HbO2 and HHb are the parameters used to quantify SpO2 (after a calibration). Their system was capable of detecting HbO2 and HHb changes with high temporal and spatial resolutions during different conditions (including ischemic stroke and hypercapnia). In addition, this demonstrated to be highly sensitive.

Regarding motion activity, there are several commercial and noncommercial systems that permit us to track the animal and analyze its motion. In fact, to help other researchers, some groups (such as Patel et al.32) have published, in combination with the journal article, the source-code of the tracking and analysis software. These approaches were mainly developed for behavioral tasks such as open field tests, where the animal is always in the camera’s field of view. Although the monitoring of animals in their home cage is widely desired (due to the circadian rhythms), this is still a challenging task. The motion of a single animal can be easily computed by comparing the background model against the current image or by establishing an image reference. In cages with more animals, the task is more complex when the aim is to analyze individual movements. Multitracking approaches are not capable of identifying similar/identical objects. In addition, occlusion is a major problem in imaging processing.

Close observation of wounds is necessary to detect inflammation in an early stage and to avoid systemic consequences such as a sepsis. Although laboratory parameters respond slowly and can be very unspecific, reliability of visual inspection depends on the observer’s experience and the accessibility of wounds. In fact, the combination of cameras sensitive in infrared and visible spectrum is highly promising to derive new insights for wound pathophysiology. Moreover, image processing might lead to objective and comparable parameters and, thus, also to wound status scores. Although several morphological aspects have already been noted, clinical validation is pending and potential clinical benefits should be proven. However, a few studies already report the clinical application of IRT. In acute third-degree burn wounds, IRT has been capable of determining unsalvageable tissue, similar to widely used indocyanine green angiography.38 In another study, elevation of temperature between wound region and healthy skin of 4°C to 5°C was noted, leading to correct differentiation between the patients suffering from infected wounds and the control group patients.39

5.

Conclusions

To date, severity assessment typically requires animal handling or even the application of invasive methods (e.g., implantation of transponders). But, for various application areas, contactless alternatives are available to acquire such information as respiration rate, HR, oxygenation, motion activity, and potential wound infection. Requirements on the appropriate camera system highly depend on the type of application. Thermal imaging or IRT takes advantage of being absolutely passive, meaning that no ambient light is required. In this context, it is noteworthy that, only movements and thermal effects (cooling or warming) are detectable, whereas color changes, e.g., caused by perfusion, are not. Imaging systems operating in the visible spectrum (or NIR) are relatively cheap and permit the analysis of movements as well as color changes. However, they require ambient light or even active illumination to improve the signal-to-noise ratio (especially when NIR is used).

Disclosures

M. C. is a co-founder and CEO of Docs in Clouds GmbH; J. K. and C. B. P. are employed by Docs in Clouds GmbH, providing applications for camera-based vital detection and telemedicine. T. T. is a shareholder of Cardior Pharmaceuticals GmbH.

Acknowledgments

This study was supported by the German Research Foundation–DFG (DFG research group FOR 2591, Project Nos. 321137804, CZ 215/3-1, BL953/10-1, TH903/22-1, TO 542/5-1, KI1072/20-1, LI 588/5-1, and TR 447/5-1.

References

1. 

V. Baumans, “Use of animals in experimental research: an ethical dilemma?,” Gene Ther., 11 (Suppl 1), S64 –S66 (2004). https://doi.org/10.1038/sj.gt.3302371 GETHEC 0969-7128 Google Scholar

2. 

Institute of Medicine, Science, Medicine, and Animals, The National Academies Press, Washington, DC (1991). Google Scholar

3. 

S. Festing and R. Wilkinson, “The ethics of animal research. Talking point on the use of animals in scientific research,” EMBO Rep, 8 526 –530 (2007). https://doi.org/10.1038/sj.embor.7400993 Google Scholar

4. 

European Commission, “Seventh report on the statistics on the number of animals used for experimental and other scientific purposes in the member states of the European Union,” (2013). Google Scholar

5. 

P. Roelfsema and S. Treue, “Basic neuroscience research with nonhuman primates: a small but indispensable component of biomedical research,” Neuron, 82 1200 –1204 (2014). https://doi.org/10.1016/j.neuron.2014.06.003 NERNET 0896-6273 Google Scholar

6. 

W. M. S. Russell and R. L. Burch, The Principles of Humane Experimental Technique, Methuen, London (1959). Google Scholar

7. 

K. Mutlu et al., “IR thermography-based monitoring of respiration phase without image segmentation,” J. Neurosci. Methods, 301 1 –8 (2018). https://doi.org/10.1016/j.jneumeth.2018.02.017 JNMEDT 0165-0270 Google Scholar

8. 

J. E. Niemeyer, “Telemetry for small animal physiology,” Lab Anim., 45 255 –257 (2016). https://doi.org/10.1038/laban.1048 Google Scholar

9. 

V. A. Braga, M. A. Burmeister, “Applications of telemetry in small laboratory animals for studying cardiovascular diseases,” Modern Telemetry, 183 –196 1 ed.InTech, Rijeka, Croatia (2011). Google Scholar

10. 

R. Hoyt et al., “Mouse physiology,” Normative Biology, Husbandry, and Models, 2nd ed.Academic Press, Oxford, UK (2007). Google Scholar

11. 

C. Barbosa Pereira et al., “Remote welfare monitoring of rodents using thermal imaging,” Sensors, 18 3653 (2018). https://doi.org/10.3390/s18113653 SNSRES 0746-9462 Google Scholar

12. 

C. Gonzlez-Snchez et al., “Capacitive sensing for non-invasive breathing and heart monitoring in non-restrained, non-sedated laboratory mice,” Sensors (Basel), 16 (7), 1052 (2016). https://doi.org/10.3390/s16071052 Google Scholar

13. 

C. B. Pereira et al., “Remote monitoring of breathing dynamics using infrared thermography,” Biomed. Opt. Express, 6 4378 –4394 (2015). https://doi.org/10.1364/BOE.6.004378 BOEICL 2156-7085 Google Scholar

14. 

C. Barbosa Pereira et al., “Monitoring of cardiorespiratory signals using thermal imaging: a pilot study on healthy human subjects,” Sensors, 18 1541 (2018). https://doi.org/10.3390/s18051541 SNSRES 0746-9462 Google Scholar

15. 

R. B. Knobel, B. D. Guenther and H. E. Rice, “Thermoregulation and thermography in neonatal physiology and disease,” Biol. Res. Nurs., 13 274 –282 (2011). https://doi.org/10.1177/1099800411403467 Google Scholar

16. 

C. B. Pereira et al., “Contact-free monitoring of circulation and perfusion dynamics based on the analysis of thermal imagery,” Biomed. Opt. Express, 5 1075 –1089 (2014). https://doi.org/10.1364/BOE.5.001075 BOEICL 2156-7085 Google Scholar

17. 

G. Balakrishnan, F. Durand and J. Guttag, “Detecting pulse from head motions in video,” in IEEE Conf. Comput. Vision and Pattern Recognit., 3430 –3437 (2013). Google Scholar

18. 

M. H. Li, A. Yadollahi and B. Taati, “Noncontact vision-based cardiopulmonary monitoring in different sleeping positions,” IEEE J. Biomed. Health Inf., 21 1367 –1375 (2017). https://doi.org/10.1109/JBHI.2016.2567298 Google Scholar

19. 

M.-Z. Poh, D. J. McDuff and R. W. Picard, “Non-contact, automated cardiac pulse measurements using video imaging and blind source separation,” Opt. Express, 18 10762 –10774 (2010). https://doi.org/10.1364/OE.18.010762 OPEXFF 1094-4087 Google Scholar

20. 

F. Zhao et al., “Remote measurements of heart and respiration rates for telemedicine,” PLoS One, 8 (10), e71384 (2013). https://doi.org/10.1371/journal.pone.0071384 POLNCL 1932-6203 Google Scholar

21. 

N. Blanik et al., “Remote photopletysmographic imaging of dermal perfusion in a porcine animal model,” in 15th Int. Conf. Biomed. Eng., IFMBE Proc., 92 –95 (2014). Google Scholar

22. 

A. M. Unakafov et al., “Using imaging photoplethysmography for heart rate estimation in non-human primates,” PLoS One, 13 (8), e0202581 (2018). https://doi.org/10.1371/journal.pone.0202581 POLNCL 1932-6203 Google Scholar

23. 

B. A. Reyes et al., “Tidal volume and instantaneous respiration rate estimation using a volumetric surrogate signal acquired via a smartphone camera,” IEEE J. Biomed. Health Inf., 21 764 –777 (2017). https://doi.org/10.1109/JBHI.2016.2532876 Google Scholar

24. 

C. Pereira et al., “Noncontact monitoring of respiratory rate in newborn infants using thermal imaging,” IEEE Trans. Biomed. Eng., 66 (4), 1105 –1114 (2019). https://doi.org/10.1109/TBME.2018.2866878 IEBEAX 0018-9294 Google Scholar

25. 

J. Fei and I. Pavlidis, “Thermistor at a distance: unobtrusive measurement of breathing,” IEEE Trans. Biomed. Eng., 57 988 –998 (2010). https://doi.org/10.1109/TBME.2009.2032415 IEBEAX 0018-9294 Google Scholar

26. 

C. Barbosa Pereira et al., “Estimation of breathing rate in thermal imaging videos: a pilot study on healthy human subjects,” J. Clin. Monit. Comput., 31 1241 –1254 (2017). https://doi.org/10.1007/s10877-016-9949-y Google Scholar

27. 

B. G. Vainer, “A novel high-resolution method for the respiration rate and breathing waveforms remote monitoring,” Ann. Biomed. Eng., 46 960 –971 (2018). https://doi.org/10.1007/s10439-018-2018-6 ABMECF 0090-6964 Google Scholar

28. 

D. Shao et al., “Noncontact monitoring of blood oxygen saturation using camera and dual-wavelength imaging system,” IEEE Trans. Biomed. Eng., 63 1091 –1098 (2016). https://doi.org/10.1109/TBME.2015.2481896 IEBEAX 0018-9294 Google Scholar

29. 

M. Kumar, A. Veeraraghavan and A. Sabharwal, “DistancePPG: robust non-contact vital signs monitoring using a camera,” Biomed. Opt. Express, 6 1565 –1588 (2015). https://doi.org/10.1364/BOE.6.001565 BOEICL 2156-7085 Google Scholar

30. 

M. van Gastel, S. Stuijk and G. de Haan, “New principle for measuring arterial blood oxygenation, enabling motion-robust remote monitoring,” Sci. Rep., 6 38609 (2016). https://doi.org/10.1038/srep38609 SRCEC3 2045-2322 Google Scholar

31. 

A. Steimers et al., “Simultaneous imaging of cortical blood flow and haemoglobin concentration with LASCA and RGB reflectometry,” Adv. Exp. Med. Biol., 789 427 –433 (2013). https://doi.org/10.1007/978-1-4614-7411-1 AEMBAP 0065-2598 Google Scholar

32. 

T. P. Patel et al., “An open-source toolbox for automated phenotyping of mice in behavioral tasks,” Front. Behav. Neurosci., 8 349 (2014). https://doi.org/10.3389/fnbeh.2014.00349 Google Scholar

33. 

M. O. Freire and T. E. Van Dyke, “Natural resolution of inflammation,” Periodontology 2000, 63 149 –164 (2013). https://doi.org/10.1111/prd.2013.63.issue-1 Google Scholar

34. 

J. Kunczik et al., “Remote vitals monitoring using visual imaging,” Biomed. Opt. Express, (2019). Google Scholar

35. 

C. Barbosa Pereira et al., “Contactless monitoring of heart and respiratory rate in pigs using infrared thermography,” PLoS ONE, PHMBA7 0031-9155 Google Scholar

36. 

A. Rodriguez et al., “ToxTrac: a fast and robust software for tracking organisms,” Methods Ecol. Evol., 9 (3), 460 –464 (2018). https://doi.org/10.1111/mee3.2018.9.issue-3 Google Scholar

37. 

M. A. Nashaat et al., “Pixying behavior: a versatile real-time and post hoc automated optical tracking method for freely moving and head fixed animals,” eNeuro, 4 1 –13 (2017). https://doi.org/10.1523/ENEURO.0245-16.2017 Google Scholar

38. 

E. Y. Xue et al., “Use of FLIR ONE smartphone thermography in burn wound assessment,” Ann. Plast. Surg., 80 S236 –S238 (2018). https://doi.org/10.1097/SAP.0000000000001363 Google Scholar

39. 

A. Chanmugam et al., “Relative temperature maximum in wound infection and inflammation as compared with a control subject using long-wave infrared thermography,” Adv. Skin Wound Care, 30 406 –414 (2017). https://doi.org/10.1097/01.ASW.0000522161.13573.62 Google Scholar

Biography

Carina Barbosa Pereira received her MSc degree in biomedical engineering from the University of Minho (Braga, Portugal) in 2011, and her Dr.-Ing. degree from the RWTH Aachen University (Aachen, Germany) in 2018. In the same year, she was appointed senior scientist at the Department of Anesthesiology of the RWTH Aachen University. Her research interests include infrared thermography, imaging and biosignal processing as well as feedback control in medicine.

Janosch Kunczik is an engineering scientist at the University Hospital of RWTH Aachen. He received his MSc degree in electrical engineering, focusing on system technology and automation, from RWTH Aachen University in 2017. After gaining in-depth experience in robust control for medical applications during his graduate studies, he is now focusing on contactless biomedical sensing techniques.

André Bleich heads the Institute for Laboratory Animal Science and Central Animal Facility, Hannover Medical School. Besides, he also represents the animal welfare commissioner at his university. His major research interests include animal welfare, microbiome and gnotobiology, and animal models of inflammatory bowel disease, as well as infection models.

Christine Haeger studied biology at the University of Bremen, Germany (2000–2005), and completed a doctoral thesis on lung inflammation and immune responses at the Hannover Medical School (MHH), Germany, obtaining a Dr. rer. nat. in 2008, followed by a postdoc focusing on vascular inflammation and regeneration, MHH (2009–2012). Since 2013, she has been a postdoc in the Institute for Laboratory Animal Science, MHH, focusing on murine models for inflammatory bowel disease. Since 2017, she has served as principal investigator within the DFG-funded research unit 2591: severity assessment in animal-based research.

Fabian Kiessling studied medicine in Heidelberg. Until 2008, he worked in the Departments of Radiology and Medical Physics in Radiology at the German Cancer Research Center. In parallel, he did his clinical training at Heidelberg University and received the board certification as radiologist. Since 2008 he has led the Institute for Experimental Molecular Imaging at RWTH Aachen University. He has authored over 300 publications and book chapters, edited three books, and received multiple research awards.

Thomas Thum studied medicine in Hannover, with state examination in 2001 and board examination for internal medicine/cardiology in 2009/2010. He received his PhD at the Imperial College, London (2008). He is a full professor and director of the Institute of Molecular and Translational Therapeutic Strategies at the Hannover Medical School, and visiting professor at the Imperial College. He has coauthored over 300 publications and is a distinguished reviewer, board member, patent holder, and founder.

René Tolba studied medicine in Bonn (Germany) and Pittsburgh (USA). In 2002, he received his PhD from the University of Bonn. From 1999 to 2007, he worked in the Department of Surgery at the University Hospital of Bonn. Nowadays, he holds a professorship at the RWTH Aachen University and is director of the Institute for Laboratory Animal Science. His research focuses on visceral organ transplantation and preservation as well as animal models for various disciplines.

Ute Lindauer completed the state examination in veterinary medicine in 1988 and obtained her doctorate in 1991 from Ludwig-Maximilians-University in Munich. She has worked as a postdoctoral fellow in Berlin in the Experimental Neurology Lab at University Hospital Charité—Universitätsmedizin Berlin in Ulrich Dirnagl’s group, where she finished her professional dissertation (habilitation) in experimental neurology in 2001. She worked as an associate professor at the Technical University in Munich, from 2009 to 2014, and is now an associate professor of translational neurosurgery and neurobiology at the Medical Faculty of RWTH Aachen University, Germany. Her research focuses on mechanisms of cerebrovascular regulation in health and disease.

Stefan Treue holds a professorship for cognitive neuroscience at the University of Goettingen, Germany and is director of the German Primate Center. He received his PhD in systems neuroscience in 1992 from MIT. His research interest is focused on the role of attention in cortical processing of visual information in primates ( www.dpz.eu/cnl). Additionally he is interested in advancing the wellbeing of primates in research and is engaged in public outreach about animal research.

Michael Czaplik is working for the Department of Anesthesiology, University Hospital RWTH Aachen as senior consultant in anesthesiology. While heading the interdisciplinary section “Medical Technology,” he has been leading diverse research projects combining medical and technical expertise since 2014. His main research focus includes camera-based monitoring modalities in terms of anesthesiology, as well as interoperability and telemedicine. Besides his academic affiliation, he is also cofounder and CEO of Docs in Clouds TeleCare GmbH.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 Unported License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Carina B. Pereira, Janosch Kunczik, André Bleich, Christine Haeger, Fabian Kiessling, Thomas Thum, René Tolba, Ute Lindauer, Stefan Treue, and Michael Czaplik "Perspective review of optical imaging in welfare assessment in animal-based research," Journal of Biomedical Optics 24(7), 070601 (8 July 2019). https://doi.org/10.1117/1.JBO.24.7.070601
Received: 27 March 2019; Accepted: 30 May 2019; Published: 8 July 2019
Lens.org Logo
CITATIONS
Cited by 10 scholarly publications.
Advertisement
Advertisement
KEYWORDS
Cameras

Imaging systems

Near infrared

Heart

Optical imaging

Forward looking infrared

Visible radiation

Back to Top