Near infrared spectroscopy (NIRS) has been used extensively in recent years as a noninvasive tool for investigating cerebral hemodynamics and oxygenation. 1, 2, 3, 4 The technique exploits the different absorption spectra of oxy-hemoglobin , deoxy-hemoglobin (HHb), and cytochrome oxidase (CytOx) in the near infrared region to measure the chromophore concentration levels in the cerebral tissue. By making simultaneous NIRS measurements at multiple brain sites, optical topography (OT) provides a spatial map of the hemoglobin concentration changes from specific regions of the cerebral cortex. Over the last decade or so, many studies have been published describing the use of the OT technique to map functional brain activation.5, 6 In comparison with fMRI, since the optodes are attached on the head, OT offers the possibility of studying neuronal response in a wider range of experimental situations and also a wider range of subjects, especially those for whom other monitoring options are limited, e.g., infants and elderly adults.7, 8 In addition, the instrumentation does not interfere with other imaging modalities and hence allows its superior temporal resolution to be recorded simultaneously by and simply alongside these other techniques.9, 10
However, interpretation of the physiological signals remains a complex task due to the intrinsic physiological noise and systemic interference signals coupled into the functional response. Inference about these observed responses requires a rigorous statistical approach to analysis of the spatially extended data. While there are various ways of analyzing the optical signals that are largely dependent upon the degree of noise in the data and the form in which the diffuse light is measured (i.e., the mode of OT measurement), there are limited methods available that can simultaneously test for the temporal and spatial distribution of the hemodynamic response that can be attributed to functional activation of the brain. The classical approach to the analysis of functional OT data is to employ a Student’s t-test to compare two different states of the brain activity (e.g., “rest” versus “task”). The rest period is usually defined as a fixed time period before the stimulus onset. Due to the unique hemodynamic behavior (i.e., different areas under the hemodynamic response curve) of individuals, it can be difficult to derive a common task period for statistical group testing. In such cases, it has to be assumed that the maximum activation period has been accounted for using the conventional block t-test approach, and any remaining temporal information in the hemodynamic response is ignored. While a simplistic approach of this kind helps to provide a quick assessment of the hemodynamic response to the task, it often produces an underestimate of significance due to the lack of consideration of the spatial coherence of functional OT data. Clearly analysis methods that depend upon the comparison of rest versus task are highly susceptible to how these time periods are defined.
In trying to decorrelate the physiological noise (cardiac, respiratory, and vasomotion-related fluctuations) from the evoked hemodynamic response, various groups have looked at the possibility of using the technique of blind source separation in the form of principal component analysis11 and independent component analysis12, 13 methods to the time-series analysis of functional data. However, these methods require various assumptions of orthogonality in order to separate out the signal of interest from the physiological noise, or they require prior constraints to be applied in order to derive the time course of the hemodynamic response.14 While such approaches offer the possibility of reducing the effects of physiological noise to a certain degree, they require an estimation of the eigenstructure based on the prior information about the degree of noise or visual inspection with the step-by-step removal of eigenvectors to avoid the possibility of reducing the magnitude of the task-related response. In addition, these methods require a clear distinction between the evoked response and interference. Barbour looked at the possibility of separating out the signal of interest in space with prior information about its distinctive frequency response in time.15 Previous spectroscopic studies have successfully exploited the advantages of using the general linear model (GLM) to analyze the hemodynamic response in rats.16, 17 More recently, GLM analysis has been applied to OT data of functional activation; however, a simple box-car model was used as the temporal basis function for changes in and HHb, and no attempt was made to look at the spatial coherence of the OT data.18
As the changes in cerebral hemodynamics during functional activation become well-characterized,19, 20 it is possible to fit the signal to a hemodynamic response function (HRF) and compute the statistics based on the variance between the fitted models. There are clearly a number of advantages of applying a GLM approach to the analysis of OT data over a direct comparison between chromophore changes during activation. The comparison of the time course with the modeled HRF negates the need for user defined rest and task periods. The fitted model does not depend on the absolute magnitude of changes, which for OT data may be susceptible to varying optical path length arising from partial volume effects21 of light attenuation in multilayered structures such as the adult head. Conventional statistical t-test methods treat each optical channel as a spatially independent measurement. Given that brain activity is spatially correlated, this reduces the sensitivity of the tests, as it increases the probability of false positives. It is important to quantify these spatial correlations by estimating the smoothness of the statistical process because global systemic changes affect the statistical results. In this paper, we describe an approach that has been used in the analysis of fMRI, PET/SPECT, and EEG/MEG data and show how it can be applied to the analysis of the functional OT data. The methodology of statistical parametric mapping (SPM)22 has been referred to as the construction and assessment of spatially extended statistical processes used to test hypotheses about functional neuroimaging data. Since the SPM approach offers the flexibility to model the varying changes of hemodynamics, it is well-suited to the analysis of OT data.
While it is common practice for biomedical optics research groups and companies to develop software for specific systems,23, 24 there are a limited number of packages that allow analysis of the spatially resolved data collected from a range of different instruments with varying measurement configuration. The development of such software should be based upon building programs using robust and validated components that encapsulate the basic algorithms and provide an interface that the user feels comfortable with. Ideally, software of this kind should be released as open source, thus allowing it to be modified to suit individuals’ needs. Since there are now a number of different OT systems available, both commercially and in research laboratories, it was felt necessary to develop a software package that provides a more widely applicable and flexible approach to the analysis of multichannel NIRS data as well as incorporating a robust statistical approach that takes into consideration both the temporal and spatial correlations of the OT data.
This paper describes the development of an analytical software, functional optical signal analysis (fOSA), which as well as providing a range of processing tools incorporates the SPM method for the analysis of OT data. It is designed to serve as a platform for the user to perform real-time analysis of OT data from different optical imaging systems using algorithms that incorporate both standard MATLAB libraries (The Math Works, Inc.) as well as those developed by the user. It is an open-source program and together with its documentation is available online.25 The software offers a range of processing procedures and statistical analysis methods. Specific features of this software include the following:
• It enables direct application of the SPM package for the analysis of spatially correlated data to functional activation data from optical systems.
• It analyzes data from a number of different types of optical imaging systems with varying source detector geometries.
• It displays and exports the preliminary results at any stage, thereby enabling its use as a “first-pass” assessment during the performance of an experimental study, allowing the user to determine the suitability of the chosen protocols.
• It provides a flexible processing sequence and allows additional plug-in options for user-defined processing procedures.
In this paper, we first describe the flexible structure of the program and the various processing procedures available in fOSA. We then compare the two statistical analysis methods that are currently incorporated in fOSA. While the SPM package is made up of several different compartments, this paper focuses on the methodology and considerations of applying SPM to optical data as well as providing a description of the processes necessary to analyze the OT data sequences. In functional studies, it is necessary to be able to compare with statistical reliability the signal change across brain regions and across time periods and among experimental conditions. We describe the application of the software using a synthetic data set and published data from a finger-tapping motor task.26
Each of the fOSA features will be described in the following sections. In summary, deriving functional activation-related hemodynamic changes from optical data involves a number of procedures that generally include the conversion of optical attenuation changes into chromophore concentration changes, filtering and detrending any observable noise to a known frequency spectrum, and comparing the significant differences between different activation states using some standard statistical tests.26, 27, 28 Since these operations are likely to be linear, fOSA offers a choice of flexibility when it performs these procedures. While it allows other programs to be implemented under the fOSA environment, the processed results can also be exported to a format accessible by most software to allow the data to be analyzed separately. The fOSA software was initially developed for the analysis of optical data from the continuous-wave (cw) Hitachi systems (ETG-100 and ETG-4000), and hence the presentation of the results in this paper is specific to the Hitachi OT configuration. The software has however also been used for the analysis of optical data from other cw OT systems, as discussed in Sec. 3.3.
Optical Signal Conversion
The first preprocessing stage involves the conversion of the measured light attenuation into relevant hemoglobin concentration levels. While it is possible to perform the conversion at a later stage,23, 26, 27 fOSA requires the conversion of the absolute light intensities from the cw measurement at any two distinctive wavelengths to and concentrations before proceeding on to other signal processing. This enables the user to inspect the hemoglobin data as a “first-pass” check on signal size, optode positioning and contact stability, etc. The conversion algorithm used in fOSA follows the modified Beer-Lambert law,29 which considers the light scattering between each light source-detector pair to be homogenous both spatially and temporally throughout the study period. While the detailed algorithms for the conversion can be found elsewhere,21, 27 in summary the derivation of and concentrations (in micromolar of hemoglobin per liter of tissue) for each time sample can be simplified in matrix form if a minimum of two-wavelength measurement is made:is the extinction absorption coefficient. The total hemoglobin concentration, is the summation of the two chromophore concentrations. The additional distance the light travels as a result of the highly diffuse nature of the biological tissues results in the actual optical path length being longer than the geometric optode distance .29 It has previously been shown, both from the modeling of light transport in the adult head using diffusion theory and from experimental measurement, that to a first approximation, the actual path length can be regarded as a multiple of the geometry optode spacing and differential path-length factor (DPF), which is wavelength30 and age dependent.31 The software allows the use of an optical path-length multiplier to be optional, and its value is defined by the user. In Sec. 3.3, we describe the use of fOSA to process a set of data from an in-house optical imaging system with a complex optode configuration having distributed source detector spacings.
fOSA Data Processing Procedures
In principle, the three possible forms of noise that can cause interference on the optical signals include the instrumentation noise, experimental noise, and physiological noise. The characteristics of the system noise are often well-behaved and can be removed easily, and the experimental noise can often be minimized with a properly designed task paradigm. However, very often the physiological noise (e.g., spontaneous vasomotion) overlaps very closely in frequency with the expected activation-related brain response.32, 33 The major emphasis of the filtering and curve fitting procedures in the data processing is to reduce non-task-related effects and to correct for any observable noise from the physiological measurements. Since the software is designed to include basic data correction techniques that then allow a real-time assessment of the activation signals, fOSA incorporates three different types of digital filters with distinctive roll-off and passband characteristics. While the elliptic filter gives a steep roll-off, the Chebyshev filter produces a gradual roll-off with fewer ripples in the passband, and the Butterworth filter produces a “maximally flat” passband with optimum roll-off. fOSA enables the removal of noise from the known frequency spectrum using either the low-, high-, and/or band-pass digital filters, resampling the data points and/or smoothing of the data by means of moving average. To view the range of frequencies in the physiological signal, its power spectral density can be plotted in the frequency domain using a Fast Fourier Transform (FFT). The FFT function in fOSA is defined using Welch’s averaged periodogram method by dividing the whole time course into five Hanning windows (i.e., the window duration is set as one-fifth of the signal duration). With no overlap and no zero padding needed, the sampling frequency (of the instrument) is set by the user.
Inappropriate detrending can introduce artificial peaks in the lower power spectrum that could be misinterpreted as physiological oscillations.34 The fitting procedure implemented in fOSA uses a predefined rest period to obtain a curve that best fits its time course and then subtracts it from the remainder of the data points, where the level of the residual depends on the selective fitting order. Another effective way to cancel out the low-frequency noise is to perform a block averaging over the specific trials. The direct effect of this is to remove any uncorrelated high-frequency noise. Block averaging has very little effect on the removal of slow variations and the overall mean. These effects are usually considered when the data is detrended from the baseline.
fOSA offers the possibility to view the processed signals after each procedure. Since all the operations are linear, the sequence of the processing does not affect the overall results. Depending on the optode configuration, the results can be displayed in the form of topographic images (with linear interpolation between the measurement channels) or spatially configured hemoglobin time courses ( , HHb, or both). It is also possible to focus on a specific measurement channel for more detailed analysis. An option to visualize the topographic Hb maps either at a particular time slice or in the playback mode is available where the whole time series of topographic images is displayed in a movie format. There are a variety of display options to interpret the statistical results (p-values and t-values), which can be viewed in the form of a topographic map or in a numerical table. To allow the user to carry out further analysis using other software, fOSA provides an option that allows the results to be exported as spreadsheets. Alternatively, fOSA offers the flexibility to process a batch of optical signals with the same experimental protocols, by applying the same processes to the group data.
A classical statistical test approach used to compare the difference in signal means between periods of rest and activation is to use a t-statistic that derives its probability of likelihood based on the standard deviation of the mean differences. Two different t-statistical approaches have been incorporated into fOSA: a Student’s t-test method typically used in the analysis of functional OT data26, 28 and the implementation of the SPM-OT method that uses the SPM approach to make inferences about the spatial coherence of the functional OT data. The former approach provides a quick assessment of the significantly active brain area, as each measurement channel is treated independently from its neighbor and the epoch design means that the task and rest periods are specified. Given the sluggish temporal nature of the hemodynamic response,35 fOSA considers the temporal variation of the response function and provides the user with the option to specify the task and rest periods for statistical comparison. While more complex analysis involving multiple levels of comparisons that are required in some studies can be conducted outside the fOSA environment, the current software provides a single-factorial comparison by allowing the choice of one of the following options:
• one-sample t-test in which the mean differences between the task and rest periods are statistically tested;
• two-sample t-test in which the averaged task period is compared against the averaged rest period; and
• paired t-test in which each trial is considered separately and a comparison is made between different trials per condition.
The open-source software means that it is also possible to modify the existing scripts and customize a user-specified statistical test.
Statistical Parametric Mapping of OT Data
During functional activation, the neurovascular responses observed from the multiple brain regions are often spatially and temporally correlated in some way.20 In the conventional statistical test approaches used in most optical imaging studies, each measurement point is treated as independent of its nearest neighbors, assuming that the regional activated areas are localized and independent of systemic influences. These tests also require prior knowledge about the hemodynamic response in order to specify the most relevant periods during which to compare the changes in the and HHb and cannot easily account for the intersubject variability in the time course of the hemodynamic response. While it is possible to determine the time course of the hemodynamic response, modeling of spatial corrections is also required given that OT measurement channels are not spatially independent. SPM offers the flexibility to analyze sequential functional neuroimaging data by modeling the hemodynamic response and correcting for the spatial smoothness of the functional data.36 The methodology uses the mass univariate approach where a statistic value is calculated for every voxel and the resulting statistics are then assembled into a map, i.e., an SPM. A detailed description of SPM theory and its algorithms is given elsewhere.37 This paper focuses on the specific SPM procedures necessary for the analysis of OT data. The statistical approach adapted by SPM involves two basic steps. First, statistics reflecting evidence against a null hypothesis of no effect at each voxel are computed, where the effect is specified by a so-called contrast vector.38 An image resulting from these statistics is produced. This image can be three-dimensional (3-D—fMRI, PET), or two-dimensional (2-D—M/EEG, OT). Second, this image is assessed, using p-values, by performing voxel-wise tests, to determine whether there is activation. While there are various approaches to control the family-wise error rate39 of this mass-testing of many voxels, the random field theory (RFT) has been shown to be appropriate for the analysis of functional neuroimaging data.40 The Gaussian random field treats the statistical images as sampled versions of continuous, spatially resolved data. P-values, adjusted for multiple comparisons, are computed by estimating the probability that some peak in an image surpasses some user-specified threshold, given the null hypothesis of no activation. For typical, smooth images, it has been observed that the RFT provides much more sensitive tests than the Bonferroni correction. This is because the RFT is explicitly informed about image smoothness (i.e., spatial correlations), while the Bonferroni correction is not. It might be asked whether the RFT is applicable to OT data. To the best of our knowledge, OT data fulfill all the assumptions needed for the RFT. The results of the RFT are asymptotically valid (for higher and higher thresholds) for any type of correlation structure, as long as the local correlation is strong. (We arrive at this conclusion through personal communication with Tom Nichols and Stefan Kiebel; see also below.) This holds for OT data. Similar discussions about its applicability exist in the M/EEG field, where we observe that no definitive review has yet been written on the applicability of RFT to M/EEG data. (M/EEG is similar to OT data in that it is a spatial mixture of the expressions of underlying brain sources.)
SPM Modeling and Inference
In the following, we will go through some basic techniques that SPM uses to model spatiotemporal data, by which we mean that both temporal and spatial correlations are modeled. The main point is that the assumptions that the SPM procedures involved hold not only for fMRI, PET, and MEG/EEG but also for OT data.
SPM involves the construction of a statistical model and making inferences about these parameter estimates based on the spatiotemporal neuroimaging data and the predicted model. In order to capture the sluggish nature of the neurovascular response, SPM treats each scan as a dependent observation of the others and models the time-continuous data for each voxel. Here, each of the brain activation voxels is expressed as a combination of different basis functions that best represent the functional response using the general linear model (GLM). In simpler terms, the GLM expresses the observed response as a combination of the explanatory variables plus a well-behaved error term (i.e., ).22 Translating the model into matrix form, each row in is treated as a scan of time-series data. The design matrix is where the experimental knowledge about the expected signal is quantified. The matrix contains the regressors (e.g., designed effects or confounds), where each row represents each observation and each column corresponds to some experimental effects, including effects that are of no interest and pertain to confounding effects (e.g., physiological effects) to our analysis that are modeled explicitly. The associated parameter in each column of indicates the effect of interest (e.g., the effect of a particular cognitive condition or the regression coefficient of the hemodynamic response). The convolution model for the hemodynamic response takes a stimulus function encoding the supposed neuronal responses convolved with a defined canonical hemodynamic response function (HRF) to form a regressor that enters into a design matrix.20 SPM also allows the modeling of latency and dispersion derivatives as additional regressors to its canonical HRF in order to treat the variability of hemodynamic responses due to different sorts of events. Other forms of explanatory data (e.g., blood pressure and/or heart rate measurements) can be defined as additional regressors in the design matrix. Unmodeled effects are then treated as residual errors. Often, the number of parameters (column) is less than the number of observations (row); hence, some method of estimating parameters to find the best fit for the observed response is needed. In SPM, the parameter estimation is achieved using a mixture of restricted maximum likelihood (ReML) and ordinary least squares (OLS) methods. In estimating the error covariance matrix, SPM assumes that the pattern of serial correlations is the same over all voxels (of interest) but that its amplitude is different at each voxel. To incorporate nonspherical error distributions, SPM uses the ReML procedure to estimate the model coefficients and error variance iteratively.41 In essence, the ReML method deals with linear combinations of the observed values whose expectations are zero. The relationship between experimental manipulations and observed data may consist of multiple effects, all of which are contained within the design matrix. To test for a specific effect, a contrast is defined to focus on a particular characteristic of the data. SPM allows the specification of different contrast vectors to the same design matrix to test for multiple effects without having the need to refit the model (i.e., univariate linear combinations of the parameter estimates).
Regional changes in error variance and spatial correlation in the data can induce profound nonsphericity in the error terms (as opposed to the assumption of identically and independently distributed error terms). This nonsphericity would require large numbers of parameters to be estimated for each voxel using conventional multivariate techniques. However, using SPM, the parameterization is minimized to just two parameters for each voxel—error variance and smoothness estimators. This is made possible because SPM uses the RFT to resolve the multiple comparison problem, which then implicitly imposes constraints on the nonsphericity implied by the spatially extended data. SPM improves the sensitivity of the statistical test by considering the fact that neighboring voxels are not independent by virtue of spatial correlations in the original data and employs the RFT approach to adjust the p-significance. RFT correction often expresses its search volume as a function of smoothness (or resolution elements that define the block of correlated pixels). It relies on the expected Euler characteristic (EC) that leads directly to the expected number of clusters above the threshold, and we want to derive this statistic height threshold once the smoothness has been estimated. A relatively straightforward equation on the expected EC to derive this threshold is given by Worsley that depends on the -threshold and number of resels contained in the volume of voxels under analysis.42 In practice, the correction will need to take into consideration the search volume. Restricting the search region to a small volume (i.e., defining the region of interest) within the statistical map is likely to reduce the height threshold for given family-wise error rates. The smoothness is in turn calculated using the residual values from the statistical analysis. Kiebel have shown that it is possible to estimate the smoothness based on the residual components in the GLM with sufficient degrees of freedom and the size of the spatial filter kernel.43 Using the same analogy, the voxel-based inference can be extended to a larger framework involving cluster-level and set-level inference.
The fact that both the fMRI and OT methods measure the temporal neurovascular changes induced by the increased oxygen demand of the neuronal activities and hence share a similar variation of the hemodynamic response makes it possible to utilize some of the existing functions in SPM designed for the analysis of fMRI data and apply them to the treatment of the optical signals. In addition to implementing the signal-processing features in fOSA, a conversion program named SPM-OT has been incorporated into fOSA that allows the OT data processed in fOSA to be analyzed in SPM like other neuroimaging modalities.44 The approach used by SPM-OT to analyze the OT data is simple:
1. Configure the brain activity signals ( , , and/or ) as topographic images such that each channel is represented by a pixel in the SPM map.
2. Perform spatial analysis and statistical inference on the planar images, knowing that this is what is being measured using the OT technique.
3. Provide an option to interpret the statistics in three-dimensional (3-D) space, given the coordinate information about these pixels.
The third option produces images where the two-dimensional (2-D) measurement patches are embedded into a 3-D head space. This is important for co-registration of the OT analysis with other modalities like fMRI or EEG. Similar SPM approaches to analyze 2-D data embedded into 3-D space have been described for fMRI.45, 46 However, this is not the approach we take in this paper. We simply analyze the OT data on a 2-D patch, which can then post hoc be co-registered to brain space.
Spatial preprocessing of OT data
A series of 2-D brain maps are generated for each time bin, where each measurement point is represented by a pixel corresponding to the and/or HHb parameter of the OT data. Figure 1 shows one possible statistical map based on the standard configuration of the 24-channel Hitachi OT system (Fig. 2 ). To improve on the sensitivity and facilitate intersubject comparisons, an interpolation process using nearest neighbors is used to produce a finer grid. As the interoptode spacing (IOS) is often larger than the spatial resolution afforded by the measurement system (the Hitachi OT system has a fixed IOS of ), it is more appropriate to break down each channel into smaller pixels to better identify any focal change in the hemodynamics. It is important to consider the spatial information that could be revealed using the existing OT technique and a smoothed map of the brain activity to facilitate the RFT correction for multiple comparisons. The position of each optode can be measured using an electromagnetic tracking device that provides the 3-D coordinates of the optode pixels (in red and blue, as shown in Fig. 1). The same interpolation technique can be applied to derive the 3-D positions of the channels (in yellow), where each point is assumed to be at the center between each source-detector pair.47 To mask out any irrelevant pixels (outside the two squares as shown in Fig. 1), a “not-a-number” (NaN) value is assigned on the configured grid. This allows SPM to identify regions that are not spatially correlated during the analysis. In considering the typical neurovascular response, SPM-OT provides an option to resample the optical data to a level similar to that of the repetition time (TR) used in fMRI. Each SPM brain map is tagged sequentially, and it is vital that the images are generated in a correct sequence, as each of these maps is being selected based on its header information during the specification of the design matrix.
Modeling of the spatiotemporal OT data
Previous fMRI-OT studies have shown some similarities between the BOLD and signals,9, 10 and given the better signal-to-noise ratio typically seen in signals during functional activation, the signal has often been used as an indicator of regional cerebral blood flow.48 As the NIRS measurement is able to provide a higher sampling rate than fMRI, in principle, it allows a wider range of the oscillatory spectrum to be captured. Specifying these measurable physiological oscillations in the design matrix enables a better modeling of the temporal hemodynamic response in SPM. During functional activation, the hemodynamic response of the signal resembles that of a gamma function with a time to peak typically between 5 to and a slight undershoot before and after the bulk response.49, 50 A similar response function has been observed during functional activation in the fMRI-BOLD signal that is used in SPM as a temporal basis function for the modeling of the brain response.20 Figure 3 shows the representation of the canonical hemodynamic response function (HRF) in SPM. To accommodate the sluggish feature of the HRF and the variability between different voxels due to different events, SPM allows a combination of different basis functions to model the observed response from the OT data with the inclusion of temporal and dispersion derivatives to give additional weightings to the canonical HRF.51 An advantage of using the OT technique to map the neurovascular response is that it is able to monitor both the and signals simultaneously. Hence in the analysis of OT data, SPM-OT offers the possibility to model the two chromophores (and possibly ) in the GLM. By assuming that the two responses take the form of a gamma function but in different directions during functional activation, SPM-OT offers an alternative route to infer about the functional OT data. Using the same SPM procedures in the model specification, Fig. 4b shows the inverse canonical HRF prescribed in SPM-OT to model the signal in an experimental data set. A positive t-statistic from the SPM analysis will hence indicate that the particular pixel location is significantly deactivated. However, it should be noted that while this option offers an alternative to compare each of the chromophores between different conditions, an inference between the two chromophores will not be possible since the response functions in these signals are thought to be different, with each function having different time latencies and amplitude changes.35 These differences would require explicit (nonlinear) models that are not available with the linear methods in SPM.
Unlike in the SPM analysis of fMRI data where a global normalization is recommended to threshold the BOLD response from the extracerebral volume, this procedure is not necessary for the analysis of OT data. SPM estimates the size of the experimental effects from the residuals with the combination of the maximum number of explanatory regressors using restricted maximum likelihood (ReML) for each voxel, in which one or more hypotheses can be specified. The associated parameter estimates are the coefficients that determine the mixture of basis functions that best model the HRF for the voxel in question. A classical univariate test statistic can then be formed to examine each voxel, and the resulting statistical parameters are assembled into an SPM image. Here a within-subject effect is specified in the first-level analysis, and the intersubject comparison is carried forward to the second-level analysis. A precheck on the variance between optode positions is often necessary in the event of a group analysis.
Statistical inference on OT data
In order to make statistical inference about the volume of statistical values, the RFT has been applied to solve the problem of multiple comparisons, by defining the height threshold for a smooth statistical map that gives the required family-wise error rate. Two assumptions are being made in using RFT to analyze the functional data: the error fields are a reasonable lattice approximation to an underlying random field, and the random fields are multivariate Gaussian distributions with a continuously differentiable autocorrelation function.52 In order to satisfy this requirement, the OT data have been sufficiently smoothed during the preprocessing.
OT measurements are 3-D: channels (space), time, and type of measurements (oxy- and deoxy-hemoglobin). To derive appropriate statistics, one has to model the correlations within and among the data dimensions. A similar problem has been addressed in the M/EEG literature, and we follow the arguments by Kiebel and Friston.53 In our scheme, time is modeled by using the GLM. The channel data is modeled by using results from the Gaussian RFT.42 This leaves us with the problem of how to model the third dimension, , and measurements. Note that between-dimension correlations are subsumed by assuming a factorization of correlation matrix (over all three dimensions). In the SPM framework, there are two ways of doing this: modeling it as a third dimension using Gaussian RFT, or modeling it as a factor with two levels in the design matrix of the GLM. Given that there are only two measurements of interest, we do not choose to model space and oxy/deoxy as a 2-D random field. Also, because SPM is a mass-univariate framework, this would exclude all inferences about a combination of and measurements. It is conceivable (although not used in the present paper) that such inferences are of interest to the community. Therefore, we model the two types of measurement as a factor with two levels. In the SPM framework, it is straightforward to model different variances and covariance between the two measurements by using the appropriate covariance components. A typical model for both oxy- and deoxy-hemoglobin uses a two-block design matrix, where each block models the time series of each type of measurement. The noise variance and serial correlations are modeled by two covariance components for each measurement type. Note that the assumption of different variances is necessary because we assume that the two measurement types have different noise characteristics. The crucial covariance component is between measurement types, which in its simplest case would be composed of an identity matrix modeling the covariance over time. Note that this within-session model is necessary only if one is interested in making within-subject inferences. For group inferences, it is sufficient to fit a general linear model to each measurement type and model the covariances between measurements, at the second level, in a random effects model.
The statistical results from the SPM analysis are plotted as planar statistical maps in SPM. The advantage of having the results displayed in SPM is that it thresholds the significantly active pixels after adjusting the p-significance. fOSA provides the facility to import and display the statistics from SPM and to facilitate more complex analysis. SPM-OT provides an option to display the 2-D statistical results in 3-D space, given the positions of the optodes placement. The channel and optode positions are registered as vertices points, and the faces on the vertices are based on the SPM results.
The programs in fOSA were validated using both simulated data and experimental data from the Hitachi ETG-100 OT system. A set of synthetic data containing realistic functional activation-related changes in and HHb was generated and used to test the conversion and processing algorithms in fOSA. The applicability of fOSA was then demonstrated with a published experimental data set acquired from a finger-tapping task with optodes placed over the motor cortex area.26
Comparison Using Simulated Data and Results
Validating algorithms on experimental in vivo data is difficult since there is an uncertainty over the true chromophore concentration changes; therefore, a set of simulated data was used to validate the software.54 The artificial data set consists of a time series of Hb values generated using known values for the tissue absorption and scattering coefficients together with a diffusion theory model to calculate the expected light attenuation.55 To generate the time series data, we have taken measured in vitro spectra of the and HHb and combined these in time varying concentration levels to simulate the different experimental conditions.56 Figure 5a (top) shows the simulated data together with an imposed linear drift and a known range of simulated noise frequencies between 0.1 and . The duration for both the task and rest periods were set at . Figure 5a (bottom) shows the simulated HHb concentration varying between 0 and and concentration varying between 0 and .
The results showed good agreement between the simulated data set and the fOSA results. There was a small offset between the two data sets that could be due to the wavelength-dependent effects of scattering and that is not currently taken into account in the fOSA application of the modified Beer-Lambert law (although this could be added by the user if desired). By applying a fifth-order Butterworth low-pass filter at and a linear curve fitting, we were able to effectively eliminate the simulated noise in this example. Figure 5b illustrates the comparison between the simulated and processed data after the averaging process. The slight difference in waveform shape is due to the effect of the filtering. Although not shown here, both signals share the same wave forms if the comparison is made with a similarly filtered version of the simulated signal. This test validated the correctness of the algorithms used to convert the light intensity signals as well as some of the basic signal processing routines.
Comparison Using Experimental Data and Results
The application of the fOSA software was tested using published experimental data.26 In this study, left ( -ftap) and right ( -ftap) finger-tapping tasks were conducted in two separate trials. The and were measured using a 24-channel OT system (ETG-100, Hitachi Medical Corporation, Japan).57 Figure 2 shows the optode configuration used in the study, where the optodes were placed close to the sensorimotor cortex (C3 and C4 areas). In summary, the subject was guided to perform the finger-tapping tasks at a regular pace of , over a task period of and a rest period of . Each trial was repeated five times.
Figures 6a, 6c, 6e show the time course results for the -ftap task over channel 16, and Figs. 6b, 6d, 6f show the time course results for the -ftap task over channel 9. By applying the algorithms employed in the fOSA software, the two-wavelength light intensities were converted to relative and concentrations. Figures 6a and 6b show the concentrations after smoothing the data with a moving average with a window span of 10 samples and the data were downsampled to . Figures 6c and 6d show the concentration changes after modeling out the slow drift using a linear fit. Last, the data were averaged over the five repeated blocks to eliminate any existing high frequency components, as shown in Figs. 6e and 6f.
The results of the analysis on the experimental time course data processed using fOSA can be compared to those obtained using the analysis methods described by Sato 26 in the study from which the data set was taken. In general, the hemodynamic response was observed in both analyses, i.e., an increase in with a slight decrease in after the stimulus onset, with the increase in maintained throughout the activation period. However, there is a slight variation in the time course that could be due to the processing procedures used. First, intensity conversion and block averaging were the only preprocessing steps done in the original experimental study by Sato. fOSA performed additional preprocessing steps: linear detrending and moving average to remove the slow drift and high-frequency components. Second, Sato used a “peak search” technique to select a maximum absolute mean change (i.e., maximum averaged area under the curve within a window). In their analysis, a paired t-test between the predefined rest and selected activation periods was conducted over the five repeated blocks to determine the significance for each task.
Since the experimental data was measured using the 24-channel Hitachi ETG-100 system, the optodes configuration as shown in Fig. 1 was used to generate the spatial map for the SPM analysis. As the two tasks were conducted in separate trials, SPM-OT first combined the data sequences as one continuous vector for and concentration data. Note that SPM provides a flexible approach to estimate the experimental data; it is able to accommodate different data length with varying stimulus onset periods in its model. A canonical HRF (presented in SPM) plus its latency and dispersion derivatives were used as the basis functions for the response model. To illustrate the flexibility of the software and validate the activation response, a second similar but inverted canonical HRF plus its two derivatives were used for the response model. Figure 4a shows the specification of the design matrix. The axis corresponds to the six regressors for the two tasks plus a constant error term, and the axis represents the 700 topographic images generated by SPM-OT. A t-statistic was computed for each of the -ftap [Figs. 7a and 7c ] and -ftap [Figs. 7b and 7d] tasks. The reported t-results were p-corrected using family-wise error correction and analyzed at the “pixel” level. Statistical results from the analysis showed a significant effect over the right cortex at and when the subject performed the -ftap task and at and on the left cortex when performing the -ftap task. These results showed good agreement in terms of laterality with the published results26 and previous OT studies.27 To illustrate the significant activated regions on the left sensorimotor area by estimating the optode placement (not measured in the study), Fig. 8 shows the possibility of extracting the 2-D t-results from SPM and superimposing them onto a SPM brain map.
Application with Other OT Systems
fOSA has been designed to process and analyze data from both fixed and flexible array systems and is not restricted to use with the ETG 100 Hitachi system. To date, the software has so far been used to process OT data from two other cw systems—the ETG 4000 (Hitachi Medical Systems; 24 channels, wavelengths and )28 and an in-house cw OT system (84 channels, wavelengths and ).58 Figure 9 shows the optode array design of the in-house system. Eight sources and eight detectors are positioned to provide five different optode spacings ( , , , , and ) with 42 channels in each array. This distribution of optode spacing is designed to provide some depth discrimination in the measured signals.59 Software extensions to read data from other OT formats, including the OXYMON 24-channel OT system (Artinis Ltd; http://www.artinis.com) and Hamamatsu OT system (Hamamatsu Photonics KK) are under constant development.
Optical imaging has been used extensively in the past few years to map functional brain activity occurring in the cortex. The advantages of optical imaging include its relatively high temporal resolution and ability to distinguish between changes in both and HHb. Given its noninvasive and portable nature, it is particularly well suited to investigating subject and patient groups such as neonates, young infants, and children who may not be able to tolerate other scanning techniques. As optical imaging is increasingly used to address more complex cerebral processes (e.g., neurodevelopment and cognitive behavior), a robust and accurate analysis tool is essential to interpret the observed hemodynamic responses. The fOSA software presented in this paper is one such tool.
The accuracy and validity of the algorithms used in fOSA have been assessed using a set of simulated data with known changes, and the application of fOSA has been illustrated using a previously published experimental data set. The slight difference between the simulated data and fOSA results is due to the effects of temporal filtering. The validation test has shown the ability of fOSA to process various forms of time-continuous OT data with different experimental conditions. A single experimental data set has been used to compare results from the analysis using fOSA and a method described by Sato 26 The slight temporal variation in the resulting hemodynamic response can be attributed to the processing procedures used in both the analysis and the selection procedure for the most significantly active channel. We have shown from the -ftap task that the most significant channel is close to channel 16 when using the SPM-OT technique, compared with channel 18, which was chosen in the original study. The spatial difference between these two channels is estimated to be . While this paper is not intended to compare the different statistical test methods but to provide an alternative approach to the analysis of functional OT data, it is questionable if it is appropriate to selectively obtain the maximum area under the curve during activation for this finger-tapping study. This approach could mean that different activation periods would have been used for the group analysis due to intersubject variations in the hemodynamic response. In addition, for changes in the hemodynamics associated with more subtle cognitive processing tasks, this peak search approach may not be suitable.
This problem of data selection for analysis of OT data is one reason why alternative methods need to be investigated. In addition to data selection issues, one also needs to consider spatial correlation between neighboring channels and the incorporation of other physiological measurements into the analysis to reduce the likelihood of false positives. We have shown in this paper the possibility of using SPM-OT to model both the and the for each measurement channel while considering the varying temporal responses by including the latency and dispersion derivatives in the GLM. The advantage of having two different parameters to compute the statistical difference in a single measurement, as we have shown in this paper, allows one to pinpoint particular cerebral (cortical) area from the location of the optical channels that shows a functional activation (i.e., positive t-values on both and voxels). In addition, the use of RFT to adjust single-voxel p-values of voxels to family-wise controlled p-values means that fewer false positives are observed. From the statistical t-results using SPM-OT, several observations could be made on the analysis of the experimental data from the finger-tapping task:
1. The positive t-results demonstrated the contralateral activation on both -ftap and -ftap tasks, thus suggesting significant increase in and decrease in HHb during the tasks, which was in agreement with previous studies.26, 27
2. While the highest-significance pixel was found to be on the same channel on both and maps for the -ftap task, there seemed to be a variability for the -ftap task, more so on the signal.
3. The higher t-results as shown on both and maps when performing the -ftap task would suggest that the subject had a more pronounced hemodynamic response to this motor task when the right hand was used.
The fOSA software has been developed to meet the needs of OT analysis, which include providing:
1. a systematic but flexible approach to the analysis of multichannel OT data;
2. an interface with validated and standard MATLAB tools to preprocess the OT data and provide a real-time assessment of the measured signals;
3. a platform on which to run other MATLAB-based functions and conduct further analysis of the fOSA data using other software;
4. robust statistical technique using SPM for the analysis of functional OT data; and
5. analysis of data acquired from different cw OT systems with flexible optodes configurations.
As the source codes of the data processing software supplied with the Hitachi ETG systems cannot be modified by the user, fOSA helps to complement this particular system software with the inclusion of flexible processing tools. Through numerous tests, the software has been in extensive use in our laboratory and that of our collaborators. The software is due to be released as an open-source program with online supporting documentation, with emphasis on its general application to a range of OT systems with distributed optode geometries.
fOSA also provides a platform to access the SPM software and statistically analyze the functional OT data. Unlike conventional voxel-wise t-test methods, the mass-univariate approach used in SPM ensures that each cortical pixel is modeled taking into account local spatial correlations using the RFT. This procedure enables valid statistical inference. In this paper, we propose a way to model OT data in the same way as other neuroimaging data, using SPM.
Given the OT data’s distinctive advantage over the fMRI technique in being able to simultaneously measure and concentrations, it has been shown that the two signals can be modeled using different basis functions with the existing canonical HRF functions. This new technique will hopefully serve as a useful tool to make inferences about the OT signal, selecting channels with significant increase in oxy-hemoglobin and significant decrease in deoxy-hemoglobin during functional activation. There are also improvements to be made in the analytical approach to the NIRS data and in presentation of the SPM results. A feasible approach to improving the SPM-OT will be to incorporate functional and anatomical constraints (from other imaging modalities) as priors to the SPM-OT analysis. To reduce the ambiguous source localization problem, SPM uses the informed basis function approach to employ spatial priors and reduce the solution space.60 This can be done in a similar fashion for OT data. Another ongoing development in our group is the use of real-time estimation of the distributed hemodynamic response. This will allow informed decisions to be made about the suitability of the chosen protocols, placement of optodes, and other considerations at the beginning of a study.
The fOSA package represents an improvement on currently available OT data processing software both in terms of the flexible and adaptable approach to preprocessing and the incorporation of the SPM approach to multichannel OT data. fOSA is open source and designed for use with a wide range of optical topography systems.
This work was funded with grants from EPSRC/MRC (Grant No. 14248/01). P. Koh was supported on an EPSRC CASE studentship in collaboration with Hitachi Advanced Research Laboratory, Japan. We would like to thank the SPM team at UCL for their technical support and Hitachi Medical Ltd for providing the ETG-100 OT system.