Breast-conserving surgery is a well-accepted breast cancer treatment. However, it is still challenging for the surgeon to accurately localize the tumor during the surgery. Also, the guidance provided by current methods is 1 dimensional distance information, which is indirect and not intuitive. Therefore, it creates problems on a large re-excision rate, and a prolonged surgical time. To solve these problems, we have developed a fiber-delivered optoacoustic guide (OG), which mimics the traditional localization guide wire and is preoperatively placed into tumor mass, and an augmented reality (AR) system to provide real-time visualization on the location of the tumor with sub-millimeter variance. By a nano-composite light diffusion sphere and light absorbing layer formed on the tip of an optical fiber, the OG creates an omnidirectional acoustic source inside tumor mass under pulsed laser excitation. The optoacoustic signal generated has a high dynamic range (~ 58dB) and spreads in a large apex angle of 320 degrees. Then, an acoustic radar with three ultrasound transducers is attached to the breast skin, and triangulates the location of the OG tip. With an AR system to sense the location of the acoustic radar, the relative position of the OG tip inside the tumor to the AR display is calculated and rendered. This provides direct visual feedback of the tumor location to surgeons, which will greatly ease the surgical planning during the operation and save surgical time. A proof-of-concept experiment using a tablet and a stereo-vision camera is demonstrated and 0.25 mm tracking variance is achieved.
Atherosclerotic plaque at the carotid bifurcation is the underlying cause of the majority of ischemic strokes. Noninvasive imaging and quantification of the compositional changes preceding gross anatomic changes within the arterial wall is essential for diagnosis of disease. Current imaging modalities such as duplex ultrasound, computed tomography, positron emission tomography are limited by the lack of compositional contrast and the detection of flow-limiting lesions. Although high-resolution magnetic resonance imaging has been developed to characterize atherosclerotic plaque composition, its accessibility for wide clinical use is limited. Here, we demonstrate a fiber-based multispectral photoacoustic tomography system for excitation of lipids and external acoustic detection of the generated ultrasound. Using sequential ultrasound imaging of ex vivo preparations we achieved ~2 cm imaging depth and chemical selectivity for assessment of human arterial plaques. A multivariate curve resolution alternating least squares analysis method was applied to resolve the major chemical components, including intravascular lipid, intramuscular fat, and blood. These results show the promise of detecting carotid plaque in vivo through esophageal fiber-optic excitation of lipids and external acoustic detection of the generated ultrasound. This imaging system has great potential for serving as a point-ofcare device for early diagnosis of carotid artery disease in the clinic.
Compound eyes in arthropods demonstrate distinct imaging characteristics from human eyes, with wide angle field of view, low aberrations, high acuity to motion and infinite depth of field. Artificial imaging systems with similar geometries and properties are of great interest for many applications. However, the challenges in building such systems with hemispherical, compound apposition layouts cannot be met through established planar sensor technologies and conventional optics. We present our recent progress in combining optics, materials, mechanics and integration schemes to build fully functional artificial compound eye cameras. Nearly full hemispherical shapes (about 160 degrees) with densely packed artificial ommatidia were realized. The number of ommatidia (180) is comparable to those of the eyes of fire ants and bark beetles. The devices combine elastomeric compound optical elements with deformable arrays of thin silicon photodetectors, which were fabricated in the planar geometries and then integrated and elastically transformed to hemispherical shapes. Imaging results and quantitative ray-tracing-based simulations illustrate key features of operation. These general strategies seem to be applicable to other compound eye devices, such as those inspired by moths and lacewings (refracting superposition eyes), lobster and shrimp (reflecting superposition eyes), and houseflies (neural superposition eyes).
Photoacoustic imaging employing molecular overtone vibration as a contrast mechanism opens a new avenue for bond-selective imaging of deep tissues. Broad use of this modality is, however, hampered by the extremely low conversion efficiency of optical parametric oscillators at the overtone transition wavelengths. To overcome such a barrier, we demonstrate the construction and use of a compact, barium nitrite crystal-based Raman laser for photoacoustic imaging of C–H overtone vibrations. Using a 5-ns Nd∶YAG laser as the pumping source, up to 21.4 mJ pulse energy at 1197 nm was generated, corresponding to a conversion efficiency of 34.8%. Using the 1197 nm pulses, three-dimensional photoacoustic imaging of intramuscular fat was demonstrated.
Photoacoustic imaging employing molecular overtone vibration as contrast mechanism opens a new avenue for deep tissue imaging with chemical bond selectivity. Here, we demonstrate vibration-based photoacoustic tomography with an imaging depth on the centimeter scale. To provide sufficient pulse energy at the overtone transition wavelengths, we constructed a compact, barium nitrite crystal-based Raman laser for excitation of 2<sup>nd</sup> overtone of C-H bond. Using a 5-ns Nd:YAG laser as pumping source, up to 105 mJ pulse energy at 1197 nm was generated. Vibrational photoacoutic spectroscopy and tomography of phantom (polyethylene tube) immersed in whole milk was performed. With a pulse energy of 47 mJ on the milk surface, up to 2.5 cm penetration depth was reached with a signal-to-noise ratio of 12.