This paper presents a selection of image processing methods and algorithms, which are needed to enable the reliable
automation of robotic tasks at the micro and nanoscale. Application examples are automatic assembly of new nanoscale
electronic elements or automatic testing of material properties. Due to the very small object dimensions targeted here, the
scanning electron microscope is the appropriate image sensor. The methods described in this paper can be categorized
into procedures of object recognition and object tracking. Object recognition deals with the problem of finding and
labeling nanoscale objects in an image scene, whereas tracking is the process of continuously following the movement of
a specific object. Both methods carried out subsequently enable fully automated robotic tasks at the micro- and
nanoscale. A selection of algorithms is demonstrated and found suitable.
Depth estimation in the scanning electron microscope (SEM) is an important topic especially for automation
purposes. The SEM only delivers two-dimensional (2D) images, which makes manipulation processes difficult.
In spite of the high depth of focus in the SEM, it is still possible to use depth from focus as a depth estimation
technique for nanomanipulation applications. This article deals with the extraction of depth information from
SEM images using focus-based methods, and possibilities to improve the performance of these algorithms. A
new approach is presented, combining 2D object tracking with focus-based depth estimation methods in order
to obtain a possibility for limited three-dimensional tracking.
This paper describes the implementation of several key components that were used to build a prototype for a
versatile camera system operating in a scanning electron microscope. For the precise alignment of the camera
inside the vacuum chamber, a stick-slip-based actuator was developed, that can create a high torque while having
small dimensions. The camera is mounted on a rail and carriage system and the implemented combination of
absolute and relative optical sensors is described. Finally, several object tracking scenarios are defined and first
results of implemented tracking algorithms are given.