This paper adresses the issue of generating a panoramic view and a panoramic depth maps using only a single camera. The
proposed approach first estimates the egomotion of the camera. Based on this information, a particle filter approximates
the 3D structure of the scene. Hence, 3D scene points are modeled probabilistically. These points are accumulated in a
cylindric coordinate system. The probabilistic representation of 3D points is used to handle the problem of visualizing
occluding and occluded scene points in a noisy environment to get a stable data visualization. This approach can be easily
extended to calibrated multi-camera applications (even with non-overlapping field of views).