In this paper we propose "browsing by 3D scene" and "rendering by photographs" based on viewpoint-based approach. The idea is that by linking the 3D models and photographs via the spatial information, "viewpoint" in particular, we can use them as a reference for each other when browsing photographs or walking through the 3D scene. We use the camera parameters to express the viewpoint. Each photograph has the extrinsic camera parameters as metadata, which is defined according to the same coordinates as the 3D model, and hence we can compare the viewpoints of them and judge their similarity. Unlike content-base image retrieval, the viewpoint-based search is robust to the difference of features such as color and shape among images. The browsing by 3D scene method allows users to retrieve images that contain the same object but show it with different appearances and to browse the images taken from a similar viewpoint in groups. On the other hand, when a user want to see a particular 3D scene, the user specifies a sample image by selecting a photograph from the archive. The system then renders the 3D scene
with the viewpoint similar to that of the selected photograph.