This paper discusses the experimental results of our visualization model for data extracted from sensors. The
objective of this paper is to find a computationally efficient method to produce a real time rendering visualization
for a large amount of data. We develop visualization method to monitor temperature variance of a data center.
Sensors are placed on three layers and do not cover all the room. We use particle paradigm to interpolate data
sensors. Particles model the "space" of the room. In this work we use a partition of the particle set, using two
mathematical methods: Delaunay triangulation and Vorono¨ý cells. Avis and Bhattacharya present these two
algorithms in. Particles provide information on the room temperature at different coordinates over time. To
locate and update particles data we define a computational cost function. To solve this function in an efficient
way, we use a client server paradigm. Server computes data and client display this data on different kind of
hardware. This paper is organized as follows. The first part presents related algorithm used to visualize large
flow of data. The second part presents different platforms and methods used, which was evaluated in order to
determine the better solution for the task proposed. The benchmark use the computational cost of our algorithm
that formed based on located particles compared to sensors and on update of particles value. The benchmark was
done on a personal computer using CPU, multi core programming, GPU programming and hybrid GPU/CPU.
GPU programming method is growing in the research field; this method allows getting a real time rendering
instates of a precompute rendering. For improving our results, we compute our algorithm on a High Performance
Computing (HPC), this benchmark was used to improve multi-core method. HPC is commonly used in data
visualization (astronomy, physic, etc) for improving the rendering and getting real-time.