Touch sensing is ubiquitous in many consumer electronic products. Users are expecting to be able to touch with their finger the surface of a display and interact with it. Yet, the actual mechanics and physics of the touch process are little known, as these are dependent on many independent variables. Ranging from the physics of the fingertip structure, composed of ridges, valleys, and pores, and beyond a few layers of skin and flesh the bone itself. Moreover, sweat glands and wetting are critical as well as we will see. As for the mechanics, the pressure at which one touches the screen, and the manner by which the surfaces responds to this pressure, have major impact on the touch sensing. In addition, different touch sensing methods, like capacitive or optical, will have different dependencies. For example, the color of the finger might impact the latter, whereas the former is insensitive to it. In this paper we describe a system that captures multiple modalities of the touch event, and by post-processing synchronizing all these. This enables us to look for correlation between various effects, and uncover their influence on the performance of the touch sensing algorithms. Moreover, investigating these relations allows us to improve various sensing algorithms, as well as find areas where they complement each other. We conclude by pointing to possible future extensions and applications of this system.