Fifty years ago, when Claude Shannon was developing the Mathematical Theory of Communications, for reliable data transmission, which evolved into the subject of information theory, another discipline was developing dealing with Feedback Control of Dynamical System, which evolved into a scientific subject dealing with decision, stability, and optimization. More recently, a separate discipline dealing with robustness of uncertain systems was born in response to the codification of high performance and reliability in the presence of modeling uncertainties. In principle, robustness in dynamical systems is captured through power dissipation via induced norms and dynamic games, while reliable data transmission is captured through measures of information via entropy, relative entropy, and certain laws of Large Deviations theory. The main ingredient in Large Deviations is the rate functional (or action functional in the classical mechanics terminology), often identified through the Cramer or Legendre-Fenchel Transform. On the other hand, robustness of stochastic uncertain systems is currently under development, using information theoretic as well as statistical mechanics concepts, such as, partition functions, free energy, relative entropy, and entropy rate functional. This lecture will summarize certain connections between fundamental concepts of robustness, information theory, and statistical mechanics, and possibly make future projections into the convergence of these disciplines.