This paper presents an innovative integrated visual approach for indexing music and automatically composing personalized playlists for radios or chain stores. To efficiently index hundreds of music titles by hand with artistic descriptors, the user only needs to drag and drop them onto a dynamic music landscape. To help the user we propose different dynamic visualization tools, such as the semantic spectrum and semantic field lenses. An algorithm then propagates artistic values that are hidden in the landscape into the titles being indexed. Different propagation algorithms are tested and compared. The dynamic composition methodology is then described with its class n-gram algorithm and its means for personalization based on the same music map as the visual indexing method. The new tools and techniques presented in this paper enable us to turn musical experience into an integrated visual experience that may generate new music knowledge and emotion.