In recent days, phenotyping of various crops is gaining widespread popularity due to its ability to recognize variations in the effects of different genotypes of a particular crop in terms of its growth, yield, biomass, and so on. Such an application requires extensive data collection and analysis with a high spatial and temporal resolution, which can be attained using multiple sensors onboard Unmanned Aerial Vehicles (UAVs). In this study, we focus on harnessing information from a variety of sensors, such as RGB cameras, LiDAR units, and push-broom hyperspectral sensors – Short-wave Infrared (SWIR) and Visible Near Infrared (VNIR). The major challenge that needs to be overcome in this regard is to ensure an accurate integration of information captured across several days from the different sensor modalities. Moreover, the payload constraint for UAVs restrain us from mounting all the sensors simultaneously during a single flight mission, thus entailing the need for data capture from different sensors mounted on separate platforms that are flown individually over the agricultural field of interest. The first step towards integration of different data modalities is the generation of georeferenced products from each of the flight missions, which is accomplished with the help of Global Navigation Satellite Systems (GNSS) and Inertial Navigation Systems (INS) mounted on the UAVs that are time-synchronized with the onboard LiDAR units, cameras and/or hyperspectral sensors. Furthermore, an accurate georeferencing is achieved by developing robust calibration approaches dedicated towards accurate estimation of mounting parameters of the involved sensors. Finally, the geometric and spectral characteristics, such as canopy cover and leaf count, derived from the different sensors are used to devise a model to analyze the phenotypic traits of crops. The preliminary results indicate that the proposed calibration techniques can attain an accuracy of upto 3 cm.