Precise and functional phenotyping is a limiting factor for crop genetic improvement. However, because of its ease of application, imagery-based phenomics represents the next breakthrough for improving the rates of genetic gains in field crops. Currently, crop breeders lack the know-how and computational tools to include such traits in breeding pipelines. A fully automatic user-friendly data management together with a more powerful and accurate interpretation of results should increase the use of field high throughput phenotyping platforms (HTPPs) and, therefore, increasing the efficiency of crop genetic improvement to meet the needs of future generations. The aim of this study is to generate a methodology to high throughput phenotyping based on temporal multispectral imagery (MSI) collected from Unmanned Aerial Systems (UAS) in soybean crops. In this context, ‘Triple S’ (Statistical computing of Segmented Soybean multispectral imagery) is developed as an open-source software tool to statistically analyze the pixel values of soybean end-member and to compute canopy cover area, number and length of soybean rows from georeferenced multispectral images. During the growing season of 2017, a soybean experiment was carried out at the Agronomy Center for Research and Education (ACRE) in West-Lafayette (Indiana, USA). Periodic images were acquired by Parrot Sequoia Multispectral sensor on board senseFly eBee. The results confirm the feasibility of the proposed methodology, providing scalability to a comprehensive analysis of crop extension and affording a constant operational improvement and proactive management.