Timely and accurate recognition of health conditions in crops helps to perform necessary treatment for the plants. Automatically localizing these conditions in an image helps in estimating their spread and severity, thus saving on precious resources. Automated disease detection involving recognition as well as localization helps in identifying multiple diseases from one image and can be a small step forward for robotic farm surveying and spraying. Recent developments in Deep Neural Networks have drastically improved the localization and identification accuracy of objects. We leverage the neural network based method to perform accurate and fast detection of the diseases and pests in tea leaves. With a goal to identify an accurate yet efficient detector in terms of speed and memory, we evaluate various feature extraction networks and detection architectures. The images used to train and evaluate the models are with different resolutions, quality, brightness and focus as they are captured with mobile phones having different cameras through a participatory sensing approach. The experimental results show that the detection system effectively identifies and locates the health condition on the tea leaves in a complex background and with occlusion. We have evaluated YOLO based detection methods with different feature extraction architectures. Detection using YOLOv3 achieves mAP of about 86% with 50% IOU while making the system usable in real time.