Deep learning-based approaches have shown highly successful performance in the categorization of digitized biopsy samples. The commonly used setting in these approaches is to employ convolutional neural networks for classification of data sets consisting of images all having the same size. However, the clinical practice in breast histopathology necessitates multi-class categorization of regions of interest (ROI) in biopsy samples where these regions can have arbitrary shapes and sizes. The typical solution to this problem is to aggregate the classification results of fixed-sized patches cropped from these images to obtain image-level classification scores. Another limitation of these approaches is the independent processing of individual patches where the rich contextual information in the complex tissue structures has not yet been sufficiently exploited. We propose a generic methodology to incorporate local inter-patch context through a graph convolution network (GCN) that admits a graph-based ROI representation. The proposed GCN model aims to propagate information over neighboring patches in a progressive manner towards classifying the whole ROI into a diagnostic class. The experiments using a challenging data set for a 4-class ROI-level classification task and comparisons with several baseline approaches show that the proposed model that incorporates the spatial context by using graph convolutional layers performs better than commonly used fusion rules.