Extreme learning machine (ELM), as a single hidden layer feedforward neural network, has shown very effective performance in pattern analysis and machine intelligence; however, there are some limitations that constrain the performance of ELM, such as data multicollinearity issues. The generalization capability of ELM could be significantly deteriorated when multicollinearity is present in the hidden layer output matrix which causes the matrix to become singular or ill-conditioning. To overcome such a problem, ridge regression can be utilized. The conventional way to avoid multicollinearity in ELM is achieved by precisely adjusting the ridge constant, which may not be a sophisticate solution to obtain the optimal value. In this paper, we present a solution for finding a satisfactory ridge constant by incorporating variance inflation factors (VIF) during calculating output weights in ELM, we termed this technique as ELM-VIF. Experimental results on handwritten digit recognition show that the proposed ELM-VIF, compared with the original ELM, has better stability and generalization performance.