In the paper, we are to design the optimal learning rule for the Hopfield associative memory (HAM) based on three well recognized criteria, that is, all desired attractors must be made not only isolately stable but also asymptotically stable, and the spurious stable states should be the fewest possible. To construct a satisfactory associative memory, those criteria are crucial. In the paper, we first analyze the real cause of the unsatisfactory performance of the Hebb rule and many other existing learning rules designed for HAMs and then show that three criteria actually amount to widely expanding the basin of attraction around each desired attractor. One effective way to widely expand basins of attraction of all desired attractors is to appropriately dig their respective steep kernel basin of attraction. For this, we introduce a concept called by the Hamming-stability. Surprisingly, we find that the Hamming-stability for all desired attractors can be reduced to a moderately expansive linear separability condition at each neuron and thus the well known Rosenblatt's perceptron learning rule is the right one for learning the Hamming-stability. Extensive experiments were conducted, convincingly showing that the proposed perceptron Hamming-stability learning rule did take good care of three optimal criteria.