A single layer perceptron neural network with four input nodes and two output nodes is constructed for simulating the learning process of performing two 2-bit values comparisons. After the training, it is observed that the weight for the node with higher order bit input has a significant higher value than the weight of the node with lower order bit input. More precisely, the weight of higher order node is twice as many as that of the lower order node. This exactly reflects the significances of different bits representing a value. The output, when using a hard limiter and a offset θ as the non-linearity, produces correct result as expected. However, when examining the differences between the output nodes and applied the piecewise linear activate function to the differences, the outcome is astonishing. This single layer perceptron calculate the difference of the two 2-bit values. In other words, it does more than comparison, it subtracts. We conclude that the perceptron act not only as a classifier, it performs classifier with gradient feature.