本研究建立了一個單層感知機的類神經網路,此網路的輸入層有四個節點輸出層有兩個節點,以模擬兩個二位元之值的比較。在感知機學習之後,我們觀察到連接高位元的權重都比連接低位元的權重大,更精確的說,高位元的權重是低位元的權重的兩倍,這確實的反應出在表達一個在進位數時,不同位元所具有的權值。當我們使用硬限制器以及一個位移量做非線性轉換,此網路的表現一如預期做出正確的比較,將比較的數的大小做分類。然而,當我們將輸出節點之值相減,並使用線性轉換函數時,其結果另人驚訝,此單層感知機計算了兩數值之差。換言之,此機不只做比較,它還做了減法。因此,單層感知機不只做單純的分類,它能做有層次的分類。
A single layer perceptron neural network with four input nodes and two output nodes is constructed for simulating the learning process of performing two 2-bit values comparisons. After the training, it is observed that the weight for the node with higher order bit input has a significant higher value than the weight of the node with lower order bit input. More precisely, the weight of higher order node is twice as many as that of the lower order node. This exactly reflects the significances of different bits representing a value. The output, when using a hard limiter and a offset θ as the non-linearity, produces correct result as expected. However, when examining the differences between the output nodes and applied the piecewise linear activate function to the differences, the outcome is astonishing. This single layer perceptron calculate the difference of the two 2-bit values. In other words, it does more than comparison, it subtracts. We conclude that the perceptron act not only as a classifier, it performs classifier with gradient feature.