This paper describes implementation of neural network processing layers using basic current-mode operating modules. The research work has been focused on the implementation of neural networks based on the Adaptive Resonance Theory, developed by S. Grossberg and G.A. Carpenter. The ART-based neural network whose operating modules have been choosen for development is the one called MART, proposed by
F. Delgado, because of its complex architecture, auto--adaptive self-learning process, able to discard unmeaningful cathegories. Our presentation starts introducing the behaviour of MART with an analysis of its structure. The development described by this research work is focused on the monochannel block included in the main signal
processing part of the MART neural network. The description of the computing algorithm of the layers inside a monochannel block are also provided in order to show what operational current-mode modules are
needed (multiplier, divider, square-rooter, adder, substractor, absolute value, maximum and minimum evaluator...). Descriptions at schematic and layout levels of all the processing layers are given. All of them have been designed using AMS 0.35 micron technology with a supply voltage of 3.3 volts. The modules are designed to deal with input currents in the range of 20 to 50 microamps, showing a lineal behaviour and an output error of less than 10%, which is good enough for neural signal processing systems. The maximum frecuency of operation is around 200 kHz. Simulation results are included to show that the operation performed by the hardware designed matches the behaviour described by the MART neural network. For testing purposes we show the design of a monochannel block hardware implementation restricted to five inputs and three cathegories.