### H. Magnussen, J.A. Nossek and Leon O. Chua

###
EECS Department

University of California, Berkeley

Technical Report No. UCB/ERL M93/88

1993

"Global" Learning algorithms for Discrete-Time Cellular Neural Networks (DTCNNs) are a class of learning algorithms where the algorithm designs the trajectory of the network. For binary input patterns, the global learning problem for DTCNNs is a combinatorial optimization problem. Properties of the solution space can be deduced using the available theory on Linear Threshold Logic. Results on the required accuracy of the network parameters are given. The problem of deciding whether a given task can be learned for a DTCNN architecture (feasibility problem) is conjectured to be NP-complete. A cost function is defined that is a measure of the errors in the mapping process from a set of input images onto the desired output images. Simulated Annealing methods are used to minimize this cost function. The algorithm is used to find the parameters of a standard DTCNN architecture for a simple pattern recognition task.

BibTeX citation:

@techreport{Magnussen:M93/88, Author = {Magnussen, H. and Nossek, J.A. and Chua, Leon O.}, Title = {The Learning Problem for Discrete-Time Cellular Neural Networks as a Combinatorial Optimization Problem}, Institution = {EECS Department, University of California, Berkeley}, Year = {1993}, URL = {http://www2.eecs.berkeley.edu/Pubs/TechRpts/1993/2464.html}, Number = {UCB/ERL M93/88}, Abstract = {"Global" Learning algorithms for Discrete-Time Cellular Neural Networks (DTCNNs) are a class of learning algorithms where the algorithm designs the trajectory of the network. For binary input patterns, the global learning problem for DTCNNs is a combinatorial optimization problem. Properties of the solution space can be deduced using the available theory on Linear Threshold Logic. Results on the required accuracy of the network parameters are given. The problem of deciding whether a given task can be learned for a DTCNN architecture (feasibility problem) is conjectured to be NP-complete. A cost function is defined that is a measure of the errors in the mapping process from a set of input images onto the desired output images. Simulated Annealing methods are used to minimize this cost function. The algorithm is used to find the parameters of a standard DTCNN architecture for a simple pattern recognition task.} }

EndNote citation:

%0 Report %A Magnussen, H. %A Nossek, J.A. %A Chua, Leon O. %T The Learning Problem for Discrete-Time Cellular Neural Networks as a Combinatorial Optimization Problem %I EECS Department, University of California, Berkeley %D 1993 %@ UCB/ERL M93/88 %U http://www2.eecs.berkeley.edu/Pubs/TechRpts/1993/2464.html %F Magnussen:M93/88