ANNEXED E Neuronal network architectures One of the points important to consider when it is desired to make a neuronal network consists of the design of an architecture adapted for the proposed aim. This he is one of the aspects in where the artisan work of the designer is denoted, its mark, its " company/signature ". Actually, so many architectures will exist then as designers find, in views of which each investigator tries to add some element that confers to the network a special characteristic to him. Nevertheless, some models standard of networks exist, which can be used like testeo function. If the propose network does not fulfill one perfomance better than these networks, then the effort must be continued. If, on the contrary, the new network improves the previous yield, it has reached a fruitful solution of the work. It is possible to be said that the network of contrapropagación with a layer hides or BPN (3 to layer Back Propagation Network) consists of the popular structure but of reference like testeo material. It consists of three main layers: First it is the entrance layer, to which weights are assigned to him and an activation function obtaining that way the layer hides or second layer. Multiplying the hidden layer by its weights and possibly its function of activation, obtains the layer of exits or third layer. The Panel A5.1 shows the basic scheme of a BPN with an hidden layer Panel A5.1 DFD of a BPN of 3 layers BPN of four layers (Four to layer Back Propagation Network): Its configuration is similar to the BPN of 3 layers except by the added one of an extra hidden layer. Therefore it has an hidden layer of entrance, 2 layers and one layer of exit. This network against a BPN of 3 layers is due to testear to determine perfomance optimal enters both. Networks with Jump of Connection: (Jump Connection Networks). She is similar to a BPN but difference in which each layer is nourished of the previous layers. In a JCN of 3 layers we have: Layer of Entrances. It is nourished of the entrances to the network It castrates Hides: It is nourished of the layer of entrances It castrates Exit: It is nourished of the layer hides and the layer of entrances. Conceptually the scheme of connections in a network of up to 6 layers is the exposed one in the Panel A5.2 PANEL A5.2 Scheme of the connections of a network with jump of connection of 6 layers In this diagram each circle represents a layer of the network and the emanated arrows the connections. We see that each new layer receives connections of all the previous layers, reason why the training and the execution of the network are made but slow well-known as we added layers to the same one. Networks with hidden layers in parallel: In this type of networks, they exist one or several hidden layers that contain each one function of different activation. For example the case of a network with two hidden layers can occur, each one of which has a function of different activation, which they contribute altogether to the layer of exits. The Panel A5.3 illustrates this architecture. PANEL A5.3 Scheme of Network with hidden layers in parallel Circle 1 represents the hidden layer of entrances, the 2 and 3 layers in parallel, and the 4 the layer of exits. PANEL A5.3 Scheme of Network with hidden layers in parallel Networks of Autoasociativos Maps (Network of Kohonen and similars) In this type of Networks it is characterized to have two layers, the layer of entrances and the layer of exits. Each element of the layer of entrances is left associate with an element of the layer of exits, which has special utility to be able to group data in categories or classes. Each category, then corresponds to an element of the layer of exits. The panel A5.4 shows a diagram of the operation of this network Panel A5.4 Network of Autoasociativos Maps The classic examples to deal with this structure of network are the problems where it is to find the aglutinante or nexus between the patterns. For example dices symptoms of a disease, are required to know the name the pathology. Networks of Memoria Asociativa Bidireccional (BAM) This type of networks consists of two layers, the layer of entrances and the one of exits. Its graphical structure is similar to the graph A5.4. This network tries to give back the original pattern before a defective entrance of the same one. This is: before an incomplete or defective entrance of a pattern the network returns the complete pattern. For example: If we looked for " Jose Rodriguez " but by error we entered " Jose Radriguez ", one bam will give back to correct " Jose Rodriguez ". A network BAM associated to a computadorizada telephone directory makes possible to write an incorrect name and that the network gives back the name and the direction of all those that are called similar. The Panel A5. 5 shows the DFD of an associated BAM a computerized telephone directory. Panel A5.5 D.F.D. of a BAM associated to a telephone directory Previous Pagina Following Pagina
Make your own free website on