Hücresel nöral ağlar ve uygulamaları
Cellular neural networks and applications
- Tez No: 19322
- Danışmanlar: PROF.DR. I. CEM GÖKNAR
- Tez Türü: Yüksek Lisans
- Konular: Bilgisayar Mühendisliği Bilimleri-Bilgisayar ve Kontrol, Computer Engineering and Computer Science and Control
- Anahtar Kelimeler: Belirtilmemiş.
- Yıl: 1991
- Dil: Türkçe
- Üniversite: İstanbul Teknik Üniversitesi
- Enstitü: Fen Bilimleri Enstitüsü
- Ana Bilim Dalı: Belirtilmemiş.
- Bilim Dalı: Belirtilmemiş.
- Sayfa Sayısı: 99
Özet
ÖZET Yapay nöral ağlar, son 10 yıldır hızla artan bir ilginin odağı olmaktadır. Günümüzde kullanılan bilgisayarlar di git al devreler ile gerçeklenmesi çok zor olan bazı tür fonksiyonları insanlar ve hayvanlar kolaylıkla gerçekleştirebilmektedirler. Bilim adamları bir yandan insan beyninin çalışmasını çözmeye çalışırken diğer yandan da edindikleri bilgilerle yeni ve daha güçlü bilgisayarlar, makineler yapmaya çalışmaktadırlar. Bu tezde yapay nöral ağlar hakkında bazı genel bilgiler verildikten sonra nöral ağların özel bir türü olan Hücresel Nöral Ağlar konusu incelenecektir. Hücresel nöral ağlar, nöral ağ teorisindeki bazı boşlukları dolduracak şekilde geliştirilmiş ve görüntü işleme ve tanıma problemlerine başarıyla uygulanmış analog, paralel devrelerdir. HNA'ların teorik yapısı incelendikten sonra çeşitli alanlardaki uygulamalarına da yer verilecektir. Bu tezde hücresel ağlar ile ör j i nal bir uygulama da gerçekleştirilerek, işaret işlemenin önemli problemlerinden biri olan işaret Ayrıştırma problemi de yeni bir küme kalıbı bulunarak çözülmüştür. v
Özet (Çeviri)
SUMMARY CELLULAR NEURAL NETWORKS AND APPLICATIONS Technology consists of the ideas that people see in the nature and try to copy to ease the life of human beings. Many researchers have taken the examples from nature, animals and humans itself to build up machines, chemical materials, grow plants and even intelligent systems starting from the early ages. Building a machine that can think and behave like a human was always one of the main research issues in the history of mathematics and engineering. Mathematicians, physicians and pshycohologists produced many ideas on thinking machines but until the second half of 20th century they were notbe able to prove their ideas experimentally because of the barriers of technology. Semiconductor and microelectronic technology gave rise to the research on thinking machines and allowed the researchers to carry their ideas from paper to laboratory environment. Then, people realized that conventional circuit and computer theory was not the right answer to solve human-like perception and decision problems. Even the fastest computers of the world were not able to recognize a simple object faster than an animal of high-class. Scientists turned to brain research to find the answer of the question“How does the brain work ?”. Several years of brain research showed that the brain was working in a completely different way than our conventional computer electronics which is digital and sequential, brain was analog and parallel. Brain consists of an enormous number of simple cells, called neurons and neurons are working in a highly parallel structure. The elements of a simple neuron can be seen in Fig. 1. NDRITE SYNAPSE AXON Fig 1. Neuron riA neuron consists of dendrite, the input ports, axons the output ports and synapses forming the connections between neurons. Â neuron can be described as a controlled pulse generator which produces a variable frequency pulse train under the controlling inputs of its dendrits. What was surprising that the neurons were slower than an average logic gate but they were producing outputs 100 or even more times faster while working in parallel with other neurons. This fact.led the scientists to a new circuit theory, named Artificial Neural Network Theory. Artificial neural ^networks are consist of simple analog units with non-linear characteristics. Many researchers have focused on neural networks for nearly 20 years and succesfully adapted real-life problems to neural circuits where the conventional computers were inefficient. Today the term neural network or more properly artificial neural network has come to mean any computing architecture that consists of massively parallel interconnection of simple neural processors. The human nervous system is anatomically subdivided into the central nervous system which comprises the brain and spinal cord and the peripheral nervous system which comprises the panial and cranial nerves. The brain is subdivided into five major anatomical units called the cerebrum (cortex), cerebellum, midbrain, pons and medulla. Functionally brain can be divided into smaller parts such as motor cortex, visual cortex, auditory cortex, hippocampus and various nuclei groups in the brain system. The average human brain consist of 1.5x10 neurons of various types, with each neuron receiving signals through as many as 10 synapses. With that kind of complexity, human brain is considered to be the most complex piece of machinery on earth. Neurobiologists have taken the bottom-up approach by studying the stimulus-response characteristics of single neurons and networks of neurons. On the other hand, psychologists have taken the top-down approach by studying brain function from the cognitive and behavioral level.Engineers seek solutions to problem that are difficult with today's digital computing technology, problems that are easily solved by people and animals. Therefore they keep an eye on the brain studies and the other eye on making use of available models and paradigms and possibly developing new ones. Engineers try to build more brain-like computers out of neuron-like parts. It was this view that led to the development of many of the earlier models of neurons and neural networks. McCullogh and Pitts in the 1940s showed that the neuron can be modeled as a simple threshold device to perform logic function. Their work was very significant because.of showing that the simpleunits can perform complex functions when working in paral lei. By the late 1950s and early 1960s neuron models were further defined into Rosenblatt's Perceptron, Widrow and Hoff's Adaline and Steinbuch's Learning matrix. The perceptron received considerable excitement when it was first introduced because of its conceptual simplicity. However was short lived when Minsky and Papert proved mathematically that it can not be used to realize the complex logic functions which are not «.linearly separable. On the other hand, the faith of.adaline was quite different. Because of its linear and adaptive nature, the teenni c has developed into a powerful tool for adaptive signal processing. The present interest on neural network research is due inpart to the paper Hopfield published in 1982. In that papei“ he presented a model of neural computation that was based on the interaction of neurons. The model consisted of a set of first order (non-linear) differential equations that minimize a certain energy function. The 3 articles of Hopfield and Tank, started a new generation of neural networks research which led the world to a new kind of thinking. The main advantages of neural networks over conventional computer are their adaptation and learning capability and of course their speed. Furthermore because of their non- linear nature, neural networks can perform functional approximation and signal filtering operations which are beyond optimal linear technics. Neural networks can be used in pattern classification by defining non-linear regions in feature space. Like all enginering phenomena, neural networks also have their limits. Their massively parallel nature prevents the use of discrete implementation technics because of immediately growing number of connections whenthe number of neurons is increased. In VLSI implementations, scientists can try to reduce the size of neurons but connections are still covering most of the silicon surface. Wafer-scale integration is another alternative where the scientists try to eliminate the termal effects of large networks. Cellular Neural Networks are a new generation of neural networks introduced by Chua and Yang at 1988. Their aim was to design a neural network structure which is close to human nervous system and eliminates the disadvanteges encountered in VLSI realizations. The main difference of cellular neural networks is their nearest neighborhood connection structure as compared to a fully connected Hopfield Net. In a conventional viiicelular neural network, neurons are placed in a two dimensional rectangular grid although the theory can be expanded to higher dimensions. Neurons are only connected to their nearest neighbors in the network matrix and r-Neighborhood can be defined for a cell C(i,j) as follows: r f CCk.i:> | max { |k-i |, |l-j | < r \ 1 1 < k < M ; 1 < 1 < N r G N”“ This approach reduces the silicon surface used by connections between neurons and as well as the power consumption. Brain also has a similar structure of connection between neurons. An average number of 1000 neurons are in close connection whic we call microscopic connections and these groups of neurons are also connected to other groups through long synapses that we call macroscopic connections. The internal structure of a cell can be seen in Figure 2. A typical cell in a cellular neural network Fig 2. Typical Cell Structure has 3 nodes u, x and y coresponding to input, state and output of the cell respectively. Cell input is a constant voltage source Evj which is supplied externally. The cell has the following components: A constant current source I, a linear capacitor C, two linear resistors Rx and Ry, numerous voltage controlled current sources which represent the effect of the nearest neighbor neurons and a piecewise-linear voltage controlled current source. Connections with the ixneighbors and the inputs are made through weighted currents of the following type : I Ci,j;k.O = ACi.j;k.O*> xy ykl V CCk.O C N Ci.O r I Ci,j;k.O = BCi,j;k.Ov xu u kl V CCk.O £ N Ci.jD r The only non-linear element of the cell is the voltage controlled current source with the following definition: I =-=-. fCv. D /C«0=g [*|« + 1 \-\o - 1 |1 Cellular neural networks can be programmed to perform several tasks through their neigborhood connection weights which are called cloning templates. A cloning template is a space invariant matrix that shows the connection weights of the cells in a neighborhood. Connection weigths are twofold: The matrix A that holds the weights for output voltages of neighbor cells, which is also called the feedback operator and the matrix B that holds the weights for input voltages of neighbor cells, the control operator. We can program a cellular neural network with A and B matrices and constant current I, usually these 3 are called a cloning template or analog algorithm of the network. The theory of cellular neural network the dynamic routes of the individual cells, state equations and the circuit constraints as follows : depends on The cell can be given dv CO xi J dt + A * *t. CO B UVJ + I v CO yi j -s [ha,”3*1 1“ I XI. J CO-1 v.= E.. 1 < i < M 1 < j < N x|i> CGÛ I < 1 ACi.J;k,lD = ACk,l;±.jD C > Q>, R > 0 x When cellular neural network theory was first introduced, the elements of cloning templates were all constant real numbers and the stability was dependent on the reciprocity of cloning templates. A cloning template is called to be reciprocal iff any two neuron are connected to each other by the same weigth. To state mathematically, we can write A(i, j;k, 1 )=A(k, 1 ; i, j) First stability analysis was depending on that feature and on showing that the energy function M M ECt3=~-İ- / / / ACi,J;k,i:>v. CO* CO 2 i=i j=l C r N M N M + 1 / / v* CO - / / / BC±.J;k.i:>«.COv 2R i = l j = 1 '”“ i.=lj = lC ' ”J r N M - I z V = 1 j = 1 Iv CO y>- j is a Liapunov function for the network. In 1990, Chua and Roska proved that a more relaxed condition of having positive or negative cell linking templates would be enough to guarantee complete stability. At the same time non-linear and delay type templates were introduced by the same authors and some examples given for silicon retina in a following paper. Cellular neural networks seem to be very promising to become analog computing arrays with cloning templates as analog macros whereas analog algorithms will consist of the application of several cloning templates sequantial ly. XICellular neural networks are applied to image processing problems with great success. Data corresponding to an image is given as the initial state or is applied to the inputs of the cellular neural network where each neuron corresponds to a pixel of the image. Then using different types of cloning templates, we can extract several features of images independent of the image itself. Many cloning templates for feature extraction can be found in the literature like corner and edge detection, image inversion. shadow detection, closed component detection, hole filling etc. Besides single image processing, cellular neural networks have been applied to some intelligent tasks based on image processing, like small object counting and motion detection. Although some research have been done on the subject of finding the proper cloning template for a given task, no analytical or algorithmic results have been obtained. Most of the cloning templates are found experimentally and their effect on the network's operation is not well explained. As an original contribution of the thesis, cellular neural network theory is applied to a different area of signal processing, signal decomposition problem which can be defined as follows: Given a sampled function with samples at N different time points, find the components of the original function in terms of Gaussian functions. The shape of Gaussian functions allows us to use the nearest neighbor feature of CNN and the necessary cloning template is only dependent on the sampling period and independent of the function itself in contrast to the solution of Hopfield which was originally suggested with a Hopfield Net. The main advantage of using cellular neural networks for this problem is their fast response which is independent. of the networks size and is limited to few microseconds. xii
Benzer Tezler
- Hücresel sinir ağları (CNN) modelleri
Cellular neural networks (CNN) models
GÜNAY KURTULDU
Yüksek Lisans
Türkçe
2002
Elektrik ve Elektronik Mühendisliğiİstanbul ÜniversitesiElektrik ve Elektronik Mühendisliği Ana Bilim Dalı
DOÇ. DR. OSMAN NURİ UÇAN
- Rastgele markov alanları ve hücresel sinir ağları ile görüntü işleme
Image processing with markow random fields and cellular neural networks
MAHMUT ŞAMİL SAĞIROĞLU
Yüksek Lisans
Türkçe
2001
Elektrik ve Elektronik Mühendisliğiİstanbul ÜniversitesiElektronik Mühendisliği Ana Bilim Dalı
DOÇ. DR. OSMAN NURİ UÇAN
- Mimari tasarımda yapay zeka: Evrişimli yapay sinir ağlarının vaziyet planı tasarımında kullanımı
Artificial intelligence in architectural design: The use of convolutional neural networks in site plan design
MUSTAFA KEMAL KAYIŞ
Yüksek Lisans
Türkçe
2019
Bilgisayar Mühendisliği Bilimleri-Bilgisayar ve Kontrolİstanbul Teknik ÜniversitesiGayrimenkul Geliştirme Ana Bilim Dalı
PROF. DR. HAKAN YAMAN
- Large scale wireless propagation channel characterization of air-to-air and air-to-ground drone communications
Hava-hava ve hava-yer drone haberleşmesi için büyük ölçekli kablosuz yayılım kanalı karakterizasyonu
UBEYDULLAH ERDEMİR
Yüksek Lisans
İngilizce
2024
Elektrik ve Elektronik Mühendisliğiİstanbul Teknik ÜniversitesiElektronik ve Haberleşme Mühendisliği Ana Bilim Dalı
PROF. DR. HAKAN ALİ ÇIRPAN
- Derin öğrenme ile nöronal veri kodunun çözülmesi
Neuronl data decoding with deep learning
FATMA ÖZCAN
Doktora
Türkçe
2023
BiyofizikKahramanmaraş Sütçü İmam ÜniversitesiElektrik ve Elektronik Mühendisliği Ana Bilim Dalı
PROF. DR. AHMET ALKAN