Neural Networks and Analog Computation : Beyond the Turing Limit
by Hava T. Siegelmann 2020-07-22 13:36:20
image1
Humanity''s most basic intellectual quest to decipher nature and master it has led to numerous efforts to build machines that simulate the world or communi cate with it [Bus70, Tur36, MP43, Sha48, vN56, Sha41, Rub89, NK91, Nyc92]. The computational p... Read more
Humanity''s most basic intellectual quest to decipher nature and master it has led to numerous efforts to build machines that simulate the world or communi cate with it [Bus70, Tur36, MP43, Sha48, vN56, Sha41, Rub89, NK91, Nyc92]. The computational power and dynamic behavior of such machines is a central question for mathematicians, computer scientists, and occasionally, physicists. Our interest is in computers called artificial neural networks. In their most general framework, neural networks consist of assemblies of simple processors, or "neurons," each of which computes a scalar activation function of its input. This activation function is nonlinear, and is typically a monotonic function with bounded range, much like neural responses to input stimuli. The scalar value produced by a neuron affects other neurons, which then calculate a new scalar value of their own. This describes the dynamical behavior of parallel updates. Some of the signals originate from outside the network and act as inputs to the system, while other signals are communicated back to the environment and are thus used to encode the end result of the computation. Less
  • File size
  • Print pages
  • Publisher
  • Publication date
  • Language
  • ISBN
  • 9.7x6.36x0.6inches
  • 181
  • Birkhauser
  • December 1, 1998
  • English
  • 9780817639495
Compare Prices
image
Hard Cover
image
Paperback
Available Discount
No Discount available
Related Books