stance, Lippmann (1987) provides an excellent overview of
neural networks for the signal processing community. There
1. INTRODUCTION
Neural networks have recently received a great deal of at tention in many fields of study. The excitement stems from the fact that these networks are attempts to model the capa bilities of the human brain. People are naturally attracted by attempts to create human-like machines, a Frankenstein ob session, if you will. On a practical level the human brain has many features that are desirable in an electronic computer. The human brain has the ability to generalize from abstract ideas, recognize patterns in the presence of noise, quickly recall memories, and withstand localized damage. From a statistical perspective neural networks are interesting be cause of their potential use in prediction and classification problems.
Neural networks have been used for a wide variety of applications where statistical methods are traditionally em ployed. They have been used in classification problems such as identifying underwater sonar contacts (Gorman and Se-
are also a number of good introductory books on neural networks, with Hertz, Krogh, and Palmer (1991) providing
a good mathematical description, Smith (1993) explaining backpropagation in an applied setting, and Freeman (1994) using examples and code to explain neural networks. There have also been papers relating neural networks and statisti cal methods (Buntine and Weigend 1991; Ripley 1992; Sarle 1994; Werbos 1991). One of the best for a general overview is Ripley (1993).
This article intends to provide a short, basic introduction of neural networks to scientists, statisticians, engineers, and professionals with a mathematical and statistical back ground. We achieve this by contrasting regression models with the most popular neural network tool, a feedforward multilayered network trained using backpropagation. This paper provides an easy to understand introduction to neural networks, avoiding the overwhelming complexities of many other papers comparing these techniques.
Section 2 discusses the history of neural networks. Section 3 explains the nomenclature unique to the neural network community, and provides a detailed derivation of the backpropagation learning algorithm. Section 4 shows an equivalence between regression and neural networks. It demonstrates the methods on three examples. Two examples are simulated data where the underlying functions are known, and the third is on data from the Department of Veterans Affairs Continuous Improvement in Cardiac Surgery Program (Hammermeister, Johnson, Marshall, and Grover 1994). These examples demonstrate the ideas in the paper, and clarify when one method would be preferred over the other.
stance, Lippmann (1987) provides an excellent overview ofneural networks for the signal processing community. There 1. INTRODUCTIONNeural networks have recently received a great deal of at tention in many fields of study. The excitement stems from the fact that these networks are attempts to model the capa bilities of the human brain. People are naturally attracted by attempts to create human-like machines, a Frankenstein ob session, if you will. On a practical level the human brain has many features that are desirable in an electronic computer. The human brain has the ability to generalize from abstract ideas, recognize patterns in the presence of noise, quickly recall memories, and withstand localized damage. From a statistical perspective neural networks are interesting be cause of their potential use in prediction and classification problems.Neural networks have been used for a wide variety of applications where statistical methods are traditionally em ployed. They have been used in classification problems such as identifying underwater sonar contacts (Gorman and Se- are also a number of good introductory books on neural networks, with Hertz, Krogh, and Palmer (1991) providinga good mathematical description, Smith (1993) explaining backpropagation in an applied setting, and Freeman (1994) using examples and code to explain neural networks. There have also been papers relating neural networks and statisti cal methods (Buntine and Weigend 1991; Ripley 1992; Sarle 1994; Werbos 1991). One of the best for a general overview is Ripley (1993).This article intends to provide a short, basic introduction of neural networks to scientists, statisticians, engineers, and professionals with a mathematical and statistical back ground. We achieve this by contrasting regression models with the most popular neural network tool, a feedforward multilayered network trained using backpropagation. This paper provides an easy to understand introduction to neural networks, avoiding the overwhelming complexities of many other papers comparing these techniques.Section 2 discusses the history of neural networks. Section 3 explains the nomenclature unique to the neural network community, and provides a detailed derivation of the backpropagation learning algorithm. Section 4 shows an equivalence between regression and neural networks. It demonstrates the methods on three examples. Two examples are simulated data where the underlying functions are known, and the third is on data from the Department of Veterans Affairs Continuous Improvement in Cardiac Surgery Program (Hammermeister, Johnson, Marshall, and Grover 1994). These examples demonstrate the ideas in the paper, and clarify when one method would be preferred over the other.
การแปล กรุณารอสักครู่..