Advances in neural information processing systems : proceedings of the ... conference  1 ~ 32, v. 20

edited by David S. Touretzky

The annual Neural Information Processing (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees -- physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications. Only thirty percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains all the papers presented at the 2003 conference.

「Nielsen BookData」より

The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation and machine learning. It draws a diverse group of attendees -- physicists, neuroscientists, mathematicians, statisticians, and computer scientists -- interested in theoretical and applied aspects of modeling, simulating, and building neural-like or intelligent systems. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December 2006 meeting, held in Vancouver.

「Nielsen BookData」より

The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees -- physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December, 2004 conference, held in Vancouver.

「Nielsen BookData」より

The annual Neural Information Processing Systems (NIPS) conference is the flagship meeting on neural computation. It draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists. The presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, brain imaging, vision, speech and signal processing, reinforcement learning and control, emerging technologies, and applications. Only twenty-five percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains the papers presented at the December 2005 meeting, held in Vancouver.

「Nielsen BookData」より

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented.

「Nielsen BookData」より

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented.

「Nielsen BookData」より

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes computer science, neuroscience, statistics, physics, cognitive science, and many branches of engineering, including signal processing and control theory. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented.

「Nielsen BookData」より

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. It draws preeminent academic researchers from around the world and is widely considered to be a showcase conference for new developments in network algorithms and architectures. The broad range of interdisciplinary research areas represented includes neural networks and genetic algorithms, cognitive science, neuroscience and biology, computer science, AI, applied mathematics, physics, and many branches of engineering. Only about 30% of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. All of the papers presented appear in these proceedings.

「Nielsen BookData」より

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented at the 2000 conference.

「Nielsen BookData」より

The annual conference on Neural Information Processing Systems (NIPS) is the flagship conference on neural computation. The conference is interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and diverse applications. Only about 30 percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. These proceedings contain all of the papers that were presented at the 2001 conference.

「Nielsen BookData」より

The past decade has seen greatly increased interaction between theoretical work in neuroscience, cognitive science and information processing, and experimental work requiring sophisticated computational modeling. The 152 contributions in NIPS 8 focus on a wide variety of algorithms and architectures for both supervised and unsupervised learning. They are divided into nine parts: Cognitive Science, Neuroscience, Theory, Algorithms and Architectures, Implementations, Speech and Signal Processing, Vision, Applications, and Control.Chapters describe how neuroscientists and cognitive scientists use computational models of neural systems to test hypotheses and generate predictions to guide their work. This work includes models of how networks in the owl brainstem could be trained for complex localization function, how cellular activity may underlie rat navigation, how cholinergic modulation may regulate cortical reorganization, and how damage to parietal cortex may result in neglect.Additional work concerns development of theoretical techniques important for understanding the dynamics of neural systems, including formation of cortical maps, analysis of recurrent networks, and analysis of self- supervised learning. Chapters also describe how engineers and computer scientists have approached problems of pattern recognition or speech recognition using computational architectures inspired by the interaction of populations of neurons within the brain. Examples are new neural network models that have been applied to classical problems, including handwritten character recognition and object recognition, and exciting new work that focuses on building electronic hardware modeled after neural systems.A Bradford Book

「Nielsen BookData」より

The annual Neural Information Processing (NIPS) meeting is the flagship conference on neural computation. The conference draws a diverse group of attendees--physicists, neuroscientists, mathematicians, statisticians, and computer scientists--and the presentations are interdisciplinary, with contributions in algorithms, learning theory, cognitive science, neuroscience, vision, speech and signal processing, reinforcement learning and control, implementations, and applications. Only about thirty percent of the papers submitted are accepted for presentation at NIPS, so the quality is exceptionally high. This volume contains all the papers presented at the 2002 conference.

「Nielsen BookData」より

[目次]

  • Part 1 Cognitive science. Part 2 Neuroscience. Part 3 Theory. Part 4 Algorithms and architectures. Part 5 Implementations. Part 6 Speech and signal processing. Part 7 Vision. Part 8 Applications. Part 9 Control.

「Nielsen BookData」より

[目次]

  • Part 1 Cognitive science: synchronized auditory and cognitive 40 Hz attentional streams and the impact of rhythmic expectations on auditory scene analysis, Bill Baird
  • on parallel versus serial processing - a computational study of visual search, Eyal Cohen, Eytan Ruppin
  • task and spatial frequency effects on face specialization, Matthew N. Dailey, Garrison W. Cottrell
  • Neural basis of object-centred representations, Sophie Debeve, Alexandre Pouget
  • a neural network model of naive preference and filial imprinting in the domestic chick, Lucy E. Hadden
  • adaptation in speech motor control, John F. Houde, Michael I. Jordan
  • learning human-like knowledge by singular value decomposition - a progress report, Thomas K. Landauer et al
  • multi-modular associative memory, Nir Levy et al
  • serial order in reading aloud - connectionist models and neighbourhood structure, Jeanne C. Milostan, Garrison W. Cottrell
  • a superadditive-impairment theory of optic aphasia, Michael C. Mozer et al
  • a hippocampal model of recognition memory, Randall C. O'Reilly et al
  • correlates of attention in a model of dynamic visual recognition, Rajesh P.N. Rao
  • recurrent neural networks can learn to implement symbol-sensitive counting, Paul Rodriguez, Janet Wiles
  • comparison of human and machine word recognition, Markus Schenkel et al. Part 2 Neuroscience: coding of naturalistic stimuli by auditory midbrain neurons, Hagai Atias, Christoph E. Schreiner
  • refractoriness and neural precision, Michael J. Berry II, Markus Meister
  • statistical models of conditioning, Peter Dayan, Theresa Long
  • characterizing neurons in the primary auditory cortex of the awake primate using reverse correlation, R. Christopher de Charms, Michael M. Merzenich
  • using Helmholtz machines to analyze multi-channel neuronal recordings, Virginia R. de Sa et al
  • instabilities in eye movement control - a model of periodic alternating nystagmus, Ernst R. Dow, Thomas J. Anastasio
  • hippocampal model of rat spatial abilities using temporal difference learning, David J. Foster et al
  • gradients for retinotectal mapping, Geoffrey J. Goodhill
  • a mathematical model of axon guidance by diffusible factors, Geoffrey J. Goodhill
  • computing with action potentials (invited talk), John J. Hopfield et al
  • a model of early visual processing, Laurent Itti et al
  • perturbative M-sequences for auditory systems identification, Mark Kvale, Christoph E. Schreiner
  • effects of spike timing underlying binocular integration and rivalry in a neural model of early visual cortex, Erik D. Lumer. (Part contents).

「Nielsen BookData」より

この本の情報

書名 Advances in neural information processing systems : proceedings of the ... conference
著作者等 Alspector, Joshua
Cowan, J. D.
Giles, C. Lee
Hanson, Stephen José
Hanson, Steve J.
Hasselmo, Michael E.
IEEE Conference on Neural Information Processing Systems
Jordan, Michael I.
Kearns, Michael J.
Leen, Todd
Lippmann, Richard P.
Moody, John E.
Mozer, Michael C.
Petsche, Thomas
Schölkopf, Bernhard
Solla, Sara A.
Tesauro, Gerald
Touretzky, David S
Becker Suzanna
Bottou Leon
Cohn David A.
Dietterich Thomas G.
Ghahramani Zoubin
Hofmann Thomas
Kearns Michael S.
Lafferty John
Leen Todd K.
Muller Klaus-Robert
Obermayer Klaus
Platt John
Saul Lawrence K.
Scholkopf Bernhard
Thrun Sebastian
Tresp Volker
Weiss Yair
Müller Klaus-Robert
書名別名 ... annual conference on neural information processinh systems ...

Advances in neural information processing systems
巻冊次 1
2
3
4
5
6
7
8
9
10
11
12
13
14
14, v. 1
14, v. 2
15
16
17
18
19
20, v. 1
20, v. 2
20, v. 3
21, v. 1
21, v. 2
21, v. 3
22, v. 1
22, v. 2
22, v. 3
23, v. 1
23, v. 2
23, v. 3
24, v. 1
24, v. 2
24, v. 3
25, v. 1
25, v. 2
25, v. 3
25, v. 4
26, v. 1
26, v. 2
26, v. 3
26, v. 4
27, v. 1
27, v. 2
27, v. 3
27, v. 4
28, v. 1
28, v. 2
28, v. 3
28, v. 4
29, v. 1
29, v. 2
29, v. 3
29, v. 4
29, v. 5
29, v. 6
29, v. 7
30, v. 1
30, v. 2
30, v. 3
30, v. 4
30, v. 5
30, v. 6
30, v. 7
30, v. 8
30, v. 9
30, v. 10
31
31, v. 1
31, v. 2
31, v. 3
31, v. 4
31, v. 5
31, v. 6
31, v. 7
31, v. 8
31, v. 9
31, v. 10
31, v. 11
31, v. 12
31, v. 13
31, v. 14
31, v. 15
32
32, v. 1
32, v. 2
32, v. 3
32, v. 4
32, v. 5
32, v. 6
32, v. 7
32, v. 8
32, v. 9
32, v. 10
32, v. 11
32, v. 12
32, v. 13
32, v. 14
32, v. 15
32, v. 16
32, v. 17
32, v. 18
32, v. 19
32, v. 20
出版元 M. Kaufmann Publishers
刊行年月 c1989-
ページ数 v.
大きさ 24-28 cm
ISBN 0262025507
0262042061
026204207X
0262042088
0262100657
0262100762
0262112450
0262122413
0262194503
0262195348
0262195682
0262201046
0262201070
0262201526
0262232537
1558601007
1558601848
1558602224
1558602747
1558603220
1558600159
9781713807933
9781510884472
ISSN 10495258
NCID BA06791660
※クリックでCiNii Booksを表示
言語 英語
出版国 アメリカ合衆国
この本を: 
このエントリーをはてなブックマークに追加

このページを印刷

外部サイトで検索

この本と繋がる本を検索

ウィキペディアから連想