Mathematical Neuroscience Laboratory
RIKEN, Tokyo, Japan
► Personal homepage
Biography:Prof. Shun-ichi AMARI, after completing his professorship at The University of Tokyo, moved to The Institute of Physical and Chemical Research - RIKEN where he holds the position of vice-president of Brain Science Institute, director of Brain Style Information Systems Group and team leader of Mathematical Neuroscience Laboratory. He also serves on boards of numerous scientific journals and committees.
Abstract:Signals which we process are often subject to a probability distribution, which is unknown in many cases. Information geometry studies invariant structures of a manifold of probability distributions, given the Fisher information Riemannian metric and a pair of dual affine connections. They are useful in designing signal processing systems, where generalized Pythagorean and dual projection theorems play a fundamental role. We apply information geometry, obtaining various useful insights and algorithms in the following problems: 1) pattern classification, 2) robust clustering, 3) learning systems and natural gradient, 4) systems manifold, 5) ICA and 6) sparse signal analysis.
Department of Statistics, University of Oxford, and DeepMind, United Kingdom
► Personal homepage
Biography:Yee Whye Teh is a Professor of Statistical Machine Learning at the Department of Statistics, University of Oxford and a Research Scientist at Google DeepMind with interest in developing foundational methodologies for statistical machine learning. Further he is an European Research Council Consolidator Fellow and an Alan Turing Institute Faculty Fellow.
Abstract:Probabilistic and Bayesian reasoning is one of the principle theoretical pillars to our understanding of machine learning. Over the last two decades, it has inspired a whole range of successful machine learning methods and influenced the thinking of many researchers in the community. On the other hand, in the last few years the rise of deep learning has completely transformed the field and led to a string of phenomenal, era-defining, successes. In this talk I will explore the interface between these two perspectives on machine learning, and through a number of projects I have been involved in, explore questions like: Can probabilistic thinking help us understand deep learning methods and lead us to interesting new methods? Conversely, can deep learning technologies help us develop advanced probabilistic methods?
Department of Physics, University of Tokyo, Japan
► Personal homepage
Biography:Professor at Department of Physics, University of Tokyo since 2012. Principal Investigator at the Kavli Institute for the Physics and Mathematics of the Universe and Japan Science and Technology Agency.
Abstract:Astronomy has been driving data science over four hundreds years since Tycho Brahe conducted accurate and comprehensive astronomical and planetary observations, of which the data Johannes Kepler used to develop his three laws of planetary motion. Modern telescopes are collecting an enormous amount of data every night by scanning virtually all over the sky systematically. Analysis of such big data may provide, ultimately, deep insight into fundamental physics as well as discoveries of new astronomical objects.
The night sky is dynamic and deep. There are astronomical objects that vary their brightness on timescales of milli-seconds to years. There are also distant galaxies that were born at the dawn of the universe. Detecting these objects from big imaging data is a real challenge, but also an excellent playground for modern technologies such as machine learning and statistical inference.
In this talk, I introduce the recent development of observational cosmology and the future prospect in its data-science aspect. I present the results from our ongoing project that utilizes Japan's Subaru telescope to detect distance supernovae and galaxies, and to probe the distribution of matter in the Universe. Ongoing and future sky surveys will deliver data of exabyte volume. A concerted use of physics, statistics, computer science, simulations, powerful computers, etc. are needed in the coming decade.