WHAT MY DEEP MODEL DOESN'T KNOW.

  • WHAT MY DEEP MODEL DOESN'T KNOW...
    I recently spent some time trying to understand why dropout deep learning models work so well – trying to relate them to new research from the last couple of years. I was quite surprised to see how close these were to Gaussian processes. I was even more surprised to see that we can get uncertainty information from these deep learning models for free – without changing a thing.
    [Post]

  • DROPOUT AS A BAYESIAN APPROXIMATION
    05/06/2015

2 new papers on dropout as a Bayesian approximation, with applications to model uncertainty in deep learning [1], and Bayesian convolutional neural networks [2]have been added to the publications

SOFTWARE

OPEN SOURCE PROJECTS I'M CURRENTLY WORKING ON

VSSGP
An implementation of the Variational Sparse Spectrum Gaussian Process using Theano (a Python package for symbolic differentiation).
[Software] [Paper]

CLGP
An implementation of the Categorical Latent Gaussian Process using Theano (a Python package for symbolic differentiation).
[Software] [Paper]

GPARML
A light-weight and minimal Python implementation of parallel inference for the Bayesian Gaussian process latent variable model and GP regression.
[Software] [Paper]

UNIVERSITY OF CAMBRIDGE PRESENTATION TEMPLATE
This is a presentation template with the colour scheme of the University of Cambridge. The beamer template is based oncambridge-beamer with changes to the colour scheme and page layout.
[Software] [Example]

GIZA#
An optimised C++ extension of Giza++ (a word alignment software package) implementing the hierarchical Pitman-Yor process alignment models.
[Software] [Paper]

发表论文

BAYESIAN CONVOLUTIONAL NEURAL NETWORKS WITH BERNOULLI APPROXIMATE VARIATIONAL INFERENCE


We present an efficient Bayesian convolutional neural network (convnet). The model offers better robustness to over-fitting on small data and achieves a considerable improvement in classification accuracy compared to previous approaches. We give state-of-the-art results on CIFAR-10 following our insights.
Yarin Gal, Zoubin GhahramaniIn submission, 2015 [arXiv] [BibTex]
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning
Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning

DROPOUT AS A BAYESIAN APPROXIMATION: REPRESENTING MODEL UNCERTAINTY IN DEEP LEARNING
Link to this paper
Link to this paper

We show that dropout in multilayer perceptron models (MLPs) can be interpreted as a Bayesian approximation. Results are obtained for modelling uncertainty for dropout MLP models - extracting information that has been thrown away so far, from existing models. This mitigates the problem of representing uncertainty in deep learning without sacrificing computational performance or test accuracy.
Yarin Gal, Zoubin GhahramaniIn submission, 2015 [arXiv] [Appendix] [BibTex]
Dropout as a Bayesian Approximation: Insights and Applications
Dropout as a Bayesian Approximation: Insights and Applications

DROPOUT AS A BAYESIAN APPROXIMATION: INSIGHTS AND APPLICATIONS
Link to this paper
Link to this paper

Deep learning techniques lack the ability to reason about uncertainty over the features. We show that a multilayer perceptron (MLP) with arbitrary depth and non-linearities, with dropout applied after every weight layer, is mathematically equivalent to an approximation to a well known Bayesian model. This paper is a short version of the appendix of "Dropout as a Bayesian Approximation: Representing Model Uncertainty in Deep Learning".
Yarin Gal, Zoubin GhahramaniDeep Learning Workshop, ICML, 2015 [PDF] [BibTex]
An Infinite Product of Sparse Chinese Restaurant Processes
An Infinite Product of Sparse Chinese Restaurant Processes

AN INFINITE PRODUCT OF SPARSE CHINESE RESTAURANT PROCESSES
Link to this paper
Link to this paper

We define a new process that gives a natural generalisation of the Indian buffet process (used for binary feature allocation) into categorical latent features. For this we take advantage of different limit parametrisations of the Dirichlet process and its generalisation the Pitman–Yor process.
Yarin Gal, Tomoharu Iwata, Zoubin Ghahramani10th Conference on Bayesian Nonparametrics (BNP), 2015 [Talk] [BibTex] We thank BNP for the travel award.
**
Improving the Gaussian Process Sparse Spectrum Approximation by Representing Uncertainty in Frequency Inputs
Improving the Gaussian Process Sparse Spectrum Approximation by Representing Uncertainty in Frequency Inputs

IMPROVING THE GAUSSIAN PROCESS SPARSE SPECTRUM APPROXIMATION BY REPRESENTING UNCERTAINTY IN FREQUENCY INPUTS
Link to this paper
Link to this paper

Standard sparse pseudo-input approximations to the Gaussian process (GP) cannot handle complex functions well. Sparse spectrum alternatives attempt to answer this but are known to over-fit. We use variational inference for the sparse spectrum approximation to avoid both issues. We extend the approximate inference to the distributed and stochastic domains.
Yarin Gal, Richard Turner
ICML, 2015** [PDF] [Software] [BibTex]
Latent Gaussian Processes for Distribution Estimation of Multivariate Categorical Data
Latent Gaussian Processes for Distribution Estimation of Multivariate Categorical Data

LATENT GAUSSIAN PROCESSES FOR DISTRIBUTION ESTIMATION OF MULTIVARIATE CATEGORICAL DATA
Link to this paper
Link to this paper

Multivariate categorical data occur in many applications of machine learning. One of the main difficulties with these vectors of categorical variables is sparsity. The number of possible observations grows exponentially with vector length, but dataset diversity might be poor in comparison. Recent models have gained significant improvement in supervised tasks with this data. These models embed observations in a continuous space to capture similarities between them. Building on these ideas we propose a Bayesian model for the unsupervised task of distribution estimation of multivariate categorical data.
Yarin Gal, Yutian Chen, Zoubin GhahramaniWorkshop on Advances in Variational Inference, NIPS, 2014 [PDF] [Poster] [Presentation] [BibTex] ICML, 2015 [PDF] [Software] [BibTex] We thank Google DeepMind for the travel award.
**
Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models
Distributed Variational Inference in Sparse Gaussian Process Regression and Latent Variable Models

DISTRIBUTED VARIATIONAL INFERENCE IN SPARSE GAUSSIAN PROCESS REGRESSION AND LATENT VARIABLE MODELS
Link to this paper
Link to this paper

We develop parallel inference for sparse Gaussian process regression and latent variable models. These processes are used to model functions in a principled way and for non-linear dimensionality reduction in linear time complexity. Using parallel inference we allow the models to work on much larger datasets than before.
Yarin Gal, Mark van der Wilk, Carl E. Rasmussen
Workshop on New Learning Models and Frameworks for Big Data, ICML, 2014
[arXiv] [Presentation] [Software] [BibTex] NIPS, 2014 [PDF] [BibTex] We thank NIPS for the travel award.
**
Feature Partitions and Multi-View Clusterings
Feature Partitions and Multi-View Clusterings

FEATURE PARTITIONS AND MULTI-VIEW CLUSTERINGS
Link to this paper
Link to this paper

We define a new combinatorial structure that unifies Kingman's random partitions and Broderick, Pitman, and Jordan's feature frequency models. This structure underlies non-parametric multi-view clustering models, where data points are simultaneously clustered into different possible clusterings. The de Finetti measure is a product of paintbox constructions. Studying the properties of feature partitions allows us to understand the relations between the models they underlie and share algorithmic insights between them.
Yarin Gal, Zoubin Ghahramani
International Society for Bayesian Analysis (ISBA), 2014[Link] [Poster]We thank ISBA for the travel award.
**
Dirichlet Fragmentation Processes
Dirichlet Fragmentation Processes

DIRICHLET FRAGMENTATION PROCESSES
Link to this paper
Link to this paper

We introduce a new class of models over trees based on the theory of fragmentation processes. The Dirichlet Fragmentation Process Mixture Model is an example model derived from this new class. This model has efficient and simple inference, and significantly outperforms existing approaches for hierarchical clustering and density modelling.
Hong Ge, Yarin Gal, Zoubin GhahramaniIn submission, 2014[PDF] [BibTex]
Pitfalls in the use of Parallel Inference for the Dirichlet Process
Pitfalls in the use of Parallel Inference for the Dirichlet Process

PITFALLS IN THE USE OF PARALLEL INFERENCE FOR THE DIRICHLET PROCESS
Link to this paper
Link to this paper

We show that the recently suggested parallel inference for the Dirichlet process is conceptually invalid. The Dirichlet process is important for many fields such as natural language processing. However the suggested inference would not work in most real-world applications.
Yarin Gal, Zoubin GhahramaniWorkshop on Big Learning, NIPS, 2013[PDF] [Presentation] [BibTex] ICML, 2014[PDF] [Talk] [Presentation] [Poster] [BibTex]
Variational Inference in the Gaussian Process Latent Variable Model and Sparse GP Regression – a Gentle Tutorial
Variational Inference in the Gaussian Process Latent Variable Model and Sparse GP Regression – a Gentle Tutorial

VARIATIONAL INFERENCE IN THE GAUSSIAN PROCESS LATENT VARIABLE MODEL AND SPARSE GP REGRESSION – A GENTLE TUTORIAL
Link to this paper
Link to this paper

We present an in-depth and self-contained tutorial for sparse Gaussian Process (GP) regression. We also explain GP latent variable models, a tool for non-linear dimensionality reduction. The sparse approximation reduces the time complexity of the models from cubic to linear but its development is scattered across the literature. The various results are collected here.
Yarin Gal, Mark van der WilkTutorial, 2014[arXiv] [BibTex]
Semantics, Modelling, and the Problem of Representation of Meaning – a Brief Survey of Recent Literature
Semantics, Modelling, and the Problem of Representation of Meaning – a Brief Survey of Recent Literature

SEMANTICS, MODELLING, AND THE PROBLEM OF REPRESENTATION OF MEANING – A BRIEF SURVEY OF RECENT LITERATURE
Link to this paper
Link to this paper

Over the past 50 years many have debated what representation should be used to capture the meaning of natural language utterances. Recently new needs of such representations have been raised in research. Here I survey some of the interesting representations suggested to answer for these new needs.
Yarin GalLiterature survey, 2013[arXiv] [BibTex]
A Systematic Bayesian Treatment of the IBM Alignment Models
A Systematic Bayesian Treatment of the IBM Alignment Models

A SYSTEMATIC BAYESIAN TREATMENT OF THE IBM ALIGNMENT MODELS
Link to this paper
Link to this paper

We used a non-parametric process in models that align words between pairs of sentences. These alignment models are used at the core of all machine translation systems. We obtained a significant improvement in translation using the process.
Yarin Gal, Phil BlunsomAssociation for Computational Linguistics (NA-ACL), 2013[PDF] [Presentation] [BibTex]
Relaxing HMM Alignment Model Assumptions for Machine Translation Using a Bayesian Approach
Relaxing HMM Alignment Model Assumptions for Machine Translation Using a Bayesian Approach

RELAXING HMM ALIGNMENT MODEL ASSUMPTIONS FOR MACHINE TRANSLATION USING A BAYESIAN APPROACH
Link to this paper
Link to this paper

We used a non-parametric process to relax some of the restricting assumptions often used in machine translation. When a long history of translation words is not available the process falls-back onto shorter histories in a principled way.
Yarin GalMaster's Dissertation, 2012[PDF] [BibTex]
Overcoming Alpha-Beta Limitations Using Evolved Artificial Neural Networks
Overcoming Alpha-Beta Limitations Using Evolved Artificial Neural Networks

OVERCOMING ALPHA-BETA LIMITATIONS USING EVOLVED ARTIFICIAL NEURAL NETWORKS
Link to this paper
Link to this paper

We trained a feed-forward neural network to play checkers. The network acts as both the value function for a min-max algorithm and a heuristic for pruning tree branches in a reinforcement learning setting. We used no supervised signal for training - a set of networks was assessed by playing against each-other and the winning networks' weights where changed slightly.
Yarin Gal, Mireille AvigalMachine Learning and Applications (IEEE), 2010[PDF] [BibTex]

TALKS


LATENT GAUSSIAN PROCESSES FOR DISTRIBUTION ESTIMATION OF MULTIVARIATE CATEGORICAL DATA
Link to this paper
Link to this paper

We discuss the issues with representing high dimensional vectors of discrete variables, and existing models that attempt to estimate the distribution of such. We then present our approach which relies on a continuous latent representation for the discrete data.
Yarin GalInvited talk: Microsoft Research, Cambridge, 2015Invited talk: NTT Labs, Kyoto, Japan, 2015[Presentation] [Video]
Representations of Meaning
Representations of Meaning

REPRESENTATIONS OF MEANING
Link to this paper
Link to this paper

We discuss various formal representations of meaning, including Gentzen sequent calculus, vector spaces over the real numbers, and symmetric closed monoidal categories.
Yarin GalInvited talk: Trinity College Mathematical Society, University of Cambridge, 2015[Presentation]
Symbolic Differentiation for Rapid Model Prototyping in Machine Learning and Data Analysis – a Hands-on Tutorial
Symbolic Differentiation for Rapid Model Prototyping in Machine Learning and Data Analysis – a Hands-on Tutorial

SYMBOLIC DIFFERENTIATION FOR RAPID MODEL PROTOTYPING IN MACHINE LEARNING AND DATA ANALYSIS – A HANDS-ON TUTORIAL
Link to this paper
Link to this paper

We talk about the theory of symbolic differentiation and demonstrate its use through the Theano Python package. We give two example models: logistic regression and a deep net, and continue to talk about rapid prototyping of probabilistic models with SVI. The talk is based in part on the Theano online tutorial.
Yarin GalMLG Seminar, 2014[Presentation]
Rapid Prototyping of Probabilistic Models using Stochastic Variational Inference
Rapid Prototyping of Probabilistic Models using Stochastic Variational Inference

RAPID PROTOTYPING OF PROBABILISTIC MODELS USING STOCHASTIC VARIATIONAL INFERENCE
Link to this paper
Link to this paper

In data analysis we have to develop new models which can often be a lengthy process. We need to derive appropriate inference which often involves cumbersome implementation which changes regularly. Rapid prototyping answers similar problems in manufacturing, where it is used for quick fabrication of scale models of physical parts. We present Stochastic Variational Inference (SVI) as a tool for rapid prototyping of probabilistic models.
Yarin GalShort talk, 2014[Presentation]
Distributed Inference in Bayesian Nonparametrics – the Dirichlet Process and the Gaussian Process
Distributed Inference in Bayesian Nonparametrics – the Dirichlet Process and the Gaussian Process

DISTRIBUTED INFERENCE IN BAYESIAN NONPARAMETRICS – THE DIRICHLET PROCESS AND THE GAUSSIAN PROCESS
Link to this paper
Link to this paper

I present distributed inference methodologies for two major processes in Bayesian nonparametrics. Pitfalls in the use of parallel inference for the Dirichlet process are discussed, and distributed variational inference in sparse Gaussian process regression and latent variable models is presented.
Yarin GalInvited talk: NTT Labs, Kyoto, Japan, 2014
Emergent Communication for Collaborative Reinforcement Learning
Emergent Communication for Collaborative Reinforcement Learning

EMERGENT COMMUNICATION FOR COLLABORATIVE REINFORCEMENT LEARNING
Link to this paper
Link to this paper

Slides from a seminar introducing collaborative reinforcement learning and how learning communication can improve collaboration. We use game theory to motivate the use of collaboration in a multi-agent setting. We then define multi-agent and decentralised multi-agent Markov decision processes. We discuss issues with these definitions and possible ways to overcome them. We then transition to emergent languages. We explain how the use of an emergent communication protocol could aid in collaborative reinforcement learning. Reviewing a range of emergent communication models developed from a linguistic motivation to a pragmatic view, we finish with an assessment of the problems left unanswered in the field.
Yarin Gal, Rowan McAllisterMLG Seminar, 2014[Presentation]
The Borel–Kolmogorov paradox
The Borel–Kolmogorov paradox

THE BOREL–KOLMOGOROV PARADOX
Link to this paper
Link to this paper

Slides from a short talk explaining the the Borel–Kolmogorov paradox, alluding to possible pitfalls in probabilistic modelling. The slides are partly based on Jaynes, E.T. (2003) "Probability Theory: The Logic of Science".
Yarin GalShort talk, 2014[Presentation]
Bayesian Nonparametrics in Real-World Applications: Statistical Machine Translation and Language Modelling on Big Datasets
Bayesian Nonparametrics in Real-World Applications: Statistical Machine Translation and Language Modelling on Big Datasets

BAYESIAN NONPARAMETRICS IN REAL-WORLD APPLICATIONS: STATISTICAL MACHINE TRANSLATION AND LANGUAGE MODELLING ON BIG DATASETS
Link to this paper
Link to this paper

Slides from a seminar introducing statistical machine translation and language modelling as real-world applications of Bayesian nonparametrics. We give a friendly introduction to statistical machine translation and language modelling, and then describe how recent developments in the field of Bayesian nonparametrics can be exploited for these tasks. The first part of the presentation is based on the lecture notes by Dr Phil Blunsom.
Yarin GalMLG Seminar, 2013[Presentation]

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 199,393评论 5 467
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 83,790评论 2 376
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 146,391评论 0 330
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 53,703评论 1 270
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 62,613评论 5 359
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,003评论 1 275
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 37,507评论 3 390
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,158评论 0 254
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 40,300评论 1 294
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,256评论 2 317
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,274评论 1 328
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 32,984评论 3 316
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 38,569评论 3 303
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 29,662评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 30,899评论 1 255
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 42,268评论 2 345
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 41,840评论 2 339

推荐阅读更多精彩内容