Jekyll2021-01-26T10:12:08-08:00https://xinchen236.github.io//feed.xmlXin Chen / HomePhD Candidate at Harvard UniveristyXin Chenchen_xin@g.harvard.eduNotes on Gaussian Process2020-07-29T00:00:00-07:002020-07-29T00:00:00-07:00https://xinchen236.github.io//posts/2020/07/gaussianprocess<p>My study notes on Gaussian Process and some useful resources.</p>
<script id="MathJax-script" async="" src="<url-to-your-site>/mathjax/tex-chtml.js"></script>
<h2 id="useful-resources">Useful Resources</h2>
<ul>
<li>
<p>GP Toolbox in Matlab: <a href="http://www.gaussianprocess.org/gpml/code/matlab/doc/">GPML</a>. A GP theme website is <a href="http://www.gaussianprocess.org/">here</a>.</p>
</li>
<li>
<p>A good GP textbook: <a href="http://www.gaussianprocess.org/gpml/chapters/RW.pdf">Gaussian Processes for Machine Learning</a>.</p>
</li>
</ul>
<h2 id="notes-on-gaussian-process"><a href="https://xinchen236.github.io/files/ACC2020slides.pdf">Notes on Gaussian Process</a></h2>
<h1 id="i-introduction">I. Introduction</h1>
<p>Gaussian process (GP) is a non-parametric supervised machine learning method, which has been widely used to model nonlinear system dynamics as well. GP works to infer an unknown function \(y = f(x)\) based on the training set \(\mathcal{D}:= \{(x_i, y_i): i=1,\cdots,n\}\) with \(n\) noisy observations. Comparing with other machine learning techniques, GP has the following main merits:</p>
<ul>
<li>
<p>GP provides an estimate of uncertainty or confidence in the predictions through the predictive variance, in addition to using the predictive mean as the prediction.</p>
</li>
<li>
<p>GP can work well with small datasets.</p>
</li>
<li>
<p>In the nature of Bayesian learning, GP incorporates prior domain knowledge of the unknwon system by defining kernel covariance function or setting hyperparameters.</p>
</li>
</ul>
<p>Formally, a GP is defined as a collection of random variables, any Gaussian process finite number of which have a joint Gaussian distribution. A GP is fully specified by a mean function \(m(x)\) and a (kernel) covariance function \(k(x,x')\), which is denoted as
\begin{align}
f(x)\sim\mathcal{GP}(m(x),k(x,x’))
\end{align}</p>
<p>It aims to infer the function value \(f(x_*)\) on a new point \(x_{*}\) based on the observations \(\mathcal{D}\). According to the formal definition, the collection \((\boldsymbol f_{\mathcal{D}}, f(x_*))\) follows a joint Gaussian distribution with</p>
\[[\boldsymbol f_{\mathcal{D}}; f(x_*)] \sim \mathcal{N} \Big( [ \boldsymbol m_{\mathcal{D}}; m(x_*) ], [ K_{\mathcal{D},\mathcal{D}}, \boldsymbol k_{ *,\mathcal{D}}; \boldsymbol k_{ *,\mathcal{D}}^\top, k(x_*,x_*) ] \Big)\]
<p>where vector \(\boldsymbol k_{*, \mathcal{D}}:= [ k(x_*,x_1); \cdots; k(x_*, x_n)]\), and matrix \(K_{\mathcal{D},\mathcal{D}}\) is the covariance matrix, whose \(ij\)-component is \(k(x_i,x_j)\).
Then conditioning on the given observations \(\mathcal{D}\), it is known that the posterior distribution \(f(x_*)|(\boldsymbol f_{\mathcal{D}} =\boldsymbol y_{\mathcal{D}})\) is also a Gaussian distribution \(\mathcal{N}(\mu_{*|\mathcal{D}}, \sigma^2_{*|\mathcal{D}} )\) with the closed form</p>
<p>\begin{align<em>}
\mu_{</em>|\mathcal{D}} & = m(x_<em>) + <br />
\sigma^2_{</em>|\mathcal{D}} & =
\end{align*}</p>
\[\mu_{*|\mathcal{D}} & = m(x_*) + \\\\ \sigma^2_{*|\mathcal{D}} & =\]Xin Chenchen_xin@g.harvard.eduMy study notes on Gaussian Process and some useful resources.Notes on Gaussian Process2014-08-14T00:00:00-07:002014-08-14T00:00:00-07:00https://xinchen236.github.io//blog-post-3My study notes on Gaussian Process and some useful resources.
Useful Resources
------
- GP Toolbox in Matlab: [GPML](http://www.gaussianprocess.org/gpml/code/matlab/doc/). A GP theme website is [here](http://www.gaussianprocess.org/).
- A good GP textbook: [Gaussian Processes for Machine Learning](http://www.gaussianprocess.org/gpml/chapters/RW.pdf).
I. Introduction
======
Gaussian process (GP) is a non-parametric supervised machine learning method, which has been widely used to model nonlinear system dynamics as well. GP works to infer an unknown function $$y = f(x)$$ based on the training set $$\mathcal{D}:= \{(x_i, y_i): i=1,\cdots,n\}$$ with $$n$$ noisy observations. Comparing with other machine learning techniques, GP has the following main merits:
- GP provides an estimate of uncertainty or confidence in the predictions through the predictive variance, in addition to using the predictive mean as the prediction.
- GP can work well with small datasets.
- In the nature of Bayesian learning, GP incorporates prior domain knowledge of the unknwon system by defining kernel covariance function or setting hyperparameters.
Formally, a GP is defined as a collection of random variables, any Gaussian process finite number of which have a joint Gaussian distribution. A GP is fully specified by a mean function $$m(x)$$ and a (kernel) covariance function $$k(x,x')$$, which is denoted as
\begin{align}
f(x) & \sim\mathcal{GP}(m(x),k(x,x')) \\
g(y) & =
\end{align}
<script type="text/x-mathjax-config">
MathJax.Hub.Config({
tex2jax: {
inlineMath: [ ['$','$'], ["\\(","\\)"] ],
processEscapes: true
}
});
</script>
<script
type="text/javascript"
charset="utf-8"
src="https://cdn.mathjax.org/mathjax/latest/MathJax.js?config=TeX-AMS-MML_HTMLorMML"
>
</script>
<script
type="text/javascript"
charset="utf-8"
src="https://vincenttam.github.io/javascripts/MathJaxLocal.js"
>
</script>Xin Chenchen_xin@g.harvard.eduMy study notes on Gaussian Process and some useful resources.Blog Post number 22013-08-14T00:00:00-07:002013-08-14T00:00:00-07:00https://xinchen236.github.io//posts/2013/08/blog-post-2<p>This is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.</p>
<h1 id="headings-are-cool">Headings are cool</h1>
<h1 id="you-can-have-many-headings">You can have many headings</h1>
<h2 id="arent-headings-cool">Aren’t headings cool?</h2>Xin Chenchen_xin@g.harvard.eduThis is a sample blog post. Lorem ipsum I can’t remember the rest of lorem ipsum and don’t have an internet connection right now. Testing testing testing this blog post. Blog posts are cool.