From 7d17bcf0b44a30d8269495242e9263fde1d726e1 Mon Sep 17 00:00:00 2001 From: Pineal Servo Date: Sat, 14 Sep 2013 22:25:10 -0600 Subject: [PATCH] An un-summarized change. --- scratch.page | 22 +++++++++++++++++++++- 1 file changed, 21 insertions(+), 1 deletion(-) diff --git a/scratch.page b/scratch.page index a65e0ce..76f7f4f 100644 --- a/scratch.page +++ b/scratch.page @@ -18,6 +18,26 @@ the RSS values of each of the vector elements using $C_i$ as the weight. It is calculated in this manner: $$ -\sigma_{F_s} = \frac{\sum_{i \in F_s} \sigma_i C_i} +\sigma_{F_s} = \frac{\sum_{i \in F_s} \sigma_i\, C_i} {\sum_{i \in F_s} C_i} $$ +# Bayesian Regression + +First, specify a set of probabilistic models of the data. + +Let a member of this set be denoted by $\mathcal{R}_\alpha$ + +$\mathcal{R}_\alpha$ has a *prior* probability $P(\mathcal{H}_\alpha)$ + +On observation of $\mathcal{D}$, the *likelihood* of hypothesis +$\mathcal{R}_{\alpha}$ is +$\mathit{P}(\mathcal{D}|\mathcal{R}_{\alpha})$. + +The *posterior* probability of $\mathcal{R}_{\alpha}$ is then given by +$\mathit{P}(\mathcal{H}_{\alpha})\mathit{P}(\mathcal{D}|\mathcal{H}_{\alpha})$ + +This follows from **Bayes' Theorem** which says + +$$ +P(A|B) = \frac{P(B | A)\, P(A)}{P(B)} +$$ \ No newline at end of file