<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Posts on Gabriel Dennis</title><link>https://gden173.github.io/gabrieldennis/posts/</link><description>Recent content in Posts on Gabriel Dennis</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Sun, 15 Jan 2023 22:02:22 +1000</lastBuildDate><atom:link href="https://gden173.github.io/gabrieldennis/posts/index.xml" rel="self" type="application/rss+xml"/><item><title>Poisson distribution</title><link>https://gden173.github.io/gabrieldennis/posts/poisson-distribution/</link><pubDate>Sun, 15 Jan 2023 22:02:22 +1000</pubDate><guid>https://gden173.github.io/gabrieldennis/posts/poisson-distribution/</guid><description>Poisson Distribution The general form of the pdf for the Poisson distribution is
$$ f(x; \lambda) = \mathbb{P}(X = x) = \frac{\lambda^x e^{-\lambda}}{x!} $$
And in this instance we say that the R.V \(X \sim P(\lambda)\).
PGF The Probability Generating function of a Poisson Distribution has the following form
$$ \begin{aligned} \mathcal{G}_{X}(z) &amp;amp;= \sum_{k = 1}^{\infty} z^k\frac{\lambda^k e^{-\lambda}}{k!} \\ &amp;amp;= e^{-\lambda}\sum_{k = 1}^{\infty} \frac{(\lambda z)^{k}}{k!} \\ &amp;amp;= e^{-\lambda}e^{\lambda z} \\ &amp;amp;= e^{\lambda(z - 1)} \end{aligned} $$</description></item><item><title>Honours Thesis</title><link>https://gden173.github.io/gabrieldennis/posts/honours-thesis/</link><pubDate>Sat, 14 Jan 2023 18:39:00 +1000</pubDate><guid>https://gden173.github.io/gabrieldennis/posts/honours-thesis/</guid><description>Honours Thesis This short post contains links to my honours thesis and honours presentation.
The topic of my thesis was to create a computational structure for a semi-parametric vector generalized linear model. This was based on the earlier work of my supervisor, who had built up the theoretical framework for this model over a series of papers with other collaborators.
Papers https://www.tandfonline.com/doi/abs/10.1080/01621459.2013.824892 R Package for Univariate Case Univariate model R package CRAN Link Relevant Links The following links contain some of my work on this topic.</description></item><item><title>Quadratic Equation</title><link>https://gden173.github.io/gabrieldennis/posts/quadratic-equation/</link><pubDate>Mon, 26 Sep 2022 21:31:24 +1000</pubDate><guid>https://gden173.github.io/gabrieldennis/posts/quadratic-equation/</guid><description>Introduction In this very short blog post I want to go through the derivation of the quadratic equation as a simple teaching exercise. As most of us are aware from high school, when solving for the solutions to the following equation $$ ax^2 + bx + c = 0,\qquad a \neq 0 $$ one can use the quadratic equation, which has the general form. $$ x_{1, 2} = \frac{-b \pm \sqrt{b^2 - 4ac}}{2a} $$</description></item><item><title>Beta Distribution</title><link>https://gden173.github.io/gabrieldennis/posts/beta-distribution/</link><pubDate>Tue, 29 Mar 2022 22:25:22 +1000</pubDate><guid>https://gden173.github.io/gabrieldennis/posts/beta-distribution/</guid><description>Beta Regression Today at work we had to deal with the problem of regressing on some fractional data, i.e, it lay in the interval \(x \in (0,1)\).
It was suggested that this should be modeled using a logistic regression model, however, I thought that this would be a good opportunity to use a Beta regression model due to the obvious efficiency advantage it would have in the case where the modeled data did indeed originate from a Beta distribution.</description></item><item><title>Normal Distribution</title><link>https://gden173.github.io/gabrieldennis/posts/normal-distribution/</link><pubDate>Tue, 08 Mar 2022 23:44:00 +1000</pubDate><guid>https://gden173.github.io/gabrieldennis/posts/normal-distribution/</guid><description>The Normal Distribution The normal or Gaussian distribution is perhaps the most common distribution occuring in nature. This is due to the relatively special properties it has for larger sample sizes. However, these properties will be the topic of another blog post. Instead, in this post we are simply going to outline the basic mathematical structure of the normal distribution.
Probability Density Function The probability density function of a $$ \mathcal{N}(\mu, \sigma^2) $$ normal distribution is</description></item><item><title>Bernoulli and Geometric Distribution PGF</title><link>https://gden173.github.io/gabrieldennis/posts/probability-generating-functions/</link><pubDate>Tue, 08 Mar 2022 22:15:07 +1000</pubDate><guid>https://gden173.github.io/gabrieldennis/posts/probability-generating-functions/</guid><description>In this post we are going to go over the derivation of probability generating function of the Bernoulli distribution.
Probability Generating Function The probability generating function often referred to as the PGF has the following definition
$$ \mathcal{G}_{X} (z) = \mathbb{E}_{X}[z^k] = \sum_{k = 0}^{\infty} P(x = k)z^k $$
Bernoulli As can be recalled from previous posts, the Bernoulli distribution has the PDF
$$ f(x) = {n\choose k} p^x (1- p)^x $$</description></item></channel></rss>