## Visualising Networks in ASOIAF - Part II

Posted on October 14, 2018

This is the second post of a character network analysis of George R. R. Martin’s A Song Of Ice and Fire (ASOIAF) series as well as my first submission to the R Bloggers community. A warm welcome to all readers out there! In my first post, I touched on the Tidygraph package to manipulate dataframes and ggraph for network visualisation as well as some tricks to fix the position of nodes when ploting multiple graphs containing the same node set and labeling based on polar coordinates.

[Read More]
#r
#notes
#visualisation
#graph-theory
#networks
## Visualising Networks in ASOIAF

Posted on September 9, 2018

While waiting for the winds of winter to arrive, there is plenty of time to revisit the 5 books. One of my favourite aspects of the series is the character and world building. As the song of ice and fire universe is so big, many characters are mentioned in passing while the major characters meet each other only occasionally. I thought it would be interesting to see how various characters are connected and how that progresses through the series.

[Read More]
#r
#notes
#visualisation
#graph-theory
#networks
## Applications of DAGs in Causal Inference

Posted on August 9, 2018

Introduction Two years ago I came across Pearl’s work on using directed cyclical graphs (DAGs) to model the problem of causal inference and have read the debate between academics on Pearl’s framework vs Rubin’s potential outcomes framework. Then I found it quite intriguing from a scientific methods and history perspective how two different formal frameworks could be developed to solve a common goal. I read a few papers on the DAG approach but without fully understanding how it could be useful to my work filed it away in the back of my mind (and computer folder).

[Read More]
#r
#DAGs
#notes
#musings
#causal-inference
## Notes on Regression - Approximation of the Conditional Expectation Function

Posted on February 26, 2018

The final installment in my ‘Notes on Regression’ series! For a review on ways to derive the Ordinary Least Square formula as well as various algebraic and geometric interpretations, check out the previous 5 posts:
Part 1 - OLS by way of minimising the sum of square errors
Part 2 - Projection and Orthogonality
Part 3 - Method of Moments
Part 4 - Maximum Likelihood
Part 5 - Singular Vector Decomposition

[Read More]
#regression
#ols
#notes
## Notes on Graphs and Spectral Properties

Posted on December 25, 2017

Here is the first series of a collection of notes which I jotted down over the past 2 months as I tried to make sense of algebraic graph theory. This one focuses on the basic definitions and some properties of matrices related to graphs. Having all the symbols and main properties in a single page is a useful reference as I delve deeper into the applications of the theories. Also, it saves me time from googling and checking the relationship between these objects.

[Read More]
#graph theory
#notes
## Choosing a Control Group in a RCT with Multiple Treatment Periods

Posted on November 18, 2017

Came across a fun little problem over the past few weeks that is related to the topic of policy impact evaluation - a long time interest of mine! Here’s the setting: we have a large population of individuals and a number of treatments that we want to gauge the effectiveness of. The treatments are not necessarily the same but are targeted towards certain sub-segments in the population. Examples of such situations include online ad targeting or marketing campaigns.

[Read More]
#R
#notes
#simulation
#metrics
## Notes on Regression - Singular Vector Decomposition

Posted on October 21, 2017

Here’s a fun take on the OLS that I picked up from The Elements of Statistical Learning. It applies the Singular Value Decomposition, also known as the method used in principal component analysis, to the regression framework.
Singular Vector Decomposition (SVD) First, a little background on the SVD. The SVD could be thought of as a generalisation of the eigendecomposition. An eigenvector v of matrix \(\mathbf{A}\) is a vector that is mapped to a scaled version of itself: \[ \mathbf{A}v = \lambda v \] where \(\lambda\) is known as the eigenvalue.

[Read More]
#regression
#ols
#notes
## Comparing the Population and Group Level Regression

Posted on October 1, 2017

I was planning to write a post that uses region level data to infer the underlying relationship at the population level. However, after thinking through the issue over the past few days and working out the math (below), I realise that the question I wanted to answer could not be solved using the aggregate data at hand. Nonetheless, here is a formal description of the problem outlining the assumptions needed to infer population level trends from more aggregated data.

[Read More]
#regression
#notes
## Notes on Regression - Maximum Likelihood

Posted on September 21, 2017

Part 4 in the series of notes on regression analysis derives the OLS formula through the maximum likelihood approach. Maximum likelihood involves finding the value of the parameters that maximise the probability of the observed data by assuming a particular functional form distribution.
Bernoulli example Take for example a dataset consisting of results from a series of coin flips. The coin may be biased and we want to find an estimator for the probability of the coin landing heads.

[Read More]
#regression
#ols
#notes
## Using Leaflet in R - Tutorial

Posted on September 13, 2017

Here’s a tutorial on using Leaflet in R. While the leaflet package supports many options, the documentation is not the clearest and I had to do a bit of googling to customise the plot to my liking. This walkthrough documents the key features of the package which I find useful in generating choropleth overlays. Compared to the simple tmap approach documented in the previous post, creating a visualisation using leaflet gives more control over the final outcome.

[Read More]
#Singapore
#R
#spatial
#visualisation
#notes