Bayesian models in R

If there was something that always frustrated me was not fully understanding Bayesian inference. Sometime last year, I came across an article about a TensorFlow-supported R package for Bayesian analysis, called greta. Back then, I searched for greta tutorials and stumbled on this blog post that praised a textbook called Statistical Rethinking: A Bayesian Course with Examples in … Continue reading Bayesian models in R

Advertisements

The tidy caret interface in R

Among most popular off-the-shelf machine learning packages available to R, caret ought to stand out for its consistency. It reaches out to a wide range of dependencies that deploy and support model building using a uniform, simple syntax. I have been using caret extensively for the past three years, with a precious partial least squares (PLS) tutorial in … Continue reading The tidy caret interface in R

Linear mixed-effect models in R

Statistical models generally assume that All observations are independent from each other The distribution of the residuals follows $latex \mathcal{N}(0, \sigma^2)&s=1$, irrespective of the values taken by the dependent variable y When any of the two is not observed, more sophisticated modelling approaches are necessary. Let's consider two hypothetical problems that violate the two respective assumptions, … Continue reading Linear mixed-effect models in R

Partial least squares in R

My last entry introduces principal component analysis (PCA), one of many unsupervised learning tools. I concluded the post with a demonstration of principal component regression (PCR), which essentially is a ordinary least squares (OLS) fit using the first $latex k &s=1$ principal components (PCs) from the predictors. This brings about many advantages: There is virtually no … Continue reading Partial least squares in R

Principal Component Analysis in R

Principal component analysis (PCA) is routinely employed on a wide range of problems. From the detection of outliers to predictive modeling, PCA has the ability of projecting the observations described by $latex p &s=1$ variables into few orthogonal components defined at where the data 'stretch' the most, rendering a simplified overview. PCA is particularly powerful in dealing with multicollinearity … Continue reading Principal Component Analysis in R