<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Machine Learning From Scratch on Standard error</title><link>https://t-redactyl.io/series/machine-learning-from-scratch/</link><description>Recent content in Machine Learning From Scratch on Standard error</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Mon, 02 Mar 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://t-redactyl.io/series/machine-learning-from-scratch/index.xml" rel="self" type="application/rss+xml"/><item><title>Linear regression from scratch in Python</title><link>https://t-redactyl.io/posts/2026-03-02-ols-regression-from-scratch-in-python/</link><pubDate>Mon, 02 Mar 2026 00:00:00 +0000</pubDate><guid>https://t-redactyl.io/posts/2026-03-02-ols-regression-from-scratch-in-python/</guid><description>&lt;p&gt;Almost six years ago (which means this blog is &amp;hellip; old), I wrote what has become &lt;a href="https://t-redactyl.io/posts/2020-07-13-linear-algebra-ols-regression/"&gt;one of my favourite blog posts&lt;/a&gt;), explaining the linear algebra approach to linear regression (specifically to OLS, or ordinary least square regression). I was recently rereading it, and I realised that there are some steps I skipped which might not be intuitive for people without a background in linear algebra. As such, in this blog post I will build this algorithm from scratch in Python, showing you step-by-step how the algorithm works. To really show you exactly what's going on under the hood, I'll be building all methods from scratch rather than using the corresponding Numpy operations.&lt;/p&gt;</description></item></channel></rss>