Breathing in the crisp autumn air and staring out over what has been his home, neighborhood, close friend, and therapist, for the past two years, Henry David Thoreau lightly closed the fragile cedar-wooded door of his cabin for the last time.
A few years ago, I remember being faced with an introductory physics problem. It was the first time I had encountered the idea of acceleration, and I was puzzled by the notation for it.
This is a continuation of my Linear Algebra series, which should be viewed as an extra resource while going along with Gilbert Strang’s class 18.06 on OCW. This can be closely matched to Lecture 16 his series.
This article requires understanding of the four fundamental subspaces of a matrix, projection, projection of vectors onto planes, projection matrices, orthogonality, orthogonality of subspaces, elimination, transposes, and inverses. I would highly recommend understanding everything in Lecture 15.
In a previous article, I wrote about fitting a line to data points on a two-dimensional plane in the context of linear regression with gradient descent…
James Watson and Francis Crick first published their groundbreaking study on the double-helix form of deoxyribonucleic acid (DNA) and how it expresses our genes, which up until then, were mere abstractions lacking a concrete, biological, definition.
Since then, we have discovered how exactly amino acids, and in extension, proteins, form from different combinations of nucleotide base pairs ordered along our DNA, which resides in the nucleus of all our 3 trillion non-red blood cells.
The entirety of our DNA is a single string of nucleotide bases Adenine, Thymine, Guanine and Cytosine (A, T, C, G)* which, in trios (e.g. TCA)…
The 1960’s were a time of great expectations in the still youthful artificial intelligence community. AI was not yet industry-ready, but did not fail to capture the attention of both the scientific and public community.
Machine learning, although first suggested in a paper by Alan Turing in 1950, did not come to fruition until the availability of abundant computing power in the 90’s and 2000’s. For most of AI’s 75 year history, artificial intelligence was not machine learning, but rule-based or expert systems. …
It was hard to predict just how influential the theory of natural selection would be when it first came to a young Charles Darwin in 1837 six months after returning from his pivotal journey on The Beagle.
Scribbling in his journal with the vigor and penmanship of a madman, he drew his first evolutionary tree, with the words “I think” directly above.
Twenty-one years later, On the Origin of Species was published, and the way in which we viewed psychology, philosophy, biology, and our place (or lack thereof) in nature was challenged, and eventually changed, forever.
Evolutionary biology, which found…
This is a continuation of my Linear Algebra series, which should be viewed as an extra resource while going along with Gilbert Strang’s class 18.06 on OCW. This can be closely matched to Lecture 9 and 10 in his series.
Today we tackle a topic that we’ve already seen, but not discussed formally. It is possibly the most important idea to cover in this side of linear algebra, and this is the rank of a matrix. The two other ideas, basis and dimension, will kind of fall out of this.
To put it simply, the rank of the matrix represents…
This is a continuation of my Linear Algebra series, which should be viewed as an extra resource while going along with Gilbert Strang’s class 18.06 on OCW. This can be closely matched to Lecture 8 in his series.
Here, we continue our discussion about solving linear systems with elimination. In my Gaussian Elimination series, we explored how square, invertible matrices can be solved by method of elimination and row exchanges — but we never delved into solving rectangular, non-invertible systems.
In the last lesson, we explored how non-square systems can be solved by using Gaussian elimination. …
This is a continuation of my Linear Algebra series, which should be viewed as an extra resource while going along with Gilbert Strang’s class 18.06 on OCW.
Let’s introduce a small additional concept in linear algebra — this is the Reduced Row Echelon Form of a matrix. I’d highly recommend reading the article before this, which details how to solve Ax = 0 in general.
Leaving right where we left off, we eliminated through our A matrix to get something in echelon form, symbolized by the variable U. …
This article assumes a base knowledge of all basic matrix and vector manipulation concepts, like dot products, linear combinations, as well as ideas involving span, vector spaces and subspaces, and column spaces.
Continuing off the last article, we introduced ideas of the null space.
To review, the null space is the vector space of some group of x that satisfy Ax = 0. …
15 y old learning about machine learning, as well as a lifelong naturalist. Climate activist in Vancouver. Writer.