Jon, the real-ness of numbers reminded me of the twins in Oliver Sacks "The Man Who Mistook His Wife for a Hat"... The relevant section is posted here:
Just coming back to this piece after a year or two and wanted to say how much this helped me. And the appendix at the end is just icing on the cake. Trying to bend my mind around what latent space entails from a metaphysics standpoint (seems like a lot of good ground to be covered by the Critical Realists here).
Thanks, Jon, for the useful framework. I’m stoked for more of the series and to use it as inspiration / guide to get my hands dirty on some of these models. Super relevant content.
It covers the basics of neural nets, backward propagation, and gradient descent, along with minimal and simple ruby code, based on Andrej Karpathy's MicroGrad.
I actually have this in my Notion db to dig into later :) I'm a rubyist and had been looking for something like this, so I'll definitely have to get into it and maybe write about it.
Borges's Library of Babel is my favorite treatment of the “what does it mean to say that numbers really exist?” question.
I'll check that out.
Jon, the real-ness of numbers reminded me of the twins in Oliver Sacks "The Man Who Mistook His Wife for a Hat"... The relevant section is posted here:
https://empslocal.ex.ac.uk/people/staff/mrwatkin/isoc/twins.htm
Some are skeptical of the story... But there's got to be patterns we humans don't (yet?) perceive...
Thanks! Yeah, that's a great story. Someone mentioned it on Twitter, too. That book has been on my list for a while.
Just coming back to this piece after a year or two and wanted to say how much this helped me. And the appendix at the end is just icing on the cake. Trying to bend my mind around what latent space entails from a metaphysics standpoint (seems like a lot of good ground to be covered by the Critical Realists here).
for those looking for more on the concept of latent space:
detailed:
https://towardsdatascience.com/understanding-latent-space-in-machine-learning-de5a7c687d8d
and ofc:
https://en.wikipedia.org/wiki/Latent_space
Thanks, Jon, for the useful framework. I’m stoked for more of the series and to use it as inspiration / guide to get my hands dirty on some of these models. Super relevant content.
Great primer. This might be interesting for you or your readers: https://github.com/rickhull/backprop
It covers the basics of neural nets, backward propagation, and gradient descent, along with minimal and simple ruby code, based on Andrej Karpathy's MicroGrad.
I actually have this in my Notion db to dig into later :) I'm a rubyist and had been looking for something like this, so I'll definitely have to get into it and maybe write about it.
Thank you, Jon. I'm eager for the next one.
Great read. I wrote a small piece on the creator economy here: https://clarioncall.substack.com/p/the-book-that-predicted-the-creator