
Nick Fisher
I am the founder and lead developer for Polyvox, a virtual language learning partner.
In my last post, I wrote about using FParsec to build a user input validator with parser combinators.
For most of us, when it comes to manipulating or interpreting some chunk of plain text, our first instinct will be to reach for regular expressions.
A number of recent papers have explored learning in deep neural networks without backpropagation, often motivated by the apparent biological implausibility of backpropagation.
I’m obliged to issue a severe warning to anyone who found their way here.
Gradient descent is a spectacularly effective optimization technique, but it’s not the only method for optimizing non-convex functions.
Let’s say our WPF application has an ItemsControl whose ItemsSource is bound to an ObservableCollection.
Let’s assume we have a sequence of words, and we want to predict, as accurately as possible, whether each word is a name, verb, or some other part of speech.