Loading detailsβ¦
Loading detailsβ¦
Artist
micrograd overview
422starting the core Value object of micrograd and its visualization
293derivative of a function with multiple inputs
284derivative of a simple function with one input
275implementing the backward function for each operation
216fixing a backprop bug when one node is used multiple times
197implementing the backward function for a whole expression graph
198preview of a single optimization step
199counting bigrams in a 2D torch tensor ("training the model")
1610breaking up a tanh, exercising with more operations
16