Project idea: backprop using call/cc

If anyone wants a fun project, let me suggest "differentiable Snap!." The idea is that you should be able to take the derivative of Snap! programs with respect to their inputs by using reverse-mode automatic differentiation (i.e. backpropagation). Then you can do all sorts of fun things with gradient descent, such as train small neural networks or optimize functions.

This much would actually be kind of unexciting — gradient descent is nothing new — so why do it in Snap!? Here is the novelty: there's been a fair amount of very recent research work on how to do backprop using continuations, i.e. call/cc. See, e.g. https://www.cs.purdue.edu/homes/rompf/papers/wang-nips18.pdf or https://arxiv.org/pdf/1803.10228.pdf

My impression is that we still don't have a clean, beautiful presentation of this idea, maybe because most people doing backpropagation aren't immersed in the beauty of functional programming. But maybe if you all thought really hard about it, you could whip up an extremely elegant Snap! program that captures the essence of backprop-by-call-cc. :slight_smile: