4 years ago by pishpash
Also, most recent state of progress: Predictive Coding Can Do Exact Backpropagation on Any Neural Network (2021)
4 years ago by obstbraende
here's a link with reviewer comments https://openreview.net/forum?id=PdauS7wZBfC (praise to openreview!)
4 years ago by pishpash
The decision reasoning is super helpful to put things in context. The arbitrary binary decision to "accept" vs "reject" especially for the snooty "high bar for acceptance at ICLR" is laughable in a world of free information access.
4 years ago by yewenjie
AstralCodexTen (formerly SlateStarCodex) has discussed this here - https://astralcodexten.substack.com/p/link-unifying-predicti...
He mostly points to this post in LessWrong - https://www.lesswrong.com/posts/JZZENevaLzLLeC3zn/predictive...
4 years ago by amelius
If backprop is not needed, would this finding make automatic-differentiation functionality obsolete in DL frameworks, allowing these frameworks to become much simpler? Or is there still some constant factor that makes backprop favorable?
4 years ago by lumost
Reading the openreview link, the current understanding is that this approach is dramatically more computationally intensive than standard backprop - limiting its utility.
4 years ago by p1esk
Backprop is simpler.
4 years ago by coolness
Needs [2020] in the title. Interesting work nevertheless.
4 years ago by undefined
4 years ago by l33tman
This (and its follow up papers) have already been discussed multiple times here. Don't have the links handy though..
Daily digest email
Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.