Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

For a more realistic view on AI accelerators that doesn't overhype analog computing, I enjoyed the series by Adi Fuchs: https://medium.com/@adi.fu7/ai-accelerators-part-i-intro-822...

At the end of the day, specialized hardware, particularly on the analog side (neuromorphic, optical, etc.) locks development into the path of highly uniform feedforward networks by optimizing large matrix multiplications, and it is unclear if this tradeoff is worth it as we still have so much to figure out about which methods will make progress in AI.



Neuromorphics aren't analog FWIW. They're just asynchronous.


Depends ;-)

First of all, there is no single accepted definition of "neuromorphic" [1]. Still, as a point in favour of the "neuromorphic systems are analogue" crowd: the seminal paper by Carver Mead that (to my knowledge) coined the term "neuromorphics" specifically talks about analogue neuromorphic systems [2].

Right now, there are some research "analogue" (or, more precisely "mixed signal") neuromorphic systems being developed [3, 4]. It is correct however that there are no commercially available analogue systems that I am aware of. Unfortunately, the same can be said for digital neuromorphics as well (Intel Loihi is perhaps the closest to a commercial product, and yes, this is an asynchronous digital neuromorphic system).

[1] https://iopscience.iop.org/article/10.1088/1741-2560/13/5/05...

[2] https://authors.library.caltech.edu/53090/1/00058356.pdf

[3] https://brainscales.kip.uni-heidelberg.de/

[4] https://web.stanford.edu/group/brainsinsilicon/documents/ANe...




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: