.@originalmaderix just got Apple’s Neural Engine to do training, not just inference. He reverse-engineered the private ANE stack and ran forward + backprop directly on the ANE. (Previously, Apple mainly exposes ANE through Core ML, and doesn’t provide a public training API/docs for ANE.) If this scales, local fine-tuning and always-on experimentation become cheap, quiet, and private. It’s still an early PoC: single-layer demo, some CPU fallback (dW/Adam), and private APIs that could break. But the efficiency upside is real, super exciting!
Sector:
From Twitter
Disclaimer: The content above is only the author's opinion which does not represent any position of Followin, and is not intended as, and shall not be understood or construed as, investment advice from Followin.
Like
Add to Favorites
Comments
Share
Relevant content



