NHacker Next
login
▲A New Kind of Computer (April 2025)lightmatter.co
40 points by gkolli 4 days ago | 17 comments
Loading comments...
btilly 6 hours ago [-]
This paradigm for computing was already covered three years ago by Veratasium in https://www.youtube.com/watch?v=GVsUOuSjvcg.

Maybe not the specific photonic system that they are describing. Which I'm sure has some significant improvements over what existed then. But the idea of using analog approximations of existing neural net AI models, to allow us to run AI models far more cheaply, with far less energy.

Whether or not this system is the one that wins out, I'm very sure that AI run on an analog system will have a very important role to play in the future. It will allow technologies like guiding autonomous robots with AI models running on hardware inside of the robot.

boznz 6 hours ago [-]
Weirdly complex to read yet light on key technical details. My TLDR (as an old clueless electronics engineer) was the compute part is photonic/analog, lasers and waveguides, yet we still require 50 billion transistors performing the (I guess non-compute) parts such as ADC, I/O, memory etc. The bottom line is 65 TOPS for <80W - The processing (optical) part consuming 1.65W and the 'helper electronics' consuming the rest so scaling the (optical) processing should not have the thermal bottlenecks of a solely transistor based processor. Also parallelism of the optical part though using different wavelengths of light as threads may be possible. Nothing about problems, costs, or can the helper electronics eventually use photonics.

I remember a TV Program in the UK from the 70's (tomorrows world I think) that talked about this so I am guessing silicon was just more cost effective until now. Still taking it at face value I would say it is quite an exciting technology.

5 hours ago [-]
Animats 5 hours ago [-]
Interesting. Questions, the Nature paper being expensively paywalled:

- Is the analog computation actually done with light? What's the actual compute element like? Do they have an analog photonic multiplier? Those exist, and have been scaling up for a while.[1] The announcement isn't clear on how much compute is photonic. There are still a lot of digital components involved. Is it worth it to go D/A, generate light, do some photonic operations, go A/D, and put the bits back into memory? That's been the classic problem with photonic computing. Memory is really hard, and without memory, pretty soon you have to go back to a domain where you can store results. Pure photonic systems do exist, such as fiber optic cable amplifiers, but they are memoryless.

- If all this works, is loss of repeatability going to be a problem?

[1] https://ieeexplore.ieee.org/document/10484797

Anduia 6 hours ago [-]
> Critically, this processor achieves accuracies approaching those of conventional 32-bit floating-point digital systems “out-of-the-box,” without relying on advanced methods such as fine-tuning or quantization-aware training.

Hmm... what? So it is not accurate?

btilly 6 hours ago [-]
It's an analog system. Which means that accuracy is naturally limited.

However a single analog math operation requires the same energy as a single bit flip in a digital computer. And it takes a lot of bit flips to do a single floating point operation. So a digital calculation can be approximated with far less energy and hardware. And neural nets don't need digital precision to produce useful results.

B1FF_PSUVM 6 hours ago [-]
> neural nets don't need digital precision to produce useful results.

The point - as shown by the original implementation...

bee_rider 5 hours ago [-]
It seems weirdly backwards. They don’t do techniques like quantization aware tuning to increase the accuracy of the coprocessor, right? (I mean that’s nonsense). They use those techniques, to allow them to use less accurate coprocessors, I thought.

I think they are just saying the coprocessor is pretty accurate, so they don’t need to use these advanced techniques.

croemer 6 hours ago [-]
I stopped reading after "Soon, you will not be able to afford your computer. Consumer GPUs are already prohibitively expensive."
kevin_thibedeau 6 hours ago [-]
This is always a hilarious take. If you inflation adjust a 386 PC from the early 90s when 486's were on the market you'd find they range in excess of $3000 and the 486s are in the $5000 zone. Computers are incredibly cheap now. What isn't cheap is the bleeding edge. A place fewer and fewer people have to be at, which leads to lower demand and higher prices to compensate.
ge96 5 hours ago [-]
It is crazy you can buy a used laptop for $15 and do something meaningful with like writing code (meaningful as in make money)

I used to have this weird obsession of doing this, buying old chromebooks putting linux on them, with 4GB of RAM it was still useful but I realize nowadays for "ideal" computing it seems 16GB is a min for RAM

ge96 4 hours ago [-]
It's like the black Mac from 2007, I know its tech is outdated but I want it
Animats 5 hours ago [-]
That's related more to NVidia's discovery that they could get away with huge margins, and the China GPU projects for graphics being years behind.[1]

[1] https://www.msn.com/en-in/money/news/china-s-first-gaming-gp...

TedDallas 6 hours ago [-]
It was kind or that way in early days of high end personal computing. I remember seeing an ad in the early 90s for a 486 laptop that was $6,000. Historically prices have always gone down. You just have to wait. SoTA is always going to go for a premium.
ghusto 5 hours ago [-]
That irked me too. "_Bleeding edge" consumer GPUs are ...", sure, but you wait 6 months and you have it at a fraction of the cost.

It's like saying "cars are already prohibitively expensive" whilst looking a Ferraris.

quantadev 5 hours ago [-]
In 25 years we'll have #GlassModels. A "chip", which is a passive device (just a complex lens) made only of glass or graphene, which can do an "AI Inference" simply by shining the "input tokens" thru it. (i.e. arrays of photons). In other words, the "numeric value" at one MLP "neuron input" will be the amplitude of the light (number of simultaneous photons).

All addition, multiplication, and tanh functions will be done by photon superposition/interference effects, and it will consume zero power (since it's only a complex "lens").

It will probably do parallel computations where each photon frequency range will not interfere with other ranges, allowing multiple "inferences" to be "Shining Thru" simultaneously.

This design will completely solve the energy crisis and each inference will take the same time as it takes light to travel a centimeter. i.e. essentially instantaneous.

gcanyon 3 hours ago [-]
For years I've been fascinated by those little solar-powered calculators. In a weird way, they're devices that enable us to cast hand shadows to do arithmetic.
quantadev 2 hours ago [-]
Lookup "Analog Optical Computing". There was recently a breakthrough just last week where optical computing researchers were able to use photon interference effects to do mathematical operations purely in analog! That means no 0s and 1s, just pure optics. Paste all that into Gemini to learn more.