Artificial intelligence (AI) has experienced a revival of pretty large proportions in the last decade. We’ve gone from AI being mostly useless to letting it ruin our lives in obscure and opaque ways. We’ve even given AI the task of crashing our cars for us.
AI experts will tell us that we just need bigger neural networks and the cars will probably stop crashing. You can get there by adding more graphics cards to an AI, but the power consumption becomes excessive. The ideal solution would be a neural network that can process and shovel data around at near-zero energy cost, which may be where we are headed with optical neural networks.
To give you an idea of the scale of energy we’re talking about here, a good GPU uses 20 picoJoules (1pJ is 10-12J ) for each multiply and accumulate operation. A purpose-built integrated circuit can reduce that to about 1pJ. But if a team of researchers is correct, an optical neural network might reduce that number to an incredible 50 zeptoJoules (1zJ is 10-21J).
- What to know about measles in the US as case count breaks record
- NASA to perform key test of the SLS rocket, necessitating a delay in its launch
- Fiber-guided atoms preserve quantum states—clocks, sensors to come
- Trump administration puts offshore drilling expansion in Arctic, Atlantic on ice
- The antibiotics industry is broken—but there’s a fix