Array
(
    [content] => 
    [params] => Array
        (
            [0] => /forum/threads/optical-chips.24011/
        )

    [addOns] => Array
        (
            [DL6/MLTP] => 13
            [Hampel/TimeZoneDebug] => 1000070
            [SV/ChangePostDate] => 2010200
            [SemiWiki/Newsletter] => 1000010
            [SemiWiki/WPMenu] => 1000010
            [SemiWiki/XPressExtend] => 1000010
            [ThemeHouse/XLink] => 1000970
            [ThemeHouse/XPress] => 1010570
            [XF] => 2030770
            [XFI] => 1060170
        )

    [wordpress] => /var/www/html
)

Optical chips?

Will there ever be a chip that uses light instead of electrical current? Is it even possible?

The idea of using light instead of electricity to power computer chips is moving from science fiction to reality. Photonic chips, or optical processors, transmit information using photons rather than electrons. Because photons travel at the speed of light and produce virtually no heat, they promise enormous gains in speed and energy efficiency. Instead of metal wires, these chips use waveguides to steer light, modulators to encode data, and photodetectors to read signals.

Companies such as Intel, IBM, Ayar Labs, and Lightmatter are pioneering silicon photonics—integrating optical components with traditional transistors. These hybrid systems are already used in data centers to accelerate communication and reduce power consumption. Researchers are also experimenting with purely optical logic systems capable of performing computations through light interference, potentially transforming AI and supercomputing.

 
The idea of using light instead of electricity to power computer chips is moving from science fiction to reality. Photonic chips, or optical processors, transmit information using photons rather than electrons. Because photons travel at the speed of light and produce virtually no heat, they promise enormous gains in speed and energy efficiency. Instead of metal wires, these chips use waveguides to steer light, modulators to encode data, and photodetectors to read signals.

Companies such as Intel, IBM, Ayar Labs, and Lightmatter are pioneering silicon photonics—integrating optical components with traditional transistors. These hybrid systems are already used in data centers to accelerate communication and reduce power consumption. Researchers are also experimenting with purely optical logic systems capable of performing computations through light interference, potentially transforming AI and supercomputing.

Between optical chips and large scale chips like the Cerebra's large-scale chip could this change the whole game as far as size and location of data centers and even bring that power to offices and the home cutting the demand for data centers in the future by as significant amount?
 
Between optical chips and large scale chips like the Cerebra's large-scale chip could this change the whole game as far as size and location of data centers and even bring that power to offices and the home cutting the demand for data centers in the future by as significant amount?

It is certainly possible. I remember during my undergraduate work I learned to program in LISP which was billed as an AI language. I thought for sure AI would change the world. 45 years later I am right!
 
The (silicon photonics) optical chips are used for data transport -- and switching, in Google's case -- but not for processing the data, this is still all-electronic and there's no obvious route to changing this.

The data center game -- and demand for optical transport and silicon photonics -- is changing because the massive AI models need data centers which are simply too big and power hungry to build in one place, so the AI processing has to be distributed over a number of smaller (but still massive!) data centers spread across a local region (typically 10-100km apart) which need to behave like one big one as far as AI is concerned -- this is what is being referred to as "scale-across". And what it needs is a monumentally huge data bandwidth between these data centers, far bigger than for any normal networking application.

Distributing it further into homes and offices simply doesn't work -- this is the equivalent of "edge computing", there's a need for this but it's nowhere near the AI data center demand.
 
Back
Top