What's the Latest Development?
Computer chips are beginning to consume too much power, both for the integrity of the chip and for the electricity grid to support. Faster data transfers currently require more electricity which, by causing transistors to overheat, reduces the efficiency of the chip. By 2025, information technology in Japan will consume "250 billion kilowatt-hours' worth of electricity per year, or roughly what the entire country of Australia consumes today." One alternative is optical chips, which use beams of light to transfer data, reducing power consumption.
What's the Big Idea?
The good news is that microchip manufacturing is an area where America still enjoys a comparative advantage. The bad news is that our current approach to chip making is not compatible with the extra requirements for making light-based chips. "The most intuitive way to add optics to a microprocessor’s electronics would be to build both directly on the same piece of silicon, a technique known as monolithic integration." Using IBM facilities, MIT researchers are looking for ways to harmonize the two processes.
Photo credit: shutterstock.com