Why do computers use electricity instead of light? We know that electricity travels 100 times slower than light. So, wouldn’t computers be much faster if they used light instead of electricity for their operation?

You’re first off wrong regarding the speed at which electricity travels. Individual electrons move slowly however, you’re rarely worried about the movement of an electron (exception of course, ballistic transistors) You’re more concerned with the movement of a group and the resultant electromagnetic pressure (voltage waves propagation) and it happens at the exact speed of light, whatever the material.

But I digress. The reason we choose electricity in lieu of light are many.

1.) Silicon electronics are inexpensive. The two most common materials that we’ve spent more time making are steel and concrete and they’ve had greater man-hours since they’ve been in use for a longer period of time. Optics are costly for things similar to silicon electronics . . . It is quite possible that it will not be feasible at the 10nm scale, or even the 100nm scale. (Plasmonics could be able make it smaller to this scale however, I’m having my doubts right now).
2.) We can create silicon electronics that are really tiny. Optics . . . It’s not that easy. There are fundamental limitations to contend with. Like diffraction.

3.) Silicon electronic devices are simple. It’s not as easy as youngsters can master it, but simple because we actually know the underlying mechanisms as well as how it work when creating billions of devices with one sheet of materials. Optics however are quite difficult. Optics integrated is an intriguing research field because it’s so difficult, and we’re unable to comprehend it in the same way. Yes, we can create “optical diodes” and “optical transistors” using a variety of techniques, but we don’t know enough to be able to make the equivalent of 10 million from the same piece of material and ensure that ~99.999 percent of them function. And then there are the interconnects. They’re easy on silicon since it’s essentially melting metal. When using optics, alignment can be an absolute nightmare and you’ll be able to get only about 50% of the power from one end of the connector other for interfaces that are not identical generally 1/10th.

4.) CMOS, the most common transistor technology used in computers, draws power only when things are moving. There’s also leakage but the majority of the power usage is due to switching. It is a result of it using power in tiny quantities.
Optics does not have a equivalent to CMOS. In one way, it needs to continuously draw energy.
Additionally, the majority of optical switching occurs via nonlinear optics, which consume a lot of energy. The promises of power savings made by other researchers may not happen unless there is an “light capacitor” that stores light. (And whoever creates that light capacitor, I assure you the person who invented it will win the Nobel Prize).

It’s possible to go on and on,, but this is enough of the basics to get you started.

I believe the field is interesting and there are a lot of exciting application (in-network computing for example) however, I’m not sure it’ll be able to get an official seat at the general computing table.


Leave a Comment