Saturday, April 24, 2010

Optical computing

ABSTRACT
We live in interesting times . By
exceeding good fortune , we happen to
live in an era where computing is taking new meanings and is manifesting itself in new spheres of activity by the day. In these circumstances, the need for processing power will not be denied. By the limitations of nature, silicon can aid us in our quest for dominion only to such an extent. Overcoming maudlin sentimentality will be our biggest challenge in this coming generation, for the future, belongs elsewhere. Nanotechnology, Molecular technology and quantum computing all have shown tremendous promise in being the next big wave of the future. However, while the former two are further branches down the same dead end road, the latter is unable to break the shackles of science fiction in a convincing way. An intermediate step down this road, we believe lies in the direction of optical computing. With the exciting new discoveries that have captured the attention of the whole world in the last year, no longer can we afford to let optical computing remain a term in the dictionary. If we are to remain competent, the future lies that way.
INTRODUCTION
Squeezed light, holograms, and lasers sound like things you'd find in a science-fiction novel, but they can also -be found in the labs around the world where they are used in the "thinking" machines of tomorrow-optical computers. Since they are based on light-wave technology, optical computers can process Information a million or more times faster than electronic computers. They are inherently parallel processors and almost completely immune to interference.
Optical computers use laser beams in place of wires. Unlike wires, laser beams can cross and intersect without affecting one another. Furthermore, multiple beams can converge on a single switching point with any combination of one or more beams triggering the switch. An electronic equivalent of such a multiple Input switch is much more complex. Optical computers have all these advantages because of the fundamental nature of light.

Photons: Quantum theory tells us that light has the properties of both
waves and particles. When discussing its particle nature, we call the particles "photons," However, because of light's wave-like properties, photons can do things that are impossible for typical particles-such as electrons. For example, thousands of photons can pass through a single point simultaneously without interfering with one another. Photons can also travel faster than electrons, which makes faster computational speeds possible.
As we'll discuss later on, light can also be used to represent information in many different ways. For example, one could modulate the brightness (photons per second) of a beam of light, which would produce an amplitude-modulated signal (AM) for analog computing. AM signals can also be used to transmit binary data--you just need to define a brightness threshold to represent a one and another to represent a zero. Furthermore, we can frequency-modulate (FM) light. Changing the frequency is equivalent to changing its color. More advanced methods of light manipulation--like "spatial modulation" and holograms. All these Intriguing possibilities have been tempting scientists since the 1950s, but the technology to support them only began to appear during breakthrough research dating back to the eighties.
The Early Days: The early optical-computer research In the 1950s was performed using mercury-arc lamps and sunlight. The method proved less than effective. Today, the laser (invented in 1960) is the key to optical computing. A laser produces a single coherent beam of light (all the light has the same frequency, energy phase, and direction) that is used to transmit optical information in a concise, coherent, and controlled manner.
Various attempts at building optical computers over the seventies and the eighties had some small successes, but the real advances had to wait for optical switches and semiconductor lasers.
The problem with most lasers is that they are somewhat large. An optical computer may need thousands or even millions of controlled laser beams. We can create them by splitting a single beam into as many beams as necessary, but that is a messy approach. A better solution is provided by the semiconductor laser.
The first semiconductor lasers worked by applying a current through the alternating layers of gallium arsenide (GaAs) semiconductor material, the steadily moving electrons generate in-phase photons, which emerge from the edge of the layered semiconductor material as a coherent laser beam.
CURRENTOPTICAL TECHNOLOGY:
Recent semiconductor lasers take advantage of quantum effects that result from the physical layout of chip layers. This technology has given us "quantum-well" lasers. Although these laser chips put out only a few milliwatts of power, they are useful in CD players, laser-based "tape measures," and optical telephone circuits.
Quantum wire laser: One step beyond the quantum well laser is the quantum-wire laser. Quantum-wire lasers are composed of alternating layers of GaAs and aluminum gallium arsenide (AlGaAs). These efficient diode lasers are smaller and more powerful than their predecessor-- producing about 10 milliwatts of output power. Optical computing requires this greater power because the beam must be sufficiently strong even after it is split.
The problem with quantum-wire lasers is the expensive cost of growing the zero-dimensional wires found in the AlGaAs lasers. The price should fall dramatically in the near future with improved manufacturing techniques and larger quantities. Scientists in Japan 's Basic Research Labs have predicted that quantum-wire lasers should be able to switch on and off at rates up to 100 GHz.
As mentioned earlier, there are many basic methods of sending signals by light. The simplest technique is to simply turn it on and off, like Morse code. As previously stated, the presence of a beam could denote a one and its absence a zero.
That is the binary method used in the most widely known optical computer, built at AT&T Bell Labs by Alan Huang. Huang has been working in the optical computer field for over thirty years. When he started thinking about optical computers, lasers and semiconductor chips were both relatively new developments. In the beginning, he had to work with crude technology. Then, he needed to wait for many new developments to occur such as better lasers, ICs, and the optical switch.
SEEDs: The switches, known as Self Electroptic-Effect Devices (SEEDs), are key to the computer's operation. A control laser beam turns each switch on or off. The switch controls the passage of a second laser beam--the signal beam-- based on the presence or absence of the control beam.
There are two classes of optical switch: transmissive and reflective. A transmissive switch either blocks the signal beam or allows it to pass to its destination. A reflective switch reflects the signal beam to a destination or prevents it from getting there, either absorbing it or permitting it to pass through to somewhere else.
Regardless of its type, when a switch is on, the signal beam can continue to travel. When it's off, the signal beam is stopped, so a SEED acts exactly like a transistor in an electronic computer. In fact, an optical computer works like any other computer; it just uses the optical switches and laser beams in place of transistors and electric currents, respectively.
Although David Miller (also of Bell Labs) developed the switches in 1986, it still took five years to build an optical computer. Alan Huang and twelve colleagues built an optical computer at Bell Labs early in the nineties. It had 8000 optical switches--each one only ten micrometers (.00004 inch) wide. Huang's optical computer used only a small percentage of its thousands of switches. It only counted, but even that was significant for a completely optical computer. It proved the theory behind optical computing.
Huang's computer used the SEED switches, connected as NOR gates, to form two eight-bit counters. Each NOR gate has a switching time of one nanosecond. That compares favorably to electronic NOR gates that switch at between 5 to 50 nanoseconds.
The computer also uses two ten-milliwatt lasers and various lenses, beam splitters, and pattern masks. Optical computers have one problem that electronic computers do not--alignment. You can't do much computing if a beam misses a switch, and it takes considerable work to line up all the beams precisely. Alignment difficulties are among the reasons Huang's computer only uses a part of its capability. That isn't a problem in a standard computer since the electrons travel within conductors--mask registration difficulties during IC wafer fabrication not-withstanding. Once an IC chip is built and tested, it will always work without worry of further alignment adjustments.
As we said earlier, the AT&T computer was a straightforward reproduction of existing computer architecture on a different medium--light. There are other ways of using light to compute; let's look at some of the alternatives.
Spatial Light Modulators: Spatial Light Modulators (SLMs) take advantage of light's unique properties, They direct multiple beams in multiple directions to permit parallel-processing operation. SLMs are like a cross between a piece of photographic film and a Liquid-Crystal Display (LCD). They are made up of many tiny squares, and electronics or light controls each square. A square allows some, none, or the entire signal beam to pass.
One of their primary uses is pattern matching. An input signal controls one SLM. The result comes from comparing its output to a second SLM controlled by the computer. This method can determine exact matches or near misses. It also gives the answer, literally, at the speed of light, allowing for easier and faster "fuzzy-logic" matching than today's computers.
Like other optical switches, SLMs can be either transmissive or reflective. The transmissive type either passes or stops the light. The reflective type either reflects or absorbs (redlrects) the light. The reflective type requires beam splitters to direct the reflected light.
As mentioned, SLMs can store reference patterns. These patterns might be actual images, numbers, or any other encoded Information. They can hold binary numbers by encoding them positionally along the squares. With proper encoding and positioning, they perform extremely fast mathematical calculations. Using two SLMs and passing light through their associated squares allows them to add, subtract, multiply, or divide. The nice thing is that the calculation takes place immediately regardless of the length of the number. In a digital computer, calculations usually take a considerable amount of digit shifting and manipulation. An optical computer calculates the entire number simultaneously. It's only limited by the number of squares and the complexity of the SLM.
Holograms: Holographic computers work similarly to SLMs, but with greater accuracy. Such computers can compare a holographic image with a reference hologram. The reference hologram must be created specifically for the task and can be either computer generated or created from real-world input, such as an image or other signal.
To use a holographic computer, you apply a holographic input signal to the reference hologram, which is used as a filter. The resulting light pattern is usually monitored by a charged-coupled device (CCD) array. The CCD is a digital-imaging unit, like a television camera, that is used for optical imaging in camcorders, telescopes, and other devices. A CCD produces a digital output representing any image focused on its surface. This combination of holographic filtering and CCD matching and monitoring can identify faces, fingerprints, or parts on an assembly line.
Holograms are also being used to aid in data transfer and storage, Smart-Pixel-Array (SPA) modules use hologram arrays to help direct light sent by tiny Vertical-Cavity Surface-Emitting Lasers (VCSEL), Researchers at the University of Colorado at Boulder are currently working with SPA-modules for their ongoing optical-computer research.
Quantum Limits: According to the Helsenberg Uncertainty Principle, the more you know about the position of a photon in time and space, the less you'll know about its mass and energy. Since a laser beam consists of photons that have approximately the same energy and frequency, we know the energy of the photons pretty well. That limits the certainty with which we can know where a particular photon is in space.
Because the most we can say about a photon's location is that it will be within a given area, we must allow for detection of photons over the entire area. That limits the minimum size and the applications of optical devices. Even so, the limit is so small that it is usually not a problem. By the time we reach the point where we must deal with the positions of single photons, we may have completely new computing methods or have learned enough that the uncertainty doesn't matter.
Scientists are using "squeezed light" to reduce some of the uncertainty. They do this by controlling a laser beam to create areas of greater uncertainty at certain points along the beam. Since the overall uncertainty is conserved, this process results in areas with lower uncertainty elsewhere in the beam. In other words, there are points along the beam where the photons are restricted to a smaller area than average; we are more certain where they are. By increasing most of the uncertainty in a particular area, we can work more precisely with the remaining areas.
Researchers in Colorado have managed to steer rubidium atoms through fibers as narrow as 10 microns. Advances in particle control may lead to the "painting" of circuits on an atomic scale--something far more practical than IBM's demonstration of writing the letters "IBM" with individual gold atoms.
ADVANTAGES OF OPTICS FASTER TRANSMISSION:
Coherent light, which permits a whole range of processing capabilities, may be generated inexpensively by laser diodes, as these have dropped rapidly in price in the last few years due to mass production. A cheap CD player in the home contains several of them. Laser diodes can be modulated at 30 GHZ. The advantage of optics over electronics is the higher bandwidth that enables more information to be carried. This is because electronic communication along copper requires charging a capacitance that caries with length. In contrast, optical signals in optical fibers, optical ICs and free space don't have to charge a capacitor and are therefore, faster. This faster transmission with optics is important because transmission time between units is often the limiting factor for performance on high-speed machines. Faster transmission permits faster computational elements to be used. Very high-speed machines use additional power to provide speed and have elements located close to one another to limit transmission time.

LESS INTERFERENCE:
Another advantage of optics comes because of photonic properties-they don't interact like electrons. Consequently, light beams may pass through one another without distorting the information carried. This suggests that optical memory may be able to avoid the difficulties of memory contention, at least during reads. Loops of connections are difficult to avoid in massively parallel systems. In the case of electrons, loops will generate noise voltage spikes whenever the electromagnetic fields through the loops changes. Further, high frequency or fast switching pulses will cause interference in neighboring wires. Signals in adjacent fibers or in optical integrated channels do not interfere with each other nor do they pick up noise due to loops.

PARALLELISM:
Yet another advantage of optics is that images are arrays of pixels that may be handled in parallel. Thus, it is conceivable to process a million elements or more in parallel by formulating a problem as a sequence of steps on a 2-D array.
In the past, the lack of interference between photons made it difficult to use a small signal to control a large signal for producing gain, as in a transistor. However, recently, very high-speed, low-switching-energy devices have been demonstrated by increasing the non-linearity using GaAs quantum well and other structures. Consequently optical switches have comparable performance to electronic ones. The movement from silicon to GaAs for high sped electronics encourages the use of optics for on-chip and between-chip interconnection and switches to avoid the need and energy cost to transform optical signals on fibers to electronics for phone switching.

SUPERIORSTORAGE
CAPABILITY:
A further advantage of optics for computers results from the superior storage and accessibility of optical material over magnetic materials. Magnetic disks require the floating of pick up coils within one micron of the surface. Optical disks use focused laser beams to read the information so that the light source does not have to be as close to the storage material.

In short, optics is:
* Immune to electromagnetic interference
* Free from short electrical circuits
* Able to have low-loss transmission
* Able to provide large bandwidth (capable of communicating several channels in parallel without interference)
* Capable of propagating signals within the same or adjacent fibers with no interference
* Compact, lightweight, and inexpensive to manufacture

WHY THE NEED FOR SPEED?
Present-day computers are lacking in efficiency due to their natural limitations. They just aren't fast enough to keep up with the demands of the modern Internet world. Currently, the information in a computer is passed through copper wires. In an optical computer, information would pass through light beams--at the speed of light, which is 186,000 miles per second. That's pretty fast.
Manjari Mehta of Information Systems Research Center (ISRC) at the University of Houston explains the "inefficiency" of electronic computers best, through the following comparison.
"Take, for example, e-mail. Today a message is first converted from electronic to photonic form and then transmitted over fiber-optic cables. The light signal at the other end must then be converted back into electronic form for processing by the receiving computer. These conversions are inefficient and limit the instantaneous nature of computing. If we can find ways of storing the optical message in photonic format and then processing those light encoded signals, there would be no need to convert from electronic to photonic form. In simplified terms, this describes an optical computer--performing computations, operating, storing, and transmitting data--using only light."
So again, why do we need computers to be so darn fast? According to Dr. Hossin Abdeldayem from NASA, Terabit speeds, or one trillion bits, are needed to accommodate the growth rate of the Internet and increasing demand for bandwidth-intensive data streams.

FACT OR FANTASY:
Optical Computers, long a dream of computer scientists, have moved a step closer to reality. Researchers at AT&T Bell Labs developed a 2-kbit photonic integrated circuit that they say could be used to build basic optical computing systems.
"Optical computers have been mostly on paper. Now real experimental prototypes can be built," says Leo Chirovsky, the AT&T researcher who designed the new photonic IC. "This device provides the first viable building blocks for optical computers,' he adds.
SEEDS OF TECHNOLOGY:
The 2 mm x 2 mm device is made up of an array of gallium-arsenide and aluminum-gallium-arsenide multiple well structures. The chip has 2,048 elements, and subnanosecond switching speeds requiring only 2.5 pj of energy have been measured.
Each element on the chip is a symmetric self electro-optic effect device (S-SEED), a technology first developed at Bell Labs in 1987. Each element can operate as a logic gate, memory cell, or a switch. The chip has all optical inputs and outputs.
The most sophisticated photonic chip prior to AT&T's was a 1-kbit chip developed by NEC. It does not have optical inputs and functions only as a memory device. The optical 1/0 capabilities of the AT&T device mean data can be moved in and out simultaneously, greatly increasing its processing speed.
Speed is optical computing's greatest asset. But the speed of any computer is determined by the speed of the input and output of data. Similar to a large city that has only one-lane highways to get into and out of it, a chip's processing efficiency is reduced when its 1/0 ports form data gridlocks.
We are trying to bring 1/0 up to speed with processing capability. The faster 1/0 speeds allow the chip to achieve massive connectivity, and thus a high degree of parallel processing. Until now parallel processing at the chip level just has not been possible in photonics or electronics.
A key to the new device is that it overcomes some of the stumbling blocks encountered with photonic ICs. Problems with bistability and cascadability of the device have held back development of photonic ICs. But the Bell researchers say they have overcome these problems and made a more robust chip.
The device acts just like a transistor, operating as a three-terminal device, not just as a bistable device. It makes it much easier to operate because the device is tolerant of non uniformities and the biases which are applied to it.
Scientists at Intel have created another device made of silicon that can encode data onto beams of light at very fast speeds. The device, called an optical modulator, will lead to dramatically faster and more powerful microchips that shuffle data around the Internet using light instead of electricity.
The optical modulator transmits data by turning on and off, just like electronic pulses create the ones and zeroes that compose the binary language of computers. But instead of electrons, the modulator creates flickers in light beams that can operate at a speed of roughly 1 gigahertz -- meaning it can cycle about 1 billion times a second. Other optical modulators work at even higher rates -- in the tens of gigahertz -- but they are not based on silicon.
Till this point, everyone has been building optical devices out of exotic materials -- lithium niobate, gallium arsenide -- that are hard to manufacture and very expensive. At Intel they're doing it all on silicon. This can get made alongside other products and in the long term have electronics put on it. You don't need a dedicated facility for it. The ability to use existing manufacturing processes and techniques and apply personal computing economics to the devices will dramatically reduce costs and sizes.
Previously, the best silicon optical modulators operated at the comparatively slow speed of 20 megahertz, or 20 million cycles per second. The new device is about 50 times (faster than) the previous world record in silicon. This also makes it plausible for inventors to consider replacing existing optical modulators with ones made of silicon.
An optical modulator works by directing laser beams into two waveguides -- the equivalent of wires for light. By also feeding electric current into the modulator, the researchers altered the physical properties of one of the two waveguides, causing light to pass through the altered guide more slowly.
The peaks and troughs of the slowed-down waves interfere with those in the untouched waveguide. This results in a canceled light signal -- the digital equivalent of a zero.
After all these quotations in the last two pages, if you are still in disbelief about the practicality of this technology, then this info has to make you a believer.

-Optical digital signal processors (the size of an IC) have been developed, which outstrip their electronic competition by far.
-Carbon nanotubes have been proven to act as optical antennas which will make optical transmission work like a cinch.
-Optical modulators which till recently were the size of big shoeboxes are now as small as their electronic counterparts.
So as you can see, optical computing is well out of the fiction books and into your homes.
CONCLUSION:
Although no single person has the correct or official perspective on such a dynamic field as optical computing, it may be useful to consider the ruminations of two green thumbs still struggling to think through its implications. In that spirit, we have offered this purely personal perspective.
As we have made abundantly clear in the last few pages, we hope, optical computing can no longer be relegated to the shoe closet that it was stagnating in since the end of the previous decade. With the advent of ground breaking technology, the stuff of dreams is coming true. Last year was a particularly successful year for the optical computing community. There were a number of amazing breakthroughs in a number of aspects of optics. Intel, Lumera, Lenslet, Boulder Non-Linear Systems all came forward with new innovations that could radically change the innards, if not the face, of computing as we know it. In this context, ignorance of this topic would be tantamount to a sin. This paper is basically an attempt to bring this subject out into the spotlight to take its rightful place along with all the hotly pursued topics of today such as Bluetooth, Nanotechnology and the like. We hope our small, stuttering baby steps have accomplished something worthwhile.
BIBLIOGRAPHY:
• Poptronics “Computing with Light” - HighBeam Research.
• R & D Microprocessor timeline encompasses multiple technologies
• Poptronics “Optical computing the wave of the future”- HighBeam Research.
• “Perspectives On OpticalComputing” - H. John Caulfield Northeast Photo sciences Inc.

1 comment:

  1. Alfa Chemistry offers an extensive catalog of building blocks, reagents, catalysts, reference materials, and research chemicals in a wide range of applications. photonic and optical device

    ReplyDelete