The never-ending quest for faster, smaller computers that can do more has led manufacturers to design ever tinier transistors that are now packed into computer chips by the tens of billions.
And so far, this tactic has worked. Computers have never been more powerful than they are now. But there are limits: Traditional silicon transistors can only get so small because of difficulties in manufacturing devices that are, in some cases, only a few dozen atoms wide. In response, researchers have begun developing computing technologies, like quantum computers, that do not rely on silicon transistors.
Another avenue of research is photonic computing, which uses light in place of electricity, similar to how fiber optic cables have replaced copper wires in computer networks. New research by Caltech's Alireza Marandi, assistant professor of electrical engineering and applied physics, uses optical hardware to realize cellular automata, a type of computer model consisting of a "world" (a gridded area) containing "cells" (each square of the grid) that can live, die, reproduce, and evolve into multicellular creatures with their own unique behaviors. These automata have been used to perform computing tasks and, according to Marandi, they are ideally suited to photonic technologies.
"If you compare an optical fiber with a copper cable, you can transfer information much faster with an optical fiber," Marandi says. "The big question is can we utilize that information capacity of light for computing as opposed to just communication? To address this question, we are particularly interested in thinking about unconventional computing hardware architectures that are a better fit for photonics than digital electronics."
Cellular automata
To fully grasp the hardware Marandi's group designed, it is important to understand what cellular automata are and how they work. Technically speaking, they are computational models, but that term does little to help most people understand them. It is more helpful to think of them as simulated cells that follow a very basic set of rules (each type of automata has its own set of rules). From these simple rules can emerge incredibly complex behaviors. One of the best-known cellular automata, called The Game of Life or Conway's Game of Life, was developed by English mathematician John Conway in 1970. It has just four rules that are applied to a grid of "cells" that can either be alive or dead. Those rules are:
- Any live cell with fewer than two live neighbors dies, as if by underpopulation.
- Any live cell with more than three live neighbors dies, as if by overcrowding.
- Any live cell with two or three live neighbors lives to the next generation.
- Any dead cell with exactly three live neighbors will come to life, as if by reproduction.
A computer running the Game of Life repeatedly applies these rules to the world in which the cells live at a regular interval, with each interval being considered a generation. Within a few generations, those simple rules lead to the cells organizing themselves into complex forms with evocative names like loaf, beehive, toad, and heavyweight spaceship.
Basic, or "elementary," cellular automata like The Game of Life appeal to researchers working in mathematics and computer science theory, but they can have practical applications too. Some of the elementary cellular automata can be used for random number generation, physics simulations, and cryptography. Others are computationally as powerful as conventional computing architectures—at least in principle. In a sense, these task-oriented cellular automata are akin to an ant colony in which the simple actions of individual ants combine to perform larger collective actions, such as digging tunnels, or collecting food and taking it back to the nest. More "advanced" cellular automata, which have more complicated rules (although still based on neighboring cells), can be used for practical computing tasks such as identifying objects in an image.
Marandi explains: "While we are fascinated by the type of complex behaviors that we can simulate with a relatively simple photonic hardware, we are really excited about the potential of more advanced photonic cellular automata for practical computing applications."
Ideal for Photonic Computing
Marandi says cellular automata are well suited to photonic computing for a couple of reasons. Since information processing is happening at an extremely local level (remember in cellular automata, cells interact only with their immediate neighbors), they eliminate the need for much of the hardware that makes photonic computing difficult: the various gates, switches, and devices that are otherwise required for moving and storing light-based information. And the high-bandwidth nature of photonic computing means cellular automata can run incredibly fast. In traditional computing, cellular automata might be designed in a computer language, which is built upon another layer of "machine" language below that, which itself sits atop the binary zeroes and ones that make up digital information.
In contrast, in Marandi's photonic computing device, the cellular automaton's cells are just ultrashort pulses of light, which can allow operation up to three orders of magnitude quicker than the fastest digital computers. As those pulses of light interact with each other in a hardware grid, they can process information on the go without being slowed down by all the layers that underlie traditional computing. In essence, traditional computers run digital simulations of cellular automata, but Marandi's device runs actual cellular automata.
"The ultrafast nature of photonic operations, and the possibility of on-chip realization of photonic cellular automata could lead to next-generation computers that can perform important tasks much more efficiently than digital electronic computers," Marandi says.
The paper describing the work, titled, "Photonic Elementary Cellular Automata for Simulation of Complex Phenomena," appears in the May 30 issue of the journal Light: Science & Applications. The lead author is Gordon H.Y. Li (MS '22), graduate student in applied physics; with co-authors Christian R. Leefmans, graduate student in applied physics; and James Williams, graduate student in electrical engineering.
Funding for the research was provided by U.S. Army's Army Research Office, the Air Force Office of Scientific Research, and the National Science Foundation.