Building a computer that solves practical problems at the speed of light

Dovie Salais

The end of Moore’s Law In 1965, the engineer (and a founder of Intel) Gordon Moore predicted that the number of transistors in an integrated circuit would double every year. He later changed his prediction to every two years, and for decades, the capacity of computers has increased at roughly […]

The end of Moore’s Law

In 1965, the engineer (and a founder of Intel) Gordon Moore predicted that the number of transistors in an integrated circuit would double every year. He later changed his prediction to every two years, and for decades, the capacity of computers has increased at roughly that rate, getting progressively faster and smaller without getting more expensive. But in the past decade, the trend has plateaued. At the same time, demand for computing capacity and speed has only grown.

Kirill Kalinin with Christos Gkantsidis and Hitesh Ballani
From left, Kirill Kalinin, Christos Gkantsidis and Hitesh Ballani, researchers at the Microsoft Research Lab in Cambridge, U.K., discuss the transaction settlement problem posed by Lee Braine of Barclays. Photo by Chris Welsch for Microsoft.

“The problem is, once you get past this inflection point, it becomes much more difficult to sustain that kind of growth,” says Hitesh Ballani, one of the other Microsoft researchers working on the optical computer, explaining the urgency behind developing alternative technologies like optics. “Because we had already been working on optical storage and networking, it was kind of organic to move to optical computing, although that is the toughest nut to crack.”

The lab in Cambridge has had some success with optical storage.  The team developed a system of storing enormous amounts of data embedded in pieces of glass.

In a meeting room at the lab, Ballani speaks rapidly and cheerfully. He explains the basics of optical computing and why the team brought in a mathematician to help develop a new type of algorithm to solve optimization problems. He used a red marker to cover a whiteboard and then two sets of floor-to-ceiling windows with notes, equations and graphs to illustrate his points.

“It is not a general-purpose computer,” he says. “But it is very, very useful for accelerating applications where these mathematical operations, linear algebra and non-linear algebra, are the key operational bottlenecks.”

For nearly 50 years, light has been used to transmit data using fiber-optic cables. Photons do not interact with each other, but when passed through an intermediary, like the sensor in your smartphone camera, they can – in a sense – be read. 

In the case of AIM, different intensities of light can be used to add and multiply, the basis for optimization problems. Operating at the speed of light, advanced versions of AIM should be able to transcend the speed of binary computers by about a hundred times, Ballani says. Further, computation and storage happen at the same place in AIM, unlike binary computers, which need memory in one location and compute in another to function.

Breaking new ground in algorithms

As an example of the kind of problem AIM could solve, he cites an exchange with a Microsoft health researcher about ways to reduce the time needed to conduct an MRI scan with the same level of resolution. (Typically, they take between 15 and 90 minutes depending on the size of the area being scanned.) Some techniques to shorten that time are already in use but involve compromises. Running what is now a time-consuming optimization equation would theoretically bring more accuracy and speed.  “If we are able to solve the optimization problem very, very quickly, it might be possible to do an MRI in less than a minute,” Ballani says.

Francesca Parmigiani
Francesca Parmigiani leads the team that built the hardware of the new optical computer at the Microsoft Research Lab in Cambridge, U.K. Photo by Chris Welsch for Microsoft.

Francesca Parmigiani, the third primary researcher on AIM, did her Ph.D. in the field of optical communication. Now she is leading the effort to build the optical computer itself. She and her small team are currently developing an upgraded version that will operate with 48 variables, greatly expanding the complexity of problems the optical computer can solve. Eventually, they hope to build a version of AIM with thousands of variables.

The AIM team is using components that already exist and have a manufacturing system – from fiber-optic cables to modulators to micro-LED lights – to create, and now upgrade, AIM. As it exists now, the computer is built on a metal bench about the size of a dining room table, with tangles of wires emerging from modulators and linking to what the researchers sometimes refer to as a “projector,” similar to a multimedia projector that stores and computes the data.

“As I pivoted to building this computer, I had to learn a lot,” Parmigiani says. “I had no clue about optimization.”

The process of building AIM and mapping problems to its novel form has involved a large amount of give-and-take between Parmigiani and the optical and analog team, who works on the hardware, and Ballani, Gkantsidis and the mathematician Kirill Kalinin, who work on the algorithms and software that will run on it. The researchers say that the innovations in math and algorithms they have developed are as critical as the machine itself in solving optimization problems. The novel type of algorithm being used in AIM is known as QUMO, for quadratic unconstrained mixed optimization, and its use with the optical computer is what makes AIM unique in the world.

“The story has been changing as we move forward because we learn what makes sense and what doesn’t make sense,” Parmigiani says. “We realized we really need to work very hard to figure out how to co-design the hardware with the algorithm.”

Grace Brennan surrounded by computer equipment and wires
Grace Brennan, part of the team building an upgraded version of AIM, at the Microsoft Research Lab in Cambridge, U.K. Photo by Chris Welsch for Microsoft.

Working ‘at the leading edge’

The AIM team is now turning its attention to testing the device and the QUMO algorithm with problems proposed by industry experts and academics. They are opening a service using an AIM simulator that solves large optimization problems using a graphics processing unit (GPU).  The team wants more test cases to help them learn about the potential of the tool they have built.

The transaction settlement problem proposed by Lee Braine of Barclays is a priority.

The problem is difficult to solve because of the volume of transactions. Braine says these transactions are usually described as delivery versus payment. A simple example is the delivery of a security for a cash payment – 100 shares in a company for $1,000. The problem is that every transaction and every player is subject to various constraints, including regulations and the balances available.

Next Post

Quantum Computing and Technologies Market is Set to Grow at a CAGR of 32.3% During the Forecast Period, with Rising Investments in Quantum Technology in North

Extrapolate The global quantum computing and technologies market is anticipated to reach USD 1,867 million by 2028. The rising acceptance and penetration of quantum computing across several industries along with the increasing investments are harbouring tremendous growth. Dubai, UAE, June 27, 2023 (GLOBE NEWSWIRE) — As per the latest research […]
Quantum Computing and Technologies Market is Set to Grow at a CAGR of 32.3% During the Forecast Period, with Rising Investments in Quantum Technology in North