What Is Optical Computing (Computing At The Speed of Light)
Articles,  Blog

What Is Optical Computing (Computing At The Speed of Light)

Hi, thanks for tuning into Singularity Prosperity. This video is the eighth in a multi-part series discussing computing and the first discussing non-classical computing. In this video, we’ll be discussing what optical computing is and the impact it will have on the field of computing. The speed of computation is limited by two factors: how fast information can be moved, data transfer, and how fast that information can be processed, data computation. Currently, this limit is imposed via the properties of electricity, the flow of electrons. There is however another field of computing focused on a different paradigm, optical computing, also called photonic computing. This refers to the use of light, the flow of photons, for use in computing and electronics. Light, more specifically infrared light with wavelengths around 1500 nanometers, is currently used for data transfer and communication over long distances, this is referred to as fiber optics. Beyond long distance data transfer, the principle of using light for data manipulation extends to computing as well: I work in the area of fiber optics, specifically building new types of logic that can compute all optically as opposed to electronically. I’m basically taking fiber optics and putting it into your computer to do logic and CPU components. The results of my research will be increasing the bandwidth of computers from gigahertz speeds to terabit per second speeds, which is 1000 times greater! This type of advancement won’t be in your computer tomorrow, but it’s the building block to creating the next generation of technology, which you will see years down the road. Unlike the longer infrared wavelengths needed for long distance communication to avoid signal degradation, the light used for computation will be in the visible part of the electromagnetic spectrum, with wavelengths in the range of 450 to 700 nanometers. This is because when working with small scale distances, signal degradation isn’t issue but computing speed is. Electronic circuits operate based on millions, billions of switches, that alternate between an on and off state, this switching process alone induces latency. Photonics simply uses wave propagation, the interference pattern of waves, to determine a result. This allows for instantaneous computations, without the delay induced by switch latency: Optical computing is actually the science or the art to use photons instead of electrons to do computation. So what we do here is we try to process data signals in the optical domain instead of the electronic domain, and then that’s actually what is cool about optical computing, because we process data while it’s traveling, we don’t stop the data movement nor the data flow and we process it. This is a bit similar to what we do in memory driven computing, where we bring the processor closer to the memory. Here we bring the processing closer to the data while it’s in-flight. So if you have to compare that with what we do today, every time we send or receive information from an optical fiber then we need to convert it between the electronic domain and the optical domain, what is key here is that this technology allows us to avoid that. So what we have to make or what we have to build when we do optical computing is a logic gate, and to make that using tabletop optics like the one that you see over here, with a lot of lenses and mirrors and so on – that becomes really really difficult at a macroscopic level, the issue that we have there is interference. Now the interesting thing is, if you go to the microscopic level, and that’s what you see here, then this interference effect actually becomes key to solving the problem, so that is the thing that we use. Let me now give an example of how such a logic gate works, that’s what you see here, let take this thing here. This thing here is a logic and gate, it has two inputs and one output, and we designed this thing to do a boolean operation where we have an output only in the case when both of the inputs are on. So we have a two stage process for that, the first thing that we do here is we use interference in this optical combiner, we make sure that we have a strong field only when both of the inputs are on. Then the next stage that we have here is this micro ring and this micro ring allows us to make a strong distinction between the on and the off level, so that the next gates which listen to these gates can understand the signal. The ability to compute data while it is being transferred and cut out switch delay, will completely change how computer architecture is designed and thought about. When the computer industry moved from the vacuum tube to transistor, latency decreased from in the order of microseconds to nanoseconds. Photonics promises to reduce latency orders a magnitude again, in the order of femtoseconds or less, 1 quadrillionth of a second! This speed factor alone would radically transform the computing industry, however, optical computing as many other pros as well. Classical computers operate in serial, with each calculation being performed one after the other. To scale to more complex problems requires more processors, which equates to increased computing power required and more complex data management. Optical computers can operate in parallel to tackle complex problems through light reflection, as well as have increased bandwidth as compared electron based computers due to the ability to transport multiple wavelengths of light at the same time, photons are also massless meaning that they require much less power to excite. These factors, increased parallelism and bandwidth, translate to extremely scalable systems which are much more powerful, utilize less power and don’t come coupled with complex data management issues. This reduction in data management also has a major impact on security, our modern computers have data travel all over the place: from storage to different levels of memory to the cache, then finally read by the CPU which determines if it can process the data or has to send it to another computing device, and then repeats the process all back again to storage. This exposes a lot of points in the computer where data is vulnerable. With optical computers, since data can be computed while it is in motion, translates to increased security, as less data is exposed. As you can see, optical computers present many benefits over classical computers. Coming up, we’ll cover some of the various optical computing initiatives as well as how this technology is to be integrated with current computing systems. [Music] As discussed in previous videos in this computing series, data transfer is one of the largest bottlenecks in terms of computing performance, however with optical computing and the ability to compute data in motion, solves this problem. Unfortunately with one problem solved, creates an other, computing devices will now end up becoming the bottleneck. To solve this, there are various initiatives currently in both research and development to push the field of optical computing devices forward. One of the most widely known is by a company called, Optalysys. They are designing an optical co-processor, referring back to the last video in this series, this is exactly what we discussed, new types of chips that will work within heterogeneous system architecture. This optical co-processor will benefit many sectors and due to the parallelism in photonics, many of the tasks this co-processor will excel at are the exact same as tasks currently offloaded to GPUs. Both this optical processor and GPUs working together will yield performance boosts like we’ve never seen before: Our approach is completely novel, because we use light to process data, not electricity. The Optalysys system connects to and turbocharges existing computer setups, whether they are standard desktop computers or a larger high performance computing cluster. It essentially transforms a desktop machine, into an HPC system, that increases the processing capabilities of an existing supercomputer to beyond what current and even future systems can perform. It does this by taking on certain mathematical processes that can be performed faster in the optical domain. Beyond optical computation devices, another area of research and development currently is memory, more specifically – optical RAM. While data can be computed in motion, to access data would still be a significant bottleneck, unless memory enters the optical domain as well. In R&D, through a joint collaboration by various European nations, optical RAM promises to be over 30 times faster than SRAM aka the CPU cache and 1,000 times better than DRAM. This equates to memory latencies in the order pico seconds, unheard of speeds for memory! For more in-depth information on: memory, GPUs and heterogeneous system architecture, be sure to check out the previous videos in this computing series. Back on topic, there are many other optical and optoelectronic devices in research and development that we haven’t even covered, with optoelectronic devices mixing both electron and photon based computation. For example, transistors that can switch between both electron and photon domains and Intel’s optical multiplexers that convert between optical and electronic signals. The final topic in optical devices that cover revolves around silicon photonics, essentially like fiber optic but on a smaller scale: The X1 photonic module, and just as a comparison, this is a 1.2 terabit photonic module and this is a 1.2 terabit electronic cable. You can see the difference, and it’s more than just the incredible weight and consumption of material resources it takes for the electronic communication, the thing that’s really amazing is that I can go this 10 centimeters or I can go a thousand meters, for exactly the same amount of energy, and that’s the breakthrough from the system level designer that’s so amazing about this kind of highly integrated very low cost technology. Silicon photonics, like that demonstrated by HP can transfer data at a rate of 1.2 terabytes per second at a distance up to 100 meters, in fact, even extending out to 50 kilometers the transfer rate is still 200 gigabits per second. For comparison, Thunderbolt 3 taps out at 10 gigabits and the current fastest ethernet connections in data centers at 100 gigabits per second. While at first this technology will be limited to the enterprise side of computing in terms of data centers, we will immediately begin to see improvements through increased cloud computing speeds. In time, as silicon photonics improves, it will move down to the consumer level and terabit speeds will be as simple as plugging in a wire at the back of your computer. In the grand scope of things, fiber optic speeds will be slowed down no longer. From a fiber internet connection to photonic wiring and then to photonic computing, computing at the speed of light, is the long-term goal of this field computing and will produce massive performance and efficiency gains. Optical computing is a field that has been talked about for quite some time, since the 1960s, and has produced many advances in various technologies. However now, after decades of research and development, is yielding tangible results that can accelerate computing performance. Photonic based computing will also play a significant role in quantum computing, due to the particle wave duality of light. We’ll cover this topic in the next video in this computing series! [Music] At this point the video has come to a conclusion, I’d like to thank you for taking the time to watch it. If you enjoyed it, consider supporting me on Patreon to keep this channel growing and if you want me to elaborate on any of the topics discussed or have any topic suggestions, please leave them in the comments below. Consider subscribing for more content, follow my Medium publication for accompanying blogs and like my Facebook page for more bite-sized chunks of content. This has been Ankur, you’ve been watching Singularity Prosperity and I’ll see you again soon! [Music]


  • Singularity Prosperity

    Become a YouTube member for many exclusive perks from early previews, bonus content, shoutouts and more! https://www.youtube.com/c/singularityprosperity/join – AND – Join our Discord server for much better community discussions! https://discordapp.com/invite/HFUw8eM

  • Thelonious2Monk

    Gee, I did not exactly undersatnd your line of thought, but I have a few comments as a person working in fiber optics for many many years:1. light velocity in a glass fiber is actually slower than signal velocity in a good copper conductor. The term "doing X at the speed of light" is wrong. The problem with copper lies in the increased attenuation (loss of power) as the frequency increases due to the skin effect. In light the attenuation is insensitive to the communication frequency.2. What you call "signal degradation" in a fiber is not due to slow or fast velocity, of to loss of energy (attenuation), but mostly due to dispersion – the fact that different wavelengths travel at different speeds in the glass.3. In fiberoptics we use a variety of wavelengths – between 850 nm to 1600 nm. We could easily use visible wavelength, but at these wavelengths, attenuaion is quite high due to Rayleigh scattering.Therefore, I think that the advantage of photonic computing lies in the interaction (as you showed) and not in the latency.Would love to read your comments

  • Aatish Hake

    Can you plz change ur accent it looks like you are juss reading the statement, not explaining… N it's not engaging accent too

  • Venkatesh babu

    Things work at the level of 12. Twelve smaller ones make bigger ones. Twelve bigger ones make still bigger ones. This progressive layering calls for split and movement of light.

  • CattleRustler

    How does optical processing compare with regards to heat? I'm assuming there would be less heat created because photons have no mass.

  • Robert Shackleferd

    The scientists and engineers working with this know will be in history classrooms for thousands of years. The founding fathers so to speak

  • soylentgreenb

    These things >20 years. Even fairly trivial shit like moving from aluminuim to copper traces, or SOI, low K, finfet took about 20 years. So we are stuck with silicon for the next 20 years or so and it doesn't get that much better than what it currently is. Scaling of performance slowed down greatly with the end of Dennard scaling ca 2006.

    Moore's law gives us more transistors, but Dennard scaling made them useful. Without Dennard scaling we don't get hardly any more speed from transistors except for adding more cores and hoping programmers will use them.

    Compounding this: what Andy giveth, Bill taketh away; Windows 98 on a fucking mechanial POS drive and a pentium II felt snappier than windows 10 on a descent M.2 SSD and I'm not sure I got much utility for that part of O/S development at all; win 98 to XP was the big leap in terms of stability, then very little of actual consequence.

  • Andrew Netherton

    I'm pretty skeptical about the idea of optical computing. A simple thing to consider is that microring HPE shows in their AND gate demo. Best case scenario, the waveguide material is Si (meaning they're operating at IR wavelengths — Si absorbs in the visible), the minimum diameter of that ring is on the order of 15-20 microns because your ability to tightly bend a waveguide depends on how well the light is trapped in the waveguide, and Si is one of your best options in that regard. One electronic AND gate can be made of 3 transistors, which means, if we estimate the area of one transistor consuming 50 nm x 50 nm, the optical ring-based solution is about 8400 times larger for the same device. If the optical computing platform is supposed to be performed in the visible like is suggested by the narrator, a material like silicon nitride would probably be used as a waveguide; the bend radius of the ring would probably be close to an order of magnitude larger because silicon nitride doesn't confine light as well as silicon. Therefore, I'm not convinced you'll actually achieve a higher computation density (operations/mm^2) in optical computing (especially in the visible domain), and wouldn't that be the whole point of changing paradigms?

  • darrick steele

    I know this isn't a cosmology channel, but this is what i heard …so might it be that the universe is not expanding like we think? could it be that it's just that in the infrared part of the spectrum the information is much more easily transmitted from the far reaches? that in the higher frequencies the information degrades before it gets here, so it's not that the universe stretched out the signal to make everything become red-shifted, but we just can't see the higher frequencies and infrared is all that is left? the galaxies are not moving away, but the microwaves are degridating.

  • Harley Me

    wow are they saying an led and a light sensor is much better then a transistor ? hmm I had the same idea when I was 14.
    I'm 39 now.. bout time they're clue'ing in

  • johneygd

    Just imagine if quantum computers won’t use light rather than stream will ever know how much faster a quantum computer will ever be I can’t wait to see that happen I want to how much more powerful the most powerful super quantum computer based on light transmission will ever be I am very Cory us about this .

  • hordnes

    so i have a dream of becoming an electrical engineer and work with computer hardware… with optical computing that seems kinda useles… or???

  • Joe Duke

    Our newest devices have nothing to do with any of this digital processing stuff… but, if broken down into the same form, function and digital nomenclature, we process 4096 x 4096 parallel fields with a 1024 dynamic contrast range, at the speed of light for a 10 micron distance.

  • Bobcat665

    Does anyone here recall the great difficulties scientists (and corporations that backed them) had with developing LCD "flat panel" displays? Development of them, in earnest, began in the early 1960s and the running joke was that LCD TVs were always "just ten years away". It was over 40 years later that LCD displays (finally) started displacing traditional CRTs as the dominant tech in the marketplace. The problem was the enormous challenges scientists were facing with developing the materials and manufacturing techniques required to make full-colour LCDs good enough to compete with CRTs. So far, researchers trying to develop photonic computing devices are facing obstacles that are equally as daunting (if not even more-so). What I've heard is that getting optical gates to work reliably is really hard to do and, again, it comes down to shortcomings in materials technologies. It looks like graphene-based chips will hit the market a lot sooner.

  • Antares

    And if you could somehow imbue it with phototaxic materials, you could end up with a computer hardware that arranges itself in different shapes. The potential of this technology is staggering.

  • Ivan Guerra

    I don´t understad, This is an Optical Micro Fiber Electrical Computer, so you better call it like that or OMFEC. You know, I´m inventing the Light Powered Computer or LPC that works with solar energy, No energy waste. Have you forgotten the Photo Electric Cell invented in the 60´s ? Also I´m inventing the Electro Magnetic Computer or EMC following the Jacques Bergier example, the NPU is flotting magnetic sphere that change its position every time, better than Quantum Computer. Please call me if you don´t have ideas. Thanks.

  • rade

    Interersting staf , transformed into bullshit , by a background "music". Frequent story on tubbe . MAN WE CANT HERE YOU .

  • science teacher

    I have this idea when I was in high school.i am happy to see It's actually on research state. Good work. Support from india

  • Fred Cornish

    This presentation is like quite a number of others – great info and graphics spoiled by an inhuman velocity. If you must not exceed a specific time, please consider multiple shorter segments allowing the viewer time to absorb the content.

  • Its Me

    maybe in the next 30 years this will be useable for comsumers but for now, dont even try to think about it, NO USE.

  • KidRiver

    Performance will be capped intentionally and we will see very slow incremental upgrades to this tech just like with traditional computing. Capitalism HOoO!

  • Matthew Thomson

    wouldn't optical electronics also mean that little to no heat is produced? meaning that not only is it more energy efficient, servers wouldn't need expensive mass cooling and also allowing for high-end fanless laptops and desktops. It would change computing as we know.

  • Del Sydebothom

    But wouldn't whatever is generating the photon stream have to be electrically controlled? Why wouldn't that create a speed bottleneck?

  • Bobbito Chicon

    If our going to use light to transfer Data in compact space.. Then like the Human Body and how it uses Biophotons you dont even need to use fiber , Or classical hardware.

  • Askejm

    My grandma paid lots of money to this company that put fiber cables in the ground to her house only for her to buy a 10 Mbit/s internet plan

  • ♫♪Ludwig van Beethoven♪♫

    Cant believe my child would play on his 10THz PC and i tell him stories about my good old 4Ghz pc 😂

  • Bikini Man

    Understand that the public only uses the oldest models with backdoors all over the place so we can be surveilled at all times.

  • Brian

    Great video, but there were a few points where it was hard to understand what you were saying. Just slow it down a bit 🙂

  • Fucked Gplus

    Ho swell remember cathode tube. And how they used magnet to redirect photons… Jeezz imagine a photon based computer in with anny magnetic field could fuck up everything…. I mean we are talking atom size computing here

  • Richard Deese

    Thanks! This is a fascinating field, and one that – strangely – doesn't seem to get much press. I often hear about Moore's Law for transistors, and various ways to tackle that issue, but I seldom hear optical even mentioned as a possibility! In a world where we expect to be a full Solar System species in the not-to-distant future, it's vitally important that we do discuss and consider optical, as it will be the way we calculate – and communicate – in such a future. Looking forward to more!; Rikki Tikki.

  • Dino DiNaso

    This presentation moves FAST and is PACKED with good information.

    Normally; tech features are slow with little real content.


  • Goran B

    "Speed of light" and "flow of electrons" (0:40) is very misleadnig, since electric signal is faster than optical signal (due to the permeability) and it has nothing to do with the "flow of electrons"! The faster "speed" is due to bandwidth and not due to speed of electrons/photons.

  • Jeong-hun Sin

    Can't they try to create an optical computer thousand times faster than the current ones, and try to create one that has a similar speed? Isn't the latter more achievable than the former? Faster speed is good, but the thing I want is less heat. Even if the speed is the same, if a computer or a smart phone generate significant less heat, people will buy it. Damn heat….

  • Tim Montgomery

    Why you try to make new only in part. Why you conform the language to the old way. 1/0 is no good. Must create new language with new infrastructure to support. An a can be sent as an a. Whole values not separated into bits.

  • Yolo4ever777 King

    I have two question about oVPN's, about using Bell State's Optical Communications, using oVPN with Superdense_coding is this possible? Would the oscillations device need to have oVPN?

  • Dreglanoth

    plus u didn't mention power requirements and temperature. I'm just assuming, but u could power these babies with led lights which use very little power. Phones as well as pcs would use so little power, your pc will basically run on a small battery. Also no cooling needed coz it's light, so no more bulky, noisy coolers

Leave a Reply

Your email address will not be published. Required fields are marked *