This study numerically and experimentally investigates a photonic approach for microwave time delay, which takes advantage of the redshift of the laser cavity resonance induced by external optical injection in… Click to show full abstract
This study numerically and experimentally investigates a photonic approach for microwave time delay, which takes advantage of the redshift of the laser cavity resonance induced by external optical injection in a semiconductor laser. The strong enhancement around the redshifted cavity resonance not only amplifies the power, but also shifts the phase of the microwave signals carried by the optical injection. Such a microwave phase shift is approximately linear over a few gigahertz, leading to a constant microwave time delay over the frequency range. A different time delay can be achieved by simply adjusting the injection power or frequency. For the microwave frequencies up to 40 GHz investigated in this Letter, a continuously tunable range of more than 80 ps in time delay is achieved over an instantaneous bandwidth of approximately 7 GHz. The quality of the data carried by the microwave signals is mostly preserved after time delay. Thus, a bit-error ratio down to 10-9 at 2.5 Gb/s is achieved with a possible detection sensitivity improvement of 5 dB.
               
Click one of the above tabs to view related content.