The potential for efficient high-data-rate communications using the laser has been recognized since its initial development more than 50 years ago. Free-space optical communications using lasers offer significant advantages over radio frequency (RF) or microwave systems for both airborne and satellite platforms, including high-capacity trunk links or dedicated point-to-point links for high-data-rate sensors. Communication links using laser beams are inherently resistant to both tapping and jamming. Short optical wavelengths allow high antenna gains for establishing links over extremely long distances and enable the use of shoebox-sized transmitters and receivers. Although several major initiatives have matured system-level designs, a full-scale operational free-space lasercom link is not yet available.
EARLY SYSTEM DEMONSTRATIONS: PROGRAM 405B
In 1971, the U.S. Air Force (USAF) initiated Program 405B to develop a system capable of demonstrating an unprecedented 1-Gbps data rate for a downlink comprising a geosynchronous satellite and a ground-based terminal. The program emphasized the development of critical technologies (laser modulation, pointing, acquisition, tracking, and signal detection) required to support a system design capable of operating from a geosynchronous satellite.
During the Engineering Feasibility Model phase of the program, two competing system approaches were developed, one based on coherent communications using the CO2 laser (developed by Lockheed-Martin) and the other using short-pulse
communications techniques employing the frequency-doubled Nd:YAG laser operating at 0.532 µm (developed by McDonnell- Douglas Astronautics). The Engineering Feasibility Model developed by McDonnell-Douglas successfully met program objectives, and thus McDonnell-Douglas was awarded the follow-on program in 1975 to develop the Space Flight Test System (SFTS) .
The SFTS program was to have developed and qualified a satellite terminal architecture that supported both a wideband (1 Gbps) downlink to a terrestrial ground station or to a receiver terminal in low-earth orbit. It was to have demonstrated the capability of critical technologies to survive and operate on orbit for a typical mission lifetime. A launch date was scheduled for 1979, however, shortly after the Preliminary Design Review in 1976, funding was reallocated at the Air Force Research Laboratory (AFRL), and the scope of the program was changed from a space-to-ground demonstration to an aircraft-to-ground demonstration .
Although the challenges of building hardware for an airborne demonstration were significantly reduced from the original scope of developing a terminal to operate in space, the technical difficulty of an air-to-ground link was in many ways significantly more difficult. The Airborne Flight Test System (AFTS) demonstration program transitioned the hardware originally intended for a satellite-borne terminal to a C-135 aircraft. The ground terminal (with a telescope aperture size of 48 inches) was initially located at the Airforce Test Station at Cloudcroft, NM, but it was transitioned to a facility (with a 14-inch telescope aperture size and a two-axis beam director) at White Sands Missile Range, NM (see Figure 1).
Figure 1: Airborne and Ground Terminal Platforms for the AFTS Demonstration at White Sands Missile Range, NM. Top: Airborne High-Data-Rate Transit Terminal Installed in KC-135 (With Ground-Based Test Equipment); Bottom: Airborne High-Data-Rate Ground Station Receiver Located at Cowan Site .
Several technological advances required for the original, space-based terminal were abandoned, including:
- A solar-pumped doubled Nd:YAG mode-locked laser.
- Radiation-hardened optics and electronics.
- A lightweight, gimballed beryllium mirror for precision pointing.
However, several significant advances in electro-optic technology were retained in the reduced-scope program, including:
- Low-noise silicon avalanche photodiodes (APDs).
- Wide-band precision fast-steering mirrors.
- A novel quaternary short-pulse modulation and communications data link format for high-data-rate transfer in a scintillating environment.
- A potassium-rubidium high-intensity discharge-pumped Nd:YAG laser for low-power operation.
- Diffraction-limited, beryllium telescope optics (with a 190-mm aperture).
Even though submicroradian tracking and pointing accuracy was not required to establish the link as part of the airborne demonstration, the pointing and tracking elements of the system were designed, built, and tested to meet the 0.6-µrad pointing accuracy required in an orbital environment. Because of the relaxed volume constraints, the program emphasized development of system architecture and electro-optical elements for the system (Figure 2).
Figure 2: AFTS Electro-Optics Package (Left) and Overall System Block Diagram (Right) .
Overall, the program was highly successful in meeting the program objectives, including :
- A diffraction-limited 5-µrad transmit beam from a beryllium optical telescope.
- Submicroradian wideband (300 Hz) tracking and pointing under simulated satellite dynamic environments.
- Low-noise, high-gain silicon APDs.
- A high-power (50-mW average output power) TEM00, mode-locked frequency-doubled Nd:YAG laser efficiently pumped with
a potassium-rubidium, stable, high-intensity-discharge lamp.
- Near theoretical bit error rate (BER) performance of a 1-Gbps optical link.
In addition, although not an originally intended objective, a wealth of data was obtained regarding the performance of optical links from an airborne terminal operating through the aircraft boundary layer. Through collection of both scintillation and beam- wander data at both ground and airborne terminals, much was learned regarding the impact of aircraft boundary layer and link turbulence on design of lasercom terminals. These findings would prove valuable for later system applications.
THE LASER CROSS-LINK SYSTEM (LCS) FOR DEFENSE SUPPORT PROGRAM (DSP)
As the final AFTS air-to-ground demonstrations were concluding, studies were under way for modernization of legacy DSP systems. Since the initial DSP satellite launch in 1970, the DSP satellite system has provided the United States with ballistic missile launch detection. Originally deployed with three operational satellites—one generally over the Atlantic Ocean, another over the Pacific Ocean, and a third over Europe—the DSP provided early warning of missile threats to the continental United States. With the original constellation, the western satellite relied on remote ground terminals located on foreign territory to receive threat data, which were then relayed to the continental United States for processing and evaluation. The eastern satellite downlinked data directly to a U.S. ground station terminal. As part of the modernization effort, the USAF sought to eliminate the need to rely on foreign ground stations by establishing a cross-link between the two operational satellites. At that time, only two viable technologies existed for practical cross-linking between satellites in geosynchronous orbit: lasercom and 60-GHz RF communication. As part of the modernization effort, the USAF funded a trade study to assess 60-GHz vs. laser communications for the DSP laser cross-link implementation. Because of the relative maturity of lasercom technology, and because lasercom was considered the long-term solution for secure relay of U.S. intelligence data, this technology was selected for implementation of the LCS. In 1980, McDonnell-Douglas was contracted to develop a lasercom terminal for the DSP satellite .
McDonnell-Douglas conducted an initial trade study of two different optical cross-link technologies: a direct diode modulated link and a short-pulse-modulation format based on the Nd:YAG laser. McDonnell-Douglas engineers determined that the Nd:YAG-based link was the only technology exhibiting adequate maturity to satisfy the following requirements: rapid link acquisition, nuclear survivability, and operation with the sun in the receiver field of view. Consequently, McDonnell-Douglas was awarded a research, development, test, and evaluation (RDT&E) contract in 1980.
The DSP LCS was required to transmit primary sensor data (Link 5) in one direction and satellite command and control information (Link 6) in the opposite. For the primary sensor to image the full field of regard, the DSP satellite exhibits constant rotation, with the axis of rotation pointed toward the center of the earth. Thus, for the cross-link to maintain a constant link to the opposite satellite, a gimballed telescope was mounted on a boom extending from the satellite body (Figure 3). While adding complexity, McDonnell-Douglas developed an elegant solution that featured a counter-rotating gimballed telescope. By necessity, this solution limited cross-link operation to a narrow set of satellite on-orbit stations, which ultimately led to the program’s termination.
Figure 3: Artist’s Concept of an LCS Integrated Onto the DSP Satellite .
The LCS was required to be operational 100% of the time that the satellite was on station because of the mission criticality of the cross-linked sensor data. Therefore, the terminal had to operate at full performance over a wide range of thermal conditions (-50 °C to +80 °C) and with the sun pointing down the optical axis of the telescope. Other key performance parameters for the LCS included :
- Link 5 data: 1.28 Mbps at 10-7 BER (a 6-dB margin at end of life).
- Link 6 data: 4 kbps at 10-6 BER (a >6-dB margin at end of life).
- Link acquisition in <500 s.
- 3-year on-orbit mission life (0.935 reliability).
- No single-point failures (all electronics and electro-optic elements redundant).
The system design that was implemented featured two diode-pumped Nd:YAG lasers, dual sets of acquisition and tracking detectors, and dual sets of electronics. The imaging optical assembly comprised dual optical paths for both fine tracking and point-ahead beam-steering mirrors (Figure 4).
Figure 4: LCS Showing Individual System Configuration .
At the time of its development, the LCS was one of the most complex electro-optical systems ever developed. Several demanding component-level requirements were successfully addressed to meet DSP satellite integration requirements regarding size (54 ft3), weight (300 lbs) and power (200 W) .
LCS Laser Technology
Early design trade studies resulted in the selection of a pulsed Nd:YAG laser operating at the fundamental 1.064-µm wavelength. A Pulse-Interval-Modulation (PIM) format was chosen to support the 1.28-Mbps data rate. However, this choice demanded that a convolutional code be developed to minimize burst errors associated with this modulation. Additionally, the link had to exhibit false alarm tolerance because of the radiation environment. The Iwadare-Massey convolutional error correcting code was determined to be optimum. The PIM coding with forward error correction required that the ND:YAG laser operate in cavity-dump mode at a stable 356 kpps. Because this pulse rate occurs in the middle of the instability region of Nd:YAG lasers, special active internal cavity control techniques were required to maintain <1% pulse amplitude stability for a diffraction-limited laser output pulse energy of 0.2 µJ.
Laser pump diode reliability presented another challenge. LCS represented the first application of diode pumping for an operational program (satellite or airborne). Extensive life testing was required for the mounting and polishing of the laser diode pump arrays to certify adequate lifetime. Thousands of hours of reliability testing for thousands of laser diodes were required to finally validate the allocated reliability for on-orbit operation.
LCS Telescope and Optics
Extremely high antenna gain was required to establish the 84,000-km link with 0.2-µJ pulse energy. The gimballed telescope exhibited an aperture size of 190 mm, CERVIT optics, and an Invar shell. The telescope was designed and qualified to provide a λ/10 wavefront quality after exposure to launch vibrations (80 grms at the secondary mirror) and thermal extremes of -50 °C to 90 °C (with an antenna gain of >112 dB). Unique and complex optical coatings were developed for a solar window to reduce the amount of solar energy entering the telescope to limit both damage to downstream optics and to minimize temperature swings.
LCS Pointing and Tracking
To support the pointing of the high gain antenna, a wideband tracking loop was required that could maintain pointing to better than 3.8 µrad, 3σ (including open-loop point-ahead error). A redesign of the torque-motor-driven, beam-steering mirrors that was first developed for the AFTS program was required to reduce both noise and uncompensated drift over temperature.
Quadrant silicon APDs that could meet the sensitivity and noise performance for operation at 1.06 µm had not yet been developed. To implement a null-seeking tracking detector, an arrangement of four silicon APDs was positioned around a four-sided pyramid. The physical spot size was 100 µm at the focal plane, thereby requiring a better than a 10-µm tip on the pyramid. The introduction of diamond-turned optics met this need. The silicon APDs were required to exhibit extremely high quantum efficiency at 1.06 µm as well as extremely low dark noise. Additionally, because of the natural and man-made radiation environments, special changes were required to both the diode physical structure and the transimpediance amplifier to achieve better than 1×106 V/W with noise equivalent power (NEP) of less than 1 nW at 1.06 µm.
The RDT&E program required 4 years to complete. The follow-on production program, including full qualification, ensued after successful RDT&E testing. Production proceeded until 1993, the start of the Gulf War. As part of that conflict, the satellites were repositioned for early warning of SCUD missile launch detection in the Middle East. The new satellite on-orbit stations
required operational angles that were outside of the cross-link design limits. By that time, the ground station processing facilities had been dramatically reduced, facilitating the development of mobile ground station terminals that could be placed where the data were required, thereby obviating the need for cross-linking of the data to one centralized processing center. Components for all eight terminals were assembled, and three terminals were fully integrated and passed acceptance and qualification testing. Two terminals were integrated on satellites. Unfortunately, when the program was terminated after the expenditure of nearly $0.5 billion, the USAF decided to remove the terminals from the two satellites, and they were never flown .
TRANSFORMATIONAL SATELLITE COMMUNICATIONS SYSTEMS
The DSP-LCS program was the last major lasercom initiative for several decades. There were a number of studies, such as cross-link studies for the Follow-On Early Warning System (FEWS) program, which became the Space Based Infrared Sensor. This study was halted in 1993. MIT Lincoln-Laboratories began an effort to develop key component and system technologies under the program Lasercom Intersatellite Transmission Experiment (LITE), which comprised a number of phases that ultimately resulted in a satellite-based demonstration of capability. The Defense Advanced Research Projects Agency funded Terahertz Optical Reach Back (THOR) in 2002 and the Optical RF Combined Link Experiment (ORCLE) in 2004. Both NASA and the European Space Agency funded demonstration programs at this time .
The Transformational Communications Architecture (TCA) initiative, a major lasercom program, with defined program goals, schedule, and (U.S. government) funding, began in 2003. The TCA was started in response to the recognized growing demand for bandwidth on the battlefield. The TCA included a variety of communications systems (Figure 5) intended to satisfy military bandwidth needs through the middle of the 21st century.
Figure 5: The TCA Integrated Numerous Existing RF Communications Missions With a New Lasercom-Based Network of TSAT Satellites .
The centerpiece of the TCA was a new satellite-based network, the Transformational Communications Satellite (TSAT) Network. The TSAT Network featured a top-down design to promote backward compatibility with existing RF systems. The TSAT architecture consisted of five geosynchronous satellites that could be interconnected in a variety of physical topologies. However, the logical topology was a mesh network. The routers employed in this application exhibited IPv6 protocol. Each satellite consisted of two terminals to support an optical transport network (OTN)/synchronous optical network (SONET) framing of 10-Gbps cross-link or 2.5-Gbps downlink to airborne intelligence, surveillance, and reconnaissance (ISR) terminals .
Lasercom terminal designs leveraged the significant advances achieved in commercial fiber-optic technology, including high- power (5 W) erbium-doped fiber lasers and commercial communications link protocols and hardware. The pointing and tracking subsystem designs incorporated the latest advances in InGaAs receiver technologies, fast beam steering, and carbon fiber lightweight optics. However, a few significant technology issues for both the satellite and airborne terminal segments required resolution. A robust switching capability was necessary to support dynamic bandwidth and resource allocation, and a low-noise modem was needed to accommodate multiple modulation waveforms optimized to support various links. Lockheed-Martin (Sunnyvale) and Boeing Space Systems were each awarded contracts in excess of $500 million for satellite lasercom terminal definition and risk reduction .
The Airborne Lasercom Terminal was to be hosted on high-value ISR platforms, including U-2, E-10 (MC2), and the RQ-4 (Global Hawk). Stringent limitations were placed on terminal aperture projection into the windstream because of the aerodynamic characteristics of the U-2 and Global Hawk. Therefore, a significant investment was made in the development of “conformal” beam-steering apertures. The following companies were awarded contracts to define, mature, and demonstrate the critical technologies required for an Airborne Lasercom Terminal that could be integrated onto each of the target aircraft and operate with the TSAT Lasercom terminals: BAE Systems (Nashua, NH), Raytheon (Marlborough, MA), Lockheed-Martin Integrated Systems and Solutions (San Jose, CA), and Northrop-Grumman (Linthicum, MD). Although each of the terminals exhibited slightly different architectures, all of the terminals selected Risley prism technology as the conformal beam-steering aperture.
Unfortunately, amid growing concerns with overall TCA system risk and growing estimated costs (~$26 billion) for system deployment, the Department of Defense cancelled the program in its 2010 budget request. The total spent on the program at that point was $1.5 billion, which primarily addressed lasercom terminal development (both satellite and airborne terminals). The design and qualification of the high-power erbium-glass fiber amplifier for space use was probably the most significant advance provided by the TSAT development program prior to its cancellation .
LASERCOM: LOOKING FORWARD
The need for secure bandwidth on the battlefield continues. With the need for improved targeting accuracy and assured target prosecution, the demand for high-resolution imagery to and from the dismounted soldier is even more pressing now than it was 5 years ago. Today, airborne and space assets carrying Lidar, high-resolution imaging systems, and multi-spectral imaging systems generate massive volumes of data that must be exfiltrated, processed, and redistributed to a myriad of users, all of whom are fully networked as an integrated fighting force. However, no major operational program for implementing a wideband, secure data link (lasercom) is even in the planning stages. Fortunately, the same demands for instant connectivity exist in the commercial sector, and the commercial sector may ultimately solve the problem. The Airborne Lasercom Terminal, like the one being developed by General Atomics to operate with the European Data Relay System, may eventually provide the solution .