Friday, July 6, 2012

FPA - ACRONYM


FPA - focal-plane array, an array of light detectors placed in the focal plane of a lens or optical system to record images, such as the sensor chip that takes the place of film on a camera. Often refers to infrared detector arrays, but also can be applied to arrays which respond to other parts of the spectrum, particularly visible and radio bands.

The concept of a focal-plane array is a natural one in the age of digital photography, but a Google Book search finds 10 references dating from the 1960s, mostly referring to infrared detectors for military or astronomical applications. It may have originated in the military; one reference was to a 1963 government document on the defense budget. Extending the search to the start of 1980 found another 450 publications that used the term, too many to examine in detail.

FPA wasn't a common term even then. It's not listed in the index of my 1978 edition of The Infrared Handbook. The only sub-entries under "array" are "silicon diode vidicon," "staring CCDE signal processing," and "in scanners." But those entries do show how imaging technology was evolving toward the FPA concept.

The vidicon was an imaging tube used in analog television, which recorded images as charges on a photoconductor and read them out with an electron beam. By 1978, arrays of silicon photodiodes were available, and had begun to be used for electronic imaging, leading to odd hybrid terms to describe how silicon diode arrays had come to replace television-like cameras in the near infrared.

Scanning and staring refer to different ways of recording images with a limited number of detectors. Today pixels are cheap, so it's natural to put a chip containing a million or more light-detecting elements into the focal plane, where it stares at the scene being imaged. In the 1970s detectors were much more expensive, especially in the infrared, so electronic images often were recorded by scanning a linear array of detectors across the image plane (or by scanning the image across the detector array).

Both technologies live on today. Scanning arrays are standard in flatbed scanners, which drag a linear sensing array across a page. Staring arrays have gone much farther. They are standard in cameras for applications including high-resolution scientific imaging in the near infrared. A gigapixel FPA with response in three bands stretching from 250 to 1100 nm will be launched next year on the European Space Agency's Gaia satellite.


FPA comprised of Geiger-mode avalanche photodiodes was packaged in a ceramic case with microlenslet array, readout circuit, and thermoelectric cooler by Princeton Lightwave (Cranbury, NJ).

Thursday, June 21, 2012

POF - ACRONYM


POF - Plastic Optical Fiber (also Polymer Optical Fiber): An optical fiber made from a polymer or plastic.

Plastic light guides have a surprisingly long history. Soon after its invention in 1928, the transparent plastic polymethyl methacrylate (PMMA) began replacing quartz in a variety of applications, including bent rods used as dental illuminators. Early developers of fiber-optics tested transparent plastics as well as glass in the 1950s, and plastic was used as a cladding on some of the first clad optical fibers.

For many applications, plastic had important advantages over glass. Plastic is lightweight, inexpensive, and flexible rather than brittle. Thin sheets of clear plastic, like thin sheets of window glass, seem quite transparent. But reducing the attenuation of plastic proved far more difficult than improving the clarity of glass, and plastic was left in the dust when the first low-loss silica fibers were demonstrated in 1970s.

Developers of plastic fibers turned to other light-guiding applications where fiber loss was less important than in telecommunications. By the early 1970s, bundles of plastic fibers were being used in decorative lamps, with the fiber ends splayed out to sparkle with light at their ends. Fiber-optic pioneer Will Hicks strung plastic fibers through a plastic Christmas tree, which he hoped to sell for holiday decoration until it failed an impromptu fire test at a New York trade show. Despite such reverses, plastic fiber-optic decorations live on.

POF communications also survives in short-distance applications where the low cost and ease of termination of plastic fibers offsets their high attenuation. One example is the Media Oriented System Transport (MOST) network for automobiles, which red LED transmitters to transmit signals through up to 10 meters of POF linking electronic systems in cars. Auto mechanics don't need an expensive fusion splicer to connect the large-core step-index fibers. Japanese researchers have developed graded-index POFs with bandwidths high enough to transmit 4.4 gigahertz up to 50 m at 670 to 680 nm, which developers hope could lead to applications in home networking.

Plastic optical fiber does have plenty of competition for the acronym POF, including polymer optical fiber and another optical term, "plane of focus." Acronymfinder.com lists 34 possible definitions ranging from the journal Physics of Fluids and the Pakistan Ordnance Factory to the dating site "Plenty of Fish" and "pontificating old fart".  But plastic fiber fans can take heart -- the site ranks Plastic Optical Fiber ranks as the most-used definition of POF.


Modern decorative fiber-optic lamp, courtesy of Keck Observatory.

Monday, June 11, 2012

QCL - ACRONYM

QCL: Quantum Cascade Laser, a semiconductor laser lacking a junction in which an electron passes through a series of quantum wells. In each quantum well, the electron emits a photon on an inter-subband transition before tunneling through to the next quantum well. QCLs are important sources in the mid- and far-infrared, including the terahertz band.

Semiconductor and diode lasers were long considered synonymous after demonstration of the first semiconductor diode laser in 1962, although other types had been proposed and lasing had been demonstrated in semiconductors without junctions that were pumped optically or with electron beams.

Russian physicists Rudolf Kazarinov and R. A. Suris took the first step toward the QCL in 1971 by suggesting electrons in a superlattice could tunnel between adjacent quantum wells, but the technology needed to make them was not yet available. The development of molecular beam epitaxy (MBE) revived interest in such complex semiconductor structures, and in 1986 Federico Capasso, Khalid Mohammed, and Alfred Cho of Bell Labs suggested that electrons tunneling through stacks of quantum wells might be used to make infrared lasers. 

Their 1986 paper clearly shows the basic idea, but demonstrating QCLs took eight years, as long as it took to go from the first pulsed cryogenic diode laser to room-temperature operation. Not until 1994 did Jerome Faist, Capasso, Cho, and Deborah Sivco report  the first QCL in a Science paper where they coined the evocative phrase "quantum cascade" to describe its operation; a Google Book search fails to find any earlier use of the phrase. Their device produced 8 milliwatt (mW) pulses at 4.2 micrometers (µm), but like the first diode lasers it required cryogenic cooling, with highest power at 10 degrees Kelvin (K), and operation at up to 90 K with a threshold of 14 kiloamperes (kA) per square centimeter (cm2), comparable to the threshold of the first diode lasers.

Today, QCLs are in the mainstream of laser technology, operating continuous-wave at room temperature with multiwatt output in the mid-infrared. Available commercially, QCLs operate through much of the infrared all the way to the terahertz band.

Electron emits a cascade of photons as it tunnels through a series of quantum wells in this simplified view of a QCL.

Monday, June 4, 2012

DWDM/CWDM - ACRONYM


DWDM: Dense Wavelength Division Multiplexing: Wavelength-division multiplexing with signals closely spaced in frequency.

CWDM: Coarse Wavelength Division Multiplexing: Wavelength-division multiplexing with signals broadly spaced in wavelength.


The division of the radio spectrum into broadcast channels made the idea of wavelength-division multiplexing (WDM) obvious to any serious student of communications by the time the laser was invented. But how to divide the optical spectrum was far from obvious. In the early 1980s AT&T picked three widely spaced channels for the first commercial system linking Boston to Washington, GaAlAs lasers at 825 and 875 nm, and an InGaAsP LED at 1300 nm. But single-mode fiber transmission quickly eclipsed its capacity and WDM was largely abandoned.


Invention of the erbium-doped fiber amplifier (EDFA) in the late 1980s revived interest in WDM because it could amplify multiple signals across a range of wavelengths with little crosstalk. The question quickly became how tightly wavelengths could be packed across the 1550 nm erbium-fiber gain band. That required developing new filter technology to slice the spectrum finely. By 2000, channel spacing was down to about 0.4 nm or 100 GHz.


To make systems compatible, the International Telecommunications Union (ITU) defined a standard dense frequency grid spanning the erbium gain band. Each DWDM was 100 GHz wide, with the standard specifying channels in frequency units, such as 193.10, 193.20, and 193.30 THz, although optical optical engineers translated them into wavelengths (1552.52, 1551.72, and 1550.92 nm, respectively).


DWDM was designed for expensive high-performance long-haul systems, but WDM also could enhance capacity of shorter fiber systems, if costs could be cut by using cheaper optics with less-demanding specifications. That led ITU to develop a "coarse" grid, for which they specified CWDM channels in wavelength units, spaced 20 nm apart from 1271 to 1611 nm, used in metro and access networks


That's the official CWDM grid, but it hasn't stopped designers from multiplexing other combinations of widely-spaced WDM signals, such as cable-television or fiber to the home (FTTH) systems transmitting downstream at 1550 and (sometimes) 1480 nm, and upstream at 1310 nm. 


So these divisions of the spectrum do have standard meanings. 


Comparison of CWDM and DWDM spacing. 

Thursday, May 31, 2012

SWIR - ACRONYM


SWIR - Short-Wavelength (or Short-Wave) Infrared, wavelengths at or near the short-wave end of the infrared.

You might think the infrared band stretching from the red end of the visible spectrum to three micrometers (µm) is used widely enough that the it would have a well-accepted definition. Dream on. Everybody has their own take, and they can't even agree whether that part of the spectrum should contain one or two bands.

Wikipedia's "Infrared" entry lists several definitions of SWIR, without trying to resolve the conflicts or explain the differences. It first cites a "commonly used subdivision scheme" that defines SWIR as wavelengths from 1.4 to 3 µm, where water absorption is high, with a separate near-infrared (NIR) band at 0.75 to 1.4 µm where water absorption is low. It's not a bad definition, but why is it referenced to a book titled Unexploded Ordnance Detection and Mitigation?  The International Commission on Illumination makes a similar split at 1.4 µm because shorter wavelengths (the IR-A band) can reach the retina, but longer wavelengths (the IR-B band) are absorbed within the eye. Both divisions make sense from the standpoint of illumination.

Other divisions are based on sensor response. Wikipedia cites a split between NIR extending to the long-wave limit of silicon detectors near 1.0 µm, and SWIR from 1.0 to 3.0 µm, given in Miller and Friedman's Photonic Rules of Thumb. A Raytheon wall chart similarly defines NIR as 0.7 to 1 µm, and SWIR as 1 to 2.7 µm. However, makers of other detectors pick other dividing points. An article by Nova Sensors (Solvang, CA) defines SWIR as 0.9 to 1.7 µm, the response range of the InGaAs sensors used in their SWIR cameras. However, sometimes the two ranges overlap. Headwall Photonics (Fitchburg, MA) lists its NIR hyperspectral imaging sensor as responding to 0.9 to 1.7 µm and its SWIR version as responding to 0.95 to 2.5 µm.

Some don't bother splitting the band at all. The International Organization for Standardization (ISO) 20473 standard calls 0.78 to 3 µm NIR, and the McGraw-Hill Dictionary of Scientific and Technical Terms calls NIR 0.75 to 3 µm. Neither lists SWIR.

I could go in agonizingly nit-picking detail, but the lesson is clear -- SWIR (and NIR) can be useful labels for parts of the infrared, but you need to check the numbers to be sure what they mean.


Tuesday, May 22, 2012

VCSEL - ACRONYM


VCSEL - Vertical Cavity Surface-Emitting Laser, a type of semiconductor laser in which the resonant cavity is perpendicular to the junction layer and light is emitted from the surface of the chip.

All early diode lasers oscillated in the plane of the junction or active layer and emitted from the edge of the semiconductor chip. This design is logical because keeps laser oscillation in the plane of the active layer where recombination of current carriers produces a population inversion, so the round-trip gain in the cavity is high. However, because the active layer is very thin, the beams from edge-emitting diode lasers diverge rapidly, particularly in the direction perpendicular to the active layer.

VCSELs oscillate vertically, in a cavity formed by reflective layers on the top and bottom of the chip. Single-pass gain much lower than in edge emitters because only a very thin layer of gain material is between the cavity mirrors but the emitting aperture typically is much wider than the active layer is thick, producing a higher-quality, circular beam. First demonstrated in 1979, VCSELs went through a series of structural refinements to improve their performance and fabrication processes. In current designs, one or often both of the reflectors are multilayer Bragg reflectors containing many the tens of pairs of layers needed to produce the very high reflectivity needed to sustain oscillation with only a thin gain medium.

The short length of VCSEL cavities brings some advantages, including allowing direct current modulation at speeds to 40 gigabits per second (Gbit/s) and without the mode hopping possible in edge emitters. Yet ironically, a commercial attraction of the more complex VCSEL design is that it makes them more economical. All the hard parts of VCSEL production are done by highly automated semiconductor manufacturing techniques. The resulting VCSELs can be tested on the wafer, unlike edge emitters, which can't be tested until the wafer is diced into chips. That combined with their larger emitting area greatly reduces packaging expenses, which account for more of finished product costs than the laser chips. So more complex winds up being cheaper, as well as better for many applications.

And VCSEL types and applications keep growing. Recently, a VCSEL-type cavity was used in optically pumped colloidal quantum dot lasers emitting red, green and blue light.


Complexity in a VCSEL:  Beam Express (Lausanne, Switzerland) makes 1310 nm VCSELs by bonding AlGaAs/GaAs distributed Bragg reflectors on top and bottom of an InAlGaAs/InP gain layer containing strained quantum wells and a tunnel junction because high-contrast Bragg reflectors are not practical in InP-based materials.

Tuesday, May 15, 2012

EUV/XUV - ACRONYM

EUV or XUV:  Extreme Ultraviolet, the short-wavelength or high-energy end of the ultraviolet spectrum, from 120 (or 200) nanometers to about 10 nanometers (nm).

Atmospheric transmission in the ultraviolet decreases sharply with wavelength, and air absorption is so strong that wavelengths shorter than 150 to 200 nm must be studied in a vacuum. The first explorers of the EUV spectrum were astronomers using satellite instruments. The quest to squeeze more and smaller transistors onto semiconductor chips has changed that by shrinking chip features to dimensions on the scale of EUV waves. The semiconductor industry is now testing the first wave of EUV photolithography systems operating at 13.5 nm.


Such a sudden technological interest in a long-neglected part of the spectrum is a great recipe for muddied definitions of spectral bands. Astronomers consider the 121 nm Lyman alpha line of hydrogen to be the major landmark in the EUV spectrum, so they picked 120 nm as the long-wave end of the EUV band. However, the major technological landmark for the semiconductor industry is the 13.5 nm lithography wavelength, so they often define EUV as having a wavelength of 13.5 nm. The laser community uses a broader definition of 10 to 120 or 200 nm as it explores a broader range of applications, enabled by new techniques such as high-harmonic generation.

The EUV largely overlaps the older vacuum-ultraviolet (VUV) band, usually defined as 10 to 200 nm. A draft document from the International Standardization Organization (developed for space observations) also defines two other overlapping bands, the far-ultraviolet (FUV) at 122 to 200 nm and the germicidal Ultraviolet C (UVC) band at 100 to 280 nm.


XUV often is an alternative abbreviation for extreme ultraviolet, substituting the fashionable X for the relatively drab E, but the ISO lists it as an abbreviation for soft X rays at 0.1 to 10 nm. A quick Google search gives the impression XUV is the more popular form, with 5.6 million hits compared to 3.2 million for EUV and 1.8 million for VUV. But that's misleading because XUV also is shorthand for "crossover utility vehicle," a sport-utility vehicle based on a car rather than a truck. That usage may be popular on the Internet, but it was new to me--and for years I've been driving a Toyota RAV4, which is classed as an XUV.