SURVEY OF ADVANCED TECHNOLOGIES IN IMAGING SCIENCE FOR REMOTE SENSING

By:

Erich Hernandez-Baquero

B.S. Physics, U.S. Air Force Academy

Rochester Institute of Technology

(1997)

 

 

 

 

 

Table of Contents

List of Figures and Tables

Abstract

1.0 Introduction

1.1 Applications

1.1.1 Environmental Science

1.1.2 Military

1.1.3 Commercial

1.2 The Imaging Chain

2.0 Approach

2.1 Field of View

2.2 Data Transfer and Storage

2.3 Quality of Imagery

2.3.1 Atmospheric Distortions

2.3.2 Sensor Performance

2.3.3 Platform-Induced Distortions

2.3.4 Image Processing Algorythms

3.0 Results

3.1 Multispectral Imaging

3.2 Hyperspectral Imaging

3.3 Synthetic Aperture Radar

 

4.0 Conclusions

Appendix

References

 

List of Figures and Tables

 

Figure 1. View of El Niño from the TOPEX/Poseidon satellite.

Figure 2. Comparison of CORONA and SPOT satellite imagery.

Figure 3. Atmospheric correction of Landsat-TM data by image processing.

Figure 4. MSS scanning geometry and image projection

Figure 5. Typical hyperspectral image cube

Figure 6. 3-D topographical SAR imagery

 

 

 

Table 1. UVISI sensor suite parameters.

Table 2. Spaceborne Hyperspectral Imagers

Table 3. Airborne Hyperspectral Imagers

Abstract

 

Various technologies in imaging science used for remote sensing are described. Although this report is not all-inclusive, the technologies presented are diverse and represent the most prominent fields in remote sensing imaging. Strengths and weaknesses are evaluated as it pertains to specific applications (either airborne or spaceborne). A brief description of the theory of each technique is also provided. A vision for the future of remote sensing is provided.

SURVEY OF ADVANCED TECHONOLOGIES IN IMAGING SCIENCE FOR

REMOTE SENSING

 

  1. Introduction

Remote sensing is a natural extension of the human need to explore and understand its environment. Through advances in technology, mankind has been able to extend the way we see the world to a perspective never before possible. Using airborne and spaceborne platforms, complex imaging systems that surpass the limitations of the human eye are used to observe the Earth. Through these systems, we can now see in spectral regions that were previously invisible to the unaided eye.

The ability to extract information about our world and present it in ways that our visual perception can comprehend is the ultimate goal of imaging science in remote sensing. In all applications--from environmental monitoring to intelligence data gathering--the need to obtain more accurate information in a timely and efficient manner continues to grow exponentially. It is precisely because of this rapid growth that a broad range of technologies is presented in this report.

1.1 Applications

1.1.1 Environmental Science

Clearly one of the largest and most prominent applications is the study of the Earth’s ecosystem through the use of remote sensing. The synoptic view obtained from airborne and spaceborne imaging platforms provides an opportunity to understand weather systems, climate changes, geological phenomena, etc. from a global perspective. Not only are we able to view the Earth as a single ecosystem, but the amount and quality of information that we can gather is much greater than other methods of observation.

Figure 1. View of El Niño from the TOPEX/Poseidon satellite.

Figure 1 is an example of the kind of imagery that is available to anyone almost instantaneously over the Internet. The scale describes the height of the ocean surface (which is directly correlated to temperature) compared to last year’s measurements. El Niño is seen as a mass of "red water" accumulating along the Eastern Pacific. The data contained in this single image from space would have required the use of hundreds of boats and instruments and could have potentially taken a considerable amount of time to process and distribute1.

      1. Military
      2. Perhaps one of the areas in which the greatest advances in imaging technology have occurred is in the field of intelligence data gathering for the support of military operations and national security. The need for accurate and timely data cannot be overemphasized here since the lives of military personnel could be saved by having a better understanding of the enemy forces location and activities. In addition, international treaties involving nuclear disarmament and biological/chemical warfare can be enforced without actually having to send in a team of inspectors. High-flying aircraft such as the SR-71 and U-2 and satellite platforms such as the recently de-classified CORONA provide this type of information. The resolution available from these systems is far greater than their civilian counterparts. The CORONA satellite, for example, could obtain images with resolutions of approximately 6 feet! This technology, although dating to the 1960’s, is still better than most currently operating civilian/commercial spaceborne imaging systems such as the French SPOT (see Figure 2).

      3. Commercial

During the 1980’s, the federal government decided that private industry should operate satellite space systems and manage the data that is generated from these systems. As a result of that, many companies began to sponsor or even develop the capability to do remote sensing. Their customers were the scientific community and the government itself. Other customers included local utility companies that would provide their customers with information about their energy use. For example, through the use of thermal infrared sensors, information about how efficient a home or a building is using its electricity can be determined3. Many of the commercial imagery are available for sale over the Internet--thus making it very accessible to the public.

Figure 2. Comparison of CORONA and SPOT satellite imagery2.

1.2 The Imaging Chain

Before we can start analyzing and comparing remote sensing systems, we must first lay an approach for visualizing these imaging systems. At first glance, one might consider that the caliber and performance of an imaging system rests solely on the quality of the optical system in terms of resolution and accuracy. However, the fact is that in order to fully characterize a system we must look at it using an end-to-end perspective. A satellite, for example, could be equipped with the highest technology hardware available, but if the images generated by that system cannot be processed (or interpreted) then it is useless. We then look at systems from an imaging chain approach.

The imaging chain simply consists of all the steps (which can be thought of as rungs in a chain) required to bring an image to an end-user. At this point it is important to note that the end-user may not only be a human looking at a picture or movie, but it may also be a control system used in an automated process. The imaging chain takes us through the steps of capturing a scene, storing, manipulating, and transmitting the data, and finally displaying the image. Clearly, a system may generate good data that can be processed to yield accurate information, but without a good system to display it the whole process suffers. The analogy to a chain applies here as we think of the whole chain being only as strong as its weakest link3.

The scope of this report is to analyze emerging technologies in imaging science for remote sensing using the imaging chain approach. However, no discussion on display systems is provided. It is assumed that the systems presented in this report can generate processed imagery that can be properly digitized and displayed on a moderate resolution CRT or incorporated into automated systems. Thus, using the imaging chain approach, a system can be evaluated by the scene that it can capture, the data transfer and storage capability, and the quality of the produced imagery.

2.0 Approach

The following parameters will be used to evaluate the performance of the imaging systems presented in this report:

2.1 Field of View

The scene a system can capture is driven mainly by its Field Of View (FOV). In particular, we are interested in an imaging sensor’s instantaneous FOV (IFOV), the ground IFOV (GIFOV), and the height of the imaging sensor platform. The relationship is given by

GFOV = H∙IFOV [1]

where H is the height of the platform, and IFOV is the size of the sensor at the imaging plane divided by the effective focal length of the optical system. Clearly, how much total ground coverage is achieved depends on the GFOV (which depends on the orbital parameters of a satellite platform or the flying altitude of an aircraft) and the dwell time on a particular scene. Depending on the sensor configuration, the total FOV may range from only 15o to 120o 3,4. Unfortunately, a larger FOV is not necessarily the best solution since it is more susceptible to geometric distortions and often results in poor spatial resolution. These parameters continue to improve as new electro-optics technologies develop.

    1. Data Transfer and Storage

The data transfer and storage capability of a system depends on the electronic configuration at the imaging platform. Ultimately, the photons generated or reflected by a scene reach the sensor and the energy is turned into an electrical signal (or recorded on film). Since emerging technologies involve the use of electro-optical imaging sensors, we will look at the requirements for storing and transferring the electrical data. Because of weight limitations, satellite systems usually send the data in real-time or near real-time via telemetry to ground stations which then record the data in optical drives, CD’s, or serial tape drives. Airborne platforms, however, may be able contain the data storage hardware onboard. In many cases, the distribution and storage of data is handled by government organizations or by private industry. This is another area where technology continues to improve and become more affordable allowing real-time delivery of large volumes of imagery data with minimal loss or distortion of data.

2.3 Quality of Imagery

The quality of the imagery mainly depends on the atmospheric distortions, sensor performance, platform-induced distortions, and the effectiveness of image processing algorithms. Of all of these, the atmosphere is the most dynamic, and consequently, the most difficult source of image degradation to compensate for.

2.3.1 Atmospheric Distortions

Slight variations in the atmosphere change the effective index of refraction of the atmospheric medium for any given optical path between the scene and the sensor. The effect of the atmosphere is typically seen as a blurring and loss of contrast in an image. When looking at spectral data, the atmosphere affects the spectral profile that a sensor "sees" by blocking or introducing different frequency bands to the spectra, thus generating inaccuracies in the image segmentation and classification process. In the extreme case, heavy cloud cover may completely obscure a scene from a remote sensing system. The degradation process is difficult to characterize because of the large amounts of physical processes occurring in the atmosphere that affect the transmission of light through it. In general, atmospheric effects are compensated for through a complex model of the atmosphere. The U.S. Air Force Phillips Laboratory Geophysics Directorate has developed a widely accepted database of atmospheric constituents allowing the user to make an estimation of the atmospheric effects on the image acquisition process. Other approaches for atmospheric compensation include speckle imaging, range-angle interferometry, and adaptive optics5. Interestingly, many of these developments in atmospheric compensation originated from within the astronomical community.

      1. Sensor Performance

There are so many aspects of sensor design and operation that contribute to a sensor’s overall performance that it would be beyond the scope of this paper to discuss all of them. A more complete treatment of sensor design and performance is found in Chapter 8 of the Manual of Remote Sensing6 and Volumes 3 and 4 of The Infrared & Electro-Optical Systems Handbook7. For our discussion, it is sufficient to mention that the sensor performance is mainly driven by its signal-to-noise ratio (SNR), spectral response, throughput, and ease of calibration. The SNR is simply how well the sensor can distinguish a signal of interest from the electronic or thermal noise associated with the hardware. This is the parameter that requires longer dwell time in order to obtain enough photons or flux to create a signal above the noise of the sensor. Although the detector usually drives the noise, it is possible to have a system where the signal conditioning electronics are the major source of noise. The spectral response is a parameter that describes how well the sensor can "see" in a specific spectral band. If the optical wavelength we are interested in falls in the near-infrared, for example, and our sensor has no spectral response in this region then it is not going to register a signal. Another example is the spectral response of the eye, which can only see in the visible optical spectrum. The throughput is a measure of how well the incoming flux of radiation propagates through the sensor optics. Clearly, poor mirror coatings and lens aberrations will cause the light to be scattered or attenuated as it propagates through the sensor system, thus limiting the number of photons reaching the detector. Finally, no sensor can provide high-quality imagery without proper calibration. A well-calibrated system increases the confidence in the accuracy of the data. Temperature readings from an airborne radiometer may very well be useless if the data cannot be compared to some absolute measurement. Calibration is a far greater issue for spaceborne sensors. A system can be well calibrated in the laboratory but once it reaches space, the zero-gravity environment can change the hardware to a point where the instrument operates differently than it did on the ground. Advances in remote calibration techniques include the use of ground-based radar and lasers8.

2.3.3 Platform-Induced Distortions

Another source of distortions on an image is the imaging platform itself. This is particularly true of non-stabilized platforms such as aircraft. As the airplane pitches, yaws, and rolls, the direction in which the sensor is pointing changes, causing the FOV of sequential frames to be different3. This creates geometric distortions on the image. Although this distortion is not extremely complex, it can lead to loss of data and it must be taken into consideration. One of the major advantages of spaceborne over airborne sensors is the inherent stability of spaceborne platforms.

Figure 3. Atmospheric correction of Landsat-TM data by image processing.

2.3.4 Image Processing Algorithms

As computer technology continues to improve dramatically, the use of image processing algorithms is becoming more viable. Through image processing, sensor data can be represented in a meaningful format that allows the extraction of vital information. Image processing can also help fill in the "gaps" caused by missing data through the use of application-dependent interpolation or extrapolation techniques. Also, processing allows the user to compensate for atmospheric distortions (assuming that model predictions can be correlated to the observed distortions) and displaying an image without the atmospheric effects. Figure 3 shows the dramatic improvement that can be accomplished through the use of image processing techniques for atmospheric correction. The image on the right is the corrected image shown on the left9. As new sensor systems are developed, image processing algorithms must be identified in order to generate high-quality imagery.

3.0 Results

3.1 Multispectral Imaging

One of the biggest advances in remote sensing was the launch of the Landsat series of satellites beginning in 1972. These satellites were equipped with multispectral sensors that provided repetitive global coverage. The first three satellites carried the multispectral scanners (MSS) which collected images in four broad spectral bands from the visible green to the near-infrared. The sensors consisted of an oscillating mirror that would scan the ground in the cross-track direction using six simultaneous line scans (one line scan per detector per spectral band). Figure 4 shows the image projection on the ground and the scanning arrangement for the MSS. Landsat 3 had a MSS with a fifth band (designated "Band 8") in the thermal-infrared. These MSS had an IFOV of 86 mrad (258 mrad for Band 8) and a GIFOV of approximately 79 m (237 m for Band 8) and SNR’s ranging from 72 to 123. Landsat 4 and 5 were equipped with a MSS and a

 

 

 

Figure 4. MSS scanning geometry and image projection4.

thematic mapper (TM). The TM’s operated over the same spectral bands as the MSS, but also provided 16 detectors per spectral band (as opposed to six in the MSS) and 4 detectors for the thermal-IR scan at a spatial resolution of 30 m (IFOV of 42 mrad) and 120 m (IFOV of 170 mrad) for the thermal-IR band. The TM sensor also provides more radiometric information than the MSS. The next generation of multispectral sensors comparable to these is the ETM+ (Enhanced Thematic Mapper Plus) which will be carried on the Landsat 7 (scheduled for launch in July 1998). This sensor will acquire images over 7 spectral bands with the same IFOV and spatial resolution as the TM except for the thermal-IR band, which will have an improved resolution of 30 m. The on-board radiometric calibration of the sensor is improved and a panchromatic band is included10.

Multispectral sensors continue to improve. The French SPOT satellite series consistently provide 20-m resolution multispectral and 10-m resolution panchromatic imagery that is available over the Internet. While the currently operational SPOT satellites have three spectral bands, SPOT-5 will have an additional mid-infrared band, increased multispectral resolution of 10-m, and a panchromatic resolution of 5-m. The data rate required to transmit all of these data is on the order of 150 Mbit/sec11!

    1. Hyperspectral Imaging

While multispectral sensors continue to improve in spatial resolution, hyperspectral sensors are beginning to gain more popularity in remote sensing. These sensors are capable of providing data in very narrow spectral bands (on the order of nanometers). The advantage of the hyperspectral sensor is that we can now not only determine what an object is, but what it is made of. Since the reflective and emissivity characteristics of an object vary depend on wavelength and the object’s composition, obtaining a spectrum of reflected and self-emitted photons should provide some information as to what the object’s constituents are. Because each spectral band is narrow, longer dwell times are needed in order to obtain a high enough SNR. Also, decreasing the frequency bandwidth has the adverse effect of decreasing the spatial resolution. Because of this, image processing algorithms are being developed to fuse hyperspectral data with high-resolution multispectral or panchromatic images. The resulting product is an image cube that is made up of voxels (volume elements) instead of pixels (picture elements)12. Figure 5 is an example of a typical image cube generated from hyperspectral imaging. Notice that the spectral information is in the vertical axis of the image and that there are gaps where no detector exists for the corresponding spectral band.

Figure 5. Typical hyperspectral image cube.

Currently, civilian operational hyperspectral imagers are limited to airborne systems. These systems, however, provide very high resolution spectral data at the expense of a wide synoptic view. The AISA Airborne Imaging Spectrometer, for example, can resolve spectral differences of less than 2 nm in the visible to near-infrared spectrum of 430-990 nm while obtaining spatial resolutions of about one meter for an aircraft flying at an altitude of 1000 m. The FOV is 21o cross-track and .055o along-track13. This is all accomplished with a lightweight portable package of about 35 pounds. Other airborne sensors include the NASA AVIRIS (with a spectral range of 400-2450 nm) and the ITRES CASI (430-870 nm). Rochester Institute of Technology (RIT) is currently working on the MISI sensor that has 69 bands ranging from the visible to the far-infrared. The spectral bandwidth of this instrument in the visible range is of the order of 10 nm, which is considered state-of-the-art technology14. Appendix A contains a more extensive list of airborne and spaceborne hyperspectral imagers15.

The Department of Defense (DOD) recently launched the Midcourse Space Experiment (MSX) satellite. This satellite carries a hyperspectral imaging suite of sensors built by the John Hopkins University Applied Physics Laboratory (JHU/APL) called the Ultraviolet and Visible Imagers and Spectrographic Imagers (UVISI). Table 1 shows the characteristics of this sensor.

Over the next decade, several hyperspectral sensors will be placed in orbit. This is largely due to the Mission to Planet Earth (MTPE) initiative headed by NASA. The next spaceborne hyperspectral sensor is the Moderate Resolution Imaging Spectrometer (MODIS) which will be on-board the EOS-1 (Earth Observing Satellite). The sensor will encompass the whole globe every 1 to 2 days with a spectral resolution of 10-15 nm in the visible and near-infrared and 0.1-0.3 mm for the mid-infrared to far-infrared16.

Table 1. UVISI sensor suite parameters17.

Characteristic

UV-VIS Imager

Spectrographic Imager

Wavelength coverage (nm)

110-300 UV, 300-900 VIS

110-900

Field of view (°)

1.6 X 1.3 NFOV

1.0 X 1.0

13.1 x 10.5 WFOV

Optical collecting area (cm²)

130 NFOV, 25 WFOV

110

Filters

5 position selectable

3

Spatial resolution (°)

0.01, 0.10

0.025

Spectral resolution (nm)

-

0.5-4.3

Sensitivity (photons/cm²s)

1-5

1-6

Total Weight: 456 lb

Total Power: 105 W

Perhaps the most critical advance in technology that has made hyperspectral imaging from space a realization is the advent of highly sensitive HgCdTe Focal Plane Arrays (FPA). This allows the narrow-band, "photon-starved" detectors to have a high enough SNR and/or NEDT (noise equivalent change in temperature) in order to register a signal. Also, improvements in the readout electronics result in a much higher detector quantum efficiency (which is a measure of how well the detector can generate electrons per incident photon). There is a host of other technologies involved in the proper operation of a hyperspectral imager (i.e. calibration modules, cooling systems, precise dispersion elements, optical coatings, etc.) that must be considered but are omitted here do to the limited scope of this report16.

Recall from Figure 5 that the image product of a hyperspectral sensor is an image cube. The large volume of data inherent in this product is of concern for spaceborne sensors since on-board recording systems are not encouraged due to weight limitations of payloads. Therefore, advances in telemetry must keep up with the increasing demands for real-time and near-real-time display of imagery. A promising technology appears to be laser communications where data rates of gigabytes per second are possible. Other areas of current development are the incorporation of hyperspectral images from space to current imaging technologies and the use of hyperspectral data for automated target identification18.

3.3 Synthetic Aperture Radar

One of the major limitations of multispectral and hyperspectral sensors is the fact that they cannot "see" through clouds, heavy fog, or haze. Synthetic Aperture Radar (SAR) surpasses this limitation because it operates in the microwave region of the electromagnetic spectrum (1-10 GHz) where there is no water molecule absorption. At frequencies lower than 1 GHz, the characteristics of the returned signal are dominated by ground interference and ionospheric disturbances while molecular absorption bands dominate the range of frequencies above 100 GHz, which yields information about the atmosphere, but not about the Earth’s surface. Because of its all-weather capability, and because it provides its own active source of illumination (and therefore it is independent of the sun), SAR technology is very appealing for the continuous observation of global patterns19.

Although SAR technology has existed since 1951, it has not been seriously considered as a high-resolution imaging system from space until recently. The major limitation in SAR was the inherent large number of computations needed to analyze the data. Because of the limited computational power that existed up until recently, SAR imagery resulted in poor quality and a short dynamic range. Also, SAR systems are particularly susceptible to geometric distortions generated by uneven terrain and large radiometric distortions inherent to the system design. Development of accurate calibration techniques and their operational implementation has made SAR technology more viable19.

The ingenuity behind SAR imaging systems is that the aperture is formed synthetically. Normally, operation in the microwave region would require large antennae of dimensions on the of order hundreds of meters. With SAR, a moving platform emits ranging pulses that are collected by the imaging platform as it moves. The distance the aircraft or spacecraft travels over the time of transmission is the synthetic aperture20. Range resolution in the cross-track direction (direction the imaging system is looking at) is determined by the time it takes for the radar pulse to return. Azimuthal resolution (or along-track) is obtained from the Doppler shift associated with a target return. Consider a point target located at a range R in the cross-track direction of an aircraft and a coordinate x in the along-track direction, the Doppler frequency associated with that point is

ƒD = 2∙(Vst∙sinq)/l = 2∙Vst∙x/l∙R [2]

where q is the angle between the cross-track direction and the target, Vst is the relative velocity of the aircraft, and l is the wavelength of the SAR pulse. Thus, for each location x, there is a Doppler frequency associated with it that allows the point to be resolved19.

The number of space SAR systems is very limited. In the United States, the Seasat-A satellite (which was operational for only 100 days) and the Shuttle Imaging Radar (SIR) are the only spaceborne SAR systems. Because of the limited amount of spaceborne SAR data, the next generation of EOS satellites, which will have SAR systems similar to the SIR-C and X-SAR, will generate interesting results. In 1991, the European Space Agency (ESA) and the National Space Development Agency (NASDA) of Japan launched the European Remote Sensing (ERS-1) which contains the Active Microwave Imager (AMI) SAR system. The AMI operates at 5.3 GHz (C-Band) and has a spatial resolution of 30 m over a 99 km swath21. This is comparable to the Landsat TM and ETM+ sensors. This system is a prototype for the Advanced SAR (ASAR) sensor that will be on-board the environmental monitoring and atmospheric chemistry satellite ENVISAT. The Canadian Radarsat will also employ a C-Band SAR22.

Figure 6. 3-D topographical SAR imagery.

Airborne SAR systems are more common and have provided the necessary tools to make spaceborne SAR possible. Sandia National Laboratories routinely obtains 3-D topographic imagery from interferometric airborne SAR operations. SAR interferometry simply uses two airborne passes that are correlated to generate a synthetic interferometer. Figure 6 is an example of the type of imagery possible with these systems even through heavy clouds20. NASA Jet Propulsion Laboratories (JPL) uses the AIRSAR system which is flown on a DC-8 operating at the P-band (438.75 MHz), L-band (1237.4 MHz), and C-band (5287.5 MHz)23.

As it can be seen, SAR systems can provide a unique form of imagery that is weather- and sun-independent. This is clearly an advantage over optical systems such as Landsat, which provides an average of only 6 to 10 images per year for any non-desert land area on Earth because of its weather limitations24. Furthermore, the microwave nature of the active illumination allows SAR systems to characterize surface properties (i.e. soil moisture, etc). SAR technology also serves as the basis for emerging technologies. These include, but are not limited to, passive interferometric range-angle imaging and optical range-Doppler imaging5.

4.0 Conclusions

The imaging technologies covered in this report are by no means all encompassing, but they represent the major efforts in remote sensing going on today. The tradeoffs are numerous between these systems. Multispectral sensors provide high spatial resolution but are limited in spectral information. On the other hand, hyperspectral sensors can generate nearly continuous spectra of objects but have poor spatial resolution. Finally, SAR systems can overcome weather limitations but are difficult to calibrate and are very difficult to maintain operationally. Since no single system provides and end-all solution, the answer clearly lies in taking the information that can be obtained from each of these systems and putting it all together in a single product. This is what remote sensing is headed towards. The real technological advance will occur when information technology catches up with all the sources of satellite and airborne imagery available and can fuse it all together in a way that it is available to the user in real-time mode. Advances in data transmission and computational speeds will definitely have to occur as well as display technologies that can accurately represent the fused imagery. The knowledge-power associated with having this capability is far reaching in all applications. The future of remote sensing is just beginning, and the reward for our efforts is a better understanding of our own home: The Earth.

 

 

 

Appendix

 

Table 2. Spaceborne Hyperspectral Imagers

Sensor

(Agency)

Number of

Bands

Spectral

Coverage (nm)

Band Width

at FWHM (nm)

GIFOV (mrad) (m)

FOV(deg)

(km)

Data Product

Tentative

Launch Date

NIMS

(NASA/JPL)

504

700-5100

10

0.5

20 pixels

Full Cube

flown

(extra-terrestrial mission)

VIMS

(NASA/JPL)

320

400-5000

15

0.5

70 pixels

Full Cube

flown

(extra-terrestrial mission)

UVISI

(US MILITARY)

> 200

380 - 900

110-900

1 - 3

(100 - 1000)

(25)

Full Cube

MSX

spacecraft

(1994)

Min Map

(?)

192

350-2400

12.5

.45

5.8

Full Cube

1996?

MODIS

(NASA/EOS)

36

415-2130

3750 - 4570

6720 -14240

10 - 500

(250 - 1000)

(2330)

Sub-Cube

EOS AM platform (1998)

EOS PM platform (2000)

MERIS

(ESA/EOS)

15

(selectable)

400 - 1050

2.5 - 10

(selectable)

(300)

(1450)

Sub-Cube

ESA-POEM 1

AM platform

(1998)

PRISM

(ESA/EOS)

~ 150 - 200

1

3

450-2350

3800

8000 - 12300

10 - 12

600

1000

(50)

(50)

Full Cube

Design stage

CIS

(China)

30

6

VNIR

SWIR/MWIR/TIR

20

(402)

90

Full Cube

Design stage

HSI

(TRW)

128

256

400 - 1000

900 - 2500

5.00

6.38

(30)

(7.7)

Full Cube

LEO s/c platform

(1996)

 

 

Table 3. Airborne Hyperspectral Imagers

Sensor
(Agency/Company)IFOV (mrad)
(GIFOV (m))

FOV(°)
(km)

Data Product

Period of
Operation

       

AAHIS
(SAIC)

288

433-832

6.0

1.0 x 0.5

11.4

Image Cube

since 1994

AHS
(Daedalus)

48

440-12700

20 - 1500

2.5

86

Image Cube

since 1994

AIS-1
(NASA/JPL)
AIS-2
(NASA/JPL)

128
128

900-2100
1200-2400
800-1600
1200-2400

9.3
10.6

1.91
2.05

3.7
7.3

Image Cube
Image Cube

1982-1985
1985-1987

AISA
(Karelsilva Oy)

286

450-900

1.56 - 9.36

1.0

21.0

Image Cube

since 1993

AMSS
(GEOSCAN)

32
8
6

490-1090
2020-2370
8500-12000

20.0 - 71.0
60.0
550 - 590

2.1 x 3.0

92.0

Image Cube

since 1985

ARES
(Lockheed)

75

2000 - 6300

25.0 - 70.0

1.17

3 x 3

Image Cube

since 1985

ASAS
(NASA/GSFC)
upgraded ASAS

29
62

455 - 873
400 - 1060

15.0
11.5

.80
.80

25.0
25.0

Image Cube
(7 viewing angles)
+45(deg)/-45(deg)
Image Cube
up to 10 viewing angles)
+75(deg)/-55(deg)

1987 - 1991
since 1992

ASTER Simulator
(DAIS 2815)
(GER)

1
3
20

700 - 1000
3000 - 5000
8000 - 12000

300.0
600 - 700
200

1.0, 2.5
or 5.0

28.8, 65.0
or 104.0

Image Cube

since 1992

AVIRIS
(NASA/JPL)

224

400 - 2450

9.4 - 16.0

1.0 (20)

30.0 (12)

Image Cube

since 1987

CASI
(Itres Research)

288
up to 15

430 - 870
(nominal)

2.9

1.2

35.0

Profiles
Image

since 1989

CAMODIS
(China)

64
24
1
2

400 - 1040
2000 - 2480
3530 - 3940
10500 - 12500

10.0
20.0
410.0
1000.0

1.2 x 3.6
1.2 x 1.8
1.2 x 1.2
1.2 x 1.2

80.0

Imabe Cube

since 1993

DAIS - 7915
(GER/DLR/JRC)

32
8
32
1
6

498 - 1010
1000 - 1800
70 - 2450
3000 - 5000
8700 - 12300

16.0
100.0
15.0
2000.0
600.0

3.3, 2.2
or 1.1

78.0

Full Cube

since 1994

DAIS - 16115
(GER)

76
32
32
6
12
2

400 - 1000
1000 - 1800
2000 - 2500
3000 - 5000
8000 - 12000
400 - 1000

8.0
25.0
16.0
333.0
333.0

3

78.0

Full Cube
Stereo

since 1994

DAIS - 3715
(GER)

32
1
2
1
1

360 - 1000
1000 - 2000
2175 - 2350
3000 - 5000
8000 - 12000

20
1000
50
2000
4000

5.0

90.0

Full Cube

since 1994

FLI/PMI
(MONITEQ)

288
8

430 - 805

2.5

.66/.80

70.0

Full Cube (Profiles)
Sub-Cube

1984 - 1990

GERIS
(GER)

24
7
32

400 - 1000
1000 - 2000
2000 - 2500

25.4
120.0
16.5

2.5, 3.3
or 4.5

90.0

Full Cube

since 1986

HSI
(SAIC)

128

400 - 900

4.3

0.14 x 1.0

8.0

Full Cube

until 1994

HYDICE
(Navel Research
Laboratory)

206

400 - 2500

7.6 - 14.9

0.5

8.94

Full Cube

since 1995

ISM
(DES/IAS/OPS)

64
64

800 - 1700
1500 - 3000

12.5
25.0

3.3/11.7

40.0
(selectable)

Full Cube

since 1991

MAS
(Daedalus)

9
16
16
9

529 - 969
1395 - 2405
2925 - 5325
8342 - 14521

31 - 55
47 - 57
142 - 151
352 - 517

2.5

85.92

Full Cube

since 1993

MAIS
(China)

32
32
7

450 - 1100
1400 - 2500
8200 - 12200

20
30
400 - 800

3
4.5
3

90

Full Cube

1990

MEIS
(McDonnell Douglas)

> 200

350 - 900

2.5

2.5

-

Full Cube

since 1992

MISI
(RIT)

60
1
1
3
4

400 - 1000
1700
2200
3000-5000
8000-14000

10
50
50
2000
2000

1
1 or 2

±45

Full Cube

from 1996

MIVIS
(Daedalus)

20
8
64
10

433 - 833
1150 - 1550
2000 - 2500
8200 - 127000

20.0
50.0
8.0
400.0/500.0

2.0

70.0

Full Cube

since 1993

MUSIC
(Lockheed)

90
90

2500 - 7000
6000 - 14500

25 - 70
60 - 1400

0.5

1.3

Full Cube

since 1989

ROSIS
(MBB/GKSS/DLR)

84
30

430 - 850

4.0/12.0

0.56

16.0

Full Cube
Sub-Cube

since 1993

RTISR
(Surface Optics Corp.)

20 or 30

400 - 700 (900)

7.0 - 14.0 (19.0)

0.2 - 2.0

29.0 x 22.0

Full Cube

since 1994

SFSI (CCRS)

120

1200 - 2400

10.0

0.33
(„ 0.5)

9.4

Full Cube
Sub-Cube

since 1994

SMIFTS
(U. of Hawaii)

75
35

1000 - 5200
3200 - 5200

(100 cm-1)
(50 cm-1)

0.6

6.0

Full Cube

since 1993

TRWIS-A
TRWIS-B
TRWIS-II
TRWIS-III
(TRW)

128
90
99
396

430 - 850
430 - 850
1500 - 2500
400 - 2500

3.3
4.8
11.7
5.0/6.25

1.0
1.0
0.5/1.0
0.9

13.8
13.8
6.9/13.8
13.2

Full Cube
Full Cube
Full Cube
Full Cube

since 1991
since 1991
since 1991
since 1991

Hybrid VIFIS
(U. of Dundee)

30
30

440 - 640
620 - 890

10 - 14
14 - 18

1.0
1.0

31.5
31.5

Full Cube

since 1994

WIS-FDU
(Hughes SBRC)

64

400 - 1030

10.3

1.36

10.0 & 15.0

Full Cube

1992

WIS-VNIR
(Hughes SBRC)

17
67

400 - 600
600 - 1000

9.6 - 14.4
5.4 - 8.6

0.66

19.1

Full Cube

1995

WIS-SWIR
(Hughes SBRC)

41
45

1000 - 1800
1950 - 2500

20.0 - 37.8
18.0 - 25.0

0.66

12.0

Full Cube

1995

 

 

 

References

1. NASA/JPL. "TOPEX/Poseidon." Internet. 14 Nov. 1997. Available http://topex-

www.jpl.nasa.gov/

2. Vick, C.P. "FAS Intelligence Resource Program: CORONA Products." (17 Mar.

1997). Internet. 14 Nov. 1997. Available http://www.fas.org/irp/imint/is-

nuclear.htm

3. Schott, J.R. Remote Sensing: The Image Chain Approach. New York: Oxford UP,

1997.

4. "Multispectral Scanner Landsat Data." U.S. Geological Survey’s EROS Data Center.

Internet. 14 Nov. 1997. http://edcwww.cr.usgs.gov/glis/hyper/guide/landsat.

5. Robinson, S.R., ed. Emerging Systems and Technologies. SPIE: Optical Engineering

Press and Environmental Research Institute of Michigan, 1993. Vol. 8 of The

Infrared and Electro-Optical Systems Handbook. Eds. J.S. Accetta, D.L.

Shumaker. 8 vols. 1993.

6. Colwell, R.N. Manual of Remote Sensing. Vol. 1. Virginia: American Society of

Photogrammetry, 1983.

7. Accetta, J.S., Shumaker, D.L. Eds. The Infrared and Electro-Optical Systems

Handbook. 8 Vols. SPIE: Optical Engineering Press and ERIM, 1993.

8. Schott, J.R. Personal Interview. October, 1997.

9. Vandle, J.R. et. al. "LTER/NASA Collaboration on Atmospheric Correction of

Remotely Sensed Data: Draft Workshop Report." (16-18 Aug. 1996).

14 November 1997. Available http://lternet.edu/nasa/atmcor/atmcor96/.

10. NASA. "ETM+ Image Formation." Internet. 13 Nov. 1997. Available

http://ls7pm3.gsfc.nasa.gov/Science.html

11. "The SPOT Satellites." Internet. 13 Nov. 1997. Available

http://www.spot.com/anglaise/system/satel/ss_tdata.htm

12. "Image Cube." Digital Imaging and Remote Sensing Group. Center of Imaging

Science. Rochester Institute of Technology.

13. "AISA Airborne Imaging Spectrometer." Internet. 9 Oct. 1997. Available

http://www.specim.fi/aisa.html

14. Schott, J.R. Personal Interview. October, 1997.

15. Staenz, K. "Airborne and Spaceborne Imaging Spectrometers." Canadian Activities

in Terrestrial Imaging Spectroscopy. Internet. 13 Nov. 1997. Available

http://eol.ists.ca/documents/IS-Team-Canada/Can-Activities-

ImagSpec.book_99.html#HEADING98

16. Ward, K. "MODIS Instrument." NASA/MTPE. Internet. 9 Oct. 1997. Available

http://modarch.gsfc.nasa.gov/MODIS/INSTRUMENT/

17. "UVISI" U.S. Navy Research Laboratories. John Hopkins University Applied

Physics Laboratory. Internet. 13 Nov. 1997. Available: http://msx.nrl.navy.mil/

18. "Leveraging the Infosphere: Surveillance and Reconnaissance in 2020." Air

University. Vol.1 of Spacecast 2020. June 1994.

19. Curlander, J.C., McDonough, R.N. Synthetic Aperture Radar: Systems and Signal

Processing. New York: John Wiley & Sons, Inc. 1991.

20. Walker, B. "How SAR Works." Sandia National Laboratories. (8 Jan. 1996).

Internet. 13 Nov. 1997. Available http://www.sandia.gov/RADAR/sar_sub/

  1. "AMI Sensor." ESA/NASDA. Internet. 13 Nov. 1997. Available

http://hdsn.eoc.nasda.go.jp/guide/guide/satellite/serdata/ami_e.html

22. Asrar, G., Greenstone, R. Eds. 1995 MTPE EOS Reference Handbook. NASA-

Goddard Space Flight Center.

  1. Maldonado, L. Jr., Curator "AIRSAR General Reference Manual." (18 Jul. 1997)

AIRSAR Jet Propulsion Laboratory. Internet. 13 Nov. 1997. Available

http://airsar.jpl.nasa.gov/techinfo/techinfo.html

24. Sellers, P.J. et. al. "Earth Science, Landsat, and the Earth Observing System." Land

Satellite Information in the Next Decade: Conference Proceedings of the

American Society of Photogrammetry and Remote Sensing, Vienna, Virginia,

25-28 September 1995. Maryland: American Society of Photogrammetry and

Remote Sensing, 1995.