Hiring ISRO geologists
See plankton, aerosol, cloud, ocean ecosystem (PACE)
Pacific Disaster Center (PDC)
Subordinate to the US Department of Defense civil authority for disaster risk reduction and relief in the Pacific region. From an organizational and technological point of view, the PDC is set up as a model for global, national and local disaster management.
Pan image sharpening
Engl. pansharpening; the fusion of panchromatic image data with high resolution with low resolution multispectral data in a numerical process. The result is a combination of high resolution and multispectral color information. The method is common in digital aerial cameras and remote sensing satellites.
Pan image sharpening with images from the Quickbird satellite
The left, panchromatic image has a spatial resolution of 0.6 m, the middle true color image of 2.4 m. In a combination, a high-resolution color image is created, here with the help of the HighView software from Geosage.Swell:
DigitalGlobe / Geosage
Term for the broadband spectral sensitivity of a sensor or film material. Panchromatic means that the sensor is sensitive over the entire range of the human eye from around 400 nm (blue-violet) to 780 nm (deep red). The gradation of the gray values of panchromatic data is thus comparable to typical black and white images. For example, the HRV sensors on the SPOT satellite series can be operated in panchromatic mode.
The main differences between panchromatic, multispectral and hyperspectral data acquisition are the width and number of recording tapes. While panchromatic sensors work with a single wide recording band, multi- and hyperspectral instruments have a larger number of narrower recording channels to increase the spectral resolution. Hyperspectral sensors can have up to several hundred recording tapes lying close together.
Since panchromatic films reproduce all colors in appropriate gray tones corresponding to the brightness perception of the human eye, they are most widely used for aerial photographs and are generally used as recording material.
A somewhat lower sensitivity for green is typical for panchromatic films. This corresponds to the sensitivity of the human eye, for which green - unlike the alarm colors red and yellow - has a calming, gentle effect due to its lower sensitivity. Images of panchromatic color or black and white films also appear natural to the human eye as infrared images. Furthermore, more details can be seen in panchromatic films in shadow-covered areas. This advantage comes into play particularly in areas with steep topography and / or high, dense forests and thus large, shaded areas. With the particularly problematic black and white infrared film, shaded areas are shown in deep black and no details can be seen in these areas. Panchromatic films are also sensitive to objects below the surface of the water, and the resolution is usually higher than that of infrared films.
The collection of light from a wide range of wavelengths allows more energy to be collected and therefore high resolution images (up to 30 cm resolution on the best commercially available satellite instruments).
Camera or sensor with a sensitivity in the range of visible light without differentiating individual spectral ranges, so it is equally sensitive to the entire spectrum of visible light. This means that in panchromatic recordings, the visible light reflected from objects is reproduced in grayscale. The gradations correspond to the human perception of brightness. Although colors cannot be reproduced, panchromatic recordings can achieve high spatial resolutions and are therefore often used for mapping purposes.
panchromatic color film
Panchromatic color films were developed in the 1930s. Initially, they were rarely used for aerial photography due to their high cost and low resolution. However, by reducing costs, increasing the sensitivity to light, improving the image quality and due to the fact that color images have an increased information content, they have become more and more attractive.
Panchromatic color films consist of three photosensitive layers, each layer being sensitive to a specific spectral or color range. The top layer has a blue-sensitive (0.4 to 0.5 μm), the middle one a green-sensitive (0.5 to 0.6 μm) and the bottom one a red-sensitive (0.6 to 0.7 μm) emulsion. Since the green- and red-sensitive emulsion is also sensitive to blue light, a blue-absorbing yellow filter must be inserted between the first and second layer, which is later removed during development.
panchromatic black and white film
Black and white film that is sensitive to all colors of the visible spectrum and to ultraviolet light and converts these into gray tones that correspond to the brightness of the human eye. The strong atmospheric scattering of UV and blue radiation reduces the contrast of panchromatic black and white images. Therefore, a short-wave radiation absorbing filter is usually placed in front of the camera lens. Panchromatic black and white films can also be used to record selected wavelength ranges. For example, if you only want to display green light, filters must be used that absorb the rest of the visible and ultraviolet light.
Engl. panoramic distortion, French distortion panoramique; according to DIN 18716 the "systematic image distortion that occurs when data recorded with special opto-mechanical scanners are reproduced directly as images".
As a result of line-by-line digitization at a constant time interval, image elements recorded with optomechanical scanners or with corresponding microwave radiometers are enlarged transversely to the direction of flight. However, the picture elements are displayed in a uniform size corresponding to the projection in the nadir direction and are thus compressed. This distortion is corrected (panorama correction) by assuming image elements of the same size over the entire width of the strip and calculating the pixel position back into the compressed original image. The corresponding gray values are determined by one-dimensional interpolation from neighboring gray values (e.g. nearest neighbor).
Syn. Cuboid or box classification; Image classification method that uses a parallelepiped shape to assign values to a specific object class. The parallelepiped represents a multi-dimensional space, defined in terms of values for different frequency bands. An upper and a lower gray value threshold are defined for each spectral channel, which lead to the formation of cuboids in three-dimensional space. During the classification, each picture element is assigned to the cuboid that contains the gray value combination of the element in the various spectral channels. The cuboids must not overlap, or the gray values of the individual channels should not have any strong correlations. The pixels of an image are then classified according to which parallelpiped they fall into. Pixels that fall out are classified as unknown / unassignable.
Among the monitored classifications, the method of parallelepiped formation is the simplest and involves moderate computational effort.
A measurable or inferred variable that is represented by data, e.g. sea surface temperature, ice thickness, relative humidity.
The CEOS EO Handbook offers an overview of the parameters and related instruments observed by satellites, divided into the large groups of atmosphere, land, ocean, snow and ice as well as gravity and magnetic fields. Their operating time is visualized with the help of timelines.
Engl. Acronym for Polarization and A.nisotropy of R.eflectances for A.tmospheric S.cience coupled with Observations from a L.IDAR; CNES mission to determine the microphysical and radiation properties of clouds and aerosols. This information is important in order to be able to determine and model the influence of clouds and aerosols on the radiation budget. The main instrument is POLDER (Polarization and Directionality of the Earth's Reflectances), an imaging radiometer / polarimeter with a wide field of view. Parasol is the second mission in the myriad microsatellite series.
The satellite, whose nominal deployment time was set for two years, had been in a sun-synchronous orbit at an altitude of 700 km since December 2004. One orbit takes 98.8 minutes. In November 2011, PARASOL was lowered 9.5 km below the A-Train and continues to observe clouds and aerosols. PARASOL was shut down at the end of 2013.
Engl. Acronym for Passive R.eflectrometry and I.nterferometric S.ystem; Instrument payload under development by ESA to measure the surface roughness of the ocean surface using GPS / GNSS radio signals.
Engl. For Satellite flight, Satellite flyover
Engl. passive sensor, French capteur passif; Recording system for remote sensing, which receives the naturally existing or already artificially, externally existing radiation.
Passive sensors are, for example, different types of radiometers (instruments that quantitatively measure the intensity of electromagnetic radiation in selected frequency bands) and spectrometers (devices that are designed to detect, measure and analyze the spectral content of reflected electromagnetic radiation). Most passive systems used in remote sensing applications operate in the visible, infrared, thermal infrared, and microwave regions of the electromagnetic spectrum. These sensors measure land and sea surface temperature, vegetation properties, cloud and aerosol properties, and other physical properties.
Most passive sensors cannot penetrate thick cloud cover and therefore have limitations when observing areas such as the tropics where thick cloud cover is common.
Scheme of a passive sensor compared to an active sensor
List of types of passive remote sensing sensors:
- Accelerometer - An instrument that measures acceleration (change in speed per unit of time). There are two general types of accelerometers. One measures translational accelerations (changes in linear movements in one or more dimensions), the other measures angular accelerations (changes in the rate of rotation per unit of time).
- Hyperspectroradiometer - An advanced multispectral sensor that detects hundreds of very narrow spectral bands across the visible, near, and mid-infrared range of the electromagnetic spectrum. The very high spectral resolution of this sensor makes it easy to distinguish between different targets based on their spectral response in each of the narrow bands.
- Imaging radiometer A radiometer with a scanning function that provides a two-dimensional array of pixels from which an image can be generated. Scanning can be done mechanically or electronically using an array of detectors.
- Radiometer - An instrument that quantitatively measures the intensity of electromagnetic radiation in some bands within the spectrum. Usually a radiometer is further identified by the part of the spectrum it covers, e.g. B. Visible, Infrared or Microwave.
- Sounder - An instrument that measures vertical distributions of atmospheric parameters such as temperature, pressure and composition from multispectral information.
- spectrometer - A device that is designed to record, measure and analyze the spectral content of incident electromagnetic radiation. Conventional imaging spectrometers use gratings or prisms to split the radiation for spectral differentiation.
- Spectroradiometer - A radiometer that measures the intensity of radiation in multiple wavelength bands (i.e. multispectral). Often the bands have a high spectral resolution and are designed for remote sensing of specific geophysical parameters.
passive (remote sensing) system
Engl. passive remote sensing system, French système de télédétection passive; a system which, in contrast to an active (remote sensing) system, is only sensitive to electromagnetic radiation, the
- is emitted by the observed object (e.g. the thermal radiation given off by every body due to its surface temperature) or the
- is reflected by the object (e.g. solar radiation reflected on the earth's surface), but which is not the source of the system itself.
The radiation values received by the system can then be converted into geophysical values, e.g. temperature, using sometimes complex conversion processes.
DIN 18716 defines the term as a "recording system that uses natural electromagnetic radiation".
Passive systems are used in the range from visible to infrared. In this area of the electromagnetic spectrum you have to rely on a cloud-free sky. The sensors in these systems record the radiation that is emitted or reflected by objects and their surroundings. Usually this is the reflected solar radiation.
It must be taken into account that in this area of the electromagnetic spectrum every form of radiation on its way from the sun to the object and finally to the sensor is influenced by the atmosphere in a wide variety of ways. Both solar radiation and radiation reflected from the earth are scattered, reflected or absorbed by particles in the atmosphere. In particular, carbon dioxide, ozone and water vapor absorb strongly. For remote sensing, however, the reflection of clouds is the biggest problem, since the radiation cannot penetrate the clouds in this part of the spectrum.
Passive sensors are e.g. B. installed on the satellites of the Sentinel-2 mission, the Landsat series and on RapidEye (measure reflected solar radiation) as well as on Landsat 8 (measures emitted thermal radiation). Passive and optical sensors record the radiation from the visible range of the electromagnetic spectrum to the infrared range, separated by wavelengths in so-called spectral channels. That is why such systems are also referred to as multispectral remote sensing. The product of a spectral channel are gray-scale images that reproduce the intensity of the radiation recorded. The sensor only records differences in brightness. In the optical evaluation of the multispectral data, certain colors are then assigned to the various spectral channels (bands) and the gray-scale images are combined with one another.
The more channels a remote sensing sensor has, the higher the spectral resolution of the satellite. Sensors that have more than 20 up to 200 or even more spectral channels are referred to as hyperspectral systems.
passive microwave system
Sensor system that detects the natural microwave radiation emitted by the earth's surface, e.g. a limb-probing microwave sensor.
Control point (GCP)
Syn. Reference point, Fixed point, control point, engl. ground control point (GCP), French point d'amer, point de calage; In photogrammetry and remote sensing, a point in an image or a photogrammetric model, the object coordinates of which have been determined using geodetic (usually GPS) or photogrammetric methods or taken from maps. Control points are used to determine the exact spatial position and orientation of the aerial or satellite image to the ground and to transform the image into a specified coordinate system. Object points that can be identified in an image (e.g. building corners, individual rocks) and whose coordinates are known in the object space are suitable as control points. If control points cannot be identified in the image with sufficient certainty, the terrain points must be signaled before the image flight.
There are usually three types of control points:
- Full pass points - all three coordinates are known (spatial coordinates X, Y and Z)
- Position control points - the horizontal position in the object space is known (position coordinates X and Y)
- Height control points - the point height is given in the height reference system (height coordinate Z)
Control point determination
Engl. measuring ground control points; Determination of the object coordinates of task-specific selected control points of the object to be recorded. The coordinates can be determined geodetically for a limited number of points and photogrammetrically to a greater extent using the points determined in this way. The geodetic control point determination includes the determination of coordinates, usually with GPS, the identification of the points in the aerial or satellite image and the preparation of a calibration sketch as a basis for a reliable identification in the photogrammetric image evaluation.
Engl. For Satellite flight, Satellite orbit
Radar satellite mission of the Spanish company Hisdesat. Together with the INGENIO optical earth observation satellite, PAZ forms this Programa Nacional de Observación de la Tierra por Satélite (PNOTS). PAZ was launched on February 22, 2018 with a Falcon-9 missile from Vandenberg Air Force Base in California, INGENIO is scheduled to follow in 2019. The satellite PAZ (Spanish for "peace") will move in the same orbit as TerraSAR-X and TanDEM-X. PAZ was developed as a dual-use mission - primarily to meet defense and security needs, but also for civilian applications in the field of high-resolution earth observation. The applications are varied: surveillance of the earth's surface, border control, tactical support for missions abroad, crisis management and risk assessment of natural disasters, environmental monitoring, surveillance of the marine environment, etc.
PAZ is structurally identical to TerraSAR-X and, like this one, was built at the German Astrium site in Immenstaad. PAZ uses Synthetic Aperture Radar in the X-band. These instruments were developed and built by Astrium (now Airbus Defense and Space) at the Barajas site. The data can be kept on board in a 256 GB memory and then sent to earth at a rate of 300 Mbit / s. The X-band at approx. 9.65 GHz is also used here. So that the downlink transmission does not impair the radar instrument, the transmitting antenna is mounted on a boom that is only opened in space. The total mass of the satellite is around 1,200 kg, the electrical power of the solar collectors 850 W. The flat phased array antenna of the instrument, designed for different operating modes with printed radiators, is 4.8 m long and 0.7 m wide and works with different swath widths and resolutions of up to one meter.
PAZ orbits the earth 15 times a day in a sun-synchronous polar orbit at an altitude of 514 km. It will deliver an average of 200 images per day and be able to scan an area of around 300,000 km².
Different resolutions are available:
- Spotlight: 10 km × 5 km with 1 m resolution or 10 km × 10 km with 2 m resolution
- ScanSAR: strips of 100 km swath width at 15 m resolution
- Stripmap: strips of 30 km swath width with 3 m resolution (single polarization) or 15 km swath width with 6 m resolution (double polarization)
|PAZ satellite constellation|
The Spanish earth observation satellite PAZ, launched in February 2018, complements the radar satellite constellation of TerraSAR-X and TanDEM-X. With the third satellite in the same orbit, the constellation will in future deliver a higher quality of service and offer users numerous advantages:
The ground station for Paz and Ingenio is in Torrejón de Ardoz near Madrid. Maspalomas in the Canary Islands acts as a replacement station. Both stations are operated by the Instituto Nacional de Técnica Aeroespacial (INTA). PAZ's ground segment builds on technologies developed by the IMF for the two twin satellites. A constellation of three satellites can thus enable recordings to be made available more quickly in the future.
PAZ, like the two satellites TerraSAR-X and TanDEM-X, which are flying in close formation, can map almost every scene on the earth's surface within three days. On average, the satellites can capture a location or repeat a recording within 24 hours. Every eleven days they fly over the same point on the ground with exactly the same recording geometry. The additional use of PAZ enables higher recording capacities and shorter repetition rates in the future - recordings can thus be made available more quickly. The three radar satellites offer the same imaging modes and provide identical image properties.
The data recorded by PAZ will also be incorporated into Copernicus, the European Union's environmental monitoring system.
Frequency range between 100 and 300 MHz (wavelength 30 to 100 cm) within the microwave segment of the electromagnetic spectrum. The P-band is used for SAR. Measurements with P-band are not hindered by atmospheric effects. It is able to see through heavy rain showers. The penetration capabilities of P-band SAR are of great importance with regard to studies of vegetation cover, glacier and sea ice as well as soil.
Engl. perigee, French périgée; in an elliptical orbit, the point at which a satellite is closest to the earth (center of the earth). If you subtract the earth's radius from this distance, you get the minimum height of the satellite orbit above the earth's surface. (Ggs. Apogee)
|Apogee and perigee one|
Source: mercat (R.o.)
The closest point to the Sun in the orbit of a planet or satellite orbiting the Sun.
Time it takes a satellite to make one orbit.
Point on the elliptical orbit of a spaceship at which it is the least distant from the body it is orbiting. If this body is the earth, the term perigee is used, in the case of the sun the term perihelion (Ggs. Apocenter).
Earth observation satellite of the Peruvian space agency CONIDA, launched with a Vega launcher on September 16, 2016 from the European spaceport in Kourou in French Guiana.
The three-axis stabilized PerúSAT-1 is in a sun-synchronous polar orbit at an altitude of around 695 km. Using its state-of-the-art silicon carbide optical instrument, the satellite will provide images with a resolution of 70 cm (2 m resolution in color with four channels). The swath width is 20 km. The recordings are used, among other things, in the fields of agriculture, climate monitoring, urban planning, cartography, mining, geology, hydrology, border control and the fight against drug trafficking as well as to support the management of humanitarian aid operations and to evaluate natural disasters. The data transmission to earth takes place in the X-band with 180 to 310 Mbit / s. The telemetry and control takes place in the S-band. PerúSAT-1 will be the most powerful earth observation satellite in Latin America.
|Open pit copper mine Cuajone (southern Peru)|
The first image from PerúSAT-1
PerúSAT-1 has a silicon carbide optical instrument with a resolution of 70 cm and is based on the AstroBus-S platform. The image data is used in many areas, from agriculture to disaster relief.
The satellite, which was built by Airbus Defense and Space in a record time of less than 24 months, is based on the highly flexible, compact AstroBus-S platform. The satellite was integrated into a multi-payload dispenser, also developed and built by Airbus DS, in order to launch multiple satellites into space with a single launch. In addition to PerúSAT-1, there were also four optical microsatellites from Terra Bella on board the Vega launcher, which were brought into orbit one after the other.
Discipline that analyzes the influence of the climate (mainly the temperature conditions) on the biosphere with regard to the occurrence of certain developmental stages and expressions of life. The results are evaluated in phenological maps and allow e.g. a site-differentiated cultivation planning, especially of special crops.
Regular monitoring of plant development over the entire vegetation period can be important for the documentation and optimization of plant production. Often it is not possible to survey a large area with a large number of fields to check the plant population. Through the use of satellite data, which are available several times a year, the phenological course of each field can be observed during the vegetation period.
Possible questions in inventory monitoring that are helpful for sugar beet factories in order to plan and assess the campaign:
On the left a sequence of satellite images with sugar beet fields and their visible phenological development over the growth phase.Source: Vista-geo.de
Engl. photodetector; syn. Light sensor, detectors used in remote sensing that react to changes in the incident photon flux. From the ultraviolet to the near infrared (approx. 1 μm) silicone photodiodes are used. Between 1 - 12 μm there are materials such as PbS (lead sulfide), InSB (indium antimonide), HGCdTe (mercury cadmium telluride). Typical photodetectors are used photovoltaically (PV = voltage changes) or photoconductive (PC = resistance changes). Each photodetector reacts to radiation of a certain wavelength range. Photo detectors for the near or thermal infrared range (> 1 μm) have to be cooled to reduce the inherent noise. The longer the wavelengths, the colder the detector has to be.
Engl. photogeology, French photogéology; Name for the methods of geological aerial and satellite image evaluation within the scope of the various tasks, in particular in regional and local geological mapping, in ore and oil prospecting, in hydrological and engineering geological projects. The description and interpretation of the spatial changes in the earth's surface in terms of color, gray and hue, geometry, relative relief, surface structure, outcrop of geological layers, lineaments, drainage systems, vegetation patterns etc. are often used These impressions are reinforced by filters or color transformations, etc.
Since there are close connections between the surface forms of the landscape genetic formations and the geological subsurface, various conclusions can be drawn about the rock types and the tectonic structure of a landscape from the forms and features visible in aerial and satellite images. This is particularly true for arid and semi-arid regions, where surface shapes are not covered by vegetation. Due to its high practical importance, photogeology has developed as a sub-discipline. This also applies to the design of the satellites and sensor systems. Partly taken into account (e.g. channel 7 of the Landsat Thematic Mapper).
Engl. photogrammetry, French photogrammétrie; Production of maps from aerial photographs, photographs or satellite images as an independent method of geo remote sensing. Primarily, geometric information (shape, size, position, etc.) is extracted from the images in order to precisely record the topography and quantifiable topologies. With the help of photogrammetry, basic data for the creation and updating of topographic and thematic maps as well as further processing in geographic information systems are obtained. The imaging methods were originally analog (photography), digital recordings are increasingly being used.
If high-resolution remote sensing data, especially aerial images, are available or can be created, photogrammetric methods have a high potential for interpreting visible phenomena on the earth's surface and determining their absolute geometry. The focus is on the observation of three-dimensional environments and processes. Photogrammetry sensors are analog and digital photographic systems as well as scanners.
The features of photogrammetry are:
- contactless (remote sensing)
- flexible recording time
- short recording time, combined with a high recording frequency (recording of dynamic processes possible)
- Storage of all surface information "visible" for the sensor up to the resolution of the sensor (approx. 30-40 lines / mm for photographic film, i.e. a few cm / pixel)
- very effective and geometrically stable storage
- Information can be extracted at any time and analyzed in a variety of ways
- extensive information and mathematical analysis
The disadvantages of photogrammetry are:
- only surface data (depth information together with e.g. geophysics and models)
- no cloud penetration
- Problems with lighting, shadows, lack of contrast
- Distortions (central perspective)
- high temporal resolution for aerial photos is only possible with great effort
- high image flight and evaluation effort
In the case of digital aerial photo chambers, the complex calibration of the chamber, the necessary large storage capacities on board the sensor platform (aircraft) and the 'back up' (data backup) are also added.
Aerophotogrammetry is particularly important for geosciences. In it aerial images are analyzed (e.g. moving sensor, aerial photography). The recording can be approximated as a vertical recording. Opposite it is terrestrial photogrammetry. It analyzes terrestrial images under constant recording conditions, but the recording geometry is usually far from the "ideal" vertical recording.
When looking at the History of photogrammetry one has to go back to the middle of the 19th century, when the theory of photogrammetry was developed in France and Prussia in parallel with the emerging photography. At the beginning of the 20th century, a national planning and military need for correct cartographic representations met the advancing development of aircraft and improved camera technology and film material. The analog recordings of that time were converted into maps with the help of opto-mechanical processes. Stereo-optical recordings were used to measure the height of the topography and objects.
The first earth surveying systems from space were also based on the principle of analog cameras and the subsequent photogrammetric evaluation (e.g. the European metric camera on the Space Shuttle mission of 1983). The first civil electronic systems with a geometric resolution of 80 m (Landsat-1, 1972), later 30 m (Landsat-4, 1982) did not yet have the geometric accuracy to produce detailed maps on a large scale. It was not until the French SPOT-1 satellite (1986), with its geometric resolution of 10 m and its swiveling sensor, that three-dimensional maps could be created on a larger scale for the first time. In addition, the previously analog photogrammetric process could be converted to new digital algorithms and processes.
With the end of the East-West confrontation, images with geometric resolutions better than 10 were also made available for civil use. In addition to the first Russian data, the launch of the civil US satellite IKONOS (1999) marked the beginning of a new era of high-resolution satellites with a resolution of less than 1 m.
The sensors on these satellites are designed entirely for the purpose of rapid mapping. The high geometric resolution is mostly only achieved in a panchromatic channel (B / W image). A few other channels with a slightly poorer geometric resolution allow the display as a color image. The accuracy of the position control of the sensor or the satellite is also important for the application as a map basis. Complex control systems allow both fast and precise alignment, mostly of the entire satellite. In addition, a large number of orbit and position determination sensors ensure the determination of the position of an image point on the earth with an accuracy of a few meters. A further improvement in the meter and sub-meter range can only be achieved by linking to known terrestrial control points, such as road crossings.The rapid change in the alignment of some satellites also allows a scene to be viewed again from a different oblique view and thus a stereo image pair to be obtained for three-dimensional analysis. If two cameras with different orientations are used on a satellite, such stereo-optical recordings can be generated permanently. This is possible, for example, with the satellites of the Indian CARTOSAT series.
In addition to the creation and tracking of topographic maps, the spectral information from these satellite sensors also enables detailed thematic mapping. In the civil sector, planning bases are laid, e.g. for large structures, or new roads are recorded for navigation systems. High-resolution mapping of crops allows the European Commission to monitor subsidies for agriculture and the United Nations to provide evidence of illegal drug cultivation (UNODC and illicit crop monitoring). With the help of the spectral information, the UN also receives information about activities in the nuclear field.
Rapidly implemented mappings based on high-resolution satellite data also increasingly support humanitarian aid operations in crisis areas and after natural disasters (disaster management). By comparing it with older data, current observations can show the destruction of infrastructure, e.g. after an earthquake, or the extent of flooding. In view of the urgency of such information and because of its independence from clouds and times of day, geometrically high-resolution radar systems (e.g. TerraSAR-X, COSMO-SkyMed) are increasingly being used for these purposes. In Germany, this task is the responsibility of the Center for Satellite-Based Crisis Information (ZKI) of the DLR. The ZKI is involved in European and worldwide programs (e.g. International Charter for Space and Natural Disasters).
Evaluation process for aerial and satellite images, in which the determination of geometric parameters is in the foreground. The interpretation of the content of the images is only affected insofar as it applies to the identification of the variables to be measured.
The image evaluation consists in deriving object properties from the images or image sequences. This includes the geometry (3D position), size, shape and, in the case of moving objects, the path of movement as well as the direction of movement and speed, each as a function of time), the brightness distribution (orthophotos and other visualizations, spectral signature) and the semantic information ( Class, attributes) of the individual objects shown. Relationships between the objects also belong to the derivable properties.
Engl. photography, photograph, French photograph; the technology and science of producing permanent images by means of visible, ultraviolet and infrared radiation by means of photochemical conversion in radiation-sensitive layers, also a term for the result of such a process (light image).
Photographic recording is a passive process that absorbs electromagnetic radiation in the wavelength range from 0.3 to 1.2 μm, i.e. from near ultraviolet (UV) through visible light to near infrared (IR). Photography occupies a special position among the recording processes of remote sensing. It is the only process in which the radiation-sensitive material - the photographic layer - also serves as a storage medium. It allows the simultaneous recording and storage of large amounts of data in a small space at low cost. This significant advantage is offset by serious disadvantages. For example, the radiometric calibration of photographic systems is difficult and uncertain, the photographically detectable spectral range is rather narrow, and the photographic process is an inexpedient intermediate step when the recorded data is to be computationally processed.
Photographic images contain the information in analog form, i.e. in physical quantities. A black and white image can therefore be understood as a continuous two-dimensional image function that assigns a gray value to each point of the image area. One therefore speaks of a gray value or intensity image. A color image, on the other hand, contains corresponding continuous functions in each of the photographic layers, which together result in the color image.
If a photograph is to be reproduced in digital form, the (b / w) image is divided into small, equally large and uniform areas (picture elements), the brightness characteristics of which are each denoted by a numerical value. The computer then displays each digital value as a different brightness value. In contrast, sensors that record electromagnetic energy store this radiation as a numerical pattern in digital form from the start.
Photographic films consist of a transparent, largely dimensionally stable layer support made of polyester and one or more photo layers (emulsions) on it. The emulsion is made up of light-sensitive silver salts, which are embedded in a gelatin layer. The silver salt crystals come in different sizes. Depending on the mixing ratio of the different crystal sizes, an emulsion has certain properties. Thus, the average size of silver salt crystals is an important characteristic of any photographic film. On the one hand, it has a significant influence on how much light is required to produce a photographic image. On the other hand, it determines the graininess of a film, which in turn influences the geometric resolution. Fine-grain emulsions have a low sensitivity to light and a high geometric resolution. Coarse-grained emulsions, on the other hand, are very light-sensitive, but have a low geometric resolution.
| Conversion of an analog photograph into a digital image|
A photograph could also be represented and displayed in a digital format, dividing the image into small areas of equal size and shape called picture elements, or pixels, and representing the brightness of each area with a numerical value or digital number. In fact, this is exactly what has been done with the photo on the left. In fact, it is a digital image of the original photo! The photo was scanned and divided into pixels, with each pixel assigned a digital number that represents its relative brightness. The computer displays each digital value as different levels of brightness.Source: Natural Resources Canada
In photographic systems, an image of the object to be recorded is projected through a lens for usually only a short time onto a light-sensitive layer, which is thereby changed in such a way that a permanent image is created by the photographic process.
Photographic systems are passive systems that absorb radiation in visible light and in the near infrared (from approx. 0.4 to 1.0 mm).
A collection of aerial photographs that are closely linked to provide a coherent overview of a flown area.
photosynthetically active radiation (PAR)
Engl. photosynthetically active radiation (PAR, PhAR); electromagnetic radiation (380-780 nm) used by plants for biochemical plant processes, mainly for photosynthesis. This range (380-780 nm) largely coincides with the range of radiation visible to humans (380-780 nm), which makes up about 50% of global radiation. PAR is adsorbed by chlorophyll and pigments, which absorb in the red and blue spectral range, while green is reflected. PAR is usually given in W / m² from 400-700 nm.
Since PAR controls the primary production and thus the carbon fixation of terrestrial and marine vegetation, it influences the energy and water exchange (evapotranspiration) between the vegetation and the atmosphere. In the field of climate research, photosynthetically active radiation is an important parameter for calculating the carbon balance of terrestrial and marine vegetation.
French solar research satellite of the CNES, which was launched on a Dnepr-1 launcher on June 15, 2010 from the Jasny rocket launch site together with two other satellites. PICARD is a small satellite (mass 150 kg) of the MYRIADE series for determining solar radiation, the diameter and shape of the sun and the structure of the sun using helioseismic methods. The sun-synchronous orbit fluctuates between 730 km and 750 km altitude. A mission period of at least 2 years is planned.
The findings are intended to improve our knowledge of the solar propulsion of the earth's climate as well as our knowledge of the physics of the sun and its internal structure.
Picard's payload consists of an imaging telescope, 1 radiometer and 3 sun photometers.
Left: Artist's impression of the satellite
The PICARD mission is named after the French astronomer Jean Picard, who carried out the first accurate measurement of the sun's diameter. His measurements are especially important as they were made during the Maunder Minimum. This period was characterized by the absence of sunspots and a distinctly cold climate.
Right: Lithograph by Jean Picard (1620-1682)Source: CNES
Original name for the CALIPSO mission.
Engl. Acronym for Pparameters I.information Locator Tool; this tool enables searches in the DAAC databases for the search criterion "parameters". They are related Parameter Information Pages.
Small, usually red latex balloon filled with helium or hydrogen without further instruments and with a constant rate of rise for the optical determination of the cloud base during the daytime. The balloon is filled in such a way that its rate of ascent is known. The height of the cloud above ground can be calculated from the time from the start to immersion in the cloud. By tracking the trajectory, earlier with a theodolite, today with radar or GPS, the direction and speed of high-altitude winds can be determined. At night, the balloon can be fitted with a lamp on its underside.
Made up word picture element; In satellite images, the term for a single, usually square, image point corresponding to a certain floor area, as the smallest unit of the image. This correspondence is a measure of the ability of a sensor to recognize objects of different sizes. The higher the number of pixels per unit area, the higher the resolution of the image. The totality of all pixels form a data set, an image.
The Landsat 7's Enhanced Thematic Mapper Plus has a maximum resolution of 15 m; in this respect, each pixel represents an area of 15 x 15 m. Higher resolution (smaller pixel area) means that the sensor is able to differentiate between smaller objects. The area of a scene can be calculated by adding the number of pixels in an image. Or, if you have the number of green pixels in a false color image calculated, you can determine the total area that is covered with vegetation.
|Pixels and spatial resolution|
The spatial resolution describes the pixel size of
High spatial resolution: 0.41 - 4 m
Low spatial resolution: 30 -> 1000 mSource: RapidEye
Engl. pixel graphics; Graphics made up of individual pixels, with each pixel representing a digital value. This depends on the color model used and the quantization.
European satellite mission to study cosmic background radiation, it has now ended. Planck's detectors are designed in such a way that they can measure the cosmological temperature differences, which only occur to the fifth or sixth decimal place, over a wide frequency range, depending on the angle scale. This means that the satellite determines temperature fluctuations in the background radiation in the range of a millionth of a degree.
Since the radiation previously interacted with matter (protons, electrons), conclusions can be drawn about the early distribution of matter and the parameters that describe cosmic development can be determined with great accuracy.
|Planck Spacecraft Build-up|
The picture shows from top left to bottom right the detailed structure of the focal plane units (FPUs) of the high frequency instrument (HFI) and the low frequency instrument (LFI) as well as the Planck cooling system and the main components of the spacecraft, from the close-up in the focal plane to the entire spacecraft with telescope, baffle and service module.
The satellite has two different instruments for observing the radiation, the "High Frequency Instrument" (HFI) for the higher frequency range and the "Low Frequency Instrument" (LFI) for the lower frequency range. After the instruments were calibrated, the telescope began regular observation on August 13, 2009. The first complete image of the sky was completed in June 2010, but post-processing was necessary to achieve full accuracy. The first results were published in January 2011.
Planck's aim was to record the weakest residual radiation of the Big Bang, the so-called cosmic microwave background radiation in parallel at nine frequencies between 30 and 857 GHz. This background radiation illustrates the universe in its state around 380,000 years after the Big Bang and provides details of the original conditions that led to the universe in which we live today.
However, due to the cosmic expansion, the energy of the photons has decreased so much that today they are received in the microwave range with a temperature of only about 2.7 Kelvin. However, this radiation still offers a true picture of the universe as it looked about 13.4 billion years ago - exactly at the time it became transparent. With Planck's ability to measure the temperature of the coldest dust particles, one obtains an important indicator of these physical processes and also a better understanding of stellar evolution.
With values between 4 arc minutes for the highest and 33 arc minutes for the lowest frequencies, Planck's angular resolution is much better than in the comparable previous projects COBE and WMAP.
At the same time, observations of the foreground radiation of the Milky Way and galaxies are obtained. On the one hand, these interfering effects must be very well known in order to determine the background radiation, but are also of own scientific interest, e.g. B. for a deeper understanding of stellar evolution.
|Planck depicts a galactic network of cold dust|
The picture on the left shows part of the sky, encompassing approx. 55 °. It is a three-color composite, made from Planck's two highest frequency channels (557 and 857 GHz, corresponding to 540 and 350 µm wavelength) and an image with a shorter wavelength (100 µm) recorded with the Infrared Astronomical Satellite (IRAS). This combination effectively tracks the dust: reddish tones correspond to temperatures of 13 degrees above absolute zero, and white tones correspond to significantly warmer ones (in the order of a few tens of degrees) in areas where giant stars are just forming. Overall, the picture shows local dust structures within a distance of 500 light years from the sun.Source: ESA
The Planck telescope, which weighs 1921 kg, was brought into space on May 14, 2009 together with the Herschel infrared telescope by an Ariane 5 ECA from Kourou. After the upper stage had burned out, the Planck satellite was deployed at 13:40 UTC a few minutes after the Herschel telescope on a highly elliptical earth orbit between 270 and 1,197,080 km altitude, inclined 5.99 ° to the equator he reached his Lissajous orbit around the Lagrange point L2 of the earth-sun system with a small orbit maneuver.
The ESA member states had provided key technologies such as the innovative cooling mechanism, which enabled the mission instruments to be constantly cooled to just a tenth of a degree above the absolute zero of the universe of -273.15 ° C, so that the received signals are not distorted by the satellite's own warmth were. In this way, temperature fluctuations of a few millionths of a degree could be recorded in the cosmic microwave background radiation.
The cooling of instruments to these extreme temperatures cannot, however, continue indefinitely, and so the liquid helium coolant supply of the high-frequency instrument (HFI) ran out as expected in January 2012.
At the 14th.August 2013, after 1554 days of operation, the telescope was withdrawn from the L2 point and placed in an orbit that ensures that it will not be trapped by the earth for the next 300 years. The last order was sent to Planck on October 23rd.
Two full sky surveys were planned as the original goal of the mission. In fact, five complete surveys could be carried out with both instruments, with the LFI completing its eighth survey of the entire sky in mid-August.
Planck's law of radiation
Engl. Planck's radiation law, syn. Planck’s formula, french formule de Planck; Law named after Max Planck (1858-1947): Every body with a temperature greater than absolute zero (0 K / -273.15 ° C) emits electromagnetic radiation that is related to the temperature of the body and the wavelength:
This describes the spectral energy distribution of radiation from a black body.
In remote sensing, Planck's law of radiation is important for the design of sensors, among other things. It is used to determine the energy maxima of radiating bodies (sun, earth), since passive remote sensing methods only record the reflected portions of this radiation. Planck's law of radiation makes it clear that with higher temperatures the maximum of the spectral emission is shifted to shorter wavelengths. The maximum of extraterrestrial solar radiation (T. 5900 K) is about 0.47 µm, while the earth (T. 290 K) has its radiation maximum at approx. 9.7 µm. The curves of the black body radiation in the following figure illustrate this.
|Spectral radiation distribution at different surface temperatures|
Any body with a temperature greater than absolute zero emits electromagnetic radiation that is related to the body's temperature and wavelength. In remote sensing, Planck's law of radiation is important for the design of sensors, among other things.Source: Lexicon of Geosciences
The integration over the entire wavelength range leads to the Stefan-Boltzmann law.
Planet Labs, Inc. (formerly Cosmogia, Inc.) is a private American company in the field of earth observation with headquarters in San Francisco and with additional offices in Berlin, Lethbridge, Canada, Bellevue, Washington and Washington DC. It was founded in 2010 by three former NASA scientists and has around 430 full-time employees in 2019. The goal of Planet is to capture the entirety of the planet on a daily basis in order to observe changes and identify trends.
Planet designs and produces so-called Triple-CubeSat miniature satellites (size: 10x10x30 cm, 3.3 kg), which because of their swarm-like use of 'doves' (Eng. Pigeons) to be named. They are brought into space as a payload when other satellites are launched. Each Dove satellite continuously scans the earth and sends the data to the ground station as it flies over it. Together, the doves form a satellite constellation that creates a complete picture of the earth with a resolution of 3 - 5 m. Due to its small size and low cost, the company can quickly develop and test new prototypes according to customer needs, thereby avoiding major losses in the event of a false start. The Dove images are made available with an 'Open-data access policy'. They provide up-to-date information that is important for climate monitoring, harvest forecasts, urban planning and disaster relief.
With the acquisition of BlackBridge (2015) Planet Labs had 87 Dove and 5 RapidEye satellites in space. In June 2016 the name was changed from Planet Labs in planet.
In February 2017, Planet launched another 88 Dove satellites. In the same year, Google sold its subsidiary Terra Bella and its SkySat satellite constellation to Planet. As part of the acquisition, Google secured an equity stake in Planet and entered into an agreement to acquire SkySat imagery for many years. In October 2017, Planet launched 6 more SkySats (8-13) and 4 Doves (Flock 3m) into sun-synchronous orbits with the help of an orbital ATK Minotaur-C rocket from Vandenberg Air Force Base.
The number of satellites Planet has in orbit changes often. On the one hand, Planet keeps launching a multitude of new satellites, on the other hand, its satellites enter the atmosphere at the end of their lives and burn up. Planet has successfully used 351 copies since 2013. Today (2019) there are about 140 in orbit, 120+ doves, 15 SkySats and 5 RapidEye satellites. They collect over 250 million square kilometers of image data every day.
Planet currently serves more than 30,000 users, 400 customers, in over 40 countries. The company is a leading provider of geospatial data for use in agriculture, government, and commercial mapping. She looks after clients in the areas of trade, geopolitics, energy and infrastructure, the environment and other important markets.
Planet has made a conscious decision to position its satellites at lower altitudes in order to avoid both the congested higher altitudes and to ensure the timely and safe deorbiting of its satellites when their end of life is reached.
planetary remote sensing
Planetary remote sensing deals with remote sensing of the earth's moon, the planets, their moons and other celestial bodies in our solar system such as asteroids. The aim of planetary remote sensing is to record and interpret the reflected and emitted electromagnetic radiation of the observed body, whereby space probes, orbiters and landing modules are used as platforms. The results are used for planetology, which deals with the formation and development of our planetary system and its individual objects.
As with remote sensing of the earth, active and passive sensors are used in planetary remote sensing. The active systems send electromagnetic radiation in the direction of the body to be examined and receive the reflected portion. Passive sensors such as cameras and spectrometers, on the other hand, record the solar radiation reflected from the surface of the object in a certain wave range (monochromatic or panchromatic) or in several different wave ranges (RGB, multispectral, hyperspectral) of the electromagnetic spectrum. The recorded wavelengths range from visible light through near infrared to mid-infrared, whereby thermal infrared radiation is also partly recorded, which mainly consists of the (heat) radiation emitted by the surface of an object. Occasionally, cameras and imaging spectrometers that can capture the UV range are also used.
From stereoscopic image recordings, individual 3D points are determined for geodetic reference networks, and digital terrain models (DGM) of the surface are generated in the case of extensive evaluation. Products derived from this are orthophotos and orthophotomosaics, which are expanded into topographical and thematic maps through automatic or manual interpretation and managed in spatial information systems. The data products have a wide variety of applications in geology, mineralogy, volcanology, geophysics, exploration of landing sites, etc. Photo-realistic visualizations of a static and dynamic type complement the products derived from the images.
Plankton, aerosol, cloud, ocean ecosystem (PACE)
NASA's satellite mission in preparation to investigate aquatic ecology and chemistry, as well as to provide a more detailed understanding of the role of aerosols and clouds in the climate. The PACE sensor provides scientists with ocean colors from ultraviolet to near infrared. You receive precise measurements of the biological and chemical ocean properties such as the biomass of phytoplankton and the composition of phytoplankton communities. The mission helps to better understand the response of marine resources to climate change and the role of marine phytoplankton in the global carbon cycle.
Furthermore, PACE will take measurements of the cloud cover, as well as of small air particles such as dust, smoke and aerosols to supplement and continue measurements from existing NASA missions.
PACE will be directed from NASA's Goddard Space Flight Center in Greenbelt, Maryland. The mission is scheduled to start in 2022.
Engl. Acronym for Planetary Transits and Oscillations of stars (planetary transits and oscillations of stars); a medium-sized science mission as part of ESA's Cosmic Vision 2015–2025 program with a space-based observatory for the detection of exoplanets in the orbit of alien stars, to be launched in 2026.
The mission will address two of the most important questions of the Cosmic Vision program: under what conditions do planets form and life arise and how does the solar system work? PLATO will explore relatively nearby stars and look for tiny, regular losses of light that occur when their planets fly past them and temporarily block out a small part of the starlight.
Using 34 independent small telescopes and cameras, PLATO will look for planets under approximately one million stars that span half of the sky. In addition, the seismic activity of stars is investigated, whereby an exact characterization of the central star of each discovered planet, including mass, radius and age, should be made possible. Together with ground-based observations of the radial velocity, the measurements from PLATO will allow the calculation of the mass and radius of a planet and thus its density, which will allow conclusions to be drawn about its composition. The mission will locate and examine thousands of exoplanetary systems, focusing on the discovery and characterization of Earth-like size planets and super-earths in the habitable area of their central star - the distance at which liquid surface water could exist.
Syn. Remote sensing platform, engl. platform, French plate-forms; Static or moving support structure on which sensor systems for remote sensing or photogrammetry are installed exclusively or among other things for photographic or electronic image recording, for radar recordings or geophysical recordings of an object. In the simplest case this is a small observation platform mounted on vehicles; but in the vast majority of cases it will be about flying platforms, ranging from balloons to kites, hang gliders, drones / UAVs of various sizes, airplanes, helicopters, gyrocopters, airships to spaceships and
- Where can I get avocado in Udaipur
- Is the HR department a high paying job
- Why did you leave the music industry
- How scary is the movie Goosebumps
- What is Aliexpress
- How do you pop a deep pimple
- What is a splinter wound
- How do the Japanese see the Taiwanese
- What is the story of Bogota
- What is your favorite activity in everyday life?
- Why do many underage students drink
- Santeria is considered a voodoo
- What are HR services
- What are Bach's most famous pieces
- What lessons have made you wiser with age
- Why do Chinese tourists like Germany?
- New Zealand has rainforests
- Which oil company is the least evil
- How does a motorcycle work
- What do you admire about Bill Gates
- How common is age discrimination in Malaysia
- How do Chinese students get good grades
- Why is the Japanese yen so inflated
- How are overtime taxed?