{"id":37,"date":"2016-08-01T14:14:47","date_gmt":"2016-08-01T13:14:47","guid":{"rendered":"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/?page_id=37"},"modified":"2019-09-12T12:39:53","modified_gmt":"2019-09-12T11:39:53","slug":"2-1-characteristics-of-a-remote-sensing-system","status":"publish","type":"page","link":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/2-electromagnetic-radiation-and-electromagnetic-spectrum\/2-1-characteristics-of-a-remote-sensing-system\/","title":{"rendered":"2.1. Characteristics of a Remote Sensing System"},"content":{"rendered":"<p><strong>What is Remote Sensing?<\/strong><\/p>\n<p>Curran 1985 defines it as:<\/p>\n<p>&#8220;The use of <i>electromagnetic radiation sensors <\/i>to <i>record images <\/i>of the environment, which can be interepreted to yield <i>useful information<\/i>&#8220;.<\/p>\n<p>As an example of useful information, consider the changes to the landscape of Southampton University in this graphic:<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"640\" height=\"420\" class=\"alignnone wp-image-54 size-large\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.1-1024x672.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.1-1024x672.jpg 1024w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.1-300x197.jpg 300w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.1-768x504.jpg 768w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.1.jpg 1044w\" sizes=\"auto, (max-width: 640px) 100vw, 640px\" \/><\/p>\n<hr \/>\n<p><strong>Reflection<\/strong><\/p>\n<p>At this moment you are\u00a0 probably looking at a computer screen.\u00a0 What is the analogy between what you and the computer at the moment and a remote sensing system as defined by the three italiacised points in Curran&#8217;s definition above?<\/p>\n<script>\/\/ <![CDATA[\r\nfunction showFunct2() { document.getElementById(\"hideshow2\").style.display = \"block\"; } function hideFunct2() { document.getElementById(\"hideshow2\").style.display = \"none\"; }\r\n\/\/ ]]><\/script>\r\n\r\n<p style=\"text-decoration:underline; color:#336699;\" onclick=\"showFunct2()\">Show Answer<\/p>\r\n<div id=\"hideshow2\" style=\"display:none;\">\r\n<p>You are 'sampling' the light (electromagnetic radiation) coming from the screen which is forming an image on the back of your eye. Your brain is then interpreting the information so you can understand what we are communicating to you.<\/p>\r\n<p style=\"text-decoration:underline; color:#336699; text-align:right;\" button=\"\" onclick=\"hideFunct2()\">Hide Answer<\/p>\r\n<\/div>\n<hr \/>\n<p><strong>Activity<\/strong><\/p>\n<p>Using <a href=\"http:\/\/earth.google.com\/userguide\/v5\/tutorials\/timeline.html\" target=\"_blank\" rel=\"noopener noreferrer\">this tutorial,<\/a> and Google Earth version 5 or later\u00a0 fly to this location(\u00a0Lat: 50. 6359, Lon:\u00a0 -1.4133\u00a0 ) in\u00a0the Isle of Wight and observe what is happening to the coastline by stepping through the historical remotely sensed images.<\/p>\n<hr \/>\n<p><strong>Characteristics of Remote Sensing<\/strong><\/p>\n<p>Remote sensing is characterised by:<\/p>\n<ol>\n<li>Sensor Stage (satellite, plane, kite, ground based)<\/li>\n<li>View (angle of view)<\/li>\n<li>Type of radiation sensed (visible light, infrared, radar)<\/li>\n<li>Time of capture<\/li>\n<\/ol>\n<p>It can also be used or re-used for many different purposes. We will look at each of these characteristics in turn<\/p>\n<p><strong>1. Stages<\/strong><\/p>\n<p>As should be expected, the higher the platform generally the lower resolution the imagery. Satellites vary in altitude but are generally above aircraft, these also vary in altitude with high flying spy aircraft and lower light aircraft. The image below illustrates the difference in resolution that can often be seen in the Google Earth base imagery, generally satellite imagery is being replaced by higher resolution plane based imagery.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"295\" height=\"300\" class=\"alignnone wp-image-65 size-medium\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.2-295x300.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.2-295x300.jpg 295w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.2.jpg 491w\" sizes=\"auto, (max-width: 295px) 100vw, 295px\" \/><\/p>\n<p>Recently a number of very low altitude platforms(UAVs, drones) have started to be used in remote sensing, though there are issues to do with calibration of the images taken by these platforms that still needs to be resolved.<\/p>\n<hr \/>\n<p><strong>Reflection<\/strong><\/p>\n<p>What are the two major types of satellite orbit and what are their relative altitudes?<\/p>\n<p>\r\n<script>\/\/ <![CDATA[\r\nfunction showFunct() { document.getElementById(\"hideshow\").style.display = \"block\"; } function hideFunct() { document.getElementById(\"hideshow\").style.display = \"none\"; }\r\n\/\/ ]]><\/script>\r\n<\/p>\r\n<p style=\"text-decoration: underline; color: #336699;\" onclick=\"showFunct()\">Show Answer<\/p>\r\n<div id=\"hideshow\" style=\"display: none;\">\r\n<p>Geo-stationary satellites orbit at a high altitude at an orbital period of 24 hours. This means they can remain fixed over a particular location and provide a stream of data from the same view. All other orbits are generally below this, because they orbit faster than the earth rotates they cannot be kept in one location but they have the advantage of being able to take higher resolution imagery because of their lower orbits.<\/p>\r\n<p style=\"text-decoration: underline; color: #336699; text-align: right;\" button=\"\" onclick=\"hideFunct()\">Hide Answer<\/p>\r\n<\/div>\n<hr \/>\n<p><strong>2. Platform View Angle<\/strong><\/p>\n<p>Sensors can view the earth&#8217;s surface at different angles, this reveals different information such as relative heights of objects. This creates confusing representations in Google earth, the image on the left shows one remotely sensed image of a high rise building from an oblique view. On the other side of the road the image has been taken from a similar angle but from the opposite direction.<\/p>\n<p><a href=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.3.jpg\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"227\" class=\"alignnone wp-image-66 size-medium\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.3-300x227.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.3-300x227.jpg 300w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.3.jpg 751w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/> <\/a><a href=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.3.jpg\"><\/a><\/p>\n<p>Note that both images have been processed so that ground positions are correct since oblique images foreshorten the image in the direction of tilt, this is illustrated in the image below.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"524\" height=\"233\" class=\"alignnone wp-image-67 size-full\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.4.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.4.jpg 524w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.4-300x133.jpg 300w\" sizes=\"auto, (max-width: 524px) 100vw, 524px\" \/><\/p>\n<hr \/>\n<p><strong>Reflection<\/strong><\/p>\n<p>Apart from changing the angle of view, what other characteristic of sensors sampling visible light could be used to reveal object height information?<\/p>\n<p>\r\n<script>\/\/ <![CDATA[\r\nfunction showFunct3() { document.getElementById(\"hideshow3\").style.display = \"block\"; } function hideFunct3() { document.getElementById(\"hideshow3\").style.display = \"none\"; }\r\n\/\/ ]]><\/script>\r\n<\/p>\r\n<p style=\"text-decoration: underline; color: #336699;\" onclick=\"showFunct3()\">Show Answer<\/p>\r\n<div id=\"hideshow3\" style=\"display: none;\">\r\n<p>If the image is taken  early morning or late evening when the sun is visible shadow length will illustrate object height and shape.<\/p>\r\n<p style=\"text-decoration: underline; color: #336699; text-align: right;\" button=\"\" onclick=\"hideFunct3()\">Hide Answer<\/p>\r\n<\/div>\n<hr \/>\n<p><strong>3. Spectral Bands<\/strong><\/p>\n<p>Your eyes sample in three colours, red, green and blue, any colour you perceive is a mix of these three primary colours. Remote sensing platforms are not limited to these colours or bands and can sense different wavelengths. For example, they can measure Near Infra Red which shows clearly different colours for vegetation and non-vegetation as can be seen in the images to the left: The right hand image is near infra red and vegetation appears bright red and pink in this image.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"199\" class=\"alignnone wp-image-68 size-medium\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.5-300x199.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.5-300x199.jpg 300w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.5.jpg 543w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/> <img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"198\" class=\"alignnone wp-image-69 size-medium\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.6-300x198.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.6-300x198.jpg 300w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.6.jpg 540w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/p>\n<p>Such images can be combined to form &#8216;false colour images&#8217; where primary colours in display stand in for the original band that was sensed, e.g. red stood in for near infra red in the example image.<\/p>\n<p><strong>4. Time of Capture<\/strong><\/p>\n<p>The time of capture is useful information to gather as it can be used to track spatial changes. Examine the spread of Las Vegas between 1970 and 2000 in the images below.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"225\" class=\"alignnone wp-image-71 size-medium\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.8-300x225.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.8-300x225.jpg 300w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.8.jpg 320w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/> <img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"225\" class=\"alignnone wp-image-70 size-medium\" src=\"http:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.7-300x225.jpg\" srcset=\"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.7-300x225.jpg 300w, https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-content\/uploads\/sites\/106\/2016\/08\/2.7.jpg 320w\" sizes=\"auto, (max-width: 300px) 100vw, 300px\" \/><\/p>\n<hr \/>\n<p><strong>Reflection<\/strong><\/p>\n<div class=\"block\" id=\"ta40_8\">\n<p>Open Google Earth, enter &#8216;Rondonia Brazil&#8217; in the Search box then click the magnifying glass to fly in. Using <a href=\"http:\/\/earth.google.com\/userguide\/v5\/tutorials\/timeline.html\" target=\"_blank\" rel=\"noopener noreferrer\">this tutorial,<\/a> and Google Earth version 5 or later use the timeline to see the changes in land use. Use the <a href=\"http:\/\/earth.google.com\/intl\/en_uk\/userguide\/v4\/ug_measuring.html\" target=\"_blank\" rel=\"noopener noreferrer\">ruler tool<\/a> to get a sense of the scale of the changes.<\/p>\n<p>Repeat the exercise for &#8216;Lake Chad, Nigeria&#8217;.<\/p>\n<p>\r\n<script>\/\/ <![CDATA[\r\nfunction showFunct4() { document.getElementById(\"hideshow4\").style.display = \"block\"; } function hideFunct4() { document.getElementById(\"hideshow4\").style.display = \"none\"; }\r\n\/\/ ]]><\/script>\r\n<\/p>\r\n<p style=\"text-decoration: underline; color: #336699;\" onclick=\"showFunct4()\">Show Answer<\/p>\r\n<div id=\"hideshow4\" style=\"display: none;\">\r\n<p> The main area that can be viewed on screen for Rodonia is 150 km across so the changes illustrate a large change in tropical forest cover.<\/p>\r\n<p style=\"text-decoration: underline; color: #336699; text-align: right;\" button=\"\" onclick=\"hideFunct4()\">Hide Answer<\/p>\r\n<\/div>\n<\/div>\n<hr \/>\n<p><strong>Examples of Use<\/strong><\/p>\n<p>The following examples use one or\u00a0 a number of types of remote sensing to solve spatial problems:<\/p>\n<div class=\"O\">\n<ul>\n<li><span lang=\"EN-GB\">Meteorology: <i>Weather forecasting, Climate studies, Global <span> <\/span><\/i><\/span><span lang=\"EN-GB\"><i><span> <\/span>change<\/i> <\/span><\/li>\n<li><span lang=\"EN-GB\">Hydrology: <i>Water balance, Energy balance, Agro hydrology<\/i> <\/span><\/li>\n<li><span lang=\"EN-GB\">Soil science: <i>Soil mapping <\/i><\/span><\/li>\n<li><span lang=\"EN-GB\">Biology\/conservation: <i>Vegetation mapping\/monitoring, Forestry <span> <\/span><\/i><\/span><span lang=\"EN-GB\"><i><span> <\/span>Inventories, mapping, de\/re-forestation, forest fires<\/i> <\/span><\/li>\n<li><span lang=\"EN-GB\">Enviro studies: <i>Sources\/effects of pollution, Agriculture<span> <\/span>Land use development, water management, <span> <\/span><\/i><\/span><i><span lang=\"EN-GB\"><span> <\/span>erosion <\/span><\/i><\/li>\n<li><span lang=\"EN-GB\">Planning<span> <\/span>Physical:<i> Planning scenarios <\/i><\/span><\/li>\n<li><span lang=\"EN-GB\">Land surveying: <i>Topography, spatial data models, GIS<\/i> <\/span><\/li>\n<\/ul>\n<hr \/>\n<p><strong>Reflection<\/strong><\/p>\n<p>Think of ways the 4 different types of remote sensing could be applied to the problem of forest fires.<\/p>\n<p>\r\n<script>\/\/ <![CDATA[\r\nfunction showFunct5() { document.getElementById(\"hideshow5\").style.display = \"block\"; } function hideFunct5() { document.getElementById(\"hideshow5\").style.display = \"none\"; }\r\n\/\/ ]]><\/script>\r\n<\/p>\r\n<p style=\"text-decoration: underline; color: #336699;\" onclick=\"showFunct5()\">Show Answer<\/p>\r\n<div id=\"hideshow5\" style=\"display: none;\">\r\n<p>I'm not an expert in the remote sensing of forest fires so your ideas are probably as valid as mine but here are the ideas I came up with for potential uses of remote sensing to this problem:<\/p>\r\n<p>Stages: Forest fires can be remotely sensed on a large scale to identify any fires starting, once one starts it can be  sensed on a smaller scale to more accurately track its progress.<\/p>\r\n<p>View Angle: This could be used to assess the height of trees in certain areas which will affect the progress of a forest fire.<\/p>\r\n<p>Wavelength: Using infra red the pockets of fire can be easily picked out, visible wavelengths can be used to mark the areas burnt out by the fire.<\/p>\r\n<p>Temporal sensing: Fires spread quickly, by taking multiple images of the same area its spread and progress can be gauged and dealt with.<\/p>\r\n<p style=\"text-decoration: underline; color: #336699; text-align: right;\" button=\"\" onclick=\"hideFunct5()\">Hide Answer<\/p>\r\n<\/div>\n<\/div>\n<hr \/>\n<p><strong>References<\/strong><\/p>\n<p>BBC explanation of <a href=\"http:\/\/news.bbc.co.uk\/1\/hi\/world\/africa\/6261447.stm\" target=\"_blank\" rel=\"noopener noreferrer\">vanishing Lake Chad.<\/a><\/p>\n<hr \/>\n","protected":false},"excerpt":{"rendered":"<p>What is Remote Sensing? Curran 1985 defines it as: &#8220;The use of electromagnetic radiation sensors to record images of the environment, which can be interepreted to yield useful information&#8220;. As an example of useful information, consider the changes to the landscape of Southampton University in this graphic: Reflection At this moment you are\u00a0 probably looking [&hellip;]<\/p>\n","protected":false},"author":1726,"featured_media":0,"parent":35,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-37","page","type-page","status-publish","hentry"],"jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/pages\/37","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/users\/1726"}],"replies":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/comments?post=37"}],"version-history":[{"count":24,"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/pages\/37\/revisions"}],"predecessor-version":[{"id":790,"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/pages\/37\/revisions\/790"}],"up":[{"embeddable":true,"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/pages\/35"}],"wp:attachment":[{"href":"https:\/\/generic.wordpress.soton.ac.uk\/rs4eo\/wp-json\/wp\/v2\/media?parent=37"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}