Saturday, October 24, 2015

Field Activity #5: Development of a Field Navigation Map

Introduction

This week's activities required us to create two field navigation maps.  We will being using the map in the following weeks along with a compass to locate a number of locations withing the area of the Priory in Eau Claire, WI.  The Priory is a wooded 120 acre off campus property, with residence halls owned by the University Wisconsin Eau Claire.  The ample space makes it a prime location for a navigational training.  In addition to creating the navigation maps we will also be learning a skill called pace counting.

Dr. Hupy set up a geodatabase with a plethora of data for us to use while constructing our navigation maps.  One of the few requirements was one map had to have a grid with the Universal Transverse Mercator (UTM) coordinate system with 50 meter spacing (or finer) and the other map had to have a grid with the Geographic Coordinate System (GCS).  The only other requirement was it should be created in landscape format and sized for 11x17 paper.  The reset of the design detail was left up to us using the given data.

The UTM coordinate system is broken in to 60 different zones, each of these zones are 6 degrees of longitude wide.  The zone are also split into an northern and southern zone, the split coming at the equator.

(Fig. 1) UTM zone layout (http://www.gpsinformation.org/utm-zones.gif)

The GCS is different in the aspect it uses latitude and longitude to determine your location on the earth.  This location is stated in terms of decimal degrees, and then can be converted to degrees, minuets, seconds with simple math.  One issue with using GCS for navigation is there in no actual form of measurement to determine how far you have to or have traveled.

(Fig. 2) Geographic Coordinate System layout. (http://newdiscoveryzhou.blogspot.com/2006_08_01_archive.html)



Methods

Pace Counting

Before we started the construction of the maps Dr. Hupy discussed a basic navigation skill called pace countingPace counting is knowing the distance you have walked based on the number of steps you have taken.  After the introduction of pace counting our entire class went outside to practice and learn our personal pace count for a 100 meter distance.

Using a laser distance finder Dr. Hupy gave us a 100 meter distance along the sidewalk of the Davies parking lot on campus.  The weather was not particularly pleasant, as it was raining lightly and in the temperature was around 55 degrees Fahrenheit.  Dr. Hupy had some trouble getting the distance laser finder to function properly in the beginning of our test.  After a couple second attempts he felt as though he had an accurate reading from the laser.

We were instructed to walk from point A to point B (Fig. 3) while only counting the steps we took with our right leg.  Then do the same thing when returning from point B to point A to see make sure we were consistent.  I ended up with 57 & 58 right leg paces for the 100 meter distance.

(Fig. 3) Point locations for the 100 meter distance.  (Basemap snipped from Google Maps)
Pace counting on a flat surface like the side walk gives us a good base to start from.  However, things are dramatically different when you attempt to count your paces in the woods.  Stepping over logs, going up hills, all while trying to maintain your direction are just a few of the struggles you will encounter all while trying to count and keep your paces the same distance apart as you practiced.

Map Design

We were assigned groups for this navigation project, but we were to construct maps separately.  After everyone in our group (3 of us total) had completed our map design, we had to choose which group members maps we would have printed and use for our navigation in the following weeks.

Exploring the data Dr. Hupy had provided for us was the first step for our map design portion of the exercise.  Exploring the blogs from past classes I had a good idea of what I wanted to design but I perused the data within the geodatabase.

The basemap was the first element I explored withing ArcMap.  There was a few raster images in the data to choose from.  There was a black and white aerial (Fig. 4), and color aerial taken in the fall (Fig. 5), and a scanned topographic map (Fig. 6).

(Fig. 4) Black and white aerial image of the Priory.

(Fig. 5) Color aerial image of the Priory.

(Fig. 6) Topographic map of the Priory.
After an brief examination it was clear to me the color image was definitely the one to choose.  The main factor was the level of detail which will allow my group to be able to discern different objects during our navigation.  ArcMap also has the options of importing different basemap to be used.  So using the boundary from the geodatabase we were given, I imported the Imagery basemap from ArcMap (Fig. 7).  As you can see in the photo the vegetation is very thick and green.  Our area was already experiencing fall and the majority of leaves were in the process of changing colors.  For these reasons I did not feel this map would suit our needs.

(Fig. 7) Imagery basemap of the Priory.

Contour line feature classes were the next map element I explored within the data.  I found 2 different contour line files to examine.  One had 5 meter spacing between contour lines and the other had 2 ft (.609 meters) spacing.  The 2 ft. contour file was far to tight to see any detail on the map when overlayed on the basemap (Fig. 8).  The tightness of the lines would have hindered navigation by making the map far too "busy" to understand where on the map we were located.  Additionally, I knew another set of lines was going to be added for the coordinate system grid, rendering this option useless.

(Fig. 8) Color aerial image with 2 ft. contour lines.
Examining the 5 meter contour line data, I felt the lines were spaced more appropriately (Fig. 9).  However, my initial feeling was they could possibly be spaced to far apart.  Dr. Hupy then informed us the data used to create the 5 meter contour lines was far from the best quality and he felt there was better options to  be created or found.  He made a suggestion to talk with one of our classmates, Peter Sawall who has been doing a lot of work with Lidar.

(Fig. 9) Color aerial image with 5 meter contour lines.
Talking with Peter he suggested using the latest Lidar imagery from Eau Claire County to create a new contour map from of at interval I felt was appropriate.  I told him this sounded like a great idea but I have had very little exposure to Lidar, nor do I have any idea how to process the data to produce a contour map.  Peter assured me the process was easy and he would walk me through the steps.  Peter's "walk" through was more like a solid run for me.  Peter was working on his maps at the same time so I did not take the time to slow him down to fully understand the process.  Peter has already done extensive work at the Priory so we already knew which files I needed to complete the task.  We have the current Lidar information from the county on our server at the university.  Looking at the geoprocessing history from ArcMap, we used the following tools: Create LAS Dataset, Make LAS Dataset Layer, LAS Dataset to Raster.  The final tool was the Contour tool which creates the contour lines from the raster image.  Within the Contour tool you could set the contour spacing.  I ran the tool three times setting 5 ft, 7ft, and 12 ft spacings. 

(Fig. 10) 12 ft. contour created from a raster image.
After creating the contour lines, I inspected the different spacings over my basemap of the priory.  Both the 5 ft. and 7 ft. spacings were too tight for my desired visibility of the base map.  The 12 ft. contour was exactly what I had envisioned.  Now I had to clean up the map area a little bit.  With in the data we were given was a feature class with the boundary of the area we would be navigating within.  The extent of the contour lines fell outside the navigational boundaries, so I used the Clip tool with in ArcMap to eliminate the lines which fell outside the boundary.  Eliminating the excess lines will help clean up the clutter after adding the grid lines in the next step.  I took one last look through the data and did not see any other features of interest for my map.

(Fig. 11) Color aerial image with the created 12 ft. contour lines.
Now to add the grids which will be one of the biggest aids in our navigation.  ArcMap has a built in feature to add the geographic coordinate of your map over top of the features.  To accomplish this you must be in the layout view of ArcMap.  Once in the layout view, click the Data Frame Layer properties go to the Grids tab.  From with in this tab you can create a new measured grid using your coordinate system.  Adjusting the properties after you create the grid is probably the most challenging part.  Per the recommendations of Dr. Hupy I changed the settings to have critical number in the grid in black and the beginning number (which stay the same on a map of this scale) gray (Fig 12).  I also adjusted the spacing to what I felt looked like an appropriate spacing.  The last adjustment I made was the amount of "precision" to the coordinate system labels.  After creating the grid for my UTM map, I created a new ArcMap file and added the same basemap, boundary, and contour lines and proceeded to add the grid for my GCS map.  I had to tinker around with the property settings to get the grid spacing, label colors, and precision of the numbers as I did for the UTM map.

(Fig. 12)  Layout & Design for my grid labels.

The last step was to put all of the finishing touches on the map to make it useful and cartographically pleasing.  After adding the standard map elements (title, scale, north arrow, legend, sources, my name), I added the coordinate system as well.  I decided on 2 different scales for the map not knowing exactly what we will need once we are in the field.  I adjusted the colors of the grid so they would not be as dominate of a color, allowing more of the other information on the map to stand out.  I also adjusted the transparency of the contour lines to allow more of the base map to show through.  On the GCS map I made the lines even more transparent than the UTM.  On the UTM I added labels to the contour lines.  These maps will be printed back to back so I feel having some variation between the two will be beneficial.



(Fig. 13)  UTM navigation map of the Priory.


(Fig. 14) GCS navigation map of the Priory

Discussion

Pace counting and compass navigation is not something new to me.  Years ago, I was involved with forestry judging.  We were required to navigate trails by using a compass, azimuth directions, and distances.  The difference in this exercise is I will have a map of the area and points plotted on the map and then I will have to determine the actual steps it will take to get there.  I am very comfortable walking through the woods and feel I have a good sense of distance while walking over, around, and through obstacles.

The part I am not as confidant with is the map portion.  Though I have used maps and aerial photographs for hunting and trapping purposes, I based my navigation off landmarks and just walked till I found them.   Using a map with a known distance should make the navigation easier and keep the excess wondering to a minimum.

Though  these two maps looks very similar, they are very different when it comes down to how they will be used for navigation.  My initial feeling is the UTM map will be the easier map to navigate with due to having actual distances on the grid.  The GCS map will be useful when using a GPS.  Using the GPS will give us the decimal degrees, allowing us to know exactly where on the map we are located.  After a few test pacings with the GPS I feel our group will be able to devise a good plan to know how far we have traveled.


Conclusion

Designing a navigation map when you have never used one for that purpose can pose to be a challenge.  Taking notes what others had wrote about the project and ideas from our professor proved to be the most valuable information for this exercise.  Combining that information with my preferences for maps when navigating outside, I feel as I came up with the best map I could for my level of understanding.  After the actual navigation exercise I am sure I will come up with some changes I would have made.  The best way to learn what works is done though experimentation.

Thank you

I would like to say thank you to Peter Sawall for his help with the contour line creation.  For more information about Peter hop over to his blog.

Saturday, October 17, 2015

Field Activity #4: Unmanned Aerial System Mission Planning

Introduction

In week number 4 our class was introduced to unmanned aerial systems (UAS).  The purpose of the class was to give us a basic understanding of the different types of UAS's, the variety of applications, software programs used to plan and interpret data collected by UAS's.

UAS Overview 

UAS's are not a brand new idea.  However, with recent advances in technologies UAS's are far more accessible to the commercial market than before.  The introduction gave us an overview of the different available types of UAS's and the pros and cons between platforms.  Two types of UAV's covered in our introduction where "fixed wing" and "multicopter".  Fixed wing UAS's are similar to an airplane/glider style craft. (Fig. 1)  Multicopter UAS's are a multiple blade helicopter style aircraft. (Fig. 2)

(Fig. 1) This photo is an example of a fixed wing UAV.  The UAV pictured here is similar to the one we were shown in class. 
One big misconception about UAS's is they are nothing more than a radio controlled (RC) airplane.  RC airplanes are not equipped with a number of components which comprise a UAS.  The fixed wing UAS we were introduced to was equipped with a Pixhawk flight controller.  The flight controller on the UAS communicates with base station on the ground.  The base station can be loaded with software to fly the airplane without human control when properly programmed.  The Pixhawk brand flight controller is compatible with an open source software called Mission Planner which is available for free from ArduPilot.  The open source aspect is relatively rare in the UAS field at this time.  The UAS is also equipped with a modem and radio control receiver (RX).  The modem is used for long range data link communication with the base station.  The RX is used when flying the UAS manually.

All of the UAS's introduced to us were battery operated.  The majority of UAS's available to the public and commercial sector are powered by a lithium polymer (LiPo) battery pack.  LiPo batteries are very volatile, and if not properly cared for can explode.  Dr. Hupy shared a few instances of people he knows who have had these batteries blowup due to improper storage or misuse.

Applications
 
The uses for UAS's are endless.  UAS's can be fit with a multitude of cameras and sensors to fit any application.  Search and rescue, land cover analysis, tracking animal populations, and the list just keeps going.


Technical & Pros and Cons

To effectively launch the UAS pictured in Fig. 1, you need to have a few things in place.  One thing you need is a fairly large area not only to launch but also to land the craft.  Due to the weight of the UAS the other item needed is a launching device. Dr. Hupy showed us a bungee launching system he uses to launch the craft.  To understand how the launch system is configured take a look at this video link (RV JET Bungee Launch).  Another troublesome issue with the fixed wing craft is the turning radius.  If you were trying to fly a grid pattern over a small field it will take a greater area for the UAS to turn around and get inline with the grid thus adding to your flight time.  I will go into these issues in greater detail later on in this blog post when I talk about Mission Planner Software. 

The above issues are a few of the disadvantages of using a fixed wing UAS.  One advantage to the fixed wing UAS is the long flight times they can fly.  The UAS our class was introduced to can fly up to one and half hours depending on a number of variables, such as wind, and the payload of the UAS.  This advantage allows you can cover long distances and gain a lot of data in one flight is desired.

In conclusion the fixed wing UAS is not good for a rapid deployment in confined areas like cities or urban areas.  They are more suitable for use in rural areas when covering great distances which require long flight times.


(Fig. 2) This photo is an example of a multicoper.  The UAV pictured here is a quadcopter due to the four blades used for propulsion.

The multicopters are very similar to their fixed wing brothers in the controlling features, hardware, and software.  Multicopters are constructed using a varying number of blades.  Most commonly constructed with between 4 and 8 blades.

The majority of multicopters are rigged with a gimbal.  A gimbal allows the operator to attach a camera or other types of sensors to the UAS.  The gimbal is able to be rotated and tilted through the controller allowing the user to view in a multitude of directions.  Gimbals are designed in either a 2-axis option or 3-axis. 


(Fig. 3) This photo is an example of a gimble.  This particular gimble is designed by 3D Robotics (3DR) to carry and operate a GoPro camera.
Certain multicopters can be fitted with high torque rotors.  These high torque rotors allow the multicopter the ability to carry a higher payload.  However, it takes more energy to turn the rotors and carry the payload thus reducing the length of your available flight time.  Overall the available flight time of a multicoper is shorter than a fixed wing UAS.  The biggest advantage of the multicopter is the capability to launch just about anywhere you desire even with limited space.  Also, when running a grid system over a field the area need to turn the multicopter around is none.  This improves the efficiency of the UAS in turn reducing flight times.

Methods


Following the introduction, Dr. Hupy took the class out to the site of Field Activity 1 (Blog of Field Activity 1) to fly at DJI Phantom (Fig. 4) for demonstration of the abilities and function of UAS's.  The DJI is quadcopter (four blades) style UAS.  The Phantom had a camera attached to a 2-axis gimbel.  Dr. Hupy attached a small tablet to the controller of the UAS, so he would be able to see exactly what the camera was seeing while flying. 

(Fig. 4) This a photo of the actual DJI Phantom flown by Dr. Hupy.

(Fig. 5)  An example photo of a tablet attached to the Phantom controller.
Once Dr. Hupy had launched the DJI he demonstrated the maneuverability and the stability of the craft.  Dr. Hupy was able to completely remove it hands from the joysticks and the Phantom would hover.  He then flew up and down rivers edge taking photos every 5 to 10 seconds.  On the last trip back one of my fellow students spotted a bird nest in a tree.  Dr. Hupy piloted the DJI above the bird nest and we were able to look directly in the nest to see it was empty.

During this time Dr. Hupy was discussing the many uses for UAS's.  He discussed bridge inspections could greatly benefit from the abilities of a UAS.  We were directly beneath a bridge, which a few years ago had to be inspected.  At that time the inspectors had to tie off with harnesses and physically climb over the edge of the railing and inspect the underside of the bridge.  Dr. Hupy then flew the UAS over to the bridge and displayed how with the proper equipment the inspection could have been done equally effectively without endangering the life of the inspector.

Later we had the opportunity to fly the Phantom if we desired.  I took my turn flying the DJI over the edge of the river.  The controls were as sensitive as I expected them to be.  Small movement of the joystick resulted in immediate movement of the craft.   I found the Phantom to be incredibly easy to control and maneuver for being my first time flying the unit.  While I felt in control, a few times I bumped the control in the wrong direction and by simply letting go of the joystick it stabilize itself and hover.  I felt the DJI Phantom was a great first UAS to pilot. 

The DJI has a warning device which alerts you when the battery is running low.  Dr. Hupy inspected the meter on the control and told us we still had at least another 10-15 minuets of flight time yet. 

After everyone had an opportunity to fly, Dr. Hupy landed and changed the battery on the Phantom.  Then he piloted the DJI around a few more areas of interest along the river bank taking pictures.  He flew over one of the terrain models from exercise 1 which was mostly still intact and took many photographs for later use.  Dr. Hupy also flew over a "24" which had been crafted along the rivers edge with rocks. 

The flight time in total took no more than 20-30 minutes, and in that time we collected 322 photographs.  Below you will find a few examples of photographs taken during the flight.



(Fig. 6) This is a photograph of our class watching Dr. Hupy fly the DJI Phantom.

(Fig. 7)  A photograph taken of a surface terrain from exercise one.

(Fig. 8) A photograph of the 24 along the river.

Mission Planner Software

Flying the UAS by hand is not the most effective method to cover a specific area.  After the flight demonstration Dr. Hupy introduced us the Mission Planner program. Mission Planner allows you to create a custom and fully automated flight for a UAS.  With the program you have the ability to adjust every aspect of the flight.  You set the area you will be collecting data for and then set the parameters to cover the area.

The main parameters you can set are the altitude and the resolution (spacing of the grid).  Deciding how to set the parameters depends on the equipment you are using and the goal of the flight.  If you objective is to have high resolution imagery of the ground cover then you are likely to fly lower and have a lower per area resolution number (but actually makes the resolution a higher).  Adjustment to the altitude has a direct effect on the resolution of the flight.  The higher the resolution the longer the flight time.

Looking at figure 9 & 10 below you will see the differences discussed above.  Both flight plans cover the same approximately 7 acre square.  Figure 9 has a very high resolution, but to obtain this resolution the flight will take 24 minuets and record 558 pictures.  Figure 10 shows the same flight coverage area with an increased altitude of 20 meters.  The increase has reduced the resolution but not significantly.  The altitude change also reduced the flight time by 11 minuets and the picture count by 409 pictures to cover the same area.  Depending on the objective of the flight you may need to use the flight plan in figure 9 however if the detail is higher than the objective then you are just using up unneeded time and resources.

(Fig. 9) Higher resolution and lower altitude flight plan from Mission Planner



(Fig. 10) Lower resolution due to higher altitude flight plan from Mission Planner.

Real Flight Flight Simulator

The next program Dr. Hupy introduced us to was Real Flight Flight Simulator.  The simulator program gave us practice in all the aspects of flying fixed wing and multicopter crafts.  The objective of the this portion of the lab was to pick one fixed wing and one fixed wing  aircraft and fly each of them for 30 minuets each.

In the flight simulator program there was a vast number of options to choose from.  I tried out a number of different aircrafts before deciding on flying the X8 Quadcopter 1260.  Though the X8 is technically a quadcopter it actually had 8 blades and 8 motors.  The additional blades made it incredibly stable and easy to fly.  However, the additional blades seemed to hamper its turning ability slightly.  The X8 was slower to maneuver and change directions.  The additional blades increased its payload capacity.   In the program it was hauling around what looked like a full size SLR digital camera.  The down fall to additional blades is the they require additional power to turn, thus reducing the available flight time.  This craft would be great for high resolution work over a smaller area.

(Fig. 11)  Screenshot from Real Flight Flight Simulator while flying the X8 Quadcopter.
The fixed wing aircraft I chose was a Pipercub Float Plane.  In comparison the Pipercub was considerably faster than the X8 Quadcopter.  Also, the Pipercub was considerably more responsive and quicker turning than the multicopter.  However, when attempting to turn around to land in a narrow space you had to make sure you made a wide swing to get lined up for the landing area. I found it easier to land the Pipercub on the land compared to landing in the water.  Though a true Pipercub is a gas powered airplane, I can definitely tell their are advantages and disadvantages to using a fixed wing aircraft.  Surveying long and narrow fields or long stretches of roadways would be in the wheelhouse of the a fixed wing aircraft.  One tough part of flying the Pipercub was the program did not have any gimble with a camera attached to it.  Many of the other UAS styled crafts had a camera on the nose and you could switch your flying view to the camera.  Flying the Pipercub was a lot more like flying a RC Airplane than actually flying a UAS or conventional plane.

(Fig. 12) Screenshot from Real Flight Flight Simulator while flying the Pipercub. (Yes I crashed into the water shortly after taking the screenshot.)

Pix4D

The final section of our lab session introduced us to a computer program called Pix4D.  Pix4D converts images taken in almost any method and creates georeferenced maps, mosaics, and 3D models.  There are multiple versions of Pix4D, the version I used was Pix4Dmapper Pro.  The objective was to create a 3 dimensional view of the landscape along the rivers edge using a digital elevation model (DEM) and an ortho-mosaic image created from Pix4D.

The Pix4D program is probably one of the simplest programs I have used in my life.  Once the program is open, you select new project.  When in the new project menu, you select where you want the generated information saved, and you select the photos you want to be used.  Selecting the photos is the most difficult part.  I had to make sure I selected enough photos, and those photos had to overlap themselves so for the program to properly depict the landscape.

Since I had time and access to more than one computer, I chose to run all 165 of the photos taken on the east side of the bridge (first part of the flight from earlier).  Had I been in a time crunch I could have achieved satisfactory results with fewer images.  However, my goal was to have the highest resolution depiction of the landscape possible.  The entire process took approximately 4 hours (I was not at the computer when it finished).  After the program worked its magic, I opened the created ortho-mosaic in ArcScene.

(Fig. 13)  Ortho-mosaic created in ArcScene from UAS collected imagery.

To allow you to better understand the detail projected in this image, I took the original image created in Pix4D and used ArcMap to measure a detail on the map.  The object I seletected is the LL & the heart just off to the right of the center of the above image.  I measured the end of the second L.  It measured approximately .494726 meters (1.7477 ft).  The image is clearly discernible from the ortho-mosaic.  The image captured by the UAS displays the individual rocks used to create the LL & heart pattern.

(Fig. 14) Comparison of the image created in ArcScene (Top) and the orginal image created  by Pix4D displayed in ArcMap (bottom).

Scenario

The last part of the lab, I have to provide a consulting report for a scenario.  My consulting report is being directed to my supervisor who will write the final report for the customer.  The scenario is as follows:

   "A Canadian crop consultant who manages 300,000 acres of land in the plains provinces is looking to do drainage modeling, and to monitor the overall health of the crops. She doesn’t have access to anything but coarse DEM data, and the IR satellite imagery is not at a resolution that suits decision making beyond a per field basis. They would like to start out with a pilot project by doing small 80 acre test plots, and then scale up from there if the project works. How might they best implement this plan."

Canadian crop consultation

We definitely have the tools to fully assist this customer.   Without knowing the specific layout of the customers property or the initial 80 acre for the test plot, I have used a known 80 acre field to develop a few options.  Using the mission planner software I created two scenarios to consider: 1 flight using a fixed wing 3D Robotics Areo, and the 3D Robotics X8.

3D Robotics Areo

Specs
Altitude: 50 m
Speed: 23 m/sec
Camera: Canon S110-S120

Results
Ground Resolution: 1.83 cm/pixel
Flight Time: 17 min
Number Images: 494

3D Robotics X8

Specs
Altitude: 50 m
Speed: 8m/sec
Camera: Canon S110-S120

Results
Ground Resolution: 1.83 cm/pixel
Flight Time: 37 min  (This would need to be broken into at least 2 flights)
Number Images: 489

(Fig. 15)  Flight plan for the 3D Robotic Areo.

The ideal situation would be if we could fly the Areo, for the simple fact it could be done in one flight.  If we used the X8 we would need to split the flight in at least half and make 2-3 flight to cover the 80 acres.  It is possible to increase the altitude to accomplish either flight in a shorter time frame be we would compromise the ground resolution considerably.

These results are just a rough approximation.  The following question would need to be answered for me to properly prepare an accurate plan.

  • Is if there is ample room to launch a fixed wing UAS or if we would be need to utilize one of our multicopter platforms?
  • Is the 80 acre test plot just one field or is it multiple small fields which add up to 80 acres?
Either way we approach the flight, we can definitely produce the data in which the customer is looking for.  Below you will find some sample images of products we could produce for this customer.  Figure 16 shows and array of aerial imagery in which the consultant will be able to see the exact health of the fields, through plant health, nutrient needs, and moisture contents.  Looking at figure 17 you will see an elevation and slope model to better address the water drainage of each field.

(Fig. 16) Crop health imagery we could provide for the customer. (Ascending Technologies)

(Fig. 17) Topographic map for drainage modeling. (NCSU)

Discussion

UAS technology is exploding and the ability for the public to obtain one has never been easier.   The variations which are available to the public/commercial field are endless.  The options which I spoke of were found from recommendations from Dr. Hupy and searching the internet on my own.  The field of UAS use is just gaining speed and look forward to see all the uses we haven't even thought of yet.

One use in place already which is fascinating and practical is the use of UAS for Precision Agriculture.  This technology is being used to assist farmers in proper fertilization and watering practices.  It is my hope, through this type of technology we will be able to display the functionality of no-till farming practices, and rotational planting to optimism soil health.  Through farmers across the country are successfully reducing their reliance on pesticides and chemical fertilizers, many do not believe it will work for them.  I feel through the use of this great technology we would be able to physically show them the affects these practices could have on their property and formulate calculations for savings and reduction of time in the field.

One thing not discussed a lot in this lab is the legal issues with UAS use.  From my understanding, UAS regulations vary from state to state and all have regulations from the Federal Aviation Administration (FAA).  These regulations are limiting the use and life saving ability UAS platforms can provide.

Conclusion

Overall, the lab did a great job of introducing us to UAS platforms and a very small taste of their uses.  I am excited to learn more about applications for UAS.  I look forward to taking Dr. Hupy's UAV class in one of the upcoming semesters to learn a great deal more about the in's and out's.  I feel privileged to be in one of the few universities which is teaching UAS technology to undergraduates.  Having even the most basic knowledge of their use has me continually trying to think of new uses for the UAS platforms.

Sources

Ascending Technologies
3D Robotics
NCSU
Pix4D

Sunday, October 4, 2015

Field Activity #3: Conducting a Distance Azimuth Survey

Introduction

In this weeks lab adventure we will be conducting field research with less technology than one would normally prefer too.  Far too often situations come up in the field which leave you to improvise your methods.  The window can be short for data collection in the field, so going back to get the appropriate equipment may not be always feasible.  Our instructor Dr. Hupy explained how he had traveled to a foreign country and had his high end GPS unit seized at customs.  In this type of situation you can't just run back to the office and grab another one, so you need to be prepared to improvise.  Dr. Hupy's example is just one of a multitude of possibilities which could happen to someone doing field research.  Knowing how to collect the location data accurately in improvised situations is the focus of this lab exercise.

Methods

To start this exercise Dr. Hupy took our entire class outside to give us a demo of how to use the equipment and to make sure we had the data recorded in the proper format for importing into ArcMap.  He had also randomly put us into groups of 2 to work with for the remainder of this field activity.  My partner was Casey Aumann. (Casey's Blog)

Our class was given instruction on how to use the TruPulse 360.  The TruPulse 360 is a laser range finder.  The TruPulse 360 also has the capability of telling you the azimuthal direction you are facing in degrees.  You are able to locate the image in the view finder and fire the laser with the button on the top of the unit.  If you want to see what your azimuthal direction is you must use one of the two buttons on the side to toggle through the selection until AZ is displayed in the view finder.  The unit has multiple settings for distance.  Our class was instructed to use the SD (Slope Distance) setting on the unit.

(Fig. 1)  The TruPulse 360 is on the left and the picture on the right is an example of the view withing the viewfinder


After recording just a handful of points we returned to the computer lab to input our points we had recorded into Excel and then import the table of points into ArcMap creating a feature class.  Upon adding the feature class to the data frame, and a base map to the background we noticed something was askew.  The point of origin was off and direction of the lines was off enough to visibly notice at a quick glance.  At the end of the class period, our class and Dr. Hupy were unable to resolve the issue.  Assuming the calibration was off, the following day Dr. Hupy went out with the TruPulse 200 and attempted to recalibrate the unit.  He went to the same location as our class did the day before, with terrible results.  Dr. Hupy had couple reoccurring problems including, failure to calibrate, or the compass would point to a different north direction each time.  After all the trouble, Dr. Hupy relocated to a wooded area and was able to calibrate the unit with no issues.  Dr. Hupy believes there was electrical interference somewhere on the corner of the building or underground which was causing all of the incorrect readings.  Using this information my partner and I chose to survey a different location on campus.
(Fig. 2) This is the results of the trail run with Dr. Hupy.  The red dot is actually where they points were taken from and the pink dots are where the green lines should line up too.  Not all the pink dots can be see on this extent of the photo.



For this exercise Casey and I used the TruPulse 360 unit to obtain our distance and azimuth information.  Additionally, we used my personal Garmin 62s to obtain our point of origin.  On the afternoon Wednesday September 30th, 2015 my partner and I headed to the the campus courtyard on the University Wisconsin Eau Claire Campus.  The decision to use the courtyard was due to a plethora of large stone benches, light posts and other features for us to measure.  The area wide open though it does fall between a lot of buildings, we were hoping to reduce the chance of being hampered by electrical interference.

When we arrived at the courtyard we picked a location in the center of the courtyard with a view of all the features to be surveyed.  The location we chose was on a paved surface which had individual squares.  We picked a square to stand in and record our point of origin.  Having this square helped us to make sure we remained in the exact same location as we recorded our data.

(Fig. 3) My lab partner standing in the square we chose to record our data from.
For this lab exercise we had to record the distance, azimuth, and object attribute for at least 100 different points in our study area.  As suggested by Dr. Hupy we double check our angles and make sure we weren't getting any interference using a compass.  Everything appeared to be in check so we proceeded to collect our data.  In the courtyard there are just over 50 stone benches, so we decided to use them as our first features to measure.  It was a beautiful afternoon so there was a number of people utilizing the courtyard and the benches.  This made it difficult to obtain a accurate reading from some of the benches as people were sitting or leaning up against them.  We were standing in a lower spot of the courtyard, which limited the view of some benches.  With only a small portion of the bench visible it was tough to get an accurate reading, especially if it was any distance away.  The TruLaser has a 7x magnification which increases any movement or shaking in the view finder adding to the toughness of locking on to small object for an accurate reading.

We directly entered the data in to an Excel file using a laptop to eliminate a step of having to transfer the data later if we were to record the data on paper.  Having done a trial run in class a few days before we already knew what the format had to be to achieve the correct results.

(Fig. 4) The formatting used to record the data in Excel.
After recording the data all of the benches, we decided the light poles within the courtyard would be our next feature to collect data from.  The light poles are not very large in diameter, so getting an accurate reading from them each time posed to be a challenge.  We focused in on the head of the lamp which was considerably larger in diameter than the post itself.  After collecting and recording all of the light pole information, we choose to survey the larger trees and ground signs around the courtyard.  When completed we ended up with 107 points we collected data for.

Before we could covert the data to a feature class in ArcMap we needed to take the point of origin from the GPS and convert it from degrees, minuets, seconds to decimal degrees.  Dividing the minuets by 60 gave us the decimal we needed to accurately display our point of origin.

With all the data in the Excel file I imported the table into ArcCatalog.  The next step was to open new blank file in ArcMap.  To display our data, I had to run a tool called Bearing Distance To Line using the table I had imported as the input data.  The tool is located under Data Management Tools and then under Features in ArcToolBox.  Using the data within the table the tool plotted the distances as lines from the point of origin at the given angle on the map surface.  See figure 4 for the results from this tool.
(Fig. 5)  The result after running the Bearing Distance To Line tool.

(Fig. 6)  Illustration of how the bearing distance to line tool works in ArcMap.

Once I had utilized the bearing distance to line tool, I was able to convert end of the line opposite the point of origin to a point using the Feature Vertices to Points tool with in ArcMap.  This tool is found in the same Features folder as the Bearing Distance to Line tool.  Giving the end of the line and endpoint helps you visualize the end of the line and also give the feature a determinable location on the map.

(Fig. 7)  The result of running the Feature Vertices to Points tool.

Once both of the above tools had been run I loaded in a base map of the study area in to ArcMap.  The basemap will allow me to overlay my feature classes I created to determine how accurate our data collection was and produce a map with my results.  I was unable to use the standard basemap available from ArcMap as my study area was built/created after the basemap photo was taken.Using data available to me through the university, I loaded a raster image from the City of Eau Claire file that contained my study area.  Since the basemap was projected into a Eau Claire County specific projection, I projected both my lines and endpoint feature classes to the same projection to give me the best results.  My final step was to complete the metadata.

(Fig. 8)  The finished map of our study area in the courtyard of UWEC campus.

(Fig. 9) Metadata completed.


Discussion

This lab provided good exposure to collecting data and issues which arise in the field while collecting data.  The purpose of the lab was to expose us to a "basic" survey technique which we could use in the field should our "high end" equipment fail us for one reason or another.  I realize everything is relative, however the TruLaser 360 we were using costs in the neighborhood of $1,500+ dollars.  Now I understand this is no where near the most expensive surveying tool on the market but it costs enough that not everyone is going to be carrying one around in their pocket as a "backup".  I feel that a compass and long tape reel is probably more "basic".  Knowing this tool exists now I would definitely add it to the list of tools which would be handy in conducting field surveying, especially if you are working alone.

With all technology comes limitations.  The example of the electromagnetic interference we experienced is a prime example.  Looking at my final map I believe we experienced some issues with the technology we used.  Our point of origin is off by a couple feet.  This could be caused by a number of issues.  It is possible we had more interference affecting the location.  Also, the accuracy range of a personal GPS is at best +- 3 meters.  Secondly, the point angles are off of their actual location.  This is partially due to the origin point being off but that is not the complete answer.  It is my feeling we had electromagnetic interference play games with us.  The points on the West side of the point of origin are off more than the points on the East side of the map.  I have no plausible explanation for this at the moment.

The collection of the data went smoothly.  I used the TruLaser 360 to collect all of the data numbers for all of the benches so we would have less chance of being confused as to which ones had already been done.  Also, given the benches were in row form, we decided to work from right to left and finish the complete row before moving on to the next one.  We implemented the same processes for the other features, except I was entering the data into the Excel file while my partner collected the data.

Conclusion

This lab activity proved to be very educational in many different areas.  Learning this new method of locating and plotting points will be very beneficial in the future should technology let me down as I know it will. Using a survey tape measure and compass as discussed before would be a good way to double check the accuracy of your equipment.  The introduction to the TruLaser is now another tool I am familiar with and can put to use if available.  Continued use of Excel and importing tables into ArcMap is a great way to understand and see the proper way to format data to assure it will be displayed correctly the first time.