Monday, December 16, 2013

Lab 5: Final Mini Project

Introduction:
For this project, I choose my research question to be “where is a suitable camping area in Ashland County Wisconsin”? I chose this question because last year I got a tent for my birthday and haven't really been able to use it as often as I would like. My sister, who has frequently offered to take me camping, lives in Ashland, Wisconsin. My sister is an experienced camper/backpacker but I have a few concerns. For starters, my sister is allergic to bees and it seems some other bugs as well. I’m sure she is responsible enough to bring her EPI pen along but just to be safe I set up some criteria for possible camping areas. This criteria includes being within county or national parks, at least 2 miles from a highway, a mile from any fire occurrences within the last 10 years, and at least 64 km from a hospital. These criteria are to help me find optimal camping areas but also easy access to roads incase my sister has to use her EPI Pen I want to make sure a hospital is at least 45 minutes away. 

Data Source:
I got my data from ESRI and the Wisconsin DNR. I used ESRI’s Highways and Hospitals and the WI DNR’s County, national, fire occurrences, and Wisconsin’s State and County boundaries. My concerns with some of this data is the temporal accuracy. When was the data taken and is it still useful to me?  The fire occurrence data’s most recent record was from 2009. It would be nice to have more up to date data but for a project for an introductory course for ArcGIS I’m okay with this data. Here is a link to the Wisconsin DNR metadata: http://dnr.wi.gov/maps/gis/metadata.html. I was also concerned for the completeness of some of the data, for instance for the fire occurrences, where all fire instances recorded or was certain criteria made to have to be recorded in this data set? Overall I am pretty happy with my data sources and the metadata.

Methods:
For this assignment I used the clip tool to limit the necessary data to inside Ashland County. I used buffer tools with clips for the hospital and highway features to “build” my area of interest and I used a buffer with the erase tool to take away unwanted areas. I also used the Union tool to combine the county and national forests in the county. Here is my data flow model:


Results and Evaluation:
Here is a map of my final product run from the data flow model. As you can see I am not limited for choices in the Ashland County area. I feel like there are some things i could fix or change for next time. For instance there is a hospital in the next county that you cannot see that would be closer than the other hospital. I could have forgone the hospital clip to ashland county and then more area would be available for camping but I was limited to one county. I am pretty satisfied with the outcome. If I had to do this project again I would ask a different question that would require me to go out of my comfort zone and look for data online. I had thought of doing a question for the placement of a wind farm in Eau Claire County but decided to go with a simpler project for the sake of time. I really enjoyed this project and I feel more confident to answer more spatial related questions.



Friday, December 6, 2013

Lab 4: Suitable Bear Habitat


The goal of this lab was to demonstrate and further explore our knowledge of geoprocessing and data management and use them to adhere to specific criteria, also to make a data flow map of the process.  

Background: map suitable bear habitat areas and compare it to DNR management areas in Marquette County, Michigan.

Methods:
Objective One: Map a GPS MS Excel file of black bear locations in Michigan. First, I explored the properties of bear_locations_geog$ and then previewed the file noting file type and coordinate system. This file had X and Y coordinates and in order to map them I needed to add them as an event theme. So I set the X and Y fields with the corresponding Point_X and Point_Y and set the coordinate system to NAD 1983 HARN Michigan GeoRef (meters). I then exported them to my geodatabase.
Objective Two: Determine forest types where black bears are found in central Marquette County, based on GPS locations. For this objective I made a join with Land Cover and bear_location to get bear_cover, after that I summarized MINOR_TYPE and found that the top three habitat types are mixed forest lands, forested wetlands, and evergreen forest lands.
Objective Three: Determine if bears are found near streams. First I buffered the streams and then preformed a spatial query with bear_location and found that about 72% of the bear population at that point in time was near a stream (according to biologist it has to be above 30% to be consider an important habitat characteristic.

Objective Four: Find suitable bear habitat based on two criteria. Now using the bear_cover I preform a query to limit it to the top three habitats and intersect that with the stream_buffer. These show up as overlapping polygons of the same feature so I use the dissolve tool to make them all continuous.
Objective Five: Find all areas of suitable bear habitat within areas manages by Michigan DNR. For this one I used clip to limit the available features to inside the study area and used clip again to see which DNR management areas were within our recently determined suitable bear habitats. Because the DNR’s management areas are split into small units I used dissolve to get rid of the inner borders.
Objective Six: Eliminate areas near urban or built up lands. For this last objective I used Select by Attribute on land cover to isolate urban and built up areas within the county and followed that up with a buffer. From here I used erase to eliminate any bear habitat and DNR management area within 5k of the area.
Results:
As seen on the Map I include land cover type in the bear habitat areas and DNR management areas in those habitats. It appears that there is a good cluster of bears in the North West but there are no DNR management areas around that area. If you look just south of that you see there is a good proportion of DNR Land management and bear locations. Something that needs to be recognized is that these bear locations are frozen in time, these bears aren’t glued in place and are probably mobile. I feel like this area would be the best place the Michigan DNR could utilize their resources with their given management areas at this point in time unless they are interest in obtaining more management areas to the north.

 And here is my Data Flow Map!
















 

Sources: All of the data were downloaded from the Michigan Center for Geographic Information.
   Landcover is from USGS NLCD
 http://www.mcgi.state.mi.us/mgdl/nlcd/metadata/nlcdshp.html
   DNR management units
http://www.dnr.state.mi.us/spatialdatalibrary/metadata/wildlife_mgmt_units.htm
Streams from
http://www.mcgi.state.mi.us/mgdl/framework/metadata/Marquette.html










Wednesday, October 30, 2013

Lab 2: Downloading and Mapping Data From Online


The goal of this Lab was to learn how to download and map data from the U.S. Census Bureau.

Methods:
I first performed an advanced search on the US Census Bureau’s website: http://factfinder2.census.gov/faces/nav/jsf/pages/searchresults.xhtml?refresh=t and Choose Topics > People > Basic Count/Estimate > Population Total. Then form the Geography tab I chose counties > County 050, Wisconsin > All Counties within Wisconsin. From here there data was offered to download in two data sets SF1 and ACS. SF1 datasets are the most basic based on the Decennial Census mandated every 10 years by the U.S. Constitution. SF1 provides an accurate count of all persons living in the U.S. in order to reassign seats for the House of Representatives. ACS (The American Community Survey) are detailed population estimates derived from a number of surveys across numerous geographic areas, so there is some error to the estimates as well. I choose and downloaded the P1 TOTAL POPULATION from the 2010 SF1 Dataset. From there I extracted the zip files. I then viewed the CSV files in MS Excel looking at the metadata and the tubular data: DEC_10_SF1_P1_with_ann.csv (P1) (which I saved as to Excel Workbook). I then downloaded a Wisconsin shapefile from the website and then unzipped the shapefile.


In ArcMap and confirmed that the data came over properly. I then opened the WI shapefile table and the P1 table and joined these using the common attribute GEO#id as the key. The shapefile was my destination table and the P1 table was the source table. I confirmed that the data join worked by opening the attribute table. From here I can now map the population data. In the WI shapefile symbology properties I choose quantities > graduated colors and selected a pleasing color scheme and number of classes. I choose the value to be mapped as D001 (Total Population). I also adjusted the projection of the data frame to one more appropriate for the State of Wisconsin: NAD 1983 (2011) Wisconsin TM (US Feet). From here I added other elements to complete the map.

 
I was also able to download a variable of my choice and I choose to download the P1 SEX BY AGE from the 2010 SF1 100% Data. I followed the same process with the TOTAL POPULATION data unzipping files and joining the data to the WI shapefile in order to map the data in a new data frame. For my value to be mapped I had to normalize the data I mapped D033 (Age 21) by D026 (Females) and choose to have my data shown as a percentage.
 
Results:
 
These are the finished maps of the data I downloaded from the US Census website. Overall the process was quite simple and even fun. I have Total Population by county on the left, and Total Percent of 21 year olds (Female) normalized by total female population by county. I thought that would be an interesting value to map based on my knowledge of where Wisconsin Universities and other major colleges in Wisconsin are located. Overall I am quite happy with the results of this lab.
 

Friday, October 25, 2013

Lab 3: Introduction to GPS


The objective of this lab was to become familiar with the Trimble Juno GPS unit and create a geodatabase with feature classes and prepare that geodatabase for deployment to the Trimble Juno unit and collect field data and then proceed to check in attributes collected and make a cartographically pleasing map using data collected.

 Methods:

First step was to use ArcCatalog to create a new geodatabase and add feature classes (points, lines, and polygons). Feature classes were assigned the coordinate system NAD_1983_HARN_Wisconsin_TM (meters) and I added a field type and set it to text for all feature classes. Next a shapefile of the campus buildings and a raster of campus was imported from existing files. All elements of the new geodatabase was added to the map and the symbology of feature classes was changed accordingly to be easily distinguishable on the Trimble Juno unit. Then I saved the map.

Second I prepared the geodatabase for deployment to the Trimble Juno for field data collection using ArcPad Data Manager Toolbar. I clicked the Get Data for ArcPad and clicked the Action Menu to change the defaults of the Background Layer Format and choose AXF layer and changed the Background layer editing and choose editing allowed. Finally I chose Checkout all Geodatabase layers and copyout all other layers (making sure all layers were set to check out). I named for a folder to store my data and adjusted the path of where the file will be stored. I then finalized deployment by clicking Create the ArcPad data on this computer now and then finish.

Third I loaded the Geodatabase onto the Trimble Juno using a USB cable and cutting my folder with new geodatabase out of the directory and pasting it into the Geog335CHUPY folder on the storage card. I then confirmed that everything was checked out properly by opening my map in ArcPad on the Trimble Juno unit and seeing the campus image and building along with all the layers I created in the table of contents.

Forth, I then collected point, line, and polygon features in the field using ArcPad on the Trimble Juno GPS. The objective was to collect six polygon attributes (three using point averaging and another three using point streaming), one line of the footbridge, and six points (three with type “tree” and another three with type “Light Post”). Collecting points was easy, simply tap the point feature and point averaging will begin, after that I filled out the attribute form and clicked OK and moved on to the next point I wished to collect. For polygons I tapped the Add GPS Vertex which creates the first vertex after point averaging and repeat this step for each desired vertex. When the polygon was complete I tapped Proceed to Attribute and filled out the attribute form. I also used this method of collection for my Line feature. For point streaming I tapped the Add GPS Vertex Continuously and walked the perimeter of my polygon and tapped Proceed to Attribute to complete the polygon.

Finally I reconnected the Trimble Juno to my computer and copied my folder from the Storage card and pasted it back into my Lab3 folder.  From ArcPad Data Manager Toolbar I used the get data from ArcPad tool and checked all of my feature classes and clicked Check In. The Data I collected in the field can now appear in ArcMap. From here I designed a layout with appropriate map elements.
Results:
Here are the final results of my data collection using the Trimble Juno GPS unit. Unfortunately the raster image is an outdated image of campus which has since undergone renovation as can be seen by the overlapping elements of campus buildings and my collected data. I'm pretty sure I collected the wrong footbridge when I was supposed to collect the one crossing over the Chippewa River instead of Little Niagra Creek. The three polygons using point averaging include the largest grass area and the two immediate grass area to the right. These polygons have straight edges as opposed to the point streaming polygons which allow for more curvature and the true shape of the grass area to show. At the beginning of my data collection session I had a PDOP of around 3.8 probably due to the cloudiness and my orientation to nearby campus buildings. As time went by I was able to connect to more satellites and my PDOP eventually went down to 1.4 which I was quite satisfied with given the weather, it was very cold and windy! As shown the GPS unit has some fault and inaccuracy, for instance I know for a fact that the lower right light Post is located at the edge of the polygon and not on the path. This was probably because this was my first attribute collected when my PDOP wasn’t as low. But never the less the lesson learned is that a GPS is rarely 100% accurate and should be taken into account when collecting data.
 

 

Friday, September 27, 2013

Lab 1: Eau Claire Confluence Project


 
The goal of this lab is to become familiar with various spatial data sets used in public land management, administration, and land use and to prepare base maps for the Confluence Project. This involved an understanding of Legal Descriptions of property and then being able to develop a legal description of the Eau Claire Confluence Project Proposal. To further develop digitizing skills with Public Land Survey System (PLSS) emphasis, Wisconsin’s Township range system, and civil divisions, with this basic understanding I had to develop six thematic maps of Eau Claire.  

Here is a brief description of the Confluence Project in which lab 1 was centered around, “In 2014, a public-private partnership between local developers, UW-Eau Claire and the Eau Claire Regional Arts Center (dubbed Haymarket, LLC) will break ground on a new community arts center/university student housing and commercial retail complex in downtown Eau Claire. The Arts Center will be home to three performance spaces, galleries, offices, classrooms, studios, and much more”.
http://volumeone.org/news/1/posts/2012/05/15/3134_arts_center 


Methods:

The first method that was covered in this lab was to digitize the site for the proposed Confluence Project. I first had to create a new geodatabase, add a polygon feature class for the proposed site, using projections from an existing feature class import the parameters of the Eau Claire County Coordinate System, and add the World Imagery from the base maps (an Ariel view). Next I had to digitize the two proposed parcels purchased by UWEC using the snapping tool, the parcel_area feature class, and a legal outline photo for further reference. (Fig. 1.1)
Fig 1.1
 
Next we developed legal descriptions of the parcels of the proposed Confluence Project. I used the Identify Tool to obtain the Parcel NO which is a standard number used to identify parcels for taxation purposes. I took this information and entered it into Eau Claire’s Property and Assessment Search Website: http://www.bis-net.net/cityofeauclaire/search.cfm
From here I was able to complete a legal description (Fig. 1.2)

Fig 1.2


 
The final step in making 6 maps showing major features was simply using knowledge already known. I created 6 new data frame in the table of contents. From there I was able to apply relevant feature classes to each individual data frame's layers. CIVIL Division map shows county boundaries and the City of Eau Claire along with the townships.  I used basic techniques to make the maps visually pleasing and Inserted Labels, Legends, Scale Bars, Neat lines, and Shout Outs. I also used guides to help me align and size my maps appropriately. Then I sighted my source at the bottom and included my name. (shown in Fig. 1.3)

  • Civil Division map shows county boundaries and the City of Eau Claire along with the townships.  
  • Census map shows Block groups and Population per square mile.
  • PLSS map Shows Eau Claire with dividing lines of the Public Land Survey System.
  • Parcel map shows parcels, roads, and water around the proposed Confluence site.
  • Zoning map shows locations of public properties, conservancy industrial, central buisness, residential, and shopping district zones. Center lines of roads is also includes for visual purposes.
  • Voter District map shows boundaries and the district number for most of the City of Eau Claire.
Results:

 
Fig 1.3
 
Sources
Eau Claire Confluence Project:
PLSS - Legal Descriptions: