GIS 540

Student Project Gallery


a a a a a a

Spring 2015


a

Calculation and modeling of groundwater elevation and BTEX analytical concentrations-Betsy BoutonPart 1, Part 2

The monitoring of groundwater elevations and analytical concentrations of potentially harmful contaminants is important for the delineation of the nature and extent of the contamination, assessing the stability of the plume, and developing trend data. The creation of groundwater contours and plume models has previously been performed manually by the environmental scientists collecting the data. This tool automates the process by taking CSV files of field measured depth to water along with the results of the laboratory analysis and interpolates groundwater contour lines, a raster surface of the contamination, and outputs to an HTML webpage the maximum, minimum, and average groundwater elevation and analytical result values with a map. The input of multiple years’ worth of data allows for comparison of the flow of the contamination over time. All original source data used as example input are Government owned and are the property of the Government with all rights and privileges of ownership/copyright belonging exclusively to the Government.

Keywords: CSV, groundwater, groundwater elevation, groundwater contamination, interpolation, contour lines, BTEX, HTML


a

Gridded Coastline Simplification of Postal Code Polygon-Bill Morelli Part 1, Part 2

In the package delivery industry Postal polygons are used for complex geospatial operations to analyze asset movement data and make business decisions. Data purchased from vendors are highly accurate, which results in a large number of vertices per dataset, particularly in countries with long coastlines. This complexity causes spatial operation issues due to memory and network capacity demand when performed using GIS services. My process and tools create a varied size grid coastline for postal polygons based on square area size in countries that fit this description, while maintaining the original accuracy and integrity of the inland postal code boundaries. This reduces the vertex counts tremendously in coastal water areas that do not have any residents.

Keywords: Postal Codes, Polygon Simplification, Grid Coastlines, Vertex Reduction, Spatial Selections, HTML Creation


a

Retrieving and Manipulating Map Tiles from a Webservice - Kenneth ClawsonPart 1, Part 2

The python tool allows the user to query Google’s Static Maps API with predefined parameters to obtain map tiles. By building the query with parameters the user can choose to included or remove map features including buildings, roads, and points of interest among others. After obtaining the map tiles the python script displays the map tiles in Esri’s ArcMap Desktop Software.

Keywords: Map tiles, web services, python, urllib, Google Maps


a

Importation of National Severe Storms Laboratory WDSSII Custom NetCDF Data Formats to ArcGIS - Robert Toomey Part 1, Part 2

At the National Severe Storms Laboratory we use custom NetCDF data formats as part of the WDSSII project. Being able to read these formats into ArcGIS is useful for working with this data within a GIS system and for data presentation. We create ArcGIS toolboxes to allow reading of custom NetCDF data into ArcGIS GRID format, and for batch creation of HTML and PNG map images.

Keywords: Data Conversion, NetCDF, Weather, Rasters, NSSL, WDSSII, Zipfile


a

Forecast Fetcher: Generating Weather Information with Geospatial Data for NC State Parks - Jason Brown Part 1, Part 2

Weather events are capable of having severe impacts on recreational areas and as a result, park managers as well as park visitors must stay informed. The current alert system for weather related warnings in North Carolina State Parks relies upon field staff observations and manual updates to online messaging that advises the public. By integrating National Weather Service geospatial data, an analysis is automated to determine specific alerts for precise areas of the state park system. This project differentiates summer and winter weather events and identifies the parks in the forecast area. The script also prepares data for a statewide map of all weather watches and warnings within state park land.

Keywords: weather, alert, state park, warning, public safety, site closure, weather hazard


Spring 2014


a

Vector-based Shadow Analysis - Reza Amindarbari

The shading condition of buildings, that is how much of their facades and roof are in shadow or inversely receive solar radiation at given times, highly impacts their thermal performance, daylight quality, or capacity for using of solar panels. There are a very few number of GIS tools for shadow analysis, such as Sun Shadow Volume tool, however, none of these tools are vector based. Getting a building footprint polygon shapefile as input, which should contain building height data, the Shadow Analysis toolbox, however, computes the in-shadow area/ratio of the rooftop and facades for all individual features in the input dataset for given solar altitude and azimuth. Additionally, it creates two polygon shapefiles for shadows casted on the ground and rooftops.

Keywords: Building Footprint, Rooftop Shadow, Facade Shadow, Geometry Object


a

Vehicle Routing Problem Map and Directions Automation - Stephanie A. Wendel

Finding the optimal route for a fleet of vehicles is helps to speed up delivery but unless you tell the drivers where to go, the route be ineffective. They have to have an easy to read interpretation of the optimal route. The easiest way to deliver that is in the form of a map and written directions. This solution provides a way to automate solving the vehicle routing problem and then deploying a set of written directions in a pdf map as well as in webmap form for the drivers to use. This saves both the analyst and drivers time. The mapbooks are built and uploaded automatically and the driver has a set of easy to follow instructions which they can access via their ArcGIS Online account.

Keywords: Vehicle Routing Problem, directions, fleet, mapbook, ArcGIS online, deliveries


a

Processing Daymet Data for Intersection with the WaSSI Model
- Jonah Freedman

Developed by The Eastern Forest Environmental Threat Assessment Center (EFETAC), the WaSSI (Water Supply Stress Index) Ecosystem Services Model (http://www.fs.fed.us/ccrc/tools/wassi.shtml) can be used to project the effects of land cover change, climate change, population change, and water withdrawals on river flows, water supply stress, and ecosystem productivity. Daymet (http://daymet.ornl.gov/gridded) is a collection of algorithms and computer software designed to interpolate and extrapolate from daily meteorological observations to produce gridded estimates of daily weather parameters. Some of these parameters are used in the Priestley-Taylor (PT) method for calculating PET (Potential Evapo-transpiration) which is a variable missing from the WaSSI model. The processes required for preparing Daymet data for intersection with the WaSSI model are presented here.

Keywords: water supply model, Water Supply Stress Index Model (WaSSI), ecosystem services, climate, precipitation, temperature, solar radiation, vapor pressure, spatial analysis, scaling, evapotranspiration (ET), potential evapotranspiration (PET), actual evapotranspiration (AET), Priestley-Taylor (PT) method, watershed, HUC12


a

Automate and Email Daily Movement of Collared Nigerian Elephants - Janet L. Loomis

Human-elephant conflict is a major issue in Nigeria. As part of an ongoing effort to reduce this conflict, Nigerian officials use satellite data from two collared elephants to monitor the locations of the two herds in Yankari National Park. The collars transmit location data on a daily basis which is used by the North Carolina Zoo to create elephant location maps. These maps are then emailed to counterparts in Nigeria to aid in their efforts. This tool automates the process by parsing the satellite data, creating a final map image, and emailing the final image.

Keywords: elephant satellite location data, automated email, parsing text, daily movement


Fall 2013


a Eye Tracking Data Visualization - Michelle Glatz Part 1, Part 2

An eye tracker is a device that tracks the locations where the eye’s gaze dwells (fixations) and the movements from one fixation to another (saccades). The eye tracker can be used with ArcMap to gather data that specifies where on a map a user is focusing, when and how long their gaze lingers, as well the size of the dilation of their pupils. The ability to analyze various properties of and recognize patterns in this data would be beneficial for many applications as a tool for determining the effectiveness of the map presentation . An analysis of map eye tracking data is presented here. The data is extracted from the eye tracker output file (.csv) and displayed on the map using symbology based on length of gaze or pupil dilation to visualize the level of viewer interest. A space-time grouping analysis is conducted to indicate areas of high focus both spatially and temporally.

Keywords: eye tracking, grouping analysis, space-time analysis, visualization


a Climate Change Vulnerability Index Toolset - Brian T. Watson Part 1, Part 2

NatureServe has developed the Climate Change Vulnerability Index (CCVI) to rapidly assess vulnerability of animal and plant species to climate change using a scoring system based on predicted exposure and sensitivity to climate change. The CCVI incorporates the overlay and classification of species ranges with both predicted and historical GIS climate data as factors of a species’ exposure and sensitivity to climate change. The CCVI Toolset is a suite of tools that calculates exposure to predicted change in both moisture and temperature for single or multiple species in varying formats. Future plans for the toolset include incorporating tools to calculate historical temperature and precipitation factors, exporting and/or creation of summarization reports, and interface improvements.

Keywords: Climate Change Vulnerability Index, CCVI, climate change, NatureServe, climate model, raster


a Crime Analysis Population Environment - Kevin Williams Part 1, Part 2

It has been said that crime has an inherent geographical quality. Yet, simply looking at crime density creates a bias outlook at high crime areas. The key aspect in crime analysis is the people, which serve as victims, offenders, and guardians. A population environment to overlay crime data and display hot spots is presented here to visualize the true rate of crime.

Keywords: crime, population, crime rate, analysis, database, Tennessee


a Air Force Incident Management (AFIM) Aggregator - Kevin Dunlop Part 1, Part 2

The U.S. Air Force has many air bases located overseas. During a wartime invasion, the airmen at each base track incidents, attacks, damage, and readiness status for their base. This information is summarized and reported to the number Air Force headquarters (i.e. 7th Air Force) responsible for their region. The AFIM Aggregator is an application designed to standardized the summary process for each base and then aggregate the results into a regional picture resulting in freed up manpower at the base levels while providing a more accurate and concise regional picture.

Keywords: military readiness status, incidents, building damage, air field damage, reporting, overseas, air force


a Automated FAWN Weather Data Download, Cleanup and Geoprocessing - Jeannette Atkinson Part 1, Part 2

There are 36 weather stations in the state of Florida that collect daily data, such as temperature and rainfall. The data cannot be processed in its raw form due to data issues such as missing values. An application of automated data download, cleanup, and geoprocessing is presented here. The FAWN csv files are updated and automatically uploaded into ArcGIS for an interpolated visual display.

Keywords: FAWN, Weather, Spline Interpolation


a GPX, KML and Shapefile: One Tool to Rule them All - Gregory B. Dunnigan Part 1, Part 2

ArcGIS provides many tools for data conversion. Basic users of Arcmap may not be familiar with exactly where these tools are located. It is more valuable for these users to be out in the field, not spent with time consuming data conversion processes. With the power of Python and Scripting, a tool can be built that greatly streamlines the basic users process in regards to basis conversion tools as well as heavily customized processes.

Keywords: Data Conversion, GPX, GPS, KML, KMZ, Shapefile, Zipfile



a a a a a a a a

Spring 2013


a GoldenEye....tracker: GIS Analysis of Eye Tracking to Study Data Visualizations - Stewart Rouse Part 1, Part 2

Maps enable viewers to inspect, interpret, and analyze geospatial data sets.  Effective maps are designed to engage viewers by directing attentions in response to visual stimulus, but research indicates that meaningfulness has a role in controlling visual attention.  In this study we explore how data visualizations attracts a map viewer’s attentions.  Using LIDAR data from Jockey’s Ridge, eye tracking hardware, and GIS; we can change the manner in which the data is displayed and track the motion of the user’s eyes to see where on the map the user’s attention is pulled.

Keywords: tracking, Jockey’s Ridge, CSV file, data visualization, clip


a Quantifying changes in snow cover using Landsat imagery - Stephen G. Smith Part 1, Part 2

As global climate change continues, one of the most apparent consequences has been the diminishing volumes of snow and ice in many areas of the world. Quantifying these changes in earth’s cryosphere aids researchers in developing models to predict the future impacts of climate change. One method of assessing changes in ice and snow cover is through the analysis of satellite imagery, and due to the time-consuming nature of manually mapping the extent of snow cover from satellite imagery, this programming tool has been developed to automate the process of importing Landsat imagery into ArcGIS software, extracting areas of snow cover, and quantifying the results. Analyzing these results may provide an effective means for elucidating the impact of climate change in a certain area.

Keywords: snow cover, Landsat, image classification, satellite imagery, raster classification, mapping, climate change


a Generation of Vector Field(s) using Partial Derivatives from ASCII Text Files - Emily R. Russ Part 1, Part 2

Coastal areas are highly dynamic and the surface is constantly evolving. Therefore, LiDAR data is an invaluable resource for displaying terrain change in coastal areas to help understand coastal processes such as storm impacts. However, it can be difficult to understand how the surface is changing spatially using typical surface analyses such as slope or aspect maps. Therefore, the purpose of this project is to show the fastest rate of change in a high resolution DEM by generating a gradient vector field, to more clearly visualize coastal evolution. Mapping the direction and magnitude the surface changes spatially using vector fields can increase understanding of the processes acting in coastal areas, and highlight the vulnerability of certain coastal areas.

Keywords: Rodanthe, NC, LiDAR, ASCII text files, vector fields, partial derivatives, XY to Line


a Investigating Appropriate Sites for the NSF Olympex Radar Observing System - Lani Clough Part 1, Part 2

The Olympic Peninsula in the state of Washington experiences orthographic precipitation within the mountain ranges with precipitation rates 50-70% higher at the crests than the annual precipitation means. Current knowledge of orthographic precipitation is greatly limited by the lack of large rain gauge and radar networks with the ability to monitor small-scale precipitation events. Radar antennas are difficult to place in mountains due to topographic features which can block radar beam patterns. This tool identifies appropriate sites with minimal blockage for placing the National Science Foundation (NSF) Olympex radar observing system in Washington's Olympic Peninsula.

Keywords: Radar Systems, Orthographic Precipitation, Viewshed, National Science Foundation, Spatial Reference, Site Analysis, Select By Location, Distance between point, Arcpy Mapping


a Rapid Prototyping for Tangible GIS - Brendan Harmon Part 1, Part 2

The research objective is to create physical 3D models of Jockey’s Ridge for Tangible GIS. GIS is coupled with a physical model through an iterative cycle of 3D scanning and projection. A rapid prototyping tool analyzes a DEM and exports for laser cutting, 3D modeling, and rapid prototpying. A time series tool performs time series analysis and exports point clouds for 3D modeling and rapid prototpying.

Keywords: Tangible GIS, rapid prototyping, lidar, time series analysis


a Landsat Forest Change Tools (LandsatFCT) - David G. Jones Part 1, Part 2

The N.C. Forest Service (NCFS) has identified a simple methodology for locating forest harvest sites for forestry Best Management Practices implementation monitoring using Landsat satellite imagery and collateral GIS datasets. LandsatFCT includes a suite of Python-based tools to assist with the identification of available Landsat scenes, batch downloading, TAR file unzipping, and geoprocessing to identify areas on the landscape where forest change has occurred. Presently, change analysis is primarily done through a Band 7 differencing. Future plans for LandsatFCT include a more user-friendly interface, user customizations, and additional change detection methods (e.g., NDVI, NDMI, etc.).

Keywords: Landsat, Forest Change, Band 7 Differencing, Raster, Parse, GeoRSS, TAR, TIF, TXT


a Automated Assignment Grading - Joshua Verkerke Part 1, Part 2

The GIS 520 course at North Carolina State University is an advanced look at some of the tools and methods available through ESRI’s ArcInfo software, and has been increasing in enrolment in recent years. The course takes students through 13 exercises involving many different geoprocessing methods, with exercise deliverables including shape files, databases, toolboxes, and similar data, typically with a structured naming convention for each component. Converting the grading process of such files from a manual or visual check to an automated system would help minimize the time TAs need to spend grading, and free them up to be more responsive within the student forums. The tool presented here partially automates grading of the “Suitability Analysis and Weighted Overlay” assignment, checking for the inclusion of the appropriate files with their proper names, as well as checking for valid results of several steps in the assignment.

Keywords: Grading, Rasters, Toolbox Model, Suitability Analysis


a Area-Time Inundation Index Model (ATIIM) - Chris R. Vernon Part 1, Part 2

The Area-Time Inundation Index Model (ATIIM) was created to quantitatively assess site restoration potential. The ATIIM generates two-dimensional inundation extents, determines average bankfull elevation, and generates a suite of other site metrics using hourly water-surface elevation data and terrain surface created from Light Detection And Ranging (LiDAR) data.

Keywords: Area, Time, Inundation, LiDAR, Index, Model



a a a a a a a a a


Fall 2012

back to top


a Converting Weather Collection Data into Displayable Raster Data - Brian McLean (2 part videos) Part 1, Part 2

FAWN “Florida Automated Weather Network” is a series of 36 weather stations spread throughout the state of Florida that collect various types of weather data. This weather data can be a useful source of supplemental information that farm managers can use to help make production decisions. The information is available for download via CSV file format. This tool will convert and interpolate the data across the state based on user input.

Keywords: Raster, Interpolation, Spline, Weather, CSV file, Conversion, Florida, Temperature, Evapotranspiration


a Suitability Analysis for Potential Rifle Range Locations at the S.C. Department of Natural Resources (SCDNR) - Geoff Schwitzgebel (2 part videos) Part 1, Part 2

The SCDNR builds and maintains various shooting ranges throughout South Carolina for public use. This script takes various input layers (soils, wetlands, slope, flood zones and proximity to roads) resulting in a final raster dataset that shows which areas are the most suitable for the construction of a new shooting range. As a result, significant cost savings are realized and the Engineering Department can go directly to specific areas for further field verification before determining a final site location.

Keywords: Suitability Analysis, Reclassification, Site Analysis, Raster, Raster Addition, Select by Location, Extract by Mask


a Potential Core Forest Conservation Areas in the Southern Blue Ridge Ecoregion - Rachel Albritton (2 part videos) Part 1, Part 2

Conservation focused non-profit organizations such as land-trust agencies often complete large-scale regional analysis to identify areas that may have high conservation value. Results from these processes are often shared within the agency and sometimes shared between agencies. This requires extracting sub groups of shapefiles and related information from larger datasets. Additionally, the individual(s) receiving the information are often not GIS experts and would benefit from having information that is easily accessed and digestible. The purpose of this project was to create a tool that extracts previously identified areas of high conservation value (polygons) along with relevant information (i.e. land ownership, species diversity, etc.) according to user selected county boundaries, map these areas, and automatically generate a summary report for the end user regarding acreages of conservation area(s), species diversity for selected areas, and acreages by landowner type.

Keywords: Conservation areas, southern blueridge ecoregion, batch processing, automated reporting



a Integrating GPS data and geotagged photographs into ArcMap and Google Earth for watershed analysis. - Megan Culler Part 1, Part 2

This project involves converts GPS data and geotagged photographs to shapefiles for analysis of a watershed's health, and conversion of data to KML files so it can be accessible to the public in Google Earth. GPX files containing waypoints and trackpoints are converted to shapefiles and joined with notes collected during watershed monitoring and attributes from shapefiles. Shapefiles relating to the watershed are then exported to KML, along with geotagged photographs.

Keywords: GPS, GPX, geotagging, photographs, KML, Google Earth, watershed


a Fire Incident Response Planning and Site Analysis - Joshua Tanner Part 1

The time it takes to respond to a fire incident is greatly impacted by proper planning and knowledge of the site. Using geospatial technologies and concepts, situational awareness can be improved upon and potentially provide faster response times. Two main factors in providing a prompt and efficient response are: (1) The shortest route from the fire station to the geocoded address; and (2) the closest fire hydrant to the incident. The python script in this project identifies and outputs both the shortest route and closest hydrant to a html page to help fire fighters better respond to an incident.

Keywords: Shortest Path, KML, Web Mapping, Fire Stations, Fire Hydrants, Google Maps


More Fall 2012 Projects...

a a a a a a a a a


Spring 2012

back to top


a Within-stand Variation Analysis in Southeast Forests - Kyle J. Marion (2 part videos) Part 1, Part 2

The spatial distribution of forest stand attributes, such as diameter, height, species, basal area per acre, is important to assessing how much variability is present within the stand. Knowing the variability is important for forest managers in making management decisions such as harvest planning, inventory design and gauging regeneration success. This program provides a tool for mass producing stand maps to depict this variation for any given stand attribute available in past inventory data.

Keywords: forestry, forest stand, inventory, attribute variability, distribution, map creation


a Hurricane Irene Erosion Analysis of the Outer Banks in Carteret County, NC Using LiDAR Data - Leslie John Sox (2 part videos) Part 1, Part 2

Hurricane Irene was a large and powerful Atlantic hurricane that left extensive flood and wind damage along its path through the Caribbean and the United States East Coast in August 2011. This project demonstrates the conversion of Hurricane Irene LiDAR data obtained from the NOAA Digital Coast Data Access Viewer to raster data using Python and Esri ArcGIS 10.0. The converted raster data is used to perform a geospatial analysis locating areas of erosion immediately following Hurricane Irene in the Outer Banks of Carteret County, North Carolina.

Keywords: Carteret County, NC, Outer Banks, Erosion, Hurricane Irene, GIS, LiDAR, LAS, Raster


a Regional Snowfall Index Batch Processing Tool - Jon Burroughs (2 part videos) Part 1, Part 2

The Regional Snowfall Index (RSI) is a method to categorize snow storm events based on their impacts to regional populations. This project develops an ArcGIS tool that can be used to batch process one or more snow storm events and calculate the RSI values and categories for each event. The analysis includes interpolating snowstorm events, classifying snowfall extents based on regional differences and comparing these extents to population. The final output includes gridded snowfall totals and tables containing storm categories from zero (minimal impact) to five (large impact) for each climate region.

Keywords: RSI, Regional Snowfall Index, snowfall, snow storm, eastern United States, IDW, zonal statistics, snowfall point totals, gridded snowfall, gridded population, regional polygons.


a Clip By Region - Jamie Hammermann (2 part videos) Part 1, Part 2

Use this script to clip one or more feature classes by user-specified regions. For each region, a directory is created along with a file geodatabase or shapefiles of the region boundary along with clipped feature classes. In each clipped feature class, a region field is added and populated with the region name. Additionally, and output report is generated in html which details all of the processing steps and outputs.

Keywords: clip, clip by region, geoprocessing, data extraction, data management


a Creating Map on the Cell Phone - Ali Ihsan Durmaz (2 part videos) Part 1, Part 2

Total Stations, GPS, Theodolites are the most common surveying tool and nowadays technological improvements allow to draw really very accurate maps using these equipments. In this project I will try to create map elements (point, line, polygon vector data types) using less accurate cell phone’s GPS data. For this aim, I coded two different python scripts; first one allows gathering GPS data on cell phone second one creates maps elements on the computer.

Keywords: Cell Phone, GPS, Mobile Mapping, Total Station, Surveying


More Spring 2012 Projects...

a a a a a a a a a


Fall 2011

back to top


a Making Election District Data Accessible - Jason Baker (2 part video) Part 1, Part 2

Of the great amount of data is available about voting precincts, very little of it is presented in a format that is easily accessible by the general public. While questions about ease of access to polling places, relative demographic information between different precincts, or even simple information about which district a precinct is in are often trivial for a GIS professional to provide answers to, this information often not published in a user-friendly format. As polling places and demographic information change frequently, it is important to have an easy method to update the data. This project creates a solution with a Python-based tool for ArcGIS, to quickly create webpages containing relevant information in an easy-to-read format.

Keywords: elections, precincts, districts, shapefile, voting, demographics, html, public participation, open government


a CAD Layers to Feature Classes - A Tool for Converting CAD data to GIS format - Bates Rambow (2 part video) Part 1, Part 2

CAD data of engineering drawings or surveys, known for high precision and accuracy, can be used to develop or improve GIS-based inventories of infrastructure assets. However, differences in the structures of CAD and GIS data have long been a barrier to interoperablity, limiting the ability to import and process CAD data in a GIS environment. The tool presented here provides a method for converting CAD data into geodatabase feature classes based on the Layer names assigned in the CAD file(s).

Keywords: CAD, Infrastructure, Asset Inventory, As-Builts, Survey, Engineering, Geodatabase, Data Conversion


a Polygon Classification Analysis - Jeremy Baynes (2 part video) Part 1, Part 2

: This script will run a select analysis for all unique values in a given field. The main purpose is to separate large polygon land classification datasets into individual feature classes for each classification. Options for this script will process an entire directory of files, export the files to a geodatabase, and perform a reclassification on the datasets. The reclassification is unique to this project but the code can easily be removed or edited for other projects.

Keywords: Land Use, Land Classification, Selection analysis, vector, polygon, Search Curson, Update Cursor


a Displaying Coincident Point Features: A Jitter Tool - Margueritte Cox (2 part video) Part 1, Part 2

Despite the value of geospatial analyses to public health, the availability of data involving locations of healthcare facilities varies by state and region. As a result, state- and national-level analyses pertaining to access to care are often infeasible. The most readily available and reliable proxy for exact coordinate location is zip code; however, zip codes may contain more than one hospital, which are displayed by ArcMap as a single point. This project provides a method, called jittering, for approximating hospital locations by adding random noise to points in order to break ties.

Keywords: public health, hospital locations, healthcare access, feature to point, table join, add random noise, jittering


a gInterpolate - Batch processing of GIS data for DEM interpolation using Google Maps - James J. Kim (2 part video) Part 1, Part 2

Many county GIS departments provide regular updates on their GIS data for free download and use. Compared to vector type GIS data (e.g. shapefiles), raster data such as DEM (Digital Elevation Model) data are divided into many pieces due to large sizes which recursively depend on their resolutions. This program can query and receive elevation data from Google Maps using shapefiles containing address fields and dynamically create elevation field in the shapefiles and generate DEM using interpolation. The accuracy is limited when compared to the conventional DEM creations (LIDAR or SRTM) but it can be used as a quick reference without manually downloading massive data.

Keywords: network, google, maps, api, interpolation, dem, json


More Fall 2011 Projects...


a a a a a a a a a


Spring 2011

back to top


a sedDEM: Sediment Transport Research Test Bed - Nathan J. Lyons

Research of sediment transport dynamics, a persistent topic since the early twentieth century, provides invaluable insight into fluvial processes, engineering applications, and aquatic habitat. The complex and stochastic nature of stream systems provides numerous challenges to researchers and new studies can benefit from a dynamic geospatial test bed that contains tools as well as other researchers’ models to test their hypotheses. An application of sediment transport dynamics, the control of bedload sediment size on salmonid distribution, is presented here in an effort to create this test bed.

Keywords: Sediment Transport, Grain Size, Fish Habitat, Stream Restoration, Hydraulic Model


a Measuring Impervious Surface for Raleigh Properties - JR Greco (2 part video) Part 1, Part 2

The City of Raleigh annually has their planimetric layer updated from aerial photography.  This tool takes this data and measures the amount of impervious surface for specified properties.  The result is a list of the properties and their impervious surface areas broken into categories, sub-categories, and sub-types.

Keywords: Impervious Surface, Storm Water, Planimetric Data, City of Raleigh, Property, Union, Select Layer by Attributes, Select Layer by Location


a Every Fifth Percentile for Z-value derived from the LIDAR data - Makiko Shukunobe (2 part video) Part 1, Part 2

 In order to facilitate Dr. Emanuel’s further analysis, the objective of the project is to produce text-formatted data that has a Z-value from raster data for every 5th percentile (e.g. 5%, 15%, 20%…100%) in ascending order for each polygon.

Keywords:  Reforestation, Fifth Percentile, Z-value, LIDAR data


a Semi-automatic Land Cover Classification Using Natural Breaks Method - Wei-Lun Tsai (2 part video) Part 1, Part 2

The main objective of this project is developing a simple method of land cover classification for top 50 cities in the U.S. using satellite imagery. The major problem here is that we need to do the same analysis procedure for lots of satellite imagery for one area. The analysis procedure includes reclassifying images by Natural Breaks method, converting raster to point, adding field and updating field values. Python tool is developed for reclassifying satellite imagery by Natural Breaks method and assigning land cover type for each class produced by reclassification.

Keywords: Image reclassification, Land cover classification, Natural Breaks 


a Parcel Relevance to 911 Phone Calls - Brian Ross (2 part video) Part 1, Part 2
One of the largest issues since the advent of cellular phones is how to uniformly handle emergency calls originating from wireless devices. Wireless devices are required by law to provide a latitude and longitude of the caller, but unless the caller is out in the open or relays an actual address it can make locating an emergency time consuming and inefficient. The purpose of this project is to help determine what parcels are most likely associated with the latitude and longitude provided by wireless providers to prioritize in the location of callers.

Keywords: Wireless Phone Calls, Emergency Services, 911, Geolocation, Parcel Selection


a Analyzing weekly chlorophyll a data in the Gulf of Mexico - Amy Nau (2 part video) Part 1, Part 2

Satellite data can be used to analyze chlorophyll a concentrations in ocean waters to help identify the presence and characteristics of harmful algal blooms. Weekly (8-day) chlorophyll a data from the SeaWiFs satellite is available from NASA’s GES-DISC Interactive Online Visualization And aNalysis Infrastructure (Giovanni) website. Python was used to convert these data files to rasters, and ultimately to either a GIF image time series or a composite TIFF file.

Keywords: Chlorophyll a, Gulf of Mexico, SeaWiFs, Multi-band Raster


More Spring 2011 Projects...

a a a a a a a a a


Fall 2010

back to top


a Raster Extractor - John Lloyd (2 part video) Part 1, Part 2

When analyzing study sites, a researcher often needs to extract portions of rasters covering the specific areas. The Pine Beetle Forest Fragmentation study at NCSU has a set of National Land Cover Datasets (NLCD) from which they need to create a raster containing exactly the study area. In some instances the study area can be extracted from one raster. In other instances portions of many rasters need to be extracted and mosaiced together. The distribution and amount of Pine Forest can then be determined from the single raster aligning with the study area. This project implements a tool that uses a shapefile to define the study area and then clips a directory of rasters to that study area. Finally, the project creates a mosaic of the rasters. While this project was created for the Pine Beetle Forest Fragmentation study, it is applicable to any project using rasters.


a Critical Areas Map Generator - Matthew Whitehead (2 part video) Part 1, Part 2

One major area of concern for all municipalities is the availability and protection of clean water sources for its citizens. The map created from this project incorporates buffered areas of water bodies along with impervious surfaces, roads, toxic waste sites, large industrial buildings, etc. The resulting map after buffers, merges, and intersects displays areas of water quality concerns.  This will be achieved by running through as script that will merge together certain files based on file shape type and/or keywords.  These merged files are then buffered to create an area of effect and subsequently intersected in various was to create different area of concern maps.  Ultimately, a critical areas map is created that is the intersection between all the buffers.


a Mill Production Analysis - Ray Urban (2 part video) Part 1, Part 2

A major forest product company owns mills in North Carolina and Virginia; raw material is harvested from many nearby counties for these mills.  A map displaying the total tons produced from each county allows wood procurement management to quickly see the origin of the raw material.  These maps are not routinely produced due to the time required if created manually.

This project used python to sum the individual load receipts for each county for the selected mill, product and time period.  The sum of each counties tons is displayed on the map using symbology.

The ability to quickly display production by county has business, strategic and tactical implications.  The receipts by county data are used to determine severance tax payments, to provide harvest information to the U.S. Forest Service for the Forest Inventory Analysis purposes, and to understand short term wood flows.  It is anticipated that data entry accuracy will improve with the ability to routinely review the map for each mill.

However, the primary benefit of the maps will be to evaluate and compare wood flows, especially over different seasons.


a Batch Reproject and Merge of LiDAR Data - Bradley Neish (2 part video) Part 1, Part 2

At work we have a large collection of LiDAR data that is used for a variety of projects. This collection is regularly updated and added to. Currently a footprint shapefile for each spatially explicit region of LiDAR data is stored in a series of folders and subfolders. These are then individually reprojected and merged to create a single shapefile showing the extent of all LiDAR data. The objective of this project is to create a script which will automate the entire process.
The master shapefile is used to show total extent. One of the main uses of this is as an input into an algorithm that produces a national dataset, so it is necesary to see where we have LiDAR coverage. This script can be ported into a generic script to be posted on the Esri script gallery.


a Application of DWQ Headwater Stream Models - Susan Gale

The NC Division of Water Quality Headwater Stream Spatial Dataset project has produced several statistical models to predict locations of small headwater streams within specific ecoregions based on terrain characteristics derived from Digital Elevation Models (DEMs). The output of these models are GIS shapefiles depicting headwater streams that are of consistent and known
accuracy. Applying these models require the production of a relatively large number of rasters (20-30) from the base DEM, followed by a number of manipulations to produce the final vector dataset. Automation of this process would allow batch processing of multiple DEMs, take less time for completion, and greatly minimize the risk of errors inherent in a multi-step process.


More Fall 2010 Projects...

a a a a a a a a a


Spring 2010

back to top


a Wireless Sensory Network Recording Animal Movements - Lauren Charles (2 part video) Part 1, Part 2

Wireless sensory network (WSN) units have been developed to record interactions between individual moving animals via collar devices (WSN mobile nodes) along with designated set stations (WSN stable nodes).  The information from these WSN includes a variety of sensor readings from time, location, power setting (relative distance) from other WSN, temperature, and light readings. The aim of this project is to convert the raw data output from the WSN, preprocess the data using Python, and then import it into ArcGIS.  This will be the first step in creating a VBA user-friendly interface in ArcGIS where the geographic and sensory data can be visualized, manipulated, and tested against different models of interest.


a Surface Profile Analyzer - Paul Paris

My research looks at how barrier island morphology adjusts to changing environmental
conditions through time. Theory holds that these islands are self-sustaining, adjusting in
accord with external forcing, but new research calls into question these long-standing
notions.

One key measure of morphological character is the cross-island profile. The crossisland
profile is simply an x-y plotted sample of the terrain surface (elevation) at regular
intervals along a prescribed transect. Existing do not  incorporate the time
dimension. This tool generates time-based statistics---minimum, maximum, and mean surface, and range profiles---for two or more co-located digital elevation surfaces computed along a user-defined transect.


a Scripting North Carolina Annexation Laws - James Armstrong

Involuntary Municipal Annexation in North Carolina requires a four-prong test that can result in many hours of computation and counting by hand. Usually there are several iterations of the entire test until the corrected combination is found which satisfies all of the required tests.  The test includes: 1)The area proposed for annexation must be contiguous. 2) At least 1/8 of the aggregate external boundary must coincide with the current Municipal limit. 3) No part of the area can be within another Municipality, and 4) considers the number of people per acre within the region. The project automates these tests for a set of selected properties and provides a graphical user interface (GUI) to elicit user input.  The results indicate if the selected area meets the statutory standards for annexation.


a Glacial Buff Saw DEMS - Sean Gallen (3 part video) Part 1, Part 2, Part 3

The glacial buzz saw theory proposes that glaciers are such effective agents of erosion they are able to subdue the majority of mountain range elevations to at or near the snowline altitude. If this is true, a topographic signature identified by both a concentration of land area and a decrease in hillslope steepness near the snowline altitude should be identifiable. Many studies have tested this hypothesis around the world using both hypsometry (proportion of land area per unit elevation) and mean slope per unit elevation to support or refute evidence for a glacial buzz saw. However, no easy to reconstruct or standard method has been developed to construct these data plots from digital elevation models (DEMs). This project solves this problem, with an easy to use script that generates data tables from DEMs that can be used to construct hypsometry and mean slope vs. elevation plots.


a Matching Address IDs with Land Parcel IDs - Josh Frederick

In the Wayne County Planning Department, one of their ongoing tasks is maintaining addresses using GIS.  This is done simply with high resolution orthographic imagery and the Editor toolbar in ArcMap, using points as address locators.  Address points coincide with land parcels, which are maintained as a separate shapefile within the tax department.  The parcel lines accuracy within the GIS is important. Each parcel has an auto-generated and unique Parcel Identification Number (PIN). When address points are created and/or modified by the Planning Department the PIN attribute of the individual point must be manually entered or updated, according to the PIN of the parcel in which it is situated.  Sometimes these PINs go unmatched by mistake, so this project aims to provide a quick way to correct this problem using a simple Python script, VBA code, and a custom interface in ArcMap. 


a a a a a


More Spring 2010 Projects...

Fall 2009

back to top


a Real Estate Property Evaluation - Doug Browning (3 part video) Part 1, Part 2, Part 3

This tool helps a real estate agent build a watch list for properties of interest for subdivision development. The user selects a desirable parcel, and then clicks a button which launches a form to allow the user to add notes about the parcel and automatically add the parcel and notes to a watch list stored as a personal geodatabase.  Original parcel fields are included in the new watch list table entry. Current county aerial photos are useful for determining parcel desirability.  The user can press a button to launch a script which downloads up-to-date aerial photo data from a website, unzips the downloaded files, and adds them to the display as a group layer.



a Polygon Repair Tool - Holly Brackett Part 1
Polygon shapefiles can be generated automatically from photography via Feature Analyst software.   The software analyzes the pixel contrast, draws polygons around similar areas, and assigns a classification value to each polygon.  The photographs are often a set of tiled images.  Many of the polygons fall across the tile boundaries, so they are split along the tile boundaries.  This project automates the process of recovering the true polygon geometry. It dissolves and merges the quads into one drawing, performs a multipart to singlepart on the polygons, exports the forest polygons, and finally computes and adds new fields (area, length, ratio, and acreage) for the each output polygon.

a Invasive Species Selection - JJ Scott (2 part video) Part 1, Part 2

NC State Parks need a tool to aid in prioritizing invasive species removal, by allowing the user to select any State Parks in North Carolina, select prioritization criteria, assign prioritization values, and add the desired files to a map for further analysis.  This project begins to develop this tool by displaying the species habitats within selected parks for species which have been identified by NC Natural Heritage as ecologically significant.  The user selects a state park, and then the user selects types of Natural Heritage-recognized significant species habitats of interest, and the desired files are automatically added to the map.



a Analyzing Shoreline Movement - Onur Kurum (3 part video) Part 1, Part 2, Part 3
Beach erosion is a chronic problem along most open ocean shores of the United States. As coastal populations continue to grow and community infrastructures are threatened by erosion, there is increased demand for accurate information regarding past and present shoreline movement rates and trends. This study calculates erosion rates at the coast by comparing the position of the earliest recorded position, dating around 1857 to sets of shoreline positions recorded since then.


a Trends in Parks Tourism - Stacy Supak

Tourism is a billion dollar industry worldwide. Critical to tourism planning is research to better understand who is traveling where, when, and from how far. Using zip code information contained in a database of all online reservations made for overnight stays in federal parks since 1999, this project creates a shapefile for each year containing park and customer locations, and the distance traveled between them. These files will be useful in a multitude of future analysis,including examining annual and monthly point density of parks and customers to aid planning and marketing efforts for these facilities.



a a a a a a a a a



Spring 2009

back to top

a Calculating Area Percentage - Matt Sumner
This tool calculates the area percentage of biodiversity and wildlife habitat assessment (BWHA) polygons within North Carolina census tracts. The BWHA shapefile is 1.39 GB in size with over four million records and causes Esri’s geoprocessing tools to fail due to memory limitation errors. This application solves this problem by means of Python scripting.

a NPS Fish Species Application - Sarah Nelson (3 part video) Part 1, Part 2, Part 3

The National Park Service conducted a regional study covering several National Park Service units with similar ecological characteristics.  During this study, data was collected on fish species found at various sites across Virginia and Pennsylvania between August 2002 and June 2005.  Research data has been compiled into an Access database that contains site id’s corresponding to 2 shape files which represent the physical locations of those sites. This project involves building an interface in ArcMap that will query the database for locations where a particular fish species was found.  The resulting sites will be added to the map as a new shape file.  The National Park Service plans to use this application as a way for individual parks to show the location of fish of interest.  Researchers or park personnel would have a better understanding of where the collection point or transect is located and the surrounding landscape.


a Proportional Attribute Aggregator Tool - Jeff Essic
Often people seeking demographic data (population, household numbers, income, etc.) want to know about the demographic profile within a certain distance radius of a set location.  Or, they may want demographic data for a particular distinctly defined area.  Both of these cases (a buffer or an irregular polygon) have typically involved selecting the Census or other demographic boundary data available for the area of interest, and summarizing as best as possible.  The demographic data polygons and study area polygons seldom align. This tool allows the user to overlay a set of polygons and determine the proportional Census information for each polygon (assuming an even distrubution).

a Terrestrial Vertebrate Query for Gettysburg National Military Park and Eisenhower National Historic Site - Brent Fogleman
The National Park Service (NPS) conducted a terrestrial vertebrate survey in Gettysburg National Military Park and Eisenhower National Historic Site in Pennsylvania between 1992 and 1996.  A Microsoft Access database was established in 2002 as a record of the vertebrate species.  The NPS requires a user interface that allows the user to select a vertebrate by Common Name or Species Code from a drop-down menu.  When the selection is complete, known locations for the species are displayed as a layer in an ArcMap map document.

More Spring 2009 Projects...