More student projects
Fall 2012
back to top
|
Comma Separated Value File Conversion - Brad Bogovich |
This project focuses on the conversion of a CSV table of fire ignition data to a single shapefile representing the entire data set
Keywords: CSV, comma separated value file, Make XY Event Layer, Feature Class to Shapefile Conversion
|
NatureServe's remote sensing/GIS Ecological Integrity Analysis method evaluates conservation sites in their landscape setting. Inputs are shapefiles of sites and an input raster of landcover, generally NatureServe’s National Map of Ecological Systems. Sites are evaluated on landscape connectivity, landuse, size, and buffers. Example data is from wetlands in New Jersey.
Keywords: ArcGIS 10.1, buffer, conservation sites, ecological integrity index, ecological system, EIA, landcover, tabulate area, landscape integrity, raster, shapefile, table to dBase, wetlands, New Jersey
|
Takes a single, polygon shapefile, greedily partitions its constituent polygons on the basis of a user-defined numeric field, and stores the resulting districts in a new File
Geodatabase.
Keywords: Gerrymander, Redistrict, Adjacent, Adjacency, Sort, Tuples,Dictionaries, Array, Lambda, List Comprehension, Selection,Select By Location, SearchCursor
|
Tool that takes a Cumberland County parcel shapefile and las points to create a IDW DEM. The files can be uploaded into Google Earth and mapped out using a cell phone gps.
Keywords: DEM, Property Line, Parcel, Address, Cumberland County
|
Synopsis: Gathers DEM files that are zipped from online FTP directory. Creates own
working directory and geodatabase. Extracts elevation data pertinent to the grey space
being evaluated. Exports along with GRID files a graphic to the interface.
Keywords: DEM, greyspace, slope, extract
|
In addition to monitoring and forecasting tropical cyclones during the current storm season, the National Hurricane Center (NHC) also provides storm track data for every tropical cyclone in the Atlantic Basin since 1851. The State Climate Office of North Carolina is currently storing these historical observations in its CRONOS database, which it currently provides online through its Hurricane Search interface. While statewide tropical cyclone statistics are readily available through the climate office website, many users desire statistics down to the county scale to determine how often storms affect their location. The storm track data provided by the NHC does not include the storm’s diameter, so a buffer distance from the storm’s track must be estimated to determine whether the county “was affected by” a given storm. With a given buffer distance provided, an analysis of the storm track data can then be performed to obtain a frequency of tropical cyclones that have affected the given county.
Keywords: tropical cyclones, hurricanes, climatology, climate
|
The National Park Service requires fire management data to be made available in visual format. This enables park officials to quickly identify areas affected by fire damage as well as the date and time of fire ignition. In order to transfer the raw data into visual format, csv data files need to be converted to shapefiles so that this information can be viewed in ArcMap. As there are numerous records to transfer, it is beneficial to create a programming script that performs batch processing on the data files. The csv2shp script will perform this task and allow a GIS technician to manipulate the resulting data in a map document.
Keywords: csv, Comma Separated Values, shp, Shapefile, ArcMap, TableToTable Conversion, Import X,Y Data, Write File, NPS Fire Data, ReportingUnitID
|
Spring 2012
back to top
A construction company wants to automate the process of location selection for its projects.
Keywords: shapefile, boundary, Optimal, Location, Parcel, roads, schools
|
A food desert is any area in the industrialized world where healthful, affordable food is difficult to obtain. Food deserts are prevalent in rural as well as urban areas and are most prevalent in low-socioeconomic minority communities. They are associated with a variety of diet-related health problems. This analysis will attempt to identify the highest risk areas in Charlotte-Mecklenburg, NC.
Keywords: Food Deserts, Poverty, Health, Grocers, Fastfood, bad diets, shapefiles, rasters, reclassify, weighted overlay
|
New Mexico is a significant producer of construction sand, gravel and stone. Thousands of construction companies try to capture all potential customers and the market share. The goal is to estimate the market share in Albuquerque and surrounding areas by using different geoprocessing tools in order to identify potential benefit or loss opportunity for construction materials.
Keywords: Business plan, Weighted Overlay, Euclidean distance, Construction Industry, Geocoding, highway, and shapefile
|
|
CSV to SHP - Robert A. Hurd |
Often times, files need to be modified and converted to different file formats. The script created for this project modifies a user defined CSV file and then creates multiple shapefiles for pre-selected data within the CSV file. An ArcMap button is also created, which prompts the user to select the particular CSV file for conversion and the output location of the new folder where the shapefiles will be saved to.
Keywords: CSV, Comma-Separated Values, SHP, Shapefile, ArcMap, Button, File Conversion, Excel, Table, Database, Spreadsheet
|
Production of high accuracy digital elevation models (DEM) can be achieved using laser radar data, also known as lidar. Current providers of free raw lidar data, such as the United States Geological Service (CLICK website), make the retrieval of large data sets convenient and timely. Unfortunately, these files are not immediately usable in the ArcGIS software suite. To make use of this data, conversion to an appropriate format (such as point data as provided in the LAS to Multipoint geoprocessing tool) is necessary. This process is fairly straightforward when handling single LAS files or multiple LAS files located in one folder. However, when multiple pages of files are downloaded from the CLICK website, the resulting data set consists of numerous zipped folders. These, in turn, contain multiple zipped LAS files, all of which are stored in one primary folder. My script performs batch processing on such folders and converts the raw LAS data into multipoint features. It also provides the user the option to convert and mosaic the new multipoint files into two separate DEMs, using two different interpolation methods.
Keywords: lidar, DEM, batch, alamance, IDW
|
The City of Charlotte regularly inspects where the sewer mains cross the streams to ensure that they are structurally sound and not leaking. Currently these “Critical Assets” add up to about 650, and each one of these is inspected on an annual, semi-annual, or quarterly basis. These inspections add up quickly resulting in a long day of post processing when the inspection unit is download. This project is meant to automate the post processing and create the required shapefiles to prepare the inspection unit for the next round of inspections.
Keywords: Point, Append, Copy, Select by Attributes, Calculate
|
The National Park Service manages natural and historic properties, impacts from construction adjacent to park lands can modify the integrity of national park lands. The agency will utilize the application to identify potential impacts to national park lands early on in the planning process to avoid any loss of natural or historic properties either from direct or indirect impacts from construction activities.
Keywords: National Park Service, national landmarks, historic properties, environmental impact, federal lands, anthropogenic activity, National Environmental Policy Act
|
An ongoing dispute in the meteorology community is whether one of the most prominent weather observing stations, known as ASOS (Automatic Surface Observing Systems), is being affected by something known as the urban heat island effect. The RDU ASOS unit has been the primary focus of this argument, and so I plan to analyze this by looking at the building footprint and land usage in a 3 mile radius around the station. I will measure the area of developed land in this area and then compare it to the same data at another Wake County weather station nearby (Lake Wheeler ECONet station - LAKE) for comparison.
Keywords: meteorology, urban heat island, weather stations, land cover, buffer, clip
|
This study will help us identify some factors that play a role in teenage pregnancy in Wake county with reference to zipcodes( environmental influence), mother’s education and age, and ethicinity. About 4% of the teenage girls become young mothers each year and for some it is not the first birth.
Keywords: teenage, pregnancies, wake county, zipcodes, maternal health
|
Prior to the placement of a natural gas production well, a pre-drill survey is performed to establish a record of the current water quality conditions of the properties surrounding the proposed drilling location. The surrounding land parcels must be identified and tracked through the sampling and reporting process. Due to overlapping pad radii, an individual parcel may need to be tracked for multiple well pads.The aim of this project is to develop an efficient process to identify each well pad radius that a series of parcels intersects and generate a csv file with a record and unique identifier for each pad a parcel is identified with.
Keywords: Parcels, Sampling Survey, GIS, Exploration and Production, Natural Gas, Parcel Tracking, Buffering, CSV
|
|
Comma-separated Values to Shapefile - Matthew Carter |
Use this script to convert CSV files into Shapefiles. Files used by the end user might need to be modified before converting to another file format. A button located within ArcMap will prompt the end user to select which CSV file needs to be converted and the final location of the newly created Shapefile.
Keywords: Database, Excel, CSV, Shapefile, Make XY Event Layer, ArcMap, File I/O, GUI
|
Maritime Information Safety Bulletins are issued by the United States Coast Guard, imposing restrictions and providing travel warnings to help maintain the integrity of the US maritime transportation system. This script allows the user to import a spreadsheet containing bulletin information and converts the data to an KML to be displayed in Google Earth.
Keywords: Maritime Transportation, Coast Guard, Boating Safety
|
|
CSV to Shapefile Tool - Scott Hicks |
This script tool converts a csv file containing NPS fire ignition data into point shapefiles. A master shapefile will be created that contains all data points from the input csv file. Additional shapefiles will be created for each unique reporting unit (National Park, Historic Site, etc). All ouput shapefiles will be written to a directory specified by the user.
Keywords: csv, shapefile, csv conversion, shapefile creator, fire, NPS, National Park Service, fire ignitions, data conversion, data management
|
Raw multibeam sonar data is constantly coming into the office, and the time that it takes to process of this data can be very extensive. The raw multibeam data is just a small piece of the puzzle, to make this data usable for planning purposes we need to have this data into a final product. Products such as a GRID for surface analysis is the first product for which we will then create several other products from like contours, hill shades, aspect rasters, and slope rasters. For all these final products to be created we have to convert the raw multibeam data into a format that Esri will accept. To help us along the process of data formatting and multiple geoprocessing executions we need the help of a customized tool for production process.
Keywords: Multibeam Survey Data, XYZ, ASCII 3D, GRID, Surface Analysis, Slope, Aspect, Contour, Hydrographic Survey, Bathymetry
|
Beginner ArcMap students often have ArcGIS files in random folder due to lack of organization. This tool will search a given root directory and find all of the commonly used ArcMap files and output a HTML file showing where they all are. Also gives the option to make a copy of the files and place them into an organized folder.
Keywords: File Organization, ArcGIS, Feature Class
|
National Park Services needs a tool that would be able to identify if any anthropogenic changes would affect areas of interest. Areas of interest, for them, would include National Natural Landmarks, National trails, Wild and scenic rivers and so on. I will be creating two tools, one that would allow them to interactively select an area and see if there are any areas of interest that would be affected in the near vicinity of the selected area. The second tool would do the almost the same thing, but instead of checking in the vicinity of the selected area, it would check for areas of interest in the selected area.
Keywords: NPS, Interagency Compliance, Parks Analysis, Buffer, Clip, shapefiles, vector data
|
In the Pacific Northwest, oil spill response equipment is listed in a regional database called the Western Resource Response List (WRRL). This database is used to order and track resources in the event of a large spill, compare equipment locations and capabilities to oil spill contingency plan requirements, and provide information to the public and other stakeholders. This project developed a script that reads information from the online database and creates equipment layers for all the organizations and maps equipment locations based on latitude and longitude. It also creates a summary file describing the capabilities of each organization towards Washington State contingency plan requirements for containment boom, oil skimmers, and oil storage.
Keywords: Oil Spill, Cleanup, Equipment, Washington State, WRRL,database, XY Event Layer, Planning Standards
|
This project is going to find favorable locations for black bears in the Great Smokey Mountains. This would help reduce interference of the bears with visitors to the park. A weighted overlay analysis would be carried out to determine the favorable locations. Several factors such as proximity to roads, stream, trails and certain vegetation types would help in determining these favorable locations.
Keywords: Suitability analysis, shapefile, raster, feature class, weighted overlay, Euclidean distance, slope, elevation, reclassify, great Smokey Mountains.
|
Each region of the US Fish and Wildlife service is responsible for maintaining a Cadastral Geodatabase, which are joined to create the national FWS Cadastral Geodatabase available on the Services’ webpage. The Southeast region recently transitioned from a File Geodatabase to an Enterprise Geodatabase. The naming conventions of the Enterprise geodatabase limit use of certain tools that take multiple feature classes as inputs. This project creates a tool to solve those problems, by generating a File Geodatabase with specified feature classes and fields.
Keywords: Enterprise Geodatabase, Create Geodatabase, Export to Feature class, Cadastral Geodatabase, National Wildlife Refuge.
|
This tool cleans the Wake County parcels data in order to make them useable. The parcels are reorganized based on APA criteria into four property types
Keywords: Land Values, Raleigh, Wake County, Property Type, Raster
|
This project reads in *.csv files of precipitation data, and converts them into shapefiles. Next, it performs various (Kriging, Radial basis, IDW) interpolations and exports these to geostatistical layers. It then compares Root mean square errors by means of cross validation. Finally it exports the interpolation with the lowest RMSE to a raster, and extracts that raster using an North Carolina Mask. (So this is limited to NC for now.)
Keywords: precipitation, interpolation, shapefile, .csv, RMSE
|
The National Centers for Environmental Prediction (NCEP) surface wind data is applied to SWAN ocean wave model and wave hindcasting is conducted. When the wave hindcasting is conducted, wind data is converted to the data which match the wind input of the wave model, and several different spatial and temporal scales are used. A certain period of wind speed data is necessary, and a certain amount of data is needed to be converted. The purpose of this study is creating a format of the code for conducting these cumbersome operations.
Keywords: netCDF data, ASCII data, ArcGIS, SWAN wave model
|
For each severe weather event, local storm reports are taken and saved into an Excel document. This project creates an xy layer and, ultimately, a shapefile of the storm reports. It then finds each unique value in the event field, selects each event type (hail,tornado, wind, etc), and creates a layer for each event. Each event will then be counted to determine the number of reports (this can help determine if the reports are legitimate).
Keywords: Storm, Reports, Weather, Hail, Tornado, Wind, Lightning, Flood, Time, Latitude, Longitude, GetCount, xy Layer,
|
Using economic data from the Bureau of Economic Analysis, this script calculates a (some-what iffy) index of economic activity, called the County Similarity Index, for counties in the continental US. The script populates a new field of index values in the counties shapefile's attribute table and produces a map, color-coded by index groups, using a predefined template.
Keywords: economics, counties, US, fuzzy maths, cursors, map
|
This project unzips Bare Earth LiDAR data files sitting in an input folder. The files are then used to create a Digital Elevation Model (DEM). The DEM is used to generate a watershed boundary and a stream network using Terrain Preprocessing steps.
Keywords: Streams, Stream Network, Watershed Boundary, Basin, Terrain, Terrain Preprocessing, Bare Earth Data, LiDAR, Lidar, Hydrology, Unzip, Zip Files
|
Many parents within the Wake County Public School System are disappointed with the manner in which school assignment is conducted. Many advocate a diverse student body, even if it means that some students have to travel significant distances to their assigned school. Others promote "community schools," where students schools closer to their home and retain a sense of community with their neighbors and those in close proximity. This project incorporates a tool that informs residents of their local elementary, middle, and high schools as designated by some level of community (i.e., census tracts).
Keywords: Zoning, Neighorhood District, School Assignment, School System, Community
|
New Hanover County has decided to charge phone companys property tax based on type of landuse the lines cross. New Hanover County has asked for a tool to help estimate the amount of revenue that could be raised. This tool will allow the user to select a property layer, a phone line layer, and select the buffer distance. The tool will then produce an output feature with the cost for each section of buffered phone line inside the county.
Keywords: Teleophony, Property Tax, Landuse
|
A teleconnection pattern is a geopotential height anomaly which exhibits some sort of impact elsewhere. This project looks at geopotential height anomalies at the time of a tornado outbreak, constraining the field to a latitude/longitude area from 140-340*Latitude and 20-90*Longitude. I will be using NETCDF data and will break up the file to only look at the 500-hPa level. This project will look at current geopotential height levels by breaking up the NETCDF file into individual rasters, and the project will take the overall change of geopotential height over a 5-day period prior to the tornado outbreak
Keywords: NETCDF, Raster, Geopotential Height, Tornado Outbreak, Fugita Miles
|
This script takes sample point data and interpolates a raster surface. The script allows the user to select one or multiple methods of interpolation and computes them along with the Hillshade representation of each. Methods include Natural Neighbor, Inverse Distance Weight, Spine, and Kriging.
Keywords: Interpolation, IDW, Kriging, Natural Neighbor, Spline
|
The housing market in Forsyth has seen a continuing rise in foreclosures over the past few years. Industry standards state that, in market areas in which >20% of total sales are foreclosure-related, those sales must be taken into consideration for purposes of property valuation. This project attempts to programmatically identify areas hit hardest by foreclosures and to identify some trends among them.
Keywords: property tax, assessment, market analysis, cama, iaao, parcels, sales, foreclosures, arcgis, python
|
Demostrate the use of ArcHydro tools to process DEMs. This project will allow you to create DEM models by creating terrain data sets. That DEM will be used to run the 9 basic ArcHyrdo add-in tools to create catchment polygons and resulting drainage lines. Geoprocessing will be done on the resulting output to find the longest drainage lines. This would be beneficial in floodplain anaylysis and wetland studies.
Keywords: ArcHydro, DEM, terrain, drainage lines, catchment polygons.
|
This project works with a database of the community gardens in the Chapel Hill/Carrboro area of Orange County, NC. This database needs to be regularly updated when new gardens are constructed, removed, or gardens are altered (such as becoming public or private, or increasing capacity). My scripts are designed for people with minimal ArcGIS experience so they can update all of the garden attribute information in our database and produce maps using script tools on shapfiles in the database. These maps include individual garden maps, a master map with locations of all gardens, and overall area within walking distance access to a garden (for finding ideal places for rospective gardeners).
Keywords: Community Gardens, Urban Farms, Orange County, North Carolina, dbf, shp, script tools
|
A program capable of deriving a timber price based upon a spatial data. Utilizing timber price data as the dependent variable, the model uses the number of sawmills within a county or region and the corresponding timber production as the explanatory variables to generate linear regression outputs. The data for the program is limited (odd years from 2001 to 2009), but does provide for insightful analysis.
Keywords: Regression, Timber, Commodity, Wood, Pricing Model
|
This project deals with the development of Digital Elevation Model, popularly known as DEM using the bare earth lidar data. The study area has been selected as part of the Wake County, NC and the corresponding data in the form of ASCII files, is downloaded from “ncfloodmaps.com”. The ASCII files are converted into multi point feature class using the tool ASCII 3D to Feature Class. Then, the DEM is developed using the ArcGIS tools Point File Information, Create Terrain, Adding Pyramid Levels, Add Feature Class to Terrain, Build Terrain, and Terrain to Raster. This program can be extended to any area in NC and it can also be extended to other regions with the adjustment in the Coordinate System.
Keywords: DEM, Terrain, Hydrology, Water Resources, 3D ASCII to Feature Class, Build Terrain, Terrain to Raster, Watershed Delineation, ArcHydro, Raster, ASCII
|
The implementation and restoration of riparian buffers serves many land and water quality conservation purposes. These vegetation areas filter sediment and pollutants, stabilize banks and reduce streambed scouring. Wooded riparian buffers also improve wildlife and aquatic habitats. This application provides a visualization of the riparian buffer as well as the cost of a proposed buffer project.
Keywords: Riparian Buffer, Trees, Fish Habitat, Stream Restoration, Water Quality, Bank Stabilization.
|
Fall 2011
back to top
This project geo-references tabular No Child Left Behind data from the IES National Center for Educational Statistics. This script can be run on the 2009 data provided, (all public schools, including charter, magnet and even specialized schools in all 50 states with 19 fields of interest) or on new data selected for other years available at the IES web site. The script creates a map document with a selected template, a PDF of the map and a map package to share with others.
Keywords: Schools, School Districts, No Child Left Behind, NCLB, IES, Education, Educational Statistics,
|
When statistically sampling large landscapes, a study area is often divided into small, surveyable, pieces which are then surveyed based on a random selection protocol. However, simply dividing up an irregular landscape into plots using automated methods often results in small, irregular, plots. This tool assists the user in dividing their landscape into plots and subsequently merging the small plots to eliminate plots under a specified minimum. In addition the script allows the user to automatically add data from other feature classes to the generated survey area output by using a user created text file with the data locations.
Keywords: Wildlife, Sampling, Survey,
|
The ability of emergency personel to precisely map the location of potentially harmful airborne contaminant releases and rapidly calculate the range of potentially affected population centers allows first responders to assess evacuation alternatives and extract information on special needs populations (schools, nursing homes etc.). This application provides a quick and simple evaluation tool that can be used in the field to calculate multiple wind vector polygons using air release locations and real time meteorological data. These polygons serve as wedge shaped areas of risk that account for variations in wind speed and direction over time. Risk areas can be displayed over background maps and images of pertinent features as well as used for more in depth spatial analysis of exposure populations.
Keywords: Wind vectors, Exposure Assesment, Emergency Response, Meteorology, GIS
|
Vaccines have changed some of history’s most infectious and debilitating diseases from being widespread epidemics into mostly relics of the past. However, outbreaks of vaccine preventable diseases still occur, with children being the most vulnerable. According to the Centers for Disease Control and Prevention, in 2010 of over nine thousand cases of the vaccine preventable disease pertussis, better known as whooping cough, were reported in California; a 63 year record high. This project uses data collected by the California Department of Public Health, which records the immunization coverage of 2010 kindergarten classes, to visualize where vaccination rates are strong or lacking.
Keywords: California, Vaccination, Public Health, Immunization, Preventable Diseases
|
Tomato Spotted Wilt Virus (TSWV) is a major source of lost crop yields and money for North Carolina tobacco farmers. In order to better manage this disease, farmers and extension personnel need to understand where the disease is most likely to affect crops in any given year. In order to determine this, fields throughout North Carolina are surveyed yearly for this disease and predictions about disease severity are made by county. This project uses a Python script to compile survey data from multiple sources into xy event layers in an ArcMap document.
Keywords: TSWV, tobacco, disease, xy event, coordinate conversion
|
This script creates a least-cost polygon using a base Digital Elevation Model (DEM) and then goes on to add the results to itself and then multiply it times itself. By doing this we can gain three different understandings of the way ancient people might move through the landscape. This script was created to allow Cultural Resource Managers to locate areas which could fall within the catchment area of a given archaeological site.
Keywords: Last cost, DEM, Archaeology, CRM
|
The ATT Coverage Viewer is a web-based application developed by Esri. The application is designed as AT&T’s solution to publicly communicate network coverage to its customers. Using the Coverage Viewer, customers and employees can look up a specific address or latitude/longitude and view the AT&T coverage for that location.
Keywords: mapinfo file, repair geometry on feature class, projection-transformation, clipp_ analysis, PolygonToRaster_conversion.
|
Charleston County, part of the South Carolina Lowcountry region, is home to a diverse group of historically relevant sites dating from the 17th century. The region also suffers from severe erosion risk. In order to better understand how to preserve and manage the sites, an application that determines each site’s erosion risk based on soil type has been developed. Additionally, a separate list of sites is created specifying which sites fall within a given distance from major rivers in the area.
Keywords: Cultural Resource Management, erosion, buffer, intersect, spatial join, calculate field, soils, historic sites
|
This project establishes a parcel-based system for zoning and land use data within the Winterville Planning and Zoning Jurisdiction.
Keywords: Parcel, Land Use, Zoning, Winterville, Shapefile, Feature Layer, Feature Class, DBF, Table, Attibute
|
Finding the perfect hiking route is often a difficult task with multiple new trails often branching off a current trail. This tool takes a starting point, often a parking lot, and computes the different possible hiking routes that can be taken to return to this same point. The results are summarized with the distance and the elevation of the hike displayed.
Keywords: Parking Lot, Hiking, Park, Distance, Elevation, Trails, Join Paths
|
Using collected crime statistics from Charlotte, NC and a data file of Charlotte area schools, I will locate the violent crimes (assaults, homicides, etc.) that occurred within a 2 mile radius of Charlotte schools.
Keywords: Charlotte crimes, violent crimes, Charlotte area schools, buffers, intersection.
|
Spring 2011
back to top
The North Carolina Wildlife Resources Commission Aquatics Wildlife Diversity Program samples streams through out the state for macroinvertebrates. As part of their data collection, they collect two sets of latitude and longitude coordinates to define the sample area. The purpose of this project is to use these coordinate points to create linear and polygon sample sites based on an existing linear streams and waterbody polygon feature dataset.
Keywords: polyline-creation, stream delineation, feature selections, SelectLayerByAttribute, SelectLayerByLocation
In this project, I created two tools; each generates a report from the results of a hydraulic water model. The water model was used to determine which water sources contribute to sampling stations throughout a water distribution system. The first tool creates a spreadsheet that shows which water sources contribute to each sampling station. The second tool creates a map that displays how each water source is distributed throughout the water system.
Keywords: water, hydraulic model, source trace, sampling station, pipe, buffer, join, selection
Data collection has advanced over time developing new methods and technology for the collection and processing of data. With these advances come errors that are unintentionally introduced to the data that need to be removed before processing and analysis of the data itself. These scripts: Get_Systematic_Errors.py, Fixing_Systematic_Errors.py calculate and remove systematic errors (vertical shifts of elevation) from raster digital elevation models (DEMs) based on baseline data (benchmark data or roads that contain x,y, and z values) that are used as the shifting variables. Get_Systematic_Errors.py calculate systematic errors while Fixing_Systematic_Errors.py calculates systematic errors and shifts the DEMs in order to remove the errors from the data..
Keywords: LiDAR, Systematic Errors, Digital Elevation Models (DEMs), Historical Data, Jockey's Ridge, North Carolina Outer Banks
The purpose of this project was to design a tool that could be used to help find bicycle-friendly routes for commuting around Raleigh. I split the project into two major steps: generating a bicycle-oriented network from streets and greenway data; and then using that, along with elevation data and other optional factors to create a cost surface and generate a least cost path. Overall, the goal was to output a bicycle friendly route, so the end result was to return a ‘best route.’ The final result also includes an option to export in KML format for use with Google for route planning.
Keywords: Cycling, Commuting, Least Cost Path, KML, Slope, Raster Calculations
This project uses a script tool to combine data from multiple MS Access transportation asset management databases and add it to a map. The original tabular data include multiple field inspection databases from several jurisdictions. These are combined into a single table for each jurisdiction, and converted to spatial data using GPS coordinates for each asset. The resulting layers are then added to a map.
Keywords: XY Event Layer, Tabular Data, MS Access, Asset Management, National Park Service
Allocating limited resources for prescribed burning is a challenge in Western North Carolina. Priority ranking of land management burn units can help efficiently allocate resources for these activities. The ranking of burn units is computed based on local ecological criteria. This script tool calculates this ranking based on two categories of user inputs: burn units shapefiles, and assigned weights for ecological criteria.
Keywords: The Nature Conservancy,Natural Heritage Project,Western North Carolina,South Mountains,FLN,Fire Learning Network,eco-math, burn unit,prescribed burn,fire,fire-adapted,ecozone,Steve Simon
This project uses geoprocessing to extract certain data from a folder of zip files and organizes that data according to river basins in North Carolina. Shapefiles of floodplain breaklines, LAS, and XYZ point elevation data are extracted from each zip file and stored in a folder according to which river basin they belong too. These features are merged and are used to create a dataset for each river basin which contains multipoint elevation and line feature breaklines for that basin. Lastly a terrain is built using the point and line features for each data set.
Keywords: floodplain, breakline, las, xyz, multipoint, terrain, 3d, lidar, zip, merge, clip, dissolve, geodatabase, dataset, feature class, pyramid
Public schools are currently under increasing financial strain due to rising budget cuts and an economic recession. These pressures can result in job loss and declining quality of education as class sized rise and arts and music programs are omitted from curricula. One significant expenditure for large buildings is energy costs. This model streamlines the process of modeling the potential for solar radiation collection for individual school across North Carolina.
Keywords: Solar, Radiation, School, Renewable Energy, Unzip, DEM, Clip
I work in the Utilities industry and am working with a pilot project that uses sound waves passing through sewer line segments which will return information on the condition of that pipe. The problem faced was that there are large amounts of this acoustic data that requires downloading weekly. This script takes the user provided parameters and automates the process of uploading large amounts of acoustical sewer line testing data. This is downloaded in .csv format and the output of this script is a shapefile containing only the sewer line segments which were tested along with their corresponding acoustical test data.
Keywords: Area: Charlotte North Carolina Data types: CSV, DBF, Acoustical Monitioring, polyline, Tools: Join, Append, CalculateField, Select, TabletoTableConversion, DeleteIdentical
The script that was will analyze DBF files, generate text reports and map Distance Learning and Traditional student locations/p>
Keywords: Student Geocode, High School Analysis, State Statistics, Curriculum Analysis
This project uses National Hydrology Dataset data to determine the route that surface water will take as it moves from one sub-basin to the next in the conterminous United States. Scripts were written to join, merge, and organize data from 57 separate tables. Then tools were created to select all watersheds downstream from any given watershed, create a new layer from the selected watersheds, and sum the area of all downstream watersheds.
Keywords: watershed, basin, connectivity, US, runoff, flow, route, routing, hydrology, hydrologic, huc, surface, water, conterminous, contiguous, lower, 48, nhd, wbd, delineate, national, interbasin, transfer
The Town of Mooresville Water Sewer Maintenance Department (W/S Maint.) requested help from Engineering Department to set up a way to track where the sewer cleaning crew has been each month . The Town had already completed a field survey of the entire sewer infrastructure, so using the unique identifiers for manholes (MHID) we were able to leverage the survey data and dbf files, which contained the cleaning data. Shapefiles with the monthly breakdowns were created, along with a calculated frequency of how often the sewer main has been cleaned. The data provided a new method for the Town to examine the spatial location of the problem areas and a tool for the cleaning crew to plan their daily cleaning routes.
Keywords: Sanitary Sewer, Sewer Lines, Manholes, Joins, Lists, Search Cursor, Creating Text Files, Frequency, Layer Files, Shapefiles
This tool was created to help GIS Technicians at a local electric cooperative clip the total miles of line in all of the service counties for accounting and North Carolina’s Electric Cooperatives (NCEMC) for reporting. In the past, GIS Technicians would perform this task manually. Now, this tool can be activated by the user in ArcMap with a simple click of a button. The final dbf output is then opened in Excel and a simple sum/5280 is run on the total shape length field to calculate the miles of line.
Keywords: Electric Cooperative, Clip, Electric, Distribution Line
The objective of this project is to parse RSS feeds containing Air Quality Index values in Python. These up-to-date feeds are converted to a CSV excel table, where they can be joined to an ArcGIS shapefile composed of 6 cities across North Carolina. Once the index data is in ArcMap, it is geoprocessed with 25 mile buffers around each city. Appropriate symbology is applied to these buffer zones based on their index values, which is a good way to inform the public of the current air quality for their region.
Keywords: Air Quality Index, Ozone Pollution, RSS Feeds, XML, CSV, EPA, EnviroFlash, North Carolina
My project was to convert the very large volume of 3 dimensional X, Y, and Z text data to DXF file format. The procedures within commercial tools are not able to handle large volume of data and crashes while try to convert the text data to DXF file format. I developed a generic Python script tool to convert the large volume 3 dimensional X, Y, and Z points text data to DXF file. Further this python script is extended and automated with the Esri procedures to convert the DXF file to Geodatabase, and Shape file.
Keywords: DXF Writer, Txt to DXF, Txt, DXF, Geodatabase and Shapefile, DXF converter, Geodatabase writer, Shapefile writer,
Field biologists rely heavily on field data collection for environmental analysis and resulting reports. Unfortunately, the data processing flow can be tedious and labor intensive as the standard process for creating line features requires manual connection from point to point. Automating the field data conversion of points to polyline (streams) and/or points to polygon (wetlands and water bodies) will greatly increase project efficiency, reduce human error, and allow the inexperienced GIS user to conduct the conversion process.
Keywords: points, lines, polylines, polygon, convert, conversion, gps, wetland, streams, ponds, lakes, water
Three separate scripts provided as part of a single model. findFiles.py finds, renames, and combines XYZ elevation files into a single point shapefile. updateMXD copies a reference TIN layer file, updates the dataset and data source, inserts it into a map document, and removes the reference TIN layer file. mxdTextUpdate reads a text file containing metadata, assigns those values to a dictionary based on list position, and updates text elements in a map document.
Keywords: MapDocument, ListDataframes, InsertLayer, ListLayoutElements, RefreshActiveView, SaveACopy, Coordinate System, SetProgressorLabel, ASCII3DToFeatureClass_3d, GetParameterAsText, def, procedure, for, if, len(, xyz, TIN
Short-term (acute) exposure air concentrations are compared to health reference values to determine the potential for health effects from such exposures. The hazard quotient, which is the ratio of the acute exposure concentration to a health reference value, is one measure of this potential. This project uses a Python script and air dispersion model output data to calculate hazard quotients, and displays spatial layers of these quotients that allow a determination of where health effects could occur.
Keywords: Air Dispersion Modeling, Risk Assessment, Air Pollution, Acute Exposures, Hazard Quotients, Python
Maps out Appalachian Trail shelters and demographic data for the southeastern States (GA, NC, TN). Distances are measured between nearest shelters and appropriate segments shapefiles are generated.
Keywords: Appalachian Trail, Haversine Formula, Distance, Dictionary, arcpy
Calculated flood extents based on a Neuse River Hydraulic Model are analyzed with the NC Roads data set in order to identify the roads with high risk of getting flooded as well as the development of an animation showing the possible behavior of the a flood event.
Keywords: Inundation Mapping, Hydraulic model, 2D Animation, Intersection Analysis
Often times at the beginning stages of a large land development projects it can be challenging to compile all the site analysis data while keeping the costs low for the developer. Information that is readily available for a potential development site through the local governments is only suitable for the preliminary stages of the project. More detailed information such as surveys, tree locations and sizes and watershed areas are needed to proceed. Throughout the process base mapping information is needed and constantly having to be updated as new information is received. The ripple effect of these changes can be quite time consuming to fix.
Keywords: Perennial, Intermittent, Parcel (PIN)
This project addresses some issues that have arisen during the field season while monitoring Indiana bats (Myotis sodalis) movements at Fort Drum Military Installation in northern New York. Up to date shapefiles and reports must be generated by people unfamiliar with ArcGIS on a regular basis. The first script reads through a roost tree database, and creates two outputs: a CSV file of all entries with missing information and a CSV file of the roost trees of interest following a certain criteria. The second script generates a summary report (in a .txt format) that is easy to create in order to send to the local biologists on base to inform them of our progress.
Keywords: Indiana bats, Myotis sodalis, convert CSV to shapefile, calculate area from raster, summary report, tally count, change code to name using dictionary
Fall 2010
back to top
I work for the Electric Department of the Town of Edenton, NC. Over the past 18 months I have developed a GIS for the purpose of managing, maintaining and upgrading the electric infrastructure. One of my goals is to provide the Electric Utilities Director an ArcMap interface with a capacity to easily edit map features according to changes made in the utility infrastructure. Through the use of python and VBA programming, I have made considerable progress toward this goal. A customized ArcMap interface with toolbar buttons will, depending on the button, display a dialog window, prompt for appropriate user input or simply call a python script to perform an editing task. The main feature of this project is a button labeled ‘Add New Data.’ It calls AddNewData.py which adds new records to one of three ArcMap document feature class layers based on data collected using a GPS device.
When developers contact Peachtree City Water and Sewerage Authority(PCWASA) to request a new sewer connection for a purposed project, PCWASA employees currently use Microsoft Excel spreadsheets to calculate the estimated new sewer outflow expected to be produced from the new establishment and the fee that PCWASA will need to charge for the new sewer connection. The goal of this project is to automate this process; To create a tool that can be updated by PCWASA easily, as fee and flow calculations may change over time, and to have the functionality to save, export, and view calculation results.
During my research, I have been attempting to estimate flow rates associated with rainfall rates for a small tributary of Crabtree Creek. The ultimate goal is to predict flooding events more accurately using modeled rainfall rates and to back out rainfall rates where flow rate sensors exist. This information would be very valuable to meteorologists and hydrologists, and would give them more tools to better warn the public of impending flood events. Often, I have found it very time consuming to clip each shapefile in a directory to my desired research area. It would be beneficial to have a script perform this tedious task. Furthermore, I would like to select specific elevations and create new shapefiles for further processing. Finally, the shapefiles that were made were added to the map display. These scripts automate several of the tedious tasks and cut the data conditioning time to a minimum.
The Fire Management Office (FMO) of the Shenandoah National Park (SHEN) has several years worth of fuels data stored in a database and thousands of photos of the collection sites. The FMO would like to consolidate and visualize this data (create shapefiles) and possibly include reference to the photos in the shapefile as a column with relative path to the location of the photo. The data is geo-referenced, but must be converted to standard shapefiles. The consolidated data in the resultant shapefile will be used by FMO to help plan containment strategies in the event of a wild land fire.
The purpose of this project was to explore the use of Python scripting, Visual Basic for Applications (VBA) scripting, and ERSI ArcGIS Desktop ArcObjects to develop an application that displays hiking, biking, paddling and bridle trails located within a user selected North Carolina open space. The user is presented with a list of open space names that can be filtered by county, owner and manager. Once a selection is made, the open space boundary and associated trails, hydrology, roads and points of interest (POI) are draped over a 3D scene. The 3D scene is created using a hillshade raster created from an elevation raster associated with the open space. Esri ArcScene is the application used to render and manipulate the 3D scene.
This use of this application is to help outdoor enthusiasts identify trails that they may want to experience while visiting an area within North Carolina.
Greece has long been a hub for economic activity and flourishing civilizations. A combination of geographical location and accessibility to a range of natural resources made her a dominant force within the ancient Mediterranean world. Greece is traditionally described as having a rugged and harsh terrain. It is curious then, how these ancient people developed into such complex societies and were constantly importing and exporting goods. To better understand how the people of Greece made use of their trade routes an analysis on two of her predominant natural resources (marble and lead-silver) will be performed. By examining the locations in relation with their surrounding terrain and their distance to the heart of Greece, Athens, a relative understanding of the amount of difficulty it took for Grecian people to move goods can be derived.
My project is to write a script to convert an XYZ file format to the IGES format.
A research group at NCSU is investigating the use of tangible terrain user interfaces. They are converting topographic data into physical 3D models which are then altered by hand. It is much easier to change a complex 3D model by hand then with a computer model. I will be working with graduate student Katherine Weaver who is looking at Jockey’s Ridge on the outer banks of North Carolina. She created her foam board and clay model of the dune by hand using 2007 LIDAR data. She would like to have an automated machine carve out the models but the data is in XYZ format and the machine requires IGES or dxf.
My project is write the code to accomplish this conversion.
Archaeological projects usually involve sites within a well-defined area. Some projects have a prehistoric component with hundreds, if not thousands of cultural features (posts, pits…etc). Accurate spatial information is recorded using a total station and then downloaded to computers as digital spreadsheets. As analysis takes place in the lab, these spreadsheets are often updated—ruling out some features as natural occurrences and appending data to include other features or information about those cultural features. The spreadsheets from fieldwork have north and east values truncated to 4 digits, rather than the necessary 6 or 7. This project was undertaken to find the most recent version of the field work spreadsheet and automate the generation of shapefiles from the data. Base layers (soils, streams, roads, and contours) need to be clipped to a manageable extent surrounding the project area, rather than the varied extents usually inherent in data from!
numerous sources.
The Wake County GIS department develops and maintains GIS resources for internal and public consumption. Among their many products is an ongoing shapefile of Wake County parcel level property data. This massive database contains one feature for every property parcel in Wake County and has about 320,000 features as of December 2010. Additionally, there are 49 attributes covering a range of characteristics regarding geographic location, ownership, structure properties and parcel properties. The shapefile is updated monthly and released to the website for public consumption. Although the target users of the parcel shapefile are finance and planning professionals in local government, it is an important resource for analyzing trends in local real estate. It is helpful to process the data for use in this context to eliminate the unnecessary information. The Wake County GIS Property Parcel Real Estate Cleaning Tool performs several data management a!
ctions, deleting unnecessary features and attributes and calculating the unit sale price on the remaining parcels.
In an effort to study the impact of human traffic in the back country on bear behavior at Yellowstone National Park, the bear management office has collected GPS tracks from volunteers using hand held GPS units and GPS tracking collars. As a part of the study the office would like to check for points that are near in time as well as space. This tool facilitates this analysis by computing epoch time and consolidating all of the raw data into two files, one for bears and one for people. Both files are then converted to shape files. The shape files are then used to compute a near table with the bears as in features and the bears as near features. After processing the shape files can be joined to the near table and the epoch time field can be used to determine the amount of time that separates near features allowing the bear management office to look for bear interaction.
Dr. Yu-Fai Leung at NCSU is working on the issue of problem reporting for trail resources.
Increasing reliance is placed on users of the trails to report problems as park managers have fewer resources to patrol the trails. Trail users frequently have GPS units with which they can record the locations of problems. Dr. Leung is interested in problems that are line features as opposed to point locations. Lines describe certain problems better than points e.g. trail being washed out over a length. This provides more information to park managers when dispatching resources to address the problem,
but presents a problem of matching a line segment generated by a set of input points to the curve of a trail.
Asian gypsy moth (Lymantria disper) is an invasive exotic species, which has not been established in the United States, known to damage forest species significantly. Studying biology of the insect can help understand potential impacts it might cause in the United States. Climate effects insect biology directly, and climate data can be one of the important factors to indicate suitability of insect life cycle. Phenology models can help understand timing of the insect developmental stages. Degree-day model is a model to measure insect development in response to temperature. Degree-day model estimates potential number of generations and specific insect state for a particular time period. The objectives of this project are to calculate degree days for Asian Gypsy Moth for user specified time period and to create a raster data layer to indicate the degree days for the time period in the eastern United States.
Proper soil conservation and forest regeneration require site preparation that maximizes seedling survival and growth while minimizing damage to soil and other natural resources. Herbicide application, ground preparation, and prescribed burning are the three main practices used in site preparation. These practices are often part of a prescription plan which can greatly benefit from the wealth of natural resources data that we have collected and stored in our databases. However there is a need to come up with GIS tools that automate the process of combining several data layers, sorting areas of interest, generating statistics according to the desired forest site preparation practices, and providing output for cost analysis.
Spring 2010
back to top
When modeling Red Cockaded Woodpecker (RCW) clustering dynamics and potential impact to RCW habitat it is important to have accurate forest stand and tree information. In order to obtain tree information at the necessary level of spatial accuracy, surveyors collect data on Tree type and tree size (dbh) using total stations and provide that data to the GIS modeler in georeferenced CAD file format. This tool delivers a software based workflow management system for importing surveyed CAD data into a Geodatabase, appending relevant CAD Annotation to CAD derived feature class tables, and adding code – value fields to the CAD derived feature class tables for use in definition queries, analysis, and cartographic representation.
The Oregon South Coast and Lower Rogue Watershed Councils have developed a Landowner Road Inventory protocol and crossing locations were derived from this data. To evaluate the potential for drainage overflow, erosion damage, and fish passage concerns, the watershed for each crossing needs to be measured and evaluated. Currently, this is being done manually and there are currently 1120 such crossings that need to be plotted. This project automates the process
The City of Raleigh has been collecting data on all of the street trees found within the cities right of way since May 2009 for the Street Tree Management Plan, which addresses new tree planting, stewardship and maintenance of current trees, and storm damage mitigation of the street tree population. When street tree inventory interns collect data in the field, they must check out point features created in a previous inventory and collect new data elements on those points and add new points to the database. Keeping track of progress on this project and similar projects is essential to a timely completion. Data is collected in batches of one to two per week and stored within an “Archive” folder. Combining each of these distributed shapefiles into one on a monthly basis will allow administrators to monitor the progress of the project and provides a comprehensive current database of all collected information.
Measuring biomass is important for the understanding and managing natural areas. These measures are used in calculating an area’s carbon value and also in monitoring fire fuel loads. This study develops a model to predict biomass levels using multi-return LiDAR (Light Detection and Ranging) data. LIDAR was examined and a model developed based on the relationship of measured field plots to the same locations represented as LIDAR derived vegetation surfaces. A method was then developed to estimate the spatial distribution of pocosin biomass. This model was applied to the entire pocosin in Hofmann Forest to create a surface representing the tons per acre of this area.
Slope Excluded Urbanization Transportation Hillshade (SLEUTH), a cellular automata model, is being used to predict the extent of urbanization into the 22nd century. Among other datasets, SLEUTH requires at least 4 timestamps of urban extent. For a large extent, such as the entire southeastern US, this is problematic since datasets such as the National Landcover Database (NLCD) are not frequently produced. So, road density is substituted. A low density threshold is used to extract NLCD 2000 development as a base areas. A higher density threshold captures areas not classified as urban, but which may be considered so from a conservation perspective.
Further complicating matters, Developed Open Space (DOS) represents manicured lawns, ball fields, and grassy road medians and shoulders. Extra steps were needed to minimize the inclusion of DOS in the urban extent results in roadways “growing” the same as defined urbanization.
NCSU campus police reported a sharp rise in bicycle thefts on campus during the Fall 2009 semester. Campus police provided us with a record of larceny incidents NCSU campus between 2004 and 2009, so that we could investigate this trend. This project reads the Excel spreadsheet format incidence reports and geocodes the incidents identified as bicycle larcenies. Then analysis is carried out to examine the spatial location pattern identify the most vulnerable locations, analyzing the events behavior such as the day of week. By automating this procedure we can continue to track trends as new incidents are logged.
This project examines urban areas’ racial composition and different minority groups’ access to public open space. Three Python scripts are developed and applied to a sample set of data from Wake County, North Carolina. The objectives are to use readily available data from the county (open space shape files) and sociodemographic data from the U.S. Census Bureau (census block shape files) to produce a series of coefficient surface maps illustrating the local variations in urban minorities’ access to open space.
Neighborhood Electric Vehicles (NEVs) are electric powered vehicles that are classified as “low speed vehicles”. This means that they can only legally travel on roads with speed limits of 35mph or less. Places with large areas with low speed limits, like university campuses or army bases are ideal for NEV usage. I have created a tool that would take road and building data and determine suitability for NEV travel in a city or area of that city, based on the data provided. The tool analyzes the roads to see which roads fall under the suitable speed limit and the buildings near those roads to see if they are purely residential areas. Areas with 50% or more of roads speed limit less than 35mph with less than 50% residential usage are suitable for NEV usage, as they can travel around and to attractions outside of residential areas.
Often, data downloaded from various sources is not in the most convenient format for certain tasks. The USGS currently allows public downloads of discrete-return LiDAR data from its CLICK website athttp://lidar.cr.usgs.gov/LIDAR_Viewer/viewer.php. This project streamines the process from data download to data usability. Based on a user-supplied area of interest, the correct LiDAR tiles are automatically downloaded (if available), unzipped, and converted from LAS format to multipoint format.
Spring 2009
back to top
This project examines the use of customization and automation of geoprocessing tasks within ArcMap using Python and VBA to create a usable county-wide trail data layer and create a point layer for healthy eating choices that can be updated easily.
The tool helps land managers determine the current trends in forest land cover change in the Munessa-Shashemane Forest between 1986 and 2000 to facilitate decision-making about appropriate intervention actions.
Zoning data in different counties usually contains a huge amount of entries which makes it time-consuming for users to query specific data they desire. I have a multi-county zoning data for North Carolina Sandhill region that has more than 85,000 entries. This tool identifies open space located in any of the counties per user’s request and add the selected to the map layer.
The purpose of this project is to automate the process of merging LIDAR tiles into one large LIDAR point shapefile that is projected in State Plane Meters and converting the merged LIDAR file into a DEM for further analysis. These objectives are accomplished with a combination of Python scripting, an ArcGIS toolbox, and VBA GUIs.
The National Park Service (NPS) has conducted a field study and created a MS Access database of bird species in George Washington Birthplace National Monument. From ArcGIS, the NPS would like to be able to input a species’ common name and have layers added to ArcMap showing the location each specie was found. This project involves accepting input from the user to determine file location and the species’ common name, using this input to query the database and retrieving the output shapefiles for inclusion in ArcMap.
Climate change is projected to increase global temperatures between 1 and 6 degrees Celcius and consequently raise sea levels between 300 and 800 mm by 2100, according to the Intergovernmental Panel on Climate Change. This tool uses geoprocessing and batch processing in Python to model sea level rise in coastal areas. The simple 'bathtub' model takes an input DEM, an IPCC sea level rise scenario, and a time interval and produces a predictive DEM, based on the given input.