ECSU URE Summer 2011 Internships  
1
1

Jefferson
Michael Jefferson
Senior - CS
michaeljefferso@yahoo.com

 

Development of a 3D Model for the Assessment of Vulnerability Due to Sea Level Rise on the Historic Strawbery Bank
Mentors: Michael Routhier
University New Hampshire(UNH) Research and Discover Program

Abstract
The study of climate change is now starting to be widely researched around the world. One prominent exception to this fact is within the discipline of Historic Preservation. With the likelihood of climate change causing sea levels to rise over decades to come, historical preservationists are now looking for data and information which can help them mitigate potential threats to our cultural heritage along our sea coasts. Some such information that can be helpful in understanding these threats includes geographic information such as the locations of artifacts, fossils, and historic structures as well as their vertical elevation above mean sea level. In an effort to build a set of protocols to help preservations study these threats, our work is currently focusing on a historic living history museum site known as Strawbery Banke in Portsmouth, New Hampshire. This poster features a subset of this work that was completed through undergraduate student internships funded by the Joan and James Leitzel Center at the University of New Hampshire. This subset of work focused on the creation a 3D model of the study site. Two aspects of the creation of this model involved the completion of a topographic ground survey and the 3D digital mapping of the site itself. The ground survey was completed with the use of standard surveying techniques and tools and the 3D digital mapping was completed with the use of ArcScene, a software which is part of the ArcGIS suite. This work was completed in conjunction with a larger study funded by the National Geographic Society to better understand how sea level rise and the effects of storm surges are putting the historic structures at Strawbery Banke at risk.

 

1

Hall
Cedric Hall
Senior - Mathematics
cedriclhall@yahoo.com

Poster

A Study to Understand the Potential Vulnerabilities to the Foundations of Historic Structures in Coastal Areas
Mentor: Michael Routhier
University of New Hampshire

Abstract
Sea level rise, one of the most important manifestations of climate change is expected to result in increased coastal erosion and storm surge flooding. This work reports on a project undertaken to assess the vulnerability of foundations of historic structures in coastal areas to the potential consequences from climate change resulting from increased storm surge flooding.
Foundations of historic buildings are especially vulnerable to the seepage and ground water intrusion. Many historic structures themselves are used to house valuable historic collections making it all the important to preserve and protect these structures. Data collected from a field subsurface geophysical survey, geospatial field survey, and a simulation to understand the hydrological conditions of the site useful to build an understanding of threats to foundations of historic structures located on the coast.
The study site chosen for this project is the Strawbery Banke Living History Museum, in Portsmouth, NH. Located very close to the banks of the Piscataqua River at its meeting point with the Atlantic Ocean, it consists of many buildings of historical importance dating as far back as 1690. Upstream of the river is the tidal estuary system, the Great Bay which itself is formed by a confluence of seven major rivers, making it a highly dynamic, tide-dominated body of fresh and saltwater. Strawbery Banke is representative of historic structures, located on the Northeastern coast of the US, that will likely be impacted by climate change. The primary threat to the sub-structures for the site is Puddle Dock; a tidal inlet , that lies at the heart of the facility, that once provided direct river access to Strawbery Banke (and since filled around 1900).
Results from the long term deployment of in -situ water level loggers in test wells, temperature and relative humidity sensors, and a ground penetration radar survey of the Puddle Dock, along with the detailed fine-scale elevation survey of the site provide a window into the seasonality of ground water levels and the dynamic equilibrium of fresh and saltwater.
Initial analysis of the data indicates that at present Puddle Dock, acts as a conduit for tidal flows, albeit in a restrained manner. This can be attributed to the very unscientific way that it was filled perhaps with garbage, and other waste. It is anticipated the principles applied for this work can be replicated for assessing the vulnerability of other at-risk historical structures in the Northeastern US.
Support for purchase of hardware came from National Geographic Society's Waitt Grant program, and the Undergraduate scholarships support was provided by a grant from University of New Hampshire's Joan and James Leitzel Center for Mathematics, Science and Engineering Education.

 

Austin
JerNettie Burney
Senior - CS
jaburney@mail.ecsu.edu

Austin
Robyn Evans
Senior - Mathematics
RREVANS@mail.ecsu.edu

Paper
 

A Comparison of Job Duration Utilizing High Performance Computing on a Distributed Grid
Mentors: Seung Hee Bae and Jong Youl Choi,
Indiana University of Bloomington

Abstract
Parallel computing is defined as carrying out large-scale calculations simultaneously through the use of multiple computing units—such as processors or cores—that work together to devise a solution. During the summer of 2011, undergraduate research students participating in the Science, Technology, Engineering, and Mathematics Initiative at Indiana University Bloomington, were partnered with graduate students to examine the efficiency of parallel computing. To complete this, the team decided to create a program that divides the mathematical computation to multiply large-scale matrices amongst x nodes, or computer cores.

Overall, the team had a goal of three-fold: i) To understand parallel computing and parallel algorithms for large-scale computing in a shared memory system; ii) To understand the cutting-edge computing technologies needed to maximize the power of multi-core processors; and iii) To learn the standard performance measurement methods behind parallel computing. To these ends, the team implemented parallel matrix multiplication algorithms for a shared memory system and compared their computation performances. The team planned to make comparisons between C, C++, and C#, however, due to time constraints, comparisons were only made between C and C#. These comparisons were made to find not only the most prompt program but also to study the efficiency—the measure of how well the execution was performed—of each one.

A program was first developed to compute sequential matrix multiplication and then was rewritten to include multiple workers to solve the problem; the method for including multiple workers in the program is known as threading. Open Multi-Processing (OpenMP) and Task Parallel Library (TPL) libraries were used in C and C# respectively to specify the number of workers needed to compute the program. The code was then submitted the code to the designated compute node—cn04 for C and cn05 for C#—on the cluster system Storm. It was during this step that both the matrix size and the number of desired workers were specified. The maximum number of threads that could have been used were 24, however 16 was decided to be the current maximum for research purposes. The sequence of threads that were used for calculations were 1, 2, 4, 8, and 16 while the matrix dimensions that were multiplied were 2048 x 2048 and 4096 x 4096. Finally, the time in seconds, it took each job to be completed, was recorded and a comparison was made.

 

Austin
Ryan Lawrence
Junior - Chemistry
ryan.d.lawrence@gmail.com

PowerPoint

 

Developing a Method for Accumulation Rates Using CReSIS Airborne Snow Radar from West Antarctica
Mentors: Dr. Ian Joughin and Brooke Medley
University of Washington - CReSIS Visulization

Abstract
For more than 50 years, scientists have retrieved ice cores from the world's ice sheets to study ice dynamics as well as past and present climatic and atmospheric conditions, including the accumulation rate. The ice-sheet accumulation rate is not only an important climate indicator but also a significant component of ice-sheet mass balance, which is the total mass gained or lost over a prescribed period of time [1]. Snow accumulation is the primary mass contributor while ice flux into the ocean and surface melting are the primary mass loss mechanisms. As concern over sea-level change and ice-sheet stability increases, more accurate and spatially complete estimates of the accumulation rate are required. Therefore, the sparse point estimates of the accumulation rate (i.e., ice cores) no longer give sufficient data for regional mass balance estimates because of their limited spatial coverage, but remain important paleoclimate records due to their exceptional temporal resolution. In order to constrain current mass balance estimates at the regional scale, improvement in the spatial resolution of accumulation rate estimates is necessary [2].
West Antarctica in particular is seriously lacking in point based measurements of the accumulation rate, whether through snow pit or ice core analysis. Climate models show the region along the Amundsen Coast receives snowfall amounts unprecedented across most of the continent, yet these models lack any ground-truthing and are limited in their spatial resolution [3]. Thus, any estimates of mass balance over this region are ill-constrained and are in need of much better estimates of the snow accumulation rate.
Using the CReSIS developed Snow Radar, which is capable of imaging near surface layers in the uppermost part of the ice sheet at a very fine vertical resolution, estimates of very recent  firn accumulation rates over the Thwaites Glacier along the Amundsen Coast of West Antarctica were calculated using data from Flight One, Segment 02 of the 10/18/2009 flight.
The “short” segment layer (~130.76km)  mean accumulation (temporal variation, accumulation rate changing with time)  rate was 0.4m/y and the “long” segment layer (~297.874km) was 0.44m/y. The  standard deviation (how much the accumulation rate varies in space)  for the “short” segment layer  was  0.052m  and 0.068m for the “long” segment layer. The derived dataset estimates are within range of previous estimates; however, the continent wide published estimates do not correspond well with each other nor the specific dataset for Thwaites Glacier.  

 

1

Austin
Ya'Shonti Bridgers
Sophomore - Mathematics
ymbridges970@mail.ecsu.edu

PowerPoint
Paper

 

 


Temporal Reduction of the Ice Shelf in Pine Island Bay Antarctica: 1972 - 2003
Mentor: Dr. Malcolm LeCompte
ECSU Research Experience for Undergraduatesin Ocean, Marine and Polar Sciences

Abstract
In an effort to determine whether the Antarctic ice sheet is growing or diminishing over long time intervals, Dr. Robert Bindschadler led an international team of glaciologists and computer scientists, including Elizabeth City State University (ECSU) students, to obtain an accurate measure of the area of the Antarctic ice sheet. Before the ice sheet’s area was determined, the grounding line (GL), or boundary dividing the
ice sheet resting on land from floating ice, was located by combining 2003 Landsat imagery and satellite-based laser altimetry.

Landsat image data contemporary with that used to create the grounding line was compared to earlier Landsat imagery of the same area. A small ice shelf—now known as the ECSU Ice Shelf—near the eastern entrance to Pine Island Bay was previously identified as having diminished over an approximate 31-year span and the progressive reduction of its area qualitatively characterized. Here, the area loss of the ECSU Ice Shelf is quantified over time from 1972 to its disappearance in 2003.

Departures from perfect geographic pixel registration in Landsat imagery of the ECSU Ice Shelf collected before 2003 was corrected with ITT’s Visualization Information Solutions’ ENVI image processing software using a 2003 Landsat 7 Enhanced Thematic Mapper (ETM) image as a reference. Older images from Landsat 4,5 Thematic Mapper (TM) and Landsat 7 (ETM) were registered to conform to the common fixed geographic control points visible on both images. By overlaying the GL on the registered (warped) images, the area changes in the ice shelf were computed. An average ice shelf area was determined from four independent measurement trials for each of the pre-2003 Landsat image. Landsat Images from 2003 used in creating the GL were obtained from the United States Geological Survey (USGS) archive. The older, cloud free Landsat 4, 5 TM and 7 ETM images of the Pine Island Glacier region were obtained from another USGS archive
.
Results provided: 1. A quantitative description of the disappearance of the ECSU Ice Shelf from 1972 through 2003; 2. Validation of the grounding line’s actual location; 3.A survey of Antarctic coastal features that may have experienced climate related change.

 

Bevins
Joyce Bevins
Senior-Computer Science
blessed1989@hotmail.com
Cogbill
Autumn Like
Sophomore-Computer Science
aymluke2010@gmail.com
Analyzing MapReduce Frameworks Hadoop and Twister
Mentors: Thilina Gunathne, Stephen Wu, Bingjing Zhang
Indiana University

Abstract
The primary focus of this research was to analyze the attributes of MapReduce frameworks for data intensive computing and to compare two different MapReduce frameworks, Hadoop and Twister. MapReduce is a data processing framework that allows developers to write applications that can process large sets of data in a timely manner with the use of distributed computing resources. One of its main features is the ability to partition a large computation into a set of discrete tasks to enable parallel processing of the computation. Google, the most popular search engine on the internet, uses MapReduce to simplify data processing on its large clusters. We analyze the performance of Hadoop and Twister using the Word Count application and compare the scalability and efficiency of the two frameworks for this particular application.
Poster

Austin
Glen Koch
Junior - Computer Science
gmkoch877@mail.ecsu.edu

Poster

Hybrid Cloud Security: Replication and Direction of Sensitive Data Blocks
Mentors: Dr. Xiaofeng Wang, Kehuan Zhang
School of Informatics and Computing, Indiana University, Bloomington Indiana

Abstract
The primary focus of this research was to analyze the Hybrid Cloud security platform as proposed by Dr. Xiaofeng Wang and his research team. Large scale data sets in cloud computing environments carry inherent security concerns. The proposed platform involves code implementation and modification within the Hadoop Distributed File System as it pertains to parallel data processing. A hybrid cloud solution involves separating sensitive data which is confined to a private domain (private cloud), from non-sensitive data (public cloud). The specific research task was to create and modify java source code within the Hadoop Distributed File System, to implement alternative replication factors and test to verify that data was replicated to the proper domain (public or private) based on its security tag. Code was modified to change the replication factor of data and tested. Modified code was tested to see that data was distributed to public and private domains as tagged. This project is supported in part by National Science Foundation Grant CNS-0716292.

Austin
Jean Bevins
Senior - CS
twin-j@hotmail.com

Austin
Michael Austin
Senior - Computer Science
maaustin@mail.ecsu.edu

Poster

Testing Windows Azure Cloud Computer Service Efficiency
Mentor: Xiaoming Gao
Indiana University

Abstract
The primary focus of this research project was to test cloud computer services using Windows Azure. Windows Azure is a Microsoft cloud platform used to host, shape and balance web applications through Microsoft datacenters. Clouds are machines that are accessed via internet to either store or compute files. Cloudberry Explorer manages the files in the Microsoft Azure Blob Storage. The clouds efficiency and effectiveness was tested by sending and receiving data at peak and off peak times. The type of files tested consisted of txt, jpeg, music, video, and software. The upload and download of the files were recorded along with the speed and size.

 
Elizabeth City State University does not endorse, sponsor or provide material located on this site.