2,678 research outputs found
Geospatial modeling approach to monument construction using Michigan from A.D. 1000–1600 as a case study
Building monuments was one way that past societies reconfigured their landscapes in response to shifting social and ecological factors. Understanding the connections between those factors and monument construction is critical, especially when multiple types of monuments were constructed across the same landscape. Geospatial technologies enable past cultural activities and environmental variables to be examined together at large scales. Many geospatial modeling approaches, however, are not designed for presence-only (occurrence) data, which can be limiting given that many archaeological site records are presence only. We use maximum entropy modeling (MaxEnt), which works with presence-only data, to predict the distribution of monuments across large landscapes, and we analyze MaxEnt output to quantify the contributions of spatioenvironmental variables to predicted distributions. We apply our approach to co-occurring Late Precontact (ca. A.D. 1000–1600) monuments in Michigan: (i) mounds and (ii) earthwork enclosures. Many of these features have been destroyed by modern development, and therefore, we conducted archival research to develop our monument occurrence database. We modeled each monument type separately using the same input variables. Analyzing variable contribution to MaxEnt output, we show that mound and enclosure landscape suitability was driven by contrasting variables. Proximity to inland lakes was key to mound placement, and proximity to rivers was key to sacred enclosures. This juxtaposition suggests that mounds met local needs for resource procurement success, whereas enclosures filled broader regional needs for intergroup exchange and shared ritual. Our study shows how MaxEnt can be used to develop sophisticated models of past cultural processes, including monument building, with imperfect, limited, presence-only data
An Evaluation Of A Multigrid Algorithm Error Spike Correction Method
This study is an evaluation of a method of improving the multigrid process by cor recting error spikes which are generated when moving from a coarser to finer level. The correction method was tested on nine one-dimensional problems governed by second order differential equations. Tests were performed with an accomodative, full approximation scheme, full multi-grid algorithm.
Results indicate that appropriate implementation of the correction can increase so lution accuracy. Accuracy was increased in 75% of cases in which a single correction was applied to a point in the central portion of the grid. Single corrections performed on points with error greater than the average error were effective 86% of the time. Further study is required to determine a method of identifying this scenario
Cybersecurity on My Mind: Protecting Georgia Consumers from Data Breaches
In a world where vast amounts of personal informationare obtained and stored by countless organizations andbusinesses in the public and private sector, data breaches,
due to negligence or nefarious hacking, are a far toocommon occurrence. The results of a data breach can beserious and widespread, from public humiliation toidentity theft and national security crises. In an effort toprotect consumers from the potentially devastating effectsof data breaches, the Federal Trade Commission hasbegun to take enforcement action against businesses whosedata security practices are alleged to be unfair anddeceptive. Theoretically, states can take similar actionsunder their Little FTC Acts or data breach notificationlaws.This Note argues that Georgia\u27s Little FTC Act, theFair Business Practices Act, and data breach notificationlaw, the Georgia PersonalIdentity ProtectionAct, provideinsufficient protection from data breaches for Georgiaconsumers and insufficient recourse for those harmed bybreaches. This Note also proposes several changes inGeorgia\u27s statutory scheme that would incentivizeorganizations to implement stronger data securitymeasures and provide better remedies for injuredconsumers
Teen Perceptions of Sexual Health Education in Marin County
Sexual health education in the United States has generally been taught utilizing two approaches, Abstinence-only or Comprehensive sex education. Abstinence- only until marriage programs teach abstinence as the preferred option in expressing sexuality and reducing risk of negative outcomes. This type of education usually censors out information regarding contraception and other barriers for protecting against sexually transmitted diseases (advocatesforyouth.org). Comprehensive-sex education also teaches that abstinence is the best method in preventing unintended pregnancies but it additionally explores preventative methods for people who choose to engage in sexual behavior. Although larger studies have been conducted in comprehensive or abstinence education, it’s rare that adolescent perceptions are incorporated in the research.
My research will explore perceptions of the effectiveness of sexual health education offered in Marin County schools. This research will be conducted through an interview process with adolescents who attend local high schools in Marin County. I will investigate various factors that contribute to how adolescents perceive the sexual health education that is being offered. Through the analysis of the interviews, I will provide recommendations for health education professionals, so that they can provide services that teens in Marin County perceive as relevant or effective
Reliability History and Improvements to the ANL 50 MEV H- Accelerator
The H- Accelerator consists of a 750 keV Cockcroft Walton preaccelerator and
an Alvarez type 50 MeV linac. The accelerator has been in operation since 1961.
Since 1981, it has been used as the injector for the Intense Pulsed Neutron
Source (IPNS), a national user facility for neutron scattering. The linac
delivers about 3.5x1012 H- ions per pulse, 30 times per second (30 Hz), for
multi-turn injection to a 450 MeV Rapid Cycling Synchrotron (RCS). IPNS
presently operates about 4,000 hours per year, and operating when scheduled is
critical to meeting the needs of the user community. For many years the IPNS
injector/RCS has achieved an average reliability of 95%, helped in large part
by the preaccelerator/linac which has averaged nearly 99%. To maintain and
improve system reliability, records need to show what each subsystem
contributes to the total down time. The history of source and linac subsystem
reliability, and improvements that have been made to improve reliability, will
be described. Plans to maintain or enhance this reliability for at least
another ten years of operation, will also be discussed.Comment: 3 pages, 1 figur
Effects of live-bait shrimp trawling on seagrass beds and fish bycatch in Tampa Bay, Florida
The use of live shrimp for bait in
recreational fishing has resulted in
a controversial fishery for shrimp in
Florida. In this fishery, night collections
are conducted over seagrass
beds with roller beam trawls to capture
live shrimp, primarily pink
shrimp, Penaeus duorarum. These
shrimp are culled from the catch on
sorting tables and placed in onboard
aerated “live” wells. Beds of
turtlegrass, Thalassia testudinum,
a species that has highest growth
rates and biomass during summer
and lowest during the winter (Fonseca
et al., 1996) are predominant
areas for live-bait shrimp trawling
(Tabb and Kenny, 1969).
Our study objectives were 1) to
determine effects of a roller beam
trawl on turtlegrass biomass and
morphometrics during intensive
(up to 18 trawls over a turtlegrass
bed), short-term (3-hour duration)
use and 2) to examine the mortality
of bycatch finfish following capture
by a trawl
Methods and materials for detection of multiple sclerosis
Methods and materials for diagnosis of a multiple sclerosis disease state. Antigenic blood fractions from patients clinically diagnosed for multiple sclerosis are employed to generate heterologous species antibodies. Novel antibody preparations are employed to detect the presence or absence, in a blood sample of a patient to be tested, of immunologically significant components specifically associated with a multiple sclerosis disease state
Recommended from our members
Contact Interface Verification for DYNA3D Scenario 1: Basic Contact
A suite of test problems has been developed to examine contact behavior within the nonlinear, three-dimensional, explicit finite element analysis (FEA) code DYNA3D (Lin, 2005). The test problems address the basic functionality of the contact algorithms, including the behavior of various kinematic, penalty, and Lagrangian enforcement formulations. The results from the DYNA3D analyses are compared to closed form solutions to verify the contact behavior. This work was performed as part of the Verification and Validation efforts of LLNL W Program within the NNSA's Advanced Simulation and Computing (ASC) Program. DYNA3D models the transient dynamic response of solids and structures including the interactions between disjoint bodies (parts). A wide variety of contact surfaces are available to represent the diverse interactions possible during an analysis, including relative motion (sliding), separation and gap closure (voids), and fixed relative position (tied). The problem geometry may be defined using a combination of element formulations, including one-dimensional beam and truss elements, two-dimensional shell elements, and three-dimensional solid elements. Consequently, it is necessary to consider various element interactions for each contact algorithm being verified. Most of the contact algorithms currently available in DYNA3D are examined; the exceptions are the Type 4--Single Surface Contact and Type 11--SAND algorithms. It is likely that these algorithms will be removed since their functionality is embodied in other, more robust, contact algorithms. The automatic contact algorithm is evaluated using the Type 12 interface. Two other variations of automatic contact, Type 13 and Type 14, offer additional means to adapt the interface domain, but share the same search and restoration algorithms as Type 12. The contact algorithms are summarized in Table 1. This report and associated test problems examine the scenario where one contact surface exists between two disjoint bodies. These test problems focus on whether a particular contact algorithm properly represents the interactions along the interface. A companion report (McMichael, 2006) and test problems address the multi-contact scenario in which multiple bodies interact with each other via multiple interfaces. The multi-contact test problems examine whether any ordering issues exist in the contact logic. The test problems are analyzed using version 5.2 (compiled on 12/22/2005) of DYNA3D. The analytical results are used to form baseline solutions for subsequent regression testing
- …