5,083 research outputs found
The Prevalence and Control of Bacillus and Related Spore-Forming Bacteria in the Dairy Industry
peer-reviewedMilk produced in udder cells is sterile but due to its high nutrient content, it can be a good growth substrate for contaminating bacteria. The quality of milk is monitored via somatic cell counts and total bacterial counts, with prescribed regulatory limits to ensure quality and safety. Bacterial contaminants can cause disease, or spoilage of milk and its secondary products. Aerobic spore-forming bacteria, such as those from the genera Sporosarcina, Paenisporosarcina, Brevibacillus, Paenibacillus, Geobacillus and Bacillus, are a particular concern in this regard as they are able to survive industrial pasteurization and form biofilms within pipes and stainless steel equipment. These single or multiple-species biofilms become a reservoir of spoilage microorganisms and a cycle of contamination can be initiated. Indeed, previous studies have highlighted that these microorganisms are highly prevalent in dead ends, corners, cracks, crevices, gaskets, valves and the joints of stainless steel equipment used in the dairy manufacturing plants. Hence, adequate monitoring and control measures are essential to prevent spoilage and ensure consumer safety. Common controlling approaches include specific cleaning-in-place processes, chemical and biological biocides and other novel methods. In this review, we highlight the problems caused by these microorganisms, and discuss issues relating to their prevalence, monitoring thereof and control with respect to the dairy industry.NG is funded by the Teagasc Walsh Fellowship Scheme and through the Irish Dairy Levy funded project ‘Thermodur-Out.
Token Coherence: A New Framework for Shared-Memory Multiprocessors
Commercial workload and technology trends are pushing existing shared-memory multiprocessor coherence protocols in divergent directions. Token Coherence provides a framework for new coherence protocols that can reconcile these opposing trends
Why On-Chip Cache Coherence is Here to Stay
Today’s multicore chips commonly implement shared memory with cache coherence as low-level support for operating systems and application software. Technology trends continue to enable the scaling of the number of (processor) cores per chip. Because conventional wisdom says that the coherence does not scale well to many cores, some prognosticators predict the end of coherence. This paper refutes this conventional wisdom by showing one way to scale on-chip cache coherence with bounded costs by combining known techniques such as: shared caches augmented to track cached copies, explicit cache eviction notifications, and hierarchical design. Based upon our scalability analysis of this proof-of-concept design, we predict that on-chip coherence and the programming convenience and compatibility it provides are here to stay
Accelerating Science: A Computing Research Agenda
The emergence of "big data" offers unprecedented opportunities for not only
accelerating scientific advances but also enabling new modes of discovery.
Scientific progress in many disciplines is increasingly enabled by our ability
to examine natural phenomena through the computational lens, i.e., using
algorithmic or information processing abstractions of the underlying processes;
and our ability to acquire, share, integrate and analyze disparate types of
data. However, there is a huge gap between our ability to acquire, store, and
process data and our ability to make effective use of the data to advance
discovery. Despite successful automation of routine aspects of data management
and analytics, most elements of the scientific process currently require
considerable human expertise and effort. Accelerating science to keep pace with
the rate of data acquisition and data processing calls for the development of
algorithmic or information processing abstractions, coupled with formal methods
and tools for modeling and simulation of natural processes as well as major
innovations in cognitive tools for scientists, i.e., computational tools that
leverage and extend the reach of human intellect, and partner with humans on a
broad range of tasks in scientific discovery (e.g., identifying, prioritizing
formulating questions, designing, prioritizing and executing experiments
designed to answer a chosen question, drawing inferences and evaluating the
results, and formulating new questions, in a closed-loop fashion). This calls
for concerted research agenda aimed at: Development, analysis, integration,
sharing, and simulation of algorithmic or information processing abstractions
of natural processes, coupled with formal methods and tools for their analyses
and simulation; Innovations in cognitive tools that augment and extend human
intellect and partner with humans in all aspects of science.Comment: Computing Community Consortium (CCC) white paper, 17 page
Advanced Cyberinfrastructure for Science, Engineering, and Public Policy
Progress in many domains increasingly benefits from our ability to view the
systems through a computational lens, i.e., using computational abstractions of
the domains; and our ability to acquire, share, integrate, and analyze
disparate types of data. These advances would not be possible without the
advanced data and computational cyberinfrastructure and tools for data capture,
integration, analysis, modeling, and simulation. However, despite, and perhaps
because of, advances in "big data" technologies for data acquisition,
management and analytics, the other largely manual, and labor-intensive aspects
of the decision making process, e.g., formulating questions, designing studies,
organizing, curating, connecting, correlating and integrating crossdomain data,
drawing inferences and interpreting results, have become the rate-limiting
steps to progress. Advancing the capability and capacity for evidence-based
improvements in science, engineering, and public policy requires support for
(1) computational abstractions of the relevant domains coupled with
computational methods and tools for their analysis, synthesis, simulation,
visualization, sharing, and integration; (2) cognitive tools that leverage and
extend the reach of human intellect, and partner with humans on all aspects of
the activity; (3) nimble and trustworthy data cyber-infrastructures that
connect, manage a variety of instruments, multiple interrelated data types and
associated metadata, data representations, processes, protocols and workflows;
and enforce applicable security and data access and use policies; and (4)
organizational and social structures and processes for collaborative and
coordinated activity across disciplinary and institutional boundaries.Comment: A Computing Community Consortium (CCC) white paper, 9 pages. arXiv
admin note: text overlap with arXiv:1604.0200
- …