1,857 research outputs found
Three computer codes to read, plot and tabulate operational test-site recorded solar data
Computer programs used to process data that will be used in the evaluation of collector efficiency and solar system performance are described. The program, TAPFIL, reads data from an IBM 360 tape containing information (insolation, flowrates, temperatures, etc.) from 48 operational solar heating and cooling test sites. Two other programs, CHPLOT and WRTCNL, plot and tabulate the data from the direct access, unformatted TAPFIL file. The methodology of the programs, their inputs, and their outputs are described
Who Needs Guidance?
A key issue in guidance provision is how to make services flexible and responsive to client need. A model is presented which distinguishes between individuals with high, medium and low levels of readiness for career decisionmaking. It is suggested that those with high levels can be referred to self-help services; those with moderate readiness to brief staff-assisted services; and those with low readiness to individual case-managed services. The theoretical basis for the model, the use of diagnostic instruments within the model, its implications for career resource rooms and Internet websites, and its staffing implications, are discussed. Elements of the model are currently being applied in a careers service setting in Coventry. The main principles of this work are described, and the more general relevance to the model to current policy issues in the UK is examined.
The Centre for Guidance Studies was created in 1998 by the University of Derby and five careers service companies (the Careers Consortium (East Midlands Ltd.). The centre aims to bridge the gap between guidance theory and practice. It
supports and connects guidance practitioners, policy-makers and researchers through research activities and learning opportunities; and by providing access to resources related to guidance and lifelong learning
Emergent (In)Security of Multi-Cloud Environments
As organizations increasingly use cloud services to host their IT
infrastructure, there is a need to share data among these cloud hosted services
and systems. A majority of IT organizations have workloads spread across
different cloud service providers, growing their multi-cloud environments. When
an organization grows their multi-cloud environment, the threat vectors and
vulnerabilities for their cloud systems and services grow as well. The increase
in the number of attack vectors creates a challenge of how to prioritize
mitigations and countermeasures to best defend a multi-cloud environment
against attacks. Utilizing multiple industry standard risk analysis tools, we
conducted an analysis of multi-cloud threat vectors enabling calculation and
prioritization for the identified mitigations and countermeasures. The
prioritizations from the analysis showed that authentication and architecture
are the highest risk areas of threat vectors. Armed with this data, IT managers
are able to more appropriately budget cybersecurity expenditure to implement
the most impactful mitigations and countermeasures
Systemic Risk and Vulnerability Analysis of Multi-cloud Environments
With the increasing use of multi-cloud environments, security professionals
face challenges in configuration, management, and integration due to uneven
security capabilities and features among providers. As a result, a fragmented
approach toward security has been observed, leading to new attack vectors and
potential vulnerabilities. Other research has focused on single-cloud platforms
or specific applications of multi-cloud environments. Therefore, there is a
need for a holistic security and vulnerability assessment and defense strategy
that applies to multi-cloud platforms. We perform a risk and vulnerability
analysis to identify attack vectors from software, hardware, and the network,
as well as interoperability security issues in multi-cloud environments.
Applying the STRIDE and DREAD threat modeling methods, we present an analysis
of the ecosystem across six attack vectors: cloud architecture, APIs,
authentication, automation, management differences, and cybersecurity
legislation. We quantitatively determine and rank the threats in multi-cloud
environments and suggest mitigation strategies.Comment: 27 pages, 9 figure
Corporate Social Responsibility and Its Reporting From a Management Control System Perspective
Corporate social responsibility (CSR) is a response to stakeholder concerns
and signals organisational legitimacy. We propose CSR reporting from a
comprehensive and integrated management control system perspective.
Reporting parameters start with stated people, planet, and profit goals supported
by objectives achieved through legal, ethical/moral, economic, and
giving practices. Objectives are measured, assessed, and reported through
key performance indicators. These objectives are quantified, sufficiently
specific, have a timeline, and identify targeted stakeholder group(s). CSR
strategy and its reporting are consistent with organisational mission, values,
and strategy. CSR, like most business processes, is a dynamic process
occurring over time and adjusting to circumstances sometimes involving
trade-offs. CSR reporting ideally reflects this process through providing
context and visual depictions of goals, practices, and performance evaluation
that demonstrate not only a single period in time, but also trends
that may present a more complete picture of an organisation’s CSR performance.
CSR reporting parameters are proposed
Twitter Watch: Leveraging Social Media to Monitor and Predict Collective-Efficacy of Neighborhoods
Sociologists associate the spatial variation of crime within an urban
setting, with the concept of collective efficacy. The collective efficacy of a
neighborhood is defined as social cohesion among neighbors combined with their
willingness to intervene on behalf of the common good. Sociologists measure
collective efficacy by conducting survey studies designed to measure
individuals' perception of their community. In this work, we employ the curated
data from a survey study (ground truth) and examine the effectiveness of
substituting costly survey questionnaires with proxies derived from social
media. We enrich a corpus of tweets mentioning a local venue with several
linguistic and topological features. We then propose a pairwise learning to
rank model with the goal of identifying a ranking of neighborhoods that is
similar to the ranking obtained from the ground truth collective efficacy
values. In our experiments, we find that our generated ranking of neighborhoods
achieves 0.77 Kendall tau-x ranking agreement with the ground truth ranking.
Overall, our results are up to 37% better than traditional baselines.Comment: 10 pages, 7 figure
The UK guidelines for management and surveillance of Tuberous Sclerosis Complex.
Background: The severity of Tuberous Sclerosis Complex (TSC) can vary among affected individuals. Complications of TSC can be life threatening, with significant impact on patients' quality of life. Management may vary dependent on treating physician, local and national policies, and funding. There are no current UK guidelines. We conducted a Delphi consensus process to reach agreed guidance for the management of patients with TSC in the UK. Methods: We performed a literature search and reviewed the 2012/13 international guideline for TSC management. Based on these, a Delphi questionnaire was formed. We invited 86 clinicians and medical researchers to complete an online survey in two rounds. All the people surveyed were based in the UK. Clinicians were identified through the regional TSC clinics, and researchers were identified through publications. In round one, 55 questions were asked. In round two, 18 questions were asked in order to obtain consensus on the outstanding points that had been contentious in round one. The data was analysed by a core committee and subcommittees, which consisted of UK experts in different aspects of TSC. The Tuberous Sclerosis Association was consulted. Results: 51 TSC experts took part in this survey. Two rounds were required to achieve consensus. The responders were neurologists, nephrologists, psychiatrist, psychologists, oncologists, general paediatricians, dermatologist, urologists, radiologists, clinical geneticists, neurosurgeons, respiratory and neurodisability clinicians. Conclusions: These new UK guidelines for the management and surveillance of TSC patients provide consensus guidance for delivery of best clinical care to individuals with TSC in the UK
Review of the Tuberous Sclerosis Renal Guidelines from the 2012 Consensus Conference: Current Data and Future Study.
Renal-related disease is the most common cause of tuberous sclerosis complex (TSC)-related death in adults, and renal angiomyolipomas can lead to complications that include chronic kidney disease (CKD) and hemorrhage. International TSC guidelines recommend mammalian target of rapamycin (mTOR) inhibitors as first-line therapy for management of asymptomatic, growing angiomyolipomas >3 cm in diameter. This review discusses data regarding patient outcomes that were used to develop current guidelines for embolization of renal angiomyolipomas and presents recent data on 2 available mTOR inhibitors - sirolimus and everolimus - in the treatment of angiomyolipoma. TSC-associated renal angiomyolipomas can recur after embolization. Both sirolimus and everolimus have shown effectiveness in reduction of angiomyolipoma volume, with an acceptable safety profile that includes preservation of renal function with long-term therapy. The authors propose a hypothesis for mTORC1 haploinsufficiency as an additional mechanism for CKD and propose that preventive therapy with mTOR inhibitors might have a role in reducing the number of angiomyolipoma-related deaths. Because mTOR inhibitors target the underlying pathophysiology of TSC, patients might benefit from treatment of multiple manifestations with one systemic therapy. Based on recent evidence, new guidelines should be considered that support the earlier initiation of mTOR inhibitor therapy for the management of renal angiomyolipomas to prevent future serious complications, rather than try to rescue patients after the complications have occurred
Recommended from our members
SURFACE PREPARATION OF STEEL SUBSTRATES USING GRIT-BLASTING
The primary purpose of grit blasting for thermal spray applications is to ensure a strong mechanical bond between the substrate and the coating by the enhanced roughening of the substrate material. This study presents statistically designed experiments that were accomplished to investigate the effect of abrasives on roughness for A36/1020 steel. The experiments were conducted using a Box statistical design of experiment (SDE) approach. Three grit blasting parameters and their effect on the resultant substrate roughness were investigated. These include blast media, blast pressure, and working distance. The substrates were characterized for roughness using surface profilometry. These attributes were correlated with the changes in operating parameters. Twin-Wire Electric Arc (TWEA) coatings of aluminum and zinc/aluminum were deposited on the grit-blasted substrates. These coatings were then tested for bond strength. Bond strength studies were conducted utilizing a portable adhesion tester following ASTM standard D4541
- …