677 research outputs found
Interface refactoring in performance-constrained web services
This paper presents the development of REF-WS an approach to enable a Web Service provider to reliably evolve their service through the application of refactoring transformations. REF-WS is intended to aid service providers, particularly in a reliability and performance constrained domain as it permits upgraded ’non-backwards compatible’ services to be deployed into a performance constrained network where existing consumers depend on an older version of the service interface. In order for this to be successful, the refactoring and message mediation needs to occur without affecting functional compatibility with the services’ consumers, and must operate within the performance overhead expected of the original service, introducing as little latency as possible. Furthermore, compared to a manually programmed solution, the presented approach enables the service developer to apply and parameterize refactorings with a level of confidence that they will not produce an invalid or ’corrupt’ transformation of messages. This is achieved through the use of preconditions for the defined refactorings
A confluence of new technology and the right to water: Experience and potential from South Africa's constitution and commons
South Africa's groundbreaking constitution explicitly confers a right of access to sufficient water (section 27). But the country is officially 'water-stressed' and around 10 % of the population still has no access to on-site or off-site piped or tap water. It is evident that a disconnect exists between this right and the reality for many; however the reasons for the continuation of such discrepancies are not always clear. While barriers to sufficient water are myriad, one significant factor contributing to insufficient and unpredictable access to water is the high percentage of broken water pumps. Previous studies have reported that between 20 and 50 % of all hand operated water pumps installed on the African continent are broken, or out of use. Monitoring and maintenance of pumps, which in South Africa is the responsibility of local municipalities is often ineffective, in part due to the distances between municipal centres and rural communities and the consequent costs of site visits, as well as breakdowns within the local bureaucratic system. The emergence of new telemetry tools that can remotely monitor water applications constitutes a novel and cost-efficient alternative to undertaking regular sites visits. Sustainable, appropriate, low-cost telemetry systems are emerging that could be used to monitor the operational performance of water pumps, or a wide range of other field parameters, and to communicate this information swiftly and cheaply to water service providers, using SMS messages. Data on the performance of water pumps could also be made available to the public online. This is an example of how ICT can be used for water resources management and environmental regulation, as well as in the governance of socio-economic rights: helping to optimize water allocation by improving communication and strengthening accountability. © 2014 Springer Science+Business Media Dordrecht
When is personal data rendered anonymous? Interpreting recital 26 of Directive 95/46/EC.
Outlines the scope of Council Directive 95/46. Discusses whether the principles of data protection apply to data rendered anonymous. Examines the difficulty in applying sufficient protection to data once it has been rendered anonymous and stresses the importance of data controllers informing data subjects of any anticipated anonymisation
When Can the Child Speak for Herself? The Limits of Parental Consent in Data Protection Law for Health Research.
Draft regulatory guidance suggests that if the processing of a child's personal data begins with the consent of a parent, then there is a need to find and defend an enduring consent through the child's growing capacity and on to their maturity. We consider the implications for health research of the UK Information Commissioner's Office's (ICO) suggestion that the relevant test for maturity is the Gillick test, originally developed in the context of medical treatment. Noting the significance of the welfare principle to this test, we examine the implications for the responsibilities of a parent to act as proxy for their child. We argue, contrary to draft ICO guidance, that a data controller might legitimately continue to rely upon parental consent as a legal basis for processing after a child is old enough to provide her own consent. Nevertheless, we conclude that data controllers should develop strategies to seek fresh consent from children as soon as practicable after the data controller has reason to believe they are mature enough to consent independently. Techniques for effective communication, recommended to address challenges associated with Big Data analytics, might have a role here in addressing the dynamic relationship between data subject and processing. Ultimately, we suggest that fair and lawful processing of a child's data will be dependent upon data controllers taking seriously the truism that consent is ongoing, rather than a one-time event: the core associated responsibility is to continue to communicate with a data subject regarding the processing of personal data
Contemporary Wiradjuri relatedness in Peak Hill, New South Wales.
Wiradjuri Aboriginal people in Peak Hill, a small economically-declining town in central rural New South Wales, have been subjected to a century of government policies included segregation, assimilation, and forced relocations. Despite this local, colonial history Peak Hill Wiradjuri continue to experience daily life in a distinctively Wiradjuri way. To ‘be Wiradjuri’ is to be embedded within a complex web of close relationships that are socially, morally and emotionally developed with both kin and friends, human and non-human subjects. Despite dramatic social and cultural transformations in Wiradjuri meanings and practices of relatedness, the Wiradjuri social world and their ways of self-experience remain informed by past practices. To understand contemporary socialities, and thus the significance of these transformations, this thesis is an examination of the ways in which the moral and emotional order of relatedness governs relatedness, where daily lived experience of shared emotional states can be understood in terms of a language for the self and moral framework. Specifically, this thesis is an exploration of how Wiradjuri people negotiate relatedness in a space in which shared and contrasting Wiradjuri and non-Wiradjuri inter-subjectivities are experienced. This study draws on historical research and ethnographical fieldwork to move beyond an analysis of kinship in terms of structures, roles or values to explore the deeper foundations of emotions and states of being in everyday life
PROV-TE: A Provenance-Driven Diagnostic Framework for Task Eviction in Data Centers
Cloud Computing allows users to control substantial computing power for complex data processing, generating huge and complex data. However, the virtual resources requested by users are rarely utilized to their full capacities. To mitigate this, providers often perform over-commitment to maximize profit, which can result in node overloading and consequent task eviction. This paper presents a novel framework that mines the huge and growing historical usage data generated by Cloud data centers to identify the causes of overloads. Provenance modelling is applied to add contextual meaning to the data, and the PROV-TE diagnostic framework provides algorithms to efficiently identify the causality of task eviction. Using simulation to reflect real world scenarios, our results demonstrate a precision and recall of the diagnostic algorithms of 83% and 90% respectively. This demonstrates a high level of accuracy of the identification of causes
Association between neutrophil-lymphocyte ratio and lymph node metastasis in gastric cancer : a meta-analysis
Introduction and Aim: The prognostic role of neutrophil to lymphocyte ratio (NLR) has been explored extensively in the literature. The aim of this meta-analysis was to evaluate the link between NLR and lymph node metastasis in gastric cancer. A method for increasing specificity and sensitivity of pre-treatment staging has implications on treatment algorithms and survival.
Search Strategy: The relevant databases were searched as per the Preferred Reporting Items for Systematic Reviews and Meta-Analyses flowchart. After selection, 12 full text articles that met the inclusion criteria were included for quantitative analysis. 2 x 2 squares were generated using lymph node positive/negative, and NLR high/low data. The effect size for each study was calculated using the DerSimonian-Laird random effects model. P values were calculated using the chi-square method. Finally publication bias was evaluated. All statistics were calculated using R Studio.
Results: Meta-analysis showed a 1.90 times (odds ratio, with 95% CI 1.52-2.38) increase in risk of positive lymph node status with high neutrophil to lymphocyte ratio. This has significant implications for cancer screening and staging, as NLR is a highly reproducible, cost-effective, and widely available prognostic factor for gastric cancer patients. Additionally, high or low NLR values may have implications for management pathways. Patients with lymph node metastasis can be offered neo adjuvant chemotherapy, avoiding salvage therapy in the form of adjuvant chemoradiotherapy, which is poorly tolerated.
Conclusion: This meta-analysis shows an association between NLR and positive lymph node status in gastric cancer patients with implications for staging, as well as preoperative personalisation of therapy
Holistic Data Centres: Next Generation Data and Thermal Energy Infrastructures
Digital infrastructure is becoming more distributed and requiring more power for operation. At the same time, many countries are working to de-carbonise their energy, which will require electrical generation of heat for populated areas. What if this heat generation was combined with digital processing
Seismic Response to Injection Well Stimulation in a High-Temperature, High-Permeability Reservoir
Fluid injection into the Earth's crust can induce seismic events that cause damage to local infrastructure but also offer valuable insight into seismogenesis. The factors that influence the magnitude, location, and number of induced events remain poorly understood but include injection flow rate and pressure as well as reservoir temperature and permeability. The relationship between injection parameters and injection-induced seismicity in high-temperature, high-permeability reservoirs has not been extensively studied. Here we focus on the Ngatamariki geothermal field in the central Taupō Volcanic Zone, New Zealand, where three stimulation/injection tests have occurred since 2012. We present a catalog of seismicity from 2012 to 2015 created using a matched-filter detection technique. We analyze the stress state in the reservoir during the injection tests from first motion-derived focal mechanisms, yielding an average direction of maximum horizontal compressive stress (SHmax) consistent with the regional NE-SW trend. However, there is significant variation in the direction of maximum compressive stress (σ1), which may reflect geological differences between wells. We use the ratio of injection flow rate to overpressure, referred to as injectivity index, as a proxy for near-well permeability and compare changes in injectivity index to spatiotemporal characteristics of seismicity accompanying each test. Observed increases in injectivity index are generally poorly correlated with seismicity, suggesting that the locations of microearthquakes are not coincident with the zone of stimulation (i.e., increased permeability). Our findings augment a growing body of work suggesting that aseismic opening or slip, rather than seismic shear, is the active process driving well stimulation in many environments
Adaptive Speculation for Efficient Internetware Application Execution in Clouds
Modern Cloud computing systems are massive in scale, featuring environments that can execute highly dynamic Internetware applications with huge numbers of interacting tasks. This has led to a substantial challenge the straggler problem, whereby a small subset of slow tasks significantly impede parallel job completion. This problem results in longer service responses, degraded system performance, and late timing failures that can easily threaten Quality of Service (QoS) compliance. Speculative execution (or speculation) is the prominent method deployed in Clouds to tolerate stragglers by creating task replicas at runtime. The method detects stragglers by specifying a predefined threshold to calculate the difference between individual tasks and the average task progression within a job. However, such a static threshold debilitates speculation effectiveness as it fails to capture the intrinsic diversity of timing constraints in Internetware applications, as well as dynamic environmental factors such as resource utilization. By considering such characteristics, different levels of strictness for replica creation can be imposed to adaptively achieve specified levels of QoS for different applications. In this paper we present an algorithm to improve the execution efficiency of Internetware applications by dynamically calculating the straggler threshold, considering key parameters including job QoS timing constraints, task execution progress, and optimal system resource utilization. We implement this dynamic straggler threshold into the YARN architecture to evaluate it’s effectiveness against existing state-of-the-art solutions. Results demonstrate that the proposed approach is capable of reducing parallel job response times by up to 20% compared to the static threshold, as well as a higher speculation success rate, achieving up to 66.67% against 16.67% in comparison to the static method
- …