4,202 research outputs found

    Exploring internal child sex trafficking networks using social network analysis

    Get PDF
    This article explores the potential of social network analysis as a tool in supporting the investigation of internal child sex trafficking in the UK. In doing so, it uses only data, software, and training already available to UK police. Data from two major operations are analysed using in-built centrality metrics, designed to measure a network’s overarching structural properties and identify particularly powerful individuals. This work addresses victim networks alongside offender networks. The insights generated by SNA inform ideas for targeted interventions based on the principles of Situational Crime Prevention. These harm-reduction initiatives go beyond traditional enforcement to cover prevention, disruption, prosecution, etc. This article ends by discussing how SNA can be applied and further developed by frontline policing, strategic policing, prosecution, and policy and research

    On the Experimental Evaluation of Vehicular Networks: Issues, Requirements and Methodology Applied to a Real Use Case

    Get PDF
    One of the most challenging fields in vehicular communications has been the experimental assessment of protocols and novel technologies. Researchers usually tend to simulate vehicular scenarios and/or partially validate new contributions in the area by using constrained testbeds and carrying out minor tests. In this line, the present work reviews the issues that pioneers in the area of vehicular communications and, in general, in telematics, have to deal with if they want to perform a good evaluation campaign by real testing. The key needs for a good experimental evaluation is the use of proper software tools for gathering testing data, post-processing and generating relevant figures of merit and, finally, properly showing the most important results. For this reason, a key contribution of this paper is the presentation of an evaluation environment called AnaVANET, which covers the previous needs. By using this tool and presenting a reference case of study, a generic testing methodology is described and applied. This way, the usage of the IPv6 protocol over a vehicle-to-vehicle routing protocol, and supporting IETF-based network mobility, is tested at the same time the main features of the AnaVANET system are presented. This work contributes in laying the foundations for a proper experimental evaluation of vehicular networks and will be useful for many researchers in the area.Comment: in EAI Endorsed Transactions on Industrial Networks and Intelligent Systems, 201

    Individual cognitive stimulation therapy for dementia : a clinical effectiveness and cost-effectiveness pragmatic, multicentre, randomised controlled trial

    Get PDF
    Background Group cognitive stimulation therapy programmes can benefit cognition and quality of life for people with dementia. Evidence for home-based, carer-led cognitive stimulation interventions is limited. Objectives To evaluate the clinical effectiveness and cost-effectiveness of carer-delivered individual cognitive stimulation therapy (iCST) for people with dementia and their family carers, compared with treatment as usual (TAU). Design A multicentre, single-blind, randomised controlled trial assessing clinical effectiveness and cost-effectiveness. Assessments were at baseline, 13 weeks and 26 weeks (primary end point). Setting Participants were recruited through Memory Clinics and Community Mental Health Teams for older people. Participants A total of 356 caregiving dyads were recruited and 273 completed the trial. Intervention iCST consisted of structured cognitive stimulation sessions for people with dementia, completed up to three times weekly over 25 weeks. Family carers were supported to deliver the sessions at home. Main outcome measures Primary outcomes for the person with dementia were cognition and quality of life. Secondary outcomes included behavioural and psychological symptoms, activities of daily living, depressive symptoms and relationship quality. The primary outcome for the family carers was mental/physical health (Short Form questionnaire-12 items). Health-related quality of life (European Quality of Life-5 Dimensions), mood symptoms, resilience and relationship quality comprised the secondary outcomes. Costs were estimated from health and social care and societal perspectives. Results There were no differences in any of the primary outcomes for people with dementia between intervention and TAU [cognition: mean difference –0.55, 95% confidence interval (CI) –2.00 to 0.90; p-value = 0.45; self-reported quality of life: mean difference –0.02, 95% CI –1.22 to 0.82; p-value = 0.97 at the 6-month follow-up]. iCST did not improve mental/physical health for carers. People with dementia in the iCST group experienced better relationship quality with their carer, but there was no evidence that iCST improved their activities of daily living, depression or behavioural and psychological symptoms. iCST seemed to improve health-related quality of life for carers but did not benefit carers’ resilience or their relationship quality with their relative. Carers conducting more sessions had fewer depressive symptoms. Qualitative data suggested that people with dementia and their carers experienced better communication owing to iCST. Adjusted mean costs were not significantly different between the groups. From the societal perspective, both health gains and cost savings were observed. Conclusions iCST did not improve cognition or quality of life for people with dementia, or carers’ physical and mental health. Costs of the intervention were offset by some reductions in social care and other services. Although there was some evidence of improvement in terms of the caregiving relationship and carers’ health-related quality of life, iCST does not appear to deliver clinical benefits for cognition and quality of life for people with dementia. Most people received fewer than the recommended number of iCST sessions. Further research is needed to ascertain the clinical effectiveness of carer-led cognitive stimulation interventions for people with dementia

    Wearable Computing for Health and Fitness: Exploring the Relationship between Data and Human Behaviour

    Get PDF
    Health and fitness wearable technology has recently advanced, making it easier for an individual to monitor their behaviours. Previously self generated data interacts with the user to motivate positive behaviour change, but issues arise when relating this to long term mention of wearable devices. Previous studies within this area are discussed. We also consider a new approach where data is used to support instead of motivate, through monitoring and logging to encourage reflection. Based on issues highlighted, we then make recommendations on the direction in which future work could be most beneficial

    A Brief History of Web Crawlers

    Full text link
    Web crawlers visit internet applications, collect data, and learn about new web pages from visited pages. Web crawlers have a long and interesting history. Early web crawlers collected statistics about the web. In addition to collecting statistics about the web and indexing the applications for search engines, modern crawlers can be used to perform accessibility and vulnerability checks on the application. Quick expansion of the web, and the complexity added to web applications have made the process of crawling a very challenging one. Throughout the history of web crawling many researchers and industrial groups addressed different issues and challenges that web crawlers face. Different solutions have been proposed to reduce the time and cost of crawling. Performing an exhaustive crawl is a challenging question. Additionally capturing the model of a modern web application and extracting data from it automatically is another open question. What follows is a brief history of different technique and algorithms used from the early days of crawling up to the recent days. We introduce criteria to evaluate the relative performance of web crawlers. Based on these criteria we plot the evolution of web crawlers and compare their performanc

    Wellness Protocol: An Integrated Framework for Ambient Assisted Living : A thesis presented in partial fulfilment of the requirements for the degree of Doctor of Philosophy In Electronics, Information and Communication Systems At School of Engineering and Advanced Technology, Massey University, Manawatu Campus, New Zealand

    Get PDF
    Listed in 2016 Dean's List of Exceptional ThesesSmart and intelligent homes of today and tomorrow are committed to enhancing the security, safety and comfort of the occupants. In the present scenario, most of the smart homes Protocols are limited to controlled activities environments for Ambient Assisted Living (AAL) of the elderly and the convalescents. The aim of this research is to develop a Wellness Protocol that forecasts the wellness of any individual living in the AAL environment. This is based on wireless sensors and networks that are applied to data mining and machine learning to monitor the activities of daily living. The heterogeneous sensor and actuator nodes, based on WSNs are deployed into the home environment. These nodes generate the real-time data related to the object usage and other movements inside the home, to forecast the wellness of an individual. The new Protocol has been designed and developed to be suitable especially for the smart home system. The Protocol is reliable, efficient, flexible, and economical for wireless sensor networks based AAL. According to consumer demand, the Wellness Protocol based smart home systems can be easily installed with existing households without any significant changes and with a user-friendly interface. Additionally, the Wellness Protocol has extended to designing a smart building environment for an apartment. In the endeavour of smart home design and implementation, the Wellness Protocol deals with large data handling and interference mitigation. A Wellness based smart home monitoring system is the application of automation with integral systems of accommodation facilities to boost and progress the everyday life of an occupant

    Proactive cloud management for highly heterogeneous multi-cloud infrastructures

    Get PDF
    Various literature studies demonstrated that the cloud computing paradigm can help to improve availability and performance of applications subject to the problem of software anomalies. Indeed, the cloud resource provisioning model enables users to rapidly access new processing resources, even distributed over different geographical regions, that can be promptly used in the case of, e.g., crashes or hangs of running machines, as well as to balance the load in the case of overloaded machines. Nevertheless, managing a complex geographically-distributed cloud deploy could be a complex and time-consuming task. Autonomic Cloud Manager (ACM) Framework is an autonomic framework for supporting proactive management of applications deployed over multiple cloud regions. It uses machine learning models to predict failures of virtual machines and to proactively redirect the load to healthy machines/cloud regions. In this paper, we study different policies to perform efficient proactive load balancing across cloud regions in order to mitigate the effect of software anomalies. These policies use predictions about the mean time to failure of virtual machines. We consider the case of heterogeneous cloud regions, i.e regions with different amount of resources, and we provide an experimental assessment of these policies in the context of ACM Framework

    Browser-based Analysis of Web Framework Applications

    Full text link
    Although web applications evolved to mature solutions providing sophisticated user experience, they also became complex for the same reason. Complexity primarily affects the server-side generation of dynamic pages as they are aggregated from multiple sources and as there are lots of possible processing paths depending on parameters. Browser-based tests are an adequate instrument to detect errors within generated web pages considering the server-side process and path complexity a black box. However, these tests do not detect the cause of an error which has to be located manually instead. This paper proposes to generate metadata on the paths and parts involved during server-side processing to facilitate backtracking origins of detected errors at development time. While there are several possible points of interest to observe for backtracking, this paper focuses user interface components of web frameworks.Comment: In Proceedings TAV-WEB 2010, arXiv:1009.330
    corecore