282,114 research outputs found

    An Evaluation of the Factors Used to Predict Writing Ability at the Air Force Institute of Technology

    Get PDF
    A study of 574 students at the Air Force Institute of Technology compared performance, education, and experience factors (the later two as stated by the students themselves) to a locally developed estimate of true writing ability (WGPA). This exploratory research was additionally intended to assess the effectiveness of AFITs current writing student skill diagnostic and instructional system. Direct (essay evaluation) and indirect (objective test) evaluations of AFIT student writing ability were analyzed for their predictive impact. The statistical analysis procedures used in this study included the factor analysis of a survey, ANOVA, the adjustment of multiple correlations due to measurement error and range attenuation, and the performance of a regression analysis using the raw data and the adjusted correlation matrix. The results of this study indicate AFIT\u27s direct evaluation portion (essay examination) is useful for determining writing ability; the indirect portion (objective test) did not significantly contribute to the model. Due to the combination of independent variables chosen for the predictive model, the study was unable to identify the immediate benefits of the written communications review course on AFIT performance

    Overview of healthcare in the UK

    Get PDF
    The National Health System in the UK has evolved to become one of the largest healthcare systems in the world. At the time of writing of this review (August 2010) the UK government in its 2010 White Paper “Equity and excellence: Liberating the NHS” has announced a strategy on how it will “create a more responsive, patient-centred NHS which achieves outcomes that are among the best in the world”. This review article presents an overview of the UK healthcare system as it currently stands, with emphasis on Predictive, Preventive and Personalised Medicine elements. It aims to serve as the basis for future EPMA articles to expand on and present the changes that will be implemented within the NHS in the forthcoming months

    Architecture of a Web-based Predictive Editor for Controlled Natural Language Processing

    Full text link
    In this paper, we describe the architecture of a web-based predictive text editor being developed for the controlled natural language PENG^{ASP). This controlled language can be used to write non-monotonic specifications that have the same expressive power as Answer Set Programs. In order to support the writing process of these specifications, the predictive text editor communicates asynchronously with the controlled natural language processor that generates lookahead categories and additional auxiliary information for the author of a specification text. The text editor can display multiple sets of lookahead categories simultaneously for different possible sentence completions, anaphoric expressions, and supports the addition of new content words to the lexicon

    Proposal of a P300-based BCI Speller using a predictive Text system

    Get PDF
    This paper presents a P300-based BCI speller system that uses a virtual 4 x 3 keyboard based on the T9 interface developed on mobile phones in order to increase the writing speed. To validate the effectiveness of the proposed BCI, we compared it with two adaptations of the classical Farwell and Donchin speller, which is based on a 6 x 6 symbol matrix. Three healthy subjects took part in the experiment. The preliminary results confirm the effectiveness of T9-based speller, since the time needed to spell words and complete sentences was considerably reduced.This work was partially supported by the Innovation, Science and Enterprise Council of the Junta de Andalucía (Spain), project P07-TIC-03310, the Spanish Ministry of Science and Innovation, project TEC 2011-26395 and by the European fund ERDF

    A MPC Strategy for the Optimal Management of Microgrids Based on Evolutionary Optimization

    Get PDF
    In this paper, a novel model predictive control strategy, with a 24-h prediction horizon, is proposed to reduce the operational cost of microgrids. To overcome the complexity of the optimization problems arising from the operation of the microgrid at each step, an adaptive evolutionary strategy with a satisfactory trade-off between exploration and exploitation capabilities was added to the model predictive control. The proposed strategy was evaluated using a representative microgrid that includes a wind turbine, a photovoltaic plant, a microturbine, a diesel engine, and an energy storage system. The achieved results demonstrate the validity of the proposed approach, outperforming a global scheduling planner-based on a genetic algorithm by 14.2% in terms of operational cost. In addition, the proposed approach also better manages the use of the energy storage system.Ministerio de Economía y Competitividad DPI2016-75294-C2-2-RUnión Europea (Programa Horizonte 2020) 76409

    Predictive Analytics in the Criminal Justice System: Media Depictions and Framing

    Get PDF
    Artificial intelligence and algorithms are increasingly becoming commonplace in crime-fighting efforts. For instance, predictive policing uses software to predetermine criminals and areas where crime is most likely to happen. Risk assessment software are employed in sentence determination and other courtroom decisions, and they are also being applied towards prison overpopulation by assessing which inmates can be released. Public opinion on the use of predictive software is divided: many police and state officials support it, crediting it with lowering crime rates and improving public safety. Others, however, have questioned its effectiveness, citing civil liberties concerns as well as the possibility of perpetuating systemic discrimination. According to the Prison Policy Initiative, over 2.3 million Americans were incarcerated in 2017 [1]. Of this population, 60 per cent were made up of people of color. African-American men are disproportionately targeted by the U.S. judicial system; they are more likely to be stopped and frisked by police, as well as receive stiffer sentences than white men for the same crimes [2]. In light of these facts, using algorithms and predictive methods to make decisions-especially ones that may affect the freedom of individuals-requires further study. Investigating the increasingly intertwined relationship between technology and human liberties can help develop a better understanding of how artificial intelligence can help make lives more efficient and the judicial system more transparent. The news media plays a significant role in shaping opinions on controversial issues. Articles and reports on predictive policing not only inform the public, but they also influence how people perceive the use of artificial intelligence in law enforcement, and ultimately how we, as citizens, want to be policed. This study evaluates the role of news media in shaping public opinion on two fronts: (a) the use of predictive analytics in the justice system, and (b) the integration of artificial intelligence in everyday life. Working with a corpus of articles from major journalistic outlets, we apply a qualitative methodology based on grounded theory to identify the key frames that govern media representation of predictive policing. This study makes the following contributions: - A survey of current predictive policing techniques, including hot spot analysis, regression methods, near-repeat, and spatiotemporal analysis - Application of grounded theory methods to a qualitative analysis of a corpus of 51 online articles on the U.S. criminal justice system\u27s use of predictive software and algorithms - Identification of two frames most commonly adopted by elite journalists writing for national news outlets Two dominant frames were identified from a corpus of 51 articles: fear of the future and fear of the past. The first frame elaborates on the potential consequences of implementing predictive algorithms in policing efforts, using specific examples to emphasize the difficulty of removing bias from software systems and the likelihood of perpetuating racial discrimination. The second frame argues that using data effectively can help combat rising crime rates, especially in metropolitan areas like Chicago and New York City. It bolsters its claim by attributing the ability of using predictive analytics to forecast crime as well as national threats before they happen - it focuses on preventing crime as opposed to combating it

    Exposure Fusion Framework in Deep Learning-Based Radiology Report Generator

    Get PDF
    Writing a radiology report is time-consuming and requires experienced radiologists. Hence a technology that could generate an automatic report would be beneficial. The key problem in developing an automated report-generating system is providing a coherent predictive text. To accomplish this, it is important to ensure the image has good quality so that the model can learn the parts of the image in interpreting, especially in medical images that tend to be noise-prone in the acquisition process. This research uses the Exposure Fusion Framework method to enhance the quality of medical images to increase the model performance in producing coherent predictive text. The model used is an encoder-decoder with visual feature extraction using a pre- trained ChexNet, Bidirectional Encoder Representation from Transformer (BERT) embedding for text feature, and Long-short Term Memory (LSTM) as a decoder. The model’s performance with EFF enhancement obtained a 7% better result than without enhancement processing using an evaluation value of Bilingual Evaluation Understudy (BLEU) with n-gram 4. It can be concluded that using the enhancement method effectively increases the model’s performance

    Pickup usability dominates: a brief history of mobile text entry research and adoption

    Get PDF
    Text entry on mobile devices (e.g. phones and PDAs) has been a research challenge since devices shrank below laptop size: mobile devices are simply too small to have a traditional full-size keyboard. There has been a profusion of research into text entry techniques for smaller keyboards and touch screens: some of which have become mainstream, while others have not lived up to early expectations. As the mobile phone industry moves to mainstream touch screen interaction we will review the range of input techniques for mobiles, together with evaluations that have taken place to assess their validity: from theoretical modelling through to formal usability experiments. We also report initial results on iPhone text entry speed
    corecore