11,556 research outputs found

    Provenance analysis for instagram photos

    Get PDF
    As a feasible device fingerprint, sensor pattern noise (SPN) has been proven to be effective in the provenance analysis of digital images. However, with the rise of social media, millions of images are being uploaded to and shared through social media sites every day. An image downloaded from social networks may have gone through a series of unknown image manipulations. Consequently, the trustworthiness of SPN has been challenged in the provenance analysis of the images downloaded from social media platforms. In this paper, we intend to investigate the effects of the pre-defined Instagram images filters on the SPN-based image provenance analysis. We identify two groups of filters that affect the SPN in quite different ways, with Group I consisting of the filters that severely attenuate the SPN and Group II consisting of the filters that well preserve the SPN in the images. We further propose a CNN-based classifier to perform filter-oriented image categorization, aiming to exclude the images manipulated by the filters in Group I and thus improve the reliability of the SPN-based provenance analysis. The results on about 20, 000 images and 18 filters are very promising, with an accuracy higher than 96% in differentiating the filters in Group I and Group II

    Detecting hierarchical and overlapping community structures in networks

    Get PDF
    2014-2015 > Academic research: refereed > Refereed conference paperVersion of RecordPublishe

    An EEG-based brain-computer interface for dual task driving detection

    Full text link
    The development of brain-computer interfaces (BCI) for multiple applications has undergone extensive growth in recent years. Since distracted driving is a significant cause of traffic accidents, this study proposes one BCI system based on EEG for distracted driving. The removal of artifacts and the selection of useful brain sources are the essential and critical steps in the application of electroencephalography (EEG)-based BCI. In the first model, artifacts are removed, and useful brain sources are selected based on the independent component analysis (ICA). In the second model, all distracted and concentrated EEG epochs are recognized with a self-organizing map (SOM). This BCI system automatically identified independent components with artifacts for removal and detected distracted driving through the specific brain sources which are also selected automatically. The accuracy of the proposed system approached approximately 90% for the recognition of EEG epochs of distracted and concentrated driving according to the selected frontal and left motor components. © 2013

    The Clinical Application of Anti-CCP in Rheumatoid Arthritis and Other Rheumatic Diseases

    Get PDF
    Rheumatoid arthritis (RA) is a common rheumatic disease in Caucasians and in other ethnic groups. Diagnosis is mainly based on clinical features. Before 1998, the only serological laboratory test that could contribute to the diagnosis was that for rheumatoid factor (RF). The disease activity markers for the evaluation of clinical symptoms or treatment outcome were the erythrocyte sedimentation rate (ESR) and C-reactive protein (CRP). As a matter of fact, the diagnosis of early RA is quite impossible, as the clinical criteria are insufficient at the beginning stage of the disease. In 1998, Schelleken reported that a high percentage of RA patients had a specific antibody that could interact with a synthetic peptide which contained the amino acid citrulline. The high specificity (98%) for RA of this new serological marker, anti-cyclic citrullinated antibody (anti-CCP antibody), can be detected early in RA, before the typical clinical features appear. The presence or absence of this antibody can easily distinguish other rheumatic diseases from RA. Additionally, the titer of anti-CCP can be used to predict the prognosis and treatment outcome after DMARDs or biological therapy. Therefore, with improvement of sensitivity, the anti-CCP antibody will be widely used as a routine laboratory test in the clinical practice for RA

    Evolution of the second lowest extended state as a function of the effective magnetic field in the fractional quantum hall regime

    Get PDF
    It has been shown that, at a Landau level filling factor v=1/2, a two-dimensional electron system can be mathematically transformed into a composite fermion system interacting with a Chern-Simons gauge field. At v=1/2, the average of this Chern-Simons gauge field cancels the external magnetic field B-ext so that the effective magnetic field B-eff acting on the composite fermions is zero. Away from v=1/2, the composite fermions experience a net effective magnetic field B-eff. We present the first study of the evolution of the second lowest extended state in a vanishing effective magnetic field in the fractional quantum Hall regime. Our result shows that the evolution of the second lowest extended state has a good linear dependence on the effective magnetic field Beff within the composite fermion picture

    Learning activation functions from data using cubic spline interpolation

    Full text link
    Neural networks require a careful design in order to perform properly on a given task. In particular, selecting a good activation function (possibly in a data-dependent fashion) is a crucial step, which remains an open problem in the research community. Despite a large amount of investigations, most current implementations simply select one fixed function from a small set of candidates, which is not adapted during training, and is shared among all neurons throughout the different layers. However, neither two of these assumptions can be supposed optimal in practice. In this paper, we present a principled way to have data-dependent adaptation of the activation functions, which is performed independently for each neuron. This is achieved by leveraging over past and present advances on cubic spline interpolation, allowing for local adaptation of the functions around their regions of use. The resulting algorithm is relatively cheap to implement, and overfitting is counterbalanced by the inclusion of a novel damping criterion, which penalizes unwanted oscillations from a predefined shape. Experimental results validate the proposal over two well-known benchmarks.Comment: Submitted to the 27th Italian Workshop on Neural Networks (WIRN 2017

    The impact of environmental and human factors on urban heat and microclimate variability

    Get PDF
    Urbanization is known to cause noticeable changes in the properties of local climate. Studies have shown that urban areas, compared to rural areas with less artificial surfaces, register higher local temperatures as a result of Urban Heat Islands (UHIs). Hong Kong is one of the most densely populated cities in the world and a high proportion of its population residing in densely built high-rise buildings are experiencing some degrees of thermal discomfort. This study selected Mong Kok and Causeway Bay, two typical urban communities in Hong Kong, to gather evidence of microclimate variation and sources of thermal discomfort. UHIs were estimated from 58 logging sensors placed at strategic locations to take temperature and humidity measurements over 17 consecutive days each in the summer/hot and winter/cool periods. By employing geographic information and global positioning systems, these measurements were geocoded and plotted over the built landscape to convey microclimate variation. The empirical data were further aligned with distinct environmental settings to associate possible factors contributing to UHIs. This study established the existence and extent of microclimate variation of UHI within urban communities of different environmental configuration and functional uses. The findings provided essential groundwork for further studies of UHI effects to inform sources of local thermal discomfort and better planning design to safeguard environmental health in public areas.postprin

    Evaluating extreme rainfall changes over Taiwan using a standardized index

    Get PDF
    The annual daily maximum precipitation (rx1day) is widely used to represent extreme events and is an important parameter in climate change studies. However, the climate variability in rx1day is sensitive to outliers and has difficulty representing the characteristics of large areas. We propose to use the probability index (PI), based on the cumulative density function (CDF) of a generalized extreme value (GEV) distribution to fit and standardize the rx1day to represent extreme event records in this study. A good correlation between the area-averaged PIs of the observed stations and those of the gridded dataset can be found over Taiwan. From the past PI records, there is no distinct trend in western Taiwan before the end of the 20th century, but a climate regime change happened during 2002 - 2003. The dual change effects from both the variance and linear trend of extreme events are identified over the northeastern and southern parts of Taiwan, along with the island's central and southern regions, showing different abrupt changing trends and intensity. The PI can also be calculated using climate projection data to represent the characteristics of future extreme changes. The climate variability of PIs on the present (ALL) and future (RCP4.5 and RCP8.5) scenarios were evaluated using the 16 Couple Model Intercomparison Projects Phase-5 models (CMIP5). The simulated present fluctuations in PIs are smaller than those of actual observations. In the 21st century, the RCP8.5 scenario shows that the PI significantly increases by 10% during the first half of the century, and 14% by the end of the century.1112Ysciescopu

    PMH29 ATTENTION DEFICIT HYPERACTIVITY DISORDER MEDICATION CLINICAL PRIOR AUTHORIZATION PROGRAM'S IMPACT ON PRESCRIPTION DRUG UTILIZATION AND COSTS

    Get PDF

    Attribute Equilibrium Dominance Reduction Accelerator (DCCAEDR) Based on Distributed Coevolutionary Cloud and Its Application in Medical Records

    Full text link
    © 2013 IEEE. Aimed at the tremendous challenge of attribute reduction for big data mining and knowledge discovery, we propose a new attribute equilibrium dominance reduction accelerator (DCCAEDR) based on the distributed coevolutionary cloud model. First, the framework of N-populations distributed coevolutionary MapReduce model is designed to divide the entire population into N subpopulations, sharing the reward of different subpopulations' solutions under a MapReduce cloud mechanism. Because the adaptive balancing between exploration and exploitation can be achieved in a better way, the reduction performance is guaranteed to be the same as those using the whole independent data set. Second, a novel Nash equilibrium dominance strategy of elitists under the N bounded rationality regions is adopted to assist the subpopulations necessary to attain the stable status of Nash equilibrium dominance. This further enhances the accelerator's robustness against complex noise on big data. Third, the approximation parallelism mechanism based on MapReduce is constructed to implement rule reduction by accelerating the computation of attribute equivalence classes. Consequently, the entire attribute reduction set with the equilibrium dominance solution can be achieved. Extensive simulation results have been used to illustrate the effectiveness and robustness of the proposed DCCAEDR accelerator for attribute reduction on big data. Furthermore, the DCCAEDR is applied to solve attribute reduction for traditional Chinese medical records and to segment cortical surfaces of the neonatal brain 3-D-MRI records, and the DCCAEDR shows the superior competitive results, when compared with the representative algorithms
    corecore