558 research outputs found

    Online classification for time-domain astronomy

    Full text link
    The advent of synoptic sky surveys has spurred the development of techniques for real-time classification of astronomical sources in order to ensure timely follow-up with appropriate instruments. Previous work has focused on algorithm selection or improved light curve representations, and naively convert light curves into structured feature sets without regard for the time span or phase of the light curves. In this paper, we highlight the violation of a fundamental machine learning assumption that occurs when archival light curves with long observational time spans are used to train classifiers that are applied to light curves with fewer observations. We propose two solutions to deal with the mismatch in the time spans of training and test light curves. The first is the use of classifier committees where each classifier is trained on light curves of different observational time spans. Only the committee member whose training set matches the test light curve time span is invoked for classification. The second solution uses hierarchical classifiers that are able to predict source types both individually and by sub-group, so that the user can trade-off an earlier, more robust classification with classification granularity. We test both methods using light curves from the MACHO survey, and demonstrate their usefulness in improving performance over similar methods that naively train on all available archival data.Comment: Astroinformatics workshop, IEEE International Conference on Data Mining 201

    EVALUATION AND OPTIMIZATION OF ANTIOXIDANT POTENTIALITY OF XYLARIA FEEJEENSIS HMJAU22039

    Get PDF
    ABSTRACTObjective: Antioxidants neutralize free radicals generated in the human body and prevent them from causing damage. The present study is aimed toevaluate and optimize the antioxidant activity of Xylaria feejeensis HMJAU22039 an endophyte isolated from Tectona grandis.Methods: The antioxidant potential was measured by 1-1diphenyl-2-picryl-hydrazil (DPPH) radical scavenging assay and nitric oxide (NO) scavengingassay. Total phenolic content (TPC) of the samples was measured by Folin–Ciocalteu reagent based assay, and the values were obtained from theregression equation: y = 0.006141x + 0.157733.Results: Potato dextrose yeast extract broth medium was selected as the basal medium as it reported a high antioxidant activity. The basal mediumwas optimized with 2% dextrose, 0.2% yeast extract, and 200 g/L potato infusion. Dextrose and yeast extract were selected as carbon and nitrogensources, respectively, as they reported high antioxidant activity. The antioxidant activity and phenolic content are highest at static condition(18.14 mg/g). Incubation temperature of 30°C and pH of the initial medium at 6 were found to be optimum conditions for high antioxidant activity. Anincubation period of 20 days reported the highest antioxidant activity and phenolic content. Methanol extract recorded high antioxidant activity witha DPPH (73.86%) followed by ethyl acetate and chloroform fractions.Conclusion: The study highlights the importance of different physiochemical parameters in the production of secondary metabolites havingantioxidant properties. The results reveal a significant positive correlation between DPPH radical scavenging assay, NO scavenging assay, and TPC.Keywords: 1-1diphenyl-2-picryl-hydrazil, Total phenolic content, Endophytic fungus, Tectona grandis

    STATISTICAL OPTIMIZATION OF MEDIUM COMPONENTS BY PLACKETT BURMAN DESIGN AND RESPONSE SURFACE METHODOLOGY FOR ENHANCED ANTIOXIDANT ACTIVITY BY XYLARIA FEEJEENSIS HMJAU22039

    Get PDF
    Objective: To enhance the antioxidant potential of Xylaria feejeensis by statistical optimization. The components of potato dextrose yeast extract (PDYEB) medium: potato infusion, dextrose and yeast extract were investigated to optimize antioxidant activity by Plackett-Burman design (P-B design). The interaction between the critical components of the medium and incubation time was further investigated by Response Surface Method (RSM) and the culture conditions were optimized for enhancing antioxidant activity. Central composite design (CCD) was employed and quadratic response surface model was used to analyze the data using analysis of variance (ANOVA).Methods: The antioxidant potential was measured by 1-1diphenyl-2-picryl-hydrazil (DPPH) radical scavenging assay, nitric oxide (NO) scavenging assay. The antioxidant activity was compared with total phenolic content measured by Folin-Ciocalteau (FC) reagent based assay.Results: Plackett-Burman design revealed dextrose and yeast extract to be the most signiï¬cant components of PDYEB medium (p<.0001). ANOVA analysis showed that the model was highly significant (p<0.0001) for antioxidant activity measured by DPPH, NO scavenging and TPC (total phenolic content). The model was justified by applying the optimized conditions and values of 87.38%, 56.43%, 27.28 mg/g for DPPH, NO scavenging and TPC respectively were obtained. The estimated optimum conditions of the variables for the antioxidant activity and phenolic content are 12.72g of dextrose, 1.67g yeast extract, and incubation period of 23.43 d.Conclusion: The study highlights the importance of various components of the medium and the physiochemical limitations in antioxidant potential and phenol production of the fungal strain.Â

    Statistical Prediction of [CII] Observations by Constructing Probability Density Functions using SOFIA, Herschel, and Spitzer Observations

    Full text link
    We present a statistical algorithm for predicting the [CII] emission from Herschel and Spitzer continuum images using probability density functions between the [CII] emission and continuum emission. The [CII] emission at 158 μ\mum is a critical tracer in studying the life cycle of interstellar medium and galaxy evolution. Unfortunately, its frequency is in the far infrared (FIR), which is opaque through the troposphere and cannot be observed from the ground except for highly red-shifted sources (z ≳\gtrsim 2). Typically [CII] observations of closer regions have been carried out using suborbital or space observatories. Given the high cost of these facilities and limited time availability, it is important to have highly efficient observations/operations in terms of maximizing science returns. This requires accurate prediction of the strength of emission lines and, therefore, the time required for their observation. However, [CII] emission has been hard to predict due to a lack of strong correlations with other observables. Here we adopt a new approach to making accurate predictions of [CII] emission by relating this emission simultaneously to several tracers of dust emission in the same region. This is done using a statistical methodology utilizing probability density functions (PDFs) among [CII] emission and Spitzer IRAC and Herschel PACS/SPIRE images. Our test result toward a star-forming region, RCW 120, demonstrates that our methodology delivers high-quality predictions with less than 30\% uncertainties over 80\% of the entire observation area, which is more than sufficient to test observation feasibility and maximize science return. The {\it pickle} dump files storing the PDFs and trained neural network module are accessible upon request and will support future far-infrared missions, for example, GUSTO and FIR Probe.Comment: 7 figure

    Collaborative Supervised Learning for Sensor Networks

    Get PDF
    Collaboration methods for distributed machine-learning algorithms involve the specification of communication protocols for the learners, which can query other learners and/or broadcast their findings preemptively. Each learner incorporates information from its neighbors into its own training set, and they are thereby able to bootstrap each other to higher performance. Each learner resides at a different node in the sensor network and makes observations (collects data) independently of the other learners. After being seeded with an initial labeled training set, each learner proceeds to learn in an iterative fashion. New data is collected and classified. The learner can then either broadcast its most confident classifications for use by other learners, or can query neighbors for their classifications of its least confident items. As such, collaborative learning combines elements of both passive (broadcast) and active (query) learning. It also uses ideas from ensemble learning to combine the multiple responses to a given query into a single useful label. This approach has been evaluated against current non-collaborative alternatives, including training a single classifier and deploying it at all nodes with no further learning possible, and permitting learners to learn from their own most confident judgments, absent interaction with their neighbors. On several data sets, it has been consistently found that active collaboration is the best strategy for a distributed learner network. The main advantages include the ability for learning to take place autonomously by collaboration rather than by requiring intervention from an oracle (usually human), and also the ability to learn in a distributed environment, permitting decisions to be made in situ and to yield faster response time

    Characterization of dimensional changes of cement pastes and mortars in fresh state applying an interferometric technique

    Get PDF
    The effect produced by the incorporation of additives in Portland cement based materials over dimensional changes occurring during the setting process was evaluated employing a fiber optic Fizeau interferometric sensor. The sensor system employed a broadband light source (SLED) centered at 1550 nm, whose spectral emission was modulated by the interferometer formed between the material surface and the end of the optical fiber used to illuminate the sample. An optical spectrum analyzer was used to monitor the variation of the modulated spectrum, while the mentioned process took place. The expansion or contraction experienced by materials with different compositions was observed and quantified. Results obtained point out the accuracy and the potential of the technique.Fil: Mesa Yandy, Angelica Maria. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico la Plata. Centro de Investigaciones Opticas (i); Argentina. Universidad Nacional de La Plata. Facultad de Ingenieria; Argentina. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas; ArgentinaFil: Duchowicz, Ricardo. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico la Plata. Centro de Investigaciones Opticas (i); Argentina. Universidad Austral. Facultad de Ingeniería; Argentina. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas; ArgentinaFil: Russo, Nelida Araceli. Consejo Nacional de Investigaciones Científicas y Técnicas. Centro Científico Tecnológico La Plata. Centro de Investigaciones Opticas (i); Argentina. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas; ArgentinaFil: Zerbino, Raul Luis. Universidad Nacional de La Plata. Facultad de Ingenieria; Argentina. Provincia de Buenos Aires. Gobernación. Comisión de Investigaciones Científicas; Argentin

    The IPAC Image Subtraction and Discovery Pipeline for the intermediate Palomar Transient Factory

    Get PDF
    We describe the near real-time transient-source discovery engine for the intermediate Palomar Transient Factory (iPTF), currently in operations at the Infrared Processing and Analysis Center (IPAC), Caltech. We coin this system the IPAC/iPTF Discovery Engine (or IDE). We review the algorithms used for PSF-matching, image subtraction, detection, photometry, and machine-learned (ML) vetting of extracted transient candidates. We also review the performance of our ML classifier. For a limiting signal-to-noise ratio of 4 in relatively unconfused regions, "bogus" candidates from processing artifacts and imperfect image subtractions outnumber real transients by ~ 10:1. This can be considerably higher for image data with inaccurate astrometric and/or PSF-matching solutions. Despite this occasionally high contamination rate, the ML classifier is able to identify real transients with an efficiency (or completeness) of ~ 97% for a maximum tolerable false-positive rate of 1% when classifying raw candidates. All subtraction-image metrics, source features, ML probability-based real-bogus scores, contextual metadata from other surveys, and possible associations with known Solar System objects are stored in a relational database for retrieval by the various science working groups. We review our efforts in mitigating false-positives and our experience in optimizing the overall system in response to the multitude of science projects underway with iPTF.Comment: 66 pages, 21 figures, 7 tables, accepted by PAS

    Small Near-Earth Asteroids in the Palomar Transient Factory Survey: a Real-Time Streak-detection System

    Get PDF
    Near-Earth asteroids (NEAs) in the 1–100 meter size range are estimated to be ~1,000 times more numerous than the ~15,000 currently cataloged NEAs, most of which are in the 0.5–10 kilometer size range. Impacts from 10–100 meter size NEAs are not statistically life-threatening, but may cause significant regional damage, while 1–10 meter size NEAs with low velocities relative to Earth are compelling targets for space missions. We describe the implementation and initial results of a real-time NEA-discovery system specialized for the detection of small, high angular rate (visually streaked) NEAs in Palomar Transient Factory (PTF) images. PTF is a 1.2-m aperture, 7.3 deg^2 field of view (FOV) optical survey designed primarily for the discovery of extragalactic transients (e.g., supernovae) in 60-second exposures reaching ~20.5 visual magnitude. Our real-time NEA discovery pipeline uses a machine-learned classifier to filter a large number of false-positive streak detections, permitting a human scanner to efficiently and remotely identify real asteroid streaks during the night. Upon recognition of a streaked NEA detection (typically within an hour of the discovery exposure), the scanner triggers follow-up with the same telescope and posts the observations to the Minor Planet Center for worldwide confirmation. We describe our 11 initial confirmed discoveries, all small NEAs that passed 0.3–15 lunar distances from Earth. Lastly, we derive useful scaling laws for comparing streaked-NEA-detection capabilities of different surveys as a function of their hardware and survey-pattern characteristics. This work most directly informs estimates of the streak-detection capabilities of the Zwicky Transient Facility (ZTF, planned to succeed PTF in 2017), which will apply PTF's current resolution and sensitivity over a 47-deg^2 FOV
    • …
    corecore