112 research outputs found

    Non-minimal Derivative Coupling Scalar Field and Bulk Viscous Dark Energy

    Full text link
    Inspired by thermodynamical dissipative phenomena, we consider bulk viscosity for dark fluid in a spatially flat two-component Universe. Our viscous dark energy model represents Phantom crossing avoiding Big-Rip singularity. We propose a non-minimal derivative coupling scalar field with zero potential leading to accelerated expansion of Universe in the framework of bulk viscous dark energy model. In this approach, coupling constant (κ\kappa) is related to viscosity coefficient (γ\gamma) and energy density of dark energy at the present time (ΩDE0\Omega_{\rm DE}^0). This coupling is bounded as κ∈[−1/9H02(1−ΩDE0),0]\kappa\in [-1/9H_0^2(1-\Omega_{\rm DE}^0), 0] and for γ=0\gamma=0 leads to κ=0\kappa=0. To perform robust analysis, we implement recent observational data sets including Joint Light-curve Analysis (JLA) for SNIa, Gamma Ray Bursts (GRBs) for most luminous astrophysical objects at high redshifts, Baryon Acoustic Oscillations (BAO) from different surveys, Hubble parameter from HST project, {\it Planck} data for CMB power spectrum and CMB Lensing. Joint analysis of JLA++GRBs++BAO++HST shows that ΩDE0=0.696±0.010\Omega_{\rm DE}^0=0.696\pm 0.010, γ=0.1404±0.0014\gamma=0.1404\pm0.0014 and H0=68.1±1.3H_0=68.1\pm1.3 at 1σ1\sigma confidence interval. {\it Planck} TT observation provides γ=0.32−0.26+0.31\gamma=0.32^{+0.31}_{-0.26} at 68%68\% confidence limit for viscosity coefficient. Tension in Hubble parameter is alleviated in this model. Cosmographic distance ratio indicates that current observed data prefer to increase bulk viscosity. Finally, the competition between Phantom and Quintessence behavior of viscous dark energy model can accommodate cosmological old objects reported as a sign of age crisis in Λ\LambdaCDM model.Comment: 21 pages and 18 figures, some typos in equations fixe

    Artificial Intelligence Meets Seafood Supply Chain Management

    Get PDF
    Seafood supply chains (SSCs) face intricate challenges throughout the food journey, from harvest locations to our tables. The complexity demands a holistic view of supply chain management (SCM) to ensure the interests of each respective entity within the chain are met. The perishable nature of seafood products complicates the proper distribution efforts by industry owners. They are grappling with mismanagement and inefficient operational logistics strategies, which lead to lower profits and hinder the goal of meeting market demands. Amidst these challenges, emerging technologies offer promising solutions for decision-makers to control the flow of physical goods in their businesses. Using the Design Science Research (DSR) method, this study developed an artifact to highlight the potential of artificial intelligence (AI) to be engaged in decision-making scenarios. In this regard, a mathematical model (MM) is developed to deepen the understanding of the SSC environment and make it possible to utilize novel machine learning (ML) techniques in order to help authorities make better decisions on their logistics operations, considering maximizing profits in their businesses

    High-velocity stars in the cores of globular clusters: The illustrative case of NGC 2808

    Get PDF
    We report the detection of five high-velocity stars in the core of the globular cluster NGC 2808. The stars lie on the the red giant branch and show total velocities between 40 and 45 km/s. For a core velocity dispersion sigma_c = 13.4 km/s, this corresponds to up to 3.4 sigma_c. These velocities are close to the estimated escape velocity (~ 50 km/s) and suggest an ejection from the core. Two of these stars have been confirmed in our recent integral field spectroscopy data and we will discuss them in more detail here. These two red giants are located at a projected distance of ~ 0.3 pc from the center. According to their positions on the color magnitude diagram, both stars are cluster members. We investigate several possible origins for the high velocities of the stars and conceivable ejection mechanisms. Since the velocities are close to the escape velocity, it is not obvious whether the stars are bound or unbound to the cluster. We therefore consider both cases in our analysis. We perform numerical simulations of three-body dynamical encounters between binaries and single stars and compare the resulting velocity distributions of escapers with the velocities of our stars. We compare the predictions for a single dynamical encounter with a compact object with those of a sequence of two-body encounters due to relaxation. If the stars are unbound, the encounter must have taken place recently, when the stars were already in the giant phase. After including binary fractions and black-hole retention fractions, projection effects, and detection probabilities from Monte-Carlo simulations, we estimate the expected numbers of detections for all the different scenarios. Based on these numbers, we conclude that the most likely scenario is that the stars are bound and were accelerated by a single encounter between a binary of main-sequence stars and a ~ 10 M_sun black hole.Comment: 13 pages, 12 figures, Accepted for publication in A&

    Tree ensemble kernels for Bayesian optimization with known constraints over mixed-feature spaces

    Get PDF
    Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search, as they achieve good predictive performance with little or no manual tuning, naturally handle discrete feature spaces, and are relatively insensitive to outliers in the training data. Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function. To address both points simultaneously, we propose using the kernel interpretation of tree ensembles as a Gaussian Process prior to obtain model variance estimates, and we develop a compatible optimization formulation for the acquisition function. The latter further allows us to seamlessly integrate known constraints to improve sampling efficiency by considering domain-knowledge in engineering settings and modeling search space symmetries, e.g., hierarchical relationships in neural architecture search. Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.Comment: 27 pages, 9 figures, 4 table

    Practical Path-based Bayesian Optimization

    Full text link
    There has been a surge in interest in data-driven experimental design with applications to chemical engineering and drug manufacturing. Bayesian optimization (BO) has proven to be adaptable to such cases, since we can model the reactions of interest as expensive black-box functions. Sometimes, the cost of this black-box functions can be separated into two parts: (a) the cost of the experiment itself, and (b) the cost of changing the input parameters. In this short paper, we extend the SnAKe algorithm to deal with both types of costs simultaneously. We further propose extensions to the case of a maximum allowable input change, as well as to the multi-objective setting.Comment: 6 main pages, 12 with references and appendix. 4 figures, 2 tables. To appear in NeurIPS 2023 Workshop on Adaptive Experimental Design and Active Learning in the Real Worl

    Benchmarking signal quality and spatiotemporal distribution of interictal spikes in prolonged human iEEG recordings using CorTec wireless brain interchange

    Get PDF
    Neuromodulation through implantable pulse generators (IPGs) represents an important treatment approach for neurological disorders. While the field has observed the success of state-of-the-art interventions, such as deep brain stimulation (DBS) or responsive neurostimulation (RNS), implantable systems face various technical challenges, including the restriction of recording from a limited number of brain sites, power management, and limited external access to the assessed neural data in a continuous fashion. To the best of our knowledge, for the first time in this study, we investigated the feasibility of recording human intracranial EEG (iEEG) using a benchtop version of the Brain Interchange (BIC) unit of CorTec, which is a portable, wireless, and externally powered implant with sensing and stimulation capabilities. We developed a MATLAB/SIMULINK-based rapid prototyping environment and a graphical user interface (GUI) to acquire and visualize the iEEG captured from all 32 channels of the BIC unit. We recorded prolonged iEEG (~ 24 h) from three human subjects with externalized depth leads using the BIC and commercially available clinical amplifiers simultaneously in the epilepsy monitoring unit (EMU). The iEEG signal quality of both streams was compared, and the results demonstrated a comparable power spectral density (PSD) in all the systems in the low-frequency band (\u3c 80 Hz). However, notable differences were primarily observed above 100 Hz, where the clinical amplifiers were associated with lower noise floor (BIC-17 dB vs. clinical amplifiers \u3c  - 25 dB). We employed an established spike detector to assess and compare the spike rates in each iEEG stream. We observed over 90% conformity between the spikes rates and their spatial distribution captured with BIC and clinical systems. Additionally, we quantified the packet loss characteristic in the iEEG signal during the wireless data transfer and conducted a series of simulations to compare the performance of different interpolation methods for recovering the missing packets in signals at different frequency bands. We noted that simple linear interpolation has the potential to recover the signal and reduce the noise floor with modest packet loss levels reaching up to 10%. Overall, our results indicate that while tethered clinical amplifiers exhibited noticeably better noise floor above 80 Hz, epileptic spikes can still be detected successfully in the iEEG recorded with the externally powered wireless BIC unit opening the road for future closed-loop neuromodulation applications with continuous access to brain activity

    A Sparse Representation Strategy to Eliminate Pseudo-HFO Events From Intracranial EEG for Seizure Onset Zone Localization

    Get PDF
    Objective. High-frequency oscillations (HFOs) are considered a biomarker of the epileptogenic zone in intracranial EEG recordings. However, automated HFO detectors confound true oscillations with spurious events caused by the presence of artifacts. Approach. We hypothesized that, unlike pseudo-HFOs with sharp transients or arbitrary shapes, real HFOs have a signal characteristic that can be represented using a small number of oscillatory bases. Based on this hypothesis using a sparse representation framework, this study introduces a new classification approach to distinguish true HFOs from the pseudo-events that mislead seizure onset zone (SOZ) localization. Moreover, we further classified the HFOs into ripples and fast ripples by introducing an adaptive reconstruction scheme using sparse representation. By visualizing the raw waveforms and time-frequency representation of events recorded from 16 patients, three experts labelled 6400 candidate events that passed an initial amplitude-threshold-based HFO detector. We formed a redundant analytical multiscale dictionary built from smooth oscillatory Gabor atoms and represented each event with orthogonal matching pursuit by using a small number of dictionary elements. We used the approximation error and residual signal at each iteration to extract features that can distinguish the HFOs from any type of artifact regardless of their corresponding source. We validated our model on sixteen subjects with thirty minutes of continuous interictal iEEG recording from each. Main Results. We showed that the accuracy of SOZ detection after applying our method was significantly improved. In particular, we achieved a 96.65% classification accuracy in labelled events and a 17.57% improvement in SOZ detection on continuous data. Our sparse representation framework can also distinguish between ripples and fast ripples. Significance. We show that by using a sparse representation approach we can remove the pseudo-HFOs from the pool of events and improve the reliability of detected HFOs in large data sets and minimize manual artifact elimination
    • …
    corecore