48 research outputs found

    The Impact of Dyslexia on the Effectiveness of Online Learning: A Systematic Literature Review

    Get PDF
    Dyslexia can have an impact on online learning outcomes. However, few studies have examined the association between dyslexia and online learning effectiveness. This systematic review focused on the effects of dyslexia on online learning effectiveness to conduct research in three major categories: analysis of impact, analysis of dyslexia on online learning, and analysis of interventions. A screening of two bibliographic databases identified 37 articles that met the inclusion criteria. The study determined that dyslexic learners are uncomforted with online learning and suffer from frustrated self-confidence and decreased academic performance in the learning process. It even affects the learners' evaluation of themselves and causes a decrease in self-efficacy. Among these, research on influencing factors can be divided into two dimensions: internal and external factors. Dyslexic learners are influenced by both the type of dyslexia and their psychological characteristics, as well as by teachers, teaching strategies, online educational environments, and educational media. These influences provide intervention strategies, such as developing customized online learning systems for dyslexic learners and exploring interventions in telerehabilitation medicine. However, no intervention strategies involve adjustments to the internal psychology of dyslexic learners and external support systems. Therefore, more research is needed to explore the differential impact of dyslexia on online learning and to understand the factors that produce this impact to provide a theoretical basis and direction for the generation of instructional strategies for dyslexics and the adaptation of online learning for dyslexics

    The Performance of Corporate Social Responsibility Communication in the Web2.0 Era: A Bibliometric Analysis of CSR Communication in Social Media Field

    Get PDF
    With the advent of the web2.0 era, the role of CSR communication in social media has become a hot topic in research and business. However, research on the topic in this area is still relatively new and has not been fully researched. In this study, citespace software is used to analyze the performance of literature in this field. The main results show that the use of social media for CSR communication is a long-term and significant research topic. According to the analysis of 490 research articles in this field from 2007 to 2023, the research trend shows an increasing state year by year, and the explosive growth began in 2017. The United States, China and Spain are the countries that contribute the most in this field. Perez Andrea, Camilleri, Mark Anthony and others are among the most influential authors. The network of co-authors is decentralized, while transnational cooperation takes the form of institutions and groups. Twelve clusters of high concern were identified, with "institutional theory," "web," and "citizenship" having been around longer. The changes in 19 burst terms over 17 years (2007 to 2023) indicate the evolution of the research frontier in the field, with the earliest "institutional theory" moving to "web," "image," then "citizenship," "perspective," and more. "loyalty" and "satisfaction" etc. The evolution can be divided into three phases: the initial phase (2007-2011), the debate phase (2012-2017), and the research specialization phase (2018-2023). Finally, the contribution, limitation and further research direction of this paper are discussed

    An Improved and Optimized Gated Recurrent Unit and Long Short-Term Memory Model for Fake News Detection

    Get PDF
    This study presents a novel approach for detecting counterfeit news, employing an advanced hybrid model that integrates Enhanced Gated Recurrent Unit and Long Short-Term Memory networks, termed as IGRU-LSTM. Initially, the database is assembled from the Information Security and Object Technology (ISOT) database and Wikipedia databases. From the database, the real and fake news is detected by considering news reviews. The dataset may contain unwanted information line URLs and symbols, which should be corrected to achieve efficient fake news detection. So, the pre-processing technique should be considered such as special symbol removal, URLS removal, upper to lower case conversion and replace contractions. After that, the pre-processed data is sent to the embedding procedure for word embedding. Finally, the IGRU-LSTM classifier is utilized for classifying real and fake news detection. In the combined GRU-LSTM framework, we incorporate the Enhanced Wild Horse Optimisation (EWHO) algorithm to optimize the selection of optimal weighting parameters. We utilize MATLAB for implementing this method. To evaluate the effectiveness of our approach, we analyze key performance metrics like precision, recall, and accuracy, and compare them with established methods including CNN-PSO, CNN-FO, and standard CNN

    How does financial communication affect millennials’ online shares purchase intention?

    Get PDF
    The main aim of this article is to identify the influence of online financial communication disseminated through social media on brand attitude and the intention of millennials to purchase Shariah-compliant shares in Malaysia. A convenience sampling procedure using online questionnaires was used to test seven hypotheses based on prior literature. The partial least squares structural equation modelling (PLS-SEM) technique and the SmartPLS 3 tool was used to analyse this study. The findings of this study highlighted that user generated content (UGC) and firm generated content (FGC) positively influence brand attitude. By incorporating the theory of reasoned action and the signalling theory, the result showed a contradictive outcome to previous literature. UGC has no direct effect on the purchase intention of Shariah-compliant online stocks. This study suggests that companies should further comprehend the branding of their companies with the use of their official social media pages by increasing a positive attitude towards their brand

    Disentangling correlated scatter in cluster mass measurements

    Full text link
    The challenge of obtaining galaxy cluster masses is increasingly being addressed by multiwavelength measurements. As scatters in measured cluster masses are often sourced by properties of or around the clusters themselves, correlations between mass scatters are frequent and can be significant, with consequences for errors on mass estimates obtained both directly and via stacking. Using a high resolution 250 Mpc/h side N-body simulation, combined with proxies for observational cluster mass measurements, we obtain mass scatter correlations and covariances for 243 individual clusters along ~96 lines of sight each, both separately and together. Many of these scatters are quite large and highly correlated. We use principal component analysis (PCA) to characterize scatter trends and variations between clusters. PCA identifies combinations of scatters, or variations more generally, which are uncorrelated or non-covariant. The PCA combination of mass measurement techniques which dominates the mass scatter is similar for many clusters, and this combination is often present in a large amount when viewing the cluster along its long axis. We also correlate cluster mass scatter, environmental and intrinsic properties, and use PCA to find shared trends between these. For example, if the average measured richness, velocity dispersion and Compton decrement mass for a cluster along many lines of sight are high relative to its true mass, in our simulation the cluster's mass measurement scatters around this average are also high, its sphericity is high, and its triaxiality is low. Our analysis is based upon estimated mass distributions for fixed true mass. Extensions to observational data would require further calibration from numerical simulations, tuned to specific observational survey selection functions and systematics.Comment: 18 pages, 12 figures, final version to appear in MNRAS, helpful changes from referee and others incorporate

    Analysis of non-Gaussian CMB maps based on the N-pdf. Application to WMAP data

    Full text link
    We present a new method based on the N-point probability distribution (pdf) to study non-Gaussianity in cosmic microwave background (CMB) maps. Likelihood and Bayesian estimation are applied to a local non-linear perturbed model up to third order, characterized by a linear term which is described by a Gaussian N-pdf, and a second and third order terms which are proportional to the square and the cube of the linear one. We also explore a set of model selection techniques (the Akaike and the Bayesian Information Criteria, the minimum description length, the Bayesian Evidence and the Generalized Likelihood Ratio Test) and their application to decide whether a given data set is better described by the proposed local non-Gaussian model, rather than by the standard Gaussian temperature distribution. As an application, we consider the analysis of the WMAP 5-year data at a resolution of around 2 degrees. At this angular scale (the Sachs-Wolfe regime), the non-Gaussian description proposed in this work defaults (under certain conditions) to an approximative local form of the weak non-linear coupling inflationary model (e.g. Komatsu & Spergel 2001) previously addressed in the literature. For this particular case, we obtain an estimation for the non-linear coupling parameter of -94 < F_nl < 154 at 95% CL. Equally, model selection criteria also indicate that the Gaussian hypothesis is favored against the particular local non-Gaussian model proposed in this work. This result is in agreement with previous findings obtained for equivalent non-Gaussian models and with different non-Gaussian estimators. However, our estimator based on the N-pdf is more efficient than previous estimators and, therefore, provides tighter constraints on the coupling parameter at degree angular resolution.Comment: 13 pages, 4 figures. Minor changes. Accepted for publication in MNRA

    First Low-Latency LIGO+Virgo Search for Binary Inspirals and their Electromagnetic Counterparts

    Get PDF
    Aims. The detection and measurement of gravitational-waves from coalescing neutron-star binary systems is an important science goal for ground-based gravitational-wave detectors. In addition to emitting gravitational-waves at frequencies that span the most sensitive bands of the LIGO and Virgo detectors, these sources are also amongst the most likely to produce an electromagnetic counterpart to the gravitational-wave emission. A joint detection of the gravitational-wave and electromagnetic signals would provide a powerful new probe for astronomy. Methods. During the period between September 19 and October 20, 2010, the first low-latency search for gravitational-waves from binary inspirals in LIGO and Virgo data was conducted. The resulting triggers were sent to electromagnetic observatories for followup. We describe the generation and processing of the low-latency gravitational-wave triggers. The results of the electromagnetic image analysis will be described elsewhere. Results. Over the course of the science run, three gravitational-wave triggers passed all of the low-latency selection cuts. Of these, one was followed up by several of our observational partners. Analysis of the gravitational-wave data leads to an estimated false alarm rate of once every 6.4 days, falling far short of the requirement for a detection based solely on gravitational-wave data.Comment: 13 pages, 13 figures. For a repository of data used in the publication, go to: http://dcc.ligo.org/cgi-bin/DocDB/ShowDocument?docid=P1100065 Also see the announcement for this paper on ligo.org at: http://www.ligo.org/science/Publication-S6CBCLowLatency

    Planck 2015 results. XVIII. Background geometry and topology of the Universe

    Get PDF
    Maps of cosmic microwave background (CMB) temperature and polarization from the 2015 release of Planck data provide the highestquality full-sky view of the surface of last scattering available to date. This enables us to detect possible departures from a globally isotropic cosmology. We present the first searches using CMB polarization for correlations induced by a possible non-trivial topology with a fundamental domain that intersects, or nearly intersects, the last-scattering surface (at comoving distance χrec), both via a direct scan for matched circular patterns at the intersections and by an optimal likelihood calculation for specific topologies. We specialize to flat spaces with cubic toroidal (T3) and slab (T1) topologies, finding that explicit searches for the latter are sensitive to other topologies with antipodal symmetry. These searches yield no detection of a compact topology with a scale below the diameter of the last-scattering surface. The limits on the radius ℛi of the largest sphere inscribed in the fundamental domain (at log-likelihood ratio Δlnℒ > −5 relative to a simply-connected flat Planck best-fit model) are: ℛi > 0.97 χrec for the T3 cubic torus; and ℛi > 0.56 χrec for the T1 slab. The limit for the T3 cubic torus from the matched-circles search is numerically equivalent, ℛi > 0.97 χrec at 99% confidence level from polarization data alone. We also perform a Bayesian search for an anisotropic global Bianchi VIIh geometry. In the non-physical setting, where the Bianchi cosmology is decoupled from the standard cosmology, Planck temperature data favour the inclusion of a Bianchi component with a Bayes factor of at least 2.3 units of log-evidence. However, the cosmological parameters that generate this pattern are in strong disagreement with those found from CMB anisotropy data alone. Fitting the induced polarization pattern for this model to the Planck data requires an amplitude of −0.10 ± 0.04 compared to the value of + 1 if the model were to be correct. In the physically motivated setting, where the Bianchi parameters are coupled and fitted simultaneously with the standard cosmological parameters, we find no evidence for a Bianchi VIIh cosmology and constrain the vorticity of such models to (ω/H)0 < 7.6 × 10-10 (95% CL)

    Dark matter interpretations of ATLAS searches for the electroweak production of supersymmetric particles in s√=8 s=8 TeV proton-proton collisions

    Get PDF
    A selection of searches by the ATLAS experiment at the LHC for the electroweak production of SUSY particles are used to study their impact on the constraints on dark matter candidates. The searches use 20 fb−1 of proton-proton collision data at s √ =8 s=8 TeV. A likelihood-driven scan of a five-dimensional effective model focusing on the gaugino-higgsino and Higgs sector of the phenomenological minimal supersymmetric Standard Model is performed. This scan uses data from direct dark matter detection experiments, the relic dark matter density and precision flavour physics results. Further constraints from the ATLAS Higgs mass measurement and SUSY searches at LEP are also applied. A subset of models selected from this scan are used to assess the impact of the selected ATLAS searches in this five-dimensional parameter space. These ATLAS searches substantially impact those models for which the mass m(χ ~ 0 1 ) m(χ~10) of the lightest neutralino is less than 65 GeV, excluding 86% of such models. The searches have limited impact on models with larger m(χ ~ 0 1 ) m(χ~10) due to either heavy electroweakinos or compressed mass spectra where the mass splittings between the produced particles and the lightest supersymmetric particle is small

    Multimessenger Search for Sources of Gravitational Waves and High-Energy Neutrinos: Results for Initial LIGO-Virgo and IceCube

    Get PDF
    We report the results of a multimessenger search for coincident signals from the LIGO and Virgo gravitational-wave observatories and the partially completed IceCube high-energy neutrino detector, including periods of joint operation between 2007-2010. These include parts of the 2005-2007 run and the 2009-2010 run for LIGO-Virgo, and IceCube's observation periods with 22, 59 and 79 strings. We find no significant coincident events, and use the search results to derive upper limits on the rate of joint sources for a range of source emission parameters. For the optimistic assumption of gravitational-wave emission energy of 10210^{-2}\,M_\odotc2^2 at 150\sim 150\,Hz with 60\sim 60\,ms duration, and high-energy neutrino emission of 105110^{51}\,erg comparable to the isotropic gamma-ray energy of gamma-ray bursts, we limit the source rate below 1.6×1021.6 \times 10^{-2}\,Mpc3^{-3}yr1^{-1}. We also examine how combining information from gravitational waves and neutrinos will aid discovery in the advanced gravitational-wave detector era
    corecore