163 research outputs found

    An Analysis of Renewable Energy Usage by Mobile Data Network Operators

    Get PDF
    The exponential growth in mobile data traffic has resulted in massive energy usage and therefore has increased the carbon footprint of the Internet. Data network operators have taken significant initiatives to mitigate the negative impacts of carbon emissions (CE). Renewable Energy Sources (RES) have emerged as the most promising way to reduce carbon emissions. This article presents the role of renewable energy (RE) in minimizing the environmental impacts of mobile data communications for achieving a greener environment. In this article, an analysis of some selected mobile data network operators’ energy consumption (EC) has been presented. Based on the current statistics of different mobile network operators, the future energy values are estimated. These estimations of carbon emissions are based on the predicted data traffic in the coming years and the percentage consumption of energy from renewable sources by the network operators. The analysis presented in this article would be helpful to develop and implement energy policies that accelerate the process of increasing the renewable shares in total energy requirements. Incrementing the share of renewable energy in total energy requirements can be a way forward to reach Goal 7 of the United Nations Sustainable Development Goals (SDGs)

    Proposing Logical Table Constructs for Enhanced Machine Learning Process

    Get PDF
    Machine learning (ML) has shown enormous potential in various domains with the wide variations of underlying data types. Because of the miscellany in the data sets and the features, ML classifiers often suffer from challenges, such as feature miss-classification, unfit algorithms, low accuracy, overfitting, underfitting, extreme bias, and high predictive errors. Through the lens of related study and latest progress in the field, this paper presents a novel scheme to construct logical table (LT) unit with two internal sub-modules for algorithm blend and feature engineering. The LT unit works in the deepest layer of an enhanced ML engine engineering (eMLEE) process. eMLEE consists of several low-level modules to enhance the ML classifier progression. A unique engineering approach is adopted in eMLEE to blend various algorithms, enhance the feature engineering, construct a weighted performance metric, and augment the validation process. The LT is an in-memory logical component, that governs the progress of eMLEE, regulates the model metrics, improves the parallelism, and keep tracks of each module of eMLEE as the classifier learns. Optimum fitness of the model with parallel “check, validate, insert, delete, and update” mechanism in 3-D logical space via structured schemas in the LT is obtained. The LT unit is developed in Python, C#, and R libraries and tested using miscellaneous data sets. Results are created using GraphPad Prism, SigmaPlot, Plotly, and MS Excel software. To support the built and implementation of the proposed scheme, complete mathematical models along with the algorithms, and necessary illustrations are provided in this paper. To show the practicality of the proposed scheme, several simulation results are presented with a comprehensive analysis of the outcomes for the metrics of the model that the LT regulates with improved outcomes.https://doi.org/10.1109/ACCESS.2018.286604

    Proposing Enhanced Feature Engineering and a Selection Model for Machine Learning Processes

    Get PDF
    Machine Learning (ML) requires a certain number of features (i.e., attributes) to train the model. One of the main challenges is to determine the right number and the type of such features out of the given dataset’s attributes. It is not uncommon for the ML process to use dataset of available features without computing the predictive value of each. Such an approach makes the process vulnerable to overfit, predictive errors, bias, and poor generalization. Each feature in the dataset has either a unique predictive value, redundant, or irrelevant value. However, the key to better accuracy and fitting for ML is to identify the optimum set (i.e., grouping) of the right feature set with the finest matching of the feature’s value. This paper proposes a novel approach to enhance the Feature Engineering and Selection (eFES) Optimization process in ML. eFES is built using a unique scheme to regulate error bounds and parallelize the addition and removal of a feature during training. eFES also invents local gain (LG) and global gain (GG) functions using 3D visualizing techniques to assist the feature grouping function (FGF). FGF scores and optimizes the participating feature, so the ML process can evolve into deciding which features to accept or reject for improved generalization of the model. To support the proposed model, this paper presents mathematical models, illustrations, algorithms, and experimental results. Miscellaneous datasets are used to validate the model building process in Python, C#, and R languages. Results show the promising state of eFES as compared to the traditional feature selection process.http://dx.doi.org/10.3390/app804064

    Dickson polynomial-based secure group authentication scheme for Internet of Things

    Get PDF
    Internet of Things (IoT) paves the way for the modern smart industrial applications and cities. Trusted Authority acts as a sole control in monitoring and maintaining the communications between the IoT devices and the infrastructure. The communication between the IoT devices happens from one trusted entity of an area to the other by way of generating security certificates. Establishing trust by way of generating security certificates for the IoT devices in a smart city application can be of high cost and expensive. In order to facilitate this, a secure group authentication scheme that creates trust amongst a group of IoT devices owned by several entities has been proposed. The majority of proposed authentication techniques are made for individual device authentication and are also utilized for group authentication; nevertheless, a unique solution for group authentication is the Dickson polynomial based secure group authentication scheme. The secret keys used in our proposed authentication technique are generated using the Dickson polynomial, which enables the group to authenticate without generating an excessive amount of network traffic overhead. IoT devices' group authentication has made use of the Dickson polynomial. Blockchain technology is employed to enable secure, efficient, and fast data transfer among the unique IoT devices of each group deployed at different places. Also, the proposed secure group authentication scheme developed based on Dickson polynomials is resistant to replay, man-in-the-middle, tampering, side channel and signature forgeries, impersonation, and ephemeral key secret leakage attacks. In order to accomplish this, we have implemented a hardware-based physically unclonable function. Implementation has been carried using python language and deployed and tested on Blockchain using Ethereum Goerli’s Testnet framework. Performance analysis has been carried out by choosing various benchmarks and found that the proposed framework outperforms its counterparts through various metrics. Different parameters are also utilized to assess the performance of the proposed blockchain framework and shows that it has better performance in terms of computation, communication, storage and latency

    Nuclear data for medical applications: An overview of present status and future needs

    Full text link
    A brief overview of nuclear data required for medical applications is given. The major emphasis is on radionuclides for internal applications, both for diagnosis and therapy. The status of the presently available data is discussed and some of the emerging needs are outlined. Most of the needs are associated with the development of non-standard positron emitters and novel therapeutic radionuclides. Some new developments in application of radionuclides, e.g. theranostic approach, multimode imaging, radionanoparticles, etc. are described and the related nuclear data needs are discussed. The possible use of newer irradiation technologies for medical radionuclide production, e.g. intermediate energy charged-particle accelerators, high-power electron accelerators for photon production, and spallation neutron sources, will place heavy demands on nuclear data

    An evaluation of biosecurity compliance levels and assessment of associated risk factors for highly pathogenic avian influenza H5N1 infection of live-bird-markets, Nigeria and Egypt

    Get PDF
    Live bird market (LBM) is integral component in the perpetuation of HPAI H5N1, while biosecurity is crucial and key to the prevention and control of infectious diseases. Biosecurity compliance level and risk factor assessments in 155LBMs was evaluated in Nigeria and Egypt through the administration of a 68-item biosecurity checklist, scored based on the modifications of previous qualitative data, and analysed for degree of compliance. LBMs were scored as "complied with a biosecurity item" if they had good-very good scores (4). All scores were coded and analysed using descriptive statistics and risk or protective factors were determined using univariable and multivariable logistic regression at p≤0.05. Trading of wild birds and other animal in the LBMs (Odd Ratio (OR)=34.90; p=0.01) and claims of hand disinfection after slaughter (OR=31.16; p=0.03) were significant risk factors while mandatory routine disinfection of markets (OR=0.13; p≤0.00), fencing and gates for live bird market (OR=0.02; p≤0.01) and hand washing after slaughter (OR=0.41; p≤0.05) were protective factors for and against the infection of Nigerian and Egyptian LBMs with the HPAI H5N1 virus. Almost all the LBMs complied poorly with most of the variables in the checklist (p≤0.05), but pathways to improved biosecurity in the LBMs existed. We concluded that the LBM operators play a critical role in the disruption of transmission of H5N1 virus infection through improved biosecurity and participatory epidemiology and multidisciplinary approach is needed.http://www.elsevier.com/locate/actatropica2017-12-31hb2017Veterinary Tropical Disease

    Carnegie Supernova Project-II: Extending the Near-Infrared Hubble Diagram for Type Ia Supernovae to z0.1z\sim0.1

    Full text link
    The Carnegie Supernova Project-II (CSP-II) was an NSF-funded, four-year program to obtain optical and near-infrared observations of a "Cosmology" sample of 100\sim100 Type Ia supernovae located in the smooth Hubble flow (0.03z0.100.03 \lesssim z \lesssim 0.10). Light curves were also obtained of a "Physics" sample composed of 90 nearby Type Ia supernovae at z0.04z \leq 0.04 selected for near-infrared spectroscopic time-series observations. The primary emphasis of the CSP-II is to use the combination of optical and near-infrared photometry to achieve a distance precision of better than 5%. In this paper, details of the supernova sample, the observational strategy, and the characteristics of the photometric data are provided. In a companion paper, the near-infrared spectroscopy component of the project is presented.Comment: 43 pages, 10 figures, accepted for publication in PAS
    corecore