183 research outputs found

    Young Consumers’ Attitude towards Halal Food Outlets and JAKIM's Halal Certification in Malaysia

    Get PDF
    AbstractThe attitude of Muslims on halal food is imperative in determining the Muslims’ behavior towards consuming halal food. There are several studies on consumers’ attitude in purchasing halal food in Malaysia focusing on consumers in general. But less attention is given to young Muslim consumers’ on their attitude of halal food outlets and Malaysia's Department of Islamic Development (JAKIM)’s halal certification. Therefore, this paper focuses on young people who will determine the future of halal industry in this country. The main objectives of this paper are a) to indentify the attitude of young Muslim consumers towards halal food outlets, b) to determine their attitude towards halal certification issued by JAKIM and c) to identify the relationship between subjective norm and perceived behavioral control towards the attitude of young consumers in choosing halal food outlets. The framework of consumers’ attitude in this study is based on the Ajzen's Theory of Planned Behavior which postulates three conceptually independent determinants of behavioral intention: attitude, subjective norm and perceived behavioral control. Data were collected through self administered questionnaires and the respondents comprised of Muslim students between 16 and 35 years old from Higher Learning Private Institutions. The findings of this study revealed young Muslims’ positive attitude towards halal food outlets and JAKIM's certification. However, the subjective norms have less significant influences compared to attitude and behavioral control of young consumers in choosing halal food outlets. The study suggests that to increase positive attitude of Muslim consumers, relevant bodies and media should augment the promotion and publicity of halal certification issued by JAKIM to increase awareness among young Muslim consumers in Malaysia. Future research may look into the relationship between the three independent variables of the Planned Behaviors Theory towards young consumers’ intention to choose halal food products in different geographical locations in Malaysia

    Exploring the substructure of nucleons and nuclei with machine learning

    Get PDF
    Perturbative quantum chromodynamics (QCD) ceases to be applicable at low interaction energies due to the rapid increase of the strong coupling. In that limit, the non-perturbative regime determines the properties of quarks and gluons (partons) in terms of parton distribution functions (PDFs) or nuclear PDFs, based on whether they are confined within nucleons or nuclei respectively. Related non-perturbative dynamics describe the hadronisation of partons into hadrons and are encoded by the fragmentation functions (FFs). This thesis focuses on the detailed study of PDFs in protons and nuclei as well as the charged pions FFs by means of a statistical framework based on machine learning algorithms. The key ingredients are the Monte Carlo method for error propagation as well as artificial neural networks that act as universal unbiased interpolators. The main topics addressed are the inference of proton PDFs with theoretical uncertainties and the impact on the gluon PDF from dijet cross sections; a global determination of nuclear PDFs exploiting the constraints from proton-lead collisions at the LHC and using for the first time NNLO calculations; a new determination of FFs from single-inclusive annihilation and semi-inclusive deep-inelastic scattering data; and a quantitative assessment of the impact of future colliders such as the High-Luminosity LHC and the Electron Ion Collider on the proton and nuclear PDFs

    Towards Ultimate Parton Distributions at the High-Luminosity LHC

    Full text link
    Since its start of data taking, the LHC has provided an impressive wealth of information on the quark and gluon structure of the proton. Indeed, modern global analyses of parton distribution functions (PDFs) include a wide range of LHC measurements of processes such as the production of jets, electroweak gauge bosons, and top quark pairs. In this work, we assess the ultimate constraining power of LHC data on the PDFs that can be expected from the complete dataset, in particular after the High-Luminosity (HL) phase, starting in around 2025. The huge statistics of the HL-LHC, delivering L=3\mathcal{L}=3 ab1^{-1} to ATLAS and CMS and L=0.3\mathcal{L}=0.3 ab1^{-1} to LHCb, will lead to an extension of the kinematic coverage of PDF-sensitive measurements as well as to an improvement in their statistical and systematic uncertainties. Here we generate HL-LHC pseudo-data for different projections of the experimental uncertainties, and then quantify the resulting constraints on the PDF4LHC15 set by means of the Hessian profiling method. We find that HL-LHC measurements can reduce PDF uncertainties by up to a factor of 2 to 4 in comparison to state-of-the-art fits, leading to few-percent uncertainties for important observables such as the Higgs boson transverse momentum distribution via gluon-fusion. Our results illustrate the significant improvement in the precision of PDF fits achievable from hadron collider data alone, and motivate the continuation of the ongoing successful program of PDF-sensitive measurements by the LHC collaborations.Comment: 30 pages, 20 figure

    Modeling of Preemptive RTOS Scheduler with Priority Inheritance

    Get PDF
    This work describes an approach to generate accurate system-level model of embedded software on a targeted Real-Time Operating System (RTOS). We design a RTOS emulation layer, called RTOS_SC, on top of the SystemC kernel. The system level model can be used for software optimization in the early stage of a processor design. The model precision is obtained by integrating key features which are provided in typical RTOS schedulers. We first discuss a case study which shows the impact of the implemented features on a priority-driven scheduler. We then present the abstraction of tasks scheduling and communication mechanisms. To validate the accuracy of our model we use the tasks response time metric with industrial-size examples such as MP3, Vocoder and Jpeg encoder. The experimental results show a significant improvement compared to existing RTOS models

    Nuclear parton distributions from neural networks

    Get PDF
    In this contribution we present a status report on the recent progress towards an analysis of nuclear parton distribution functions (nPDFs) using the NNPDF methodology. We discuss how the NNPDF fitting approach can be extended to account for the dependence on the atomic mass number AA, and introduce novel algorithms to improve the training of the neural network parameters within the NNPDF framework. Finally, we present preliminary results of a nPDF fit to neutral current deep-inelastic lepton-nucleus scattering data, and demonstrate how one can validate the new fitting methodology by means of closure tests

    Piramidação gênica para resistência horizontal a nematóides via silenciamento gênico

    Get PDF
    Trabalho de conclusão de curso (graduação)—Universidade de Brasília, Faculdade de Agronomia e Medicina Veterinária, 2016.O fitonematoide causador da galha Meloidogyne incognita é uma praga que causa grandes impactos a diversas culturas agronômicas ao redor do mundo. É um dos patógenos de maior relevância da atualidade e no Brasil causa danos a plantas de algodão, soja, café, cana-de-açúcar e muitas outras commodities. A pesquisa realizada no Laboratório de Interação Molecular Planta-Praga (LIMPP) da Embrapa Recursos Genéticos e Biotecnologia têm como objetivo o a piramidação dos genes para o controle deste fitonematoide. Sequências para expressão de RNA dupla fita (dsRNA) visando o silenciamento (via RNAi) de dois genes-alvo essenciais ao metabolismo do nematoide M. incognita foram previamente validadas por qPCR e bioensaios. Este trabalho consiste na piramidação, ou seja, a junção destas sequências por estratégias de cruzamentos entre plantas contendo os dois cassetes de expressão de dsRNA dos genes selecionados que apresentaram maior eficiência ao controle do M. incognita. Espera-se que a piramidação cause efeito deletério aditivo e evite, ou pelo menos atrase o surgimento de resistência dos fitopatógenos. Foram utilizadas plantas expressando dsRNA para Isocitrato Liase (IL), que atua no ciclo do glioxilato, e plantas expressando o dsRNA para a Proteína de Choque Térmico 90 (hsp90), que possui papel essencial no enovelamento e transporte de proteínas.The root-knot nematode Meloidogyne incognita is a pest that causes the major impacts to several agronomic cultures around the world. It is one of the most relevant pathogens of the current days, and in Brazil causes damage to cotton, soybean, coffee, sugarcane and many other commodities. The research carried at the Molecular Interaction Laboratory Plant-Pest (LIMPP) at Embrapa Genetic Resources and Biotechnology aims to pyramiding genes to control this phytonematode. Sequences for expression of double-stranded RNA (dsRNA) aimed silencing (using RNAi) two target genes essential to the nematode M. incognita metabolism that were previously validated by qPCR and bioassays. This work consists in pyramiding, that is, the junction of these sequences by crossing strategies of tobacco plants (Nicotiana tabacum) containing dsRNA expression cassettes of the selected genes that showed greater efficiency to control M. incognita in. It is expected that the pyramiding should cause harmful additive effect and prevent the emergence of resistance of pathogens. Plants were used to express dsRNA of Isocitrate lyase (IL), which operates in the glyoxylate cycle, and plants expressing dsRNA to heat shock protein 90 (hsp90) having essential role in the folding and transport proteins

    Economic and monetary consequences of Lebanese National debt

    Get PDF
    The Lebanese financial condition is miserable. The national debt to GDP is increasing to horrible levels, and the deficits in government is increasing and expected to increase in the subsequent years, the government spending is increasing without any deterrent in the time the revenues to the government are decreasing, the inflation rates are increasing and the economic growth is decreasing. The fallouts of these conditions have direct effect on Lebanese pound and threaten its value. With all the assures from Lebanese central bank governor and the financial officers, a prudent man can sense the risk that is coming on the Lebanese pound and never believe that the Lebanese pound will stay forever the mountain that never lean with wind. Lebanon must take rapid corrective actions to save the country and its currency from a disaster that would howl.Corrective actions of economic, financial, political, and direct reforms must be studied and activate it directly before the zero time comes and remorse will not benefit

    Video based motion analysis for the overhead badminton forehand stroke

    Get PDF
    Badminton is a game that involved a lot of quick movements and fast response. It is the fastest racket sport in the world where the badminton forehand smash can lead the shuttlecock speed to reach up to 223kph.Research shows that the arm movement contributes a lot during the overhead badminton forehand stroke. The objective of this study is to use a low cost alternative for video based motion analysis on overhead badminton forehand stroke, especially on finding the arm angle and the velocity and acceleration of the badminton player. Findings of this project will help badminton players to come out with a training system or protocol that help to improve the effectiveness of training and enhance their performance. Furthermore, the study was also conducted to observe the movement of player’s upper limb while playing the overhead badminton forehand stroke

    Dynamic model for price of wheat in Bangladesh

    Get PDF
    Wheat is the second staple food of Bangladesh. In this paper we constructed a dynamic model for wheat price. Basically we constructed a single equation autoregressive integrated moving average (ARIMA) model of the price (quarterly wholesale wheat price). Standard ARIMA analysis rests on the simplifying assumption that the time series is stationary. So, at first stationary of the series is checked. An ARIMA (1,1,0) (2,1,1)4 model is constructed based on the autocorrelation and partial autocorrelation functions. Finally, forecasts are made based on the model developed

    Identificación de biomarcadores de fibrilación auricular empleando métodos estadísticos e inteligencia artificial

    Full text link
    Trabajo fin de máster en Bioinformática y Biología ComputacionalLas arritmias cardiacas tienen un peso considerable en la morbilidad y mortalidad en las enfermedades del corazón, generando más de un cuarto de millón de muertes al año en los Estados Unidos. Las arritmias pueden ocurrir durante la edad temprana, o pueden surgir más adelante debido a alguna enfermedad o el envejecimiento. La prueba más común utilizada para diagnosticar una arritmia es un electrocardiograma (ECG), el cual registra las diferencias de potencial eléctrico generadas por el corazón. Ciertas alteraciones en el patrón normal de la actividad eléctrica del corazón son indicativas de patologías cardíacas. Entre los distintos tipos de arritmias cardíacas la Fibrilación Auricular (FA) es la más común, y está asociada al envejecimiento. En el presente proyecto se analizaron más de 320.000 electrocardiogramas (ECGs) registrados en la base de datos del Hospital Universitario de La Princesa desde el año 2007 en formato XML, con el objeto de determinar biomarcadores y la generación de modelos predictivos de FA a partir de ECGs normales. Inicialmente se procedió con el estudio de la estructura de los archivos XML, y la identificación de la información de interés y sensible que pudiera identificar al paciente. Mediante un script en Bash la base de datos fue anonimizada, eliminando toda la información que pudiera identificar a los pacientes y generando nuevos números de identificación en una base de datos alterna. Posteriormente, con herramientas de análisis masivo se identificó, de forma anonimizada, aquellos pacientes que al menos tienen un ECG en FA y que a su vez presenten ECGs previos en Ritmo Sinusal (RS) normal (grupo de casos), al igual que pacientes que solo tienen registrados ECGs en RS (grupo control). El análisis masivo de más de 444 variables de ECGs en RS entre el grupo control y casos se llevó acabo por sexo y edad (de 40 a 49, 50 a 59, 60 a 69, 70 a 79, más de 80 años y el conjunto completo), y tomando en cuenta el tiempo entre ECGs. Una vez establecidos los grupos de estudio, se realizó un análisis estadístico para determinar si estos grupos presentaban diferencias significativas con respecto la edad, sexo y distancia entre ECGs, y se ajustaron para eliminar dichas diferencias. Seguido de esto, se llevó a cabo un análisis univariante para identificar de entre las más de 444 variables aquellas que presentan diferencias significativas entre casos y controles, y seguidamente con estas variables se construyeron modelos predictivos empleando los algoritmos de “Extreme Gradient Boosting” (XGBoost) y “Support Vector Machines” (SVM). Los resultados de exactitud obtenidos de estos ensayos se encuentran alrededor del 60%. Con el objeto de mejorar los resultados se empleó el método “Sequential Forward Floating Selection” (SFFS) o Selección secuencial flotante hacia adelante, el cual es otro método para la selección del conjunto de variables relevantes, obteniendo una mejoría en la exactitud del alrededor del 2
    corecore