10,068 research outputs found
Evolution engine technology in exhaust gas recirculation for heavy-duty diesel engine
In this present year, engineers have been researching and inventing to get the optimum of less emission in every vehicle for a better environmental friendly. Diesel engines are known reusing of the exhaust gas in order to reduce the exhaust emissions such as NOx that contribute high factors in the pollution. In this paper, we have conducted a study that EGR instalment in the vehicle can be good as it helps to prevent highly amount of toxic gas formation, which NOx level can be lowered. But applying the EGR it can lead to more cooling and more space which will affect in terms of the costing. Throughout the research, fuelling in the engine affects the EGR producing less emission. Other than that, it contributes to the less of performance efficiency when vehicle load is less
Travel Package Recommendation
Location Based SocialNetworks (LBSN) benefit the users by allowing them to share their locations and life
moments with their friends. The users can also review the locations they have visited. Classical recommender
systems provide users a ranked list of single items. This is not suitable for applications like trip
planning,where the recommendations should contain multiple items in an appropriate sequence. The
problem of generating such recommendations is challenging due to various critical aspects, which includes
user interest, budget constraints and high sparsity in the available data used to solve the problem.
In this paper, we propose a graph based approach to recommend a set of personalized travel packages.
Each recommended package comprises of a sequence of multiple Point of Interests (POIs). Given the current
location and spatio-temporal constraints, our goal is to recommend a package which satisfies the
constraints. This approach utilizes the data collected fromLBSNs to learn user preferences and also models
the location popularity
Recommended from our members
Understanding construction delay analysis and the role of pre-construction programming
Copyright © 2013, American Society of Civil Engineers. This is the author's accepted manuscript. The final published article is available from the link below.Modern construction projects commonly suffer from delay in their completions. The resolution of time and cost claims consequently flowing from such delays continues to remain a difficult undertaking for all project parties. A common approach often relied on by contractors and their employers (or their representatives) to resolve this matter involves applying various delay analysis techniques, which are all based on construction programs originally developed for managing the project. However, evidence from literature suggests that the reliability of these techniques in ensuring successful claims resolution are often undermined by the nature and quality of the underlying program used. As part of a wider research carried out on delay and disruption analysis in practice, this paper reports on an aspect of the study aimed at exploring preconstruction stage programming issues that affect delay claims resolutions. This aspect is based on an in-depth interview with experienced construction planning engineers in the United Kingdom, conducted after an initial large-scale survey on delay and disruption techniques usage. Summary of key findings and conclusions include: (1) most contractors prefer to use linked bar chart format for their baseline programs over conventional critical path method (CPM) networks; (2) baseline programs are developed using planning software packages. Some of these pose difficulties when employed for most delay analysis techniques, except for simpler ones; (3) manpower loading graphs are not commonly developed as part of the main deliverables during preconstruction stage planning. As a result, most programs are not subjected to resource loading and leveling for them to accurately reflect planned resource usage on site. This practice has detrimental effects on the reliability of baseline programs in their use for resolving delay claims; and (4) baseline program development involves many different experts within construction organizations as expected, but with very little involvement of the employer or its representative. Active client involvement is however quite important as it would facilitate quick program approval/acceptance before construction, a necessary requirement for early delay claims settlement, which otherwise are often left unresolved long after the delaying events with the potential of generating into expensive disputes. The study results provide a better understanding of the key issues that need attention if improvements are to be made in delay claim resolutions. Additional research focusing on the testing of these results using a much larger sample and rigorous statistical analysis for generalization purposes would be helpful in advancing the limited knowledge of this subject matter
Recommendation domains for pond aquaculture
This publication introduces the methods and results of a research project that has developed a set of decision-support tools to identify places and sets of conditions for which a particular target aquaculture technology is considered feasible and therefore good to promote. The tools also identify the nature of constraints to aquaculture development and thereby shed light on appropriate interventions to realize the potential of the target areas. The project results will be useful for policy planners and decision makers in national, regional and local governments and development funding agencies, aquaculture extension workers in regional and local governments, and researchers in aquaculture systems and rural livelihoods. (Document contains 40 pages
DeepSphere: Efficient spherical Convolutional Neural Network with HEALPix sampling for cosmological applications
Convolutional Neural Networks (CNNs) are a cornerstone of the Deep Learning
toolbox and have led to many breakthroughs in Artificial Intelligence. These
networks have mostly been developed for regular Euclidean domains such as those
supporting images, audio, or video. Because of their success, CNN-based methods
are becoming increasingly popular in Cosmology. Cosmological data often comes
as spherical maps, which make the use of the traditional CNNs more complicated.
The commonly used pixelization scheme for spherical maps is the Hierarchical
Equal Area isoLatitude Pixelisation (HEALPix). We present a spherical CNN for
analysis of full and partial HEALPix maps, which we call DeepSphere. The
spherical CNN is constructed by representing the sphere as a graph. Graphs are
versatile data structures that can act as a discrete representation of a
continuous manifold. Using the graph-based representation, we define many of
the standard CNN operations, such as convolution and pooling. With filters
restricted to being radial, our convolutions are equivariant to rotation on the
sphere, and DeepSphere can be made invariant or equivariant to rotation. This
way, DeepSphere is a special case of a graph CNN, tailored to the HEALPix
sampling of the sphere. This approach is computationally more efficient than
using spherical harmonics to perform convolutions. We demonstrate the method on
a classification problem of weak lensing mass maps from two cosmological models
and compare the performance of the CNN with that of two baseline classifiers.
The results show that the performance of DeepSphere is always superior or equal
to both of these baselines. For high noise levels and for data covering only a
smaller fraction of the sphere, DeepSphere achieves typically 10% better
classification accuracy than those baselines. Finally, we show how learned
filters can be visualized to introspect the neural network.Comment: arXiv admin note: text overlap with arXiv:astro-ph/0409513 by other
author
Recommended from our members
The Computational Diet: A Review of Computational Methods Across Diet, Microbiome, and Health.
Food and human health are inextricably linked. As such, revolutionary impacts on health have been derived from advances in the production and distribution of food relating to food safety and fortification with micronutrients. During the past two decades, it has become apparent that the human microbiome has the potential to modulate health, including in ways that may be related to diet and the composition of specific foods. Despite the excitement and potential surrounding this area, the complexity of the gut microbiome, the chemical composition of food, and their interplay in situ remains a daunting task to fully understand. However, recent advances in high-throughput sequencing, metabolomics profiling, compositional analysis of food, and the emergence of electronic health records provide new sources of data that can contribute to addressing this challenge. Computational science will play an essential role in this effort as it will provide the foundation to integrate these data layers and derive insights capable of revealing and understanding the complex interactions between diet, gut microbiome, and health. Here, we review the current knowledge on diet-health-gut microbiota, relevant data sources, bioinformatics tools, machine learning capabilities, as well as the intellectual property and legislative regulatory landscape. We provide guidance on employing machine learning and data analytics, identify gaps in current methods, and describe new scenarios to be unlocked in the next few years in the context of current knowledge
- …