1,412 research outputs found
Numerical studies of the thermal design sensitivity calculation for a reaction-diffusion system with discontinuous derivatives
The aim of this study is to find a reliable numerical algorithm to calculate thermal design sensitivities of a transient problem with discontinuous derivatives. The thermal system of interest is a transient heat conduction problem related to the curing process of a composite laminate. A logical function which can smoothly approximate the discontinuity is introduced to modify the system equation. Two commonly used methods, the adjoint variable method and the direct differentiation method, are then applied to find the design derivatives of the modified system. The comparisons of numerical results obtained by these two methods demonstrate that the direct differentiation method is a better choice to be used in calculating thermal design sensitivity
Spatiotemporal and temporal forecasting of ambient air pollution levels through data-intensive hybrid artificial neural network models
Outdoor air pollution (AP) is a serious public threat which has been linked to severe respiratory and cardiovascular illnesses, and premature deaths especially among those residing in highly urbanised cities. As such, there is a need to develop early-warning and risk management tools to alleviate its effects. The main objective of this research is to develop AP forecasting models based on Artificial Neural Networks (ANNs) according to an identified model-building protocol from existing related works. Plain, hybrid and ensemble ANN model architectures were developed to estimate the temporal and spatiotemporal variability of hourly NO2 levels in several locations in the Greater London area. Wavelet decomposition was integrated with Multilayer Perceptron (MLP) and Long Short-term Memory (LSTM) models to address the issue of high variability of AP data and improve the estimation of peak AP levels. Block-splitting and crossvalidation procedures have been adapted to validate the models based on Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Willmott’s index of agreement (IA). The results of the proposed models present better performance than those from the benchmark models. For instance, the proposed wavelet-based hybrid approach provided 39.15% and 28.58% reductions in RMSE and MAE indices, respectively, on the performance of the benchmark MLP model results for the temporal forecasting of NO2 levels. The same approach reduced the RMSE and MAE indices of the benchmark LSTM model results by 12.45% and 20.08%, respectively, for the spatiotemporal estimation of NO2 levels in one site at Central London. The proposed hybrid deep learning approach offers great potential to be operational in providing air pollution forecasts in areas without a reliable database. The model-building protocol adapted in this thesis can also be applied to studies using measurements from other sites.Outdoor air pollution (AP) is a serious public threat which has been linked to severe respiratory and cardiovascular illnesses, and premature deaths especially among those residing in highly urbanised cities. As such, there is a need to develop early-warning and risk management tools to alleviate its effects. The main objective of this research is to develop AP forecasting models based on Artificial Neural Networks (ANNs) according to an identified model-building protocol from existing related works. Plain, hybrid and ensemble ANN model architectures were developed to estimate the temporal and spatiotemporal variability of hourly NO2 levels in several locations in the Greater London area. Wavelet decomposition was integrated with Multilayer Perceptron (MLP) and Long Short-term Memory (LSTM) models to address the issue of high variability of AP data and improve the estimation of peak AP levels. Block-splitting and crossvalidation procedures have been adapted to validate the models based on Root Mean Squared Error (RMSE), Mean Absolute Error (MAE), and Willmott’s index of agreement (IA). The results of the proposed models present better performance than those from the benchmark models. For instance, the proposed wavelet-based hybrid approach provided 39.15% and 28.58% reductions in RMSE and MAE indices, respectively, on the performance of the benchmark MLP model results for the temporal forecasting of NO2 levels. The same approach reduced the RMSE and MAE indices of the benchmark LSTM model results by 12.45% and 20.08%, respectively, for the spatiotemporal estimation of NO2 levels in one site at Central London. The proposed hybrid deep learning approach offers great potential to be operational in providing air pollution forecasts in areas without a reliable database. The model-building protocol adapted in this thesis can also be applied to studies using measurements from other sites
On processing development for fabrication of fiber reinforced composite, part 2
Fiber-reinforced composite laminates are used in many aerospace and automobile applications. The magnitudes and durations of the cure temperature and the cure pressure applied during the curing process have significant consequences for the performance of the finished product. The objective of this study is to exploit the potential of applying the optimization technique to the cure cycle design. Using the compression molding of a filled polyester sheet molding compound (SMC) as an example, a unified Computer Aided Design (CAD) methodology, consisting of three uncoupled modules, (i.e., optimization, analysis and sensitivity calculations), is developed to systematically generate optimal cure cycle designs. Various optimization formulations for the cure cycle design are investigated. The uniformities in the distributions of the temperature and the degree with those resulting from conventional isothermal processing conditions with pre-warmed platens. Recommendations with regards to further research in the computerization of the cure cycle design are also addressed
AI-Based Innovation in B2B Marketing: An Interdisciplinary Framework Incorporating Academic and Practitioner Perspectives
Artificial intelligence (AI) rests at the frontier of technology, service, and industry. AI research is helping to reconfigure innovative businesses in the consumer marketplace. This paper addresses existing literature on AI and presents an emergent B2B marketing framework for AI innovation as a cycle of the critical elements identified in cross-functional studies that represent both academic and practitioner strategic orientations. We contextualize the prevalence of AI-based innovation themes by utilizing bibliometric and semantic content analysis methods across two studies and drawing data from two distinct sources, academics, and industry practitioners. Our findings reveal four key analytical components: (1) IT tools and resource environment, (2) innovative actors and agents, (3) marketing knowledge and innovation, and (4) communications and exchange relationships. The academic literature and industry material analyzed in our studies imply that as markets integrate AI technology into their offerings and services, a governing opportunity to better foster and encourage mutually beneficial co-creation in the AI innovation process emerges
Detecting Mutations in the Mycobacterium tuberculosis Pyrazinamidase Gene pncA to Improve Infection Control and Decrease Drug Resistance Rates in Human Immunodeficiency Virus Coinfection.
Hospital infection control measures are crucial to tuberculosis (TB) control strategies within settings caring for human immunodeficiency virus (HIV)-positive patients, as these patients are at heightened risk of developing TB. Pyrazinamide (PZA) is a potent drug that effectively sterilizes persistent Mycobacterium tuberculosis bacilli. However, PZA resistance associated with mutations in the nicotinamidase/pyrazinamidase coding gene, pncA, is increasing. A total of 794 patient isolates obtained from four sites in Lima, Peru, underwent spoligotyping and drug resistance testing. In one of these sites, the HIV unit of Hospital Dos de Mayo (HDM), an isolation ward for HIV/TB coinfected patients opened during the study as an infection control intervention: circulating genotypes and drug resistance pre- and postintervention were compared. All other sites cared for HIV-negative outpatients: genotypes and drug resistance rates from these sites were compared with those from HDM. HDM patients showed high concordance between multidrug resistance, PZA resistance according to the Wayne method, the two most common genotypes (spoligotype international type [SIT] 42 of the Latino American-Mediterranean (LAM)-9 clade and SIT 53 of the T1 clade), and the two most common pncA mutations (G145A and A403C). These associations were absent among community isolates. The infection control intervention was associated with 58-92% reductions in TB caused by SIT 42 or SIT 53 genotypes (odds ratio [OR] = 0.420, P = 0.003); multidrug-resistant TB (OR = 0.349, P < 0.001); and PZA-resistant TB (OR = 0.076, P < 0.001). In conclusion, pncA mutation typing, with resistance testing and spoligotyping, was useful in identifying a nosocomial TB outbreak and demonstrating its resolution after implementation of infection control measures
Small Differences in Experience Bring Large Differences in Performance
In many life situations, people choose sequentially between repeating a past action in expectation of a familiar outcome (exploitation), or choosing a novel action whose outcome is largely uncertain (exploration). For instance, in each quarter, a manager can budget advertising for an existing product, earning a predictable boost in sales. Or she can spend to develop a completely new product, whose prospects are more ambiguous. Such decisions are central to economics, psychology, business, and innovation; and they have been studied mostly by modelling in agent-based simulations or examining statistical relationships in archival or survey data. Using experiments across cultures, we add unique evidence about causality and variations. We find that exploration is boosted by three past experiences: When decision-makers
fall below top performance; undergo performance stability; or suffer low overall performance. In contrast, individual-level variables, including risk
and ambiguity preferences, are poor predictors of exploration. The results provide insights into how decisions are made, substantiating the
micro-foundations of strategy and assisting in balancing exploration with exploitation
Recommended from our members
Racial attention deficit
Despite efforts toward equity in organizations and institutions, minority members report that they are often ignored, their contributions undervalued. Against this backdrop, we conduct a large-sample, multiyear experimental study to investigate patterns of attention. The findings provide causal evidence of a racial attention deficit: Even when in their best interest, White Americans pay less attention to Black peers. In a baseline study, we assign an incentivized puzzle to participants and examine their willingness to follow the example of their White and Black peers. White participants presume that Black peers are less competent—and fail to learn from their choices. We then test two interventions: Providing information about past accomplishments reduces the disparity in evaluations of Black peers, but the racial attention deficit persists. When Whites can witness the accomplishments of Black peers, rather than being told about them, the racial attention deficit subsides. We suggest that such a deficit can explain racial gaps documented in science, education, health, and law
- …