122 research outputs found
Quantifying cognitive workload and defining training time requirements using thermography
Effective mental workload measurement is critical because mental workload significantly affects human performance. A non-invasive and objective workload measurement tool is needed to overcome limitations of current mental workload measures. Further, training/learning increases mental workload during skill or knowledge acquisition, followed by a decreased mental workload, though sufficient training times are unknown. The objectives of this study were to: (1) investigate the efficacy of using thermography as a non-contact physiological measure to quantify mental workload, (2) quantify and describe the relationship between mental workload and learning/training, and, (3) introduce a method to determine a sufficient training time and an optimal human performance level for a novel task by using thermography. Three studies were conducted to address these objectives. The first study investigated the efficacy of using thermography to quantity the relationship between mental workload and facial temperature changes while learning an alpha-numeric task. Thermography measured and quantified the mental workload level successfully. Strong and significant correlations were found among thermography, performance, and subjective workload measures (MCH and SWAT ratings). The second study investigated the utility of using a psychophysical approach to determine workload levels that maximize performance on a cognitive task. The second study consisted of an adjustment session (participants adjusted their own workload levels) and work session (participants worked at the chosen workload level). Participants were found to fall into two performance groups (low and high performers by accuracy rate) and results were significantly different. Thermography demonstrated whether both group found their optimal workload level. The last study investigated efficacy of using thermography to quantify mental workload level in a complex training/learning environment. Experienced driversā performance data was used as criteria to indicate whether novice drivers mastered the driving skills. Strong and significant correlations were found among thermography, subjective workload measures, and performance measures in novice drivers. This study verified that thermography is a reliable and valid way to measure workload as a non-invasive and objective method. Also, thermography provided more practical results than subjective workload measures for simple and complex cognitive tasks. Thermography showed the capability to identify a sufficient training time for simple or complex cognitive tasks
Meso/micro-porous graphitic carbon for highly efficient capacitive deionization
Department of Chemical EngineeringDue to the scarcity of drinking water by population growth, global warming and increase in water consumption, the capacitive deionization (CDI) technology has considerable attention as a promising technology for desalination. Fast desalination rate and high electrosorption capacity are important element for desalination performance in CDI field. Here, meso/micro porous graphitic carbon spheres (mm-PGS) were fabricated using poly(vinyl alcohol), nickel chloride and fumed silica. Fabrication strategy of mm-PGS realized gram-scle production (>1.0 g mm-PGS sample per one time), high surface area (1492.8 m2 g-1) and large pore volume (5.1198 cm3 g-1). Under 100 ppm NaCl solution, the mm-PGS CDI electrodes showed rapid electrosorption rate (2.79 g-1 min-1) and remarkable electrosorption capacity (9.37 mg g-1) until 10 minutes compared with activated carbon electrodes (1.01 mg g-1 min-1 and 8.07 mg g-1 for 23 minutes). The mm-PGS materials also exhibited a high rate constant (0.07146) through pseudo-second-order model as kinetic model, indicating that macro and mesoporous structure of mm-PGS is favorable for ion transport during desalination process. Furthermore, mm-PGS CDI electrode consumed low electrical energy per removed ions (33.17 kJ mol-1)clos
Learning to Compose Task-Specific Tree Structures
For years, recursive neural networks (RvNNs) have been shown to be suitable
for representing text into fixed-length vectors and achieved good performance
on several natural language processing tasks. However, the main drawback of
RvNNs is that they require structured input, which makes data preparation and
model implementation hard. In this paper, we propose Gumbel Tree-LSTM, a novel
tree-structured long short-term memory architecture that learns how to compose
task-specific tree structures only from plain text data efficiently. Our model
uses Straight-Through Gumbel-Softmax estimator to decide the parent node among
candidates dynamically and to calculate gradients of the discrete decision. We
evaluate the proposed model on natural language inference and sentiment
analysis, and show that our model outperforms or is at least comparable to
previous models. We also find that our model converges significantly faster
than other models.Comment: AAAI 201
Valuing flexibilities in large-scale real estate development projects
Thesis (S.M.)--Massachusetts Institute of Technology, Dept. of Urban Studies and Planning, 2004.This electronic version was submitted by the student author. The certified thesis is available in the Institute Archives and Special Collections.Includes bibliographical references (leaves 147-150).This thesis aims to develop a set of strategic tools for real estate development projects. The conventional tools such as the Discounted Cash Flow (DCF) method fail to incorporate dynamics of real estate development processes. As a result, their application to real world situation is quite limited. Two methods are introduced to deal with this inadequacy of the DCF method. Decision Tree Analysis (DTA) employs a management science approach to analyze flexibilities and corresponding strategies from management decision making perspective. Real Options Analysis (ROA) aims to apply theories of valuing financial derivatives to real assets and it allows investors to quantitatively analyze flexibilities. Each technique has advantages and shortcomings and should only be used for appropriate situations. DTA is suited for analyses of project specific risks that are not directly related to the overall market. ROA is a superior tool when risks are originated from the uncertainties of markets. Applying both tools in practice requires rather simplified assumptions, and it is crucial to understand them to make the analyses meaningful. The thesis finds that incorporating flexibilities in decision making into an analysis is especially important for large-scale and multi-phase projects. The DCF method treats the later phase projects as if they are fully committed at the present time. This assumption of full commitment is rarely the case in the real world practice, and as a result, the DCF method systematically undervalues future phases in multi-phase projects. The case study of New Songdo City reveals that the value of flexibility is a critical factor for the analyses of large scale projects, especially when there is a lot of market uncertainties involved. Based on the conventional DCF method, New Songdo City has a hugely negative NPV and should not be pursued. However, the ROA and the DTA approaches show that it has a potential for creating enormous value by incorporating flexibilities of the project.by Jihun Kang.S.M
Efficient Conversion of Acetate to 3-Hydroxypropionic Acid by Engineered Escherichia coli
Acetate, which is an abundant carbon source, is a potential feedstock for microbial processes that produce diverse value-added chemicals. In this study, we produced 3-hydroxypropionic acid (3-HP) from acetate with engineered Escherichia coli. For the efficient conversion of acetate to 3-HP, we initially introduced heterologous mcr (encoding malonyl-CoA reductase) from Chloroflexus aurantiacus. Then, the acetate assimilating pathway and glyoxylate shunt pathway were activated by overexpressing acs (encoding acetyl-CoA synthetase) and deleting iclR (encoding the glyoxylate shunt pathway repressor). Because a key precursor malonyl-CoA is also consumed for fatty acid synthesis, we decreased carbon flux to fatty acid synthesis by adding cerulenin. Subsequently, we found that inhibiting fatty acid synthesis dramatically improved 3-HP production (3.00 g/L of 3-HP from 8.98 g/L of acetate). The results indicated that acetate can be used as a promising carbon source for microbial processes and that 3-HP can be produced from acetate with a high yield (44.6% of the theoretical maximum yield).11Ysciescopu
DeepCompass: AI-driven Location-Orientation Synchronization for Navigating Platforms
In current navigating platforms, the user's orientation is typically
estimated based on the difference between two consecutive locations. In other
words, the orientation cannot be identified until the second location is taken.
This asynchronous location-orientation identification often leads to our
real-life question: Why does my navigator tell the wrong direction of my car at
the beginning? We propose DeepCompass to identify the user's orientation by
bridging the gap between the street-view and the user-view images. First, we
explore suitable model architectures and design corresponding input
configuration. Second, we demonstrate artificial transformation techniques
(e.g., style transfer and road segmentation) to minimize the disparity between
the street-view and the user's real-time experience. We evaluate DeepCompass
with extensive evaluation in various driving conditions. DeepCompass does not
require additional hardware and is also not susceptible to external
interference, in contrast to magnetometer-based navigator. This highlights the
potential of DeepCompass as an add-on to existing sensor-based orientation
detection methods.Comment: 7page with 3 supplemental page
Prediction of type 2 diabetes using genome-wide polygenic risk score and metabolic profiles: A machine learning analysis of population-based 10-year prospective cohort study
Background: Previous work on predicting type 2 diabetes by integrating clinical and genetic factors has mostly focused on the Western population. In this study, we use genome-wide polygenic risk score (gPRS) and serum metabolite data for type 2 diabetes risk prediction in the Asian population. Methods: Data of 1425 participants from the Korean Genome and Epidemiology Study (KoGES) Ansan-Ansung cohort were used in this study. For gPRS analysis, genotypic and clinical information from KoGES health examinee (n = 58,701) and KoGES cardiovascular disease association (n = 8105) sub-cohorts were included. Linkage disequilibrium analysis identified 239,062 genetic variants that were used to determine the gPRS, while the metabolites were selected using the Boruta algorithm. We used bootstrapped cross-validation to evaluate logistic regression and random forest (RF)-based machine learning models. Finally, associations of gPRS and selected metabolites with the values of homeostatic model assessment of beta-cell function (HOMA-B) and insulin resistance (HOMA-IR) were further estimated. Findings: During the follow-up period (8.3 ?? 2.8 years), 331 participants (23.2%) were diagnosed with type 2 diabetes. The areas under the curves of the RF-based models were 0.844, 0.876, and 0.883 for the model using only demographic and clinical factors, model including the gPRS, and model with both gPRS and metabolites, respectively. Incorporation of additional parameters in the latter two models improved the classification by 11.7% and 4.2% respectively. While gPRS was significantly associated with HOMA-B value, most metabolites had a significant association with HOMA-IR value. Interpretation: Incorporating both gPRS and metabolite data led to enhanced type 2 diabetes risk prediction by capturing distinct etiologies of type 2 diabetes development. An RF-based model using clinical factors, gPRS, and metabolites predicted type 2 diabetes risk more accurately than the logistic regression-based model
Guanabenz Acetate Induces Endoplasmic Reticulum StressāRelated Cell Death in Hepatocellular Carcinoma Cells
Background Development of chemotherapeutics for the treatment of advanced hepatocellular carcinoma (HCC) has been lagging. Screening of candidate therapeutic agents by using patient-derived preclinical models may facilitate drug discovery for HCC patients. Methods Four primary cultured HCC cells from surgically resected tumor tissues and six HCC cell lines were used for high-throughput screening of 252 drugs from the Prestwick Chemical Library. The efficacy and mechanisms of action of the candidate anti-cancer drug were analyzed via cell viability, cell cycle assays, and western blotting. Results Guanabenz acetate, which has been used as an antihypertensive drug, was screened as a candidate anti-cancer agent for HCC through a drug sensitivity assay by using the primary cultured HCC cells and HCC cell lines. Guanabenz acetate reduced HCC cell viability through apoptosis and autophagy. This occurred via inhibition of growth arrest and DNA damage-inducible protein 34, increased phosphorylation of eukaryotic initiation factor 2Ī±, increased activating transcription factor 4, and cell cycle arrest. Conclusions Guanabenz acetate induces endoplasmic reticulum stressārelated cell death in HCC and may be repositioned as an anti-cancer therapeutic agent for HCC patients
Optimal planning target margin for prostate radiotherapy based on interfractional and intrafractional variability assessment during 1.5T MRI-guided radiotherapy
IntroductionWe analyzed daily pre-treatment- (PRE) and real-time motion monitoring- (MM) MRI scans of patients receiving definitive prostate radiotherapy (RT) with 1.5 T MRI guidance to assess interfractional and intrafractional variability of the prostate and suggest optimal planning target volume (PTV) margin.Materials and methodsRigid registration between PRE-MRI and planning CT images based on the pelvic bone and prostate anatomy were performed. Interfractional setup margin (SM) and interobserver variability (IO) were assessed by comparing the centroid values of prostate contours delineated on PRE-MRIs. MM-MRIs were used for internal margin (IM) assessment, and PTV margin was calculated using the van Herk formula.ResultsWe delineated 400 prostate contours on PRE-MRI images. SM was 0.57 Ā± 0.42, 2.45 Ā± 1.98, and 2.28 Ā± 2.08 mm in the left-right (LR), anterior-posterior (AP), and superior-inferior (SI) directions, respectively, after bone localization and 0.76 Ā± 0.57, 1.89 Ā± 1.60, and 2.02 Ā± 1.79 mm in the LR, AP, and SI directions, respectively, after prostate localization. IO was 1.06 Ā± 0.58, 2.32 Ā± 1.08, and 3.30 Ā± 1.85 mm in the LR, AP, and SI directions, respectively, after bone localization and 1.11 Ā± 0.55, 2.13 Ā± 1.07, and 3.53 Ā± 1.65 mm in the LR, AP, and SI directions, respectively, after prostate localization. Average IM was 2.12 Ā± 0.86, 2.24 Ā± 1.07, and 2.84 Ā± 0.88 mm in the LR, AP, and SI directions, respectively. Calculated PTV margin was 2.21, 5.16, and 5.40 mm in the LR, AP, and SI directions, respectively.ConclusionsMovements in the SI direction were the largest source of variability in definitive prostate RT, and interobserver variability was a non-negligible source of margin. The optimal PTV margin should also consider the internal margin
- ā¦