841 research outputs found
Rational Choice of Machining Tools Using Prediction Procedures
Introducing the methods and procedures for predictive analysis into the design process contours of a variety of machining tools (MT) of metal cutting machines is the main aim of this article. A sequence of realization of prediction object (PO) choice as an initial stage of search of perspective designs is offered. Effective in this regard is the "Tree of objectives" apparatus, on the basis of which many ways of improving MT are formed, selecting progressive (reducing the dimension of the problem) at each level of the hierarchy of the constructed graph-tree. The procedure for selecting the prediction method (PM) as a means of generating the forecast data is developed. The task of choosing a method is structured in detail and uses "Information supply"as the main criterion. To this end, assessment scales of choice criteria have been formed, on the basis of which it is possible to evaluate their effectiveness for the PM selection process. The rules forPOcoding are introduced by a three-element information code, including information source classes – static data, expert estimates and patent data. The process of forecasting the MT components by the method of engineering forecasting on the basis of a representative patent fund is realized. The General Definition Table has been built (GDT "Machining tools") and estimates of the prospects of design solutions have been obtained. A fragment of the database of 3D models of promising MT designs in the integrated computer-aided design KOMPAS-3D is proposed
Scheduling and Allocation of Non-Manifest Loops on Hardware Graph-Models
We address the problem of scheduling non-manifest data dependant periodic loops for high throughput DSP-applications based on a streaming data model. In contrast to manifest loops, non-manifest data dependant loops are loops where the number of iterations needed in order to perform a calculation is data dependant and hence not known at compile time. For the case of manifest loops, static scheduling techniques have been devised which produce near optimal schedules. Due to the lack of exact run-time execution knowledge of non-manifest loops, these static scheduling techniques are not suitable for tackling scheduling problems of DSP-algorithms with non-manifest loops embedded in them. We consider the case where (a) a-priori knowledge of the data distribution, and (b) worst case execution time of the non-manifest loop are known and a constraint on the total execution time has been given. Under these conditions dynamic schedules of the non-manifest data dependant loops within the DSP-algorithm are possible. We show how to construct hardware which dynamically schedules these non-manifest loops. The sliding window execution, which is the execution of a non-manifest loop when the data streams through it, of the constructed hardware will guarantee real time performance for the worst case situation. This is the situation when each non-manifest loop requires its maximum number of iterations
Team PhyPA: Brain-Computer Interfacing for Everyday Human-Computer Interaction
Brain-computer interfaces can provide an input channel from humans to computers that depends only on brain activity, bypassing traditional means of communication and interaction. This input channel can be used to send explicit commands, but also to provide implicit input to the computer. As such, the computer can obtain information about its user that not only bypasses, but also goes beyond what can be communicated using traditional means. In this form, implicit input can potentially provide significant improvements to human-computer interaction. This paper describes a selection of work done by Team PhyPA (Physiological Parameters for Adaptation) at the Technische Universität Berlin to use brain-computer interfacing to enrich human-computer interaction
DEEM: Enabling microservices via DEvice edge markets
Native applications running over handheld devices have an irreplaceable role in users' daily activities. That said, recent studies show that users download on average zero new applications on monthly basis, which suggests that new apps can face discoverability issues. In this work, we aim for a web-based, download/installation-free access to native application features through microservices (ÎĽ Services)that are shared between user devices in a peer-to-peer (P2P)manner. Such a P2P approach is self-scalable and requires no investment for ÎĽ Service deployment, unlike mobile edge computing or Data Centre. We introduce DEEM, a DEvice Edge Market design that enables device-hosted ÎĽServices to end-users. In DEEM, ÎĽ Service-based markets act as rendezvous points between available ÎĽ Service instances and clients. DEEM ensures the i) assignment of instances to the users that value them the most, in terms of QoS gain, and ii) devices' income maximisation. Our evaluation on synthetic settings demonstrates DEEM's capability in exploiting the pool of device instances for improving the application QoS in terms of latency
The interactive effects of various nitrogen fertiliser formulations applied to urine patches on nitrous oxide emissions in grassland
peer-reviewedPasture-based livestock agriculture is a major source of greenhouse gas (GHG) nitrous oxide (N2O). Although a body of research is available on the effect of urine patch N or fertiliser N on N2O emissions, limited data is available on the effect of fertiliser N applied to patches of urinary N, which can cover up to a fifth of the yearly grazed area. This study investigated whether the sum of N2O emissions from urine and a range of N fertilisers, calcium ammonium nitrate (CAN) or urea ± urease inhibitor ± nitrification inhibitor, applied alone (disaggregated and re-aggregated) approximated the N2O emission of urine and fertiliser N applied together (aggregated). Application of fertiliser to urine patches did not significantly increase either the cumulative yearly N2O emissions or the N2O emission factor in comparison to urine and fertiliser applied separately with the emissions re-aggregated. However, there was a consistent trend for approximately 20% underestimation of N2O loss generated from fertiliser and urine applied separately when compared to figures generated when urine and fertiliser were applied together. N2O emission factors from fertilisers were 0.02%, 0.06%, 0.17% and 0.25% from urea ± dicyandiamide (DCD), urea + N-(n-butyl) thiophosphoric triamide (NBPT) + DCD, urea + NBPT and urea, respectively, while the emission factor for urine alone was 0.33%. Calcium ammonium nitrate and urea did not interact differently with urine even when the urea included DCD. N2O losses could be reduced by switching from CAN to urea-based fertilisers
Assessment of mental workload across cognitive tasks using a passive brain-computer interface based on mean negative theta-band amplitudes
Brain-computer interfaces (BCI) can provide real-time and continuous assessments of mental workload in different scenarios, which can subsequently be used to optimize human-computer interaction. However, assessment of mental workload is complicated by the task-dependent nature of the underlying neural signals. Thus, classifiers trained on data from one task do not generalize well to other tasks. Previous attempts at classifying mental workload across different cognitive tasks have therefore only been partially successful. Here we introduce a novel algorithm to extract frontal theta oscillations from electroencephalographic (EEG) recordings of brain activity and show that it can be used to detect mental workload across different cognitive tasks. We use a published data set that investigated subject dependent task transfer, based on Filter Bank Common Spatial Patterns. After testing, our approach enables a binary classification of mental workload with performances of 92.00 and 92.35%, respectively for either low or high workload vs. an initial no workload condition, with significantly better results than those of the previous approach. It, nevertheless, does not perform beyond chance level when comparing high vs. low workload conditions. Also, when an independent component analysis was done first with the data (and before any additional preprocessing procedure), even though we achieved more stable classification results above chance level across all tasks, it did not perform better than the previous approach. These mixed results illustrate that while the proposed algorithm cannot replace previous general-purpose classification methods, it may outperform state-of-the-art algorithms in specific (workload) comparisons
The SBP2 protein central to selenoprotein synthesis contacts the human ribosome at expansion segment 7L of the 28S rRNA.
SBP2 is a pivotal protein component in selenoprotein synthesis. It binds the SECIS stem-loop in the 3' UTR of selenoprotein mRNA and interacts with both the specialized translation elongation factor and the ribosome at the 60S subunit. In this work, our goal was to identify the binding partners of SBP2 on the ribosome. Cross-linking experiments with bifunctional reagents demonstrated that the SBP2-binding site on the human ribosome is mainly formed by the 28S rRNA. Direct hydroxyl radical probing of the entire 28S rRNA revealed that SBP2 bound to 80S ribosomes or 60S subunits protects helix ES7L-E in expansion segment 7 of the 28S rRNA. Diepoxybutane cross-linking confirmed the interaction of SBP2 with helix ES7L-E. Additionally, binding of SBP2 to the ribosome led to increased reactivity toward chemical probes of a few bases in ES7L-E and in the universally conserved helix H89, indicative of conformational changes in the 28S rRNA in response to SBP2 binding. This study revealed for the first time that SBP2 makes direct contacts with a discrete region of the human 28S rRNA
A novel insight into the mechanism of mammalian selenoprotein synthesis
IThe amino acid selenocysteine is encoded by UGA, usually a stop codon, thus requiring a specialized machinery to enable its incorporation into selenoproteins. The machinery comprises the tRNASec, a 3′-UTR mRNA stemloop termed SElenoCysteine Insertion Sequence (SECIS), which is mandatory for recoding UGA as a Sec codon, the SECIS Binding Protein 2 (SBP2), and other proteins. Little is known about the molecular mechanism and, in particular, when, where, and how the SECIS and SBP2 contact the ribosome. Previous work by others used the isolated SECIS RNA to address this question. Here, we developed a novel approach using instead engineered minimal selenoprotein mRNAs containing SECIS elements derivatized with photoreactive groups. By cross-linking experiments in rabbit reticulocyte lysate, new information could be gained about the SBP2 and SECIS contacts with components of the translation machinery at various translation steps. In particular, we found that SBP2 was bound only to the SECIS in 48S pre-initiation and 80S pretranslocation complexes. In the complex where the Sec-tRNASec was accommodated to the A site but transpeptidation was blocked, SBP2 bound the ribosome and possibly the SECIS element as well, and the SECIS had flexible contacts with the 60S ribosomal subunit involving several ribosomal proteins. Altogether, our findings led to broadening our understanding about the unique mechanism of selenocysteine incorporation in mammals
A Deblurring/Denoising Corrected Scintigraphic Planar Image Reconstruction Model for Targeted Alpha Theory
Scintigraphy is a common nuclear medicine method to image molecular target’s bio-distribution and pharmacokinetics through the use of radiotracers and gamma cameras. The patient’s images are obtained by using a pair of opposing large flat gamma ray detectors equipped with parallel-hole lead or tungsten collimators that preferentially detect gamma-rays that are emitted perpendicular to the plane of the detector. The resulting images form an anterior/posterior (A/P) planar image pairs. The obtained images are contaminated by noise and contain artifacts caused by gamma-ray attenuation, collimator penetration, scatter and other detrimental factors. Post-filtering of the images can reduce the noise, but at the cost of spatial resolution loss, and cannot remove any of the aforementioned artifacts. In this study, we introduced a new image reconstruction-based method to recover a single corrected planar scintigraphic patient image corrected for attenuation, system spatial resolution and collimator penetration, using the A/P image pair (two conjugated views) as data. To accomplish this task, we used a system model based on the gamma camera detectors physical properties and applied regularization method based on sparse image representation to control noise while preserving spatial resolution. In this proof-of-concept study, we evaluated the proposed approach using simple numerical phantoms. The images were evaluated for simulated lesions images contrast and background variability. Our initial results indicate that the proposed method outperforms the conventional methods. We conclude, that the proposed approach is a promising methodology for improved planar scintigraphic image quality and warrants further exploration
Detailed Analysis of Scatter Contribution from Different Simulated Geometries of X-ray Detectors.
Scattering is one of the main issues left in planar mammography examinations, as it degrades the quality of the image and complicates the diagnostic
process. Although widely used, anti-scatter grids have been found to be inefficient, increasing the dose delivered, the equipment price and not eliminating all
the scattered radiation. Alternative scattering reduction methods, based on postprocessing algorithms using Monte Carlo (MC) simulations, are being developed
to substitute anti-scatter grids. Idealized detectors are commonly used in the simulations for the purpose of simplification. In this study, the scatter distribution of
three detector geometries is analyzed and compared: Case 1 makes use of idealized detector geometry, Case 2 uses a scintillator plate and Case 3 uses a more
realistic detector simulation, based on the structure of an indirect mammography
X-ray detector. This paper demonstrates that common configuration simplifications may introduce up to 14% of underestimation of the scatter in simulation
results
- …