41,681 research outputs found
Multilocular Cystic Renal Cell Carcinoma: An Unusual Gross Appearance
Multilocular Cystic Renal Cell Carcinoma (MCRCC) represents a rare variant of clear cell (conventional) renal cell carcinomas. Attributable to its distinct characteristics in prognosis and its natural history, MCRCC was recognised as a separate subtype of renal cell carcinoma in the 2004 WHO classification of adult renal tumors. We report this case of MCRCC from antemortem surgical specimen, due to its unusual gross appearance and a rare clinical entity
On hamiltonian colorings of block graphs
A hamiltonian coloring c of a graph G of order p is an assignment of colors
to the vertices of G such that for every two
distinct vertices u and v of G, where D(u,v) denoted the detour distance
between u and v. The value hc(c) of a hamiltonian coloring c is the maximum
color assigned to a vertex of G. The hamiltonian chromatic number, denoted by
hc(G), is the min{hc(c)} taken over all hamiltonian coloring c of G. In this
paper, we present a lower bound for the hamiltonian chromatic number of block
graphs and give a sufficient condition to achieve the lower bound. We
characterize symmetric block graphs achieving this lower bound. We present two
algorithms for optimal hamiltonian coloring of symmetric block graphs.Comment: 12 pages, 1 figure. A conference version appeared in the proceedings
of WALCOM 201
Scalable Bayesian Non-Negative Tensor Factorization for Massive Count Data
We present a Bayesian non-negative tensor factorization model for
count-valued tensor data, and develop scalable inference algorithms (both batch
and online) for dealing with massive tensors. Our generative model can handle
overdispersed counts as well as infer the rank of the decomposition. Moreover,
leveraging a reparameterization of the Poisson distribution as a multinomial
facilitates conjugacy in the model and enables simple and efficient Gibbs
sampling and variational Bayes (VB) inference updates, with a computational
cost that only depends on the number of nonzeros in the tensor. The model also
provides a nice interpretability for the factors; in our model, each factor
corresponds to a "topic". We develop a set of online inference algorithms that
allow further scaling up the model to massive tensors, for which batch
inference methods may be infeasible. We apply our framework on diverse
real-world applications, such as \emph{multiway} topic modeling on a scientific
publications database, analyzing a political science data set, and analyzing a
massive household transactions data set.Comment: ECML PKDD 201
Preliminary results using a P300 brain-computer interface speller: a possible interaction effect between presentation paradigm and set of stimuli
Fernández-Rodríguez Á., Medina-Juliá M.T., Velasco-Álvarez F., Ron-Angevin R. (2019) Preliminary Results Using a P300 Brain-Computer Interface Speller: A Possible Interaction Effect Between Presentation Paradigm and Set of Stimuli. In: Rojas I., Joya G., Catala A. (eds) Advances in Computational Intelligence. IWANN 2019. Lecture Notes in Computer Science, vol 11506. Springer, ChamSeveral proposals to improve the performance controlling a P300-based BCI speller have been studied using the standard row-column presentation (RCP) par-adigm. However, this paradigm could not be suitable for those patients with lack of gaze control. To solve that, the rapid serial visual presentation (RSVP) para-digm, which presents the stimuli located in the same position, has been proposed in previous studies. Thus, the aim of the present work is to assess if a stimuli set of pictures that improves the performance in RCP, could also improve the per-formance in a RSVP paradigm. Six participants have controlled four conditions in a calibration task: letters in RCP, pictures in RCP, letters in RSVP and pictures in RSVP. The results showed that pictures in RCP obtained the best accuracy and information transfer rate. The improvement effect given by pictures was greater in the RCP paradigm than in RSVP. Therefore, the improvements reached under RCP may not be directly transferred to the RSVP.Universidad de Málaga. Campus de Excelencia Internacional Andalucía Tech
A domain specific approach to high performance heterogeneous computing
Users of heterogeneous computing systems face two problems: first, in understanding the trade-off relationships between the observable characteristics of their applications, such as latency and quality of the result, and second, how to exploit knowledge of these characteristics to allocate work to distributed computing platforms efficiently. A domain specific approach addresses both of these problems. By considering a subset of operations or functions, models of the observable characteristics or domain metrics may be formulated in advance, and populated at run-time for task instances. These metric models can then be used to express the allocation of work as a constrained integer program. These claims are illustrated using the domain of derivatives pricing in computational finance, with the domain metrics of workload latency and pricing accuracy. For a large, varied workload of 128 Black-Scholes and Heston model-based option pricing tasks, running upon a diverse array of 16 Multicore CPUs, GPUs and FPGAs platforms, predictions made by models of both the makespan and accuracy are generally within 10 percent of the run-time performance. When these models are used as inputs to machine learning and MILP-based workload allocation approaches, a latency improvement of up to 24 and 270 times over the heuristic approach is seen
Water quality permitting: from end-of-pipe to operational strategies
This is the final version of the article. Available from IWA Publishing via the DOI in this record.End-of-pipe permitting is a widely practised approach to control effluent discharges from wastewater treatment plants. However, the effectiveness of the traditional regulation paradigm is being challenged by increasingly complex environmental issues, ever growing public expectations on water quality and pressures to reduce operational costs and greenhouse gas emissions. To minimise overall environmental impacts from urban wastewater treatment, an operational strategy-based permitting approach is proposed and a four-step decision framework is established: 1) define performance indicators to represent stakeholders’ interests, 2) optimise operational strategies of urban wastewater systems in accordance to the indicators, 3) screen high performance solutions, and 4) derive permits of operational strategies of the wastewater treatment plant. Results from a case study show that operational cost, variability of wastewater treatment efficiency and environmental risk can be simultaneously reduced by at least 7%, 70% and 78% respectively using an optimal integrated operational strategy compared to the baseline scenario. However, trade-offs exist between the objectives thus highlighting the need of expansion of the prevailing wastewater management paradigm beyond the narrow focus on effluent water quality of wastewater treatment plants. Rather, systems thinking should be embraced by integrated control of all forms of urban wastewater discharges and coordinated regulation of environmental risk and treatment cost effectiveness. It is also demonstrated through the case study that permitting operational strategies could yield more environmentally protective solutions without entailing more cost than the conventional end-of-pipe permitting approach. The proposed four-step permitting framework builds on the latest computational techniques (e.g. integrated modelling, multi-objective optimisation, visual analytics) to efficiently optimise and interactively identify high performance solutions. It could facilitate transparent decision making on water quality management as stakeholders are involved in the entire process and their interests are explicitly evaluated using quantitative metrics and trade-offs considered in the decision making process. We conclude that the operational strategy-based permitting shows promising for regulators and water service providers alike.The authors would like to thank the financial support from the SANITAS project (EU
FP7 Marie Curie Initial Training Network – ITN – 289193) and the support of North
Wyke Farm and Atkins
Classroom Listening Conditions in Indian Primary Schools: A Survey of Four Schools
published_or_final_versio
Generating the Best Stacking Sequence Table for the Design of Blended Composite Structures
In order to improve the ability of a large-scale light-weight composite structure to carry tensile or compressive loads, stiffeners are added to the structure. The stiffeners divide the structure into several smaller panels. For a composite structure to be manufacturable, it is necessary that plies are continuous in multiple adjacent panels. To be able to prescribe a manufacturable design, an optimization algorithm can be coupled with a reference table for the stacking sequences (SST). As long as the ply stacks are selected from the SST, it is guaranteed that the design is manufacturable and all strength related guidelines associated with the design of composite structures are satisfied. An SST is made only based on strength related guidelines. Therefore, there exist a large number of possibilities for SSTs. Minimized mass is a typical goal in the design of aircraft structures. Different SSTs result in different values for the minimized mass. Thus it is crucial to perform optimization based on the SST which results in the lowest mass. This paper aims to introduce an approach to generate a unique SST resulting in the lowest mass. The proposed method is applied to the optimization problem of a stiffened composite structure resembling the skin of an aircraft wing box
Four-nucleon contact interactions from holographic QCD
We calculate the low energy constants of four-nucleon interactions in an
effective chiral Lagrangian in holographic QCD. We start with a D4-D8 model to
obtain meson-nucleon interactions and then integrate out massive mesons to
obtain the four-nucleon interactions in 4D. We end up with two low energy
constants at the leading order and seven of them at the next leading order,
which is consistent with the effective chiral Lagrangian. The values of the low
energy constants are evaluated with the first five Kaluza-Klein resonances.Comment: 28 page
Cutaneous Neuroimmune Interactions in Peripheral Neuropathic Pain States
Bidirectional interplay between the peripheral immune and nervous systems plays a crucial role in maintaining homeostasis and responding to noxious stimuli. This crosstalk is facilitated by a variety of cytokines, inflammatory mediators and neuropeptides. Dysregulation of this delicate physiological balance is implicated in the pathological mechanisms of various skin disorders and peripheral neuropathies. The skin is a highly complex biological structure within which peripheral sensory nerve terminals and immune cells colocalise. Herein, we provide an overview of the sensory innervation of the skin and immune cells resident to the skin. We discuss modulation of cutaneous immune response by sensory neurons and their mediators (e.g., nociceptor-derived neuropeptides), and sensory neuron regulation by cutaneous immune cells (e.g., nociceptor sensitization by immune-derived mediators). In particular, we discuss recent findings concerning neuroimmune communication in skin infections, psoriasis, allergic contact dermatitis and atopic dermatitis. We then summarize evidence of neuroimmune mechanisms in the skin in the context of peripheral neuropathic pain states, including chemotherapy-induced peripheral neuropathy, diabetic polyneuropathy, post-herpetic neuralgia, HIV-induced neuropathy, as well as entrapment and traumatic neuropathies. Finally, we highlight the future promise of emerging therapies associated with skin neuroimmune crosstalk in neuropathic pain
- …