69 research outputs found
Redistributed manufacturing in healthcare: Creating new value through disruptive innovation
The RiHN White Paper is the first serious attempt to gather expertise and to explore applications in promising areas of healthcare that could benefit from RDM and covers early-stage user needs, challenges and priorities. The UK has an opportunity to lead in this area and RiHN has identified an extensive number of areas for fruitful R&D, crossing production technology, infrastructure, business and organisations. The paper serves as a foundation for discussing future technological roadmaps and engaging the wider community and stakeholders, as well as policy makers, in addressing the potential impact of RDM.The RiHN White Paper is of particular value to policy makers and funders seeking to specify action and to direct attention where it is needed. The White Paper is also useful for the research community, to support their proposals with credible research propositions and to show where collaboration with industry and the public sector will deliver the most benefits.In order to seize the opportunities presented by RDM RiHN proposes a bold new agenda that incorporates a whole healthcare system view of future implementation pathways and wider transformation implications. The priority areas for Future R&D can be summarised as follows: throughAutomated production platform technologies and supporting manufacturing infrastructuresAdvances in analytics and metrologyNew regulatory frameworks and governance pathwaysNew frameworks for business model and organisational transformationThe time to take action is now. Technologies are developing that have the potential to disrupt traditional healthcare pathways and offer therapies tailored to individual needs and physiological characteristics. The challenge is seizing this opportunity and make the UK a world leader in RDM
CoAIcoder: Examining the Effectiveness of AI-assisted Collaborative Qualitative Analysis
While the domain of individual-level AI-assisted analysis has been
extensively explored in previous studies, the field of AI-assisted
collaborative qualitative analysis remains relatively unexplored. After
identifying CQA practices and design opportunities through formative
interviews, we introduce our collaborative qualitative coding tool, CoAIcoder,
and designed the four different collaboration methods. We subsequently
implemented a between-subject design involving 32 pairs of users who have
undergone training in CQA across three commonly utilized phases under four
methods. Our results suggest that CoAIcoder, which employs AI and a Shared
Model, could potentially improve the efficiency of the coding process in CQA by
fostering a quicker shared understanding and promoting early-stage discussions.
However, this may come with the potential downside of reduced code diversity.
We also underscored the existence of a trade-off between the level of
independence and the coding outcome when humans collaborate during the early
coding stages. Lastly, we identify design implications that could inspire and
inform the future design of CQA systems
A low-cost and do-it-yourself device for pumping monitoring in deep aquifers
Water crises due to climate change, high population growth and increasing demands from industry and agriculture claim for increasing efficiency and universalizing water resources management strategies and techniques. Water monitoring helps providing necessary evidences for making sound decisions about managing water resources both now and in the future. In this work, a low cost and “do it yourself” communication device is proposed to record water production and energy consumption of electric pumpings from deep boreholes/wells, and to predict the impact of the ongoing and previous pumpings in the evolution of the water level in the aquifer. The proposal incorporates an edge-computing approach for the simulation of the aquifer response in real-time. Computation of results of interest is performed at the sensor, minimizing communication requirements and ensuring almost immediate results. An approximated solution to physically based modeling of aquifer response is computed thanks to the a priori expression of the water level time evolution in a reduced basis. The accuracy is enough to detect deviations from expected behaviour. The energy consumption of the device is very much reduced with respect to that of a full modelling, which can be computed off-line for calibrating reduced model parameters and perform detailed analyses. The device is tested in a real scenario, in a mountain subbasin of the Ebro river in Spain, obtaining a good trade-off between performance, price, and energy consumption.This research has been partly supported by EU under grant agreement N. 825184 and funded by the Government of Spain under contracts PID2019-106774RB-C21, PID2019-106774RB-C22, and PID2020-113172RB-I00 and by the Government of Catalonia as Consolidated Research Groups 2017-SGR-688 and 2017-SGR-990, and Pre-consolidated Research Group 2017-SGR-1496. The APC was funded by the Open program from Universitat Rovira i Virgili.Peer ReviewedPostprint (published version
NormBank: A Knowledge Bank of Situational Social Norms
We present NormBank, a knowledge bank of 155k situational norms. This
resource is designed to ground flexible normative reasoning for interactive,
assistive, and collaborative AI systems. Unlike prior commonsense resources,
NormBank grounds each inference within a multivalent sociocultural frame, which
includes the setting (e.g., restaurant), the agents' contingent roles (waiter,
customer), their attributes (age, gender), and other physical, social, and
cultural constraints (e.g., the temperature or the country of operation). In
total, NormBank contains 63k unique constraints from a taxonomy that we
introduce and iteratively refine here. Constraints then apply in different
combinations to frame social norms. Under these manipulations, norms are
non-monotonic - one can cancel an inference by updating its frame even
slightly. Still, we find evidence that neural models can help reliably extend
the scope and coverage of NormBank. We further demonstrate the utility of this
resource with a series of transfer experiments
Long Term Evolution-Advanced and Future Machine-to-Machine Communication
Long Term Evolution (LTE) has adopted Orthogonal Frequency Division Multiple Access (OFDMA) and Single Carrier Frequency Division Multiple Access (SC-FDMA) as the downlink and uplink transmission schemes respectively. Quality of Service (QoS) provisioning is one of the primary objectives of wireless network operators. In LTE-Advanced (LTE-A), several additional new features such as Carrier Aggregation (CA) and Relay Nodes (RNs) have been introduced by the 3rd Generation Partnership Project (3GPP). These features have been designed to deal with the ever increasing demands for higher data rates and spectral efficiency. The RN is a low power and low cost device designed for extending the coverage and enhancing spectral efficiency, especially at the cell edge. Wireless networks are facing a new challenge emerging on the horizon, the expected surge of the Machine-to-Machine (M2M) traffic in cellular and mobile networks. The costs and sizes of the M2M devices with integrated sensors, network interfaces and enhanced power capabilities have decreased significantly in recent years. Therefore, it is anticipated that M2M devices might outnumber conventional mobile devices in the near future. 3GPP standards like LTE-A have primarily been developed for broadband data services with mobility support. However, M2M applications are mostly based on narrowband traffic. These standards may not achieve overall spectrum and cost efficiency if they are utilized for serving the M2M applications. The main goal of this thesis is to take the advantage of the low cost, low power and small size of RNs for integrating M2M traffic into LTE-A networks. A new RN design is presented for aggregating and multiplexing M2M traffic at the RN before transmission over the air interface (Un interface) to the base station called eNodeB. The data packets of the M2M devices are sent to the RN over the Uu interface. Packets from different devices are aggregated at the Packet Data Convergence Protocol (PDCP) layer of the Donor eNodeB (DeNB) into a single large IP packet instead of several small IP packets. Therefore, the amount of overhead data can be significantly reduced. The proposed concept has been developed in the LTE-A network simulator to illustrate the benefits and advantages of the M2M traffic aggregation and multiplexing at the RN. The potential gains of RNs such as coverage enhancement, multiplexing gain, end-to-end delay performance etc. are illustrated with help of simulation results. The results indicate that the proposed concept improves the performance of the LTE-A network with M2M traffic. The adverse impact of M2M traffic on regular LTE-A traffic such as voice and file transfer is minimized. Furthermore, the cell edge throughput and QoS performance are enhanced. Moreover, the results are validated with the help of an analytical model
Cellular and Wi-Fi technologies evolution: from complementarity to competition
This PhD thesis has the characteristic to span over a long time because while working on it, I was working as a research engineer at CTTC with highly demanding development duties. This has delayed the deposit more than I would have liked. On the other hand, this has given me the privilege of witnessing and studying how wireless technologies have been evolving over a decade from 4G to 5G and beyond.
When I started my PhD thesis, IEEE and 3GPP were defining the two main wireless technologies at the time, Wi-Fi and LTE, for covering two substantially complementary market targets. Wi-Fi was designed to operate mostly indoor, in unlicensed spectrum, and was aimed to be a simple and cheap technology. Its primary technology for coexistence was based on the assumption that the spectrum on which it was operating was for free, and so it was designed with interference avoidance through the famous CSMA/CA protocol. On the other hand, 3GPP was designing technologies for licensed spectrum, a costly kind of spectrum. As a result, LTE was designed to take the best advantage of it while providing the best QoE in mainly outdoor scenarios.
The PhD thesis starts in this context and evolves with these two technologies. In the first chapters, the thesis studies radio resource management solutions for standalone operation of Wi-Fi in unlicensed and LTE in licensed spectrum. We anticipated the now fundamental machine learning trend by working on machine learning-based radio resource management solutions to improve LTE and Wi-Fi operation in their respective spectrum. We pay particular attention to small cell deployments aimed at improving the spectrum efficiency in licensed spectrum, reproducing small range scenarios typical of Wi-Fi settings.
IEEE and 3GPP followed evolving the technologies over the years: Wi-Fi has grown into a much more complex and sophisticated technology, incorporating the key features of cellular technologies, like HARQ, OFDMA, MU-MIMO, MAC scheduling and spatial reuse. On the other hand, since Release 13, cellular networks have also been designed for unlicensed spectrum. As a result, the two last chapters of this thesis focus on coexistence scenarios, in which LTE needs to be designed to coexist with Wi-Fi fairly, and NR, the radio access for 5G, with Wi-Fi in 5 GHz and WiGig in 60 GHz. Unlike LTE, which was adapted to operate in unlicensed spectrum, NR-U is natively designed with this feature, including its capability to operate in unlicensed in a complete standalone fashion, a fundamental new milestone for cellular. In this context, our focus of analysis changes. We consider that these two technological families are no longer targeting complementarity but are now competing, and we claim that this will be the trend for the years to come.
To enable the research in these multi-RAT scenarios, another fundamental result of this PhD thesis, besides the scientific contributions, is the release of high fidelity models for LTE and NR and their coexistence with Wi-Fi and WiGig to the ns-3 open-source community. ns-3 is a popular open-source network simulator, with the characteristic to be multi-RAT and so naturally allows the evaluation of coexistence scenarios between different technologies. These models, for which I led the development, are by academic citations, the most used open-source simulation models for LTE and NR and havereceived fundings from industry (Ubiquisys, WFA, SpiderCloud, Interdigital, Facebook) and federal agencies (NIST, LLNL) over the years.Aquesta tesi doctoral tĂ© la caracterĂstica d’allargar-se durant un llarg perĂode de temps ja que mentre treballava en ella, treballava com a enginyera investigadora a CTTC amb tasques de desenvolupament molt exigents. Això ha endarrerit el dipositar-la mĂ©s del que m’haguĂ©s agradat. D’altra banda, això m’ha donat el privilegi de ser testimoni i estudiar com han evolucionat les tecnologies sense fils durant mĂ©s d’una dècada des del 4G fins al 5G i mĂ©s enllĂ . Quan vaig començar la tesi doctoral, IEEE i 3GPP estaven definint les dues tecnologies sense fils principals en aquell moment, Wi-Fi i LTE, que cobreixen dos objectius de mercat substancialment complementaris. Wi-Fi va ser dissenyat per funcionar principalment en interiors, en espectre sense llicència, i pretenia ser una tecnologia senzilla i barata. La seva tecnologia primĂ ria per a la convivència es basava en el supòsit que l’espectre en el que estava operant era de franc, i, per tant, es va dissenyar simplement evitant interferències a travĂ©s del famĂłs protocol CSMA/CA. D’altra banda, 3GPP estava dissenyant tecnologies per a espectres amb llicència, un tipus d’espectre costĂłs. Com a resultat, LTE estĂ dissenyat per treure’n el mĂ xim profit alhora que proporciona el millor QoE en escenaris principalment a l’aire lliure. La tesi doctoral comença amb aquest context i evoluciona amb aquestes dues tecnologies. En els primers capĂtols, estudiem solucions de gestiĂł de recursos de radio per a operacions en espectre de Wi-Fi sense llicència i LTE amb llicència. Hem anticipat l’actual tendència fonamental d’aprenentatge automĂ tic treballant solucions de gestiĂł de recursos de radio basades en l’aprenentatge automĂ tic per millorar l’LTE i Wi-Fi en el seu espectre respectiu. Prestem especial atenciĂł als desplegaments de cèl·lules petites destinades a millorar la eficiència d’espectre llicenciat, reproduint escenaris de petit abast tĂpics de la configuraciĂł Wi-Fi. IEEE i 3GPP van seguir evolucionant les tecnologies al llarg dels anys: El Wi-Fi s’ha convertit en una tecnologia molt mĂ©s complexa i sofisticada, incorporant les caracterĂstiques clau de les tecnologies cel·lulars, com ara HARQ i la reutilitzaciĂł espacial. D’altra banda, des de la versiĂł 13, tambĂ© s’han dissenyat xarxes cel·lulars per a espectre sense llicència. Com a resultat, els dos darrers capĂtols d’aquesta tesi es centren en aquests escenaris de convivència, on s’ha de dissenyar LTE per conviure amb la Wi-Fi de manera justa, i NR, l’accĂ©s a la radio per a 5G amb Wi-Fi a 5 GHz i WiGig a 60 GHz. A diferència de LTE, que es va adaptar per funcionar en espectre sense llicència, NR-U estĂ dissenyat de forma nativa amb aquesta caracterĂstica, inclosa la seva capacitat per operar sense llicència de forma autònoma completa, una nova fita fonamental per al mòbil. En aquest context, el nostre focus d’anĂ lisi canvia. Considerem que aquestes dues famĂlies de tecnologia ja no estan orientades cap a la complementarietat, sinĂł que ara competeixen, i afirmem que aquesta serĂ el tendència per als propers anys. Per permetre la investigaciĂł en aquests escenaris multi-RAT, un altre resultat fonamental d’aquesta tesi doctoral, a mĂ©s de les aportacions cientĂfiques, Ă©s l’alliberament de models d’alta fidelitat per a LTE i NR i la seva coexistència amb Wi-Fi a la comunitat de codi obert ns-3. ns-3 Ă©s un popular simulador de xarxa de codi obert, amb la caracterĂstica de ser multi-RAT i, per tant, permet l’avaluaciĂł de manera natural d’escenaris de convivència entre diferents tecnologies. Aquests models, pels quals he liderat el desenvolupament, sĂłn per cites acadèmiques, els models de simulaciĂł de codi obert mĂ©s utilitzats per a LTE i NR i que han rebut finançament de la indĂşstria (Ubiquisys, WFA, SpiderCloud, Interdigital, Facebook) i agències federals (NIST, LLNL) al llarg dels anys.Esta tesis doctoral tiene la caracterĂstica de extenderse durante mucho tiempo porque mientras trabajaba en ella, trabajaba como ingeniera de investigaciĂłn en CTTC con tareas de desarrollo muy exigentes. Esto ha retrasado el depĂłsito más de lo que me hubiera gustado. Por otro lado,
gracias a ello, he tenido el privilegio de presenciar y estudiar como las tecnologĂas inalámbricas
han evolucionado durante una década, de 4G a 5G y más allá.
Cuando comencé mi tesis doctoral, IEEE y 3GPP estaban definiendo las dos principales
tecnologĂas inalámbricas en ese momento, Wi-Fi y LTE, cumpliendo dos objetivos de mercado
sustancialmente complementarios. Wi-Fi fue diseñado para funcionar principalmente en
interiores, en un espectro sin licencia, y estaba destinado a ser una tecnologĂa simple y barata.
Su tecnologĂa primaria para la convivencia se basaba en el supuesto en que el espectro en
el que estaba operando era gratis, y asà fue diseñado simplemente evitando interferencias a
travĂ©s del famoso protocolo CSMA/CA. Por otro lado, 3GPP estaba diseñando tecnologĂas
para espectro con licencia, un tipo de espectro costoso. Como resultado, LTE está diseñado
para aprovechar el espectro al máximo proporcionando al mismo tiempo el mejor QoE en
escenarios principalmente al aire libre.
La tesis doctoral parte de este contexto y evoluciona con estas dos tecnologĂas. En los
primeros capĂtulos, estudiamos las soluciones de gestiĂłn de recursos de radio para operaciĂłn
en espectro Wi-Fi sin licencia y LTE con licencia. Anticipamos la tendencia ahora fundamental
de aprendizaje automático trabajando en soluciones de gestión de recursos de radio para
mejorar LTE y funcionamiento deWi-Fi en su respectivo espectro. Prestamos especial atenciĂłn
a las implementaciones de células pequeñas destinadas a mejorar la eficiencia de espectro
licenciado, reproduciendo los tĂpicos escenarios de rango pequeño de la configuraciĂłn Wi-Fi.
IEEE y 3GPP siguieron evolucionando las tecnologĂas a lo largo de los años: Wi-Fi
se ha convertido en una tecnologĂa mucho más compleja y sofisticada, incorporando las
caracterĂsticas clave de las tecnologĂas celulares, como HARQ, OFDMA, MU-MIMO, MAC
scheduling y la reutilización espacial. Por otro lado, desde la Release 13, también se han
diseñado redes celulares para espectro sin licencia. Como resultado, los dos Ăşltimos capĂtulos
de esta tesis se centran en estos escenarios de convivencia, donde LTE debe diseñarse para
coexistir con Wi-Fi de manera justa, y NR, el acceso por radio para 5G con Wi-Fi en 5 GHz
y WiGig en 60 GHz. A diferencia de LTE, que se adaptĂł para operar en espectro sin licencia,
NR-U está diseñado de forma nativa con esta función, incluyendo su capacidad para operar
sin licencia de forma completamente independiente, un nuevo hito fundamental para los
celulares. En este contexto, cambia nuestro enfoque de análisis. Consideramos que estas dos
familias tecnológicas ya no tienen como objetivo la complementariedad, sino que ahora están
compitiendo, y afirmamos que esta será la tendencia para los próximos años.
Para permitir la investigaciĂłn en estos escenarios de mĂşltiples RAT, otro resultado fundamental
de esta tesis doctoral, además de los aportes cientĂficos, es el lanzamiento de modelos de alta
fidelidad para LTE y NR y su coexistencia con Wi-Fi y WiGig a la comunidad de cĂłdigo
abierto de ns-3. ns-3 es un simulador popular de red de cĂłdigo abierto, con la caracterĂstica
de ser multi-RAT y asĂ, naturalmente, permite la evaluaciĂłn de escenarios de convivencia
entre diferentes tecnologĂas. Estos modelos, para los cuales liderĂ© el desarrollo, son por citas
académicas, los modelos de simulación de código abierto más utilizados para LTE y NR y
han recibido fondos de la industria (Ubiquisys, WFA, SpiderCloud, Interdigital, Facebook) y
agencias federales (NIST, LLNL) a lo largo de los años.Postprint (published version
Processing Uncertain RFID Data in Traceability Supply Chains
Radio Frequency Identification (RFID) is widely used to track and trace objects in traceability supply chains. However, massive uncertain data produced by RFID readers are not effective and efficient to be used in RFID application systems. Following the analysis of key features of RFID objects, this paper proposes a new framework for effectively and efficiently processing uncertain RFID data, and supporting a variety of queries for tracking and tracing RFID objects. We adjust different smoothing windows according to different rates of uncertain data, employ different strategies to process uncertain readings, and distinguish ghost, missing, and incomplete data according to their apparent positions. We propose a comprehensive data model which is suitable for different application scenarios. In addition, a path coding scheme is proposed to significantly compress massive data by aggregating the path sequence, the position, and the time intervals. The scheme is suitable for cyclic or long paths. Moreover, we further propose a processing algorithm for group and independent objects. Experimental evaluations show that our approach is effective and efficient in terms of the compression and traceability queries
Retrieval Enhancements for Task-Based Web Search
The task-based view of web search implies that retrieval should take the user perspective into account. Going beyond merely retrieving the most relevant result set for the current query, the retrieval system should aim to surface results that are actually useful to the task that motivated the query.
This dissertation explores how retrieval systems can better understand and support their users’ tasks from three main angles: First, we study and quantify search engine user behavior during complex writing tasks, and how task success and behavior are associated in such settings. Second, we investigate search engine queries formulated as questions, and explore patterns in a large query log that may help search engines to better support this increasingly prevalent interaction pattern. Third, we propose a novel approach to reranking the search result lists produced by web search engines, taking into account retrieval axioms that formally specify properties of a good ranking.Die Task-basierte Sicht auf Websuche impliziert, dass die Benutzerperspektive berücksichtigt werden sollte. Über das bloße Abrufen der relevantesten Ergebnismenge für die aktuelle Anfrage hinaus, sollten Suchmaschinen Ergebnisse liefern, die tatsächlich für die Aufgabe (Task) nützlich sind, die diese Anfrage motiviert hat.
Diese Dissertation untersucht, wie Retrieval-Systeme die Aufgaben ihrer Benutzer besser verstehen und unterstützen können, und leistet Forschungsbeiträge unter drei Hauptaspekten: Erstens untersuchen und quantifizieren wir das Verhalten von Suchmaschinenbenutzern während komplexer Schreibaufgaben, und wie Aufgabenerfolg und Verhalten in solchen Situationen zusammenhängen. Zweitens untersuchen wir Suchmaschinenanfragen, die als Fragen formuliert sind, und untersuchen ein Suchmaschinenlog mit fast einer Milliarde solcher Anfragen auf Muster, die Suchmaschinen dabei helfen können, diesen zunehmend verbreiteten Anfragentyp besser zu unterstützen. Drittens schlagen wir einen neuen Ansatz vor, um die von Web-Suchmaschinen erstellten Suchergebnislisten neu zu sortieren, wobei Retrieval-Axiome berücksichtigt werden, die die Eigenschaften eines guten Rankings formal beschreiben
- …