79 research outputs found
Data Organization For Data Broadcasting In Mobile Computing
Peningkatan penggunaan di dalam teknologi tanpa wayar, membenarkan data
atau maklumat dicapai oleh pengguna pada bila-bila masa dan di mana sahaja.
The advances in mobile devices and wireless communication techniques have enabled
anywhere, anytime data access
The data cyclotron query processing scheme
Distributed database systems exploit static workload characteristics to steer data fragmentation and data allocation schemes. However, the grand challenge of distributed query processing is to come up with a self-organizing architecture, which exploits all resources to manage the hot data set, minimize query response time, and maximize throughput without global co-ordination.
In this paper, we introduce the Data Cyclotron architecture which addresses the challenges using turbulent data movement through a storage ring built from distributed main memory capitalizing modern remote-DMA facilities. Queries assigned to individual nodes interact with the Data Cyclotron by picking up data fragments continuously flowing around, i.e., the hot set.
Each data fragment carries a level of interest (LOI) metric, which represents the cumulative query interest as the fragment passes around the ring multiple times. A fragment with a LOI below a given threshold, inversely proportional to the ring load, is pulled o
Large Language Models Suffer From Their Own Output: An Analysis of the Self-Consuming Training Loop
Large language models (LLM) have become state of the art in many benchmarks
and conversational LLM applications like ChatGPT are now widely used by the
public. Those LLMs can be used to generate large amounts of content which is
posted on the internet to various platforms. As LLMs are trained on datasets
usually collected from the internet, this LLM-generated content might be used
to train the next generation of LLMs. Therefore, a self-consuming training loop
emerges in which new LLM generations are trained on the output from the
previous generations. We empirically study this self-consuming training loop
using a novel dataset to analytically and accurately measure quality and
diversity of generated outputs. We find that this self-consuming training loop
initially improves both quality and diversity. However, after a few generations
the output inevitably degenerates in diversity. We find that the rate of
degeneration depends on the proportion of real and generated data
Soaring spaces : the development of an integrated terrestrial and bathymetric information system for the Maltese Islands
In a rapidly developing world where the introduction of massive online information systems has enabled both the scientist and the general public to interact with remotely- located data from across the globe, the reality of access to data and eventually to information is slowly bringing forth the realisation that decades-old barriers to access to data still need to be overcome. Data availability suffers from a plethora of scourges that have left entire countries with a dearth of reliable baseline information, particularly small states which have limited human capacity to manage the whole datacycle in the physical, social and environmental domains. The main limitations include the fact that there are few homogeneous structures in operation, which governance situation has rendered data gathering agencies as a series of independent hoarding kingdoms, where data ‘ownership’ is seen as a private not as a corporate or a national affair thus the main users instead of being custodians transform themselves as the private owners of such data. Other more technical issues include the fact that there are too many standards to follow, data is not dynamic (gathered ad hoc as a one-off and not real-time), data is not quality assured/controlled, queries are not organised and recorded, data is not secured – (‘illegal’ use of storage on personal storage devices and other digital media) and that versioning is not practiced. In addition, even where the data is available, there is an upsurge in requests for access to such data which has increased drastically since TIM BERNERS-LEE’S (1989) world wide web (WWW) proposal changed society as never before. The WWW changed a medium that was at best techno-centric to one that is now essentially socio-technic. Increasing requirements for bandwidth has resulted in a need for a reanalysis of DAHRENDORFF’s (1990) access issue in contemporary worlds, both real and virtual, where not all society has access to the information through on-line services.peer-reviewe
Modification de la microstructure et des propriétés mécaniques d'échafaudages à base de gels de collagène pour la régénération du tissu vasculaire
Le besoin de substituts pour vaisseaux sanguins de petit calibre a attiré une attention considérable sur le développement de constructions artérielles dans des bioréacteurs à partir de systèmes d’échafaudage. Les gels formés à partir de collagène reconstitué représentent des substrats idéaux pour le remodelage du à l’activité cellulaire, mais leur faible résistance et élasticité limitent leur utilisation comme échafaudages pour la régénération du tissu vasculaire. Ces caractéristiques proviennent de la perte d’organisation structurelle liée au processus d’extraction du collagène. Dans ce contexte, l’objectif de ce projet était d’améliorer les propriétés mécaniques des gels de collagène afin de supporter la croissance et la maturation du tissu vasculaire sous contrainte cyclique. En considérant l’importance de l’état d’agrégation du collagène pour les propriétés mécaniques des tissus natifs, la stratégie de ce projet a été de modifier la microstructure des matrices de collagène reconstitué en agissant sur trois aspects : 1) les interactions intermoléculaires et la fibrillogenèse ont été ajustées en variant les paramètres expérimentaux (pH, température, force ionique et concentration du collagène); 2) des liaisons covalentes ont été introduites afin de fixer des fibrilles voisines; 3) les gels ont été compactés et les fibrilles alignées grâce à l’action de remodelage des cellules. Des mesures de spectrophotométrie et des images par MEB ont confirmé les effets des conditions expérimentales et du remodelage sur la microstructure des gels. Notamment, la présence des cellules a permis la formation de matrices plus compactes et orientées, surtout en présence de contraintes mécaniques. Des essais mécaniques ont démontré que les stratégies adoptées ont engendré le renforcement de la structure. En particulier, des essais cycliques ont établi que la variation des conditions expérimentales combinée à la réticulation ont produit des matrices dont l’hystérèse diminue et l’élasticité augmente. En conclusion, l’ensemble de ces études a permis la réalisation à court terme (24-48 h) de structures à base de collagène présentant une résistance mécanique, une rigidité et une élasticité accrues. Ces résultats suggèrent que ces matrices sont de bons candidats comme supports pour la régénération de tissus vasculaires sous conditionnement cyclique.The need for small-caliber vascular replacements has attracted considerable attention on the development of scaffold-based vascular constructs in bioreactors. Reconstituted collagen gels represent ideal substrates for cell-mediated remodeling, but their low strength and low elasticity, limits their application as scaffold for the regeneration of the vascular tissue. These features result from collagen extraction and the consequent loss of structural organization. The objective of this project was to improve the mechanical performances of collagen gels in order to support the growth and the maturation of the vascular tissue under cyclic conditioning. Considering how fundamental collagen assembly is for the mechanical behavior of native tissues, the microstructure of reconstituted collagen lattices was modified by working on three aspects: 1) The intermolecular interactions and the aggregation of collagen monomers were tailored by modulating the experimental conditions, including pH, temperature, ionic strength and collagen concentration; 2) Inter-fibril crosslinking was carried out in order to fix neighboring collagen fibrils through their reactive side chains; 3) Gels were compacted and fibrils were aligned through cell-mediated remodeling. Spectrophotometric analyses and SEM confirmed the effects of changes in experimental conditions and cell-mediated remodeling on collagen gels microstructure. Notably, the presence of SMCs lead to tighter and highly oriented lattices, moreover in the presence of mechanical constraints. Mechanical tests showed that the adopted procedures contributed to the stiffening of collagen lattices. In particular, the modulation of the experimental conditions combined with crosslinking lead to lattices presenting lower hysteresis and higher elasticity as shown by cyclic tests. In conclusion, this study produced, in a short time (24-48 h), collagen gel-based lattices with improved stiffness, strength, and elastic recoil. The results suggest that these lattices are serious candidates for the role of temporary supports during the maturation period under cyclic loading
Transactional concurrency control for resource constrained applications
PhD ThesisTransactions have long been used as a mechanism for ensuring the consistency of databases. Databases, and associated transactional approaches, have always been an active area of research as different application domains and computing architectures have placed ever more elaborate requirements on shared data access. As transactions typically provide consistency at the expense of timeliness (abort/retry) and resource (duplicate shared data and locking), there has been substantial efforts to limit these two aspects of transactions while still satisfying application requirements. In environments where clients are geographically distant from a database the consistency/performance trade-off becomes acute as any retrieval of data over a network is not only expensive, but relatively slow compared to co-located client/database systems. Furthermore, for battery powered clients the increased overhead of transactions can also be viewed as a significant power overhead. However, for all their drawbacks transactions do provide the data consistency that is a requirement for many application types. In this Thesis we explore the solution space related to timely transactional systems for remote clients and centralised databases with a focus on providing a solution, that, when compared to other's work in this domain: (a) maintains consistency; (b) lowers latency; (c) improves throughput. To achieve this we revisit a technique first developed to decrease disk access times via local caching of state (for aborted transactions) to tackle the problems prevalent in real-time databases. We demonstrate that such a technique (rerun) allows a significant change in the typical structure of a transaction (one never before considered, even in rerun systems). Such a change itself brings significant performance success not only in the traditional rerun local database solution space, but also in the distributed solution space. A byproduct of our improvements also, one can argue, brings about a "greener" solution as less time coupled with improved throughput affords improved battery life for mobile devices
Access to data is a small island state : the case for Malta
In a rapidly developing world where the introduction of massive online information systems has
enabled both the scientist and the general public to interact with remotely-located data from
across the globe, the reality of access to data and eventually to information is slowly bringing
forth the realisation that decades-old barriers to access to data still need to be overcome. Whilst
the massive volumes of data at hand can easily lead one to acquire a perception that there is
everything one could require at the touch of a button, reality speaks with another voice; the data
is there, the issue of reliability speaks otherwise. The fundamentals of research lie in the
availability of reliable data, a phenomenon that has left disciplines struggling with issues of
repeatability of scientific outcomes. Technology and legislative measures have caught up with
the realities facing researchers.peer-reviewe
Continuous Access of Broadcast Data Using Artificial Pointers in Wireless Mobile Computing
In a ubiquitous information environment, massive number of users carrying their portable computers can retrieve information anywhere and anytime using wireless mobile computing technologies. Wireless data broadcasting as a way of disseminating information to the large number of clients, has an inherent advantage by providing all types of users global access to information. An adaptive access method, which tolerates the access failure, has been proposed in an error-prone mobile environment. However the influence of version bits to deal with the updates of the broadcast data has not been exploited for the broadcast with modified but the same size and structure update. The basic idea is to distinguish the type of update that does not influence the change in the size and structure of the broadcast has been introduced. To deal with the types of updates, we classified the users in mobile computing environment into the users in system and the new users. In the proposed continuous algorithms, the user in systems record the previous result and use it efficiently to access the desired records with less number of probes in the broadcast, which is updated by a stream of same size and structure bits. In the performance analysis, the experimental results show that the proposed modified progression method has the best performance, as it requires the minimum cost to access the broadcast data
Recommended from our members
Engine cylinder pressure reconstruction using crank kinematics and recurrently-trained neural networks
A recurrent non-linear autoregressive with exogenous input (NARX) neural network is proposed, and a suitable fully-recurrent training methodology is adapted and tuned, for reconstructing cylinder pressure in multi-cylinder IC engines using measured crank kinematics. This type of indirect sensing is important for cost effective closed-loop combustion control and for On-Board Diagnostics. The challenge addressed is to accurately predict cylinder pressure traces within the cycle under generalisation conditions: i.e. using data not previously seen by the network during training. This involves direct construction and calibration of a suitable inverse crank dynamic model, which owing to singular behaviour at top-dead-centre (TDC), has proved difficult via physical model construction, calibration, and inversion. The NARX architecture is specialised and adapted to cylinder pressure reconstruction, using a fully-recurrent training methodology which is needed because the alternatives are too slow and unreliable for practical network training on production engines. The fully-recurrent Robust Adaptive Gradient Descent (RAGD) algorithm, is tuned initially using synthesised crank kinematics, and then tested on real engine data to assess the reconstruction capability. Real data is obtained from a 1.125 litre, 3-cylinder, in-line, direct injection spark ignition (DISI) engine involving synchronised measurements of crank kinematics and cylinder pressure across a range of steady-state speed and load conditions. The paper shows that a RAGD-trained NARX network using both crank velocity and crank acceleration as input information, provides fast and robust training. By using the optimum epoch identified during RAGD training, acceptably accurate cylinder pressures, and especially accurate location-of-peak-pressure, can be reconstructed robustly under generalisation conditions, making it the most practical NARX configuration and recurrent training methodology for use on production engines
- …