1,943 research outputs found
Theory-based user modeling for personalized interactive information retrieval
In an effort to improve users’ search experiences during their information seeking process, providing a personalized information retrieval system is proposed to be one of the effective approaches. To personalize the search systems requires a good understanding of the users. User modeling has been approved to be a good method for learning and representing users. Therefore many user modeling studies have been carried out and some user models have been developed. The majority of the user modeling studies applies inductive approach, and only small number of studies employs deductive approach. In this paper, an EISE (Extended Information goal, Search strategy and Evaluation threshold) user model is proposed, which uses the deductive approach based on psychology theories and an existing user model. Ten users’ interactive search log obtained from the real search engine is applied to validate the proposed user model. The preliminary validation results show that the EISE model can be applied to identify different types of users. The search preferences of the different user types can be applied to inform interactive search system design and development
A Bibliometric Analysis of the Top 100 Cited Articles on Hepatic Magnetic Resonance Imaging.
The purpose of this study is to guide the readers to the impact of the articles published on hepatic magnetic resonance imaging (MRI). We searched Scopus using 10 different search terms for hepatic MRI. The selected studies were thoroughly reviewed by two independent authors and any disagreement was sorted out by mutual consensus. The list of articles and journals was downloaded into an excel spreadsheet. Only the top 100 cited articles were selected by mutual consensus among all the authors. These articles were further read in the full-text form and were further categorized into subgroups. Three authors independently reviewed the top 100 selected articles, and subsequently data was extracted from them and analyzed. Our study showed that the highest number of top 100 cited articles on hepatic MRI were from Radiology (30 articles) followed by European Radiology (14 articles). The American Journal of Roentgenology, Radiographics, and Journal of Magnetic Resonance had seven articles each. The United States had the highest number of articles by region. Nineteen other journals contributed only one article each to the list of top 100 cited articles. The contribution of authors to the top 100 cited articles was reviewed; all the authors contributing with more than two articles to the highly cited articles are given in Table 3 in the supplementary material. The maximum number of articles were published during 2009 (14 articles), and for a five-year period, the maximum contribution was made during 2008-2013 (44 articles). Our analysis gives an insight on the frequency of citations of top articles on hepatic MRI, categorizes the subtopics, the timeline of the publications, and contributions from different geographic distributions
AI Driven Optimization of Resource Allocation and Cost Efficiency in Cloud Computing Environments
AI can improve resource allocation and minimise operational costs in cloud computing settings. Machine learning (ML) and reinforcement learning are examined. The study recommends two models. Linear regression, random forests, and neural networks were utilised to optimize resource consumption depending on workload, cost, and performance.
Random Forest was the most accurate model, with an R² score of 0.9870, outperforming others in prediction. The second section employed RL to find smart resource-use strategies. By watching states, actions, and rewards interact, the DDPG and Q-learning algorithms learnt how to allocate resources flexibly. Classic rule-based approaches were inferior to RL models, which saved money, used resources better, and improved system reliability.
The paradigm combines adaptive decision-making and predictive analytics, delivering a theoretical and practical contribution. The study reveals that AI-driven systems can manage cloud resources in real time with various benefits. The study found certain issues, such as tiny datasets and a lack of computing capacity, but it also opened new possibilities for studying more advanced RL approaches in the real world across multiple cloud platforms.Tekoälyllä on merkittävä potentiaali parantaa resurssien allokointia ja vähentää toimintakustannuksia pilvilaskentaympäristöissä. Tässä tutkimuksessa tarkasteltiin koneoppimisen (ML) ja vahvistusoppimisen (RL) menetelmiä. Suositeltaviksi malleiksi nousivat lineaarinen regressio, satunnaismetsät ja neuroverkot, joita sovellettiin resurssien käytön optimointiin työkuorman, kustannusten ja suorituskyvyn perusteella.
Satunnaismetsä osoittautui tarkimmaksi malliksi, saavuttaen R²-arvon 0,9870, mikä ylitti muiden mallien ennustustarkkuuden. Tutkimuksen toisessa vaiheessa hyödynnettiin RL-menetelmiä älykkäiden resurssienhallintastrategioiden kehittämiseen. DDPG- ja Q-learning-algoritmit oppivat allokoimaan resursseja tehokkaasti havainnoimalla tiloja, toimenpiteitä ja niistä saatavia palkkioita. Perinteiset sääntöpohjaiset lähestymistavat jäivät jälkeen RL-malleista, jotka tuottivat parempia säästöjä, tehokkaampaa resurssien hyödyntämistä ja paransivat järjestelmän luotettavuutta.
Esitetty malli yhdistää ennakoivan analytiikan ja mukautuvan päätöksenteon, tarjoten sekä teoreettista ymmärrystä että käytännön sovelluksia. Tutkimus osoittaa, että tekoälypohjaiset järjestelmät voivat hallita pilvialustoja dynaamisesti ja reaaliaikaisesti useilla eri tavoilla. Vaikka tutkimuksessa kohdattiin haasteita, kuten pienet aineistot ja rajallinen laskentateho, se avaa samalla uusia mahdollisuuksia jatkotutkimukseen edistyneemmillä RL-menetelmillä eri pilviympäristöissä
Study of Relationship between Customer Focus and Organizational Performance in the Telecommunication Organizations of Pakistan
Quality culture and performance of the organizations has become matter of concentration and interest for the researchers and practitioners from last few decades. The purpose of this study was to explore the relationship between quality culture and organizational performance with mediating effect of competitive advantage and moderating effect of human resource. It has been observed that telecommunication companies of Pakistan are facing tough contest and no empirical research has been known to be conducted in Pakistan within the context of quality culture, organizational performance with a mediating effect of competitive advantage and a moderating effect of human resource. The literature review has exposed seven characteristics of quality culture like; employee involvement, senior management leadership, effect of CEO, supplier partnership, customer focus, teamwork, and open corporate culture. 500 questionnaires were circulated in telecommunication companies and 250 received back of which 207 were valid. To examine the validity and reliability of data collected, different statistical techniques and tools have been applied like; Cronbach’s alpha, factor analysis, Pearson correlation, and multiple regression. Results of these statistical techniques have revealed that there is positive and significant association between independent variable (quality culture) and dependent variable (organizational performance). Mediating variable (competitive advantage) also shown reliable connection but moderating variable (human resource) has not created optimistic results. Keywords: Quality culture, customer focus, organizational performance, competitive advantage
APPLICATION OF DEEP CONVOLUTIONAL NETWORK FOR THE CLASSIFICATION OF AUTO IMMUNE DISEASE
Abstract—Indirect Immuno Fluorescence (IIF) detection analysis technique is in limelight because of its great importance in the field of medical health. It is mainly used for the analysis of auto-immune diseases. These diseases are caused when body’s natural defense system can’t distinguish between normal body cells and foreign cells. More than 80 auto-immune diseases exist in humans which affect different parts of body. IIF works both manually as well as by using Computer-Aided Diagnosis (CAD). The aim of research is to propose an advanced methodology for the analysis of auto-immune diseases by using well-known model of transfer learning for the analysis of autoimmune diseases. Data augmentation and data normalization is also used to resolve the problem of over fitting in data. Firstly, freely available MIVIA data set of HEP- type 2 cells has been selected, which contains total of 1457 images and six different classes of staining patterns named as centromere, homogeneous, nucleolar, coarse speckled, fine speckled and cytoplasmatic. Then well-known model of transfer learning VGG-16 are train on MIVIA data set of HEP-type 2 cells. Data augmentation and data normalization used on pre-trained data to avoid over fitting because datasets of medical images are not very large. After the application of data augmentation and data normalization on pre-trained model, the performance of model is used to calculate by using a confusion matrix of VGG-16. VGG-16 achieves 84.375% accuracy. It is more suitable for the analysis of auto-immune diseases. Same as accuracy, we also use the other three parameters, Precision, F1 measures, and recall to check the performance of model. All four parameters use confusion matrix to find performance of model. The tools and languages also have great importance because it gives a simple and easy way of implementation to solve problems in image processing. For this purpose, python and colab is used to read and write the data because python provides fast execution of data and colab work as a simulator of python. The result shows that transfer learning is the most sufficient and enhanced technique for the analysis of auto-immune diseases since it provides high accuracy in less time and reduces the errors in image classification
Study of Relationship between Open Corporate Culuter and Organizational Performance in the Telecommunication Organizations of Pakistan
The purpose of this study was to explore the relationship between senior management leadership and organizational performance. It has been observed that telecommunication companies of Pakistan are facing tough contest and no empirical research has been known to be conducted in Pakistan within the context of senior management leadership and organizational performance. This study examined the relationship between senior management leadership and organizational performance. 500 questionnaires were administrated to managers of telecommunication organizations. Out of which 250 received back and 207 were valid. Data was analyzed through statistical techniques such as regression analysis, factor analysis and Cronbach’s alpha. The results were found positive and significant among senior management leadership and organizational performance. This research revealed that top management should develop such type of strategies in order to keep employees involved in their particular tasks, to optimize the performance of the telecommunications organizations of Pakistan. This study will also help to senior managers, how employees are; to a greater extent involved to become more competitive in the market. This research would also help to deep and better understand the relationship between employee involvement and organizational performance. Keywords: Quality culture, open corporate culture, organizational performance, competitive advantage
Metagenomic Analysis of Spring and Stream Waters in the Chesapeake and Ohio Canal National Historical Park
In the current century, the most critical crises faced by human kind will likely be climate change, shortage of energy supplies, and pollution of the environment. A large variety of contaminants are susceptible to be released in the environment from households and from agricultural and industrial activities. During the last decades, physical, chemical, and biological technologies have been developed for pollution remediation and for assessing the extent of environmental contamination in water resources. Because of the large diversity of contaminants, the systematic and comprehensive analysis of elemental and compound pollutants cannot practically be conducted over an extensive network of water bodies. As a consequence, large-scale surface water monitoring programs frequently rely on biological assessment protocols based on macroinvertebrates, microalgae, or fishes, allowing to integrate the impact of many potential contaminants into single indices that are easy to interpret. However, standard bioassessment protocols are currently based on the morphological identification of representative sets of indicator organisms, which requires extensive stream sampling and laboratory observation in the laboratory and taxonomic identification. These operations are time- and personnel-consuming and require a great deal of experience. In this project, we have developed and validated an innovative water quality bioindicator based on the metagenomic analysis of the total prokaryotic microbial community in the water. Microorganisms are essential components of the aquatic ecosystem and their diversity, nature, and distribution typically reflect variations of the environmental conditions and water quality parameters. Although conventional, cultivation-based methods for microbial characterization are important in investigating the microbial communities, they are time and resources consuming. New polymerase chain reaction (PCR)-based molecular methods, such as metagenomic pyrosequencing, have the potential to quickly provide the detailed information on the microbial communities present in any environment. Advanced bioinformatics computing in connection with the resources of extensive genomic databases allow providing the detailed distribution of the microbial species present in the samples, which, in this project, was used as a fingerprint of water quality. The proposed research has been conducted using water samples collected from the Chesapeake and Ohio Canal National Historical Park (CHOH) in Maryland. Comprehensive characterization of the aquatic bacterial communities has been performed using metagenomic pyrosequencing. In parallel, a suite of relevant water quality parameters were monitored in the samples using standard methods. Using redundancy analyses (RDA), meaningful relationships were established between water characteristics and the metagenomic biomarker, showing its potential utilization as a general water quality indicator. This study provides the basis for the development of an innovative method for the fast and cost-effective assessment of water quality based on the aquatic prokaryotic microbiome. Phylogenetic analyses conducted on the metagenomic data revealed that the dominant prokaryotic phyla detected in the 19 samples are similar to the ones typically detected in freshwater environments. Microbial diversity indices showed that all 2012 samples were characterized by a low biodiversity, while 2013 samples were characterized by a higher diversity, which is likely the result of different meteorological conditions in 2012 and 2013. Clustering analysis and principal component analysis (PCA) were conducted to investigate the relationships between the relative abundance of the prokaryotic phyla and water quality parameters. The results showed that the samples collected from the same sites in different years cluster well together when compared based on the water quality parameters. On the contrary, the samples collected in 2012 made a separate group of cluster and same is true for 2013 samples when compared based on the prokaryotic phyla. These observations suggest a larger temporal variation of the microbial communities than the physico-chemical parameters of the water. PCA focusing on prokaryotic communities showed that Proteobacteria and Bacteroides phyla, including aerobic heterotrophic, fast growing bacteria – referred to as copiotrophic or 'r-type' organisms --, cluster together. On the other hand, the other phyla, including mostly anaerobic and/or autotrophic, slow growing bacteria – referred to as oligotropic or 'K-type' organisms --, form a rather distinct cluster. The dependence of the prokaryotic relative abundance on the water quality parameters for the 19 samples was then interrogated using RDA. As showed by PCA investigations, the r-type phyla cluster together and correlate with high alkalinity and conductivity. On the contrary, the K-type phyla cluster together and correlate collectively with sulfate and nitrate. As expected, the copiotrophic, fast-growing, r-type phyla also correlate with the stream samples, while the oligotrophic, slow-growing, K-type phyla correlate better with spring, cave, and mine samples. This study provides the basis for the development of an innovative method for the fast and cost-effective assessment of water quality based on the prokaryotic microbiome.Civil Engineerin
Characteristics of aggregated traffic in LoRaWAN
Over the past few years, internet-of-Things (IoT) request a large number of smart devices to communicate and exchange information without direct human assistance. As the number of IoT nodes are increasing rapidly, the massive machine-type communication (mMTC) in 5G enables us to integrate the IoT massive traffic and applications without affecting the traditional services. Low-Power Wide-Area Networks (LPWAN) are emerging commercially and considered as fundamental enablers of IoT, Industrial Internet-of-Things (IIoT), and industrial revolution 4.0 because of their license free frequency bands, long range, low power consumption, and low cost. In recent years, LoRaWAN is appearing as one of the most leading LPWAN technologies. The main contribution of this work is examining the characteristics and modeling the aggregated traffic of a large and dense LoRa Network that is deployed as a monitoring system inside Tellus Innovation Arena, University of Oulu, with the concept of IoT-based digital campus as a Wireless Access IoT service of 5GTN. To understand the traffic behaviour, we analyzed the inter-arrival times of the transmissions for different weeks, days, and hours. The statistical presentation of data reveals that the trend of transmissions is exponential, that shows that most of the transmissions were within the inter-arrival time of less than 10 seconds while few of them have inter-arrival time over 20 seconds. After that, we fitted inter-arrival times into the exponential distribution, which helped us to find the mean inter-arrival time of the 5GTN traffic which is further used for the modeling of aggregated traffic. Finally, we performed the transmission compression from a gateway to the Network Server that will be beneficial to efficiently utilize the resources and bandwidth. The results demonstrate that the proposed aggregation mechanism increases the system goodput
Machine-type direct-to-satellite communications : modeling and performance analysis
Abstract
Despite the massive progress in 5G and beyond systems, the deployed terrestrial networks do not provide ubiquitous coverage even on land; the situation is even worse in the open sea, which covers 70% of the Earth's surface. This disparity illustrates a significant digital divide. To address this vital need, global wireless coverage stands as one of the most crucial requirements for the upcoming 6G wireless networks to offer connectivity to both human and IoT machines.
This thesis advocates massive machine-type communication (mMTC) and low Earth orbit (LEO) satellite integration to enable direct connectivity from low-cost, low-power end devices to satellites without relying on a terrestrial network. This emerging form of new connectivity, machine-type direct-to-satellite (DtS) communications, promises to bridge the digital divide. However, DtS presents exciting challenges which need to be carefully examined and addressed. Among these challenges is LEO satellite mobility, which introduces a strong Doppler effect and time-varying channel conditions. Similarly, the long link distance ranging from hundreds to thousands of kilometers and the wide satellite footprint contributes to high propagation losses and massive interference, respectively. Additionally, terrestrial end devices possess limited energy resources. Therefore, successful DtS operation requires high energy efficiency, allowing the low-power signals to reach satellites orbiting thousands of kilometers away from Earth.
Motivated by the low-power and long-range capabilities of LoRaWAN, this thesis selected LoRa and LR-FHSS-based solutions for the modeling. This thesis develops novel Monte Carlo simulation and analytical models for DtS communications performance analysis. The results confirmed the feasibility and potential of both LoRa and LR-FHSS for DtS communications and highlighted the trade-off of different parameter configurations. In summary, LR-FHSS reveals better performance compared to LoRa, primarily due to its high sensitivity, robust Doppler resistance, and increased capacity, making it a suitable choice for DtS communications.
LR-FHSS end devices are expected to be powered by battery, therefore, it is vital to investigate the energy consumption of LR-FHSS. This work conducts experiments using real-life end devices and measures the LR-FHSS air-time and current consumption. We leverage these empirical measurement results to develop analytical models which give us the energy efficiency and battery lifetime of LR-FHSS end devices. LR-FHSS air-time model is highly important for accurate collisions modeling and scalability analysis.Tiivistelmä
Vaikka 5G- ja uudemmissa järjestelmissä on edistytty merkittävästi, nykyiset maanpäälliset verkot eivät ulotu kaikkialle edes maalla, saati avomerellä, jota on 70 prosenttia maapallon pinta-alasta. Tästä erosta voidaan havaita merkittävä digitaalinen kahtiajako. Tulevien langattomien 6G-verkkojen tärkeimpiä vaatimuksia on maailmanlaajuinen kuuluvuus, jotta tähän elintärkeään tarpeeseen voidaan vastata ja yhdistää sekä ihmiset että koneet.
Esitän tässä tutkielmassa, miten massiivisen laitteiden välisen viestinnän (massive machine-type communication, MMTC) ja matalan kiertoradan (low earth orbit, LEO) satelliittien avulla edulliset ja pienitehoiset päätelaitteet voivat saada satelliittiyhteyden ilman maanpäällistä verkkoa. Tämä uusi yhteysmuoto, suora laitteiden välinen satelliittiviestintä (direct-to-satellite, DtS), voi kuroa edellä mainitun digitaalisen kuilun umpeen. DtS-tekniikkaan liittyy kuitenkin mielenkiintoisia haasteita. Haasteena on muun muassa LEO-satelliittien liikkuvuus, mikä aiheuttaa voimakasta Doppler-ilmiötä ja ajassa vaihtelevia kanavan olosuhteita. Samoin yhteyden huomattava pituus (satoja tai tuhansia kilometrejä) ja laaja satelliitin peittoalue aiheuttavat suurta etenemisvaimennusta ja häiriötä tässä järjestyksessä. Lisäksi maanpäällisten päätelaitteiden energiabudjetti on rajallinen. Näin ollen onnistunut DtS:n käyttö edellyttää erinomaista energiatehokkuutta.
LoRaWAN-teknologian pienet tehovaatimukset ja pitkä kantama saivat minut valitsemaan tämän tutkielman mallinnukseen ratkaisut, jotka perustuvat LoRa- ja LR-FHSS-teknologiaan. Kehitän tutkielmassa uusia Monte Carlo -simulaatioita ja analyyttisiä malleja DtS-tietoliikenteen suorituskyvyn analysoimiseksi. Tulosten perusteella sekä LoRa- että LR-FHSS-teknologiassa on potentiaalia ja ne ovat toteuttamiskelpoisia DtS-tiedonsiirtoon. Tulokset myös korostavat erilaisista parametreista seuraavia kompromisseja. Yhteenvetona totean, että LR-FHSS:n suorituskyky oli LoRa-teknologiaa parempi, ensisijaisesti sen suuren herkkyyden, hyvän Doppler-ilmiön siedon ja suuremman kapasiteetin ansiosta, minkä ansiosta se on sopiva DtS-tiedonsiirtoon.
LR-FHSS-päätelaitteet ovat oletusarvoisesti akkukäyttöisiä, joten LR-FHSS-teknologian energiankulutuksen tutkiminen on ehdottoman tärkeää. Tein tässä tutkielmassa kokeita oikeilla päätelaitteilla ja mittasin LR-FHSS-teknologian lähetysaikaa ja virrankulutusta. Sovelsin empiirisiä mittaustuloksia sellaisten analyyttisten mallien kehittämiseen, joilla saatoin päätellä LR-FHSS-päätelaitteiden energiatehokkuuden ja akkukeston. LR-FHSS-lähetysaikamalli on tärkeä työkalu tarkkaan törmäysten mallintamiseen ja skaalautuvuuden analysoimiseen.Academic dissertation to be presented with the assent of the Doctoral Programme Committee of Information Technology and Electrical Engineering of the University of Oulu for public defence in the OP auditorium (L10), Linnanmaa, on 27 September 2024, at 12 noonAbstract
Despite the massive progress in 5G and beyond systems, the deployed terrestrial networks do not provide ubiquitous coverage even on land; the situation is even worse in the open sea, which covers 70% of the Earth's surface. This disparity illustrates a significant digital divide. To address this vital need, global wireless coverage stands as one of the most crucial requirements for the upcoming 6G wireless networks to offer connectivity to both human and IoT machines.
This thesis advocates massive machine-type communication (mMTC) and low Earth orbit (LEO) satellite integration to enable direct connectivity from low-cost, low-power end devices to satellites without relying on a terrestrial network. This emerging form of new connectivity, machine-type direct-to-satellite (DtS) communications, promises to bridge the digital divide. However, DtS presents exciting challenges which need to be carefully examined and addressed. Among these challenges is LEO satellite mobility, which introduces a strong Doppler effect and time-varying channel conditions. Similarly, the long link distance ranging from hundreds to thousands of kilometers and the wide satellite footprint contributes to high propagation losses and massive interference, respectively. Additionally, terrestrial end devices possess limited energy resources. Therefore, successful DtS operation requires high energy efficiency, allowing the low-power signals to reach satellites orbiting thousands of kilometers away from Earth.
Motivated by the low-power and long-range capabilities of LoRaWAN, this thesis selected LoRa and LR-FHSS-based solutions for the modeling. This thesis develops novel Monte Carlo simulation and analytical models for DtS communications performance analysis. The results confirmed the feasibility and potential of both LoRa and LR-FHSS for DtS communications and highlighted the trade-off of different parameter configurations. In summary, LR-FHSS reveals better performance compared to LoRa, primarily due to its high sensitivity, robust Doppler resistance, and increased capacity, making it a suitable choice for DtS communications.
LR-FHSS end devices are expected to be powered by battery, therefore, it is vital to investigate the energy consumption of LR-FHSS. This work conducts experiments using real-life end devices and measures the LR-FHSS air-time and current consumption. We leverage these empirical measurement results to develop analytical models which give us the energy efficiency and battery lifetime of LR-FHSS end devices. LR-FHSS air-time model is highly important for accurate collisions modeling and scalability analysis.Tiivistelmä
Vaikka 5G- ja uudemmissa järjestelmissä on edistytty merkittävästi, nykyiset maanpäälliset verkot eivät ulotu kaikkialle edes maalla, saati avomerellä, jota on 70 prosenttia maapallon pinta-alasta. Tästä erosta voidaan havaita merkittävä digitaalinen kahtiajako. Tulevien langattomien 6G-verkkojen tärkeimpiä vaatimuksia on maailmanlaajuinen kuuluvuus, jotta tähän elintärkeään tarpeeseen voidaan vastata ja yhdistää sekä ihmiset että koneet.
Esitän tässä tutkielmassa, miten massiivisen laitteiden välisen viestinnän (massive machine-type communication, MMTC) ja matalan kiertoradan (low earth orbit, LEO) satelliittien avulla edulliset ja pienitehoiset päätelaitteet voivat saada satelliittiyhteyden ilman maanpäällistä verkkoa. Tämä uusi yhteysmuoto, suora laitteiden välinen satelliittiviestintä (direct-to-satellite, DtS), voi kuroa edellä mainitun digitaalisen kuilun umpeen. DtS-tekniikkaan liittyy kuitenkin mielenkiintoisia haasteita. Haasteena on muun muassa LEO-satelliittien liikkuvuus, mikä aiheuttaa voimakasta Doppler-ilmiötä ja ajassa vaihtelevia kanavan olosuhteita. Samoin yhteyden huomattava pituus (satoja tai tuhansia kilometrejä) ja laaja satelliitin peittoalue aiheuttavat suurta etenemisvaimennusta ja häiriötä tässä järjestyksessä. Lisäksi maanpäällisten päätelaitteiden energiabudjetti on rajallinen. Näin ollen onnistunut DtS:n käyttö edellyttää erinomaista energiatehokkuutta.
LoRaWAN-teknologian pienet tehovaatimukset ja pitkä kantama saivat minut valitsemaan tämän tutkielman mallinnukseen ratkaisut, jotka perustuvat LoRa- ja LR-FHSS-teknologiaan. Kehitän tutkielmassa uusia Monte Carlo -simulaatioita ja analyyttisiä malleja DtS-tietoliikenteen suorituskyvyn analysoimiseksi. Tulosten perusteella sekä LoRa- että LR-FHSS-teknologiassa on potentiaalia ja ne ovat toteuttamiskelpoisia DtS-tiedonsiirtoon. Tulokset myös korostavat erilaisista parametreista seuraavia kompromisseja. Yhteenvetona totean, että LR-FHSS:n suorituskyky oli LoRa-teknologiaa parempi, ensisijaisesti sen suuren herkkyyden, hyvän Doppler-ilmiön siedon ja suuremman kapasiteetin ansiosta, minkä ansiosta se on sopiva DtS-tiedonsiirtoon.
LR-FHSS-päätelaitteet ovat oletusarvoisesti akkukäyttöisiä, joten LR-FHSS-teknologian energiankulutuksen tutkiminen on ehdottoman tärkeää. Tein tässä tutkielmassa kokeita oikeilla päätelaitteilla ja mittasin LR-FHSS-teknologian lähetysaikaa ja virrankulutusta. Sovelsin empiirisiä mittaustuloksia sellaisten analyyttisten mallien kehittämiseen, joilla saatoin päätellä LR-FHSS-päätelaitteiden energiatehokkuuden ja akkukeston. LR-FHSS-lähetysaikamalli on tärkeä työkalu tarkkaan törmäysten mallintamiseen ja skaalautuvuuden analysoimiseen
- …
