13,050 research outputs found
Wi-Fi QoS improvements for industrial automation
Digitalization caused a considerable increase in the use of industrial automation applications. Industrial automation applications use real-time traffic with strict requirements of connection of tens of devices, high-reliability, determinism, low-latency, and synchronization. The current solutions meeting these requirements are wired technologies. However, there is a need for wireless technologies for mobility,less complexity, and quick deployment.
There are many studies on cellular technologies for industrial automation scenarios with strict reliability and latency requirements, but not many developments for wireless communications over unlicensed bands. Wireless Fidelity (Wi-Fi) is a commonly used and preferred technology in factory automation since it is supported by many applications and operates on a license free-band. However, there is still room for improving Wi-Fi systems performance for low-latency and high-reliable communication requirements in industrial automation use cases.
There are various limitations in the current Wi-Fi system restraining the deployment for time-critical operations. For meeting the strict timing requirements of low delay and jitter in industrial automation applications, Quality of Service (QoS)in Wi-Fi needs to be improved. In this thesis, a new access category in Medium Access Control (MAC) layer for industrial automation applications is proposed.The performance improvement is analyzed with simulations, and a jitter definition for a Wi-Fi system is studied. Then, a fixed Modulation and Coding (MCS) link adaptation method and bounded delay is implemented for time-critical traffic in the simulation cases to observe performance changes.
Finally, it is shown that the new access category with no backoff time can decrease the delay and jitter of time-critical applications. The improvements in Wi-Fi QoS are shown in comparison with the current standard, and additional enhancements about using a fixed modulation and coding scheme and implementation of a bounded delay are also analyzed in this thesi
Conceptual definition of a high voltage power supply test facility
NASA Lewis Research Center is presently developing a 60 GHz traveling wave tube for satellite cross-link communications. The operating voltage for this new tube is - 20 kV. There is concern about the high voltage insulation system and NASA is planning a space station high voltage experiment that will demonstrate both the 60 GHz communications and high voltage electronics technology. The experiment interfaces, requirements, conceptual design, technology issues and safety issues are determined. A block diagram of the high voltage power supply test facility was generated. It includes the high voltage power supply, the 60 GHz traveling wave tube, the communications package, the antenna package, a high voltage diagnostics package and a command and data processor system. The interfaces with the space station and the attached payload accommodations equipment were determined. A brief description of the different subsystems and a discussion of the technology development needs are presented
ANALYSIS OF CONSUMER LEGAL PROTECTION ON SELLING ONLINE TRANSACTIONS
Trading transactions via the internet are different from shopping or trading in the real world. Through e-commerce, buyers (buyers) access the internet to websites, which then buyers (buyers) look for the desired item. If you have found the desired item, the buyer sends an offer on the seller's page, calls, or sends a short message to the seller.The legal requirements of an online sale and purchase agreement viewed from Law Number 11 of 2008 concerning Electronic Information and Transactions fulfill the legal requirements of an agreement Article 1320 Civil Code, namely agreements, skills, certain matters, and legal reasons.Although there is one of the legal requirements of the agreement that is not fulfilled, namely regarding the skill requirements of the parties, agreements that buy and sell online through rekber remain valid and binding and become legal for the parties because skill requirements are included in subjective conditions where a condition is not fulfilled in the agreement does not cause the agreement to be invalid, but the agreement can be requested for cancellation; and legal protection for buyers and sellers who use joint accounts regulated in the UUPK, namely in Article 4 concerning consumer rights, among others, obtaining goods that are in accordance with exchange rates and conditions and guarantees, obtaining information about goods, and obtaining compensation, Article 5 concerning the obligation of consumers, among others, to follow the procedure for the use of goods, has good intentions in carrying out goods purchase transactions, and pays as agreed. KeyWord : UU Konsumen, UU ITE, Jual-Beli, Onlin
The Effect of Self Efficacy and Learning Independence on English Speaking Ability
This study aims to determine and analyze the effect of self-efficacy and learning independence on students' speaking skills in English. The research method used is a survey method with correlational analysis conducted on semester one students of the informatics engineering study program for the even academic year 2020/2021. The sampling technique is simple random sampling technique of 70 students. The instruments used are questionnaires and tests. The data was analyzed first by testing the data analysis requirements met, namely normality test, regression linearity test and multicollinearity test. After the test data requirements are met, a differential analysis is carried out to test the research hypotheses using correlation analysis and multiple regression techniques. From the results of the study it was found that: (1) there was a positive and significant effect of self-efficacy and independent learning together on students' English speaking skills; (2) there is a positive and significant effect of self-efficacy on students' English speaking skills; (3) there is a positive and significant effect of learning independence on students' English speaking skills
High-Accuracy Ranging using Spread-Spectrum Technology
Small satellite formation flying is an important new technology. Of prime importance in formation flying is the need to determine high-accuracy real-time satellite-to-satellite range and velocity information. Spread spectrum technology is usually used for this purpose, but there are limitations to this approach. Traditional spread-spectrum ranging involves either one-way ranging (e.g. the GPS constellation) or round-trip ranging. Both of these are useful, but each has limitations. One-way ranging requires highly accurate synchronized clocks on each satellite. Time synchronization, as well as frequency stability, is required and time synchronization is difficult to achieve between satellites. If the ranging accuracy requirements are stringent these requirements are especially difficult to meet. Two-way ranging eliminates the need for high-accuracy synchronicity between satellites, but when using traditional spread-spectrum ranging techniques the achievable chipping rate limits the resolution. This paper shows how other techniques can be used to generate much more accurate information than traditional spread-spectrum allows. The paper describes a step-by-step approach to extracting integer code range, sub-chip code phase and carrier phase information from a coded waveform. The method demonstrated in the paper explains qualitatively at each step how this is done, and then explains quantitatively at each step how to determine the expected system performance. The paper addresses in detail the quantitative limits on achievable performance. From these limits the requirements for system frequency accuracy and the tradeoffs between signal-to-noise ratio (SNR) and range updating are developed
Meeting the requirements to deploy cloud RAN over optical networks
Radio access network (RAN) cost savings are expected in future cloud RAN (C-RAN). In contrast to traditional distributed RAN architectures, in C-RAN, remote radio heads (RRHs) from different sites can share baseband processing resources from virtualized baseband unit pools placed in a few central locations (COs). Due to the stringent requirements of the several interfaces needed in C-RAN, optical networks have been proposed to support C-RAN. One of the key elements that needs to be considered are optical transponders. Specifically, sliceable bandwidth-variable transponders (SBVTs) have recently shown many advantages for core optical transport networks. In this paper, we study the connectivity requirements of C-RAN applications and conclude that dynamicity, fine granularity, and elasticity are needed. However, there is no SBVT implementation that supports those requirements, and thus, we propose and assess an SBVT architecture based on dynamic optical arbitrary generation/measurement. We consider different long-term evolution-advanced configurations and study the impact of the centralization level in terms of the capital expense and operating expense. An optimization problem is modeled to decide which COs should be equipped and which equipment, including transponders, needs to be installed. The results show noticeable cost savings from installing the proposed SBVTs compared to installing fixed transponders. Finally, compared to the maximum centralization level, remarkable cost savings are shown when a lower level of centralization is considered.Peer ReviewedPostprint (author's final draft
Why and How to Extract Conditional Statements From Natural Language Requirements
Functional requirements often describe system behavior by relating events to each other, e.g. "If the system detects an error (e_1), an error message shall be shown (e_2)". Such conditionals consist of two parts: the antecedent (see e_1) and the consequent (e_2), which convey strong, semantic information about the intended behavior of a system. Automatically extracting conditionals from texts enables several analytical disciplines and is already used for information retrieval and question answering. We found that automated conditional extraction can also provide added value to Requirements Engineering (RE) by facilitating the automatic derivation of acceptance tests from requirements. However, the potential of extracting conditionals has not yet been leveraged for RE. We are convinced that this has two principal reasons:
1) The extent, form, and complexity of conditional statements in RE artifacts is not well understood. We do not know how conditionals are formulated and logically interpreted by RE practitioners. This hinders the development of suitable approaches for extracting conditionals from RE artifacts.
2) Existing methods fail to extract conditionals from Unrestricted Natural Language (NL) in fine-grained form. That is, they do not consider the combinatorics between antecedents and consequents. They also do not allow to split them into more fine-granular text fragments (e.g., variable and condition), rendering the extracted conditionals unsuitable for RE downstream tasks such as test case derivation.
This thesis contributes to both areas. In Part I, we present empirical results on the prevalence and logical interpretation of conditionals in RE artifacts. Our case study corroborates that conditionals are widely used in both traditional and agile requirements such as acceptance criteria. We found that conditionals in requirements mainly occur in explicit, marked form and may include up to three antecedents and two consequents. Hence, the extraction approach needs to understand conjunctions, disjunctions, and negations to fully capture the relation between antecedents and consequents. We also found that conditionals are a source of ambiguity and there is not just one way to interpret them formally. This affects any automated analysis that builds upon formalized requirements (e.g., inconsistency checking) and may also influence guidelines for writing requirements.
Part II presents our tool-supported approach CiRA capable of detecting conditionals in NL requirements and extracting them in fine-grained form. For the detection, CiRA uses syntactically enriched BERT embeddings combined with a softmax classifier and outperforms existing methods (macro-F_1: 82%). Our experiments show that a sigmoid classifier built on RoBERTa embeddings is best suited to extract conditionals in fine-grained form (macro-F_1: 86%). We disclose our code, data sets, and trained models to facilitate replication. CiRA is available at http://www.cira.bth.se/demo/.
In Part III, we highlight how the extraction of conditionals from requirements can help to create acceptance tests automatically. First, we motivate this use case in an empirical study and demonstrate that the lack of adequate acceptance tests is one of the major problems in agile testing. Second, we show how extracted conditionals can be mapped to a Cause-Effect-Graph from which test cases can be derived automatically. We demonstrate the feasibility of our approach in a case study with three industry partners. In our study, out of 578 manually created test cases, 71.8% can be generated automatically. Furthermore, our approach discovered 80 relevant test cases that were missed in manual test case design. At the end of this thesis, the reader will have an understanding of (1) the notion of conditionals in RE artifacts, (2) how to extract them in fine-grained form, and (3) the added value that the extraction of conditionals can provide to RE
An Empirical Analysis of Development Processes for Anticipatory Standards
There is an evolution in the process used by standards-development
organizations (SDOs) and this is changing the prevailing standards
development activity (SDA) for information and communications technology
(ICT). The process is progressing from traditional SDA modes, typically
involving the selection from many candidate, existing alternative
components, into the crafting of standards that include a substantial
design component (SSDC), or 'anticipatory' standards. SSDC require
increasingly important roles from organizational players as well as
SDOs. Few theoretical frameworks exist to understand these emerging
processes. This project conducted archival analysis of SDO documents for
a selected subset of web-services (WS) standards taken from publicly
available sources including minutes of meetings, proposals, drafts and
recommendations. This working paper provides a deeper understanding of
SDAs, the roles played by different organizational participants and the
compliance with SDO due process requirements emerging from public policy
constraints, recent legislation and standards accreditation
requirements. This research is influenced by a recent theoretical
framework that suggests viewing the new standards-setting processes as a
complex interplay among three forces: sense-making, design, and
negotiation (DSN). The DSN model provides the framework for measuring
SDO progress and therefore understanding future generations of standards
development processes. The empirically grounded results are useful
foundation for other SDO modeling efforts
- …