90,617 research outputs found
Conflicting Tendencies in the Development of Scientific and Technical Language Varieties: Metaphorization vs. Standardization
The present paper discusses relations between meaning and context as an interactive process that promotes cognition and communication, both intralingual and interlingual. The article also studies two evident conflicting tendencies in the development of technical language: metaphorization and standardization. Metaphorical meaning extension is characteristic of technical vocabulary in all discourse domains. At the same time, contemporary development of corpus linguistics facilitates standardization of terms. Taking into account pragmatic aspects of the text environment, i.e. referential, situational, cultural and social contexts, language users can interpret the meaning of new terms, establish relations and interconnections between terms and concepts within a text, domain and entire scientific and technical discourse. In the present article, observations on the nature and application of contemporary technical terminology are made on the basis of extensive empirical research
Implementing Standardization Education at the National Level
This paper explores how standardization education can be implemented at the national level. Previous studies form the main source for the paper. This research shows that implementation of standardization in the national education system requires policy at the national level, a long term investment in support, and cooperation between industry, standardization bodies, academia, other institutions involved in education, and government. The approach should combine bottom-up and top-down. The paper is new in combining previous findings to an underpinned recommendation on how to implement standardization education
What Is Process Standardization?
Standards and standardization have played an important role in the evolution of information and communication technology. In parts of the literature on standardization and especially among practitioners we see a new theme emerge: business process standards. While there seems to be a consensus on the desirability of process standards, the concept has not yet been fully developed, and there is even less of a clear definition let alone a systematic understanding of the how and why of its value impact than with data and communication standards.
In this paper, we suggest a process standardization construct and the associated value dimensions and report on a process standardization effort in a large multinational services firm that reveals how the theoretical considerations translate into concrete business value
The effects of higher education programme characteristics on allocation and performance of the graduates: a European view
This paper provides new insight into the role of higher educational programmes in allocation and performance during the transition from education to the labour market. Using a unique data set on the labour market situation of graduates in nine European countries, we investigate the significance of five characteristics of the higher education programmes: (1) The academic versus discipline-specific character of the competencies generated by the curriculum; (2) the level of standardization of the generated competencies; (3) the extent by which working and learning activities are combined, (4) the level of internationalization of the educational programme and (5) the extent to which a programme provides exclusive entrance to particular occupations.First, our results reveal in particular the importance of the competence orientation of the education programme. Allocation of graduates to occupations takes place in a manner that yields a situation wherein the competence orientation of the education is in congruence with the competence orientation of the occupation. Second, we show that the standardization of the education programme with respect to the competencies students acquire plays an important role in both informing the employer, and reducing the adjustment costs. By that, it allows for a higher remuneration of the graduates.education, training and the labour market;
Recommended from our members
The Global academic research organization network: Data sharing to cure diseases and enable learning health systems.
Introduction:Global data sharing is essential. This is the premise of the Academic Research Organization (ARO) Council, which was initiated in Japan in 2013 and has since been expanding throughout Asia and into Europe and the United States. The volume of data is growing exponentially, providing not only challenges but also the clear opportunity to understand and treat diseases in ways not previously considered. Harnessing the knowledge within the data in a successful way can provide researchers and clinicians with new ideas for therapies while avoiding repeats of failed experiments. This knowledge transfer from research into clinical care is at the heart of a learning health system. Methods:The ARO Council wishes to form a worldwide complementary system for the benefit of all patients and investigators, catalyzing more efficient and innovative medical research processes. Thus, they have organized Global ARO Network Workshops to bring interested parties together, focusing on the aspects necessary to make such a global effort successful. One such workshop was held in Austin, Texas, in November 2017. Representatives from Japan, Taiwan, Singapore, Europe, and the United States reported on their efforts to encourage data sharing and to use research to inform care through learning health systems. Results:This experience report summarizes presentations and discussions at the Global ARO Network Workshop held in November 2017 in Austin, TX, with representatives from Japan, Korea, Singapore, Taiwan, Europe, and the United States. Themes and recommendations to progress their efforts are explored. Standardization and harmonization are at the heart of these discussions to enable data sharing. In addition, the transformation of clinical research processes through disruptive innovation, while ensuring integrity and ethics, will be key to achieving the ARO Council goal to overcome diseases such that people not only live longer but also are healthier and happier as they age. Conclusions:The achievement of global learning health systems will require further exploration, consensus-building, funding aligned with incentives for data sharing, standardization, harmonization, and actions that support global interests for the benefit of patients
FGQT Q04 - Standardization Roadmap on Quantum Technologies [written by the CEN-CENELEC Focus Group on Quantum Technologies (FGQT)]
In 2018, the European Commission launched its long term and large scale Quantum Technology FET Flagship Program. The European Commission is also very interested in boosting standards for quantum technologies (QT). The Quantum Flagship has its own cooperation and coordination activities to “coordinate national strategies and activities” and in its “Quantum Manifesto” [1] explicitly advises to form “advisory boards” to promote collaboration in standardization. The CEN/CENELEC Focus Group for Quantum Technologies (FGQT) was formed in June 2020 with the goal to support the plans of the Commission.
Currently, a multitude of standardization activities in QT are ongoing worldwide. While there is overlap in certain areas, other areas of this wide technological field are not being addressed at all. A coordinated approach will be highly beneficial to unleash the full potential of standardization for speeding up progress—also because the pool of standardization experts available for quantum technologies is still very limited. Furthermore, not all areas are yet “ready for standardization”, i.e., while in some fields early standardization is capable of boosting progress, it may be a problem in other areas. Thus, an assessment of standardization readiness of the different areas is required, too.
The FGQT was established to identify standardization needs and opportunities for the entire field of QT with the final goal to boost the establishment of new industries in Europe and consequently the development and engineering of unprecedented novel devices and infrastructures for the benefit of European citizens.
The QT standardization roadmap follows a constructive approach, starting with basic enabling technologies, from which QT components and subsystems are constructed, which again are assembled into QT systems that in turn form composite systems, constituting the building blocks for use cases. Thus, the roadmap is structured approximating very closely the categories of the EC quantum technology FET Flagship Program: quantum communication, quantum computing and simulation, quantum metrology, sensing, and enhanced imaging, while the basic enabling technologies and sub-systems are organized in two pools —thus supporting re-use in the different system categories. The separate types of QT unit systems are then foundations of general QT infrastructures or composite systems. On the level of use cases, the QT standardization roadmap describes basic domains of applicability, so-called “meta use cases”, while the detailed use cases are listed in a separate document of the FGQT: “FGQT Q05 Use Cases”.
Finally, the QT standardization roadmap presents an outlook and conclusions, including an actual prioritization of the single identified standardization needs in the form of sequence diagrams (Gantt charts).
This approach differs slightly from the QT “Pillar design” of the EU Quantum Flagship but, in our opinion, it extends it and is better adapted to standardization purposes, while the former is optimally suited as a research program design.
The FGQT is an open group of European-based experts, working in QT research areas or enabling technologies, and of developers of components, products, or services related to QT. If you are based in Europe, and interested in guidelines and standards to help setting up a research infrastructure, or structuring and boosting your market relevance; if you want to improve coordination with your stakeholders and are interested in coordination and exchange with other experts in the field of QT—please consider to join the CEN/CENELEC FGQT.
NOTE 1 European QT standards development in CEN/CENELEC will take place in the new JTC 22 QT (Joint Technical Committee 22 on Quantum Technologies). The work in JTC 22 QT will be guided by the present roadmap doc ument, and it is expected that the FGQT roadmap-development activity will be absorbed/continued by JTC 22 Q
The Private Administrative Law of Technical Standardization
The nature and place of technical standards has remained an enigma for EU law and legal thought, despite their ubiquitous part and growing importance in market-building processes within and beyond Europe. The significance and intractability of this enigma has been heightened by the landmark Fra.bo (2012) and James Elliot (2016) judgments of the ECJ. These judgments have prompted contradictory positions regarding the publicity and justiciability of technical standards among European legal scholarship and even between the European Commission and the European Parliament. The enigma and these contradictory positions have recently reached the ECJ again through the Stichting Rookpreventie case currently under review by its Grand Chamber. Drawing upon a reconstructive analysis of these and other relevant legal sources concerning technical standardization in Europe, this paper surmounts these seeming contradictions by advancing a new account of these legal developments. Contrary to the mainstream positions nowadays in tension, the article argues that these judgments have reaffirmed the New Approach and the distinctive place of technical standardization organizations in the European legal order while avoiding dysfunctional modes of judicialization. It has done so by acknowledging the techno-political character of technical standards and aptly delineating institutional competences between the government and the judiciary throughout technical standardization processes. To guide future legal thinking and reasoning on these processes, the paper recasts these legal developments through the idea of a ‘private administrative law’ as signifying the way that EU Law has transformed the nature and place of technical standardization in the internal market and as an eventual means for the global reach of EU law
Spatial standardization of taxon occurrence data—a call to action
The fossil record is spatiotemporally heterogeneous: taxon occurrence data have patchy spatial distributions, and this patchiness varies through time. Large-scale quantitative paleobiology studies that fail to account for heterogeneous sampling coverage will generate uninformative inferences at best and confidently draw wrong conclusions at worst. Explicitly spatial methods of standardization are necessary for analyses of large-scale fossil datasets, because nonspatial sample standardization, such as diversity rarefaction, is insufficient to reduce the signal of varying spatial coverage through time or between environments and clades. Spatial standardization should control both geographic area and dispersion (spread) of fossil localities. In addition to standardizing the spatial distribution of data, other factors may be standardized, including environmental heterogeneity or the number of publications or field collecting units that report taxon occurrences. Using a case study of published global Paleobiology Database occurrences, we demonstrate strong signals of sampling; without spatial standardization, these sampling signatures could be misattributed to biological processes. We discuss practical issues of implementing spatial standardization via subsampling and present the new R package divvy to improve the accessibility of spatial analysis. The software provides three spatial subsampling approaches, as well as related tools to quantify spatial coverage. After reviewing the theory, practice, and history of equalizing spatial coverage between data comparison groups, we outline priority areas to improve related data collection, analysis, and reporting practices in paleobiology
- …