86,868 research outputs found
Bandwagon or Barriers? The Role of Standards in the European and American Marketplace. Working Paper #1, November 1997
Industrial Standards - a highly technical and even obscure topic to many scholars and policy-makers - are crucial in shaping market access and conditions. They act as non-tariff barriers (NTBs) and may affect relations between governments and businesses. The paper examines the evolution of EU policy toward standards and evaluates recent efforts to foster greater cooperation between the EU and the US in reducing trade inhibiting of industrial standards
PKI Interoperability: Still an Issue? A Solution in the X. 509 Realm
There exist many obstacles that slow the global adoption of public key infrastructure (PKI) technology. The PKI interoperability problem, being poorly understood, is one of the most confusing. In this paper, we clarify the PKI interoperability issue by exploring both the juridical and technical domains. We demonstrate the origin of the PKI interoperability problem by determining its root causes, the latter being legal, organizational and technical differences between countries, which mean that relying parties have no one to rely on. We explain how difficult it is to harmonize them. Finally, we propose to handle the interoperability problem from the trust management point of view, by introducing the role of a trust broker which is in charge of helping relying parties make informed decisions about X.509 certificates
Standardization of guidelines for patient photograph deidentification
IMPORTANCE: This work was performed to advance patient care by protecting patient anonymity.
OBJECTIVES: This study aimed to analyze the current practices used in patient facial photograph deidentification and set forth standardized guidelines for improving patient autonomy that are congruent with medical ethics and Health Insurance Portability and Accountability Act.
DESIGN: The anonymization guidelines of 13 respected journals were reviewed for adequacy in accordance to facial recognition literature. Simple statistics were used to compare the usage of the most common concealment techniques in 8 medical journals which may publish the most facial photographs.
SETTING: Not applicable.
PARTICIPANTS: Not applicable.
MAIN OUTCOME MEASURES: Facial photo deidentification guidelines of 13 journals were ascertained. Number and percentage of patient photographs lacking adequate anonymization in 8 journals were determined.
RESULTS: Facial image anonymization guidelines varied across journals. When anonymization was attempted, 87% of the images were inadequately concealed. The most common technique used was masking the eyes alone with a black box.
CONCLUSIONS: Most journals evaluated lack specific instructions for properly de-identifying facial photographs. The guidelines introduced here stress that both eyebrows and eyes must be concealed to ensure patient privacy. Examples of proper and inadequate photo anonymization techniques are provided.
RELEVANCE: Improving patient care by ensuring greater patient anonymity
Challenges in Developing Applications for Aging Populations
Elderly individuals can greatly benefit from the use of computer applications, which can assist in monitoring health conditions, staying in contact with friends and family, and even learning new things. However, developing accessible applications for an elderly user can be a daunting task for developers. Since the advent of the personal computer, the benefits and challenges of developing applications for older adults have been a hot topic of discussion. In this chapter, the authors discuss the various challenges developers who wish to create applications for the elderly computer user face, including age-related impairments, generational differences in computer use, and the hardware constraints mobile devices pose for application developers. Although these challenges are concerning, each can be overcome after being properly identified
Fast computation of the performance evaluation of biometric systems: application to multibiometric
The performance evaluation of biometric systems is a crucial step when
designing and evaluating such systems. The evaluation process uses the Equal
Error Rate (EER) metric proposed by the International Organization for
Standardization (ISO/IEC). The EER metric is a powerful metric which allows
easily comparing and evaluating biometric systems. However, the computation
time of the EER is, most of the time, very intensive. In this paper, we propose
a fast method which computes an approximated value of the EER. We illustrate
the benefit of the proposed method on two applications: the computing of non
parametric confidence intervals and the use of genetic algorithms to compute
the parameters of fusion functions. Experimental results show the superiority
of the proposed EER approximation method in term of computing time, and the
interest of its use to reduce the learning of parameters with genetic
algorithms. The proposed method opens new perspectives for the development of
secure multibiometrics systems by speeding up their computation time.Comment: Future Generation Computer Systems (2012
Recommended from our members
Data standardization
With data rapidly becoming the lifeblood of the global economy, the ability to improve its use significantly affects both social and private welfare. Data standardization is key to facilitating and improving the use of data when data portability and interoperability are needed. Absent data standardization, a “Tower of Babel” of different databases may be created, limiting synergetic knowledge production. Based on interviews with data scientists, this Article identifies three main technological obstacles to data portability and interoperability: metadata uncertainties, data transfer obstacles, and missing data. It then explains how data standardization can remove at least some of these obstacles and lead to smoother data flows and better machine learning. The Article then identifies and analyzes additional effects of data standardization. As shown, data standardization has the potential to support a competitive and distributed data collection ecosystem and lead to easier policing in cases where rights are infringed or unjustified harms are created by data-fed algorithms. At the same time, increasing the scale and scope of data analysis can create negative externalities in the form of better profiling, increased harms to privacy, and cybersecurity harms. Standardization also has implications for investment and innovation, especially if lock-in to an inefficient standard occurs. The Article then explores whether market-led standardization initiatives can be relied upon to increase welfare, and the role governmental-facilitated data standardization should play, if at all
A Machine Learning Based Analytical Framework for Semantic Annotation Requirements
The Semantic Web is an extension of the current web in which information is
given well-defined meaning. The perspective of Semantic Web is to promote the
quality and intelligence of the current web by changing its contents into
machine understandable form. Therefore, semantic level information is one of
the cornerstones of the Semantic Web. The process of adding semantic metadata
to web resources is called Semantic Annotation. There are many obstacles
against the Semantic Annotation, such as multilinguality, scalability, and
issues which are related to diversity and inconsistency in content of different
web pages. Due to the wide range of domains and the dynamic environments that
the Semantic Annotation systems must be performed on, the problem of automating
annotation process is one of the significant challenges in this domain. To
overcome this problem, different machine learning approaches such as supervised
learning, unsupervised learning and more recent ones like, semi-supervised
learning and active learning have been utilized. In this paper we present an
inclusive layered classification of Semantic Annotation challenges and discuss
the most important issues in this field. Also, we review and analyze machine
learning applications for solving semantic annotation problems. For this goal,
the article tries to closely study and categorize related researches for better
understanding and to reach a framework that can map machine learning techniques
into the Semantic Annotation challenges and requirements
Towards robust and reliable multimedia analysis through semantic integration of services
Thanks to ubiquitous Web connectivity and portable multimedia devices, it has never been so easy to produce and distribute new multimedia resources such as videos, photos, and audio. This ever-increasing production leads to an information overload for consumers, which calls for efficient multimedia retrieval techniques. Multimedia resources can be efficiently retrieved using their metadata, but the multimedia analysis methods that can automatically generate this metadata are currently not reliable enough for highly diverse multimedia content. A reliable and automatic method for analyzing general multimedia content is needed. We introduce a domain-agnostic framework that annotates multimedia resources using currently available multimedia analysis methods. By using a three-step reasoning cycle, this framework can assess and improve the quality of multimedia analysis results, by consecutively (1) combining analysis results effectively, (2) predicting which results might need improvement, and (3) invoking compatible analysis methods to retrieve new results. By using semantic descriptions for the Web services that wrap the multimedia analysis methods, compatible services can be automatically selected. By using additional semantic reasoning on these semantic descriptions, the different services can be repurposed across different use cases. We evaluated this problem-agnostic framework in the context of video face detection, and showed that it is capable of providing the best analysis results regardless of the input video. The proposed methodology can serve as a basis to build a generic multimedia annotation platform, which returns reliable results for diverse multimedia analysis problems. This allows for better metadata generation, and improves the efficient retrieval of multimedia resources
Regulatory Cooperation, Regional Trade Agreements, and World Trade Law: Conflict or Complementarity?
Today, the growing and aging population, and the rise of new global threats on human health puts an increasing demand on the healthcare system and calls for preventive actions. To make existing medical treatments more efficient and widely accessible and to prevent the emergence of new threats such as drug-resistant bacteria, improved diagnostic technologies are needed. Potential solutions to address these medical challenges could come from the development of novel lab-on-chip (LoC) for point-of-care (PoC) diagnostics. At the same time, the increasing demand for sustainable energy calls for the development of novel approaches for energy conversion and storage systems (ECS), to which micro- and nanotechnologies could also contribute. This thesis has for objective to contribute to these developments and presents the results of interdisciplinary research at the crossing of three disciplines of physics and engineering: electrokinetic transport in fluids, manufacturing of micro- and nanofluidic systems, and surface control and modification. By combining knowledge from each of these disciplines, novel solutions and functionalities were developed at the macro-, micro- and nanoscale, towards applications in PoC diagnostics and ECS systems. At the macroscale, electrokinetic transport was applied to the development of a novel PoC sampler for the efficient capture of exhaled breath aerosol onto a microfluidic platform. At the microscale, several methods for polymer micromanufacturing and surface modification were developed. Using direct photolithography in off-stoichiometry thiol-ene (OSTE) polymers, a novel manufacturing method for mold-free rapid prototyping of microfluidic devices was developed. An investigation of the photolithography of OSTE polymers revealed that a novel photopatterning mechanism arises from the off-stoichiometric polymer formulation. Using photografting on OSTE surfaces, a novel surface modification method was developed for the photopatterning of the surface energy. Finally, a novel method was developed for single-step microstructuring and micropatterning of surface energy, using a molecular self-alignment process resulting in spontaneous mimicking, in the replica, of the surface energy of the mold. At the nanoscale, several solutions for the study of electrokinetic transport toward selective biofiltration and energy conversion were developed. A novel, comprehensive model was developed for electrostatic gating of the electrokinetic transport in nanofluidics. A novel method for the manufacturing of electrostatically-gated nanofluidic membranes was developed, using atomic layer deposition (ALD) in deep anodic alumina oxide (AAO) nanopores. Finally, a preliminary investigation of the nanopatterning of OSTE polymers was performed for the manufacturing of polymer nanofluidic devices.QC 20140509RappidNanoGateNorosenso
- …