8 research outputs found

    Complexity of hyperconcepts

    Get PDF
    AbstractIn machine-learning, maximizing the sample margin can reduce the learning generalization error. Samples on which the target function has a large margin (γ) convey more information since they yield more accurate hypotheses. Let X be a finite domain and S denote the set of all samples S⊆X of fixed cardinality m. Let H be a class of hypotheses h on X. A hyperconcept h′ is defined as an indicator function for a set A⊆S of all samples on which the corresponding hypothesis h has a margin of at least γ. An estimate on the complexity of the class H′ of hyperconcepts h′ is obtained with explicit dependence on γ, the pseudo-dimension of H and m

    Service specification and service compliance: How to consider the responsibility dimension?

    Get PDF
    Service engineering is a huge research topic that addresses the specification, the compliance and the sharing of business and IT services across companies, institutions or governmental organizations. Despite many advantages of working with the services, the guarantee of service compliance and management of the service overlaps by the stakeholders remains challenging. The objective of this document is to present a methodological approach in order to specify the links between the organizational layer and the informational layer of services. Therefore our research has focused on clarifying the responsibility dimension of the stakeholders involved in those services. The proposed approach is illustrated with an example in the context of sensitive data exchange between stakeholders from the healthcare domai

    Services Specification and Services Compliance, How to Consider the Responsibility Dimension?

    Get PDF
    Service engineering is a huge research topic that addresses the specification, the compliance and the sharing of business and IT services across companies, institutions or governmental organizations. Despite many advantages of working with the services, the guarantee of service compliance and management of the service overlaps by the stakeholders remains challenging. The objective of this document is to present a methodological approach in order to specify the links between the organizational layer and the informational layer of services. Therefore our research has focused on clarifying the responsibility dimension of the stakeholders involved in those services. The proposed approach is illustrated with an example in the context of sensitive data exchange between stakeholders from the healthcare domain

    Author index

    Get PDF

    Abstract Complexity of Hyperconcepts

    No full text
    In machine-learning, maximizing the sample margin can reduce the learning generalization error. Samples on which the target function has a large margin (γ) convey more information since they yield more accurate hypotheses. Let X be a finite domain and S denote the set of all samples S ⊆ X of fixed cardinality m. Let H be a class of hypotheses h on X. A hyperconcept h ′ is defined as an indicator function for a set A ⊆ S of all samples on which the corresponding hypothesis h has a margin of at least γ. An estimate on the complexity of the class H ′ of hyperconcepts h ′ is obtained with explicit dependence on γ, the Pseudo-dimension of H and m. Key words: Sample margin, Learning complexity, Pseudo-dimension

    Complexity of Hyperconcepts Joel Ratsaby

    No full text
    In machine-learning, maximizing the sample margin can reduce the learning generalization error. Samples on which the target function has a large margin (#) convey more information since they yield more accurate hypotheses. Let X be a finite domain and S denote the set of all samples S X of fixed cardinality m. Let be a class of hypotheses h on X. A hyperconcept h # is defined as an indicator function for a set A S of all samples on which the corresponding hypothesis h has a margin of at least #. An estimate on the complexity of the class # of hyperconcepts h # is obtained with explicit dependence on #, the Pseudo-dimension of and m. Key words: Sample dependent error bounds, Large margin samples, Learning complexity, Pseudo-dimension
    corecore