83,713 research outputs found

    Intangible trust requirements - how to fill the requirements trust "gap"?

    Get PDF
    Previous research efforts have been expended in terms of the capture and subsequent instantiation of "soft" trust requirements that relate to HCI usability concerns or in relation to "hard" tangible security requirements that primarily relate to security a ssurance and security protocols. Little direct focus has been paid to managing intangible trust related requirements per se. This 'gap' is perhaps most evident in the public B2C (Business to Consumer) E- Systems we all use on a daily basis. Some speculative suggestions are made as to how to fill the 'gap'. Visual card sorting is suggested as a suitable evaluative tool; whilst deontic logic trust norms and UML extended notation are the suggested (methodologically invariant) means by which software development teams can perhaps more fully capture hence visualize intangible trust requirements

    A Web Smart Space Framework for Intelligent Search Engines

    Get PDF
    A web smart space is an intelligent environment which has additional capability of searching the information smartly and efficiently. New advancements like dynamic web contents generation has increased the size of web repositories. Among so many modern software analysis requirements, one is to search information from the given repository. But useful information extraction is a troublesome hitch due to the multi-lingual; base of the web data collection. The issue of semantic based information searching has become a standoff due to the inconsistencies and variations in the characteristics of the data. In the accomplished research, a web smart space framework has been proposed which introduces front end processing for a search engine to make the information retrieval process more intelligent and accurate. In orthodox searching anatomies, searching is performed only by using pattern matching technique and consequently a large number of irrelevant results are generated. The projected framework has insightful ability to improve this drawback and returns efficient outcomes. Designed framework gets text input from the user in the form complete question, understands the input and generates the meanings. Search engine searches on the basis of the information provided

    Exact Requirements Engineering for Developing Business Process Models

    Full text link
    Process modeling is a suitable tool for improving the business processes. Successful process modeling strongly depends on correct requirements engineering. In this paper, we proposed a combination approach for requirements elicitation for developing business models. To do this, BORE (Business-Oriented Requirements Engineering) method is utilized as the base of our work and it is enriched by the important features of the BDD (Business-driven development) method, in order to make the proposed approach appropriate for modeling the more complex processes. As the main result, our method eventuates in exact requirements elicitation that adapts the customers' needs. Also, it let us avoid any rework in the modeling of process. In this paper, we conduct a case study for the paper submission and publication system of a journal. The results of this study not only give a good experience of real world application of proposed approach on a web-based system, also it approves the proficiency of this approach for modeling the complex systems with many sub-processes and complicated relationships.Comment: (IEEE) 3th International Conference on Web Researc

    Moving from Data-Constrained to Data-Enabled Research: Experiences and Challenges in Collecting, Validating and Analyzing Large-Scale e-Commerce Data

    Get PDF
    Widespread e-commerce activity on the Internet has led to new opportunities to collect vast amounts of micro-level market and nonmarket data. In this paper we share our experiences in collecting, validating, storing and analyzing large Internet-based data sets in the area of online auctions, music file sharing and online retailer pricing. We demonstrate how such data can advance knowledge by facilitating sharper and more extensive tests of existing theories and by offering observational underpinnings for the development of new theories. Just as experimental economics pushed the frontiers of economic thought by enabling the testing of numerous theories of economic behavior in the environment of a controlled laboratory, we believe that observing, often over extended periods of time, real-world agents participating in market and nonmarket activity on the Internet can lead us to develop and test a variety of new theories. Internet data gathering is not controlled experimentation. We cannot randomly assign participants to treatments or determine event orderings. Internet data gathering does offer potentially large data sets with repeated observation of individual choices and action. In addition, the automated data collection holds promise for greatly reduced cost per observation. Our methods rely on technological advances in automated data collection agents. Significant challenges remain in developing appropriate sampling techniques integrating data from heterogeneous sources in a variety of formats, constructing generalizable processes and understanding legal constraints. Despite these challenges, the early evidence from those who have harvested and analyzed large amounts of e-commerce data points toward a significant leap in our ability to understand the functioning of electronic commerce.Comment: Published at http://dx.doi.org/10.1214/088342306000000231 in the Statistical Science (http://www.imstat.org/sts/) by the Institute of Mathematical Statistics (http://www.imstat.org

    Stigmergy in Web 2.0: a model for site dynamics

    Get PDF
    Building Web 2.0 sites does not necessarily ensure the success of the site. We aim to better understand what improves the success of a site by drawing insight from biologically inspired design patterns. Web 2.0 sites provide a mechanism for human interaction enabling powerful intercommunication between massive volumes of users. Early Web 2.0 site providers that were previously dominant are being succeeded by newer sites providing innovative social interaction mechanisms. Understanding what site traits contribute to this success drives research into Web sites mechanics using models to describe the associated social networking behaviour. Some of these models attempt to show how the volume of users provides a self-organising and self-contextualisation of content. One model describing coordinated environments is called stigmergy, a term originally describing coordinated insect behavior. This paper explores how exploiting stigmergy can provide a valuable mechanism for identifying and analysing online user behavior specifically when considering that user freedom of choice is restricted by the provided web site functionality. This will aid our building better collaborative Web sites improving the collaborative processes

    Austrian higher education institutions' idiosyncrasies and technology transfer system

    Get PDF
    The aim of this paper is to present the findings of a PhD research (Heinzl, 2007) conducted on the Universities of Applied Sciences in Austria. The research is to establish an idiosyncrasy model for Universities of Applied Sciences in Austria showing the effects of their idiosyncrasies on the ability to successfully conduct technology transfer. Research applied in the study is centred on qualitative methods as major emphasis is placed on theory building. The study pursues a stepwise approach for the establishment of the idiosyncrasy model. In the first step, an initial technology transfer model and list of idiosyncrasies are established based on a synthesis of findings from secondary research. In the second step, these findings are enhanced by the means of empirical research including problem-centred expert interviews, a focus group and participant observation. In the third step, the idiosyncrasies are matched with the factors conducive for technology transfer and focused interviews have been conducted for this purpose. The findings show that idiosyncrasies of Universities of Applied Sciences have remarkable effects on their technology transfer abilities. This paper presents four of the models that emerge from the PhD research: Generic Technology Transfer Model (Section 5.1); Idiosyncrasies Model for the Austrian Universities of Applied Sciences (Section 5.2); Idiosyncrasies-Technology Transfer Effects Model (Section 5.3); Idiosyncrasies-Technology Transfer Cumulated Effects Model (Section 5.3). The primary and secondary research methods employed for this study are: literature survey, focus groups, participant observation, and interviews. The findings of the research contribute to a conceptual design of a technology transfer system which aims to enhance the higher education institutions' technology transfer performance

    Markov Decision Processes with Applications in Wireless Sensor Networks: A Survey

    Full text link
    Wireless sensor networks (WSNs) consist of autonomous and resource-limited devices. The devices cooperate to monitor one or more physical phenomena within an area of interest. WSNs operate as stochastic systems because of randomness in the monitored environments. For long service time and low maintenance cost, WSNs require adaptive and robust methods to address data exchange, topology formulation, resource and power optimization, sensing coverage and object detection, and security challenges. In these problems, sensor nodes are to make optimized decisions from a set of accessible strategies to achieve design goals. This survey reviews numerous applications of the Markov decision process (MDP) framework, a powerful decision-making tool to develop adaptive algorithms and protocols for WSNs. Furthermore, various solution methods are discussed and compared to serve as a guide for using MDPs in WSNs
    corecore