815 research outputs found

    Improving the Canny Edge Detector Using Automatic Programming: Improving Hysteresis Thresholding

    Get PDF
    We have used automatic programming to improve the hysteresis thresholding stage of the popular Canny edge detector—without increasing the computational complexity or adding extra information. The F-measure has been increased by 1.8% on a test set of natural images, and a paired student-t test and a Wilcoxon signed rank test show that the improvement is statistically significant. This is the first time evolutionary computation and automatic programming has been used to improve hysteresis thresholding. The new program has introduced complex recursive patterns that make the algorithm perform better with weak edges and retain more detail. The findings provide further evidence that an automatically designed algorithm can outperform manually created algorithms on low level image analysis problems, and that automatic programming is ideally suited for inferring suitable heuristics for such problems

    Towards Secure Data Flow Oriented Multi-Vendor ICT Governance Model

    Get PDF
    Today, still, ICT Governance is being regarded as a departmental concern, not an overall organizational concern. History has shown us that implementation strategies, which are based on departments, results in fractional implementations leading to ad hoc solutions with no central control and stagnation for the in-house ICT strategy. Further, this recently has created an opinion trend; many are talking about the ICT department as being redundant, a dying out breed, which should be replaced by on-demand specialized external services. Clearly, the evermore changing surroundings do force organizations to accelerate the pace of new adaptations within their ICT plans, more vivacious than most organizations currently is able to. This leads to that ICT departments tend to be reactive rather than acting proactively and take the lead in the increased transformation pace in which organizations find themselves. Simultaneously, the monolithic systems of the 1980ies/1990ies is often very dominating in an organization, consume too much of the yearly IT budget, leaving healthy system development behind. These systems were designed before data became an organizational all-encompassing resource; the systems were designed more or less in isolation in regards to the surrounding environment. These solutions make data sharing costly and not at all optimal. Additionally, in strives to adapt to the organization’s evolution, the initial architecture has become disrupted and built up in shreds. Adding to this, on May 25, 2018, an upgraded EU Privacy Regulation on General Data Protection Regulation (GDPR) will be activated. This upgraded privacy regulation includes a substantial strengthening of 1994’s data privacy regulation, which will profoundly affect EU organizations. This regulation will, among other things, limit the right to collect and process personal data and will give the data subject all rights to his/her data sets, independentof where this data is/has been collected and by whom. Such regulation force data collecting and processingorganizations to have total control over any personal data collected and processed. This includes detailedunderstanding of data flows, including who did what and when and under who’s authorization, and how data istransported and stored. Concerning data/information flows, maps are a mandatory part of the system documentation. This encompasses all systems, including outsourced such as cloud services. Hence, individual departments cannot any longer claim they “own” data. Further, since mid-2000, we have seen aglobal inter-organizational data integration, independent of organizations, public or private. If this integration ceasesto exist, the result will be a threat to the survival of the organization. Additionally, if the organization fails to providea transparent documentation according to the GDPR, substantial economic risk is at stake. So, the discussion aboutthe ICT departments’ demise is inapt. Any organizational change will require costly and time-consuming ICTdevelopment efforts to adapt to the legislation of today’s situation. Further, since data nowadays is interconnectedand transformed at all levels, interacting at multiple intersections all over the organization, and becoming a unifiedbase of all operative decisions, an ICT governance model for the organization is required

    Towards Secure Data Flow Oriented Multi-Vendor ICT Governance Model

    Get PDF
    Today, still, ICT Governance is being regarded as a departmental concern, not an overall organizational concern. History has shown us that implementation strategies, which are based on departments, results in fractional implementations leading to ad hoc solutions with no central control and stagnation for the in-house ICT strategy. Further, this recently has created an opinion trend; many are talking about the ICT department as being redundant, a dying out breed, which should be replaced by on-demand specialized external services. Clearly, the evermore changing surroundings do force organizations to accelerate the pace of new adaptations within their ICT plans, more vivacious than most organizations currently is able to. This leads to that ICT departments tend to be reactive rather than acting proactively and take the lead in the increased transformation pace in which organizations find themselves. Simultaneously, the monolithic systems of the 1980ies/1990ies is often very dominating in an organization, consume too much of the yearly IT budget, leaving healthy system development behind. These systems were designed before data became an organizational all-encompassing resource; the systems were designed more or less in isolation in regards to the surrounding environment. These solutions make data sharing costly and not at all optimal. Additionally, in strives to adapt to the organization’s evolution, the initial architecture has become disrupted and built up in shreds. Adding to this, on May 25, 2018, an upgraded EU Privacy Regulation on General Data Protection Regulation (GDPR) will be activated. This upgraded privacy regulation includes a substantial strengthening of 1994’s data privacy regulation, which will profoundly affect EU organizations. This regulation will, among other things, limit the right to collect and process personal data and will give the data subject all rights to his/her data sets, independentof where this data is/has been collected and by whom. Such regulation force data collecting and processingorganizations to have total control over any personal data collected and processed. This includes detailedunderstanding of data flows, including who did what and when and under who’s authorization, and how data istransported and stored. Concerning data/information flows, maps are a mandatory part of the system documentation. This encompasses all systems, including outsourced such as cloud services. Hence, individual departments cannot any longer claim they “own” data. Further, since mid-2000, we have seen aglobal inter-organizational data integration, independent of organizations, public or private. If this integration ceasesto exist, the result will be a threat to the survival of the organization. Additionally, if the organization fails to providea transparent documentation according to the GDPR, substantial economic risk is at stake. So, the discussion aboutthe ICT departments’ demise is inapt. Any organizational change will require costly and time-consuming ICTdevelopment efforts to adapt to the legislation of today’s situation. Further, since data nowadays is interconnectedand transformed at all levels, interacting at multiple intersections all over the organization, and becoming a unifiedbase of all operative decisions, an ICT governance model for the organization is required

    THEATRE OF CREATION: INDUSTRY ANALYSTS AS PROPAGATORS OF INFORMATION TECHNOLOGY FRAMEWORKS

    Get PDF
    Industry analysts have received increased attention within information systems research during the past few years. Acknowledged as propagators in the diffusion of information technology (IT) innovations, they provide their clients and the neighboring industry-community with definitions, tools and frameworks intended to aid both managerial decision-making and operations. They have been described as \u27promissory organization\u27, building revenue through selling future-oriented promises and expectations. At the same time, they are producers of artefacts with a performative affect, changing the way in which decisions are made and operations are organized. This paper addresses the research question how industry analysts propagate IT frameworks. This is answered through an ethnographic study conducted at one of the largest annual gatherings of industry analysts and industry leaders in 2011. The story that unfolds highlights aspects of co-creation taking place in one studied session, introducing a new perspective on the propagation of IT frameworks, as well as the need for further studies of industry analyst gatherings

    The Sewdish Model in Historical Context

    Get PDF

    La percée de l'industrialisation en Suède. Nouvelles orientations du débat et de la recherche

    Get PDF
    I. La révolution industrielle en Suède a été essentiellement considérée comme une sorte de « grande explosion ». Si l'on s'en tient au modèle proposé par Gerschenkron, la Suède a eu besoin d'un taux de croissance plus élevé pour réussir son « décollage » parce qu'elle faisait partie des tard venus dans la course à l'industrialisation. D'après ce modèle général dont l'influence a été grande à partir des années 1960, il était également tout à fait logique de considérer qu'une demande extérieure..

    Edge Pixel Classification Using Automatic Programming

    Get PDF
    We have considered edge detection as a classification problem, and wehave applied two popular machine learning techniques to the problem andcompared their best results to that of automatic programming. We showthat ADATE, our system for automatic programming, is capable of producingsolutions that are as good as, or better than, the best solutions generated bytwo other machine learning techniques.The results demonstrates the ability of the ADATE system to createpowerful heuristics to solve image analysis problems

    Game mechanics engine

    Get PDF
    Game logic and game rules exists in all computer games, but they are created di erently for all game engines. This game engine dependency exists because of how the internal object model is implemented in the engine, as a place where game logic data is intermingled with the data needed by the low- level subsystems. This thesis propose a game object model design, based on existing theory, that removes this dependency and establish a general game logic framework. The thesis then expands on this logic framework and existing engine design theory to create a concept of a genre-independent engine that can provide an alternative to the normal game engine. This new genre-independent alternative is referred to as a game mechanics engin

    Evolutionary Optimization of Artificial Neural Networks and Tree-Based Ensemble Models for Diagnosing Deep Vein Thrombosis

    Get PDF
    Machine learning algorithms, particularly artificial neural networks, have shown promise in healthcare for disease classification, including diagnosing conditions like deep vein thrombosis. However, the performance of artificial neural networks in medical diagnosis heavily depends on their architecture and hyperparameter configuration, which presents virtually unlimited variations. This work employs evolutionary algorithms to optimize hyperparameters for three classic feed-forward artificial neural networks of pre-determined depths. The objective is to enhance the diagnostic accuracy of the classic neural networks in classifying deep vein thrombosis using electronic health records sourced from a Norwegian hospital. The work compares the predictive performance of conventional feed-forward artificial neural networks with standard tree-based ensemble methods previously successful in disease prediction on the same dataset. Results indicate that while classic neural networks perform comparably to tree-based methods, they do not surpass them in diagnosing thrombosis on this specific dataset. The efficacy of evolutionary algorithms in tuning hyperparameters is highlighted, emphasizing the importance of choosing the optimization technique to maximize machine learning models' diagnostic accuracy.publishedVersio
    corecore