12,099 research outputs found

    Soft set theory based decision support system for mining electronic government dataset

    Get PDF
    Electronic government (e-gov) is applied to support performance and create more efficient and effective public services. Grouping data in soft-set theory can be considered as a decision-making technique for determining the maturity level of e-government use. So far, the uncertainty of the data obtained through the questionnaire has not been maximally used as an appropriate reference for the government in determining the direction of future e-gov development policy. This study presents the maximum attribute relative (MAR) based on soft set theory to classify attribute options. The results show that facilitation conditions (FC) are the highest variable in influencing people to use e-government, followed by performance expectancy (PE) and system quality (SQ). The results provide useful information for decision makers to make policies about their citizens and potentially provide recommendations on how to design and develop e-government systems in improving public services

    Computational virtual measurement for trees

    Get PDF
    National forest inventory (NFI) is a systematic sampling method to collect forest information, including tree parameters, site conditions, and auxiliary data. The sample plot measurement is the key work in NFI. However, compared to the techniques 100 years ago, measuring methods and data-processing (modeling) approaches for NFI sample plots have been improved to a minor extent. The limit was that the newly-developed methods introduced additional validation workflows and would increase the workload in NFI. That was due to that these methods were usually developed based on species-specific and site-specific strategies. In order to overcome these obstacles, the integration of the novel measuring instruments is in urgent need, e.g., light detection and ranging (LiDAR) and the corresponding data processing methods with NFI. Given these situations, this thesis proposed a novel computational virtual measurement (CVM) method for the determination of tree parameters without the need for validation. Primarily, CVM is a physical simulation method and works as a virtual measuring instrument. CVM measures raw data, e.g., LiDAR point clouds and tree models, by the simulation of the physical mechanism of measuring instruments and natural phenomena. Based on the theory of CVM, this thesis is a systematic description of how to develop virtual measuring instruments. The first work is to introduce the CVM theory. CVM is a conceptual and general methodology, which is different from a specific measurement of tree parameters. Then, the feasibility of CVM was tested using a conceptual implementation, i.e., virtual ruler. The development of virtual ruler demonstrated the two key differences between CVM and conventional modeling methods. Firstly, the research focus of CVM is to build an appropriate physical scenario instead of finding a mathematical relationship between modeling results and true values. Secondly, the CVM outputs can approach true values, whereas the modeling results could not. Consequently, in a virtual space, tree parameters are determined by a measuring process without mathematical predictions. Accordingly, the result is free of validation and can be regarded as true values, at least in virtual spaces. With the knowledge from the virtual ruler development, two exceptional implementations are further developed. They are the virtual water displacement (VWD) method and sunlight analysis method. Both of them employ the same CVM workflow, which is firstly measured in reality and secondly measured in virtual space. The VWD aims to virtually measure the point clouds using the simulation of water displacement methods in reality. There are two stages in this method. The first stage is to apply the simulation of water displacement using massive virtual water molecules (VWMs). Some empirical regressions have to be employed in this stage, due to the limitation of computer performance. In the second stage, a single (or few) VWM (or VWMs) is developed to remove those empirical processes in VWD. Finally, VWD can function as a fully automatic method to measure point clouds.The sunlight analysis method aims to virtually measure the tree models using the simulation of solar illumination during daylight. There are also two stages in this method. The first stage is to develop sunlight analysis for a single tree. The second stage is to analyze the interference from neighboring trees. The results include default tree attributes, which can be collected in the future NFI. The successful developments of CVM, along with implementations of VWD and sunlight analysis methods, prove the initial assumptions in this thesis. It is the conversion of mathematical processing of data into virtual measurements. Accordingly, this is a different philosophy, i.e., the role of data is extended to the digital representative of trees. It opens an avenue of data processing using a more natural approach and is expected to be employed in the near future as a standard measuring instrument, such as a diameter tape, in NFI.Die Nationale Waldinventur (NFI) ist eine systematische Stichprobenmethode zur Erfassung von Waldinformationen, einschließlich Baumparameter, Standortbedingungen und Hilfsdaten. Die Messung von Stichprobenparzellen ist die Schlüsselarbeit der NFI. Im Vergleich zu den Techniken vor 100 Jahren wurden die Messmethoden und Datenverarbeitungsansätze (Modellierung) für NFI-Stichprobenparzellen jedoch in geringem Umfang verbessert. Die Grenze lag darin, dass die neu entwickelten Methoden zusätzliche Validierungsabläufe einführten und den Arbeitsaufwand in der NFI erhöhen würden. Dies war darauf zurückzuführen, dass diese Methoden in der Regel auf der Grundlage art- und standortspezifischer Strategien entwickelt wurden. Um diese Hindernisse zu überwinden, ist die Integration der neuartigen Messinstrumente dringend erforderlich, z.B. Light Detection and Ranging (LiDAR) und die entsprechenden Datenverarbeitungsmethoden mit NFI. Vor diesem Hintergrund wird in dieser Arbeit ein neuartiges rechnergestütztes virtuelles Messverfahren (CVM) zur Bestimmung von Baumparametern ohne Validierungsbedarf vorgeschlagen. CVM ist in erster Linie eine physikalische Simulationsmethode und arbeitet als virtuelles Messinstrument. CVM misst Rohdaten, z.B. LiDAR-Punktwolken und Baummodelle, durch die Simulation des physikalischen Mechanismus von Messinstrumenten und Naturphänomenen. Basierend auf der Theorie des CVM ist diese Arbeit eine systematische Beschreibung, wie virtuelle Messinstrumente entwickelt werden können. Die erste Arbeit dient der Einführung in die Theorie des CVM. CVM ist eine konzeptuelle und allgemeine Methodik, die sich von einer spezifischen Messung von Baumparametern unterscheidet. Anschliessend wird die Durchführbarkeit des CVM anhand einer konzeptuellen Implementierung, d.h. eines virtuellen Lineals, getestet. Die Entwicklung des virtuellen Lineals zeigte die beiden Hauptunterschiede zwischen CVM und konventionellen Modellierungsmethoden auf. Erstens besteht der Forschungsschwerpunkt von CVM darin, ein geeignetes physisches Szenario zu erstellen, anstatt eine mathematische Beziehung zwischen Modellierungsergebnissen und wahren Werten zu finden. Zweitens können sich die Ergebnisse des CVM den wahren Werten annähern, während die Modellierungsergebnisse dies nicht konnten. Folglich werden in einem virtuellen Raum die Baumparameter durch einen Messprozess ohne mathematische Vorhersagen bestimmt. Dementsprechend ist das Ergebnis frei von Validierung und kann, zumindest in virtuellen Räumen, als wahre Werte betrachtet werden. Mit dem Wissen aus der Entwicklung des virtuellen Lineals werden zwei aussergewöhnliche Implementierungen weiterentwickelt. Es handelt sich um die Methode der virtuellen Wasserverdrängung (VWD) und die Methode der Sonnenlichtanalyse. Beide verwenden den gleichen CVM-Workflow, der erstens in der Realität und zweitens im virtuellen Raum gemessen wird. Das VWD zielt darauf ab, die Punktwolken virtuell zu messen, wobei die Simulation von Wasserverdrängungsmethoden in der Realität verwendet wird. Diese Methode besteht aus zwei Stufen. Die erste Stufe besteht in der Anwendung der Simulation der Wasserverdrängung unter Verwendung massiver virtueller Wassermoleküle (VWMs). Aufgrund der begrenzten Computerleistung müssen in dieser Phase einige empirische Regressionen angewandt werden. In der zweiten Stufe wird ein einzelnes (oder wenige) VWM (oder VWMs) entwickelt, um diese empirischen Prozesse im VWD zu entfernen. Schließlich kann VWD als vollautomatische Methode zur Messung von Punktwolken fungieren. Die Methode der Sonnenlichtanalyse zielt darauf ab, die Baummodelle virtuell zu messen, indem die Simulation der Sonneneinstrahlung bei Tageslicht verwendet wird. Auch bei dieser Methode gibt es zwei Stufen. In der ersten Stufe wird die Sonnenlichtanalyse für einen einzelnen Baum entwickelt. Die zweite Stufe ist die Analyse der Interferenz von benachbarten Bäumen. Die Ergebnisse umfassen Standard-Baumattribute, die in der zukünftigen NFI gesammelt werden können. Die erfolgreichen Entwicklungen von CVM, zusammen mit Implementierungen von VWD- und Sonnenlichtanalysemethoden, beweisen die anfänglichen Annahmen in dieser Arbeit. Es handelt sich um die Umsetzung der mathematischen Verarbeitung von Daten in virtuelle Messungen. Dementsprechend handelt es sich um eine andere Philosophie, d.h. die Rolle der Daten wird auf die digitale Darstellung von Bäumen ausgedehnt. Sie eröffnet einen Weg der Datenverarbeitung unter Verwendung eines natürlicheren Ansatzes und wird voraussichtlich in naher Zukunft als Standard-Messinstrument, wie z.B. ein Durchmesser-Band, in der NFI eingesetzt werden

    Methodology for the Construction of a Virtual Environment for the Simulation of Critical Processes

    Get PDF
    There is a growing trend in education and training towards the use of online and distance learning courses. This delivery format provides flexibility and accessibility; it is also viewed as a way to provide education in a more effective way to a broader community. Online courses are comfortable, they are built under the missive of “anyone, anywhere, anytime”. Everyone can participate from home or workplace. Online courses can be developed in a variety of ways, for example, using a LMS (Learning Management System), a LCM (Learning Content System), or a Web 2.0 tool (or some mixture). These options, however, show limitations in terms of communication and interaction levels that can be achieved between students. Most learning systems are asynchronous and don't allow an effective real-time interaction, collaboration and cooperation. Whilst they typically have synchronous chats and whiteboards, these capabilities are often sterile and don’t stimulate the appropriate interactions that enhance learning. A rich interaction does not necessarily involve just verbal exchange since there is an huge learning value to be gained from interacting with the learning content in a more visual and practical way. For instance, imagine the learning benefits from collaborating on a 3D construction jointly and in real-time? Imagine watching the impact of soil erosion, or building and walking inside an heart model or a car engine? All this is possible in a 3D immersive virtual world. Students can engage at a distance building content in real-time, collaboratively and interactively. On the net there can be found an array of virtual worlds, however we have chosen Second Life® (SL®) to show how teaching and learning can be enhanced through the use of this platform. Second Life® is immersive, enabling users to interact, communicate and collaborate as if in the real world. SL® is a model of the real world, it shows an accurate physics simulation and it includes a meteorological and gravitational system; as such, anything can be modelled and simulated. Each user in the environment is represented by an avatar with all the features of a human being and avatars can manipulate the environment. Scientific experiments can be held in a very safe and controlled environment, and can be directly conducted by the scientist in charge. Scientific fields such as architecture, history, medicine, biology, sociology, programming, languages learning among many others can all be tested and researched through this virtual world.info:eu-repo/semantics/publishedVersio

    Implementation of computer visualisation in UK planning

    Get PDF
    PhD ThesisWithin the processes of public consultation and development management, planners are required to consider spatial information, appreciate spatial transformations and future scenarios. In the past, conventional media such as maps, plans, illustrations, sections, and physical models have been used. Those traditional visualisations are at a high degree of abstraction, sometimes difficult to understand for lay people and inflexible in terms of the range of scenarios which can be considered. Yet due to technical advances and falling costs, the potential for computer based visualisation has much improved and has been increasingly adopted within the planning process. Despite the growth in this field, insufficient consideration has been given to the possible weakness of computerised visualisations. Reflecting this lack of research, this study critically evaluates the use and potential of computerised visualisation within this process. The research is divided into two components: case study analysis and reflections of the author following his involvement within the design and use of visualisations in a series of planning applications; and in-depth interviews with experienced practitioners in the field. Based on a critical review of existing literature, this research explores in particular the issues of credibility, realism and costs of production. The research findings illustrate the importance of the credibility of visualisations, a topic given insufficient consideration within the academic literature. Whereas the realism of visualisations has been the focus of much previous research, the results of the case studies and interviews with practitioners undertaken in this research suggest a ‘photo’ realistic level of details may not be required as long as the observer considers the visualisations to be a credible reflection of the underlying reality. Although visualisations will always be a simplification of reality and their level of realism is subjective, there is still potential for developing guidelines or protocols for image production based on commonly agreed standards. In the absence of such guidelines there is a danger that scepticism in the credibility of computer visualisations will prevent the approach being used to its full potential. These findings suggest there needs to be a balance between scientific protocols and artistic licence in the production of computer visualisation. In order to be sufficiently credible for use in decision making within the planning processes, the production of computer visualisation needs to follow a clear methodology and scientific protocols set out in good practice guidance published by professional bodies and governmental organisations.Newcastle upon Tyne for awarding me an International Scholarship and Alumni Bursar

    Seeing the invisible: from imagined to virtual urban landscapes

    Get PDF
    Urban ecosystems consist of infrastructure features working together to provide services for inhabitants. Infrastructure functions akin to an ecosystem, having dynamic relationships and interdependencies. However, with age, urban infrastructure can deteriorate and stop functioning. Additional pressures on infrastructure include urbanizing populations and a changing climate that exposes vulnerabilities. To manage the urban infrastructure ecosystem in a modernizing world, urban planners need to integrate a coordinated management plan for these co-located and dependent infrastructure features. To implement such a management practice, an improved method for communicating how these infrastructure features interact is needed. This study aims to define urban infrastructure as a system, identify the systematic barriers preventing implementation of a more coordinated management model, and develop a virtual reality tool to provide visualization of the spatial system dynamics of urban infrastructure. Data was collected from a stakeholder workshop that highlighted a lack of appreciation for the system dynamics of urban infrastructure. An urban ecology VR model was created to highlight the interconnectedness of infrastructure features. VR proved to be useful for communicating spatial information to urban stakeholders about the complexities of infrastructure ecology and the interactions between infrastructure features.https://doi.org/10.1016/j.cities.2019.102559Published versio

    Simulating city growth by using the cellular automata algorithm

    Get PDF
    The objective of this thesis is to develop and implement a Cellular Automata (CA) algorithm to simulate urban growth process. It attempts to satisfy the need to predict the future shape of a city, the way land uses sprawl in the surroundings of that city and its population. Salonica city in Greece is selected as a case study to simulate its urban growth. Cellular automaton (CA) based models are increasingly used to investigate cities and urban systems. Sprawling cities may be considered as complex adaptive systems, and this warrants use of methodology that can accommodate the space-time dynamics of many interacting entities. Automata tools are well-suited for representation of such systems. By means of illustrating this point, the development of a model for simulating the sprawl of land uses such as commercial and residential and calculating the population who will reside in the city is discussed

    Developing virtual watersheds for evaluating the dynamics of land use change

    Get PDF

    Three Dimensional Visualization of Fire Spreading Over Forest Landscapes

    Get PDF
    Previous studies in fire visualization have required high end computer hardware and specialized technical skills. This study demonstrated fire visualization is possible using Visual Nature Studio and standard computer hardware. Elevation and vegetation data were used to create a representation of the New Jersey pine barren environment and a forest compartment within Hobcaw Barony. Photographic images were edited to use as image object models for forest vegetation. The FARSITE fire behavioral model was used to model a fire typical of that area. Output from FARSITE was used to visualize the fire with tree models edited to simulate burning and flame models. Both static and animated views of the fire spread and effects were visualized. The two visualization methods were compared for advantages and disadvantages. VNS visualizations were more realistic, including many effects such as ground textures, lighting, user made models, and atmospheric effects. However the program had higher hardware requirements and sometimes rendered images slowly. ArcScene had lower hardware requirements and produced visualizations with real time movement. The resulting images lacked many of the effects found in VNS and were more simplistic looking

    Natural landscape scenic preference: techniques for evaluation and simulation.

    Get PDF
    The aesthetic beauty of a landscape is a very subjective issue: every person has their own opinions and their own idea of what beauty is. However, all people have a common evolutionary history, and, according to the Biophilia hypothesis, a genetic predisposition to liking certain types of landscapes. It is possible that this common inheritance allows us to attempt to model scenic preference for natural landscapes. The ideal type of model for such predictions is the psychophysical preference model, integrating psychological responses to landscapes with objective measurements of quantitative and qualitative landscape variables. Such models commonly predict two thirds of the variance in the predications of the general public for natural landscapes. In order to create such a model three sets of data were required: landscape photographs (surrogates of the actual landscape), landscape preference data and landscape component variable measurements. The Internet was used to run a questionnaire survey; a novel, yet flexible, environmentally friendly and simple method of data gathering, resulting in one hundred and eighty responses. A geographic information system was used to digitise ninety landscape photographs and measure their landforms (based on elevation) in terms of areas and perimeters, their colours and proxies for their complexity and coherence. Landscape preference models were created by running multiple linear regressions using normalised preference data and the landscape component variables, including mathematical transformations of these variables. The eight models created predicted over sixty percent of variance in the responses and had moderate to high correlations with a second set of landscape preference data. A common base to the models were the variables of complexity, water and mountain landform, in particular the presence or absence of water and mountains was noted as being significant in determining landscape scenic preference. In order to fully establish the utility of these models, they were further tested against: changes in weather and season; the addition of cultural structures; different photographers; alternate film types; different focal lengths; and composition. Results showed that weather and season were not significant in determining landscape preference; cultural structures increased preferences for landscapes; and photographs taken by different people did not produce consistent results from the predictive models. It was also found that film type was not significant and that changes in focal length altered preferences for landscapes

    A Review of Platforms for the Development of Agent Systems

    Full text link
    Agent-based computing is an active field of research with the goal of building autonomous software of hardware entities. This task is often facilitated by the use of dedicated, specialized frameworks. For almost thirty years, many such agent platforms have been developed. Meanwhile, some of them have been abandoned, others continue their development and new platforms are released. This paper presents a up-to-date review of the existing agent platforms and also a historical perspective of this domain. It aims to serve as a reference point for people interested in developing agent systems. This work details the main characteristics of the included agent platforms, together with links to specific projects where they have been used. It distinguishes between the active platforms and those no longer under development or with unclear status. It also classifies the agent platforms as general purpose ones, free or commercial, and specialized ones, which can be used for particular types of applications.Comment: 40 pages, 2 figures, 9 tables, 83 reference
    corecore