32 research outputs found

    SPATIAL TRANSFORMATION PATTERN DUE TO COMMERCIAL ACTIVITY IN KAMPONG HOUSE

    Get PDF
    ABSTRACT Kampung houses are houses in kampung area of the city. Kampung House oftenly transformed into others use as urban dynamics. One of the transfomation is related to the commercial activities addition by the house owner. It make house with full private space become into mixused house with more public spaces or completely changed into full public commercial building. This study investigate the spatial transformation pattern of the kampung houses due to their commercial activities addition. Site observations, interviews and questionnaires were performed to study the spatial transformation. This study found that in kampung houses, the spatial transformation pattern was depend on type of commercial activities and owner perceptions, and there are several steps of the spatial transformation related the commercial activity addition. Keywords: spatial transformation pattern; commercial activity; owner perception, kampung house; adaptabilit

    International Conference Management, Business and Economics

    Get PDF
    UBT Annual International Conference is the 9th international interdisciplinary peer reviewed conference which publishes works of the scientists as well as practitioners in the area where UBT is active in Education, Research and Development. The UBT aims to implement an integrated strategy to establish itself as an internationally competitive, research-intensive university, committed to the transfer of knowledge and the provision of a world-class education to the most talented students from all background. The main perspective of the conference is to connect the scientists and practitioners from different disciplines in the same place and make them be aware of the recent advancements in different research fields, and provide them with a unique forum to share their experiences. It is also the place to support the new academic staff for doing research and publish their work in international standard level. This conference consists of sub conferences in different fields like: Art and Digital Media Agriculture, Food Science and Technology Architecture and Spatial Planning Civil Engineering, Infrastructure and Environment Computer Science and Communication Engineering Dental Sciences Education and Development Energy Efficiency Engineering Integrated Design Information Systems and Security Journalism, Media and Communication Law Language and Culture Management, Business and Economics Modern Music, Digital Production and Management Medicine and Nursing Mechatronics, System Engineering and Robotics Pharmaceutical and Natural Sciences Political Science Psychology Sport, Health and Society Security Studies This conference is the major scientific event of the UBT. It is organizing annually and always in cooperation with the partner universities from the region and Europe. We have to thank all Authors, partners, sponsors and also the conference organizing team making this event a real international scientific event

    Proceedings of Mathsport international 2017 conference

    Get PDF
    Proceedings of MathSport International 2017 Conference, held in the Botanical Garden of the University of Padua, June 26-28, 2017. MathSport International organizes biennial conferences dedicated to all topics where mathematics and sport meet. Topics include: performance measures, optimization of sports performance, statistics and probability models, mathematical and physical models in sports, competitive strategies, statistics and probability match outcome models, optimal tournament design and scheduling, decision support systems, analysis of rules and adjudication, econometrics in sport, analysis of sporting technologies, financial valuation in sport, e-sports (gaming), betting and sports

    Multi-agent System Models for Distributed Services Scheduling

    Get PDF
    This thesis investigates the computational and modeling issues involved with developing solutions for distributed service scheduling problems. Compared with traditional manufacturing scheduling, service scheduling poses additional challenges due to the significant customer involvement in service processes. The first challenge is that the service scheduling environment is a distributed environment in which scheduling-related information is scattered among individual identities, such as service providers and customers. The second challenge is that the service scheduling environment is a dynamic environment. Uncertainty in customer demand, customer cancellations and no-shows make the scheduling of services a complex dynamic process. Service scheduling has to be robust and prepared to accommodate any contingencies caused by customer involvement in service production. The third challenge concerns customers’ private information. To compute optimal schedules, ideally, the scheduler should know the complete customer availability and preference information within the scheduling horizon. However, customers may act strategically to protect their private information. Therefore, service scheduling systems should be designed so that they are able to elicit enough of a customer’s private information that will make it possible to compute high quality schedules. The fourth challenge is that in a service scheduling environment, the objectives are complicated and they may even be in opposition. The distributed service scheduling environment enables each agent to have their own scheduling objectives. The objectives of these agents can vary from one to another. In addition to multiple objectives, since agents are self-interested, they are likely to behave strategically to achieve their own objectives without considering the global objectives of the system. Existing approaches usually deal with only a part of the challenges in a specific service domain. There is a need for general problem formulations and solutions that address service scheduling challenges in a comprehensive framework. In this thesis, I propose an integrated service scheduling framework for the general service scheduling problem. The proposed framework uses iterative auction as the base mechanism to tackle service scheduling challenges in distributed and dynamic environments. It accommodates customer’s private information by providing appropriate incentives to customers and it has the potential to accommodate dynamic events. This framework integrates customers’ preferences with the allocation of a provider’s capacity through multilateral negotiation between the provider and its customers. The framework can accommodate both price-based commercial settings and non-commercial service settings. Theoretical and experimental results are developed to verify the effectiveness of the proposed framework. The application of the framework to the mass customization of services and to appointment scheduling are developed to demonstrate the applicability of the general framework to specific service domains. A web-based prototype is designed and implemented to evaluate the scalability of the approach in a distributed environment

    Optimizing Resource Management in Cloud Analytics Services

    Get PDF
    The fundamental challenge in the cloud today is how to build and optimize machine learning and data analytical services. Machine learning and data analytical platforms are changing computing infrastructure from expensive private data centers to easily accessible online services. These services pack user requests as jobs and run them on thousands of machines in parallel in geo-distributed clusters. The scale and the complexity of emerging jobs lead to increasing challenges for the clusters at all levels, from power infrastructure to system architecture and corresponding software framework design. These challenges come in many forms. Today's clusters are built on commodity hardware and hardware failures are unavoidable. Resource competition, network congestion, and mixed generations of hardware make the hardware environment complex and hard to model and predict. Such heterogeneity becomes a crucial roadblock for efficient parallelization on both the task level and job level. Another challenge comes from the increasing complexity of the applications. For example, machine learning services run jobs made up of multiple tasks with complex dependency structures. This complexity leads to difficulties in framework designs. The scale, especially when services span geo-distributed clusters, leads to another important hurdle for cluster design. Challenges also come from the power infrastructure. Power infrastructure is very expensive and accounts for more than 20% of the total costs to build a cluster. Power sharing optimization to maximize the facility utilization and smooth peak hour usages is another roadblock for cluster design. In this thesis, we focus on solutions for these challenges at the task level, on the job level, with respect to the geo-distributed data cloud design and for power management in colocation data centers. At the task level, a crucial hurdle to achieving predictable performance is stragglers, i.e., tasks that take significantly longer than expected to run. At this point, speculative execution has been widely adopted to mitigate the impact of stragglers in simple workloads. We apply straggler mitigation for approximation jobs for the first time. We present GRASS, which carefully uses speculation to mitigate the impact of stragglers in approximation jobs. GRASS's design is based on the analysis of a model we develop to capture the optimal speculation levels for approximation jobs. Evaluations with production workloads from Facebook and Microsoft Bing in an EC2 cluster of 200 nodes show that GRASS increases accuracy of deadline-bound jobs by 47% and speeds up error-bound jobs by 38%. Moving from task level to job level, task level speculation mechanisms are designed and operated independently of job scheduling when, in fact, scheduling a speculative copy of a task has a direct impact on the resources available for other jobs. Thus, we present Hopper, a job-level speculation-aware scheduler that integrates the tradeoffs associated with speculation into job scheduling decisions based on a model generalized from the task-level speculation model. We implement both centralized and decentralized prototypes of the Hopper scheduler and show that 50% (66%) improvements over state-of-the-art centralized (decentralized) schedulers and speculation strategies can be achieved through the coordination of scheduling and speculation. As computing resources move from local clusters to geo-distributed cloud services, we are expecting the same transformation for data storage. We study two crucial pieces of a geo-distributed data cloud system: data acquisition and data placement. Starting from developing the optimal algorithm for the case of a data cloud made up of a single data center, we propose a near-optimal, polynomial-time algorithm for a geo-distributed data cloud in general. We show, via a case study, that the resulting design, Datum, is near-optimal (within 1.6%) in practical settings. Efficient power management is a fundamental challenge for data centers when providing reliable services. Power oversubscription in data centers is very common and may occasionally trigger an emergency when the aggregate power demand exceeds the capacity. We study power capping solutions for handling such emergencies in a colocation data center, where the operator supplies power to multiple tenants. We propose a novel market mechanism based on supply function bidding, called COOP, to financially incentivize and coordinate tenants' power reduction for minimizing total performance loss while satisfying multiple power capping constraints. We demonstrate that COOP is "win-win", increasing the operator's profit (through oversubscription) and reducing tenants' costs (through financial compensation for their power reduction during emergencies).</p

    Efficiency and productivity analysis of deregulated telecommunications industries: a comparative study of the cases of Canada and Nigeria

    Get PDF
    Following telecommunications industry deregulation in United Kingdom and the introduction of competition in the United States of America's long distance telecommunications services in the 1980s, telecommunications industries in other developed and developing countries have been deregulated. Contributing to the deregulation are the influences of globalization, technological advancement, fiscal policy restraint, lending institutions' requirements, regulatory costs curtailment and the desire for improved performance. However, the benefits of deregulation remain uncertain. The motivation for this research is to investigate the efficiency and productivity performance of telecommunications industries in deregulated environments. Comparatively analyzing the experiences of Canada and Nigeria, this research addresses two broad questions. First, how did deregulatory policies influence competitiveness in the industries in the two countres? This was addressed by: (i) investigating the forces that drove deregulation, (ii) exploring the similarities and differences in the deregulatory milieu in the two countries, and (iii) evaluating competitiveness in the industry. Second, how did the industries perform in the deregulated environments? The outcomes shed lights on the efficiency, productivity and the influence of environmental factors on efficiency performance. It also imbues the applicability of structure-conduct-performance model in the understanding of deregulatory outcomes. The approach adopted entailed empirical analysis of the two countries in the context of 17 other telecommunications industries from High Income Countries and Middle Income Countries over a 13-year period (2001–13). The study used non-parametric Data Envelopment Analysis (DEA) and the Malmquist Productivity Index to assess the efficiency and productivity changes and a random effect (RE) panel Tobit model was used to evaluate the effect of environmental factors on efficiency performance. Furthermore, responses from industry participants were obtained to complement the DEA findings. The DEA results suggest that operating in deregulated environment improves efficiency and productivity performance; a finding validated by the views of the industry participants involved in the study. The two countries, though inefficient, showed improved technical efficiency. The productivity analysis revealed both countries experienced productivity growth but it has slowed. Also, the Mann-Whitney test showed that the two countries have comparable productivity change. The Canadian telecommunications industry experienced technological progress and efficiency improvement, but the productivity change was mainly due to efficiency improvement attained through managerial effectiveness. On the other hand, the Nigerian telecommunications industry experienced technological retardation but efficiency progression. Its productivity change was due to efficiency improvements attained through enhanced operational scale. The investigation of the influence of environmental factors on efficiency reveals that the number of years in deregulation has an insignificant negative influence on technical and scale efficiency. However, as a quadratic term, the effect is positive but remained insignificant. Revenue per subscription positively influences technical and scale efficiencies and is statistically significant. This indicates that higher prices may result in better technical efficiency and operational scale. Industry concentration level was found to have a positive but not statistically significant effect on technical and scale efficiencies and a negative but also statistically insignificant effect on pure technical efficiency. This signifies that telecommunications industry concentration is not consequential to performance. Capital expenditure to revenue ratio has no significant influence on technical efficiency but a statistically significant negative influence on scale efficiency. This signifies that scale efficiency could be attained by optimizing capital expenditure through full capacity utilization and by avoiding infrastructure duplication. Labour productivity influences technical efficiency but has an unimportant negative effect on scale efficiency. This implies that technical efficiency could be enhanced through labour productivity improvements. Also, change in real gross domestic product per capita has a negative and insignificant effect on technical and scale efficiencies. However, as a quadratic term, it has significant positive influence on scale efficiency, suggesting that countries with higher economic growth and wealth would display better scale efficiency performance. Inflation has significant positive influence on technical and scale efficiency performance. The level of development has insignificant relationship with technical and scale efficiency scores, implying that it is not an essential determinant of performance. The interaction of labour productivity and capital intensity undermines technical efficiency, signifying that efficiency improvement through labour productivity and increased use of capital is not sufficient to neutralize efficiency loss from increased capital intensity

    Mixed structural models for decision making under uncertainty using stochastic system simulation and experimental economic methods: application to information security control choice

    Get PDF
    This research is concerned with whether and to what extent information security managers may be biased in their evaluation of and decision making over the quantifiable risks posed by information management systems where the circumstances may be characterized by uncertainty in both the risk inputs (e.g. system threat and vulnerability factors) and outcomes (actual efficacy of the selected security controls and the resulting system performance and associated business impacts). Although ‘quantified security’ and any associated risk management remains problematic from both a theoretical and empirical perspective (Anderson 2001; Verendel 2009; Appari 2010), professional practitioners in the field of information security continue to advocate the consideration of quantitative models for risk analysis and management wherever possible because those models permit a reliable economic determination of optimal operational control decisions (Littlewood, Brocklehurst et al. 1993; Nicol, Sanders et al. 2004; Anderson and Moore 2006; Beautement, Coles et al. 2009; Anderson 2010; Beresnevichiene, Pym et al. 2010; Wolter and Reinecke 2010; Li, Parker et al. 2011) The main contribution of this thesis is to bring current quantitative economic methods and experimental choice models to the field of information security risk management to examine the potential for biased decision making by security practitioners, under conditions where information may be relatively objective or subjective and to demonstrate the potential for informing decision makers about these biases when making control decisions in a security context. No single quantitative security approach appears to have formally incorporated three key features of the security risk management problem addressed in this research: 1) the inherently stochastic nature of the information system inputs and outputs which contribute directly to decisional uncertainty (Conrad 2005; Wang, Chaudhury et al. 2008; Winkelvos, Rudolph et al. 2011); 2) the endogenous estimation of a decision maker’s risk attitude using models which otherwise typically assume risk neutrality or an inherent degree of risk aversion (Danielsson 2002; Harrison, Johnson et al. 2003); and 3) the application of structural modelling which allows for the possible combination and weighting between multiple latent models of choice (Harrison and Rutström 2009). The identification, decomposition and tractability of these decisional factors is of crucial importance to understanding the economic trade-offs inherent in security control choice under conditions of both risk and uncertainty, particularly where established psychological decisional biases such as ambiguity aversion (Ellsberg 1961) or loss aversion (Kahneman and Tversky 1984) may be assumed to be endemic to, if not magnified by, the institutional setting in which these decisions take place. Minimally, risk averse managers may simply be overspending on controls, overcompensating for anticipated losses that do not actually occur with the frequency or impact they imagine. On the other hand, risk-seeking managers, where they may exist (practitioners call them ‘cowboys’ – they are a familiar player in equally risky financial markets) may be simply gambling against ultimately losing odds, putting the entire firm at risk of potentially catastrophic security losses. Identifying and correcting for these scenarios would seem to be increasingly important for now universally networked business computing infrastructures. From a research design perspective, the field of behavioural economics has made significant and recent contributions to the empirical evaluation of psychological theories of decision making under uncertainty (Andersen, Harrison et al. 2007) and provides salient examples of lab experiments which can be used to elicit and isolate a range of latent decision-making behaviours for choice under risk and uncertainty within relatively controlled conditions versus those which might be obtainable in the field (Harrison and Rutström 2008). My research builds on recent work in the domain of information security control choice by 1) undertaking a series of lab experiments incorporating a stochastic model of a simulated information management system at risk which supports the generation of observational data derived from a range of security control choice decisions under both risk and uncertainty (Baldwin, Beres et al. 2011); and 2) modeling the resulting decisional biases using structural models of choice under risk and uncertainty (ElGamal and Grether 1995; Harrison and Rutström 2009; Keane 2010). The research contribution consists of the novel integration of a model of stochastic system risk and domain relevant structural utility modeling using a mixed model specification for estimation of the latent decision making behaviour. It is anticipated that the research results can be applied to the real world problem of ‘tuning’ quantitative information security risk management models to the decisional biases and characteristics of the decision maker (Abdellaoui and Munier 1998
    corecore