851 research outputs found

    New models for digital government: the role of service brokers in driving innovation

    Get PDF
    Executive summary Digital Government strategies are being rolled out in many Australian and international jurisdictions, ushering in a fundamentally different approach to the design and delivery of public sector services. Digital Government makes digital services (usually delivered through internet and mobile channels) the default delivery channels for the majority of services, and places them at the centre of innovating, designing and operating government services. Public sector or independent service brokers are increasingly important to delivering and designing these services. Service brokers are organisations or businesses that enable customers to interact with other organisations through easy-to-use and seamless interfaces. In the digital realm, a public sector service brokers example is one that provides a customer-focussed portal, such as the Federal Department of Human Services’ MyGov website. Independent service brokers from the private or community sectors can also provide greater service choice and innovation in how people interact with governments. Models for independent service brokers include Digital Mailboxes and Personal Safeboxes (eg Australia Post); public transport information service brokers (eg TripView, Tripgo and Google Transit), taxation service brokers (eg Xero and MYOB Online), community service brokers (eg HubCare) and access brokers for government services (eg public libraries, online access centres, etc) to assist those unable to access digital services. It is likely that the ambitious goals for large-scale adoption of digital government will only be achieved if governments encourage the involvement of independent service brokers to complement the role of public sector service brokers. However, there is currently little guidance on best practice models for agencies seeking to collaborate with independent service brokers or the other way around. This report addresses this critical knowledge gap by providing a practical guide to the service broker model. It explains the different roles of public sector and independent service brokers and provides case studies of service broker models. This will help to inform digital government strategies and policies to encourage the development of public sector and independent service brokers. It also considers how the emergence of a marketplace of service brokers will raise important issues such as how customer data is managed and protected, identity assured and how research and analysis of the data generated by these digital services can help inform better public policies and service improvement

    Physical layer security in cellular networks: a stochastic geometry approach

    No full text
    This paper studies the information-theoretic secrecy performance in large-scale cellular networks based on a stochastic geometry framework. The locations of both base stations and mobile users are modeled as independent two-dimensional Poisson point processes. We consider two important features of cellular networks, namely, information exchange between base stations and cell association, to characterize their impact on the achievable secrecy rate of an arbitrary downlink transmission with a certain portion of the mobile users acting as potential eavesdroppers. In particular, tractable results are presented under diverse assumptions on the availability of eavesdroppers' location information at the serving base station, which captures the benefit from the exchange of the location information between base stations.This work was supported by National ICT Australia (NICTA), and the Australian Research Council's Discovery Projects funding scheme (Project No. DP110102548 and DP130101760). NICTA is funded by the Australian Government as represented by the Department of Broadband, Communications and the Digital Economy and the Australian Research Council through the ICT Centre of Excellence program

    Video shot boundary detection: seven years of TRECVid activity

    Get PDF
    Shot boundary detection (SBD) is the process of automatically detecting the boundaries between shots in video. It is a problem which has attracted much attention since video became available in digital form as it is an essential pre-processing step to almost all video analysis, indexing, summarisation, search, and other content-based operations. Automatic SBD was one of the tracks of activity within the annual TRECVid benchmarking exercise, each year from 2001 to 2007 inclusive. Over those seven years we have seen 57 different research groups from across the world work to determine the best approaches to SBD while using a common dataset and common scoring metrics. In this paper we present an overview of the TRECVid shot boundary detection task, a high-level overview of the most significant of the approaches taken, and a comparison of performances, focussing on one year (2005) as an example

    An Ontological Basis for Design Methods

    Get PDF
    This paper presents a view of design methods as process artefacts that can be represented using the function-behaviour-structure (FBS) ontology. This view allows identifying five fundamental approaches to methods: black-box, procedural, artefact-centric, formal and managerial approaches. They all describe method structure but emphasise different aspects of it. Capturing these differences addresses common terminological confusions relating to methods. The paper provides an overview of the use of the fundamental method approaches for different purposes in designing. In addition, the FBS ontology is used for developing a notion of prescriptiveness of design methods as an aggregate construct defined along four dimensions: certainty, granularity, flexibility and authority. The work presented in this paper provides an ontological basis for describing, understanding and managing design methods throughout their life cycle. Keywords: Design Methods; Function-Behaviour-Structure (FBS) Ontology; Prescriptive Design Knowledge</p

    The national cloud computing strategy

    Get PDF
    Executive summary On 5 October 2012 the Prime Minister announced that the Australian Government would develop a National Cloud Computing Strategy. This announcement recognised the synergies between the National Broadband Network (NBN) and cloud computing, but also the important role for government in providing the tools that small business, individuals and government agencies need to realise the promise of cloud computing. This strategy has been developed in a partnership between government, industry and consumer groups and outlines a vision for cloud computing in Australia: Australians will create and use world-class cloud services to boost innovation and productivity across the digital economy. When organisations adopt cloud services, they are generally more productive, innovate better and operate with greater agility. As a nation, Australia is well placed to take advantage of cloud computing for a range of reasons—including a stable socio-economic system, a strong rule of law, and a highly diverse and skilled Information and Communications Technology (ICT) sector. At the individual level there are many organisations across the economy that have implemented innovative cloud computing services that have transformed the way they operate. However, as a group, Australian small business and not-for-profit organisations lag behind their counterparts in Organisation for Economic Co-operation and Development (OECD) countries in the use of online technology. This places these organisations at a competitive disadvantage, which could be overcome through the use of cloud computing services. One reason for this has been insufficient access to the necessary infrastructure to support sophisticated cloud services—the relatively slow download or upload speeds in many parts of Australia have limited the adoption of cloud services. The NBN is changing this and is a key enabler of the digital economy more broadly. There are other reasons that cloud computing has not been adopted more generally in Australia, including a lack of awareness of how to make best use of cloud computing and a lack of confidence that some organisations and individuals have in adopting cloud computing services. This strategy has identified three core goals and a set of actions to achieve the government’s vision. However, as the cloud services market continues to evolve, users and providers of cloud services must remain responsive to change. Likewise, the government will continue to adapt its strategy in response to market and technological changes

    Improvements in DCCP congestion control for satellite links

    Get PDF
    We propose modifications in the TCP-Friendly Rate Control (TFRC) congestion control mechanism from the Datagram Congestion Control Protocol (DCCP) intended for use with real-time traffic, which are aimed at improving its performance for long delay (primarily satellite) links. Firstly, we propose an algorithm to optimise the number of feedback messages per round trip time (RTT) rather than use the currently standard of at least one per RTT, based on the observed link delay. We analyse the improvements achievable with proposed modification in different phases of congestion control and present results from simulations with modified ns-2 DCCP and live experiments using the modified DCCP Linux kernel implementation. We demonstrate that the changes results in improved slow start performance and a reduced data loss compared to standard DCCP, while the introduced overhead remains acceptable

    Totally Corrective Multiclass Boosting with Binary Weak Learners

    Full text link
    In this work, we propose a new optimization framework for multiclass boosting learning. In the literature, AdaBoost.MO and AdaBoost.ECC are the two successful multiclass boosting algorithms, which can use binary weak learners. We explicitly derive these two algorithms' Lagrange dual problems based on their regularized loss functions. We show that the Lagrange dual formulations enable us to design totally-corrective multiclass algorithms by using the primal-dual optimization technique. Experiments on benchmark data sets suggest that our multiclass boosting can achieve a comparable generalization capability with state-of-the-art, but the convergence speed is much faster than stage-wise gradient descent boosting. In other words, the new totally corrective algorithms can maximize the margin more aggressively.Comment: 11 page
    • 

    corecore