213,181 research outputs found
A Framework for Integrating Transportation Into Smart Cities
In recent years, economic, environmental, and political forces have quickly given rise to âSmart Citiesâ -- an array of strategies that can transform transportation in cities. Using a multi-method approach to research and develop a framework for smart cities, this study provides a framework that can be employed to: Understand what a smart city is and how to replicate smart city successes; The role of pilot projects, metrics, and evaluations to test, implement, and replicate strategies; and Understand the role of shared micromobility, big data, and other key issues impacting communities.
This research provides recommendations for policy and professional practice as it relates to integrating transportation into smart cities
Recommended from our members
Multimedia delivery in the future internet
The term âNetworked Mediaâ implies that all kinds of media including text, image, 3D graphics, audio
and video are produced, distributed, shared, managed and consumed on-line through various networks,
like the Internet, Fiber, WiFi, WiMAX, GPRS, 3G and so on, in a convergent manner [1]. This white
paper is the contribution of the Media Delivery Platform (MDP) cluster and aims to cover the Networked
challenges of the Networked Media in the transition to the Future of the Internet.
Internet has evolved and changed the way we work and live. End users of the Internet have been confronted
with a bewildering range of media, services and applications and of technological innovations concerning
media formats, wireless networks, terminal types and capabilities. And there is little evidence that the pace
of this innovation is slowing. Today, over one billion of users access the Internet on regular basis, more
than 100 million users have downloaded at least one (multi)media file and over 47 millions of them do so
regularly, searching in more than 160 Exabytes1 of content. In the near future these numbers are expected
to exponentially rise. It is expected that the Internet content will be increased by at least a factor of 6, rising
to more than 990 Exabytes before 2012, fuelled mainly by the users themselves. Moreover, it is envisaged
that in a near- to mid-term future, the Internet will provide the means to share and distribute (new)
multimedia content and services with superior quality and striking flexibility, in a trusted and personalized
way, improving citizensâ quality of life, working conditions, edutainment and safety.
In this evolving environment, new transport protocols, new multimedia encoding schemes, cross-layer inthe
network adaptation, machine-to-machine communication (including RFIDs), rich 3D content as well as
community networks and the use of peer-to-peer (P2P) overlays are expected to generate new models of
interaction and cooperation, and be able to support enhanced perceived quality-of-experience (PQoE) and
innovative applications âon the moveâ, like virtual collaboration environments, personalised services/
media, virtual sport groups, on-line gaming, edutainment. In this context, the interaction with content
combined with interactive/multimedia search capabilities across distributed repositories, opportunistic P2P
networks and the dynamic adaptation to the characteristics of diverse mobile terminals are expected to
contribute towards such a vision.
Based on work that has taken place in a number of EC co-funded projects, in Framework Program 6 (FP6)
and Framework Program 7 (FP7), a group of experts and technology visionaries have voluntarily
contributed in this white paper aiming to describe the status, the state-of-the art, the challenges and the way
ahead in the area of Content Aware media delivery platforms
The FCC's Network Neutrality Ruling in the Comcast Case: Towards a Consensus with Europe?
In August 2008, the FCC found that Comcast's restrictions on peer-to-peer upload transmissions were unreasonably discriminatory, arbitrarily targeted a particular application, and deprived consumers of their rights to run Internet applications and use services of their choice. The Comcast ruling represents a significant change in the FCC's direction: given the FCC's past decisions that broadband Internet access services do not fall within the "common carrier" category, it is notable that the agency has now imposed nondiscrimination requirements on these services. This Article shows that the rationales articulated in the FCC's Comcast order, stressing both (i) concerns about protecting competition and (ii) concerns about protecting consumers from disruption of their ability to communicate freely and privately, are rooted in centuries of Anglo-American law defining he obligations of "common carriers." The FCC appears to be moving away from its traditional emphasis on the competition policy concerns, which justify asymmetrical regulation of dominant providers for the sake of enabling competition, and toward an emphasis on the consumer protection issues, which justify symmetrical regulation of all service providers regardless whether they have market power. These developments in the U.S. echo the discussion now going on in Europe in the context of the package of proposals on a new common regulatory framework for telecommunications, released by the European Commission on Nov. 13, 2007, and which is now being debated by the European Parliament and Council. On both sides of the Atlantic, a trend is emerging to permit network discrimination only if the discrimination is narrowly tailored to achieve legitimate objectives.network neutrality, discrimination, common carrier, network management, Comcast, European Directives.
Assessing the Benefits of Public Research Within an Economic Framework: The Case of USDA's Agricultural Research Service
Evaluation of publicly funded research can help provide accountability and prioritize programs. In addition, Federal intramural research planning generally involves an institutional assessment of the appropriate Federal role, if any, and whether the research should be left to others, such as universities or the private sector. Many methods of evaluation are available, peer reviewâused primarily for establishing scientific meritâbeing the most common. Economic analysis focuses on quantifying ultimate research outcomes, whether measured in goods with market prices or in nonmarket goods such as environmental quality or human health. However, standard economic techniques may not be amenable for evaluating some important public research priorities or for institutional assessments. This report reviews quantitative methods and applies qualitative economic reasoning and stakeholder interviewing methods to the evaluation of economic benefits of Federal intramural research using three case studies of research conducted by USDAâs Agricultural Research Service (ARS). Differences among the case studies highlight the need to select suitable assessment techniques from available methodologies, the limited scope for comparing assessment results across programs, and the inherent difficulty in quantifying benefits in some research areas. When measurement and attribution issues make it difficult to quantify these benefits, the report discusses how qualitative insights based on economic concepts can help research prioritization.Agricultural Research Service, Federal intramural research, publicly funded research, Environmental Economics and Policy, Food Consumption/Nutrition/Food Safety, Livestock Production/Industries, Productivity Analysis,
Network Awareness of P2P Live Streaming Applications
Early P2P-TV systems have already attracted millions of users, and many new commercial solutions are entering this market. Little information is however available about how these systems work. In this paper we present large scale sets of experiments to compare three of the most successful P2P-TV systems, namely PPLive, SopCast and TVAnts. Our goal is to assess what level of "network awareness" has been embedded in the applications, i.e., what parameters mainly drive the peer selection and data exchange. By using a general framework that can be extended to other systems and metrics, we show that all applications largely base their choices on the peer bandwidth, i.e., they prefer high-bandwidth users, which is rather intuitive. Moreover, TVAnts and PPLive exhibits also a preference to exchange data among peers in the same autonomous system the peer belongs to. However, no evidence about preference versus peers in the same subnet or that are closer to the considered peer emerges. We believe that next-generation P2P live streaming applications definitively need to improve the level of network-awareness, so to better localize the traffic in the network and thus increase their network-friendliness as wel
Performance Analysis of Publish/Subscribe Systems
The Desktop Grid offers solutions to overcome several challenges and to
answer increasingly needs of scientific computing. Its technology consists
mainly in exploiting resources, geographically dispersed, to treat complex
applications needing big power of calculation and/or important storage
capacity. However, as resources number increases, the need for scalability,
self-organisation, dynamic reconfigurations, decentralisation and performance
becomes more and more essential. Since such properties are exhibited by P2P
systems, the convergence of grid computing and P2P computing seems natural. In
this context, this paper evaluates the scalability and performance of P2P tools
for discovering and registering services. Three protocols are used for this
purpose: Bonjour, Avahi and Free-Pastry. We have studied the behaviour of
theses protocols related to two criteria: the elapsed time for registrations
services and the needed time to discover new services. Our aim is to analyse
these results in order to choose the best protocol we can use in order to
create a decentralised middleware for desktop grid
- âŠ