22 research outputs found
Coded Caching Schemes for Multiaccess Topologies via Combinatorial Design
This paper studies a multiaccess coded caching (MACC) where the connectivity
topology between the users and the caches can be described by a class of
combinatorial designs. Our model includes as special cases several MACC
topologies considered in previous works. The considered MACC network includes a
server containing files, cache nodes and cacheless users,
where each user can access cache nodes. The server is connected to the
users via an error-free shared link, while the users can retrieve the cache
content of the connected cache-nodes while the users can directly access the
content in their connected cache-nodes. Our goal is to minimise the worst-case
transmission load on the shared link in the delivery phase. The main limitation
of the existing MACC works is that only some specific access topologies are
considered, and thus the number of users should be either linear or
exponential to . We overcome this limitation by formulating a new
access topology derived from two classical combinatorial structures, referred
to as the -design and the -group divisible design. In these topologies,
scales linearly, polynomially, or even exponentially with . By
leveraging the properties of the considered combinatorial structures, we
propose two classes of coded caching schemes for a flexible number of users,
where the number of users can scale linearly, polynomially or exponentially
with the number of cache nodes. In addition, our schemes can unify most schemes
for the shared link network and unify many schemes for the multi-access network
except for the cyclic wrap-around topology.Comment: 48 page
Recommended from our members
Hardware accelerated computer graphics algorithms
The advent of shaders in the latest generations of graphics hardware, which has made consumer level graphics hardware partially programmable, makes now an ideal time to investigate new graphical techniques and algorithms as well as attempting to improve upon existing ones.
This work looks at areas of current interest within the graphics community such as Texture Filtering, Bump Mapping and Depth of Field simulation. These are all areas which have enjoyed much interest over the history of computer graphics but which provide a great deal of scope for further investigation in the light of recent hardware advances.
A new hardware implementation of a texture filtering technique, aimed at consumer level hardware, is presented. This novel technique utilises Fourier space image filtering to reduce aliasing. Investigation shows that the technique provides reduced levels of aliasing along with comparable levels of detail to currently popular techniques. This adds to the community's knowledge by expanding the range of techniques available, as well as increasing the number of techniques which offer the potential for easy integration with current consumer level graphics hardware along with real-time performance.
Bump mapping is a long-standing and well understood technique. Variations and extensions of it have been popular in real-time 3D computer graphics for many years. A new hardware implementation of a technique termed Super Bump Mapping (SBM) is introduced. Expanding on the work of Cant and Langensiepen [1], the SBM technique adopts the novel approach of using normal maps which supply multiple vectors per texel. This allows the retention of much more detail and overcomes some of the aliasing deficiencies of standard bump mapping caused by the standard single vector approach and the non-linearity of the bump mapping process.
A novel depth of field algorithm is proposed, which is an extension of the authors previous work [2][3][4]. The technique is aimed at consumer level hardware and attempts to raise the bar for realism by providing support for the 'see-through' effect. This effect is a vital factor in the realistic appearance of simulated depth of field and has been overlooked in real time computer graphics due to the complexities of an accurate calculation. The implementation of this new algorithm on current consumer level hardware is investigated and it is concluded that while current hardware is not yet capable enough, future iterations will provide the necessary functional and performance increases
Cyber Law and Espionage Law as Communicating Vessels
Professor Lubin\u27s contribution is Cyber Law and Espionage Law as Communicating Vessels, pp. 203-225.
Existing legal literature would have us assume that espionage operations and “below-the-threshold” cyber operations are doctrinally distinct. Whereas one is subject to the scant, amorphous, and under-developed legal framework of espionage law, the other is subject to an emerging, ever-evolving body of legal rules, known cumulatively as cyber law. This dichotomy, however, is erroneous and misleading. In practice, espionage and cyber law function as communicating vessels, and so are better conceived as two elements of a complex system, Information Warfare (IW). This paper therefore first draws attention to the similarities between the practices – the fact that the actors, technologies, and targets are interchangeable, as are the knee-jerk legal reactions of the international community. In light of the convergence between peacetime Low-Intensity Cyber Operations (LICOs) and peacetime Espionage Operations (EOs) the two should be subjected to a single regulatory framework, one which recognizes the role intelligence plays in our public world order and which adopts a contextual and consequential method of inquiry. The paper proceeds in the following order: Part 2 provides a descriptive account of the unique symbiotic relationship between espionage and cyber law, and further explains the reasons for this dynamic. Part 3 places the discussion surrounding this relationship within the broader discourse on IW, making the claim that the convergence between EOs and LICOs, as described in Part 2, could further be explained by an even larger convergence across all the various elements of the informational environment. Parts 2 and 3 then serve as the backdrop for Part 4, which details the attempt of the drafters of the Tallinn Manual 2.0 to compartmentalize espionage law and cyber law, and the deficits of their approach. The paper concludes by proposing an alternative holistic understanding of espionage law, grounded in general principles of law, which is more practically transferable to the cyber realmhttps://www.repository.law.indiana.edu/facbooks/1220/thumbnail.jp
Exploring the Boundaries of Patent Commercialization Models via Litigation
This thesis explores direct patent commercialization via patent assertion, particularly patent infringement litigation, a complex nonmarket activity whose successful undertaking requires knowledge, creativity, and financial resources, as well as a colorable infringement case. Despite these complexities, firms have increasingly employed patents as competitive tools via patent assertions, particularly in the United States. This thesis explores the business models that have been created to facilitate the direct monetization of patents. Since secrecy underpins the patent assertion strategies studied, the thesis is based on rich and enhanced secondary data. In particular, a data chaining technique has been developed to assemble relevant but disparate data into a larger coherent data set that is amenable to combination and pairing with other forms of relevant public data. This research has discovered that one particularly successful business model that employs a leveraging strategy, known as the non-practicing entity (“NPE”), has itself spawned at least two other business models, the highly capitalized “patent mass aggregator” and the “patent privateer.” The patent privateer, newly discovered in this research, is particularly interesting because it provides a way for firms to employ patents to attack competitors by forming specialized NPEs in a manner that essentially expands the boundaries of the firm. This research has also examined plaintiff firm management processes during litigations brought under leveraging and proprietary strategies, the two patent litigation strategies in which firms affirmatively initiate infringement litigations. In particular, this research investigates the commercial contexts that drive patent assertion strategies to explore the effective limits of the patent right in a litigation context. The investigation concludes that a variety of robust business models and management processes may be quite successful in extracting value from patents in the US
Cyber-Activists As Innovators: Online Technologies and the Power Struggle in Iran
This thesis analyses key social and technical capabilities and functions in Iran through the
lens of the National Innovation System (NIS) model, focusing on processes influencing
the on-going online encounter between the regime and local and expatriate prodemocracy
cyber-activists in the aftermath of the country's contested presidential
elections in June 2009. Conceptually, it is located in Science and Technology Studies
(STS), with an emphasis on constructivist theory including Social Shaping of Technology
(SST) as its creative backbone.
In the original Nordic conceptualisation of the NIS model, openness is considered a
given. This prevents the model from adequately explaining the dynamics of innovation in
repressive countries. In Iran, nationwide innovation processes are distorted by high level
security officials' ideology-driven approach to the generation and diffusion of scientific
knowledge and the influence of the Islamic Revolutionary Guards Corps (IRGC) over
Iran's national economy. Bifurcated due to significant political differences, the Iranian
NIS has become dysfunctional in the absence of an integrated, democratic structure,
making the country highly dependent on foreign expertise.
The overreliance of Iran on cross-border technological contributions is reflected in the
state's internet surveillance apparatus. Currently, major European information and
communications technology (ICT) companies aid the core of the censorship infrastructure
employed by the Iranian regime, while a great majority of the anti-filtering software used
by the cyber-activists is developed by North American universities, research centres and
human rights NGOs. This, in turn, highlights a limitation in the EU export policy
regimen, which fails to promote the development of pro-democracy online innovations
and remains relatively weak in terms of its ability to regulate the overseas trade of
telecommunications technologies.
Laying emphasis on the social responsibility of large international telecommunications
companies based on the outcome of a combination of weblog content analysis, semistructured
expert interviews and document reviews, the results of this project are
expected to help improve Western policies on dual-use ICT exports to repressive
countries. A focused attempt at the dynamisation of relevant legislation by the European
Parliament (EP) can help more effectively foster egalitarian values in emerging
economies through supporting legitimate, bottom-up dissent.
The main body of data used by this research was collected through a longitudinal
observation of 65 Persian activist weblogs evaluated against an inductively crafted
checklist. The preliminary findings of the weblog content analysis were later on
examined in relation to the scripts of direct discussions with 17 active scholars and
practitioners sampled largely by snowballing, as well as to an extensive archive of legal
and journalistic documents