1,768 research outputs found
CREATe 2012-2016: Impact on society, industry and policy through research excellence and knowledge exchange
On the eve of the CREATe Festival May 2016, the Centre published this legacy report (edited by Kerry Patterson & Sukhpreet Singh with contributions from consortium researchers)
Antitrust Overreach: Undoing Cooperative Standardization in the Digital Economy
Information technology markets in general and wireless communications markets, in particular, rely on standardization mechanisms to develop interoperable devices for data processing, storage, and transmission. From 2G through the emergent 5G standard, wireless communications markets have largely achieved standardization through cooperative multi-firm arrangements that likely outperform the historically dominant alternatives of government monopoly, which is subject to informational deficits and regulatory capture, and private monopoly, which suffers from pricing and other distortions inherent to protected market positions. This cooperative process has successfully relied on three key legal elements: reasonably secure patents, quasi-contractual licensing commitments supplemented by reputation effects, and targeted application of antitrust safeguards against collusion risk. Over approximately the past decade, antitrust agencies and courts in the U.S., Europe and Asia have taken actions that threaten this legal infrastructure by limiting patenteesâ ability to seek injunctive relief, adopting rigid understandings of âfair, reasonable and non-discriminatoryâ licensing principles, and addressing collusion risk among licensors-innovators while overlooking (and even exacerbating) collusion risk among licensees-implementers. These judicial and regulatory interventions in IP licensing markets shift value from firms and economies that specialize in generating innovations to firms and economies that specialize in integrating innovations into end-user products. These entity-level and country-level redistributive effects are illustrated by lobbying activities in the wireless communications markets and antitrust actions against IP licensors in jurisdictions that have substantial net IP deficits and are principally populated by IP licensees. Current antitrust policy promotes producersâ narrow interests in lower input costs while ignoring the broader public interest in preserving the cooperative standardization structures that have supported innovation and commercialization in the digital economy
Location privacy: The challenges of mobile service devices
Adding to the current debate, this article focuses on the personal data and privacy challenges posed by private industry's use of smart mobile devices that provide location-based services to users and consumers. Directly relevant to personal data protection are valid concerns about the collection, retention, use and accessibility of this kind of personal data, in relation to which a key issue is whether valid consent is ever obtained from users. While it is indisputable that geo-location technologies serve important functions, their potential use for surveillance and invasion of privacy should not be overlooked. Thus, in this study we address the question of how a legal regime can ensure the proper functionality of geo-location technologies while preventing their misuse. In doing so, we examine whether information gathered from geo-location technologies is a form of personal data, how it is related to privacy and whether current legal protection mechanisms are adequate. We argue that geo-location data are indeed a type of personal data. Not only is this kind of data related to an identified or identifiable person, it can reveal also core biographical personal data. What is needed is the strengthening of the existing law that protects personal data (including location data), and a flexible legal response that can incorporate the ever-evolving and unknown advances in technology.postprin
To boardrooms and sustainability: the changing nature of segmentation
Market segmentation is the process by which customers in markets with some heterogeneity
are grouped into smaller homogeneous segments of more âsimilarâ customers. A market
segment is a group of individuals, groups or organisations sharing similar characteristics and
buying behaviour that cause them to have relatively similar needs and purchasing behaviour.
Segmentation is not a new concept: for six decades marketers have, in various guises, sought to
break-down a market into sub-groups of users, each sharing common needs, buying behavior
and marketing requirements. However, this approach to target market strategy development
has been rejuvenated in the past few years. Various reasons account for this upsurge in the
usage of segmentation, examination of which forms the focus of this white paper.
Ready access to data enables faster creation of a segmentation and the testing of propositions to
take to market. âBig dataâ has made the re-thinking of target market segments and value
propositions inevitable, desirable, faster and more flexible. The resulting information has
presented companies with more topical and consumer-generated insights than ever before.
However, many marketers, analytics directors and leadership teams feel over-whelmed by the
sheer quantity and immediacy of such data.
Analytical prowess in consultants and inside client organisations has benefited from a stepchange,
using new heuristics and faster computing power, more topical data and stronger
market insights. The approach to segmentation today is much smarter and has stretched well
away from the days of limited data explored only with cluster analysis. The coverage and wealth
of the solutions are unimaginable when compared to the practices of a few years ago. Then,
typically between only six to ten segments were forced into segmentation solutions, so that an
organisation could cater for these macro segments operationally as well as understand them
intellectually. Now there is the advent of what is commonly recognised as micro segmentation,
where the complexity of business operations and customer management requires highly
granular thinking. In support of this development, traditional agency/consultancy roles have
transitioned into in-house business teams led by data, campaign and business change planners.
The challenge has shifted from developing a granular segmentation solution that describes all
customers and prospects, into one of enabling an organisation to react to the granularity of the
solution, deploying its resources to permit controlled and consistent one-to-one interaction
within segments. So whilst the cost of delivering and maintaining the solution has reduced with
technology advances, a new set of systems, costs and skills in channel and execution
management is required to deliver on this promise. These new capabilities range from rich
feature creative and content management solutions, tailored copy design and deployment tools,
through to instant messaging middleware solutions that initiate multi-streams of activity in a
variety of analytical engines and operational systems.
Companies have recruited analytics and insight teams, often headed by senior personnel, such as
an Insight Manager or Analytics Director. Indeed, the situations-vacant adverts for such
personnel out-weigh posts for brand and marketing managers. Far more companies possess the
in-house expertise necessary to help with segmentation analysis. Some organisations are also
seeking to monetise one of the most regularly under-used latent business assets⊠data.
Developing the capability and culture to bring data together from all corners of a business, the open market, commercial sources and business partners, is a step-change, often requiring a
Chief Data Officer. This emerging role has also driven the professionalism of data exploration,
using more varied and sophisticated statistical techniques.
CEOs, CFOs and COOs increasingly are the sponsor of segmentation projects as well as the users
of the resulting outputs, rather than CMOs. CEOs because recession has forced re-engineering of
value propositions and the need to look after core customers; CFOs because segmentation leads
to better and more prudent allocation of resources â especially NPD and marketing â around the
most important sub-sets of a market; COOs because they need to better look after key
customers and improve their satisfaction in service delivery. More and more it is recognised that
with a new segmentation comes organisational realignment and change, so most business
functions now have an interest in a segmentation project, not only the marketers.
Largely as a result of the digital era and the growth of analytics, directors and company
leadership teams are becoming used to receiving more extensive market intelligence and
quickly updated customer insight, so leading to faster responses to market changes, customer
issues, competitor moves and their own performance. This refreshing of insight and a leadership
teamâs reaction to this intelligence often result in there being more frequent modification of a
target market strategy and segmentation decisions.
So many projects set up to consider multi-channel strategy and offerings; digital marketing;
customer relationship management; brand strategies; new product and service development;
the re-thinking of value propositions, and so forth, now routinely commence with a
segmentation piece in order to frame the ongoing work. Most organisations have deployed
CRM systems and harnessed associated customer data. CRM first requires clarity in segment
priorities. The insights from a CRM system help inform the segmentation agenda and steer how
they engage with their important customers or prospects. The growth of CRM and its ensuing
data have assisted the ongoing deployment of segmentation.
One of the biggest changes for segmentation is the extent to which it is now deployed by
practitioners in the public and not-for-profit sectors, who are harnessing what is termed social
marketing, in order to develop and to execute more shrewdly their targeting, campaigns and
messaging. For Marketing per se, the interest in the marketing toolkit from non-profit
organisations, has been big news in recent years. At the very heart of the concept of social
marketing is the market segmentation process.
The extreme rise in the threat to security from global unrest, terrorism and crime has focused
the minds of governments, security chiefs and their advisors. As a result, significant resources,
intellectual capability, computing and data management have been brought to bear on the
problem. The core of this work is the importance of identifying and profiling threats and so
mitigating risk. In practice, much of this security and surveillance work harnesses the tools
developed for market segmentation and the profiling of different consumer behaviours.
This white paper presents the findings from interviews with leading exponents of segmentation
and also the insights from a recent study of marketing practitioners relating to their current
imperatives and foci. More extensive views of some of these âleading lightsâ have been sought
and are included here in order to showcase the latest developments and to help explain both
the ongoing surge of segmentation and the issues under-pinning its practice. The principal
trends and developments are thereby presented and discussed in this paper
Search Engines, Social Media, and the Editorial Analogy
Deconstructing the âeditorial analogy,â and analogical reasoning more generally, in First Amendment litigation involving powerful tech companies
Reimagining Merger Analysis to Include Intent
Applications of Section 7 of the Clayton Act have been deficient in identifying and prohibiting anticompetitive mergers, particularly those involving the acquisition of nascent competitors in digital markets. While the language of the Clayton Act is flexible and broad, its implementation has evolved into a narrow, economic-focused analysis that requires (or expects) quantitative evidence to show competitive harm and establish a prima facie case. This approach sets an unusually high bar for plaintiffs when the mergers involve dynamic technology markets in which firms compete more on innovation than on price, primarily because the preferred economic tools are not well equipped to measure and predict innovation harms in the long run. The problems are exacerbated when dominant firms acquire nascent competitors because the potential competitive impact of their acquisition is inherently even more uncertain and therefore the quantifiable metrics even less helpful.
This Article makes a case for reimagining merger analysis to include intent to help satisfy the plaintiffâs evidentiary burden and strengthen merger enforcement. Insisting on, or strongly preferring, empirical data to demonstrate effects of a proposed acquisition when that data is unavailable means that merger law will fail in its core mission for at least certain types of mergers. Therefore, the better approach is to be open to the use of other sources of evidence, such as intent, to supplement standard economic evidence. This Article explains why and how intent evidence can be probative in predicting effects, particularly in the case of a dominant digital platformâs acquisition of a nascent rival. To illustrate, this Article draws on the collection of emails and statements made by Facebookâs executives relating to the companyâs famous acquisitions of Instagram and WhatsApp.
Though many courts and commentators today are dismissive of the value of intent, integrating it into merger analysis would not require legislative action because the relevant statutory language is broad and no major case has barred its use. The Article concludes by addressing the main objections that critics have raised about the use of intent evidence in antitrust analysis generally
Amazon and Platform Antitrust
With its decision in Ohio v. American Express, the U.S. Supreme Court for the first time embraced the recently developed, yet increasingly prolific, concept of the two-sided platform. Through advances in technology, platforms, which serve as intermediaries allowing two groups to transact, are increasingly ubiquitous, and many of the biggest tech companies operate in this fashion. Amazon Marketplace, for example, provides a platform for third-party vendors to sell directly to consumers through Amazonâs web and mobile interfaces. At the same time that platforms and their scholarship have evolved, a burgeoning antitrust movement has also developed which focuses on the impact of the dominance of these tech companies and the fear that current antitrust laws are ill-equipped to prevent any potential anticompetitive behavior. Many of those who feel this way worried that American Express, which decided whether a plaintiff alleging anticompetitive behavior by a two- sided platform would have to show harm to both sides of the market to make a prima facie case, would give companies like Amazon even more power. This Note argues that while the case could be interpreted in such a way, because Amazon and similarly situated platforms possess a great degree of control over their usersâin some cases competing with them directlyâit would be unwise to do so
Privacy as a Public Good
Privacy is commonly studied as a private good: my personal data is mine to protect and control, and yours is yours. This conception of privacy misses an important component of the policy problem. An individual who is careless with data exposes not only extensive information about herself, but about others as well. The negative externalities imposed on nonconsenting outsiders by such carelessness can be productively studied in terms of welfare economics. If all relevant individuals maximize private benefit, and expect all other relevant individuals to do the same, neoclassical economic theory predicts that society will achieve a suboptimal level of privacy. This prediction holds even if all individuals cherish privacy with the same intensity. As the theoretical literature would have it, the struggle for privacy is destined to become a tragedy.
But according to the experimental public-goods literature, there is hope. Like in real life, people in experiments cooperate in groups at rates well above those predicted by neoclassical theory. Groups can be aided in their struggle to produce public goods by institutions, such as communication, framing, or sanction. With these institutions, communities can manage public goods without heavy-handed government intervention. Legal scholarship has not fully engaged this problem in these terms. In this Article, we explain why privacy has aspects of a public good, and we draw lessons from both the theoretical and the empirical literature on public goods to inform the policy discourse on privacy
- âŠ