217,411 research outputs found
High Dimensional Consistent Digital Segments
We consider the problem of digitalizing Euclidean line segments from R^d to Z^d. Christ {et al.} (DCG, 2012) showed how to construct a set of {consistent digital segments} (CDS) for d=2: a collection of segments connecting any two points in Z^2 that satisfies the natural extension of the Euclidean axioms to Z^d. In this paper we study the construction of CDSs in higher dimensions.
We show that any total order can be used to create a set of {consistent digital rays} CDR in Z^d (a set of rays emanating from a fixed point p that satisfies the extension of the Euclidean axioms). We fully characterize for which total orders the construction holds and study their Hausdorff distance, which in particular positively answers the question posed by Christ {et al.}
Consistent Digital Curved Rays and Pseudoline Arrangements
Representing a family of geometric objects in the digital world where each object is represented by a set of pixels is a basic problem in graphics and computational geometry. One important criterion is the consistency, where the intersection pattern of the objects should be consistent with axioms of the Euclidean geometry, e.g., the intersection of two lines should be a single connected component. Previously, the set of linear rays and segments has been considered. In this paper, we extended this theory to families of curved rays going through the origin. We further consider some psudoline arrangements obtained as unions of such families of rays
To boardrooms and sustainability: the changing nature of segmentation
Market segmentation is the process by which customers in markets with some heterogeneity
are grouped into smaller homogeneous segments of more ‘similar’ customers. A market
segment is a group of individuals, groups or organisations sharing similar characteristics and
buying behaviour that cause them to have relatively similar needs and purchasing behaviour.
Segmentation is not a new concept: for six decades marketers have, in various guises, sought to
break-down a market into sub-groups of users, each sharing common needs, buying behavior
and marketing requirements. However, this approach to target market strategy development
has been rejuvenated in the past few years. Various reasons account for this upsurge in the
usage of segmentation, examination of which forms the focus of this white paper.
Ready access to data enables faster creation of a segmentation and the testing of propositions to
take to market. ‘Big data’ has made the re-thinking of target market segments and value
propositions inevitable, desirable, faster and more flexible. The resulting information has
presented companies with more topical and consumer-generated insights than ever before.
However, many marketers, analytics directors and leadership teams feel over-whelmed by the
sheer quantity and immediacy of such data.
Analytical prowess in consultants and inside client organisations has benefited from a stepchange,
using new heuristics and faster computing power, more topical data and stronger
market insights. The approach to segmentation today is much smarter and has stretched well
away from the days of limited data explored only with cluster analysis. The coverage and wealth
of the solutions are unimaginable when compared to the practices of a few years ago. Then,
typically between only six to ten segments were forced into segmentation solutions, so that an
organisation could cater for these macro segments operationally as well as understand them
intellectually. Now there is the advent of what is commonly recognised as micro segmentation,
where the complexity of business operations and customer management requires highly
granular thinking. In support of this development, traditional agency/consultancy roles have
transitioned into in-house business teams led by data, campaign and business change planners.
The challenge has shifted from developing a granular segmentation solution that describes all
customers and prospects, into one of enabling an organisation to react to the granularity of the
solution, deploying its resources to permit controlled and consistent one-to-one interaction
within segments. So whilst the cost of delivering and maintaining the solution has reduced with
technology advances, a new set of systems, costs and skills in channel and execution
management is required to deliver on this promise. These new capabilities range from rich
feature creative and content management solutions, tailored copy design and deployment tools,
through to instant messaging middleware solutions that initiate multi-streams of activity in a
variety of analytical engines and operational systems.
Companies have recruited analytics and insight teams, often headed by senior personnel, such as
an Insight Manager or Analytics Director. Indeed, the situations-vacant adverts for such
personnel out-weigh posts for brand and marketing managers. Far more companies possess the
in-house expertise necessary to help with segmentation analysis. Some organisations are also
seeking to monetise one of the most regularly under-used latent business assets… data.
Developing the capability and culture to bring data together from all corners of a business, the open market, commercial sources and business partners, is a step-change, often requiring a
Chief Data Officer. This emerging role has also driven the professionalism of data exploration,
using more varied and sophisticated statistical techniques.
CEOs, CFOs and COOs increasingly are the sponsor of segmentation projects as well as the users
of the resulting outputs, rather than CMOs. CEOs because recession has forced re-engineering of
value propositions and the need to look after core customers; CFOs because segmentation leads
to better and more prudent allocation of resources – especially NPD and marketing – around the
most important sub-sets of a market; COOs because they need to better look after key
customers and improve their satisfaction in service delivery. More and more it is recognised that
with a new segmentation comes organisational realignment and change, so most business
functions now have an interest in a segmentation project, not only the marketers.
Largely as a result of the digital era and the growth of analytics, directors and company
leadership teams are becoming used to receiving more extensive market intelligence and
quickly updated customer insight, so leading to faster responses to market changes, customer
issues, competitor moves and their own performance. This refreshing of insight and a leadership
team’s reaction to this intelligence often result in there being more frequent modification of a
target market strategy and segmentation decisions.
So many projects set up to consider multi-channel strategy and offerings; digital marketing;
customer relationship management; brand strategies; new product and service development;
the re-thinking of value propositions, and so forth, now routinely commence with a
segmentation piece in order to frame the ongoing work. Most organisations have deployed
CRM systems and harnessed associated customer data. CRM first requires clarity in segment
priorities. The insights from a CRM system help inform the segmentation agenda and steer how
they engage with their important customers or prospects. The growth of CRM and its ensuing
data have assisted the ongoing deployment of segmentation.
One of the biggest changes for segmentation is the extent to which it is now deployed by
practitioners in the public and not-for-profit sectors, who are harnessing what is termed social
marketing, in order to develop and to execute more shrewdly their targeting, campaigns and
messaging. For Marketing per se, the interest in the marketing toolkit from non-profit
organisations, has been big news in recent years. At the very heart of the concept of social
marketing is the market segmentation process.
The extreme rise in the threat to security from global unrest, terrorism and crime has focused
the minds of governments, security chiefs and their advisors. As a result, significant resources,
intellectual capability, computing and data management have been brought to bear on the
problem. The core of this work is the importance of identifying and profiling threats and so
mitigating risk. In practice, much of this security and surveillance work harnesses the tools
developed for market segmentation and the profiling of different consumer behaviours.
This white paper presents the findings from interviews with leading exponents of segmentation
and also the insights from a recent study of marketing practitioners relating to their current
imperatives and foci. More extensive views of some of these ‘leading lights’ have been sought
and are included here in order to showcase the latest developments and to help explain both
the ongoing surge of segmentation and the issues under-pinning its practice. The principal
trends and developments are thereby presented and discussed in this paper
Structural matching by discrete relaxation
This paper describes a Bayesian framework for performing relational graph matching by discrete relaxation. Our basic aim is to draw on this framework to provide a comparative evaluation of a number of contrasting approaches to relational matching. Broadly speaking there are two main aspects to this study. Firstly we locus on the issue of how relational inexactness may be quantified. We illustrate that several popular relational distance measures can be recovered as specific limiting cases of the Bayesian consistency measure. The second aspect of our comparison concerns the way in which structural inexactness is controlled. We investigate three different realizations ai the matching process which draw on contrasting control models. The main conclusion of our study is that the active process of graph-editing outperforms the alternatives in terms of its ability to effectively control a large population of contaminating clutter
Three-dimensional reconstruction of stenosed coronary artery segments with assessment of the flow impedance
In this paper preliminary results of a study about the diagnostic benefits of 3D visualization and quantitation of stenosed coronary artery segments are presented. As is well known, even biplane angiographic images do not provide enough information for binary reconstruction. Therefore,a priori information about the slice to be reconstructed must be incorporated into the reconstruction algorithm. One approach is to assume a circular cross-section of the coronary artery. Hence, the diameter is estimated from the contours of the vessels in both projections. Another approach is to search for a solution of the reconstruction problem close to the previously reconstructed adjacent slice. In this paper we follow the first method based on contour information. The reconstructed coronary segment is visualized in three dimensions. Based on the obtained geometry of the obstruction the pertinent blood flow impedance is estimated on the basis of fluid dynamic principles. The results of applying the reconstruction algorithms to clinical coronary biplane exposures are presented with an indication of the assessed flow impedance
Non-white frequency noise in spin torque oscillators and its effect on spectral linewidth
We measure the power spectral density of frequency fluctuations in
nanocontact spin torque oscillators over time scales up to 50 ms. We use a
mixer to convert oscillator signals ranging from 10 GHz to 40 GHz into a band
near 70 MHz before digitizing the time domain waveform. We analyze the waveform
using both zero crossing time stamps and a sliding Fourier transform, discuss
the different limitations and advantages of these two methods, and combine them
to obtain a frequency noise spectrum spanning more than five decades of Fourier
frequency . For devices having a free layer consisting of either a single
NiFe layer or a Co/Ni multilayer we find a
frequency noise spectrum that is white at large and varies as \emph{}
at small . The crossover frequency ranges from \approx\unit[10^{4}]{Hz} to
\approx\unit[10^{6}]{Hz} and the component is stronger in the
multilayer devices. Through actual and simulated spectrum analyzer
measurements, we show that frequency noise causes both broadening and a
change in shape of the oscillator's spectral line as measurement time
increases. Our results indicate that the long term stability of spin torque
oscillators cannot be accurately predicted from models based on thermal (white)
noise sources
- …