3,673 research outputs found
Intangible trust requirements - how to fill the requirements trust "gap"?
Previous research efforts have been expended in terms of the capture and subsequent instantiation of "soft" trust requirements that relate to HCI usability concerns or in relation to "hard" tangible security requirements that primarily relate to security a ssurance and security protocols. Little direct focus has been paid to managing intangible trust related requirements
per se. This 'gap' is perhaps most evident in the public B2C (Business to Consumer) E- Systems we all use on a daily basis. Some speculative suggestions are made as to how to fill the 'gap'.
Visual card sorting is suggested as a suitable evaluative tool; whilst deontic logic trust norms
and UML extended notation are the suggested (methodologically invariant) means by which software development teams can perhaps more fully capture hence visualize intangible trust requirements
Making the Most of Your Samples
We study the problem of setting a price for a potential buyer with a
valuation drawn from an unknown distribution . The seller has "data"' about
in the form of i.i.d. samples, and the algorithmic challenge is
to use these samples to obtain expected revenue as close as possible to what
could be achieved with advance knowledge of .
Our first set of results quantifies the number of samples that are
necessary and sufficient to obtain a -approximation. For example,
for an unknown distribution that satisfies the monotone hazard rate (MHR)
condition, we prove that samples are
necessary and sufficient. Remarkably, this is fewer samples than is necessary
to accurately estimate the expected revenue obtained by even a single reserve
price. We also prove essentially tight sample complexity bounds for regular
distributions, bounded-support distributions, and a wide class of irregular
distributions. Our lower bound approach borrows tools from differential privacy
and information theory, and we believe it could find further applications in
auction theory.
Our second set of results considers the single-sample case. For regular
distributions, we prove that no pricing strategy is better than
-approximate, and this is optimal by the Bulow-Klemperer theorem.
For MHR distributions, we show how to do better: we give a simple pricing
strategy that guarantees expected revenue at least times the maximum
possible. We also prove that no pricing strategy achieves an approximation
guarantee better than
Collisions of Jets of Particles from Active Galactic Nuclei with Neutralino Dark Matter
We examine the possibility that energetic Standard Model particles contained
in the jets produced by active galactic nuclei (AGN) may scatter off of the
dark matter halo which is expected to surround the AGN. In particular, if there
are nearby states in the dark sector which can appear resonantly in the
scattering, the cross section can be enhanced and a distinctive edge feature in
the energy spectrum may appear. We examine bounds on supersymmetric models
which may be obtained from the Fermi Gamma-ray Space Telescope observation of
the nearby AGN Centaurus A.Comment: 20 pages, 9 figures; v2: version published in JCA
Can intelligent optimisation techniques improve computing job scheduling in a Grid environment? review, problem and proposal
In the existing Grid scheduling literature, the reported methods and strategies are mostly related to high-level schedulers such as global schedulers, external schedulers, data schedulers, and cluster schedulers. Although a number of these have previously considered job scheduling, thus far only relatively simple queue-based policies such as First In First Out (FIFO) have been considered for local job scheduling within Grid contexts. Our initial research shows that it is worth investigating the potential impact on the performance of the Grid when intelligent optimisation techniques are applied to local scheduling policies. The research problem is defined, and a basic research methodology with a detailed roadmap is presented. This paper forms a proposal with the intention of exchanging ideas and seeking potential collaborators
Reflection on Kurosawa Akira’s Movies
This senior project is a reflection of Kurosawa Akira’s movies. I discussed the film techniques he used and his cultural confidence. In addition, I talked about the connection between his movies and my adaption of Spring Bird
Private Matchings and Allocations
We consider a private variant of the classical allocation problem: given k
goods and n agents with individual, private valuation functions over bundles of
goods, how can we partition the goods amongst the agents to maximize social
welfare? An important special case is when each agent desires at most one good,
and specifies her (private) value for each good: in this case, the problem is
exactly the maximum-weight matching problem in a bipartite graph.
Private matching and allocation problems have not been considered in the
differential privacy literature, and for good reason: they are plainly
impossible to solve under differential privacy. Informally, the allocation must
match agents to their preferred goods in order to maximize social welfare, but
this preference is exactly what agents wish to hide. Therefore, we consider the
problem under the relaxed constraint of joint differential privacy: for any
agent i, no coalition of agents excluding i should be able to learn about the
valuation function of agent i. In this setting, the full allocation is no
longer published---instead, each agent is told what good to get. We first show
that with a small number of identical copies of each good, it is possible to
efficiently and accurately solve the maximum weight matching problem while
guaranteeing joint differential privacy. We then consider the more general
allocation problem, when bidder valuations satisfy the gross substitutes
condition. Finally, we prove that the allocation problem cannot be solved to
non-trivial accuracy under joint differential privacy without requiring
multiple copies of each type of good.Comment: Journal version published in SIAM Journal on Computation; an extended
abstract appeared in STOC 201
Development and characterization of a laser-induced acoustic desorption source
A laser-induced acoustic desorption source, developed for use at central
facilities, such as free-electron lasers, is presented. It features prolonged
measurement times and a fixed interaction point. A novel sample deposition
method using aerosol spraying provides a uniform sample coverage and hence
stable signal intensity. Utilizing strong-field ionization as a universal
detection scheme, the produced molecular plume is characterized in terms of
number density, spatial extend, fragmentation, temporal distribution,
translational velocity, and translational temperature. The effect of desorption
laser intensity on these plume properties is evaluated. While translational
velocity is invariant for different desorption laser intensities, pointing to a
non-thermal desorption mechanism, the translational temperature increases
significantly and higher fragmentation is observed with increased desorption
laser fluence.Comment: 8 pages, 7 figure
Six Top Messages of New Physics at the LHC
Six top signatures provide a novel probe of new physics. We discuss
production of six top quarks as the decay products of a pair of top partners in
the setting of a composite Higgs model, and argue that the six top signal may
generically provide one of the first final states to show a discrepancy. We
construct an analysis based on quantities such as and the numbers of jets
which are tagged as boosted tops, s, or containing -tags, and show that
the LHC with 3~ab can discover top partners with masses up to around 2.5
TeV in the six top signature.Comment: 15 pages, 6 figures, and 2 table
- …