41 research outputs found
Robust Popular Matchings
We study popularity for matchings under preferences. This solution concept
captures matchings that do not lose against any other matching in a majority
vote by the agents. A popular matching is said to be robust if it is popular
among multiple instances. We present a polynomial-time algorithm for deciding
whether there exists a robust popular matching if instances only differ with
respect to the preferences of a single agent while obtaining NP-completeness if
two instances differ only by a downward shift of one alternative by four
agents. Moreover, we find a complexity dichotomy based on preference
completeness for the case where instances differ by making some options
unavailable.Comment: Appears in: Proceedings of the 23rd International Conference on
Autonomous Agents and Multiagent Systems (AAMAS 2024
Developing a predictive model for estimation of height of a normal child using head length in south Indian children
Background: Estimation of height from length of head has been done in many races but very few studies have been done using Regression Model in South Indian children. Hence an attempt is made to develop a model based on regression for deriving height from head length in a normal child.Methods: The present study is conducted on 318 apparently normal children of both sexes attending schools in Ranga reddy District between 8 to 12 yrs of age. Regression is used to derive the predictive model for Height and Head length in both boys and girls. Pearson’s correlation has been used to find the degree of relationship between various parameters.Results: Positive strong relationship exists between Height and Head length with r= 0.733 (P<0.0001). The Regression model equation is derived as Height =11.602 × (Head Length) - 66.309. This model explains that all the variability of the response data around its mean is up to 53.70 % with r square (coefficient of determination) 0.537.Conclusions: From present study it was concluded that the head length can be used for estimation of height in medico-legal cases and other issues related to identity. As regression equations are known to be population and sex specific, there is a need for similar equations to be derived for other ethnic groups.
A Structural and Algorithmic Study of Stable Matching Lattices of Multiple Instances
Recently MV18a identified and initiated work on the new problem of
understanding structural relationships between the lattices of solutions of two
``nearby'' instances of stable matching. They also gave an application of their
work to finding a robust stable matching. However, the types of changes they
allowed in going from instance to were very restricted, namely, any one
agent executes an upward shift.
In this paper, we allow any one agent to permute its preference list
arbitrarily. Let and be the sets of stable matchings of the
resulting pair of instances and , and let and
be the corresponding lattices of stable matchings. We prove
that the matchings in form a sublattice of both
and and those in form a join
semi-sublattice of . These properties enable us to obtain a
polynomial time algorithm for not only finding a stable matching in , but also for obtaining the partial order, as promised by Birkhoff's
Representation Theorem, thereby enabling us to generate all matchings in this
sublattice.
Our algorithm also helps solve a version of the robust stable matching
problem. We discuss another potential application, namely obtaining new
insights into the incentive compatibility properties of the Gale-Shapley
Deferred Acceptance Algorithm.Comment: arXiv admin note: substantial text overlap with arXiv:1804.0553
The Flow Game: Leximin and Leximax Core Imputations
Recently [Vaz24] gave mechanisms for finding leximin and leximax core
imputations for the assignment game and remarked, "Within the area of algorithm
design, the "right" technique for solving several types of algorithmic
questions was first discovered in the context of matching and later these
insights were applied to other problems. We expect a similar phenomenon here."
One of the games explicitly mentioned in this context was the flow game of
Kalai and Zemel [KZ82]. In this paper, we give strongly polynomial time
mechanisms for computing the leximin and leximax core imputations for the flow
game, among the set of core imputations that are captured as optimal solutions
to the dual LP. We address two versions: 1. The imputations are leximin and
leximax with respect to the distance labels of edges. 2. The imputations are
leximin and leximax with respect to the product of capacities of edges and
their distance labels.Comment: 10 page
DESIGN AND ANALYSIS OF SUSPENSION SYSTEM
A suspension system or shock absorber is really a mechanical device made to lessen or moist shock impulse, and dissipate kinetic energy. The shocks duty would be to absorb or dissipate energy. In the vehicle, it cuts down on the result of traveling over rough ground, resulting in improved quality of ride, while increasing in comfort because of substantially reduced amplitude of disturbances. Whenever a vehicle travels on an amount road and also the wheels strike a bump, the spring is compressed rapidly. The compressed spring will Endeavour revisit its normal loaded length and, in that way, will rebound past its normal height, resulting in the body to become lifted. The load from the vehicle will push the spring lower below its normal loaded height. Within this project a surprise absorber was created along with a 3D model is produced using Pro/Engineer. The model can also be altered by altering the thickness from the spring. Structural analysis and modal analysis are carried out around the shock absorber by different material for spring, Spring Steel and Beryllium Copper. Comparison is completed for 2 materials to ensure best material for spring in Shock absorber. Modeling is completed in Pro/ENGINEER and analysis is completed in ANSYS. Pro/ENGINEER may be the standard in 3D product design, featuring industry-leading productivity tools that promote guidelines in design. ANSYS is general-purpose finite element analysis (FEA) software program. Finite Element Analysis is really a statistical approach to deconstructing an intricate system into really small pieces (of user-designated size) known as elements
A live-online mindfulness-based intervention for children living with epilepsy and their families: protocol for a randomized controlled trial of Making Mindfulness Matter©.
BACKGROUND: Epilepsy extends far beyond seizures; up to 80% of children with epilepsy (CWE) may have comorbid cognitive or mental health problems, and up to 50% of parents of CWE are at risk for major depression. Past research has also shown that family environment has a greater influence on children\u27s and parents\u27 health-related quality of life (HRQOL) and mental health than epilepsy-related factors. There is a pressing need for low-cost, innovative interventions to improve HRQOL and mental health for CWE and their parents. The aim of this randomized controlled trial (RCT) is to evaluate whether an interactive online mindfulness-based intervention program, Making Mindfulness Matter (M3), can be feasibly implemented and whether it positively affects CWE\u27s and parents\u27 HRQOL and mental health (specifically, stress, behavioral, depressive, and anxiety symptoms).
METHODS: This parallel RCT was planned to recruit 100 child-parent dyads to be randomized 1:1 to the 8-week intervention or waitlist control and followed over 20 weeks. The intervention, M3, will be delivered online and separately to parents and children (ages 4-10 years) in groups of 4-8 by non-clinician staff of a local community epilepsy agency. The intervention incorporates mindful awareness, social-emotional learning skills, and positive psychology. It is modeled after the validated school-based MindUP program and adapted for provision online and to include a parent component.
DISCUSSION: This RCT will determine whether this online mindfulness-based intervention is feasible and effective for CWE and their parents. The proposed intervention may be an ideal vector to significantly improve HRQOL and mental health for CWE and their parents given its low cost and implementation by community epilepsy agencies.
TRIAL REGISTRATION: ClinicalTrials.gov NCT04020484 . Registered on July 16, 2019
Engineering Hydroxylase and Ketoreductase Activity, Selectivity, and Stability for a Scalable Concise Synthesis of Belzutifan
Please click Additional Files below to see the full abstrac
High accuracy, lightweight methods for network measurement services
Network monitoring is indispensable for maintaining and managing networks efficiently. With increasing network traffic in the ISP, enterprise and cloud environments, it is challenging to provide low overhead monitoring services without sacrificing accuracy. In this dissertation, we present techniques to enable measurement systems and services to have (1) high measurement accuracy, and (2) low measurement overhead. In the context of active measurements, shared active measurement services have been proposed to provide a common and safe environment to conduct measurements. By adapting to user measurement requests, we present solutions to (1) selectively use inference mechanisms, and (2) schedule active measurements in a non-interfering manner. These techniques reduce the measurement overhead costs and improve the accuracy for an active measurement service. In the context of passive flow based measurements systems, this dissertation introduces Pegasus, a monitoring system that leverages co-located compute and storage devices to support aggregation queries. Using Pegasus, we present IFA (Iterative Feedback Aggregator), a technique to accurately detect global icebergs and network anomalies at a low communication cost. Finally, we present ALE (Approximate Latency Estimator), a scalable and low-overhead technique to estimate TCP round trip times at high data rates for troubleshooting network performance problems
Mitigating interference in a network measurement service
Shared measurement services offer key advantages over conventional ad-hoc techniques for network monitoring. A measurement service may receive measurement requests concurrently from different applications and network administrators. These measurement requests are often served by injecting active network measurement traffic between two hosts. Two active measurements are said to interfere when the probe packets of one measurement tool are viewed as network traffic by the other. This may lead to faulty measurement readings. In this paper, we model the measurement interference problem, and show how to schedule measurement tasks to reduce interference and hence increase measurement accuracy. We propose twelve computationally tractable algorithms that decrease the total completion time (makespan) of measurement tasks, while avoiding interference. Our evaluation shows that the algorithm we refer to as Largest Area First, Busiest Node First - Earliest Interval Schedule (LAFBNF-EIS) has a mean makespan of about 5% more than the theoretical lower bound over our set of measurement workloads