1,634 research outputs found

    Health expenditure comparison of extended-release metoprolol succinate and immediate-release metoprolol tartarate

    Get PDF
    Varun Vaidya, Pranav PatelCollege of Pharmacy and Pharmaceutical Sciences, University of Toledo, Toledo, OH, USABackground: Metoprolol, a selective beta-1 blocker, is available in two different salt forms in the market – metoprolol succinate (MS) and metoprolol tartarate (MT). Both the formulations are Food and Drug Administration approved for the treatment of hypertension. Several studies have shown similar efficacies between the two salts; however, they differ in their pharmacokinetic properties and are therefore priced differently. The primary objective of this study was to compare the overall health care expenditures of hypertensive patients on MT and MS to see if the price difference in the two preparations is offset by savings in overall expenditure.Methods: Two cohorts of patients using MT and MS were selected from the 2008 Medical Expenditure Panel Survey. Propensity score matching technique was used to balance the cohorts on various parameters such as demographic information, insurance status, and comorbidity score. Patients using MT were matched to patients using MS on the logit of propensity score using calipers of width equal to 0.2 of the standard deviation of the logit of the propensity score. Multiple regression analysis was carried out to examine the association between health expenditure and type of metoprolol salt, adjusting for other covariates.Results: A total of 742 patients were found to use metoprolol (MT-388, MS-354). After propensity score matching, a total of 582 patients were left in the sample for final analysis (291 patients in each cohort). The average annual health care expenditure was slightly higher in the MT cohort; however, after adjusting for covariates in a multivariate analysis, the difference was found to be statistically insignificant (P = 0.23).Conclusion: Both the products of metoprolol were found to have similar average annual total health care expenditure; however, MS once a day has higher out-of-pocket cost.Keywords: hypertension, cost, propensity scor

    Cecal Bascule after Colonoscopy - Case Report and Review of Literature

    Get PDF
    Cecal bascule is a rare disease variant of a cecal volvulus. It consists of upward and anterior folding of the ascending colon, forming a flap valve, and occluding the bowel lumen resulting in proximal cecal dilatation. Herein, we present a case of a patient who developed persistent abdominal pain few hours after a colonoscopy. CT scan of the abdomen revealed an upward and anterior folding of the cecum. Subsequently the patient was taken to the operating room for a right hemi-colectomy. This case emphasizes the importance to consider cecal bascule as a differential diagnosis in patients with persistent abdominal pain after colonoscopy, considering the ease of diagnosis with imaging studies and emergent surgical correction

    Simplicity Bias in Transformers and their Ability to Learn Sparse Boolean Functions

    Get PDF
    Despite the widespread success of Transformers on NLP tasks, recent works have found that they struggle to model several formal languages when compared to recurrent models. This raises the question of why Transformers perform well in practice and whether they have any properties that enable them to generalize better than recurrent models. In this work, we conduct an extensive empirical study on Boolean functions to demonstrate the following: (i) Random Transformers are relatively more biased towards functions of low sensitivity. (ii) When trained on Boolean functions, both Transformers and LSTMs prioritize learning functions of low sensitivity, with Transformers ultimately converging to functions of lower sensitivity. (iii) On sparse Boolean functions which have low sensitivity, we find that Transformers generalize near perfectly even in the presence of noisy labels whereas LSTMs overfit and achieve poor generalization accuracy. Overall, our results provide strong quantifiable evidence that suggests differences in the inductive biases of Transformers and recurrent models which may help explain Transformer's effective generalization performance despite relatively limited expressiveness.Comment: Preprin

    Understanding in-context learning in transformers and LLMs by learning to learn discrete functions

    Get PDF
    In order to understand the in-context learning phenomenon, recent works have adopted a stylized experimental framework and demonstrated that Transformers can match the performance of gradient-based learning algorithms for various classes of real-valued functions. However, the limitations of Transformers in implementing learning algorithms, and their ability to learn other forms of algorithms are not well understood. Additionally, the degree to which these capabilities are confined to attention-based models is unclear. Furthermore, it remains to be seen whether the insights derived from these stylized settings can be extrapolated to pretrained Large Language Models (LLMs). In this work, we take a step towards answering these questions by demonstrating the following: (a) On a test-bed with a variety of Boolean function classes, we find that Transformers can nearly match the optimal learning algorithm for ‘simpler’ tasks, while their performance deteriorates on more ‘complex’ tasks. Additionally, we find that certain attention-free models perform (almost) identically to Transformers on a range of tasks. (b) When provided a teaching sequence, i.e. a set of examples that uniquely identifies a function in a class, we show that Transformers learn more sample-efficiently. Interestingly, our results show that Transformers can learn to implement two distinct algorithms to solve a single task, and can adaptively select the more sample-efficient algorithm depending on the sequence of in-context examples. (c) Lastly, we show that extant LLMs, e.g. LLaMA-2, GPT-4, can compete with nearest-neighbor baselines on prediction tasks that are guaranteed to not be in their training set

    Acute esophageal necrosis masquerading acute coronary syndrome

    Get PDF
    Acute esophageal necrosis (AEN) also known as “black esophagus” or “acute necrotizing esophagus” is a rare entity characterized by striking endoscopic findings of circumferential black coloring of the esophagus. AEN most frequently seen in the distal esophagus and can extend proximally along the entire esophagus. Characteristically, the circumferential black mucosa stops abruptly at the EGJ. AEN tends to present as acute upper gastrointestinal bleeding, though other symptoms including dysphagia and epigastric pain have been described. The etiology of AEN is multifactorial including a combination of ischemic insult, mucosal barrier defect, and a backflow injury of gastric secretions. Described is a case of AEN in a patient with history of uncontrolled diabetes who presented with an atypical chest pain mimicking acute coronary syndrome with negative subsequent cardiovascular workup

    General anesthesia for cesarean section in the presence of mitral stenosis associated with severe pulmonary hypertension

    Get PDF
    Rheumatic heart disease (RHD) is the most common cause of cardiac disease during pregnancy in India. A case of severe pulmonary hypertension (pulmonary arterial pressure- 98 mm Hg) secondary to mitral stenosis associated with RHD in a 28-year old woman, is described. She underwent a high risk elective cesarean section under general anesthesia at 36 weeks of gestation. The intraoperative and postoperative course was uneventful. The advantages of general anesthesia over neuraxial blockade during cesarean section are discussed in this report. The management of patients at a high risk of developing pulmonary edema and decompensated heart failure in the perioperative period has been stressed upon. A thorough understanding of the pathophysiology and complications of RHD during pregnancy and cesarean section is required for managing the disease successfully

    Integrating Data Analytics and Decision Support Systems in Public Health Management

    Get PDF
    For better data-driven decision-making and better health results, it is important for public health management to include data analytics and decision support systems (DSS). This abstract talks about why combining these tools is important and how they might affect public health management.Data analytics is an important part of public health because it uses big sets of data to find useful information for making decisions. By looking at patterns, trends, and connections in health data, public health managers can find new health problems, make good use of resources, and keep an eye on how well measures are working.As an addition to data analytics, decision support systems offer tools and models that make the decision-making process easier. Algorithms and models are used by these systems to look at data, make suggestions, and weigh possible results. This helps public health managers make smart choices in settings that are complicated and changeable.There are several perks to using both data analytics and DSS together in public health management. It makes decisions more accurate and reliable by giving real-time data and suggestions based on proof. It also helps plan and allocate resources better by finding groups at high risk and directing actions more effectively.Putting these tools together also helps public health managers handle public health situations better, like disease attacks or natural disasters. Using data analytics and DSS, public health agencies can quickly figure out what\u27s going on, put resources where they\u27re needed most, and keep real-time track of how measures are working. DOI: https://doi.org/10.52710/seejph.49

    Seismic Fragility Updating of Highway Bridges using Field Instrumentation Data

    Get PDF
    Seismic fragility assessment of deteriorating highway bridges using analytical methods often rely on empirical, semi-empirical or numerical models to predict the rate and nature of degradation. Consequently, the predicted structural vulnerabilities of bridge components or overall bridge system during seismic shaking are only as good as the adopted deterioration models. For the sake of simplicity and ease of computational modeling, these deterioration models are often far removed from observed manifestations of time-dependent aging. One such example is the nature of corrosion in reinforced concrete bridge components under chloride attacks. While this deterioration mechanism leads to the formation of pits along the length of the rebar, past literature often adopts the simplified uniform area loss model. This study proposes a probabilistic framework that assists in improved deterioration modeling of highway bridges by explicitly modeling pit formation and also provides the opportunity of updating the analytical models with field measurement data using Bayesian techniques. The framework and case-study results presented in this study are believed to render realistic seismic fragilities for highway bridges when located in moderate to high seismic zones.This research was funded by the Science and Engineering Research Board Grant No. ECR/2016/001622. Their support is gratefully acknowledged

    Single-shot quantum memory advantage in the simulation of stochastic processes

    Full text link
    Stochastic processes underlie a vast range of natural and social phenomena. Some processes such as atomic decay feature intrinsic randomness, whereas other complex processes, e.g. traffic congestion, are effectively probabilistic because we cannot track all relevant variables. To simulate a stochastic system's future behaviour, information about its past must be stored and thus memory is a key resource. Quantum information processing promises a memory advantage for stochastic simulation that has been validated in recent proof-of-concept experiments. Yet, in all past works, the memory saving would only become accessible in the limit of a large number of parallel simulations, because the memory registers of individual quantum simulators had the same dimensionality as their classical counterparts. Here, we report the first experimental demonstration that a quantum stochastic simulator can encode the relevant information in fewer dimensions than any classical simulator, thereby achieving a quantum memory advantage even for an individual simulator. Our photonic experiment thus establishes the potential of a new, practical resource saving in the simulation of complex systems
    corecore