447 research outputs found
Be My Guest: Normalizing and Compiling Programs using a Host Language
In programming language research, normalization is a process of fundamental importance to the theory of computing and reasoning about programs.In practice, on the other hand, compilation is a process that transforms programs in a language to machine code, and thus makes the programminglanguage a usable one. In this thesis, we investigate means of normalizing and compiling programs in a language using another language as the "host".Leveraging a host to work with programs of a "guest" language enables reuse of the host\u27s features that would otherwise be strenuous to develop.The specific tools of interest are Normalization by Evaluation and Embedded Domain-Specific Languages, both of which rely on a host language for their purposes. These tools are applied to solve problems in three different domains: to show that exponentials (or closures) can be eliminated from a categorical combinatory calculus, to propose a new proof technique based on normalization for showing noninterference, and to enable the programming of resource-constrained IoT devices from Haskell
A workbench for advanced database implementation and benchmarking
This work focuses on a methodology to help bring many of our database artifacts and prototypes to reside on the top of a common workbench platform that leads to uniformity and removes overlap across different subsystems. A versatile command format has been developed to allow commands belonging to different subsystems to be interleaved in the same batch unambiguously. Through a collaborative effort carried on in parallel, existing GUIs (Graphical User Interfaces) have also been merged into a common, but simple GUI. The GUI executes a batch of commands.
Subsystem currently included are: a runner for SQL on a variety of database platforms, a runner for Quilt queries (Quilt is an early version of XQuery and runs on a platform called KWEELT), ElementalDB, an experimental database system used for instruction in a graduate database implementation course, our own XQuery engine which aims at handling data in terabyte range stored in our storage in paginated form using our pagination algorithm, a research prototype for NC94, an important spatiotemporal data set in agriculture, and a research prototype for a temporal database. The organization of the subsystems follows strict convention for ease of further development and maintenance. XML is used extensively by various subsystems. An XML based framework has been developed for benchmarking subsystems to make experiments completely repeatable at click of a button starting from creation of storage, loading of data sets, execution of commands, collecting performance data in XML-based logs to reporting using XQuery queries on the XML logs.
With a very small learning curve, the resulting workbench can be used by students, instructors, developers and researchers alike and managed easily
ROLE OF INFORMATION TECHNOLOGY IN EDUCATION SECTOR: SCRUTINIZING ITS MERITS AND DEVELOPMENTS
It’s known that technology has invaded all industries and for almost all services. This article will concentrate more on the merits of technology in Education sector and highlights its crucial prominence to its fraternity. Many institutions have adopted IT inside their campus and reaping their fruit however on bird’s view there may be few left-outs who are still demanding authenticity. There are needs to still extend the service in the Education sector, for instance, Online Education has become a global hit but still, many students do not conceive it as a professional model of education. Only leading Institutions like Oxford and Harvard are gaining advantages by these innovations. Author has taken efforts to collect all relevant information’s and put in simple & readable format herewith. Justifications are provided wherever necessary. Implementing appropriate technology is under the discretion of the institution and their requirement. This report does not campaign or propaganda any particular technology system or software that needs to be adopted. Evaluating the vox-pop outputs from Institutions, IT Enabled Service companies and fraternity of the institutions, the research was done
Modular Normalization with Types
With the increasing use of software in today’s digital world, software is becoming more and more complex and the cost of developing and maintaining software has skyrocketed. It has become pressing to develop software using effective tools that reduce this cost. Programming language research aims to develop such tools using mathematically rigorous foundations. A recurring and central concept in programming language research is normalization: the process of transforming a complex expression in a language to a canonical form while preserving its meaning. Normalization has compelling benefits in theory and practice, but is extremely difficult to achieve. Several program transformations that are used to optimise programs, prove properties of languages and check program equivalence, for instance, are after all instances of normalization, but they are seldom viewed as such.Viewed through the lens of current methods, normalization lacks the ability to be broken into sub-problems and solved independently, i.e., lacks modularity. To make matters worse, such methods rely excessively on the syntax of the language, making the resulting normalization algorithms brittle and sensitive to changes in the syntax. When the syntax of the language evolves due to modification or extension, as it almost always does in practice, the normalization algorithm may need to be revisited entirely. To circumvent these problems, normalization is currently either abandoned entirely or concrete instances of normalization are achieved using ad hoc means specific to a particular language. Continuing this trend in programming language research poses the risk of building on a weak foundation where languages either lack fundamental properties that follow from normalization or several concrete instances end up repeated in an ad hoc manner that lacks reusability.This thesis advocates for the use of type-directed Normalization by Evaluation (NbE) to develop normalization algorithms. NbE is a technique that provides an opportunity for a modular implementation of normalization algorithms by allowing us to disentangle the syntax of a language from its semantics. Types further this opportunity by allowing us to dissect a language into isolated fragments, such as functions and products, with an individual specification of syntax and semantics. To illustrate type-directed NbE in context, we develop NbE algorithms and show their applicability for typed programming language calculi in three different domains (modal types, static information-flow control and categorical combinators) and for a family of embedded-domain specific languages in Haskell
Method To Estimate Network Availability
A distributed network makes network services available to end users at various nodes or connection points throughout the distributed network’s geographic area. A network administrator monitors the performance, capability, and availability of the distributed network to provide the network services. However, the network administrator may be limited to network traffic or other network-side parameters that may not provide an accurate or a conclusive representation of the state of the distributed network. For example, diminished or decreased network traffic could indicate a malfunction in the distributed network or be a natural consequence of a decreased number of end users. Cost, infrastructure requirements, and other limitations prevent installation and operation of a secondary network, which could be used to conclusively determine the conditions of the area within the distributed network. Instead, machine-learning algorithms may monitor and model some features of the distributed network, which may supplement service availability composite metrics, and allow the network administrator to better evaluate the condition of the distributed network without the need of the secondary network
The Humour in My Tumour: Respecting Legal Capacity in Health-Care Decision-Making
Article 12 CRPD guarantees persons with disabilities the right to equal recognition before the law and the right to enjoy legal capacity on an equal basis with others in all aspects of life. It is the right to have one’s decisions legally recognised. Reshma’s decision not to take medication prescribed for schizophrenia was not accepted and respected by the physician. Instead, the physician implied that Reshma would be denied any further medical care for her current symptoms until she complied with a pharmaceutical-based treatment course for her psychiatric condition. Thus, Reshma would have had to take drugs against her will and preferences, to access standard medical care. In addition, by not respecting her legal capacity regarding Reshma’s decision not to take drugs for her schizophrenia, the physician further discriminated against Reshma in regards to other aspects of her health, including investigation of a new set of symptoms and signs that ultimately should have led to an earlier diagnosis and treatment of Reshma’s brain tumour and thus less pain, anguish and complications for her
CHEMOMETRIC SCREENING AND OPTIMIZATION OF LIQUID CHROMATOGRAPHIC METHOD FOR SIMULTANEOUS DETERMINATION OF SEVEN ANTIHISTAMINES
Objective: Chemometric optimization and validation of a HPLC method for the simultaneous determination of seven antihistamines viz., loratadine, fexofenadine, desloratadine, levocetirizine, doxylamine, promethazine and cinnarizine in bulk and their dosage form.Methods: Analytes were separated on Phenomenex cyano column by using ACN: MeOH: NH4OAc buffer as a mobile phase and peaks were detected at 220 nm. Optimization was performed in three steps: initially, fractional factorial design experiments were employed to eliminate parameters which were having an insignificant effect on responses. Significant variables: %ACN, pH and flow rate were incorporated in the central composite design and as the response variables, the retention factor (k1), resolution (Rs) of all seven investigated substances and retention time of last eluted peak (tR7) were studied. Finally, Derringer's desirability function a global optimization technique was utilized to obtain ideal chromatographic conditions for a best possible combination of separation and analysis time.Results: The results were analyzed by using ANOVA for the establishment of an appropriate statistical relationship between the inputs and outputs. The predicted response values corresponding to the highest desirability value (D = 0.815) was selected. The optimized condition of %ACN: 19.88%v/v, pH: 4 and flow rate of 1 ml/min was obtained through global optimization procedure. While using proposed condition up to seven antihistamines were separated in the same chromatogram with good resolution.Conclusion: The present study demonstrated the benefit of applying the chemometric approach in selecting optimum conditions for the simultaneous determinations of cited drugs in pharmaceutical formulations.Keywords: Cinnarizine, Desloratadine, Doxylamine, Fexofenadine, Levocetirizine, Loratadine, Promethazine, HPL
FAST CHIRAL HPLC PROCEDURE FOR THE SIMULTANEOUS DETERMINATION OF DROPROPIZINE ENANTIOMERS AND ITS NONPOLAR IMPURITY IN RAW MATERIAL AND PHARMACEUTICAL FORMULATION
Objective: Levodropropizine is a novel antitussive drug, which occurs as enantiomers. They are levodropropizine (2S) [LDP] and dextrodropropizine (impurity A) (2R) [DDP]. An isocratic chiral high performance liquid chromatographic (Normal phase HPLC) method has been developed and validated for simultaneous determination of dropropizine enantiomers along with non-polar impurity-B, (1-phenyl piperazine) [1-PP] in raw material and in dosage forms.
Methods: The compounds were separated on chiral stationary phase (CSP) Chiralpak AD-H column, with a mixture of n-hexane, anhydrous ethanol, diethyl amine (DEA) in the ratio of 55:45:0.1 v/v as mobile phase at a flow rate of 1.4 ml/min. UV detection was performed at 254 nm. The method was validated for accuracy, precision, specificity, linearity, and sensitivity. The developed and validated method was successfully used for quantitative analysis of commercially available Tablets.
Results: Total chromatographic analysis time per sample was ~5 min. with 1-PP, levodrpropizne, dextropropizine eluting with retention times of 2.5 min., 3.05 min., and 3.66 min., respectively. Validation studies revealed the method is specific, rapid, reliable and reproducible for levodropropizne and its impurity A and non chiral impurity B. Calibration plots were linear over the concentration ranges 0.5-5 µg/ml and 0.5-5 µg/ml for levodropropizine and dextrodropropizine respectively.
Conclusion: The high recovery and low relative standard deviation confirm the suitability of the method for determination of dropropizine compounds in commercial tablets
- …