169,523 research outputs found

    Improved OR-Composition of Sigma-Protocols

    Get PDF
    In [CDS94] Cramer, Damg̊ard and Schoenmakers (CDS) devise an OR-composition technique for ÎŁ-protocols that allows to construct highly-efficient proofs for compound statements. Since then, such technique has found countless applications as building block for designing efficient protocols. Unfortunately, the CDS OR-composition technique works only if both statements are fixed before the proof starts. This limitation restricts its usability in those protocols where the theorems to be proved are defined at different stages of the protocol, but, in order to save rounds of communication, the proof must start even if not all theorems are available. Many round-optimal protocols ([KO04, DPV04, YZ07, SV12]) crucially need such property to achieve round-optimality, and, due to the inapplicability of CDS’s technique, are currently implemented using proof systems that requires expensive NP reductions, but that allow the proof to start even if no statement is defined (a.k.a., LS proofs from Lapidot-Shamir [LS90]). In this paper we show an improved OR-composition technique for ÎŁ-protocols, that requires only one statement to be fixed when the proof starts, while the other statement can be define

    Online/Offline OR Composition of Sigma Protocols

    Get PDF
    Proofs of partial knowledge allow a prover to prove knowledge of witnesses for k out of n instances of NP languages. Cramer, Schoenmakers and DamgÄrd [10] provided an efficient construction of a 3-round public-coin witness-indistinguishable (k, n)-proof of partial knowledge for any NP language, by cleverly combining n executions of Σ-protocols for that language. This transform assumes that all n instances are fully specified before the proof starts, and thus directly rules out the possibility of choosing some of the instances after the first round. Very recently, Ciampi et al. [6] provided an improved transform where one of the instances can be specified in the last round. They focus on (1, 2)-proofs of partial knowledge with the additional feature that one instance is defined in the last round, and could be adaptively chosen by the verifier. They left as an open question the existence of an efficient (1, 2)-proof of partial knowledge where no instance is known in the first round. More in general, they left open the question of constructing an efficient (k, n)-proof of partial knowledge where knowledge of all n instances can be postponed. Indeed, this property is achieved only by inefficient constructions requiring NP reductions [19]. In this paper we focus on the question of achieving adaptive-input proofs of partial knowledge. We provide through a transform the first efficient construction of a 3-round public-coin witness-indistinguishable (k, n)-proof of partial knowledge where all instances can be decided in the third round. Our construction enjoys adaptive-input witness indistinguishability. Additionally, the proof of knowledge property remains also if the adversarial prover selects instances adaptively at last round as long as our transform is applied to a proof of knowledge belonging to the widely used class of proofs of knowledge described in [9,21]. Since knowledge of instances and witnesses is not needed before the last round, we have that the first round can be precomputed and in the online/offline setting our performance is similar to the one of [10]. Our new transform relies on the DDH assumption (in contrast to the transforms of [6,10] that are unconditional)

    Improved recovery of antioxidant compounds from refined pumpkin peel extract: a mixture design method approach

    Get PDF
    This study employed the mixture design method to determine optimal solvent combinations, aiming to obtain refined extracts from squash peels with enhanced antioxidant properties. We optimized extraction solvents, focusing on recovering the total phenolic compounds (TPC) and increased antioxidant properties using a second-order polynomial equation through the response surface methodology (RSM). Six solvents (MeOH, Hexane, DCM, EtOAc, BuOH, and water) were assessed for their effects on TPC and antioxidant activity in preliminary experiments. The refined extracts underwent a HPLC analysis for a phenolic composition determination and were further evaluated for their antibacterial activity and cytotoxicity. The results revealed a rich phenolic content in the refined extract from peels of Bejaoui landrace, primarily catechin (8.06 mg/g dry extract (DE)), followed by epicatechin and kaempferol (5 mg/g DE). Antibacterial tests against Enterococcus faecalis, Pseudomonas aeruginosa, Salmonella typhimurium, and Staphylococcus aureus showed significant antimicrobial activities, especially for Karkoubi and batati landraces, where the growth inhibitions were 99%, 96%, 97%, and 80% and 94%, 89%, 98%, and 96% for the respective bacteria. The peel extracts exhibited a negligible cytotoxicity on the RAW264.7 cell line, even at high concentrations. Our findings emphasize the potential antioxidant and antibacterial properties of peel extracts due to diverse phenolic compounds, suggesting the potential use of squash peels in the food and nutraceuticals industries as sources of natural antimicrobial agents.This study was supported by the Tunisian Ministry of Higher Education and Scientific Research and was funded under the scope of the Project PulpIng-H2020-PRIMA 2019—Section 2—Multi-topic 2019. To the Foundation for Science and Technology (FCT, Portugal) for financial support by national funds FCT/MCTES (PIDDAC) to CIMO (UIDB/00690/2020 and UIDP/00690/2020) and SusTEC (LA/P/0007/2020); to FCT for the contract of L. Barros (CEEC Institutional); to the General Secretariat for Research and Technology of the Ministry of Development and Investments under the PRIMA Program. PRIMA is an Art.185 initiative supported and co-funded under Horizon 2020, the European Union’s Program for Research and Plants 2022, 11, 800 16 of 18 Innovation (Prima2019-08).info:eu-repo/semantics/publishedVersio

    Improved Straight-Line Extraction in the Random Oracle Model With Applications to Signature Aggregation

    Get PDF
    The goal of this paper is to improve the efficiency and applicability of straightline extraction techniques in the random oracle model. Straightline extraction in the random oracle model refers to the existence of an extractor, which given the random oracle queries made by a prover P∗(x)P^*(x) on some theorem xx, is able to produce a witness ww for xx with roughly the same probability that P∗P^* produces a verifying proof. This notion applies to both zero-knowledge protocols and verifiable computation where the goal is compressing a proof. Pass (CRYPTO \u2703) first showed how to achieve this property for NP using a cut-and-choose technique which incurred a λ2\lambda^2-bit overhead in communication where λ\lambda is a security parameter. Fischlin (CRYPTO \u2705) presented a more efficient technique based on ``proofs of work\u27\u27 that sheds this λ2\lambda^2 cost, but only applies to a limited class of Sigma Protocols with a ``quasi-unique response\u27\u27 property, which for example, does not necessarily include the standard OR composition for Sigma protocols. With Schnorr/EdDSA signature aggregation as a motivating application, we develop new techniques to improve the computation cost of straight-line extractable proofs. Our improvements to the state of the art range from 70X--200X for the best compression parameters. This is due to a uniquely suited polynomial evaluation algorithm, and the insight that a proof-of-work that relies on multicollisions and the birthday paradox is faster to solve than inverting a fixed target. Our collision based proof-of-work more generally improves the Prover\u27s random oracle query complexity when applied in the NIZK setting as well. In addition to reducing the query complexity of Fischlin\u27s Prover, for a special class of Sigma protocols we can for the first time closely match a new lower bound we present. Finally we extend Fischlin\u27s technique so that it applies to a more general class of strongly-sound Sigma protocols, which includes the OR composition. We achieve this by carefully randomizing Fischlin\u27s technique---we show that its current deterministic nature prevents its application to certain multi-witness languages

    Healthcare Management Primer

    Get PDF
    This primer was written by students enrolled in HMP 721.01, Management of Health Care Organizations, in the Health Management & Policy Program, College of Health and Human Services, University of New Hampshire. This course was taught by Professor Mark Bonica in Fall 2017

    Towards Human Computable Passwords

    Get PDF
    An interesting challenge for the cryptography community is to design authentication protocols that are so simple that a human can execute them without relying on a fully trusted computer. We propose several candidate authentication protocols for a setting in which the human user can only receive assistance from a semi-trusted computer --- a computer that stores information and performs computations correctly but does not provide confidentiality. Our schemes use a semi-trusted computer to store and display public challenges Ci∈[n]kC_i\in[n]^k. The human user memorizes a random secret mapping σ:[n]→Zd\sigma:[n]\rightarrow\mathbb{Z}_d and authenticates by computing responses f(σ(Ci))f(\sigma(C_i)) to a sequence of public challenges where f:Zdk→Zdf:\mathbb{Z}_d^k\rightarrow\mathbb{Z}_d is a function that is easy for the human to evaluate. We prove that any statistical adversary needs to sample m=Ω~(ns(f))m=\tilde{\Omega}(n^{s(f)}) challenge-response pairs to recover σ\sigma, for a security parameter s(f)s(f) that depends on two key properties of ff. To obtain our results, we apply the general hypercontractivity theorem to lower bound the statistical dimension of the distribution over challenge-response pairs induced by ff and σ\sigma. Our lower bounds apply to arbitrary functions ff (not just to functions that are easy for a human to evaluate), and generalize recent results of Feldman et al. As an application, we propose a family of human computable password functions fk1,k2f_{k_1,k_2} in which the user needs to perform 2k1+2k2+12k_1+2k_2+1 primitive operations (e.g., adding two digits or remembering σ(i)\sigma(i)), and we show that s(f)=min⁥{k1+1,(k2+1)/2}s(f) = \min\{k_1+1, (k_2+1)/2\}. For these schemes, we prove that forging passwords is equivalent to recovering the secret mapping. Thus, our human computable password schemes can maintain strong security guarantees even after an adversary has observed the user login to many different accounts.Comment: Fixed bug in definition of Q^{f,j} and modified proofs accordingl

    Critical parameters for efficient sonication and improved chromatin immunoprecipitation of high molecular weight proteins

    Get PDF
    Solubilization of cross-linked cells followed by chromatin shearing is essential for successful chromatin immunoprecipitation (ChIP). However, this task, typically accomplished by ultrasound treatment, may often become a pitfall of the process, due to inconsistent results obtained between different experiments under seemingly identical conditions. To address this issue we systematically studied ultrasound-mediated cell lysis and chromatin shearing, identified critical parameters of the process and formulated a generic strategy for rational optimization of ultrasound treatment. We also demonstrated that whereas ultrasound treatment required to shear chromatin to within a range of 100–400 bp typically degrades large proteins, a combination of brief sonication and benzonase digestion allows for the generation of similarly sized chromatin fragments while preserving the integrity of associated proteins. This approach should drastically improve ChIP efficiency for this class of proteins

    On Adaptive Security of Delayed-Input Sigma Protocols and Fiat-Shamir NIZKs

    Get PDF
    We study adaptive security of delayed-input Sigma protocols and non-interactive zero-knowledge (NIZK) proof systems in the common reference string (CRS) model. Our contributions are threefold: - We exhibit a generic compiler taking any delayed-input Sigma protocol and returning a delayed-input Sigma protocol satisfying adaptive-input special honest-verifier zero-knowledge (SHVZK). In case the initial Sigma protocol also satisfies adaptive-input special soundness, our compiler preserves this property. - We revisit the recent paradigm by Canetti et al. (STOC 2019) for obtaining NIZK proof systems in the CRS model via the Fiat-Shamir transform applied to so-called trapdoor Sigma protocols, in the context of adaptive security. In particular, assuming correlation-intractable hash functions for all sparse relations, we prove that Fiat- Shamir NIZKs satisfy either: (i) Adaptive soundness (and non-adaptive zero-knowledge), so long as the challenge is obtained by hashing both the prover’s first round and the instance being proven; (ii) Adaptive zero-knowledge (and non-adaptive soundness), so long as the challenge is obtained by hashing only the prover’s first round, and further assuming that the initial trapdoor Sigma protocol satisfies adaptive-input SHVZK. - We exhibit a generic compiler taking any Sigma protocol and returning a trapdoor Sigma protocol. Unfortunately, this transform does not preserve the delayed-input property of the initial Sigma protocol (if any). To complement this result, we also give yet another compiler taking any delayed-input trapdoor Sigma protocol and returning a delayed-input trapdoor Sigma protocol with adaptive-input SHVZK. An attractive feature of our first two compilers is that they allow obtaining efficient delayed-input Sigma protocols with adaptive security, and efficient Fiat-Shamir NIZKs with adaptive soundness (and non-adaptive zero-knowledge) in the CRS model. Prior to our work, the latter was only possible using generic NP reductions

    Pharmacological Potential of Hippophae rhamnoides L. Nano-Emulsion for Management of Polycystic Ovarian Syndrome in Animals’ Model: In Vitro and In Vivo Studies

    Get PDF
    The most common female endocrinopathy, polycystic ovarian syndrome (PCOS), generally affects women of childbearing age. Hippophae rhamnoides L. has been traditionally used to improve menstrual cyclicity. Gas chromatography by flame ionization detection analysis showed that it contained various phytoconstituents such as omega-3 fatty acid, phytosterols, palmitic acid, oleic acid, and linoleic acid. H. rhamnoides L. (HR) nano-emulsion was also formulated. HR and its encapsulated nano-emulsion (HRNE) were evaluated for the treatment of PCOS. Thirty-five healthy female adult albino rats were acquired and divided into seven groups (n = 5). Letrozole (1 mg/kg) was used for 5 weeks to induce the disease. To confirm disease (PCOS) induction, the animals were weighed weekly and their vaginal smears were analyzed daily under a microscope. After PCOS induction, animals were treated with metformin, HR, and HRNE with two different doses (0.5/kg and 1 g/kg, p.o.) for 5 weeks. At the end of the treatment, animals were euthanized, and blood was collected for hormonal assessment, lipid profiling, and liver functioning test assessment. Both the ovaries were preserved for histopathology and liver for the purpose of assessment of antioxidant potential. The results revealed that HR and HRNE at both doses improved the hormonal imbalance; follicle-stimulating hormone, estrogen, and progesterone levels are increased, while luteinizing hormone surge and testosterone level are controlled. Insulin sensitivity is improved. Ovarian histopathology showed that normal ovarian echotexture is restored with corpus luteum and mature and developing follicles. HR and HRNE also improved the lipid profile and decreased lipid peroxidation (MDA) with improved antioxidant markers (SOD, CAT, and GSH). Results were statistically analyzed by one-way analysis of variance and were considered significant only if p < 0.05. In conclusion, it can be postulated that H. rhamnoides L. proved effective in the management of PCOS and its nano-emulsion effects were statistically more significant, which might be due to better bioavailability

    Improved Methodologies for Biomass Wet Chemical Analysis

    Get PDF
    The purpose of this thesis study was to further the development of lignocellulosic biomass as a potential renewable energy source by investigating new wet chemical compositional analysis techniques to be used to monitor changes in biomass composition resulting from size reduction and separation processes such grinding and sieving. Numerous disadvantages to the standard wet chemical analysis procedure as developed by US Dept of Energy and the National Renewable Energy Laboratory (NREL) were identified as targets for possible improvements. The overall objective was utilization of ionic liquids as a “green” alternative to the use of aqueous acidic solvents employed in the NREL protocol. These experiments included direct spectral analyses to quantify the lignin constituent, and successive enzymatic hydrolysis for quantification of the cellulose constituent. Results contained herein revealed that solubilization of biomass occurred in ionic liquids, which allowed for rapid spectroscopic determination of its lignin composition. The enzymatic hydrolysis of cellulose occurred in an ionic liquid-rich solvent system, and quantification of the cellulolytic monosaccharide products was achieved using high performance liquid chromatography. Motivated by the disadvantages associated with the NREL biomass compositional analysis procedure, a new analysis procedure utilizing ionic liquids was proposed and developed as an approach aimed towards improving laboratory safety and analysis time. The study was approached by first quantifying the solubility of biomass in ionic liquids. Direct quantification of the lignin content was conducted by two methods, UV-visible spectrophotometric analyses after the addition of a dilution agent, acetonitrile, and Fourier Transform Infrared Spectroscopy. The cellulose component of yellow poplar was then completely hydrolyzed using a cellulolytic enzyme in the ionic liquid-rich reaction media, and the hydrolysate was then analyzed by high performance liquid chromatography for the quantification of glucose monomeric units. Success was achieved in the design of the analysis procedure, and it was employed for the quantification of lignin and cellulose in yellow poplar. There was also a highly predictable conversion of cellulose to glucose and cellobiose by the cellulase in the ionic liquid-rich reaction media. A biomass compositional analysis procedure for the quantification of lignin and cellulose was created and was observed to be consistent in comparison with the results from the NREL protocol. The total lignin content as a percent of dry mass in yellow poplar was found to be 25.1% ± 0.8 using the NREL protocol, and 21.5% ± 0.4 and 25.6% ± 0.1 by the UV-visible and Fourier Transform Infrared Spectroscopy approaches, respectively, in the methods described herein. The glucan component was quantified as 43.5% ± 0.5 utilizing the NREL protocol and 43.6% ± 0.3 through analysis of the enzymatic hydrolysate as part of these methodologies
    • 

    corecore