510 research outputs found

    A Serious Games Development Environment

    Get PDF
    Un ambiente per lo sviluppo di Serious Game

    Chile en su literatura (1973-2008)

    Get PDF

    Na(+) -K(+) -2Cl(-) cotransporter type 2 trafficking and activity: The role of interacting proteins.

    Get PDF
    The central role of Na+–K+–2Cl− cotransporter type 2 (NKCC2) in vectorial transepithelial salt reabsorption in thick ascending limb cells from Henle’s loop in the kidney is evidenced by the effects of loop diuretics, the pharmacological inhibitors of NKCC2, that are amongst the most powerful antihypertensive drugs available to date. Moreover, genetic mutations of the NKCC2 encoding gene resulting in impaired apical targeting and function of NKCC2 transporter give rise to a pathological phenotype known as type I Bartter syndrome, characterised by a severe volume depletion, hypokalaemia and metabolic alkalosis with high prenatal mortality. On the contrary, excessive NKCC2 activity has been linkedwith inherited hypertension in humans and in rodent models. Interestingly, in animal models of hypertension, NKCC2 upregulation is achieved by post-translational mechanisms underlining the need to analyse the molecular mechanisms involved in the regulation of NKCC2 trafficking and activity to gain insights in the pathogenesis of hypertension

    Fine-Grained Derandomization: From Problem-Centric to Resource-Centric Complexity

    Get PDF
    We show that popular hardness conjectures about problems from the field of fine-grained complexity theory imply structural results for resource-based complexity classes. Namely, we show that if either k-Orthogonal Vectors or k-CLIQUE requires n^{epsilon k} time, for some constant epsilon>1/2, to count (note that these conjectures are significantly weaker than the usual ones made on these problems) on randomized machines for all but finitely many input lengths, then we have the following derandomizations: - BPP can be decided in polynomial time using only n^alpha random bits on average over any efficient input distribution, for any constant alpha>0 - BPP can be decided in polynomial time with no randomness on average over the uniform distribution This answers an open question of Ball et al. (STOC \u2717) in the positive of whether derandomization can be achieved from conjectures from fine-grained complexity theory. More strongly, these derandomizations improve over all previous ones achieved from worst-case uniform assumptions by succeeding on all but finitely many input lengths. Previously, derandomizations from worst-case uniform assumptions were only know to succeed on infinitely many input lengths. It is specifically the structure and moderate hardness of the k-Orthogonal Vectors and k-CLIQUE problems that makes removing this restriction possible. Via this uniform derandomization, we connect the problem-centric and resource-centric views of complexity theory by showing that exact hardness assumptions about specific problems like k-CLIQUE imply quantitative and qualitative relationships between randomized and deterministic time. This can be either viewed as a barrier to proving some of the main conjectures of fine-grained complexity theory lest we achieve a major breakthrough in unconditional derandomization or, optimistically, as route to attain such derandomizations by working on very concrete and weak conjectures about specific problems

    Memoria e tempo spazializzati nella nuova narrativa del Sud.

    Get PDF
    Lo studio parte da un rilettura dei postmoderni mutamenti di paradigma legati che trasformano la percezione dello spazio e del tempo alla luce delle più recenti scoperte della fisica e delle scienze neurocognitive, per poi verificare come la nuova narrativa del Sud, nel suo difficile rapporto con la Storia, sappia sperimentare nuove modalità di rappresentazione artistica di quest'ultima attraverso la dimensione spaziale

    Agnostic Learning from Tolerant Natural Proofs

    Get PDF
    We generalize the "learning algorithms from natural properties" framework of [CIKK16] to get agnostic learning algorithms from natural properties with extra features. We show that if a natural property (in the sense of Razborov and Rudich [RR97]) is useful also against functions that are close to the class of "easy" functions, rather than just against "easy" functions, then it can be used to get an agnostic learning algorithm over the uniform distribution with membership queries. * For AC0[q], any prime q (constant-depth circuits of polynomial size, with AND, OR, NOT, and MODq gates of unbounded fanin), which happens to have a natural property with the requisite extra feature by [Raz87, Smo87, RR97], we obtain the first agnostic learning algorithm for AC0[q], for every prime q. Our algorithm runs in randomized quasi-polynomial time, uses membership queries, and outputs a circuit for a given Boolean function f that agrees with f on all but at most polylog(n)*opt fraction of inputs, where opt is the relative distance between f and the closest function h in the class AC0[q]. * For the ideal case, a natural proof of strongly exponential correlation circuit lower bounds against a circuit class C containing AC0[2] (i.e., circuits of size exp(Omega(n)) cannot compute some n-variate function even with exp(-Omega(n)) advantage over random guessing) would yield a polynomial-time query agnostic learning algorithm for C with the approximation error O(opt)

    Hardness Amplification for Non-Commutative Arithmetic Circuits

    Get PDF
    We show that proving mildly super-linear lower bounds on non-commutative arithmetic circuits implies exponential lower bounds on non-commutative circuits. That is, non-commutative circuit complexity is a threshold phenomenon: an apparently weak lower bound actually suffices to show the strongest lower bounds we could desire. This is part of a recent line of inquiry into why arithmetic circuit complexity, despite being a heavily restricted version of Boolean complexity, still cannot prove super-linear lower bounds on general devices. One can view our work as positive news (it suffices to prove weak lower bounds to get strong ones) or negative news (it is as hard to prove weak lower bounds as it is to prove strong ones). We leave it to the reader to determine their own level of optimism
    • …
    corecore