974 research outputs found

    Manage entropy in Architectural Projects: assuming commitments and ordering diversity

    Get PDF
    It was primary to transfer to the student the importance of his own responsibility on the project, transcending the usual condition of mere academic practice in the school. Their projects are already real actions to transform the reality from the moment in which are the product of a responsible and critical thinking. To the regular development of the Subject of Architectural Projects with theoretical classes and collective corrections in Workshop of Projects, it was incorporated a series of methodologies to activate the individual and collective learning. The project critically acts on city model, proposes an architectural intervention with transformative will and the classroom becomes a social space where individual engagements and collective engagements are added. This dialectic with reality was concluded with the exhibition of projects, which the students explained to the neighbours as the end of the course, first in school and finally in their civic center.Fue primordial transferir al estudiante la importancia de su propia responsabilidad sobre el proyecto, trascendiendo la habitual condición de mera práctica escolar en la Escuela. Sus proyectos son ya verdaderas acciones para transformar la realidad desde el momento mismo en que son el producto de un pensamiento crítico y responsable. Al desarrollo habitual de la Asignatura de Proyectos Arquitectónicos con clases teóricas y correcciones colectivas en Taller de Proyectos se incorporan una serie de metodologías para activar individual y colectivamente el aprendizaje. El proyecto actúa críticamente sobre el modelo de ciudad, propone una intervención arquitectónica con voluntad transformadora y el aula se convierte en un espacio social donde se suman compromisos individuales y compromisos colectivos. Esta dialéctica con la realidad se concluyó con la exposición de los proyectos realizados, que los estudiantes explicaron ante los vecinos del barrio como final de curso, en la Escuela y en su Centro Cívico

    Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models

    Full text link
    [EN] This paper introduces the Group Linear Algorithm with Sparse Principal decomposition, an algorithm for supervised variable selection and clustering. Our approach extends the Sparse Group Lasso regularization to calculate clusters as part of the model fit. Therefore, unlike Sparse Group Lasso, our idea does not require prior specification of clusters between variables. To determine the clusters, we solve a particular case of sparse Singular Value Decomposition, with a regularization term that follows naturally from the Group Lasso penalty. Moreover, this paper proposes a unified implementation to deal with, but not limited to, linear regression, logistic regression, and proportional hazards models with right-censoring. Our methodology is evaluated using both biological and simulated data, and details of the implementation in R and hyperparameter search are discussed.Laria, JC.; Aguilera-Morillo, MC.; Lillo, RE. (2022). Group linear algorithm with sparse principal decomposition: a variable selection and clustering method for generalized linear models. Statistical Papers. 64(1):227-253. https://doi.org/10.1007/s00362-022-01313-z227253641Alizadeh AA, Eisen MB, Davis RE, Ma C, Lossos IS, Rosenwald A, Boldrick JC, Sabet H, Tran T, Yu X et al (2000) Distinct types of diffuse large b-cell lymphoma identified by gene expression profiling. Nature 403(6769):503–511Bair E, Hastie T, Paul D, Tibshirani R (2006) Prediction by supervised principal components. J Am Stat Assoc 101(473):119–137Beck A, Teboulle M (2009) A fast iterative shrinkage-thresholding algorithm for linear inverse problems. SIAM J Imag Sci 2(1):183–202Beisser D, Klau GW, Dandekar T, Müller T, Dittrich MT (2010) Bionet: an r-package for the functional analysis of biological networks. Bioinformatics 26(8):1129–1130Bergstra J, Bengio Y (2012) Random search for hyper-parameter optimization. J Mach Learn Res 13(Feb):281–305Bühlmann P, Rütimann P, van de Geer S, Zhang CH (2013) Correlated variables in regression: clustering and sparse estimation. J Stat Plan Inference 143(11):1835–1858Chen K, Chen K, Müller HG, Wang JL (2011) Stringing high-dimensional data for functional analysis. J Am Stat Assoc 106(493):275–284Ciuperca G (2020) Adaptive elastic-net selection in a quantile model with diverging number of variable groups. Statistics 54(5):1147–1170Dittrich MT, Klau GW, Rosenwald A, Dandekar T, Müller T (2008) Identifying functional modules in protein-protein interaction networks: an integrated exact approach. Bioinformatics 24(13):i223–i231Eddelbuettel D, François R (2011) Rcpp: seamless R and C++ integration. J Stat Softw 40(8):1–18Friedman J, Hastie T, Tibshirani R (2010a) A note on the group lasso and a sparse group lasso. arXiv preprint arXiv:1001.0736Friedman J, Hastie T, Tibshirani R (2010b) Regularization paths for generalized linear models via coordinate descent. J Stat Softw 33(1):1Kuhn M (2020) tune: Tidy Tuning Tools. https://CRAN.R-project.org/package=tune, r package version 0.1.0Kuhn M, Vaughan D (2020) parsnip: a Common API to Modeling and Analysis Functions. https://CRAN.R-project.org/package=parsnip, r package version 0.0.5Laria JC, Carmen Aguilera-Morillo M, Lillo RE (2019) An iterative sparse-group lasso. J Comput Graph Stat 28(3):722–731Luo S, Chen Z (2020) Feature selection by canonical correlation search in high-dimensional multiresponse models with complex group structures. J Am Stat Assoc 115(531):1227–1235Moore DF (2016) Applied survival analysis using R. Springer, New YorkNdiaye E, Fercoq O, Gramfort A, Salmon J (2016) Gap safe screening rules for sparse-group lasso. In: Advances in Neural Information Processing Systems, pp 388–396Price BS, Sherwood B (2017) A cluster elastic net for multivariate regression. J Mach Learn Res 18(1):8685–8723Rand WM (1971) Objective criteria for the evaluation of clustering methods. J Am Stat Assoc 66(336):846–850Ren S, Kang EL, Lu JL (2020) Mcen: a method of simultaneous variable selection and clustering for high-dimensional multinomial regression. Stat Comput 30(2):291–304Rosenwald A, Wright G, Chan WC, Connors JM, Campo E, Fisher RI, Gascoyne RD, Muller-Hermelink HK, Smeland EB, Giltnane JM et al (2002) The use of molecular profiling to predict survival after chemotherapy for diffuse large-b-cell lymphoma. N Engl J Med 346(25):1937–1947Shen H, Huang JZ (2008) Sparse principal component analysis via regularized low rank matrix approximation. J Multivar Anal 99(6):1015–1034Simon N, Friedman J, Hastie T, Tibshirani R (2013) A sparse-group lasso. J Comput Graph Stat 22(2):231–245Snoek J, Larochelle H, Adams RP (2012) Practical bayesian optimization of machine learning algorithms. In: Advances in Neural Information Processing Systems, pp 2951–2959Therneau TM (2015) A package for survival analysis in S. https://CRAN.R-project.org/package=survival, version 2.38Therneau TM, Grambsch PM (2000) Modeling survival data: extending the cox model. Springer, New YorkTibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc 58(1):267–288Tibshirani R, Bien J, Friedman J, Hastie T, Simon N, Taylor J, Tibshirani RJ (2012) Strong rules for discarding predictors in lasso-type problems. J R Stat Soc Ser B 74(2):245–266Witten DM, Shojaie A, Zhang F (2014) The cluster elastic net for high-dimensional regression with unknown variable grouping. Technometrics 56(1):112–122Zhang Y, Zhang N, Sun D, Toh KC (2020) An efficient hessian based algorithm for solving large-scale sparse group lasso problems. Math Program 179(1):223–263Zhao H, Wu Q, Li G, Sun J (2019) Simultaneous estimation and variable selection for interval-censored data with broken adaptive ridge regression. J Am Stat Assoc 1–13Zhou N, Zhu J (2010) Group variable selection via a hierarchical lasso and its oracle property. Stat Interface 3:557–574Zou H, Hastie T (2005) Regularization and variable selection via the elastic net. J R Stat Soc Ser B 67(2):301–32

    Adaptive sparse group LASSO in quantile regression

    Full text link
    [EN] This paper studies the introduction of sparse group LASSO (SGL) to the quantile regression framework. Additionally, a more flexible version, an adaptive SGL is proposed based on the adaptive idea, this is, the usage of adaptive weights in the penalization. Adaptive estimators are usually focused on the study of the oracle property under asymptotic and double asymptotic frameworks. A key step on the demonstration of this property is to consider adaptive weights based on a initial root n-consistent estimator. In practice this implies the usage of a non penalized estimator that limits the adaptive solutions to low dimensional scenarios. In this work, several solutions, based on dimension reduction techniques PCA and PLS, are studied for the calculation of these weights in high dimensional frameworks. The benefits of this proposal are studied both in synthetic and real datasets.We appreciate the work of the referees that has contributed to substantially improve the scientific contributions of this work. In this research we have made use of Uranus, a supercomputer cluster located at University Carlos III of Madrid and funded jointly by EU-FEDER funds and by the Spanish Government via the National Projects No. UNC313-4E-2361, No. ENE2009-12213- C03-03, No. ENE2012-33219 and No. ENE2015-68265-P. This research was partially supported by research grants and Project ECO2015-66593-P from Ministerio de Economia, Industria y Competitividad, Project MTM2017-88708-P from Ministerio de Economia y Competitividad, FEDER funds and Project IJCI-2017-34038 from Agencia Estatal de Investigacion, Ministerio de Ciencia, Innovacion y Universidades.Mendez-Civieta, A.; Aguilera-Morillo, MC.; Lillo, RE. (2021). Adaptive sparse group LASSO in quantile regression. Advances in Data Analysis and Classification. 15:547-573. https://doi.org/10.1007/s11634-020-00413-8S54757315Chatterjee S, Banerjee, Arindam S, Ganguly AR (2011) Sparse Group Lasso for regression on land climate variables. In: IEEE 11th international conference on data mining workshops. IEEE, pp 1–8Chiang AP, Beck JS, Yen H-J, Tayeh MK, Scheetz TE, Swiderski RE, Nishimura DY, Braun TA, Kim K-YA, Huang J, Elbedour K, Carmi R, Slusarski DC, Casavant TL, Stone EM, Sheffield VC (2006) Homozygosity mapping with SNP arrays identifies TRIM32, an E3 ubiquitin ligase, as a Bardet-Biedl syndrome gene (BBS11). Proc Natl Acad Sci 103(16):6287–6292Chun H, Keleş S (2010) Sparse partial least squares regression for simultaneous dimension reduction and variable selection. J R Stat Soc Ser B Stat Methodol 72(1):3–25Ciuperca G (2017) Adaptive fused LASSO in grouped quantile regression. J Stat Theory Pract 11(1):107–125Ciuperca G (2019) Adaptive group LASSO selection in quantile models. Stat Pap 60(1):173–197Diamond S, Boyd S (2016) CVXPY: a Python-embedded modeling language for convex optimization. arXiv:1603.00943Domahidi A, Chu E, Boyd S (2013) ECOS: an SOCP solver for embedded systems. In: European control conference (ECC)Fan J, Li R (2001) Variable selection via nonconcave penalized likelihood and its oracle properties. J Am Stat Assoc 96(456):1348–1360Fan J, Peng H (2004) Nonconcave penalized likelihood with a diverging number of parameters. Ann Stat 32(3):928–961Friedman J, Hastie T, Tibshirani R (2010) A note on the group lasso and a sparse group lasso, pp 1–8. ArXiv:1001.0736Ghosh S (2011) On the grouped selection and model complexity of the adaptive elastic net. Stat Comput 21:451–462Huang J, Horowitz JL, Ma S (2008a) Asymptotic properties of bridge estimators in sparse high-dimensional regression models. Ann Stat 36(2):587–613Huang J, Ma S, Zhang C-H (2008b) Adaptive Lasso for sparse high-dimensional regression. Stat Sin 1(374):1–28Huber PJ, Ronchetti EM (2009) Robust statistics. Wiley series in probability and statistics, 2nd edn. Wiley, HobokenKim Y, Choi H, Oh HS (2008) Smoothly clipped absolute deviation on high dimensions. J Am Stat Assoc 103(484):1665–1673Koenker R (2005) Quantile regression. Cambridge University Press, CambridgeKoenker R, Bassett G (1978) Regression quantiles. Econometrica 46(1):33–50Laria JC, Aguilera-Morillo MC, Lillo RE (2019) An iterative sparse-group Lasso. J Comput Graph Stat 28:722–731Li Y, Zhu J (2008) L1_1-Norm quantile regression. J Comput Graph Stat 17(1):1–23Loh PL (2017) Statistical consistency and asymptotic normality for high-dimensional robust m-estimators. Ann Stat 45(2):866–896Nardi Y, Rinaldo A (2008) On the asymptotic properties of the group lasso estimator for linear models. Electron J Stat 2:605–633Poignard B (2018) Asymptotic theory of the adaptive Sparse Group Lasso. Ann Inst Stat Math 72:297–328Scheetz TE, Kim K-YA, Swiderski RE, Philp AR, Braun TA, Knudtson KL, Dorrance AM, DiBona GF, Huang J, Casavant TL, Sheffield VC, Stone EM (2006) Regulation of gene expression in the mammalian eye and its relevance to eye disease. Proc Natl Acad Sci 103(39):14429–14434Simon N, Friedman J, Hastie T, Tibshirani R (2013) A sparse-group lasso. J Comput Graph Stat 22(2):231–245Subramanian A, Tamayo P, Mootha VK, Mukherjee S, Ebert BL, Gillette MA, Paulovich A, Pomeroy SL, Golub TR, Lander ES, Mesirov JP (2005) Gene set enrichment analysis: a knowledge-based approach for interpreting genome-wide expression profiles. Proc Natl Acad Sci 102(43):15545–15550Tibshirani R (1996) Regression shrinkage and selection via the lasso. J R Stat Soc Ser B (Methodol) 58(1):267–288Wang L, Wu Y, Li R (2012) Quantile regression for analyzing heterogeneity in ultra-high dimension. J Am Stat Assoc 107(497):214–222Wright J, Ma Y, Mairal J, Sapiro G, Huang TS, Yan S (2010) Sparse representation for computer vision and pattern recognition. Proc IEEE 98(6):1031–1044Wu Y, Liu Y (2009) Variable selection in quantile regression. Stat Sin 19(2):801–817Yahya Algamal Z, Hisyam Lee M (2019) A two-stage sparse logistic regression for optimal gene selection in high-dimensional microarray data classification. Adv Data Anal Classif 13:753–771Yuan M, Lin Y (2006) Model selection and estimation in regression with grouped variables. J R Stat Soc Ser B (Methodol) 68(1):49–67Zhao W, Zhang R, Liu J (2014) Sparse group variable selection based on quantile hierarchical Lasso. J Appl Stat 41(8):1658–1677Zhou N, Zhu J (2010) Group variable selection via a hierarchical lasso and its oracle property. Stat Interface 3:557–574Zou H (2006) The adaptive lasso and its oracle properties. J Am Stat Assoc 101(476):1418–1429Zou H, Hastie T, Tibshirani R (2006) Sparse principal component analysis. J Comput Graph Stat 15(2):265–28

    A low-voltage low-power front-end for wearable EEG systems

    No full text
    A low-voltage and low-power front-end for miniaturized, wearable EEG systems is presented. The instrumentation amplifier, which removes the electrode drift and conditions the signal for a 10-bit A/D converter, combines a chopping strategy with quasi-FGMOS (QFG) transistors to minimize low frequency noise whilst enabling operation at 1 V supply. QFG devices are also key to the A/D converter operating at 1.2 V with 70dB of SNR and an oversampling ratio of 64. The whole system consumes less than 2uW at 1.2V.Published versio

    Comparative effects of several cyclodextrins on the extraction of PAHs from a real contaminated soil

    Get PDF
    Polycyclic aromatic hydrocarbons (PAHs) are persistent organic pollutants (POPs) attracting extensive attention worldwide. Soils from many sites, such as areas of coal storage, coke oven plants, manufactured gas plants and areas of coal tar spillage present a high contamination level by PAHs. Due to their low solubility in water, the presence of PAHs in the soil matrix constitutes a long-term source of groundwater contamination, and their toxic, mutagenic and carcinogenic properties are responsible that the remediation of PAH-contaminated soil becomes a major environmental concern. In order to enhance the desorption rate of organic pollutants, various extracting agents have been used. Recently, cyclodextrins (CDs) have been proposed as an alternative agent to enhance the water solubility of hydrophobic compounds and thus their availability for biodegradation. The objectives of the present work were: to identify the level of PAHs of an aged-contaminated soil sample from a former chemical industry plant and to evaluate the ability of a natural cyclodextrin (ß-cyclodextrin, BCD) and three chemically modified cyclodextrins: 2-hydroxypropyl-ß-cyclodextrin (HPBCD), partially methylated-ß-cyclodextrin (PMBCD), and hydroxypropyl-¿-cyclodextrin (HPGCD) to extract the sixteen PAHs considered as priority pollutants by US-EPA. A real contaminated soil from the surrounding area of a deserted chemical industry situated in Asturias (North of Spain) was analyzed exhaustively in order to know its PAHs content. Then extraction experiments using Ca(NO3)2 solution or three types of different cyclodextrins solutions were carried out about the same soil. The results presented in this study show that according to Spanish legislation the analyzed soil had to be considered as contaminated soil. Its total PAHs content was about 1068.77±100.81 mg Kg-1, being phenanthrene, anthracene and naphthalene the most abundant compounds (25.3, 24.7 and 17.1 % of the total PAHs content of the soil, respectively). After the extractions experiments using CDs solutions, it was observed that the percentages of PAHs obtained were always higher than when an aqueous solution was used, although the three chemically modified cyclodextrins achieved higher extractions percentages than the natural cyclodextrin (BCD). From the sixteen selected PAHs, the highest extraction percentages was always obtained for the 3-rings PAHs, what is related with the more appropriated size and shape of this compounds with respect to the CDs cavity dimensions

    A Framework to Evaluate Software Developer’s Productivity The VALORTIA Project

    Get PDF
    Currently, there is a lack in companies developing software in relation to assessing their staff’s productivity before executing software projects, with the aim of improving effectiveness and efficiency. QuEF (Quality Evaluation Framework) is a framework that allows defining quality management tasks based on a model. The main purpose of this framework is twofold: improve an entity’s continuous quality, and given a context, decide between a set of entity’s instances on the most appropriate one. Thus, the aim of this paper is to make this framework available to evaluate productivity of professionals along software development and select the most appropriate experts to implement the suggested project. For this goal, Valortia platform, capable of carrying out this task by following the QuEF framework guidelines, is designed. Valortia is a platform to certify users' knowledge on a specific area and centralize all certification management in its model by means of providing protocols and methods for a suitable management, improving efficiency and effectiveness, reducing cost and ensuring continuous quality.Ministerio de Ciencia e Innovación TIN2013-46928-C3-3-

    Gain in Stochastic Resonance: Precise Numerics versus Linear Response Theory beyond the Two-Mode Approximation

    Get PDF
    In the context of the phenomenon of Stochastic Resonance (SR) we study the correlation function, the signal-to-noise ratio (SNR) and the ratio of output over input SNR, i.e. the gain, which is associated to the nonlinear response of a bistable system driven by time-periodic forces and white Gaussian noise. These quantifiers for SR are evaluated using the techniques of Linear Response Theory (LRT) beyond the usually employed two-mode approximation scheme. We analytically demonstrate within such an extended LRT description that the gain can indeed not exceed unity. We implement an efficient algorithm, based on work by Greenside and Helfand (detailed in the Appendix), to integrate the driven Langevin equation over a wide range of parameter values. The predictions of LRT are carefully tested against the results obtained from numerical solutions of the corresponding Langevin equation over a wide range of parameter values. We further present an accurate procedure to evaluate the distinct contributions of the coherent and incoherent parts of the correlation function to the SNR and the gain. As a main result we show for subthreshold driving that both, the correlation function and the SNR can deviate substantially from the predictions of LRT and yet, the gain can be either larger or smaller than unity. In particular, we find that the gain can exceed unity in the strongly nonlinear regime which is characterized by weak noise and very slow multifrequency subthreshold input signals with a small duty cycle. This latter result is in agreement with recent analogue simulation results by Gingl et al. in Refs. [18, 19].Comment: 22 pages, 5 eps figures, submitted to PR

    Tuning the program transformers from LCC to PDL

    Get PDF
    This work proposes an alternative definition of the so-called program transformers used to obtain reduction axioms in the Logic of Communication and Change (LCC). Our proposal uses an elegant matrix treatment of Brzozowski’s equational method instead of Kleene’s translation from finite automata to regular expressions. The two alternatives are shown to be equivalent, with Brzozowski’s method having the advantage of generating smaller expressions for models with average connectivity

    Unidimensional model of the ad-atom diffusion on a substrate submitted to a standing acoustic wave I. Derivation of the ad-atom motion equation

    Full text link
    The effect of a standing acoustic wave on the diffusion of an ad-atom on a crystalline surface is theoretically studied. We used an unidimensional space model to study the ad-atom+substrate system. The dynamic equation of the ad-atom, a Generalized Langevin equation, is analytically derived from the full Hamiltonian of the ad-atom+substrate system submitted to the acoustic wave. A detailed analysis of each term of this equation, as well as of their properties, is presented. Special attention is devoted to the expression of the effective force induced by the wave on the ad-atom. It has essentially the same spatial and time dependences as its parent standing acoustic wave
    corecore