2,479 research outputs found
The Smooth-Lasso and other -penalized methods
We consider a linear regression problem in a high dimensional setting where
the number of covariates can be much larger than the sample size . In
such a situation, one often assumes sparsity of the regression vector, \textit
i.e., the regression vector contains many zero components. We propose a
Lasso-type estimator (where '' stands for quadratic)
which is based on two penalty terms. The first one is the norm of the
regression coefficients used to exploit the sparsity of the regression as done
by the Lasso estimator, whereas the second is a quadratic penalty term
introduced to capture some additional information on the setting of the
problem. We detail two special cases: the Elastic-Net , which
deals with sparse problems where correlations between variables may exist; and
the Smooth-Lasso , which responds to sparse problems where
successive regression coefficients are known to vary slowly (in some
situations, this can also be interpreted in terms of correlations between
successive variables). From a theoretical point of view, we establish variable
selection consistency results and show that achieves a
Sparsity Inequality, \textit i.e., a bound in terms of the number of non-zero
components of the 'true' regression vector. These results are provided under a
weaker assumption on the Gram matrix than the one used by the Lasso. In some
situations this guarantees a significant improvement over the Lasso.
Furthermore, a simulation study is conducted and shows that the S-Lasso
performs better than known methods as the Lasso, the
Elastic-Net , and the Fused-Lasso with respect to the
estimation accuracy. This is especially the case when the regression vector is
'smooth', \textit i.e., when the variations between successive coefficients of
the unknown parameter of the regression are small. The study also reveals that
the theoretical calibration of the tuning parameters and the one based on 10
fold cross validation imply two S-Lasso solutions with close performance
Efficient H.264 intra Frame CODEC with Best prediction matrix mode algorithm
The continuous growth of smart communities and everincreasingdemand of sending or storing videos, have led toconsumption of huge amount of data. The video compressiontechniques are solving this emerging challenge. However, H.264standard can be considered most notable, and it has proven to meetproblematic requirements. The authors present (BPMM) as a novelefficient Intra prediction scheme. We can say that the creation of ourproposed technique was in a phased manner; it's emerged as aproposal and achieved impressive results in the performanceparameters as compression ratios, bit rates, and PSNR. Then in thesecond stage, we solved the challenges of overcoming the obstacle ofencoding bits overhead. In this research, we try to address the finalphase of the (BPMM) codec and to introduce our approach in a globalmanner through realization of decoding mechanism. For evaluation ofour scheme, we utilized VHDL as a platform. Final results haveproven our success to pass bottleneck of this phase, since the decodedvideos have the same PSNR that our encoder tells us, whilepreserving steady compression ratio treating the overhead. We aspireour BPMM algorithm will be adopted as reference design of H.264 inthe ITU
Efficient H.264 intra Frame CODEC with Best prediction matrix mode algorithm
The continuous growth of smart communities and everincreasingdemand of sending or storing videos, have led toconsumption of huge amount of data. The video compressiontechniques are solving this emerging challenge. However, H.264standard can be considered most notable, and it has proven to meetproblematic requirements. The authors present (BPMM) as a novelefficient Intra prediction scheme. We can say that the creation of ourproposed technique was in a phased manner; it\u27s emerged as aproposal and achieved impressive results in the performanceparameters as compression ratios, bit rates, and PSNR. Then in thesecond stage, we solved the challenges of overcoming the obstacle ofencoding bits overhead. In this research, we try to address the finalphase of the (BPMM) codec and to introduce our approach in a globalmanner through realization of decoding mechanism. For evaluation ofour scheme, we utilized VHDL as a platform. Final results haveproven our success to pass bottleneck of this phase, since the decodedvideos have the same PSNR that our encoder tells us, whilepreserving steady compression ratio treating the overhead. We aspireour BPMM algorithm will be adopted as reference design of H.264 inthe ITU
The Effect of Atorvastatin Intensity, Obesity, Gender, and Age upon New-onset Type 2 Diabetes Incidence among Libyan Coronary Heart Disease Patients
Background: Atorvastatin is one of the statins family of lipid-lowering drugs. Statin has been linked to protective actions against cardiovascular disease; however, the use of statin has been linked to an increased risk of diabetes.
Aim of the study: To assess the prevalence of diabetes following the use of Atorvastatin and also to evaluate the effect of the statin intensity, BMI, age, and gender upon the glycemic control and incidence of diabetes.
Material and methods: 200 CHD patients divided into 2 groups. The atorvastatin group (using Atorvastatin (40 or 80 mg/day) for<2 months and the control group (non-atorvastatin users). FBG, HbA1c, total cholesterol, triglyceride, ALT, and AST were investigated, and BMI was calculated for all participants.
Results: Generally, the NOD incidence was 17% (17 patients had NOD out of 100 Atorvastatin-consuming patients), whereas, there were no NOD cases in the non-Atorvastatin group (100 cases). The oldest age ( ā„ 70 years) Atorvastatin-using patients showed the highest incidence of NOD (43.8%). The NOD incidence was higher in the male group with 22 NOD cases (27.5%) than that of the female group which showed 3 NOD cases (15%). Regarding the effect of BMI, the obese group (BMI ā„ 30) showed a higher incidence of NOD (23 cases, 45.1%) than the non-obese group (BMI<30) which showed 13 NOD cases (26.5%) of the atorvastatin using patients. In regards to statin intensity, the 80 mg/day-Atorvastatin subgroup showed 7 NOD cases (30%) which was higher than the 40 mg/ day-Atorvastatin subgroup which showed 18 NOD cases (23%) of the atorvastatin using patients.
Conclusion: Atorvastatin treatment was significantly implicated in the development of NOD. NOD incidence increases with higher doses of statin. The risk of NOD was also affected by other factors including; obesity, gender, and old age
Pramipexole protective effect on rotenone induced neurotoxicity in mice
Introduction: 
Pramipexole is a new dopaminergic drug which has been approved for PD treatment. However, we tried to find a new capacity for this drug rather than symptomatic effect. 

Materials and Methods: 
A chronic rotenone model with daily oral dose of 30mg/kg was induced in mice. Pramipexole was tried in a new approach where the treatment began in the middle of rotenone course with oral dose 1mg/kg/day of pramipexole. 

Results: 
Further analysis of behavioral tests and immunohistochemistry revealed success of pramipexole in improving the rotenone intoxicated mice. 

Conclusion: 
These results showed possible beneficial effects of pramipexole against rotenone-induced neurotoxicity
Cartilage Dysfunction in ALS Patients as Side Effect of Motion Loss: 3D Mechano-Electrochemical Computational Model
Amyotrophic lateral sclerosis (ALS) is a debilitating motor neuron disease characterized by progressive weakness, muscle atrophy, and fasciculation. This fact results in a continuous degeneration and dysfunction of articular soft tissues. Specifically, cartilage is an avascular and nonneural connective tissue that allows smooth motion in diarthrodial joints. Due to the avascular nature of cartilage tissue, cells nutrition and by-product exchange are intermittently occurring during joint motions. Reduced mobility results in a change of proteoglycan density, osmotic pressure, and permeability of the tissue. This work aims to demonstrate the abnormal cartilage deformation in progressive immobilized articular cartilage for ALS patients. For this aim a novel 3D mechano-electrochemical model based on the triphasic theory for charged hydrated soft tissues is developed. ALS patient parameters such as tissue porosity, osmotic coefficient, and fixed anions were incorporated. Considering different mobility reduction of each phase of the disease, results predicted the degree of tissue degeneration and the reduction of its capacity for deformation. The present model can be a useful tool to predict the evolution of joints in ALS patients and the necessity of including specific cartilage protectors, drugs, or maintenance physical activities as part of the symptomatic treatment in amyotrophic lateral sclerosis
siRNA blocking of mammalian target of rapamycin (mTOR) attenuates pathology in annonacin-induced tauopathy in mice
Tauopathy is a pathological hallmark of many neurodegenerative diseases. It is characterized by abnormal aggregates of pathological phosphotau and somatodendritic redistribution. One suggested strategy for treating tauopathy is to stimulate autophagy, hence, getting rid of these pathological protein aggregates. One key controller of autophagy is mTOR. Since stimulation of mTOR leads to inhibition of autophagy, inhibitors of mTOR will cause stimulation of autophagy process. In this report, tauopathy was induced in mice using annonacin. Blocking of mTOR was achieved through stereotaxic injection of siRNA against mTOR. The behavioral and immunohistochemical evaluation revealed the development of tauopathy model as proven by deterioration of behavioral performance in open field test and significant tau aggregates in annonacin-treated mice. Blocking of mTOR revealed significant clearance of tau aggregates in the injected side; however, tau expression was not affected by mTOR blockage
Influence of Annealing Temperature on Structural, Electrical, and Magnetic Properties of Nd0.7Ca0.3MnO3
In this paper, we investigated the effect of annealing temperature on the electrical and magnetic properties of polycrystalline Nd0.7Ca0.3MnO3 synthesized using the well-known solid-state reaction technique. After the formation of the required Perovskite crystal structure phase, another annealing treatment has been done. The selected annealing temperatures are 700, 800, and 900Ā°C for 12 hours. Structural refinement of the X-ray diffraction patterns showed the formation of a single orthorhombic crystal structure phase the of P b n m space group in Nd0.7Ca0.3MnO3 without any impurity peaks. From magneto-resistance measurements, we found that NCMO samples have high-colossal magnetoresistance (CMR). Moreover, the under-investigated NCMO samples showed a high power factor. The resistivity data in the insulating region (T \u3eTMI) were analyzed by considering, the Mott-variable range hopping model. The phase transition temperature showed dependence on the grain size, where Curie temperature (TC) increases with an increase in the grain size
Personalized Quantification of Facial Normality using Artificial Intelligence
While congenital facial deformities are not rare, and surgeons typically perform operations to improve these deformities, currently the success of the surgical reconstruction operations can only be āmeasuredā subjectively by surgeons and specialists. No efficient objective mechanisms of comparing the outcomes of plastic reconstruction surgeries or the progress of different surgery techniques exist presently. The aim of this research project is to develop an efficient software application that can be used by plastic surgeons as an objective measurement tool for the success of an operation. The long-term vision is to develop a software application that is user-friendly and can be downloaded on a regular laptop and used by doctors and patients to assess the progress of their surgical reconstruction procedures. The application would work by first scanning a face before and after an operation and providing the surgeon with a normality score of the face from 0 to 3 where 3 represents normal and 0 represents extreme abnormality. A score will be given when the face is scanned before and after surgery. The difference between those scores is what we will call the delta. A high delta value would point to a high improvement in the normality of a face post-surgery, and a low delta value would indicate a small improvement. The first chapter of the thesis represents the introduction which describes the general aspects of the project. The second chapter presents the methodology employed for building the application and the existing solutions and proposed functional model structure. The results chapter presents the process behind collecting and labeling the image database and analyzes the scores produced by the program when fed with new images from the database. Finally, the last chapter of this thesis presents the conclusions. The list of references completes this work
Myomectomy for fibroids during cesarean section: A randomized controlled trial
Background: There is a considerable debate about the management of myoma during cesarean section (CS). Recently, several studies indicated the safety and feasibility of undertaking myomectomy during CS.Objectives: To evaluate the safety, accessibility, and short-term morbidity of myomectomy for fibroids during cesarean section.Patients and Methods: This was a randomized controlled trial that included 72 patients who were admitted to the Obstetrics & Gynecology Department, Menoufia University Hospital with uterine fibroids during pregnancy; who were randomly allocated equally into a group of cesarean myomectomy (CM; n=36) and another group of CS only (n=36). The operative events and the outcome were recorded and analyzed.Results: CM group showed a longer duration of surgery and longer hospital stay, higher amount of blood loss, and higher mean pain sores, with a highly statistically significant difference (p = 0.000). No cases in both groups required blood transfusion or ICU admission. No statistically significant differences were noted between both groups as regards the fetal outcome measures (p=0.583 & 0.601).Conclusion: CM is safe and applicable in selected cases without deleterious maternal complications. Special precautions ought to be paid during the procedure, particularly in the intramural type and with large fibroids
- ā¦