507 research outputs found
Analisis Rasio Solvabilitas Pada PT Bank Sulut
: To cover the shortage of funds, the company will need to obtain funds from its own capital and of leadership. When combining these two sources of funds is called the ratio of borrowed funds or solvency ratio or leverage ratio, which recognizes the extent of the company's assets is financed by debt. The solvency ratio is a tool to measure the wealth assessment company. For this study entitled: Solvency Ratio Analysis in PT Bank of North Sulawesi. Analysis solvency ratio is the ratio used to measure the extent to which the company's assets financed with debt, so that the company can measure the ability to pay all of its liabilities, both short and long term. This study used a descriptive method and analyze the company's financial statements, in particular the analysis the solvency of the PT Bank of North Sulawesi. To determine the extent of the bank's ability to seek funding, to finance their activities and the ability of some element of equity to assets in terms of efficiency. The results show that financial performance under conditions of solvability is the source of funds and property companies, total assets and total liabilities of the company is likely to increase from year to year. Primary level ratio showed a trend increase from year to year; Risk Assets Ratio levels tended to increase from year to year; Capital Ratio experienced a rising trend from year to year, except in 2012 due to declining reserves or losses on earning assets. So the Capital Adequacy Ratio (CAR) also showed rising trend. In conclusion, the level of Primary Ratio, Risk Assets Ratio, Capital Ratio and Capital Adequacy Ratio (CAR) shows the trend of the increase; PT Bank of North Sulawesi can be said as a solvent company, the level of capital adequacy or CAR can be met from 2010 to 2013. As a suggestion, the company's ability to raise or provide funds rely more improved, can satisfy funding for operational activities, in order to increase the level of CAR PT Bank of North Sulawesi remains solvable
Role of the impact parameter in exoplanet transmission spectroscopy
Transmission spectroscopy is a promising tool for the atmospheric
characterization of transiting exoplanets. Because the planetary signal is
faint, discrepancies have been reported regarding individual targets. We
investigate the dependence of the estimated transmission spectrum on deviations
of the orbital parameters of the star-planet system that are due to the
limb-darkening effects of the host star. We describe how the uncertainty on the
orbital parameters translates into an uncertainty on the planetary spectral
slope. We created synthetic transit light curves in seven different wavelength
bands, from the near-ultraviolet to the near-infrared, and fit them with
transit models parameterized by fixed deviating values of the impact parameter
. Our simulations show a wavelength-dependent offset that is more pronounced
at the blue wavelengths where the limb-darkening effect is stronger. This
offset introduces a slope in the planetary transmission spectrum that becomes
steeper with increasing values. Variations of by positive or negative
values within its uncertainty interval introduce positive or negative slopes,
thus the formation of an error envelope. The amplitude from blue optical to
near-infrared wavelength for a typical uncertainty on corresponds to one
atmospheric pressure scale height and more. This impact parameter degeneracy is
confirmed for different host types; K stars present prominently steeper slopes,
while M stars indicate features at the blue wavelengths. We demonstrate that
transmission spectra can be hard to interpret, basically because of the
limitations in defining a precise impact parameter value for a transiting
exoplanet. This consequently limits a characterization of its atmosphere
Decoding human mental states by whole-head EEG+fNIRS during category fluency task performance
Objective: Concurrent scalp electroencephalography (EEG) and functional near-infrared spectroscopy (fNIRS), which we refer to as EEG+fNIRS, promises greater accuracy than the individual modalities while remaining nearly as convenient as EEG. We sought to quantify the hybrid system's ability to decode mental states and compare it with unimodal systems.
Approach: We recorded from healthy volunteers taking the category fluency test and applied machine learning techniques to the data.
Main results: EEG+fNIRS's decoding accuracy was greater than that of its subsystems, partly due to the new type of neurovascular features made available by hybrid data.
Significance: Availability of an accurate and practical decoding method has potential implications for medical diagnosis, brain-computer interface design, and neuroergonomics
Recommended from our members
High density optical neuroimaging predicts surgeons's subjective experience and skill levels
Measuring cognitive load is important for surgical education and patient safety. Traditional approaches of measuring cognitive load of surgeons utilise behavioural metrics to measure performance and surveys and questionnaires to collect reports of subjective experience. These have disadvantages such as sporadic data, occasionally intrusive methodologies, subjective or misleading self-reporting. In addition, traditional approaches use subjective metrics that cannot distinguish between skill levels. Functional neuroimaging data was collected using a high density, wireless NIRS device from sixteen surgeons (11 attending surgeons and 5 surgery resident) and 17 students while they performed two laparoscopic tasks (Peg transfer and String pass). Participant’s subjective mental load was assessed using the NASA-TLX survey. Machine learning approaches were used for predicting the subjective experience and skill levels. The Prefrontal cortex (PFC) activations were greater in students who reported higher-than-median task load, as measured by the NASA-TLX survey. However in the case of attending surgeons the opposite tendency was observed, namely higher activations in the lower v higher task loaded subjects. We found that response was greater in the left PFC of students particularly near the dorso- and ventrolateral areas. We quantified the ability of PFC activation to predict the differences in skill and task load using machine learning while focussing on the effects of NIRS channel separation distance on the results. Our results showed that the classification of skill level and subjective task load could be predicted based on PFC activation with an accuracy of nearly 90%. Our finding shows that there is sufficient information available in the optical signals to make accurate predictions about the surgeons’ subjective experiences and skill levels. The high accuracy of results is encouraging and suggest the integration of the strategy developed in this study as a promising approach to design automated, more accurate and objective evaluation methods
Integromic analysis of genetic variation and gene expression identifies networks for cardiovascular disease phenotypes
BACKGROUND - : Cardiovascular disease (CVD) reflects a highly coordinated complex of traits. Although genome-wide association studies have reported numerous single nucleotide polymorphisms (SNPs) to be associated with CVD, the role of most of these variants in disease processes remains unknown. METHODS AND RESULTS - : We built a CVD network using 1512 SNPs associated with 21 CVD traits in genome-wide association studies (at P≤5×10) and cross-linked different traits by virtue of their shared SNP associations. We then explored whole blood gene expression in relation to these SNPs in 5257 participants in the Framingham Heart Study. At a false discovery rate <0.05, we identified 370 cis-expression quantitative trait loci (eQTLs; SNPs associated with altered expression of nearby genes) and 44 trans-eQTLs (SNPs associated with altered expression of remote genes). The eQTL network revealed 13 CVD-related modules. Searching for association of eQTL genes with CVD risk factors (lipids, blood pressure, fasting blood glucose, and body mass index) in the same individuals, we found examples in which the expression of eQTL genes was significantly associated with these CVD phenotypes. In addition, mediation tests suggested that a subset of SNPs previously associated with CVD phenotypes in genome-wide association studies may exert their function by altering expression of eQTL genes (eg, LDLR and PCSK7), which in turn may promote interindividual variation in phenotypes. CONCLUSIONS - : Using a network approach to analyze CVD traits, we identified complex networks of SNP-phenotype and SNP-transcript connections. Integrating the CVD network with phenotypic data, we identified biological pathways that may provide insights into potential drug targets for treatment or prevention of CVD
Genetics and epigenetics of liver cancer
Hepatocellular carcinoma (HCC) represents a major form of primary liver cancer in adults. Chronic infections with hepatitis B (HBV) and C (HCV) viruses and alcohol abuse are the major factors leading to HCC. This deadly cancer affects more than 500,000 people worldwide and it is quite resistant to conventional chemo- and radiotherapy. Genetic and epigenetic studies on HCC may help to understand better its mechanisms and provide new tools for early diagnosis and therapy. Recent literature on whole genome analysis of HCC indicated a high number of mutated genes in addition to well-known genes such as TP53, CTNNB1, AXIN1 and CDKN2A, but their frequencies are much lower. Apart from CTNNB1 mutations, most of the other mutations appear to result in loss-of-function. Thus, HCC-associated mutations cannot be easily targeted for therapy. Epigenetic aberrations that appear to occur quite frequently may serve as new targets. Global DNA hypomethylation, promoter methylation, aberrant expression of non-coding RNAs and dysregulated expression of other epigenetic regulatory genes such as EZH2 are the best-known epigenetic abnormalities. Future research in this direction may help to identify novel biomarkers and therapeutic targets for HCC. © 2013 Elsevier B.V
Probing the atmosphere of HD189733b with the Na I and K I lines
High spectral resolution transmission spectroscopy is a powerful tool to
characterize exoplanet atmospheres. Especially for hot Jupiters, this technique
is highly relevant, due to their high altitude absorption e.g. from resonant
sodium (Na I) and potassium (K I) lines. We resolve the atmospheric K
I-absorption on HD189733b with the aim to compare the resolved K I -line and
previously obtained high resolution Na I-D-line observations with synthetic
transmission spectra. The line profiles suggest atmospheric processes leading
to a line broadening of the order of 10 km/s for the Na I-D-lines, and only a
few km/s for the K I-line. The investigation hints that either the atmosphere
of HD189733b lacks a significant amount of K I or the alkali lines probe
different atmospheric regions with different temperature, which could explain
the differences we see in the resolved absorption lines
Sense of place in the changing process of house form: Case studies from Ankara, Turkey
This paper aims to investigate the impact of typomorphological changes of residential environments on residents’ sense of place’. Seven housing developments representing different types introduced in Ankara, Turkey since the late 19th-century are selected as case studies. Their morphological characters at the building, street and neighbourhood scales are examined, and typological transformations among the cases in terms of the degrees of continuity are identified. The paper proposes a conceptual model consisting of ten indicators to assess sense of place at the building, street and neighbourhood scales of the residents of the seven cases. The scores of sense of place are generated through structured interviews with the residents and analysed in SPSS. The results show that sense of place is negatively affected by typomorphological changes over time, particularly when mutational changes occur. Continuity in typomorphological transformation helps to maintain sense of place at a desirable level. Furthermore, physical changes at the street and neighbourhood scales have larger impact on sense of place than that at the building scale. The research thus suggests that planning and design should be responsive to traditional types in residential development, particularly at the street and neighbourhood scales to maintain residents’ sense of place
ElecSim: Monte-Carlo Open-Source Agent-Based Model to Inform Policy for Long-Term Electricity Planning
Due to the threat of climate change, a transition from a fossil-fuel based
system to one based on zero-carbon is required. However, this is not as simple
as instantaneously closing down all fossil fuel energy generation and replacing
them with renewable sources -- careful decisions need to be taken to ensure
rapid but stable progress. To aid decision makers, we present a new tool,
ElecSim, which is an open-sourced agent-based modelling framework used to
examine the effect of policy on long-term investment decisions in electricity
generation. ElecSim allows non-experts to rapidly prototype new ideas.
Different techniques to model long-term electricity decisions are reviewed
and used to motivate why agent-based models will become an important strategic
tool for policy. We motivate why an open-source toolkit is required for
long-term electricity planning.
Actual electricity prices are compared with our model and we demonstrate that
the use of a Monte-Carlo simulation in the system improves performance by
. Further, using ElecSim we demonstrate the effect of a carbon tax to
encourage a low-carbon electricity supply. We show how a {\pounds}40 ()
per tonne of CO2 emitted would lead to 70% renewable electricity by 2050.Comment: e-Energy '19 Proceedings of the Tenth ACM International Conference on
Future Energy System
- …