51 research outputs found

    İşe Alım Süreci Aday Ön Tesbitinde Bulanık Mantık Tabanlı SQL Sorgulama Yönteminin İncelenmesi

    Get PDF
    Bu çalışmada, kurumların aday kayıtlarını tuttuğu insan kaynakları veri tabanın, klasik SQL sorgulama yöntemi ile uygun çıkarımın yapılamadığı sorgulamalarda bulanık mantık ile sorgulama yapımı anlatılmıştır. Böylelikle klasik veri tabanı tablo yapısı referans alınarak SQL sorgulama diline uyumlu, karmaşık sorgulamaların çözümüne alternatif yapı geliştirilmesi sağlanmıştır. Bu çalışmada dikkat çekilmesi istenen diğer noktalardan biri de; klasik veritabanı yapılarına, bulanık teknik kullanarak bilgisayarlardan farklı olarak insanın yaklaşık düşünme ve belirsizlik içeren bilgiler ile işlem yapabilme yeteneğinin kazandırılmasının mümkün olduğudur

    Evaluation of the Electronic Resource Usage of Ankara University Faculty Members

    Get PDF
    Günümüzde üniversiteler, eğitim-öğretim, araştırma ve uygulama faaliyetleri içerisinde bulunabilmek için bilimsel bilgiye en doğru ve en hızlı biçimde ulaşmayı hedeflemektedir. Bundan dolayı, Ankara Üniversitesi akademik personelinin bilimsel bilgiye erişim için çağdaş bilgi ve iletişim teknolojilerini kullanması kaçınılmazdır. Bilimsel her veri tabanının standart taşıması gereken belirli özellikleri bulunmaktadır. Bu özellikler veri tabanında yer alan verilere erişimin doğru ve hızlı olması bakımından önem kazanmaktadır. Ankara Üniversitesi mensuplarına gereksinim duyacakları bilimsel bilgi kaynakları elektronik ortamda en çağdaş bilgi ve iletişim teknolojilerini kullanarak sunulmaktadır. Bu araştırmada, Ankara Üniversitesi akademik personelinin bilgi gereksinimleri doğrultusunda oluşturulan ve elektronik kütüphanede yer alan veri tabanlarının kullanılıp kullanılmadığının belirlenmesi amaçlanmış; bu amaca yönelik bilgi modeli ve içeriği belirlenerek uygulamaların gerçekleştirilmesi hedeflenmiştir.Education, research, and implementation activities of universities necessitate access to information in the most accurate and fastest manner in today's conditions. In this regard; Members of Ankara University have to access scientific information through the most contemporary information and communication technology in the electronic environment, primarily through the global internet system. Each scientific database has specific standard characteristics. These properties gain importance in terms of the correctness and speed of access to the data contained in the database. The sources of scientific information that the members of Ankara University will need are presented in the electronic environment using the most contemporary information and communication technologies. This article aims to determine the information needs of the members of Ankara University and to determine the information access and the use of information by identifying the information access model for researching the educational facilities and the necessary infrastructure and technical support so that the maximum benefit can be obtained from the electronic library. Aims at realizing their application by determining the information model and content for this purpose

    Selection effects in the black hole-bulge relation and its evolution

    Full text link
    We present an investigation of sample selection effects that influence the observed black hole - bulge relations and its evolution with redshift. We provide a common framework in which all kinds of selection effects on the BH-bulge relations can be investigated, but our main emphasis is on the consequences of using broad-line AGN and their host galaxies to search for evolution in the BH-bulge relation. We identified relevant sources of bias that were not discussed in the literature so far. A particularly important effect is caused by the fact that the active fraction among SMBHs varies considerably with BH mass, in the sense that high-mass BHs are less likely to be active than lower mass ones. In the connection with intrinsic scatter of the BH-bulge relation this effect implies a bias towards a low BH mass at given bulge property. This effect adds to the bias caused by working with luminosity or flux limited samples that were already discussed by others. A quantitative prediction of these biases requires (i) a realistic model of the sample selection function, and (ii) knowledge of relevant underlying distribution functions. For low-redshift AGN samples we can naturally reproduce the flattening of the relation observed in some studies. When extending our analysis to higher redshift samples we are clearly hampered by limited empirical constraints on the various relevant distribution functions. Using a best-guess approach for these distributions we estimate the expected magnitude of sample selection biases for a number of recent observational attempts to study the BH-bulge evolution. In no case do we find statistically significant evidence for an evolving BH-bulge relation. We suggest a possible practical approach to circumvent several of the most problematic issues connected with AGN selection; this could become a powerful diagnostic in future investigations (abridged).Comment: 20 pages, 20 figures, accepted for publication in A&

    Environmental effects on the growth of super massive black holes and AGN feedback

    Full text link
    We investigate how environmental effects by gas stripping alter the growth of a super massive black hole (SMBH) and its host galaxy evolution, by means of 1D hydrodynamical simulations that include both mechanical and radiative AGN feedback effects. By changing the truncation radius of the gas distribution (R_t), beyond which gas stripping is assumed to be effective, we simulate possible environments for satellite and central galaxies in galaxy clusters and groups. The continuous escape of gas outside the truncation radius strongly suppresses star formation, while the growth of the SMBH is less affected by gas stripping because the SMBH accretion is primarily ruled by the density of the central region. As we allow for increasing environmental effects - the truncation radius decreasing from about 410 to 50 kpc - we find that the final SMBH mass declines from about 10^9 to 8 x 10^8 Msol, but the outflowing mass is roughly constant at about 2 x 10^10 Msol. There are larger change in the mass of stars formed, which declines from about 2 x 10^10 to 2 x 10^9 Msol, and the final thermal X-ray gas, which declines from about 10^9 to 5 x 10^8 Msol, with increasing environmental stripping. Most dramatic is the decline in the total time that the objects would be seen as quasars, which declines from 52 Myr (for R_t = 377 kpc) to 7.9 Myr (for R_t = 51 kpc). The typical case might be interpreted as a red and dead galaxy having episodic cooling flows followed by AGN feedback effects resulting in temporary transitions of the overall galaxy color from red to green or to blue, with (cluster) central galaxies spending a much larger fraction of their time in the elevated state than do satellite galaxies.(Abridged)Comment: Accepted for publication in Ap

    Constraints on the faint end of the quasar luminosity function at z~5 in the COSMOS field

    Get PDF
    We present the result of our low-luminosity quasar survey in the redshift range of 4.5 < z < 5.5 in the COSMOS field. Using the COSMOS photometric catalog, we selected 15 quasar candidates with 22 < i' < 24 at z~5, that are ~ 3 mag fainter than the SDSS quasars in the same redshift range. We obtained optical spectra for 14 of the 15 candidates using FOCAS on the Subaru Telescope and did not identify any low-luminosity type-1 quasars at z~5 while a low-luminosity type-2 quasar at z~5.07 was discovered. In order to constrain the faint end of the quasar luminosity function at z~5, we calculated the 1sigma confidence upper limits of the space density of type-1 quasars. As a result, the 1sigma confidence upper limits on the quasar space density are Phi< 1.33*10^{-7} Mpc^{-3} mag^{-1} for -24.52 < M_{1450} < -23.52 and Phi< 2.88*10^{-7} Mpc^{-3} mag^{-1} for -23.52 < M_{1450} < -22.52. The inferred 1sigma confidence upper limits of the space density are then used to provide constrains on the faint-end slope and the break absolute magnitude of the quasar luminosity function at z~5. We find that the quasar space density decreases gradually as a function of redshift at low luminosity (M_{1450} ~ -23), being similar to the trend found for quasars with high luminosity (M_{1450}<-26). This result is consistent with the so-called downsizing evolution of quasars seen at lower redshifts.Comment: 8 pages, 9 figures, 1 table, accepted for publication in Ap

    The Lick AGN Monitoring Project: Recalibrating Single-Epoch Virial Black Hole Mass Estimates

    Get PDF
    We investigate the calibration and uncertainties of black hole mass estimates based on the single-epoch (SE) method, using homogeneous and high-quality multi-epoch spectra obtained by the Lick Active Galactic Nucleus (AGN) Monitoring Project for 9 local Seyfert 1 galaxies with black hole masses < 10^8 M_sun. By decomposing the spectra into their AGN and stellar components, we study the variability of the single-epoch Hbeta line width (full width at half-maximum intensity, FWHM_Hbeta; or dispersion, sigma_Hbeta) and of the AGN continuum luminosity at 5100A (L_5100). From the distribution of the "virial products" (~ FWHM_Hbeta^2 L_5100^0.5 or sigma_Hbeta^2 L_5100^0.5) measured from SE spectra, we estimate the uncertainty due to the combined variability as ~ 0.05 dex (12%). This is subdominant with respect to the total uncertainty in SE mass estimates, which is dominated by uncertainties in the size-luminosity relation and virial coefficient, and is estimated to be ~ 0.46 dex (factor of ~ 3). By comparing the Hbeta line profile of the SE, mean, and root-mean-square (rms) spectra, we find that the Hbeta line is broader in the mean (and SE) spectra than in the rms spectra by ~ 0.1 dex (25%) for our sample with FWHM_Hbeta < 3000 km/s. This result is at variance with larger mass black holes where the difference is typically found to be much less than 0.1 dex. To correct for this systematic difference of the Hbeta line profile, we introduce a line-width dependent virial factor, resulting in a recalibration of SE black hole mass estimators for low-mass AGNs.Comment: Accepted for publication in ApJ. 18 pages, 17 figure

    Black Hole Mass Estimates Based on CIV are Consistent with Those Based on the Balmer Lines

    Full text link
    Using a sample of high-redshift lensed quasars from the CASTLES project with observed-frame ultraviolet or optical and near-infrared spectra, we have searched for possible biases between supermassive black hole (BH) mass estimates based on the CIV, Halpha and Hbeta broad emission lines. Our sample is based upon that of Greene, Peng & Ludwig, expanded with new near-IR spectroscopic observations, consistently analyzed high S/N optical spectra, and consistent continuum luminosity estimates at 5100A. We find that BH mass estimates based on the FWHM of CIV show a systematic offset with respect to those obtained from the line dispersion, sigma_l, of the same emission line, but not with those obtained from the FWHM of Halpha and Hbeta. The magnitude of the offset depends on the treatment of the HeII and FeII emission blended with CIV, but there is little scatter for any fixed measurement prescription. While we otherwise find no systematic offsets between CIV and Balmer line mass estimates, we do find that the residuals between them are strongly correlated with the ratio of the UV and optical continuum luminosities. Removing this dependency reduces the scatter between the UV- and optical-based BH mass estimates by a factor of approximately 2, from roughly 0.35 to 0.18 dex. The dispersion is smallest when comparing the CIV sigma_l mass estimate, after removing the offset from the FWHM estimates, and either Balmer line mass estimate. The correlation with the continuum slope is likely due to a combination of reddening, host contamination and object-dependent SED shapes. When we add additional heterogeneous measurements from the literature, the results are unchanged.Comment: Accepted for publication in The Astrophysical Journal. 37 text pages + 8 tables + 23 figures. Updated with comments by the referee and with a expanded discussion on literature data including new observation

    The X-ray properties of million solar mass black holes

    Get PDF
    We present new Chandra X-ray observations of seven low-mass black holes (~1e6 Msun) accreting at low Eddington ratios between -2.0<log L/Ledd<-1.5. We compare the X-ray properties of these seven low-mass active galactic nuclei (AGN) to a total of 73 other low-mass AGN in the literature with published Chandra observations (with Eddington ratios extending from -2.0<log L/Ledd<-0.1). We do not find any statistical differences between low- and high-Eddington ratio low-mass AGN in the distributions of their X-ray to ultraviolet luminosity ratios (aox), or in their X-ray spectral shapes. Furthermore, the aox distribution of low-L/Ledd AGN displays an X-ray weak tail that is also observed within high-L/Ledd objects. Our results indicate that between -2<log L/Ledd<-0.1, there is no systematic change in the structure of the accretion flow for active galaxies hosting 1e6 Msun black holes. We examine the accuracy of current bolometric luminosity estimates for our low-L/Ledd objects with new Chandra observations, and it is plausible that their Eddington ratios could be underestimated by up to an order of magnitude. If so, then in analogy with weak emission line quasars, we suggest that accretion from a geometrically thick, radiatively inefficient `slim disk' could explain their diverse properties in aox. Alternatively, if current Eddington ratios are in fact correct (or overestimated), then the X-ray weak tail would imply that there is diversity in disk/corona couplings among individual low-mass objects. Finally, we conclude by noting that the aox distribution for low-mass black holes may have favorable consequences for the epoch of cosmic reionization being driven by AGN.Comment: 14 pages, 6 figures, 6 tables. Accepted for publication in Ap

    Anomalilerden elde edilen kurallarla geliştirilmiş uzun-kısa süreli bellekler ile trafik hacim tahmini

    No full text
    Traffic volume forecasting is crucial in order to create a successful smart transportation system. The precision and timeliness of the traffic flow data are critical to the forecast's effectiveness. The lack of data has led to the usage of shallow architectures in traffic forecasting models or the creation of models based on fabricated measurement data. These models did not have a high level of success in predicting outcomes. In the world of big data, the variety and scale of the acquired traffic data has increased in lockstep with the increase in traffic density. It has a great importance for any field to be able to determine the situations that may occur in the future from the events of the past. History repeats itself constantly and in many time series problems it might be possible to make predictions for the future from what happened in the past. The LSTM structure in the Machine Learning has been implemented in order to be able to do these predictions. LSTM is a special type of recurrent neural network and its main difference from standard recurrent neural network is its success in modeling long-term information. For LSTM usage, there must be ongoing data for a certain period of time, and from this historical data the future state is predicted. In this research, we use LSTM to estimate the number of vehicles in traffic and by estimating it, we aim to see traffic congestions before they occur. In order to overcome the difficulty of LSTM when predicting vehicle count at an anomaly state, we added a new layer that puts these anomalies in a certain rule set and we used this rule set to improve accuracy of the predictions. In this study, an improved hybrid model was created by adding anomaly-based rules on top of the LSTM model, thus a model that achieves successful results even in anomaly situations.Başarılı bir akıllı ulaşım sistemi oluşturmak için trafik hacim tahmini çok önemlidir. Akış verilerinin kesinliği ve zamanlılığı, tahminin etkinliği için kritik öneme sahiptir. Veri eksikliği, trafik tahmin modellerinde sığ mimarilerin kullanılmasına veya fabrikasyon ölçüm verilerine dayalı modellerin oluşturulmasına yol açmıştır. Bu modeller, sonuçları tahmin etmede yüksek düzeyde bir başarıya sahip değildi. Günümüzün büyük veri çağında, trafik yoğunluğunun artmasıyla birlikte elde edilen trafik verilerinin çeşitliliği ve ölçeği hızla artmıştır. Geçmişte yaşanabilecek olaylardan gelecekte oluşabilecek durumları belirleyebilmek her alan için büyük önem taşımaktadır. Tarih kendini sürekli olarak tekrar eder ve birçok zaman serisi probleminde geçmişte olanlardan geleceğe yönelik tahminlerde bulunmak mümkün olabilir. Bu çalışmada trafik sıkışıklığının tahmini için Machine Learning'deki Uzun Kısa Süreli Bellek yapısı hayata geçirilmiştir. UKSB, özel bir tekrarlayan sinir ağı türüdür ve standart tekrarlayan sinir ağlarından temel farkı, uzun vadeli bilgileri modellemedeki başarısıdır. UKSB kullanımı için belirli bir süre için devam eden bir veri olmalıdır ve bu geçmiş verileriyle gelecekteki durum tahmin edilir. Bu araştırmada trafikteki araç sayısını tahmin etmek için UKSB kullanıyoruz ve bunu tahmin ederek trafik sıkışıklıklarını oluşmadan önce görmeyi hedefliyoruz. UKSB'nin kullanılan verinin anomali içeren kısımlarında araç sayısında yaşadığı zorluğu yenebilmek için bu anormallikleri belirli bir kural setine oturtup tahminlerimizde bu kurallara göre düzeltmeler yapan yeni bir katman ekledik. Bu çalışmada, UKSB modelinin üzerine anomali tabanlı kurallar eklenerek geliştirilmiş bir hibrit model oluşturulmuştur, bu sayede anomali durumlarında dahi başarılı sonuçlar elde eden bir model elde edilmiştir.M.S. - Master of Scienc

    Impact of business intelligence solutions on company strategies that run in the financial sector

    No full text
    YÖK Tez No: 456320Günümüzde şirketler değişen iş ihtiyaçları doğrultusunda farklı veri kaynaklarından, farklı formatta veri ihtiyaçlarını karşılamaktadır. Veriler, farklı bölümlerde ve sistemlerce tutulup yönetilmektedir. İş zekası, dağıtılmış tüm veriyi bir depo içinde tümleştirip büyük resmi görmeyi sağlayan en etkin çözüm aracıdır. İş zekası, karar verme sürecini desteklemek için tasarlanan önceki sistemlerin doğal sonucu olarak ortaya çıkmıştır. Zaman içinde, karar destek sistemlerde fark edilen görsel yetersizlikler, kullanım zorlukları ve uygulamalar arasındaki uyumsuzluklar iş zekası teknolojisinin doğmasındaki önemli faktörlerin başında gelmektedir. Bu tarz çözümlerin karar vericilere sunduğu en büyük fayda güncel ve tümleştirilmiş iş performansını görüntüleme imkanıdır. İş zekası çözümlerinin iş dünyasında yaygınlaşmasıyla karar destek sistemlerinde kontrol ve zamanlama yeteneğinde de ciddi gelişmeler gözlenmektedir. Veri kaynaklarının merkezileşmesiyle birlikte artan veri kalitesi, kontrol ve zamanlama yeteneği karar vericilerin gelişen rekabetçi ortamda hızlı ve doğru kararlar almasına imkan sağlamıştır. Küreselleşen dünyada iş zekası kavramı, stratejik kararların alınması ve hayata geçirilmesi noktasında önemli bir unsur olmaktadır. Bu çalışmada örnek bir iş zekası tasarımı Oracle programında gerçekleştirilmiş ve çıkan sonuçların şirketlerin karar süreçlerinde bir değerlendirme unsuru olarak raporlaştırılması gerçekleştirilmiştir.Nowadays, many companies meet the needs of data from different data sources in different formats in order to in line with changing business needs. Data is managed and stored in different parts of the system. Business intelligence is the most effective solution that allows to see big picture by integrating all of the distributed data within a storage. Business intelligence has emerged as a natural result of the previous system designed to support the decision-making process. Over time, visual deficiencies discovered in decision support systems, difficulties of useage and mismatch between applications, is one of the major factors in the rise of business intelligence technology. Such solutions are up to date and integrated view of business performance it offers the greatest benefits to decision makers. By increasing centralization of data quality, control and scheduling capabilities have allowed us to take quick and right decisions in the evolving competitive environment.The concept of business intelligence is an important element of taking strategic decisons and implementation point in globalized world.This study has designed by Oracle business intelligence tool and results have been a key element of evaluation in decision making processes of the companies
    corecore