52 research outputs found

    Explicit MBR All-Symbol Locality Codes

    Full text link
    Node failures are inevitable in distributed storage systems (DSS). To enable efficient repair when faced with such failures, two main techniques are known: Regenerating codes, i.e., codes that minimize the total repair bandwidth; and codes with locality, which minimize the number of nodes participating in the repair process. This paper focuses on regenerating codes with locality, using pre-coding based on Gabidulin codes, and presents constructions that utilize minimum bandwidth regenerating (MBR) local codes. The constructions achieve maximum resilience (i.e., optimal minimum distance) and have maximum capacity (i.e., maximum rate). Finally, the same pre-coding mechanism can be combined with a subclass of fractional-repetition codes to enable maximum resilience and repair-by-transfer simultaneously

    Scope for Credit Risk Diversification

    Get PDF
    This paper considers a simple model of credit risk and derives the limit distribution of losses under different assumptions regarding the structure of systematic risk and the nature of exposure or firm heterogeneity. We derive fat-tailed correlated loss distributions arising from Gaussian risk factors and explore the potential for risk diversification. Where possible the results are generalised to non-Gaussian distributions. The theoretical results indicate that if the firm parameters are heterogeneous but come from a common distribution, for sufficiently large portfolios there is no scope for further risk reduction through active portfolio management. However, if the firm parameters come from different distributions, then further risk reduction is possible by changing the portfolio weights. In either case, neglecting parameter heterogeneity can lead to underestimation of expected losses. But, once expected losses are controlled for, neglecting parameter heterogeneity can lead to overestimation of risk, whether measured by unexpected loss or value-at-risk

    Evaluation of individual and ensemble probabilistic forecasts of COVID-19 mortality in the United States

    Get PDF
    Short-term probabilistic forecasts of the trajectory of the COVID-19 pandemic in the United States have served as a visible and important communication channel between the scientific modeling community and both the general public and decision-makers. Forecasting models provide specific, quantitative, and evaluable predictions that inform short-term decisions such as healthcare staffing needs, school closures, and allocation of medical supplies. Starting in April 2020, the US COVID-19 Forecast Hub (https://covid19forecasthub.org/) collected, disseminated, and synthesized tens of millions of specific predictions from more than 90 different academic, industry, and independent research groups. A multimodel ensemble forecast that combined predictions from dozens of groups every week provided the most consistently accurate probabilistic forecasts of incident deaths due to COVID-19 at the state and national level from April 2020 through October 2021. The performance of 27 individual models that submitted complete forecasts of COVID-19 deaths consistently throughout this year showed high variability in forecast skill across time, geospatial units, and forecast horizons. Two-thirds of the models evaluated showed better accuracy than a naïve baseline model. Forecast accuracy degraded as models made predictions further into the future, with probabilistic error at a 20-wk horizon three to five times larger than when predicting at a 1-wk horizon. This project underscores the role that collaboration and active coordination between governmental public-health agencies, academic modeling teams, and industry partners can play in developing modern modeling capabilities to support local, state, and federal response to outbreaks

    The United States COVID-19 Forecast Hub dataset

    Get PDF
    Academic researchers, government agencies, industry groups, and individuals have produced forecasts at an unprecedented scale during the COVID-19 pandemic. To leverage these forecasts, the United States Centers for Disease Control and Prevention (CDC) partnered with an academic research lab at the University of Massachusetts Amherst to create the US COVID-19 Forecast Hub. Launched in April 2020, the Forecast Hub is a dataset with point and probabilistic forecasts of incident cases, incident hospitalizations, incident deaths, and cumulative deaths due to COVID-19 at county, state, and national, levels in the United States. Included forecasts represent a variety of modeling approaches, data sources, and assumptions regarding the spread of COVID-19. The goal of this dataset is to establish a standardized and comparable set of short-term forecasts from modeling teams. These data can be used to develop ensemble models, communicate forecasts to the public, create visualizations, compare models, and inform policies regarding COVID-19 mitigation. These open-source data are available via download from GitHub, through an online API, and through R packages

    Інноваційні стратегії розвитку високотехнологічних компаній Туречиини

    No full text
    Впровадження інновацій спрямовано на досягнення конкурентних переваг шляхом покращення критеріїв ефективності, що сприяє формуванню міцних зв’язків між інноваціями та продуктивністю. Метою статті є визначення впливу інноваційних стратегій на ефективність діяльності високотехнологічних підприємств. У роботі, запропоновано класифікаційні ознаки та виділено шість типів інноваційних стратегій: 1) проактивна стратегія, 2) ризикоорієнтована стратегія, 3) оборонна стратегія, 4) стратегія на майбутнє, 5) стратегія наступу та 6) аналітична стратегія. Оцінювання ефективності ведення бізнес-діяльності здійснено за пятьма напрямами: якість продукції, продуктивність праці, взаємодію зі споживачами, результати фінансової діяльності та ефективність процесів, за шкалою Морган та Стронг (1998), Ченді та Теліс (1998), а також Ескічі (2020). Підґрунтям дослідження стали результати опитування 346 менеджерів високотехнологічних компаній Туреччини. Аналіз результатів опитування було здійснено за допомогою програмного забезпечення JAMOVI та SPSS 26.0. Емпіричне дослідження проведено з використанням інструментарію експлораторного та конфірматорного факторних аналізів, коефіцієнта кореляції Пірсона та регресійного аналізу. За результатами дослідження встановлено, що інноваційні стратегії сприяють підвищенню ефективності діяльності високотехнологічних підприємств. Зокрема, стратегія на майбутнє сприяє покращенню якості продукції, продуктивності праці, взаємодії зі споживачами, фінансовій результативності та ефективності процесів. Однак, результати дослідження засвідчили, що стратегія на майбутнє та ризикоорієнтована стратегія є неефективними для показників якості продукції, взаємодії зі споживачами, продуктивності праці, результатів фінансової діяльності та ефективності процесів. Наступальні, аналітичні, оборонні, проактивні та ризикоорієнтовані стратегії не є ефективними для підвищення продуктивності процесів. Аналітичні, оборонні, орієнтовані на майбутнє та проактивні стратегії ефективні для якості продукції, взаємодії зі споживачами та фінансової результативності. На основі отриманих результатів дослідження автори дійшли висновку, що інноваційні стратегії мають сильніший вплив на ефективність взаємодії зі споживачами (R2=0,687) та фінансову результативність (R2=0,701) високотехнологічних компаній.Innovation efforts aim to attain important competitiveness by improving performance criteria. It has led to very strong ties between innovation and performance, which is also accepted by the recent business understanding. The purpose of the current study, which is designed in compliance with this understanding, is to examine the effect of innovation strategies on business performance in enterprises using high technology. Innovation strategies consist of 6 dimensions: proactive strategy, risk-oriented strategy, defensive strategy, future-oriented strategy, offensive strategy, and analytical strategy. Business performance was measured as product performance, employee-based performance, customer-based performance, financial performance, and process performance. The study involved the innovation strategies and business performance scale developed by Morgan and Strong (1998), Chandy and Tellis (1998), and Eskici (2020). The current study conducted a survey on 346 managers of companies operating in Turkey and using high technology. The obtained data were analyzed with the help of JAMOVI and SPSS 26.0 programs. Exploratory factor analysis, confirmatory factor analysis, Pearson correlation, and regression analysis methods were used in empirical analysis. The result determined that innovation strategies are effective on business performance. In other words, future-oriented strategy is effective on the product, customer-based, employee-based, financial, and process performances of enterprises. On the other hand, it was determined that future-oriented strategy and riskoriented strategy are not effective on enterprises' product, customer-based, employee-based, financial, and process performances. In addition, offensive, analytical, defensive, proactive, and risk-oriented strategies are ineffective in process performance. Analytical, defensive, future-oriented, and proactive strategies are effective on product performance, customer-based performance, and financial performance of businesses. As a result, it was determined that innovation strategies affect customer-based performance (R2=0.687) and financial performance (R2=0.701) of companies more

    Pilot contamination attacks in massive MIMO systems

    Full text link
    © 2017 IEEE. We consider a single-cell massive multiple-input multiple-output (MIMO) system in which a base station (BS) with a large number of antennas simultaneously transmits to K single-antenna users in the presence of an attacker. Massive MIMO systems often operate in a time division duplexing (TDD) fashion. The BS estimates the channel state information (CSI) at receivers based on their uplink pilot transmissions. Downlink transmission rates are highly dependent on these estimates, as the BS utilizes the CSI to exploit the beamforming gain offered by massive MIMO. However, this CSI estimation phase is vulnerable to malicious attacks. Specifically, an attacker can contaminate the uplink pilot sequences by generating identical pilot signals to those of legitimate users. We formulate a denial of service (DoS) attack in which the attacker aims to minimize the sum-rate of downlink transmissions by contaminating the uplink pilots. We also consider another attack model where the attacker generates jamming signals in both the CSI estimation and data transmission phases by exploiting in-band full-duplex techniques. We study these attacks under two power allocation strategies for downlink transmissions. Our analysis is conducted when the attacker knows or does not know the locations of the BS and users. When the attacker does not have perfect location information, stochastic optimization techniques are utilized to assess the impact of the attack. The formulated problems are solved using interior-point, Lagrangian minimization, and game-theoretic methods. We obtain a closed-form solution for a special case of the problem. Our results indicate that even though the attacker does not have the perfect location information, proposed pilot contamination attacks degrade the throughput of a massive MIMO system by more than 50%, and reduce fairness among users significantly. In addition, we show that increasing the number of pilot symbols does not prevent the proposed attacks, if the BS uniformly allocates powers for downlink transmissions

    Fundamental bound on the persistence and capacity of short-term memory stored as graded persistent activity

    No full text
    It is widely believed that persistent neural activity underlies short-term memory. Yet, as we show, the degradation of information stored directly in such networks behaves differently from human short-term memory performance. We build a more general framework where memory is viewed as a problem of passing information through noisy channels whose degradation characteristics resemble those of persistent activity networks. If the brain first encoded the information appropriately before passing the information into such networks, the information can be stored substantially more faithfully. Within this framework, we derive a fundamental lower-bound on recall precision, which declines with storage duration and number of stored items. We show that human performance, though inconsistent with models involving direct (uncoded) storage in persistent activity networks, can be well-fit by the theoretical bound. This finding is consistent with the view that if the brain stores information in patterns of persistent activity, it might use codes that minimize the effects of noise, motivating the search for such codes in the brain
    corecore