5 research outputs found
Are Stable Coins Stable? Stable Coins\u27 Exposure to Cryptocurrency and Financial Market Volatility
Stable coins are the nonvolatile assets that assist cryptocurrency investors in hedging from the volatility of the market. These coins have market capitalizations worth billions of US dollars and are relied on throughout the world to store funds without risk. However, there have been instances in the past of stable coins failing and wiping out billions from the market. In this analysis, I examine the exposure that the four largest stable coins have to market movements in both cryptocurrency markets and traditional financial markets. I find that one stable coin, DAI is the only coin that has no correlation or granger-causal relationships with either market. Tether is found to have a statistically significant correlation with the S&P 500, and the S&P 500 granger-causes price returns in USDC. Finally, there was a statistically significant correlation between cryptocurrency market price returns and the price returns of BUSD. These results show that DAI is the most trustworthy cryptocurrency in hedging from market risks. The research displays weaknesses that each coin has that investors should take into account when investing
Design and Validation of the Bright Internet
Bright Internet research was launched as a core project of the AIS Bright ICT Initiative, which aims to build an ICT-enabled Bright Society. To facilitate research on the Bright Internet, we explicitly define the goals and principles of the Bright Internet, and review the evolution of its principles. The three goals of the Bright Internet are: the realization of preventive security, the provision of the freedom of anonymous expression for innocent netizens, and protection from the risk of privacy infringement that may be caused by preventive security schemes. We respecify design principles to fulfill these seemingly conflicting goals: origin responsibility, deliverer responsibility, identifiable anonymity, global collaboration, and privacy protection. Research for the Bright Internet is characterized by two perspectives: first, the Bright Internet adopts a preventive security paradigm in contrast to the current self-centric defensive protective security paradigm. Second, the target of research is the development and deployment of the Bright Internet on a global scale, which requires the design of technologies and protocols, policies and legislation, and international collaboration and global governance. This research contrasts with behavioral research on individuals and organizations in terms of the protective security paradigm. This paper proposes validation research concerning the principles of the Bright Internet using prevention motivation theory and analogical social norm theory, and demonstrates the need for a holistic and prescriptive design for a global scale information infrastructure, encompassing the constructs of technologies, policies and global collaborations. An important design issue concerns the business model design, which is capable of promoting the propagation of the Bright Internet platform through applications such as Bright Cloud Extended Networks and Bright E-mail platforms. Our research creates opportunities for prescriptive experimental research, and the various design and behavioral studies of the Bright Internet open new horizons toward our common goal of a bright future
Combi2011 Conference Proceedings
Combi2011 is an international conference, where practice and research become one. It is hosted by three Universities of Applied Sciences: HAMK, LAMK and Laurea.The themes for 2011 were: “Learning and Working in a Virtual World”, “Doing Business in a Global World and Enhancing Entrepreneurship” and “Accelerating Innovations”
Legal challenges to future information businesses
The thesis studies new legal challenges to future information businesses: it presents an applicable research method, lists central legal challenges, and discusses the implications of those challenges.
I have developed a scenario-based method that produces lists of legal challenges and helps to analyze them.
The method highlights information products and services from commercial entity's viewpoint: other business aspects are paid less attention. Also, some specific characteristics of particular companies cannot be considered in a general method like this. Therefore challenges in legal areas such as tax law or competition law do not appear although in practice they can be relevant. The method is still able to point out numerous relevant legal challenges.
The study focuses on the future: the time span is about two to ten years from now. The focus is on the business-to-consumer (B2C) market. The emphasis is on strategic product and service development.
I have listed, analyzed, and discussed the future legal challenges that the method has found. I conclude that the most important legal challenges to future information businesses are within the areas of privacy and data protection; intellectual property rights; and contracts. I have also discussed the major distinguishers of businesses implying legal challenges. They help to point out the specific legal challenges related to a certain information product or service.
Legal rules can affect businesses in many ways. At their best, they enable businesses, but too often they also harm useful activities. I conclude business drivers and hurdles that are important from the legal point of view.reviewe
Suchmaschinen - Eine industrieökonomische Analyse der Konzentration und ihrer Ursachen
The main topic of this doctoral thesis is to investigate the concentration
in search engine markets. It is investigated, whether the structural
characteristics of the market, favors a natural concentration
(monopoly/oligopoly) or if an abuse of a dominant position are responsible
for this. Further the (qualitative) efficiency of the providers is examined
based on surveys of quality and satisfaction. In succession to the
introduction, Chapter 2 depicts background information to support a better
understanding of the analysis. The 3rd Chapter describes the structure and
operation of a search engine. The service is split into sub-processes and
particular functions are analyzed regarding their contribution to the
search engine quality. The 4th chapter analyzes the demand side
characteristics. The focuses are: possible switching costs, network
effects, and platform properties. The hypothesis is supported by behavioral
studies. The service side of a search engine is considered in the 5th
chapter. Especially the cost structure to maintain a search engine service
is being investigated. In chapter 6, the high concentration of the search
engine market is empirically analyzed. The Chapter 7 addresses whether the
concentration can be justified by the search engines quality or by the
determined economic characteristics. The barriers to entry and the
concentration process (winner takes it all) are distinguished. In the final
analytical section, the concentration factors are analyzed in chronological
sequence and the search engines are examined within the framework of the
theory of contestable markets. The thesis concludes with a summary and a
discussion of the regulatory proposals.Hauptinteresse der vorliegenden Arbeit besteht darin, die bestehende hohe
Konzentration der Suchmaschinenmärkte zu ergründen. Vor allem wird
untersucht, ob diese auf die strukturellen Eigenschaften der Märkte
zurückzuführen ist und somit eine natürliche Konzentration
(Monopol/Oligopol) darstellt oder ob diese auf missbräuchliche
Verhaltensweisen der etablierten Suchmaschinenbetreiber zurückzuführen ist.
Als weiterer Erklärungsansatz der Konzentration wird anhand der Qualitäts-
und Zufriedenheitsstudien eine höhere (qualitative) Effizienz der Betreiber
untersucht.Im Anschluss an eine Einleitung werden im 2. Kapitel
Hintergrundinformationen für das bessere Verständnis der durchgeführten
Untersuchung dargestellt. Im 3. Kapitel werden der Aufbau und die
Funktionsweise einer Suchmaschine beschrieben. Die Zerlegung der
Dienstleistung einer Suchmaschine in Teilprozesse (Wertschöpfungsstufen)
sowie die Ermittlung der Bedeutung einzelner Funktionen für die
Suchmaschinenqualität bilden die Grundlagen für die Analyse der
ökonomischen Eigenschaften in den Kapiteln 4 und 5.Das 4. Kapitel befasst
sich mit den nachfrageseitigen Eigenschaften. Die wichtigsten
Untersuchungspunkte sind hierbei: die möglichen Wechselbarrieren der
Nachfragegruppen; die zwischen den und innerhalb der Nachfragegruppen
bestehenden Netzwerkeffekte sowie die auf den Netzwerkeffekten aufbauende
Analyse der Plattformeigenschaften. Zur Untermauerung der argumentativen
Untersuchung werden Verhaltensstudien verwendet.Im 5. Kapitel der Arbeit
wird die Angebotsseite einer Suchmaschine betrachtet. Hierbei wird die
Kostenstruktur zur Unterhaltung einer Suchmaschine analysiert, um unter
anderem mögliche Betriebsgrößen- oder Verbundvorteile zu erfassen. Im 6.
Kapitel wird die hohe Konzentration der Suchmaschinenmärkte empirisch
analysiert. Daran anschließend wird im 7. Kapitel die Konzentration anhand
von Studien zur Qualität von Suchmaschinen sowie anhand der ermittelten
ökonomischen Eigenschaften begründet. Hierbei werden die
Markteintrittsbarrieren und der Konzentrationsprozess (Winner takes all)
unterschieden. Im letzten analytischen Abschnitt werden die
Konzentrationsfaktoren im zeitlichen Ablauf sowie die Suchmaschinen auf
bestreitbare natürliche Monopole analysiert.Die Arbeit schließt mit einer
Schlussbetrachtung, in der die Erkenntnisse zusammengefasst sowie
Regulierungsvorhaben diskutiert werden