10 research outputs found
Toward Interactive Music Generation: A Position Paper
Music generation using deep learning has received considerable attention in recent years. Researchers have developed various generative models capable of imitating musical conventions, comprehending the musical corpora, and generating new samples based on the learning outcome. Although the samples generated by these models are persuasive, they often lack musical structure and creativity. For instance, a vanilla end-to-end approach, which deals with all levels of music representation at once, does not offer human-level control and interaction during the learning process, leading to constrained results. Indeed, music creation is a recurrent process that follows some principles by a musician, where various musical features are reused or adapted. On the other hand, a musical piece adheres to a musical style, breaking down into precise concepts of timbre style, performance style, composition style, and the coherency between these aspects. Here, we study and analyze the current advances in music generation using deep learning models through different criteria. We discuss the shortcomings and limitations of these models regarding interactivity and adaptability. Finally, we draw the potential future research direction addressing multi-agent systems and reinforcement learning algorithms to alleviate these shortcomings and limitations
Application of homogenization theory related to Stokes flow in porous media
summary:We consider applications, illustration and concrete numerical treatments of some homogenization results on Stokes flow in porous media. In particular, we compute the global permeability tensor corresponding to an unidirectional array of circular fibers for several volume-fractions. A 3-dimensional problem is also considered
Regression analysis using a blending type spline construction
Regression analysis allows us to track the dynamics of change in measured data and to investigate their properties. A sufficiently good model allows us to predict the behavior of dependent variables with higher accuracy, and to propose a more precise data generation hypothesis.
By using polynomial approximation for big data sets with complex dependencies we get piecewise smooth functions. One way to obtain a smooth spline representation of an entire data set is to use local curves and to blend them using smooth basis functions. This construction allows the computation of derivatives at any point on the spline. Properties such as tangent, velocity, acceleration, curvature and torsion can be computed, which gives us the opportunity to exploit these data in the subsequent analysis.
We can adjust the accuracy of the approximation on the different segments of the data set by choosing a suitable knot vector. This article describes a new method for determining the number and location of the knot-points, based on changes in the Frenet frame.
We present a method of implementation using generalized expo-rational B-splines (GERBS) for regression problems (in two and three variables) and we evaluate the accuracy of the model using comparison of the residuals
Three-dimensional performance surfaces : a tool for analysing and estimation of production system performances
This paper presents a new method to describe, analyse and estimate production system performances. Work-in-process (units), lead time (number of time units spent in the production system for each unit) and throughput (number of produced units per time unit) are basic performance measures, also used in this article. It is essential for industry to know about relations between system parameters and system performances in existing systems, and in not yet implemented system alternatives. Different performances are achieved by adjusting system parameters. Trade-offs between system parameters and its different performances are necessary to stay efficient and competitive in today's market. Queuing theory and simulation can help the decision makers to estimate system performances of existing and not yet implemented systems. When the complexity increases queuing theory becomes cumbersome, very difficult and eventually impossible to use. A single simulation presents limited information. Multiple simulations are necessary to ensure that the best alternative is chosen. A high number of simulations demand a lot of computer time and resources. Reduction of runs is desirable even with cheaper computer equipment. Currently, traditional two-dimensional charts are the only tools to present and analyse system performances. This article presents a new surrogate model for easier estimation and presentation of system performances, their internal relations, and relations to the system parameters. With the new surrogate model, system performances based on simulations are presented as positions in a three-dimensional environment. Parametric curves and surfaces of Bezier type are generated and adapted to these positions. System performances of other system alternatives can then be estimated without explicit simulation. The number of simulation calculations can thereby be moderated. The method is illustrated with a small production line systemValiderad; 2010; 20100719 (andbra
A field study of sensors for winter road assessment
Winter road assessment is a research field with considerable progress over the last 10 years. Various sensors and methods have been tested and analysed, often in a laboratory setting, in order to come up with robust and valid assessment tools that can be used to warn the driver, road users in general, and maintenance personnel of critical conditions. In this paper we compare the field measurements of an RCM411 and a MARWIS sensor with each other and with previously performed laboratory experiments, we reflect on OBD-II as a tool in winter road assessment, and perform initial field tests with an experimental radar sensor. The results of the RCM411/MARWIS comparison shows significant correlation between our field experiments and the laboratory experiments, OBD-II appears to be fitting as a supplementary tool in the assessments, and the experimental radar tests uncovers a need for more investigation
Hydrocarbon production optimization in fields with different ownership and commercial interests
A main field and satellite fields consist of several separate reservoirs with gas cap and/or oil rim. A processing facility on the main field receives and processes the oil, gas and water from all the reservoirs. This facility is typically capable of processing only a limited amount of oil, gas and water per unit of time. In order to satisfy these processing limitations, the production needs to be choked. The available capacity is shared among several field owners with different commercial interests. In this paper we focus on how total oil and gas production from all the fields could be optimized. The satellite field owners negotiate processing capacities on the main field facility. This introduces additional processing capacity constraints (booking constraints) for the owners of the main field. If the total wealth created by all owners represents the economic interests of the community, it is of interest to investigate whether the total wealth may be increased by lifting the booking constraints. If all reservoirs may be produced more optimally by removing the booking constraints, all owners may benefit from this when appropriate commercial arrangements are in place. We will compare two production strategies. The first production strategy optimizes locally, at distinct time intervals. At given intervals the production is prioritized so that the maximum amount of oil is produced. In the second production strategy a fixed weight is assigned to each reservoir. The reservoirs with the highest weights receive the highest priority