1,838 research outputs found
The difficulty of prime factorization is a consequence of the positional numeral system
The importance of the prime factorization problem is very well known
(e.g., many security protocols are based on the impossibility of a fast factorization
of integers on traditional computers). It is necessary from a number k
to establish two primes a and b giving k = a · b. Usually, k is written in a positional
numeral system. However, there exists a variety of numeral systems
that can be used to represent numbers. Is it true that the prime factorization is
difficult in any numeral system? In this paper, a numeral system with partial
carrying is described. It is shown that this system contains numerals allowing
one to reduce the problem of prime factorization to solving [K/2] − 1
systems of equations, where K is the number of digits in k (the concept of
digit in this system is more complex than the traditional one) and [u] is the
integer part of u. Thus, it is shown that the difficulty of prime factorization is
not in the problem itself but in the fact that the positional numeral system is
used traditionally to represent numbers participating in the prime factorization.
Obviously, this does not mean that P=NP since it is not known whether
it is possible to re-write a number given in the traditional positional numeral
system to the new one in a polynomial time
The exact (up to infinitesimals) infinite perimeter of the Koch snowflake and its finite area
The Koch snowflake is one of the first fractals that were mathematically
described. It is interesting because it has an infinite perimeter in the limit
but its limit area is finite. In this paper, a recently proposed computational
methodology allowing one to execute numerical computations with infinities
and infinitesimals is applied to study the Koch snowflake at infinity. Numerical
computations with actual infinite and infinitesimal numbers can be
executed on the Infinity Computer being a new supercomputer patented in
USA and EU. It is revealed in the paper that at infinity the snowflake is not
unique, i.e., different snowflakes can be distinguished for different infinite
numbers of steps executed during the process of their generation. It is then
shown that for any given infinite number n of steps it becomes possible to
calculate the exact infinite number, Nn, of sides of the snowflake, the exact
infinitesimal length, Ln, of each side and the exact infinite perimeter, Pn,
of the Koch snowflake as the result of multiplication of the infinite Nn by
the infinitesimal Ln. It is established that for different infinite n and k the
infinite perimeters Pn and Pk are also different and the difference can be infinite.
It is shown that the finite areas An and Ak of the snowflakes can be
also calculated exactly (up to infinitesimals) for different infinite n and k and
the difference An − Ak results to be infinitesimal. Finally, snowflakes constructed
starting from different initial conditions are also studied and their
quantitative characteristics at infinity are computed
Numerical infinities and infinitesimals: Methodology, applications, and repercussions on two Hilbert problems
In this survey, a recent computational methodology paying a special attention to the separation
of mathematical objects from numeral systems involved in their representation is described.
It has been introduced with the intention to allow one to work with infinities and infinitesimals
numerically in a unique computational framework in all the situations requiring these notions. The
methodology does not contradict Cantor’s and non-standard analysis views and is based on the
Euclid’s Common Notion no. 5 “The whole is greater than the part” applied to all quantities (finite,
infinite, and infinitesimal) and to all sets and processes (finite and infinite). The methodology uses a
computational device called the Infinity Computer (patented in USA and EU) working numerically
(recall that traditional theories work with infinities and infinitesimals only symbolically) with infinite
and infinitesimal numbers that can be written in a positional numeral system with an infinite radix.
It is argued that numeral systems involved in computations limit our capabilities to compute and lead
to ambiguities in theoretical assertions, as well. The introduced methodology gives the possibility
to use the same numeral system for measuring infinite sets, working with divergent series, probability,
fractals, optimization problems, numerical differentiation, ODEs, etc. (recall that traditionally
different numerals lemniscate; Aleph zero, etc. are used in different situations related to infinity). Numerous numerical examples and theoretical illustrations are given. The accuracy of the achieved results is continuously compared with those obtained by traditional tools used to work with infinities and infinitesimals. In particular, it is shown that the new approach allows one to observe mathematical
objects involved in the Hypotheses of Continuum and the Riemann zeta function with a higher
accuracy than it is done by traditional tools. It is stressed that the hardness of both problems is not
related to their nature but is a consequence of the weakness of traditional numeral systems used to
study them. It is shown that the introduced methodology and numeral system change our perception
of the mathematical objects studied in the two problems
A Study on the indications to the use of Base Ten Blocks and Green Chips in Mathematics textbooks in Brazil
This paper describes a research aimed at problematizing cases of indication to the use of Base Ten Blocks (BTB) and Green Chips (GC) for the teaching and learning of the Arabic numeral system and of arithmetic operations. The cases analyzed are present in the five collections of Elementary School mathematics textbooks selected by the National Program of Textbook (PNLD), which were the most purchased by the Ministry of Education (MEC) in 2016, to supply the system of public education in Brazil during the triennium 2016-2018. The identified indications were classified in three types, and two of them were particularly problematized for presenting some type of limitation or confusion. Only one of the five collections studied did not present problems regarding manipulatives
Token-based typology and word order entropy: A study based on universal dependencies
The present paper discusses the benefits and challenges of token-based typology, which takes into account the frequencies of words and constructions in language use. This approach makes it possible to introduce new criteria for language classification, which would be difficult or impossible to achieve with the traditional, type-based approach. This point is illustrated by several quantitative studies of word order variation, which can be measured as entropy at different levels of granularity. I argue that this variation can be explained by general functional mechanisms and pressures, which manifest themselves in language use, such as optimization of processing (including avoidance of ambiguity) and grammaticalization of predictable units occurring in chunks. The case studies are based on multilingual corpora, which have been parsed using the Universal Dependencies annotation scheme
- …