Abstract: This paper takes a philosophical view as axiom, and
reveals the relationship between information theory and Natural
Intelligence and Artificial Intelligence under real world conditions.
This paper also derives the relationship between natural intelligence
and nature. According to communication principle of information
theory, Natural Intelligence can be divided into real part and virtual
part. Based on information theory principle that Information does not
increase, the restriction mechanism of Natural Intelligence creativity is
conducted. The restriction mechanism of creativity reveals the limit of
natural intelligence and artificial intelligence. The paper provides a
new angle to observe natural intelligence and artificial intelligence.
Abstract: In this short paper, new properties of transition matrix were introduced. Eigen values for small order transition matrices are calculated in flexible method. For benefit of these properties applications of these properties were studied in the solution of Markov's chain via steady state vector, and information theory via channel entropy. The implemented test examples were promised for usages.
Abstract: Logarithms reduce products to sums and powers to products; they play an important role in signal processing, communication and information theory. They are primarily
used for hardware calculations, handling multiplications, divisions,
powers, and roots effectively. There are three commonly
used bases for logarithms; the logarithm with base-10 is called
the common logarithm, the natural logarithm with base-e and
the binary logarithm with base-2. This paper demonstrates different
methods of calculation for log2 showing the complexity
of each and finds out the most accurate and efficient besides
giving insights to their hardware design. We present a new
method called Floor Shift for fast calculation of log2, and
then we combine this algorithm with Taylor series to improve
the accuracy of the output, we illustrate that by using two
examples. We finally compare the algorithms and conclude
with our remarks.
Abstract: We introduce a novel approach to measuring how
humans learn based on techniques from information theory and
apply it to the oriental game of Go. We show that the total amount
of information observable in human strategies, called the strategic
information, remains constant for populations of players of differing
skill levels for well studied patterns of play. This is despite the very
large amount of knowledge required to progress from the recreational
players at one end of our spectrum to the very best and most
experienced players in the world at the other and is in contrast to
the idea that having more knowledge might imply more 'certainty'
in what move to play next. We show this is true for very local
up to medium sized board patterns, across a variety of different
moves using 80,000 game records. Consequences for theoretical and
practical AI are outlined.
Abstract: The objective of the present communication is to
develop new genuine exponentiated mean codeword lengths and to
study deeply the problem of correspondence between well known
measures of entropy and mean codeword lengths. With the help of
some standard measures of entropy, we have illustrated such a
correspondence. In literature, we usually come across many
inequalities which are frequently used in information theory.
Keeping this idea in mind, we have developed such inequalities via
coding theory approach.