Abstract: A property is called persistent if for any many-sorted term rewriting system , has the property if and only if term rewriting system , which results from by omitting its sort information, has the property. In this paper,we show that termination is persistent for non-overlapping term rewriting systems and we give the example as application of this result. Furthermore we obtain that completeness is persistent for non-overlapping term rewriting systems.
Abstract: Fault tree analysis is a well-known method for
reliability and safety assessment of engineering systems. In the last 3
decades, a number of methods have been introduced, in the literature,
for automatic construction of fault trees. The main difference between these methods is the starting model from which the tree is constructed. This paper presents a new methodology for the construction of static and dynamic fault trees from a system Simulink
model. The method is introduced and explained in detail, and its correctness and completeness is experimentally validated by using an example, taken from literature. Advantages of the method are also mentioned.
Abstract: The double difference sequence space I2 (M, of fuzzy numbers for both 1 < p < oo and 0 < p < 1, is introduced. Some general properties of this sequence space are studied. Some inclusion relations involving this sequence space are obtained.
Abstract: Point quad tree is considered as one of the most
common data organizations to deal with spatial data & can be used to
increase the efficiency for searching the point features. As the
efficiency of the searching technique depends on the height of the
tree, arbitrary insertion of the point features may make the tree
unbalanced and lead to higher time of searching. This paper attempts
to design an algorithm to make a nearly balanced quad tree. Point
pattern analysis technique has been applied for this purpose which
shows a significant enhancement of the performance and the results
are also included in the paper for the sake of completeness.
Abstract: Dynamic location referencing method is an important technology to shield map differences. These method references objects of the road network by utilizing condensed selection of its real-world geographic properties stored in a digital map database, which overcomes the defections existing in pre-coded location referencing methods. The high attributes completeness requirements and complicated reference point selection algorithm are the main problems of recent researches. Therefore, a dynamic location referencing algorithm combining intersection points selected at the extremities compulsively and road link points selected according to link partition principle was proposed. An experimental system based on this theory was implemented. The tests using Beijing digital map database showed satisfied results and thus verified the feasibility and practicability of this method.
Abstract: Mounds are one of the most valuable sources of
information on various aspects of life, household skills, rituals and
beliefs of the ancient peoples of Kazakhstan. Moreover, the objects
associated with the cult of the burial of the dead are the most
informative, and often the only source of knowledge about past eras.
The present study is devoted to some results of the excavations
carried out on the mound "Baygetobe" of Shilikti burial ground. The
purpose of the work is associated with certain categories of grave
goods and reading "Fine Text" of Shilikti graves, whose structure is
the same for burials of nobles and ordinary graves. The safety of a
royal burial mounds, the integrity and completeness of the source are
of particular value for studying.
Abstract: Model mapping and transformation are important processes in high level system abstractions, and form the cornerstone of model-driven architecture (MDA) techniques. Considerable research in this field has devoted attention to static system abstraction, despite the fact that most systems are dynamic with high frequency changes in behavior. In this paper we provide an overview of work that has been done with regard to behavior model mapping and transformation, based on: (1) the completeness of the platform independent model (PIM); (2) semantics of behavioral models; (3) languages supporting behavior model transformation processes; and (4) an evaluation of model composition to effect the best approach to describing large systems with high complexity.
Abstract: The IFS is a scheme for describing and manipulating complex fractal attractors using simple mathematical models. More precisely, the most popular “fractal –based" algorithms for both representation and compression of computer images have involved some implementation of the method of Iterated Function Systems (IFS) on complete metric spaces. In this paper a new generalized space called Multi-Fuzzy Fractal Space was constructed. On these spases a distance function is defined, and its completeness is proved. The completeness property of this space ensures the existence of a fixed-point theorem for the family of continuous mappings. This theorem is the fundamental result on which the IFS methods are based and the fractals are built. The defined mappings are proved to satisfy some generalizations of the contraction condition.
Abstract: In this work a dual laser triangulation system is presented for fast building of 2.5D textured models of objects within a production line. This scanner is designed to produce data suitable for 3D completeness inspection algorithms. For this purpose two laser projectors have been used in order to considerably reduce the problem of occlusions in the camera movement direction. Results of reconstruction of electronic boards are presented, together with a comparison with a commercial system.
Abstract: In this work, we present for the first time in our
perception an efficient digital watermarking scheme for mpeg audio
layer 3 files that operates directly in the compressed data domain,
while manipulating the time and subband/channel domain. In
addition, it does not need the original signal to detect the watermark.
Our scheme was implemented taking special care for the efficient
usage of the two limited resources of computer systems: time and
space. It offers to the industrial user the capability of watermark
embedding and detection in time immediately comparable to the real
music time of the original audio file that depends on the mpeg
compression, while the end user/audience does not face any artifacts
or delays hearing the watermarked audio file. Furthermore, it
overcomes the disadvantage of algorithms operating in the PCMData
domain to be vulnerable to compression/recompression attacks,
as it places the watermark in the scale factors domain and not in the
digitized sound audio data. The strength of our scheme, that allows it
to be used with success in both authentication and copyright
protection, relies on the fact that it gives to the users the enhanced
capability their ownership of the audio file not to be accomplished
simply by detecting the bit pattern that comprises the watermark
itself, but by showing that the legal owner knows a hard to compute
property of the watermark.
Abstract: A generalization of the concepts of Feistel Networks (FN), known as Extended Feistel Network (EFN) is examined. EFN splits the input blocks into n > 2 sub-blocks. Like conventional FN, EFN consists of a series of rounds whereby at least one sub-block is subjected to an F function. The function plays a key role in the diffusion process due to its completeness property. It is also important to note that in EFN the F-function is the most computationally expensive operation in a round. The aim of this paper is to determine a suitable type of EFN for a scalable cipher. This is done by analyzing the threshold number of rounds for different types of EFN to achieve the completeness property as well as the number of F-function required in the network. The work focuses on EFN-Type I, Type II and Type III only. In the analysis it is found that EFN-Type II and Type III diffuses at the same rate and both are faster than Type-I EFN. Since EFN-Type-II uses less F functions as compared to EFN-Type III, therefore Type II is the most suitable EFN for use in a scalable cipher.
Abstract: In the Top Right Access point Minimum Length Corridor (TRA-MLC) problem [1], a rectangular boundary partitioned into rectilinear polygons is given and the problem is to find a corridor of least total length and it must include the top right corner of the outer rectangular boundary. A corridor is a tree containing a set of line segments lying along the outer rectangular boundary and/or on the boundary of the rectilinear polygons. The corridor must contain at least one point from the boundaries of the outer rectangle and also the rectilinear polygons. Gutierrez and Gonzalez [1] proved that the MLC problem, along with some of its restricted versions and variants, are NP-complete. In this paper, we give a shorter proof of NP-Completeness of TRA-MLC by findig the reduction in the following way.
Abstract: Since primary school trips usually start from home,
attention by many scholars have been focused on the home end for
data gathering. Thereafter category analysis has often been relied
upon when predicting school travel demands. In this paper, school
end was relied on for data gathering and multivariate regression for
future travel demand prediction. 9859 pupils were surveyed by way
of questionnaires at 21 primary schools. The town was divided into 5
zones. The study was carried out in Skudai Town, Malaysia. Based
on the hypothesis that the number of primary school trip ends are
expected to be the same because school trips are fixed, the choice of
trip end would have inconsequential effect on the outcome. The
study compared empirical data for home and school trip end
productions and attractions. Variance from both data results was
insignificant, although some claims from home based family survey
were found to be grossly exaggerated. Data from the school trip ends
was relied on for travel demand prediction because of its
completeness. Accessibility, trip attraction and trip production were
then related to school trip rates under daylight and dry weather
conditions. The paper concluded that, accessibility is an important
parameter when predicting demand for future school trip rates.
Abstract: This paper presents a signal analysis process for
improving energy completeness based on the Hilbert-Huang
Transform (HHT). Firstly, the vibration signal of a DC Motor obtained
by employing an accelerometer is the model used to analyze the
signal. Secondly, the intrinsic mode functions (IMFs) and Hilbert
spectrum of the decomposed signal are obtained by applying HHT.
The results of the IMFs constituent and the original signal are
compared and the process of energy loss is discussed. Finally, the
differences between Wavelet Transform (WT) and HHT in analyzing
the signal are compared. The simulated results reveal the analysis
process based on HHT is advantageous for the enhancement of energy
completeness.
Abstract: Time series analysis often requires data that represents
the evolution of an observed variable in equidistant time steps. In
order to collect this data sampling is applied. While continuous
signals may be sampled, analyzed and reconstructed applying
Shannon-s sampling theorem, time-discrete signals have to be dealt
with differently. In this article we consider the discrete-event
simulation (DES) of job-shop-systems and study the effects of
different sampling rates on data quality regarding completeness and
accuracy of reconstructed inventory evolutions. At this we discuss
deterministic as well as non-deterministic behavior of system
variables. Error curves are deployed to illustrate and discuss the
sampling rate-s impact and to derive recommendations for its wellfounded
choice.
Abstract: In this work, we present for the first time in our perception an efficient digital watermarking scheme for mpeg audio layer 3 files that operates directly in the compressed data domain, while manipulating the time and subband/channel domain. In addition, it does not need the original signal to detect the watermark. Our scheme was implemented taking special care for the efficient usage of the two limited resources of computer systems: time and space. It offers to the industrial user the capability of watermark embedding and detection in time immediately comparable to the real music time of the original audio file that depends on the mpeg compression, while the end user/audience does not face any artifacts or delays hearing the watermarked audio file. Furthermore, it overcomes the disadvantage of algorithms operating in the PCMData domain to be vulnerable to compression/recompression attacks, as it places the watermark in the scale factors domain and not in the digitized sound audio data. The strength of our scheme, that allows it to be used with success in both authentication and copyright protection, relies on the fact that it gives to the users the enhanced capability their ownership of the audio file not to be accomplished simply by detecting the bit pattern that comprises the watermark itself, but by showing that the legal owner knows a hard to compute property of the watermark.
Abstract: Numerous concrete structures projects are currently running in Libya as part of a US$50 billion government funding. The
quality of concrete used in 20 different construction projects were assessed based mainly on the concrete compressive strength achieved. The projects are scattered all over the country and are at
various levels of completeness. For most of these projects, the
concrete compressive strength was obtained from test results of a
150mm standard cube mold. Statistical analysis of collected concrete
compressive strengths reveals that the data in general followed a
normal distribution pattern. The study covers comparison and assessment of concrete quality aspects such as: quality control, strength range, data standard deviation, data scatter, and ratio of minimum strength to design strength. Site quality control for these projects ranged from very good to poor according to ACI214 criteria [1]. The ranges (Rg) of the strength (max. strength – min. strength) divided by average strength are from (34% to 160%). Data scatter is
measured as the range (Rg) divided by standard deviation () and is
found to be (1.82 to 11.04), indicating that the range is ±3σ.
International construction companies working in Libya follow
different assessment criteria for concrete compressive strength in lieu
of national unified procedure. The study reveals that assessments of
concrete quality conducted by these construction companies usually
meet their adopted (internal) standards, but sometimes fail to meet
internationally known standard requirements. The assessment of
concrete presented in this paper is based on ACI, British standards
and proposed Libyan concrete strength assessment criteria.
Abstract: Imperfect knowledge cannot be avoided all the time. Imperfections may have several forms; uncertainties, imprecision and incompleteness. When we look to classification of methods for the management of imperfect knowledge we see fuzzy set-based techniques. The choice of a method to process data is linked to the choice of knowledge representation, which can be numerical, symbolic, logical or semantic and it depends on the nature of the problem to be solved for example decision support, which will be mentioned in our study. Fuzzy Logic is used for its ability to manage imprecise knowledge, but it can take advantage of the ability of neural networks to learn coefficients or functions. Such an association of methods is typical of so-called soft computing. In this study a new method was used for the management of imprecision for collected knowledge which related to economic analysis of construction industry in Turkey. Because of sudden changes occurring in economic factors decrease competition strength of construction companies. The better evaluation of these changes in economical factors in view of construction industry will made positive influence on company-s decisions which are dealing construction.