Abstract: Brassinosteroids (BRs) regulate cell elongation,
vascular differentiation, senescence, and stress responses. BRs signal
through the BES1/BZR1 family of transcription factors, which
regulate hundreds of target genes involved in this pathway. In this
research a comprehensive genome-wide analysis was carried out in
BES1/BZR1 gene family in Arabidopsis thaliana, Cucumis sativus,
Vitis vinifera, Glycin max and Brachypodium distachyon.
Specifications of the desired sequences, dot plot and hydropathy plot
were analyzed in the protein and genome sequences of five plant
species. The maximum amino acid length was attributed to protein
sequence Brdic3g with 374aa and the minimum amino acid length
was attributed to protein sequence Gm7g with 163aa. The maximum
Instability index was attributed to protein sequence AT1G19350
equal with 79.99 and the minimum Instability index was attributed to
protein sequence Gm5g equal with 33.22. Aliphatic index of these
protein sequences ranged from 47.82 to 78.79 in Arabidopsis
thaliana, 49.91 to 57.50 in Vitis vinifera, 55.09 to 82.43 in Glycin
max, 54.09 to 54.28 in Brachypodium distachyon 55.36 to 56.83 in
Cucumis sativus. Overall, data obtained from our investigation
contributes a better understanding of the complexity of the
BES1/BZR1 gene family and provides the first step towards directing
future experimental designs to perform systematic analysis of the
functions of the BES1/BZR1 gene family.
Abstract: The building of a factory can be a strategic investment
owing to its long service life. An evaluation that only focuses, for
example, on payments for the building, the technical equipment of
the factory, and the personnel for the enterprise is – considering the
complexity of the system factory – not sufficient for this long-term
view. The success of an investment is secured, among other things,
by the attainment of nonmonetary goals, too, like transformability.
Such aspects are not considered in traditional investment calculations
like the net present value method. This paper closes this gap with the
enhanced economic evaluation (EWR) for factory planning. The
procedure and the first results of an application in a project are
presented.
Abstract: Biological sequences from different species are called or-thologs if they evolved from a sequence of a common ancestor species and they have the same biological function. Approximations of Kolmogorov complexity or entropy of biological sequences are already well known to be useful in extracting similarity information between such sequences -in the interest, for example, of ortholog detection. As is well known, the exact Kolmogorov complexity is not algorithmically computable. In prac-tice one can approximate it by computable compression methods. How-ever, such compression methods do not provide a good approximation to Kolmogorov complexity for short sequences. Herein is suggested a new ap-proach to overcome the problem that compression approximations may notwork well on short sequences. This approach is inspired by new, conditional computations of Kolmogorov entropy. A main contribution of the empir-ical work described shows the new set of entropy-based machine learning attributes provides good separation between positive (ortholog) and nega-tive (non-ortholog) data - better than with good, previously known alter-natives (which do not employ some means to handle short sequences well).Also empirically compared are the new entropy based attribute set and a number of other, more standard similarity attributes sets commonly used in genomic analysis. The various similarity attributes are evaluated by cross validation, through boosted decision tree induction C5.0, and by Receiver Operating Characteristic (ROC) analysis. The results point to the conclu-sion: the new, entropy based attribute set by itself is not the one giving the best prediction; however, it is the best attribute set for use in improving the other, standard attribute sets when conjoined with them.
Abstract: Our adaptive multimodal system aims at correctly
presenting a mathematical expression to visually impaired users.
Given an interaction context (i.e. combination of user, environment
and system resources) as well as the complexity of the expression
itself and the user-s preferences, the suitability scores of different
presentation format are calculated. Unlike the current state-of-the art
solutions, our approach takes into account the user-s situation and not
imposes a solution that is not suitable to his context and capacity. In
this wok, we present our methodology for calculating the
mathematical expression complexity and the results of our
experiment. Finally, this paper discusses the concepts and principles
applied on our system as well as their validation through cases
studies. This work is our original contribution to an ongoing research
to make informatics more accessible to handicapped users.
Abstract: The H.264/AVC video coding standard contains a number of advanced features. Ones of the new features introduced in this standard is the multiple intramode prediction. Its function exploits directional spatial correlation with adjacent block for intra prediction. With this new features, intra coding of H.264/AVC offers a considerably higher improvement in coding efficiency compared to other compression standard, but computational complexity is increased significantly when brut force rate distortion optimization (RDO) algorithm is used. In this paper, we propose a new fast intra prediction mode decision method for the complexity reduction of H.264 video coding. for luma intra prediction, the proposed method consists of two step: in the first step, we make the RDO for four mode of intra 4x4 block, based the distribution of RDO cost of those modes and the idea that the fort correlation with adjacent mode, we select the best mode of intra 4x4 block. In the second step, we based the fact that the dominating direction of a smaller block is similar to that of bigger block, the candidate modes of 8x8 blocks and 16x16 macroblocks are determined. So, in case of chroma intra prediction, the variance of the chroma pixel values is much smaller than that of luma ones, since our proposed uses only the mode DC. Experimental results show that the new fast intra mode decision algorithm increases the speed of intra coding significantly with negligible loss of PSNR.
Abstract: This paper presents the results related to the
interference reduction technique in multistage multiuser detector for
asynchronous DS-CDMA system. To meet the real-time
requirements for asynchronous multiuser detection, a bit streaming,
cascade architecture is used. An asynchronous multiuser detection
involves block-based computations and matrix inversions. The paper
covers iterative-based suboptimal schemes that have been studied to
decrease the computational complexity, eliminate the need for matrix
inversions, decreases the execution time, reduces the memory
requirements and uses joint estimation and detection process that
gives better performance than the independent parameter estimation
method. The stages of the iteration use cascaded and bits processed
in a streaming fashion. The simulation has been carried out for
asynchronous DS-CDMA system by varying one parameter, i.e.,
number of users. The simulation result exhibits that system gives
optimum bit error rate (BER) at 3rd stage for 15-users.
Abstract: In this paper we propose a Particle Swarm heuristic
optimized Multi-Antenna (MA) system. Efficient MA systems
detection is performed using a robust stochastic evolutionary
computation algorithm based on movement and intelligence of
swarms. This iterative particle swarm optimized (PSO) detector
significantly reduces the computational complexity of conventional
Maximum Likelihood (ML) detection technique. The simulation
results achieved with this proposed MA-PSO detection algorithm
show near optimal performance when compared with ML-MA
receiver. The performance of proposed detector is convincingly
better for higher order modulation schemes and large number of
antennas where conventional ML detector becomes non-practical.
Abstract: In this paper a new cost function for blind equalization
is proposed. The proposed cost function, referred to as the modified
maximum normalized cumulant criterion (MMNC), is an extension
of the previously proposed maximum normalized cumulant criterion
(MNC). While the MNC requires a separate phase recovery system
after blind equalization, the MMNC performs joint blind equalization
and phase recovery. To achieve this, the proposed algorithm
maximizes a cost function that considers both amplitude and phase of
the equalizer output. The simulation results show that the proposed
algorithm has an improved channel equalization effect than the MNC
algorithm and simultaneously can correct the phase error that the
MNC algorithm is unable to do. The simulation results also show that
the MMNC algorithm has lower complexity than the MNC algorithm.
Moreover, the MMNC algorithm outperforms the MNC algorithm
particularly when the symbols block size is small.
Abstract: Automated discovery of Rule is, due to its applicability, one of the most fundamental and important method in KDD. It has been an active research area in the recent past. Hierarchical representation allows us to easily manage the complexity of knowledge, to view the knowledge at different levels of details, and to focus our attention on the interesting aspects only. One of such efficient and easy to understand systems is Hierarchical Production rule (HPRs) system. A HPR, a standard production rule augmented with generality and specificity information, is of the following form: Decision If < condition> Generality Specificity . HPRs systems are capable of handling taxonomical structures inherent in the knowledge about the real world. This paper focuses on the issue of mining Quantified rules with crisp hierarchical structure using Genetic Programming (GP) approach to knowledge discovery. The post-processing scheme presented in this work uses Quantified production rules as initial individuals of GP and discovers hierarchical structure. In proposed approach rules are quantified by using Dempster Shafer theory. Suitable genetic operators are proposed for the suggested encoding. Based on the Subsumption Matrix(SM), an appropriate fitness function is suggested. Finally, Quantified Hierarchical Production Rules (HPRs) are generated from the discovered hierarchy, using Dempster Shafer theory. Experimental results are presented to demonstrate the performance of the proposed algorithm.
Abstract: In this paper, two versions of an iterative loopless
algorithm for the classical towers of Hanoi problem with O(1) storage complexity and O(2n) time complexity are presented. Based
on this algorithm the number of different moves in each of pegs with its direction is formulated.
Abstract: Generalized Center String (GCS) problem are
generalized from Common Approximate Substring problem
and Common substring problems. GCS are known to be
NP-hard allowing the problems lies in the explosion of
potential candidates. Finding longest center string without
concerning the sequence that may not contain any motifs is
not known in advance in any particular biological gene
process. GCS solved by frequent pattern-mining techniques
and known to be fixed parameter tractable based on the
fixed input sequence length and symbol set size. Efficient
method known as Bpriori algorithms can solve GCS with
reasonable time/space complexities. Bpriori 2 and Bpriori
3-2 algorithm are been proposed of any length and any
positions of all their instances in input sequences. In this
paper, we reduced the time/space complexity of Bpriori
algorithm by Constrained Based Frequent Pattern mining
(CBFP) technique which integrates the idea of Constraint
Based Mining and FP-tree mining. CBFP mining technique
solves the GCS problem works for all center string of any
length, but also for the positions of all their mutated copies
of input sequence. CBFP mining technique construct TRIE
like with FP tree to represent the mutated copies of center
string of any length, along with constraints to restraint
growth of the consensus tree. The complexity analysis for
Constrained Based FP mining technique and Bpriori
algorithm is done based on the worst case and average case
approach. Algorithm's correctness compared with the
Bpriori algorithm using artificial data is shown.
Abstract: A data warehouse (DW) is a system which has value and role for decision-making by querying. Queries to DW are critical regarding to their complexity and length. They often access millions of tuples, and involve joins between relations and aggregations. Materialized views are able to provide the better performance for DW queries. However, these views have maintenance cost, so materialization of all views is not possible. An important challenge of DW environment is materialized view selection because we have to realize the trade-off between performance and view maintenance. Therefore, in this paper, we introduce a new approach aimed to solve this challenge based on Two-Phase Optimization (2PO), which is a combination of Simulated Annealing (SA) and Iterative Improvement (II), with the use of Multiple View Processing Plan (MVPP). Our experiments show that 2PO outperform the original algorithms in terms of query processing cost and view maintenance cost.
Abstract: Process measurement is the task of empirically and objectively assigning numbers to the properties of business processes in such a way as to describe them. Desirable attributes to study and measure include complexity, cost, maintainability, and reliability. In our work we will focus on investigating process complexity. We define process complexity as the degree to which a business process is difficult to analyze, understand or explain. One way to analyze a process- complexity is to use a process control-flow complexity measure. In this paper, an attempt has been made to evaluate the control-flow complexity measure in terms of Weyuker-s properties. Weyuker-s properties must be satisfied by any complexity measure to qualify as a good and comprehensive one.
Abstract: In this paper we propose a new approach to constructing the Delaunay Triangulation and the optimum algorithm for the case of multidimensional spaces (d ≥ 2). Analysing the modern state, it is possible to draw a conclusion, that the ideas for the existing effective algorithms developed for the case of d ≥ 2 are not simple to generalize on a multidimensional case, without the loss of efficiency. We offer for the solving this problem an effective algorithm that satisfies all the given requirements. But theoretical complexity of the problem it is impossible to improve as the Worst - Case Optimality for algorithms of solving such a problem is proved.
Abstract: As the network based technologies become
omnipresent, demands to secure networks/systems against threat
increase. One of the effective ways to achieve higher security is
through the use of intrusion detection systems (IDS), which are a
software tool to detect anomalous in the computer or network. In this
paper, an IDS has been developed using an improved machine
learning based algorithm, Locally Linear Neuro Fuzzy Model
(LLNF) for classification whereas this model is originally used for
system identification. A key technical challenge in IDS and LLNF
learning is the curse of high dimensionality. Therefore a feature
selection phase is proposed which is applicable to any IDS. While
investigating the use of three feature selection algorithms, in this
model, it is shown that adding feature selection phase reduces
computational complexity of our model. Feature selection algorithms
require the use of a feature goodness measure. The use of both a
linear and a non-linear measure - linear correlation coefficient and
mutual information- is investigated respectively
Abstract: Duplicated region detection is a technical method to
expose copy-paste forgeries on digital images. Copy-paste is one
of the common types of forgeries to clone portion of an image
in order to conceal or duplicate special object. In this type of
forgery detection, extracting robust block feature and also high
time complexity of matching step are two main open problems.
This paper concentrates on computational time and proposes a local
block matching algorithm based on block clustering to enhance time
complexity. Time complexity of the proposed algorithm is formulated
and effects of two parameter, block size and number of cluster, on
efficiency of this algorithm are considered. The experimental results
and mathematical analysis demonstrate this algorithm is more costeffective
than lexicographically algorithms in time complexity issue
when the image is complex.
Abstract: In this paper a new approach to prioritize urban planning projects in an efficient and reliable way is presented. It is based on environmental pressure indices and multicriteria decision methods. The paper introduces a rigorous method with acceptable complexity of rank ordering urban development proposals according to their environmental pressure. The technique combines the use of Environmental Pressure Indicators, the aggregation of indicators in an Environmental Pressure Index by means of the Analytic Network Process method and interpreting the information obtained from the experts during the decision-making process. The ANP method allows the aggregation of the experts- judgments on each of the indicators into one Environmental Pressure Index. In addition, ANP is based on utility ratio functions which are the most appropriate for the analysis of uncertain data, like experts- estimations. Finally, unlike the other multicriteria techniques, ANP allows the decision problem to be modelled using the relationships among dependent criteria. The method has been applied to the proposal for urban development of La Carlota airport in Caracas (Venezuela). The Venezuelan Government would like to see a recreational project develop on the abandoned area and mean a significant improvement for the capital. There are currently three options on their table which are currently under evaluation. They include a Health Club, a Residential area and a Theme Park. The participating experts coincided in the appreciation that the method proposed in this paper is useful and an improvement from traditional techniques such as environmental impact studies, lifecycle analysis, etc. They find the results obtained coherent, the process seems sufficiently rigorous and precise, and the use of resources is significantly less than in other methods.
Abstract: The urban transformation processes in its framework
and its general significance became a fundamental and vital subject
of consideration for both the developed and the developing societies.
It has become important to regulate the architectural systems adopted
by the city, to sustain the present development on one hand, and on
the other hand, to facilitate its future growth.
Thus, the study dealt with the phenomenon of urban
transformation of the Mediterranean cities, and the city of Alexandria
in particular, because of its significant historical and cultural legacy,
its historical architecture and its contemporary urbanization.
This article investigates the entirety of cities in the Mediterranean
region through the analysis of the relationship between inflation and
growth of these cities and the extent of the complexity of the city
barriers. We hope to analyze not only the internal transformations,
but the external relationships (both imperial and post-colonial) that
have shaped Alexandria city growth from the nineteenth century until
today.
Abstract: The implementation of the new software and hardware-s technologies for tritium processing nuclear plants, and especially those with an experimental character or of new technology developments shows a coefficient of complexity due to issues raised by the implementation of the performing instrumentation and equipment into a unitary monitoring system of the nuclear technological process of tritium removal. Keeping the system-s flexibility is a demand of the nuclear experimental plants for which the change of configuration, process and parameters is something usual. The big amount of data that needs to be processed stored and accessed for real time simulation and optimization demands the achievement of the virtual technologic platform where the data acquiring, control and analysis systems of the technological process can be integrated with a developed technological monitoring system. Thus, integrated computing and monitoring systems needed for the supervising of the technological process will be executed, to be continued with the execution of optimization system, by choosing new and performed methods corresponding to the technological processes within the tritium removal processing nuclear plants. The developing software applications is executed with the support of the program packages dedicated to industrial processes and they will include acquisition and monitoring sub-modules, named “virtually" as well as the storage sub-module of the process data later required for the software of optimization and simulation of the technological process for tritium removal. The system plays and important role in the environment protection and durable development through new technologies, that is – the reduction of and fight against industrial accidents in the case of tritium processing nuclear plants. Research for monitoring optimisation of nuclear processes is also a major driving force for economic and social development.
Abstract: In this paper, we propose a new architecture for the implementation of the N-point Fast Fourier Transform (FFT), based on the Radix-2 Decimation in Frequency algorithm. This architecture is based on a pipeline circuit that can process a stream of samples and produce two FFT transform samples every clock cycle. Compared to existing implementations the architecture proposed achieves double processing speed using the same circuit complexity.