Abstract: Transcription factors are a group of proteins that
helps for interpreting the genetic information in DNA.
Protein-protein interactions play a major role in the execution
of key biological functions of a cell. These interactions are
represented in the form of a graph with nodes and edges.
Studies have showed that some nodes have high degree of
connectivity and such nodes, known as hub nodes, are the
inevitable parts of the network. In the present paper a method
is proposed to identify hub transcription factor proteins using
sequence information. On a complete data set of transcription
factor proteins available from the APID database, the
proposed method showed an accuracy of 77%, sensitivity of
79% and specificity of 76%.
Abstract: In an Orthogonal Frequency Division Multiplexing (OFDM) systems, the Peak to Average power Ratio (PAR) is high. The clipping signal scheme is a useful and simple method to reduce the PAR. However, it introduces additional noise that degrades the systems performance. We propose an oversampling scheme to deal with the received signal in order to reduce the clipping noise by using Finite Impulse Response (FIR) filter. Coefficients of filter are obtained by correlation function of the received signal and the oversampling information at receiver. The performance of the proposed technique is evaluated for frequency selective channel. Results show that the proposed scheme can mitigate the clipping noise significantly for OFDM systems and in order to maintain the system's capacity, the clipping ratio should be larger than 2.5.
Abstract: This paper aims to (1) analyze the profiles of
transgressors (detected evaders); (2) examine reason(s) that triggered a
tax audit, causes of tax evasion, audit timeframe and tax penalty
charged; and (3) to assess if tax auditors followed the guidelines as
stated in the 'Tax Audit Framework' when conducting tax audits. In
2011, the Inland Revenue Board Malaysia (IRBM) had audited and
finalized 557 company cases. With official permission, data of all the
557 cases were obtained from the IRBM. Of these, a total of 421 cases
with complete information were analyzed. About 58.1% was small and
medium corporations and from the construction industry (32.8%). The
selection for tax audit was based on risk analysis (66.8%), information
from third party (11.1%), and firm with low profitability or fluctuating
profit pattern (7.8%). The three persistent causes of tax evasion by
firms were over claimed expenses (46.8%), fraudulent reporting of
income (38.5%) and overstating purchases (10.5%). These findings
are consistent with past literature. Results showed that tax auditors
took six to 18 months to close audit cases. More than half of tax
evaders were fined 45% on additional tax raised during audit for the
first offence. The study found tax auditors did follow the guidelines in
the 'Tax Audit Framework' in audit selection, settlement and penalty
imposition.
Abstract: One important problem in today organizations is the
existence of non-integrated information systems, inconsistency and
lack of suitable correlations between legacy and modern systems.
One main solution is to transfer the local databases into a global one.
In this regards we need to extract the data structures from the legacy
systems and integrate them with the new technology systems. In
legacy systems, huge amounts of a data are stored in legacy
databases. They require particular attention since they need more
efforts to be normalized, reformatted and moved to the modern
database environments. Designing the new integrated (global)
database architecture and applying the reverse engineering requires
data normalization. This paper proposes the use of database reverse
engineering in order to integrate legacy and modern databases in
organizations. The suggested approach consists of methods and
techniques for generating data transformation rules needed for the
data structure normalization.
Abstract: This paper proposes a modeling methodology for the
development of data analysis solution. The Author introduce the
approach to address data warehousing issues at the at enterprise level.
The methodology covers the process of the requirements eliciting and
analysis stage as well as initial design of data warehouse. The paper
reviews extended business process model, which satisfy the needs of
data warehouse development. The Author considers that the use of
business process models is necessary, as it reflects both enterprise
information systems and business functions, which are important for
data analysis. The Described approach divides development into
three steps with different detailed elaboration of models. The
Described approach gives possibility to gather requirements and
display them to business users in easy manner.
Abstract: A high performance computer includes a fast
processor and millions bytes of memory. During the data processing,
huge amount of information are shuffled between the memory and
processor. Because of its small size and its effectiveness speed, cache
has become a common feature of high performance computers.
Enhancing cache performance proved to be essential in the speed up
of cache-based computers. Most enhancement approaches can be
classified as either software based or hardware controlled. The
performance of the cache is quantified in terms of hit ratio or miss
ratio. In this paper, we are optimizing the cache performance based
on enhancing the cache hit ratio. The optimum cache performance is
obtained by focusing on the cache hardware modification in the way
to make a quick rejection to the missed line's tags from the hit-or
miss comparison stage, and thus a low hit time for the wanted line in
the cache is achieved. In the proposed technique which we called
Even- Odd Tabulation (EOT), the cache lines come from the main
memory into cache are classified in two types; even line's tags and
odd line's tags depending on their Least Significant Bit (LSB). This
division is exploited by EOT technique to reject the miss match line's
tags in very low time compared to the time spent by the main
comparator in the cache, giving an optimum hitting time for the
wanted cache line. The high performance of EOT technique against
the familiar mapping technique FAM is shown in the simulated
results.
Abstract: This paper presents an interval-based multi-attribute
decision making (MADM) approach in support of the decision
process with imprecise information. The proposed decision
methodology is based on the model of linear additive utility function
but extends the problem formulation with the measure of composite
utility variance. A sample study concerning with the evaluation of
electric generation expansion strategies is provided showing how the
imprecise data may affect the choice toward the best solution and
how a set of alternatives, acceptable to the decision maker (DM),
may be identified with certain confidence.
Abstract: Microcirculation is essential for the proper supply of
oxygen and nutritive substances to the biological tissue and the
removal of waste products of metabolism. The determination of
blood flow in the capillaries is therefore of great interest to clinicians.
A comparison has been carried out using the developed non-invasive,
non-contact and whole field laser speckle contrast imaging (LSCI)
based technique and as well as a commercially available laser
Doppler blood flowmeter (LDF) to evaluate blood flow at the finger
tip and elbow and is presented here. The LSCI technique gives more
quantitative information on the velocity of blood when compared to
the perfusion values obtained using the LDF. Measurement of blood
flow in capillaries can be of great interest to clinicians in the
diagnosis of vascular diseases of the upper extremities.
Abstract: PARADIGMA (PARticipative Approach to DIsease
Global Management) is a pilot project which aims to develop and
demonstrate an Internet based reference framework to share scientific
resources and findings in the treatment of major diseases.
PARADIGMA defines and disseminates a common methodology and
optimised protocols (Clinical Pathways) to support service functions
directed to patients and individuals on matters like prevention, posthospitalisation
support and awareness. PARADIGMA will provide a
platform of information services - user oriented and optimised
against social, cultural and technological constraints - supporting the
Health Care Global System of the Euro-Mediterranean Community
in a continuous improvement process.
Abstract: The objective of this study was to improve our
understanding of vulnerability and environmental change; it's causes
basically show the intensity, its distribution and human-environment
effect on the ecosystem in the Apodi Valley Region, This paper is
identify, assess and classify vulnerability and environmental change
in the Apodi valley region using a combined approach of landscape
pattern and ecosystem sensitivity. Models were developed using the
following five thematic layers: Geology, geomorphology, soil,
vegetation and land use/cover, by means of a Geographical
Information Systems (GIS)-based on hydro-geophysical parameters.
In spite of the data problems and shortcomings, using ESRI-s ArcGIS
9.3 program, the vulnerability score, to classify, weight and combine
a number of 15 separate land cover classes to create a single indicator
provides a reliable measure of differences (6 classes) among regions
and communities that are exposed to similar ranges of hazards.
Indeed, the ongoing and active development of vulnerability
concepts and methods have already produced some tools to help
overcome common issues, such as acting in a context of high
uncertainties, taking into account the dynamics and spatial scale of
asocial-ecological system, or gathering viewpoints from different
sciences to combine human and impact-based approaches. Based on
this assessment, this paper proposes concrete perspectives and
possibilities to benefit from existing commonalities in the
construction and application of assessment tools.
Abstract: We investigate the planar quasi-septic non-analytic systems which have a center-focus equilibrium at the origin and whose angular speed is constant. The system could be changed into an analytic system by two transformations, with the help of computer algebra system MATHEMATICA, the conditions of uniform isochronous center are obtained.
Abstract: The paper presents a computational tool developed for
the evaluation of technical and economic advantages of an innovative
cleaning and conditioning technology of fluidized bed steam/oxygen
gasifiers outlet product gas. This technology integrates into a single
unit the steam gasification of biomass and the hot gas cleaning and
conditioning system. Both components of the computational tool,
process flowsheet and economic evaluator, have been developed
under IPSEpro software. The economic model provides information
that can help potential users, especially small and medium size
enterprises acting in the regenerable energy field, to decide the
optimal scale of a plant and to better understand both potentiality and
limits of the system when applied to a wide range of conditions.
Abstract: Multimedia information availability has increased
dramatically with the advent of video broadcasting on handheld
devices. But with this availability comes problems of maintaining the
security of information that is displayed in public. ISMA Encryption
and Authentication (ISMACryp) is one of the chosen technologies for
service protection in DVB-H (Digital Video Broadcasting-
Handheld), the TV system for portable handheld devices. The
ISMACryp is encoded with H.264/AVC (advanced video coding),
while leaving all structural data as it is. Two modes of ISMACryp are
available; the CTR mode (Counter type) and CBC mode (Cipher
Block Chaining) mode. Both modes of ISMACryp are based on 128-
bit AES algorithm. AES algorithms are more complex and require
larger time for execution which is not suitable for real time
application like live TV. The proposed system aims to gain a deep
understanding of video data security on multimedia technologies and
to provide security for real time video applications using selective
encryption for H.264/AVC. Five level of security proposed in this
paper based on the content of NAL unit in Baseline Constrain profile
of H.264/AVC. The selective encryption in different levels provides
encryption of intra-prediction mode, residue data, inter-prediction
mode or motion vectors only. Experimental results shown in this
paper described that fifth level which is ISMACryp provide higher
level of security with more encryption time and the one level provide
lower level of security by encrypting only motion vectors with lower
execution time without compromise on compression and quality of
visual content. This encryption scheme with compression process
with low cost, and keeps the file format unchanged with some direct
operations supported. Simulation was being carried out in Matlab.
Abstract: Academic research information service is a must for surveying previous studies in research and development process. OntoFrame is an academic research information service under Semantic Web framework different from simple keyword-based services such as CiteSeer and Google Scholar. The first purpose of this study is for revealing user behavior in their surveys, the objects of using academic research information services, and their needs. The second is for applying lessons learned from the results to OntoFrame.
Abstract: The healthcare environment is generally perceived as
being information rich yet knowledge poor. However, there is a lack
of effective analysis tools to discover hidden relationships and trends
in data. In fact, valuable knowledge can be discovered from
application of data mining techniques in healthcare system. In this
study, a proficient methodology for the extraction of significant
patterns from the Coronary Heart Disease warehouses for heart
attack prediction, which unfortunately continues to be a leading cause
of mortality in the whole world, has been presented. For this purpose,
we propose to enumerate dynamically the optimal subsets of the
reduced features of high interest by using rough sets technique
associated to dynamic programming. Therefore, we propose to
validate the classification using Random Forest (RF) decision tree to
identify the risky heart disease cases. This work is based on a large
amount of data collected from several clinical institutions based on
the medical profile of patient. Moreover, the experts- knowledge in
this field has been taken into consideration in order to define the
disease, its risk factors, and to establish significant knowledge
relationships among the medical factors. A computer-aided system is
developed for this purpose based on a population of 525 adults. The
performance of the proposed model is analyzed and evaluated based
on set of benchmark techniques applied in this classification problem.
Abstract: The paper deals with quality labels used in the food products market, especially with labels of quality, labels of origin, and labels of organic farming. The aim of the paper is to identify perception of these labels by consumers in the Czech Republic. The first part refers to the definition and specification of food quality labels that are relevant in the Czech Republic. The second part includes the discussion of marketing research results. Data were collected with personal questioning method. Empirical findings on 150 respondents are related to consumer awareness and perception of national and European food quality labels used in the Czech Republic, attitudes to purchases of labelled products, and interest in information regarding the labels. Statistical methods, in the concrete Pearson´s chi-square test of independence, coefficient of contingency, and coefficient of association are used to determinate if significant differences do exist among selected demographic categories of Czech consumers.
Abstract: The review performed on the condition of energy
consumption & rate in Iran, shows that unfortunately the subject of
optimization and conservation of energy in active industries of
country lacks a practical & effective method and in most factories,
the energy consumption and rate is more than in similar industries of
industrial countries. The increasing demand of electrical energy and
the overheads which it imposes on the organization, forces
companies to search for suitable approaches to optimize energy
consumption and demand management. Application of value
engineering techniques is among these approaches. Value
engineering is considered a powerful tool for improving profitability.
These tools are used for reduction of expenses, increasing profits,
quality improvement, increasing market share, performing works in
shorter durations, more efficient utilization of sources & etc.
In this article, we shall review the subject of value engineering and
its capabilities for creating effective transformations in industrial
organizations, in order to reduce energy costs & the results have
been investigated and described during a case study in Mazandaran
wood and paper industries, the biggest consumer of energy in north
of Iran, for the purpose of presenting the effects of performed tasks
in optimization of energy consumption by utilizing value engineering
techniques in one case study.
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: A series of tests on cold-formed steel (CFS) wall plate system subjected to uplift force at the mid span of the wall plate is presented. The aim of the study was to study the behaviour and identify the modes of failure of CFS wall plate system. Two parameters were considered in these studies: 1) different dimension of U-bracket at the supports and 2) different sizes of lipped C-channel. The lipped C-channels used were C07508, C07512 and C10012. The dimensions of the leg of U-bracket were 50x35 mm and 50x60 mm respectively, where 25 mm clearance was provided to the connections for specimens with clearance. Results show that specimens with and without clearance experienced the same mode of failure. Failure began with the yielding of the connectors followed by distortional buckling of the wall plate. However, when C075 sections were used as wall plate, the system behaved differently. There was a large deformation in the wall plate and failure began in the distortional buckling of the wall plate followed by bearing of the connecting plates at the supports (U-bracket). The ultimate strength of the system also decreased dramatically when C075 sections were used.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.