Abstract: Background: Tissue Doppler Echocardiography
(TDE) assesses diastolic function more accurately than routine pulse
Doppler echo. Assessment of the effects of dynamic and static
exercises on the heart by using TDE can provides new information
about the athlete-s heart syndrome. Methods: This study was
conducted on 20 elite wrestlers, 14 endurance runners at national
level and 21 non-athletes as the control group. Participants underwent
two-dimensional echocardiography, standard Doppler and TDE.
Results: Wrestlers had the highest left ventricular mass index, enddiastolic
inter-ventricular septum thickness and left ventricular
Posterior wall thickness. Runners had the highest Left ventricular
end-diastolic volume, LV ejection fraction, stroke volume and
cardiac output. In TDE, the early diastolic velocity of mitral annulus
to the late diastolic velocity ratio in athletic groups was greater than
the controls with no significant difference. Conclusion: In spite of
cardiac morphological changes in athletes, TDE shows that cardiac
diastolic function won-t be adversely affected.
Abstract: Rational Emotive Behaviour Therapy is the first
cognitive behavior therapy which was introduced by Albert Ellis.
This is a systematic and structured psychotherapy which is effective
in treating various psychological problems. A patient, 25 years old
male, experienced intense fear and situational panic attack to return
to his faculty and to face his class-mates after a long absence (2
years). This social anxiety disorder was a major factor that impeded
the progress of his study. He was treated with the use of behavioural
technique such as relaxation breathing technique and cognitive
techniques such as imagery, cognitive restructuring, rationalization
technique and systematic desensitization. The patient reported
positive improvement in the anxiety disorder, able to progress well in
studies and lead a better quality of life as a student.
Abstract: The SOM has several beneficial features which make
it a useful method for data mining. One of the most important
features is the ability to preserve the topology in the projection.
There are several measures that can be used to quantify the goodness
of the map in order to obtain the optimal projection, including the
average quantization error and many topological errors. Many
researches have studied how the topology preservation should be
measured. One option consists of using the topographic error which
considers the ratio of data vectors for which the first and second best
BMUs are not adjacent. In this work we present a study of the
behaviour of the topographic error in different kinds of maps. We
have found that this error devaluates the rectangular maps and we
have studied the reasons why this happens. Finally, we suggest a new
topological error to improve the deficiency of the topographic error.
Abstract: Protective clothing limits heat transfer and hampers
task performance due to the increased weight. Militarism protective
clothing enables humans to operate in adverse environments. In the
selection and evaluation of militarism protective clothing attention
should be given to heat strain, ergonomic and fit issues next to the
actual protection it offers.
Fifty Male healthy subjects participated in the study. The subjects
were dressed in shorts, T-shirts, socks, sneakers and four deferent
kinds of militarism protective clothing such as CS, CSB, CS with
NBC protection and CS with NBC- protection added.
Ergonomically and psychological strains of every four cloths were
investigated on subjects by walking on a treadmill (7km/hour) with a
19.7 kg backpack. As a result of these tests were showed that, the
highest heart rate was found wearing the NBC-protection added
outfit, the highest temperatures were observed wearing NBCprotection
added, followed by respectively CS with NBC protection,
CSB and CS and the highest value for thermal comfort (implying
worst thermal comfort) was observed wearing NBC-protection
added.
Abstract: The implicit block methods based on the backward
differentiation formulae (BDF) for the solution of stiff initial value
problems (IVPs) using variable step size is derived. We construct a
variable step size block methods which will store all the coefficients
of the method with a simplified strategy in controlling the step size
with the intention of optimizing the performance in terms of
precision and computation time. The strategy involves constant,
halving or increasing the step size by 1.9 times the previous step size.
Decision of changing the step size is determined by the local
truncation error (LTE). Numerical results are provided to support the
enhancement of method applied.
Abstract: The purpose of this article is to analyze the
degree of concentration in the banking market in EU member
states as well as to determine the impact of the length of EU
membership on the degree of concentration. In that sense
several analysis were conducted, specifically, panel analysis,
calculation of correlation coefficient and regression analysis of
the impact of the length of EU membership on the degree of
concentration. Panel analysis was conducted to determine
whether there is a similar trend of concentration in three
groups of countries - countries with a low, moderate and high
level of concentration. The conducted panel analysis showed
that in EU countries with a moderate level of concentration,
the level of concentration decreases. The calculation of
correlation showed that, to some extent, with other influential
factors, the length of EU membership negatively affects the
market concentration of the banking market. Using the
regression analysis for investigation of the influence of the
length of EU membership on the level of concentration in the
banking sector in a particular country, the results reveal that
there is a negative effect of the length in EU membership on
market concentration, although it is not significantly influential
variable.
Abstract: In the LFC problem, the interconnections among some areas are the input of disturbances, and therefore, it is important to suppress the disturbances by the coordination of governor systems. In contrast, tie-line power flow control by TCPS located between two areas makes it possible to stabilize the system frequency oscillations positively through interconnection, which is also expected to provide a new ancillary service for the further power systems. Thus, a control strategy using controlling the phase angle of TCPS is proposed for provide active control facility of system frequency in this paper. Also, the optimum adjustment of PID controller's parameters in a robust way under bilateral contracted scenario following the large step load demands and disturbances with and without TCPS are investigated by Particle Swarm Optimization (PSO), that has a strong ability to find the most optimistic results. This newly developed control strategy combines the advantage of PSO and TCPS and has simple stricture that is easy to implement and tune. To demonstrate the effectiveness of the proposed control strategy a three-area restructured power system is considered as a test system under different operating conditions and system nonlinearities. Analysis reveals that the TCPS is quite capable of suppressing the frequency and tie-line power oscillations effectively as compared to that obtained without TCPS for a wide range of plant parameter changes, area load demands and disturbances even in the presence of system nonlinearities.
Abstract: Speckle noise affects all coherent imaging systems
including medical ultrasound. In medical images, noise suppression
is a particularly delicate and difficult task. A tradeoff between noise
reduction and the preservation of actual image features has to be made
in a way that enhances the diagnostically relevant image content.
Even though wavelets have been extensively used for denoising
speckle images, we have found that denoising using contourlets gives
much better performance in terms of SNR, PSNR, MSE, variance and
correlation coefficient. The objective of the paper is to determine the
number of levels of Laplacian pyramidal decomposition, the number
of directional decompositions to perform on each pyramidal level and
thresholding schemes which yields optimal despeckling of medical
ultrasound images, in particular. The proposed method consists of the
log transformed original ultrasound image being subjected to contourlet
transform, to obtain contourlet coefficients. The transformed
image is denoised by applying thresholding techniques on individual
band pass sub bands using a Bayes shrinkage rule. We quantify the
achieved performance improvement.
Abstract: This paper proposes an active soft-switching circuit for
bridge converters aiming to improve the power conversion efficiency.
The proposed circuit achieves loss-less switching for both main and
auxiliary switches without increasing the main switch current/voltage
rating. A winding coupled to the primary of power transformer
ensures ZCS for the auxiliary switches during their turn-off. A 350 W,
100 kHz phase shifted full bridge (PSFB) converter is built to validate
the analysis and design. Theoretical loss calculations for proposed
circuit is presented. The proposed circuit is compared with passive
soft switched PSFB in terms of efficiency and loss in duty cycle.
Abstract: This paper presents a novel approach for optimal
reconfiguration of radial distribution systems. Optimal
reconfiguration involves the selection of the best set of branches to
be opened, one each from each loop, such that the resulting radial
distribution system gets the desired performance. In this paper an
algorithm is proposed based on simple heuristic rules and identified
an effective switch status configuration of distribution system for the
minimum loss reduction. This proposed algorithm consists of two
parts; one is to determine the best switching combinations in all loops
with minimum computational effort and the other is simple optimum
power loss calculation of the best switching combination found in
part one by load flows. To demonstrate the validity of the proposed
algorithm, computer simulations are carried out on 33-bus system.
The results show that the performance of the proposed method is
better than that of the other methods.
Abstract: Most of the existing text mining approaches are
proposed, keeping in mind, transaction databases model. Thus, the
mined dataset is structured using just one concept: the “transaction",
whereas the whole dataset is modeled using the “set" abstract type. In
such cases, the structure of the whole dataset and the relationships
among the transactions themselves are not modeled and
consequently, not considered in the mining process.
We believe that taking into account structure properties of
hierarchically structured information (e.g. textual document, etc ...)
in the mining process, can leads to best results. For this purpose, an
hierarchical associations rule mining approach for textual documents
is proposed in this paper and the classical set-oriented mining
approach is reconsidered profits to a Direct Acyclic Graph (DAG)
oriented approach. Natural languages processing techniques are used
in order to obtain the DAG structure. Based on this graph model, an
hierarchical bottom up algorithm is proposed. The main idea is that
each node is mined with its parent node.
Abstract: The paper examines the performance of bit-interleaved parity (BIP) methods in error rate monitoring, and in declaration and clearing of alarms in those transport networks that employ automatic protection switching (APS). The BIP-based error rate monitoring is attractive for its simplicity and ease of implementation. The BIP-based results are compared with exact results and are found to declare the alarms too late, and to clear the alarms too early. It is concluded that the standards development and systems implementation should take into account the fact of early clearing and late declaration of alarms. The window parameters defining the detection and clearing thresholds should be set so as to build sufficient hysteresis into the system to ensure that BIP-based implementations yield acceptable performance results.
Abstract: Cryptography, Image watermarking and E-banking are
filled with apparent oxymora and paradoxes. Random sequences are
used as keys to encrypt information to be used as watermark during
embedding the watermark and also to extract the watermark during
detection. Also, the keys are very much utilized for 24x7x365
banking operations. Therefore a deterministic random sequence is
very much useful for online applications. In order to obtain the same
random sequence, we need to supply the same seed to the generator.
Many researchers have used Deterministic Random Number
Generators (DRNGs) for cryptographic applications and Pseudo
Noise Random sequences (PNs) for watermarking. Even though,
there are some weaknesses in PN due to attacks, the research
community used it mostly in digital watermarking. On the other hand,
DRNGs have not been widely used in online watermarking due to its
computational complexity and non-robustness. Therefore, we have
invented a new design of generating DRNG using Pi-series to make it
useful for online Cryptographic, Digital watermarking and Banking
applications.
Abstract: Many companies have switched their processes to project-oriented in the last years. This brings new possibilities and effectiveness not only in the field of external processes connected with the product delivery but also the internal processes as well. However centralized project organization which is based on the role of project manager in the team has proved insufficient in some cases. Agile methods of project organization are trying to solve this problem by bringing new view on the project organization, roles, processes and competences. Scrum is one of these methods which builds on the principles of knowledge management to drive the project to effectiveness from all view angles. Using this method to organize internal and delivery projects helps the organization to create and share knowledge throughout the company. It also supports forming unique competences of individuals and project teams and drives innovations in the company.
Abstract: Frequent patterns are patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a dataset. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages such as C, Cµ, Java. The imperative paradigm is significantly inefficient when itemset is large and the frequent pattern is long. We suggest a high-level declarative style of programming using a functional language. Our supposition is that the problem of frequent pattern discovery can be efficiently and concisely implemented via a functional paradigm since pattern matching is a fundamental feature supported by most functional languages. Our frequent pattern mining implementation using the Haskell language confirms our hypothesis about conciseness of the program. The performance studies on speed and memory usage support our intuition on efficiency of functional language.
Abstract: Before performing polymerase chain reactions (PCR), a feasible primer set is required. Many primer design methods have been proposed for design a feasible primer set. However, the majority of these methods require a relatively long time to obtain an optimal solution since large quantities of template DNA need to be analyzed. Furthermore, the designed primer sets usually do not provide a specific PCR product. In recent years, evolutionary computation has been applied to PCR primer design and yielded promising results. In this paper, a particle swarm optimization (PSO) algorithm is proposed to solve primer design problems associated with providing a specific product for PCR experiments. A test set of the gene CYP1A1, associated with a heightened lung cancer risk was analyzed and the comparison of accuracy and running time with the genetic algorithm (GA) and memetic algorithm (MA) was performed. A comparison of results indicated that the proposed PSO method for primer design finds optimal or near-optimal primer sets and effective PCR products in a relatively short time.
Abstract: Our Medicine-oriented research is based on a medical
data set of real patients. It is a security problem to share
patient private data with peoples other than clinician or hospital
staff. We have to remove person identification information
from medical data. The medical data without private data
are available after a de-identification process for any research
purposes. In this paper, we introduce an universal automatic
rule-based de-identification application to do all this stuff on an
heterogeneous medical data. A patient private identification is
replaced by an unique identification number, even in burnedin
annotation in pixel data. The identical identification is used
for all patient medical data, so it keeps relationships in a data.
Hospital can take an advantage of a research feedback based
on results.
Abstract: Theory of Constraints has been emerging as an
important tool for optimization of manufacturing/service systems.
Goldratt in his first book “ The Goal " gave the introduction on
Theory of Constraints and its applications in a factory scenario. A
large number of production managers around the globe read this book
but only a few could implement it in their plants because the book did
not explain the steps to implement TOC in the factory. To overcome
these limitations, Goldratt wrote this book to explain TOC, DBR and
the method to implement it. In this paper, an attempt has been made
to summarize the salient features of TOC and DBR listed in the book
and the correct approach to implement TOC in a factory setting. The
simulator available along with the book was actually used by the
authors and the claim of Goldratt regarding the use of DBR and
Buffer management to ease the work of production managers was
tested and was found to be correct.
Abstract: This paper presents the use of Legendre pseudospectral
method for the optimization of finite-thrust orbital transfer for
spacecrafts. In order to get an accurate solution, the System-s
dynamics equations were normalized through a dimensionless method.
The Legendre pseudospectral method is based on interpolating
functions on Legendre-Gauss-Lobatto (LGL) quadrature nodes. This
is used to transform the optimal control problem into a constrained
parameter optimization problem. The developed novel optimization
algorithm can be used to solve similar optimization problems of
spacecraft finite-thrust orbital transfer. The results of a numerical
simulation verified the validity of the proposed optimization method.
The simulation results reveal that pseudospectral optimization method
is a promising method for real-time trajectory optimization and
provides good accuracy and fast convergence.
Abstract: We analyze the effectivity of different pseudo noise (PN) and orthogonal sequences for encrypting speech signals in terms of perceptual intelligence. Speech signal can be viewed as sequence of correlated samples and each sample as sequence of bits. The residual intelligibility of the speech signal can be reduced by removing the correlation among the speech samples. PN sequences have random like properties that help in reducing the correlation among speech samples. The mean square aperiodic auto-correlation (MSAAC) and the mean square aperiodic cross-correlation (MSACC) measures are used to test the randomness of the PN sequences. Results of the investigation show the effectivity of large Kasami sequences for this purpose among many PN sequences.