Abstract: A dual-reciprocity boundary element method is presented
for the numerical solution of a class of axisymmetric elastodynamic
problems. The domain integrals that arise in the integrodifferential
formulation are converted to line integrals by using the
dual-reciprocity method together suitably constructed interpolating
functions. The second order time derivatives of the displacement
in the governing partial differential equations are suppressed by
using Laplace transformation. In the Laplace transform domain, the
problem under consideration is eventually reduced to solving a system
of linear algebraic equations. Once the linear algebraic equations are
solved, the displacement and stress fields in the physical domain can
be recovered by using a numerical technique for inverting Laplace
transforms.
Abstract: Maximal Ratio Combining (MRC) is considered the most complex combining technique as it requires channel coefficients estimation. It results in the lowest bit error rate (BER) compared to all other combining techniques. However the BER starts to deteriorate as errors are introduced in the channel coefficients estimation. A novel combining technique, termed Generalized Maximal Ratio Combining (GMRC) with a polynomial kernel, yields an identical BER as MRC with perfect channel estimation and a lower BER in the presence of channel estimation errors. We show that GMRC outperforms the optimal MRC scheme in general and we hereinafter introduce it to the scientific community as a new “supraoptimal" algorithm. Since diversity combining is especially effective in small femto- and pico-cells, internet-associated wireless peripheral systems are to benefit most from GMRC. As a result, many spinoff applications can be made to IP-based 4th generation networks.
Abstract: Pressures for urban redevelopment are intensifying in
all large cities. A new logic for urban development is required –
green urbanism – that provides a spatial framework for directing
population and investment inwards to brownfields and greyfields
precincts, rather than outwards to the greenfields. This represents
both a major opportunity and a major challenge for city planners in
pluralist liberal democracies. However, plans for more compact
forms of urban redevelopment are stalling in the face of community
resistance. A new paradigm and spatial planning platform is required
that will support timely multi-level and multi-actor stakeholder
engagement, resulting in the emergence of consensus plans for
precinct-level urban regeneration capable of more rapid
implementation. Using Melbourne, Australia as a case study, this
paper addresses two of the urban intervention challenges – where and
how – via the application of a 21st century planning tool ENVISION
created for this purpose.
Abstract: Decision Feedback equalizers (DFEs) usually outperform linear equalizers for channels with intersymbol interference. However, the DFE performance is highly dependent on the availability of reliable past decisions. Hence, in coded systems, where reliable decisions are only available after decoding the full block, the performance of the DFE will be affected. A symbol based DFE is a DFE that only uses the decision after the block is decoded. In this paper we derive the optimal settings of both the feedforward and feedback taps of the symbol based equalizer. We present a novel symbol based DFE filterbank, and derive its taps optimal settings. We also show that it outperforms the classic DFE in terms of complexity and/or performance.
Abstract: In this work, the primary compressive strength
components of human femur trabecular bone are qualitatively
assessed using image processing and wavelet analysis. The Primary
Compressive (PC) component in planar radiographic femur trabecular
images (N=50) is delineated by semi-automatic image processing
procedure. Auto threshold binarization algorithm is employed to
recognize the presence of mineralization in the digitized images. The
qualitative parameters such as apparent mineralization and total area
associated with the PC region are derived for normal and abnormal
images.The two-dimensional discrete wavelet transforms are utilized
to obtain appropriate features that quantify texture changes in medical
images .The normal and abnormal samples of the human femur are
comprehensively analyzed using Harr wavelet.The six statistical
parameters such as mean, median, mode, standard deviation, mean
absolute deviation and median absolute deviation are derived at level
4 decomposition for both approximation and horizontal wavelet
coefficients. The correlation coefficient of various wavelet derived
parameters with normal and abnormal for both approximated and
horizontal coefficients are estimated. It is seen that in almost all cases
the abnormal show higher degree of correlation than normals. Further
the parameters derived from approximation coefficient show more
correlation than those derived from the horizontal coefficients. The
parameters mean and median computed at the output of level 4 Harr
wavelet channel was found to be a useful predictor to delineate the
normal and the abnormal groups.
Abstract: The clinical usefulness of heart rate variability is
limited to the range of Holter monitoring software available. These
software algorithms require a normal sinus rhythm to accurately
acquire heart rate variability (HRV) measures in the frequency
domain. Premature ventricular contractions (PVC) or more
commonly referred to as ectopic beats, frequent in heart failure,
hinder this analysis and introduce ambiguity. This investigation
demonstrates an algorithm to automatically detect ectopic beats by
analyzing discrete wavelet transform coefficients. Two techniques
for filtering and replacing the ectopic beats from the RR signal are
compared. One technique applies wavelet hard thresholding
techniques and another applies linear interpolation to replace ectopic
cycles. The results demonstrate through simulation, and signals
acquired from a 24hr ambulatory recorder, that these techniques can
accurately detect PVC-s and remove the noise and leakage effects
produced by ectopic cycles retaining smooth spectra with the
minimum of error.
Abstract: In-place sorting algorithms play an important role in many fields such as very large database systems, data warehouses, data mining, etc. Such algorithms maximize the size of data that can be processed in main memory without input/output operations. In this paper, a novel in-place sorting algorithm is presented. The algorithm comprises two phases; rearranging the input unsorted array in place, resulting segments that are ordered relative to each other but whose elements are yet to be sorted. The first phase requires linear time, while, in the second phase, elements of each segment are sorted inplace in the order of z log (z), where z is the size of the segment, and O(1) auxiliary storage. The algorithm performs, in the worst case, for an array of size n, an O(n log z) element comparisons and O(n log z) element moves. Further, no auxiliary arithmetic operations with indices are required. Besides these theoretical achievements of this algorithm, it is of practical interest, because of its simplicity. Experimental results also show that it outperforms other in-place sorting algorithms. Finally, the analysis of time and space complexity, and required number of moves are presented, along with the auxiliary storage requirements of the proposed algorithm.
Abstract: Data clustering is an important data exploration technique
with many applications in data mining. We present an enhanced
version of the well known single link clustering algorithm. We will
refer to this algorithm as DCBOR. The proposed algorithm alleviates
the chain effect by removing the outliers from the given dataset.
So this algorithm provides outlier detection and data clustering
simultaneously. This algorithm does not need to update the distance
matrix, since the algorithm depends on merging the most k-nearest
objects in one step and the cluster continues grow as long as possible
under specified condition. So the algorithm consists of two phases;
at the first phase, it removes the outliers from the input dataset. At
the second phase, it performs the clustering process. This algorithm
discovers clusters of different shapes, sizes, densities and requires
only one input parameter; this parameter represents a threshold for
outlier points. The value of the input parameter is ranging from 0 to
1. The algorithm supports the user in determining an appropriate
value for it. We have tested this algorithm on different datasets
contain outlier and connecting clusters by chain of density points,
and the algorithm discovers the correct clusters. The results of
our experiments demonstrate the effectiveness and the efficiency of
DCBOR.
Abstract: Shadoo protein (Sho) was described in 2003 as the newest member of Prion protein superfamily [1]. Sho has similar structural motifs like prion protein (PrP) that is known for its central role in transmissible spongiform enchephalopathies. Although a great number of functions have been proposed, the exact physiological function of PrP is not known yet. Investigation of the function and localization of Sho may help us to understand the function of the Prion protein superfamily. Analyzing the subcellular localization of YFP-tagged forms of Sho, we detected the protein in the plasma membrane and in the nucleus of various cell lines. To reveal the localization of the endogenous protein we generated antibodies against Shadoo as well as employed commercially available anti-Shadoo antibodies: i) EG62 anti-mouse Shadoo antibody generated by Eurogentec Ltd.; ii) S-12 anti-human Shadoo antibody by Santa Cruz Biotechnology Inc.; iii) R-12 anti-mouse Shadoo antibody by Santa Cruz Biotechnology Inc.; iv) SPRN antibody against human Shadoo by Abgent Inc. We carried out immunocytochemistry on non-transfected HeLa, Zpl 2-1, Zw 3-5, GT1-1, GT1-7 and SHSY5Y cells as well as on YFP-Sho, Sho-YFP, and YFP-GPI transfected HeLa cells. Their specificity (in antibody-peptide competition assay) and co-localization (with the YFP signal) were assessed.
Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.
Abstract: Background: Blunt aortic trauma (BAT) includes
various morphological changes that occur during deceleration,
acceleration and/or body compression in traffic accidents. The
various forms of BAT, from limited laceration of the intima to
complete transection of the aorta, depends on the force acting on the
vessel wall and the tolerance of the aorta to injury. The force depends
on the change in velocity, the dynamics of the accident and of the
seating position in the car. Tolerance to aortic injury depends on the
anatomy, histological structure and pathomorphological alterations
due to aging or disease of the aortic wall.
An overview of the literature and medical documentation reveals
that different terms are used to describe certain forms of BAT, which
can lead to misinterpretation of findings or diagnoses. We therefore,
propose a classification that would enable uniform systematic
screening of all forms of BAT. We have classified BAT into three
morphologycal types: TYPE I (intramural), TYPE II (transmural) and
TYPE III (multiple) aortic ruptures with appropriate subtypes.
Methods: All car accident casualties examined at the Institute of
Forensic Medicine from 2001 to 2009 were included in this
retrospective study. Autopsy reports were used to determine the
occurrence of each morphological type of BAT in deceased drivers,
front seat passengers and other passengers in cars and to define the
morphology of BAT in relation to the accident dynamics and the age
of the fatalities.
Results: A total of 391 fatalities in car accidents were included in
the study. TYPE I, TYPE II and TYPE III BAT were observed in
10,9%, 55,6% and 33,5%, respectively. The incidence of BAT in
drivers, front seat and other passengers was 36,7%, 43,1% and
28,6%, respectively. In frontal collisions, the incidence of BAT was
32,7%, in lateral collisions 54,2%, and in other traffic accidents
29,3%. The average age of fatalities with BAT was 42,8 years and of
those without BAT 39,1 years.
Conclusion: Identification and early recognition of the risk factors
of BAT following a traffic accident is crucial for successful treatment
of patients with BAT. Front seat passengers over 50 years of age who
have been injured in a lateral collision are the most at risk of BAT.
Abstract: Human activities are increasingly based on the use of remote resources and services, and on the interaction between
remotely located parties that may know little about each other. Mobile agents must be prepared to execute on different hosts with
various environmental security conditions. The aim of this paper is to
propose a trust based mechanism to improve the security of mobile
agents and allow their execution in various environments. Thus, an
adaptive trust mechanism is proposed. It is based on the dynamic interaction between the agent and the environment. Information
collected during the interaction enables generation of an environment
key. This key informs on the host-s trust degree and permits the mobile agent to adapt its execution. Trust estimation is based on
concrete parameters values. Thus, in case of distrust, the source of problem can be located and a mobile agent appropriate behavior can
be selected.
Abstract: In this work, we propose a hybrid heuristic in order to
solve the Team Orienteering Problem (TOP). Given a set of points (or
customers), each with associated score (profit or benefit), and a team
that has a fixed number of members, the problem to solve is to visit a
subset of points in order to maximize the total collected score. Each
member performs a tour starting at the start point, visiting distinct
customers and the tour terminates at the arrival point. In addition,
each point is visited at most once, and the total time in each tour
cannot be greater than a given value. The proposed heuristic combines
beam search and a local optimization strategy. The algorithm was
tested on several sets of instances and encouraging results were
obtained.
Abstract: In this paper, we propose a supervised method for
color image classification based on a multilevel sigmoidal neural
network (MSNN) model. In this method, images are classified into
five categories, i.e., “Car", “Building", “Mountain", “Farm" and
“Coast". This classification is performed without any segmentation
processes. To verify the learning capabilities of the proposed method,
we compare our MSNN model with the traditional Sigmoidal Neural
Network (SNN) model. Results of comparison have shown that the
MSNN model performs better than the traditional SNN model in the
context of training run time and classification rate. Both color
moments and multi-level wavelets decomposition technique are used
to extract features from images. The proposed method has been
tested on a variety of real and synthetic images.
Abstract: The fault detection and diagnosis of complicated
production processes is one of essential tasks needed to run the process
safely with good final product quality. Unexpected events occurred in
the process may have a serious impact on the process. In this work,
triangular representation of process measurement data obtained in an
on-line basis is evaluated using simulation process. The effect of using
linear and nonlinear reduced spaces is also tested. Their diagnosis
performance was demonstrated using multivariate fault data. It has
shown that the nonlinear technique based diagnosis method produced
more reliable results and outperforms linear method. The use of
appropriate reduced space yielded better diagnosis performance. The
presented diagnosis framework is different from existing ones in that it
attempts to extract the fault pattern in the reduced space, not in the
original process variable space. The use of reduced model space helps
to mitigate the sensitivity of the fault pattern to noise.
Abstract: The aim of this paper is to investigate a process of modernization of the People-s Republic of China. The theme of scientific research is interesting, first, because the Chinese model of development is recognized as successful and most dynamically developing. They are obliged by these successes of the modernization spent in the country. Economy modernization as the basic motive power of progress of the country is a priority direction of development in the Republic of Kazakhstan. So the example of successful development modernization processes in China can be rather useful to use in working out of the Kazakhstan national reforms.
Abstract: Wireless Sensor Networks (WSN) are emerging
because of the developments in wireless communication technology and miniaturization of the hardware. WSN consists of a large number of low-cost, low-power, multifunctional sensor nodes to monitor physical conditions, such as temperature, sound, vibration, pressure,
motion, etc. The MAC protocol to be used in the sensor networks must be energy efficient and this should aim at conserving the energy during its operation. In this paper, with the focus of analyzing the
MAC protocols used in wireless Adhoc networks to WSN, simulation
experiments were conducted in Global Mobile Simulator
(GloMoSim) software. Number of packets sent by regular nodes, and received by sink node in different deployment strategies, total energy
spent, and the network life time have been chosen as the metric for comparison. From the results of simulation, it is evident that the IEEE 802.11 protocol performs better compared to CSMA and MACA protocols.
Abstract: Most file systems overwrite modified file data and
metadata in their original locations, while the Log-structured File
System (LFS) dynamically relocates them to other locations. We
design and implement the Evergreen file system that can select
between overwriting or relocation for each block of a file or metadata.
Therefore, the Evergreen file system can achieve superior write
performance by sequentializing write requests (similar to LFS-style
relocation) when space utilization is low and overwriting when
utilization is high. Another challenging issue is identifying
performance benefits of LFS-style relocation over overwriting on a
newly introduced SSD (Solid State Drive) which has only
Flash-memory chips and control circuits without mechanical parts.
Our experimental results measured on a SSD show that relocation
outperforms overwriting when space utilization is below 80% and vice
versa.
Abstract: Lossless compression schemes with secure
transmission play a key role in telemedicine applications that helps in
accurate diagnosis and research. Traditional cryptographic algorithms
for data security are not fast enough to process vast amount of data.
Hence a novel Secured lossless compression approach proposed in
this paper is based on reversible integer wavelet transform, EZW
algorithm, new modified runlength coding for character
representation and selective bit scrambling. The use of the lifting
scheme allows generating truly lossless integer-to-integer wavelet
transforms. Images are compressed/decompressed by well-known
EZW algorithm. The proposed modified runlength coding greatly
improves the compression performance and also increases the
security level. This work employs scrambling method which is fast,
simple to implement and it provides security. Lossless compression
ratios and distortion performance of this proposed method are found
to be better than other lossless techniques.