Abstract: Data clustering is an important data exploration technique
with many applications in data mining. We present an enhanced
version of the well known single link clustering algorithm. We will
refer to this algorithm as DCBOR. The proposed algorithm alleviates
the chain effect by removing the outliers from the given dataset.
So this algorithm provides outlier detection and data clustering
simultaneously. This algorithm does not need to update the distance
matrix, since the algorithm depends on merging the most k-nearest
objects in one step and the cluster continues grow as long as possible
under specified condition. So the algorithm consists of two phases;
at the first phase, it removes the outliers from the input dataset. At
the second phase, it performs the clustering process. This algorithm
discovers clusters of different shapes, sizes, densities and requires
only one input parameter; this parameter represents a threshold for
outlier points. The value of the input parameter is ranging from 0 to
1. The algorithm supports the user in determining an appropriate
value for it. We have tested this algorithm on different datasets
contain outlier and connecting clusters by chain of density points,
and the algorithm discovers the correct clusters. The results of
our experiments demonstrate the effectiveness and the efficiency of
DCBOR.
Abstract: CTMA-bentonite and BTEA-Bentonite prepared by Na-bentonite cation exchanged with cetyltrimethylammonium(CTMA) and benzyltriethylammonium (BTEA). Products were characterized by XRD and IR techniques.The d001 spacing value of CTMA-bentonite and BTEA-bentonite are 7.54Å and 3.50Å larger than that of Na-bentonite at 100% cation exchange capacity, respectively. The IR spectrum showed that the intensities of OH stretching and bending vibrations of the two organoclays decreased greatly comparing to untreated Na-bentonite. Batch experiments were carried out at 303 K, 318 K and 333 K to obtain the sorption isotherms of Crystal violet onto the two organoclays. The results show that the sorption isothermal data could be well described by Freundlich model. The dynamical data for the two organoclays fit well with pseudo-second-order kinetic model. The adsorption capacity of CTMA-bentonite was found higher than that of BTEA-Bentonite. Thermodynamic parameters such as changes in the free energy (ΔG°), the enthalpy (ΔH°) and the entropy (ΔS°) were also evaluated. The overall adsorption process of Crystal violet onto the two organoclays were spontaneous, endothermic physisorption. The CTMA-bentonite and BTEA-Bentonite could be employed as low-cost alternatives to activated carbon in wastewater treatment for the removal of color which comes from textile dyes.
Abstract: Shadoo protein (Sho) was described in 2003 as the newest member of Prion protein superfamily [1]. Sho has similar structural motifs like prion protein (PrP) that is known for its central role in transmissible spongiform enchephalopathies. Although a great number of functions have been proposed, the exact physiological function of PrP is not known yet. Investigation of the function and localization of Sho may help us to understand the function of the Prion protein superfamily. Analyzing the subcellular localization of YFP-tagged forms of Sho, we detected the protein in the plasma membrane and in the nucleus of various cell lines. To reveal the localization of the endogenous protein we generated antibodies against Shadoo as well as employed commercially available anti-Shadoo antibodies: i) EG62 anti-mouse Shadoo antibody generated by Eurogentec Ltd.; ii) S-12 anti-human Shadoo antibody by Santa Cruz Biotechnology Inc.; iii) R-12 anti-mouse Shadoo antibody by Santa Cruz Biotechnology Inc.; iv) SPRN antibody against human Shadoo by Abgent Inc. We carried out immunocytochemistry on non-transfected HeLa, Zpl 2-1, Zw 3-5, GT1-1, GT1-7 and SHSY5Y cells as well as on YFP-Sho, Sho-YFP, and YFP-GPI transfected HeLa cells. Their specificity (in antibody-peptide competition assay) and co-localization (with the YFP signal) were assessed.
Abstract: There have been various methods created based on the regression ideas to resolve the problem of data set containing censored observations, i.e. the Buckley-James method, Miller-s method, Cox method, and Koul-Susarla-Van Ryzin estimators. Even though comparison studies show the Buckley-James method performs better than some other methods, it is still rarely used by researchers mainly because of the limited diagnostics analysis developed for the Buckley-James method thus far. Therefore, a diagnostic tool for the Buckley-James method is proposed in this paper. It is called the renovated Cook-s Distance, (RD* i ) and has been developed based on the Cook-s idea. The renovated Cook-s Distance (RD* i ) has advantages (depending on the analyst demand) over (i) the change in the fitted value for a single case, DFIT* i as it measures the influence of case i on all n fitted values Yˆ∗ (not just the fitted value for case i as DFIT* i) (ii) the change in the estimate of the coefficient when the ith case is deleted, DBETA* i since DBETA* i corresponds to the number of variables p so it is usually easier to look at a diagnostic measure such as RD* i since information from p variables can be considered simultaneously. Finally, an example using Stanford Heart Transplant data is provided to illustrate the proposed diagnostic tool.
Abstract: Nanoemulsions are a class of emulsions with a droplet
size in the range of 50–500 nm and have attracted a great deal of
attention in recent years because it is unique characteristics. The
physicochemical properties of nanoemulsion suggests that it can be
successfully used to recover the residual oil which is trapped in the
fine pore of reservoir rock by capillary forces after primary and
secondary recovery. Oil-in-water nanoemulsion which can be formed
by high-energy emulsification techniques using specific surfactants
can reduce oil-water interfacial tension (IFT) by 3-4 orders of
magnitude. The present work is aimed on characterization of oil-inwater
nanoemulsion in terms of its phase behavior, morphological
studies; interfacial energy; ability to reduce the interfacial tension and
understanding the mechanisms of mobilization and displacement of
entrapped oil blobs by lowering interfacial tension both at the
macroscopic and microscopic level. In order to investigate the
efficiency of oil-water nanoemulsion in enhanced oil recovery
(EOR), experiments were performed to characterize the emulsion in
terms of their physicochemical properties and size distribution of the
dispersed oil droplet in water phase. Synthetic mineral oil and a series
of surfactants were used to prepare oil-in-water emulsions.
Characterization of emulsion shows that it follows pseudo-plastic
behaviour and drop size of dispersed oil phase follows lognormal
distribution. Flooding experiments were also carried out in a
sandpack system to evaluate the effectiveness of the nanoemulsion as
displacing fluid for enhanced oil recovery. Substantial additional
recoveries (more than 25% of original oil in place) over conventional
water flooding were obtained in the present investigation.
Abstract: The objective of this paper is to present explicit analytical formulas for evaluating important characteristics of Double Moving Average control chart (DMA) for Poisson distribution. The most popular characteristics of a control chart are Average Run Length ( 0 ARL ) - the mean of observations that are taken before a system is signaled to be out-of control when it is actually still incontrol, and Average Delay time ( 1 ARL ) - mean delay of true alarm times. An important property required of 0 ARL is that it should be sufficiently large when the process is in-control to reduce a number of false alarms. On the other side, if the process is actually out-ofcontrol then 1 ARL should be as small as possible. In particular, the explicit analytical formulas for evaluating 0 ARL and 1 ARL be able to get a set of optimal parameters which depend on a width of the moving average ( w ) and width of control limit ( H ) for designing DMA chart with minimum of 1 ARL
Abstract: The study of proteomics reached unexpected levels of
interest, as a direct consequence of its discovered influence over
some complex biological phenomena, such as problematic diseases
like cancer. This paper presents a new technique that allows for an
accurate analysis of the human interactome network. It is basically
a two-step analysis process that involves, at first, the detection of
each protein-s absolute importance through the betweenness centrality
computation. Then, the second step determines the functionallyrelated
communities of proteins. For this purpose, we use a community
detection technique that is based on the edge betweenness
calculation. The new technique was thoroughly tested on real biological
data and the results prove some interesting properties of those proteins that are involved in the carcinogenesis process. Apart from its
experimental usefulness, the novel technique is also computationally
effective in terms of execution times. Based on the analysis- results, some topological features of cancer mutated proteins are presented
and a possible optimization solution for cancer drugs design is suggested.
Abstract: There are two common types of operational research techniques, optimisation and metaheuristic methods. The latter may be defined as a sequential process that intelligently performs the exploration and exploitation adopted by natural intelligence and strong inspiration to form several iterative searches. An aim is to effectively determine near optimal solutions in a solution space. In this work, a type of metaheuristics called Ant Colonies Optimisation, ACO, inspired by a foraging behaviour of ants was adapted to find optimal solutions of eight non-linear continuous mathematical models. Under a consideration of a solution space in a specified region on each model, sub-solutions may contain global or multiple local optimum. Moreover, the algorithm has several common parameters; number of ants, moves, and iterations, which act as the algorithm-s driver. A series of computational experiments for initialising parameters were conducted through methods of Rigid Simplex, RS, and Modified Simplex, MSM. Experimental results were analysed in terms of the best so far solutions, mean and standard deviation. Finally, they stated a recommendation of proper level settings of ACO parameters for all eight functions. These parameter settings can be applied as a guideline for future uses of ACO. This is to promote an ease of use of ACO in real industrial processes. It was found that the results obtained from MSM were pretty similar to those gained from RS. However, if these results with noise standard deviations of 1 and 3 are compared, MSM will reach optimal solutions more efficiently than RS, in terms of speed of convergence.
Abstract: Many applications of speech communication and speaker
identification suffer from the problem of co-channel speech. This
paper deals with a multi-resolution dyadic wavelet transform method
for usable segments of co-channel speech detection that could be
processed by a speaker identification system. Evaluation of this
method is performed on TIMIT database referring to the Target to
Interferer Ratio measure. Co-channel speech is constructed by
mixing all possible gender speakers. Results do not show much
difference for different mixtures. For the overall mixtures 95.76% of
usable speech is correctly detected with false alarms of 29.65%.
Abstract: The article deals with the classification of alternative water resources in terms of potential risks which is the prerequisite for incorporating these water resources to the emergency plans. The classification is based on the quantification of risks resulting from possible damage, disruption or total destruction of water resource caused by natural and anthropogenic hazards, assessment of water quality and availability, traffic accessibility of the assessed resource and finally its water yield. The aim is to achieve the development of an integrated rescue system, which will be capable of supplying the population with drinking water on the whole stricken territory during the states of emergency.
Abstract: One of the purposes of the robust method of
estimation is to reduce the influence of outliers in the data, on the
estimates. The outliers arise from gross errors or contamination from
distributions with long tails. The trimmed mean is a robust estimate.
This means that it is not sensitive to violation of distributional
assumptions of the data. It is called an adaptive estimate when the
trimming proportion is determined from the data rather than being
fixed a “priori-.
The main objective of this study is to find out the robustness
properties of the adaptive trimmed means in terms of efficiency, high
breakdown point and influence function. Specifically, it seeks to find
out the magnitude of the trimming proportion of the adaptive
trimmed mean which will yield efficient and robust estimates of the
parameter for data which follow a modified Weibull distribution with
parameter λ = 1/2 , where the trimming proportion is determined by a
ratio of two trimmed means defined as the tail length. Secondly, the
asymptotic properties of the tail length and the trimmed means are
also investigated. Finally, a comparison is made on the efficiency of
the adaptive trimmed means in terms of the standard deviation for the
trimming proportions and when these were fixed a “priori".
The asymptotic tail lengths defined as the ratio of two trimmed
means and the asymptotic variances were computed by using the
formulas derived. While the values of the standard deviations for the
derived tail lengths for data of size 40 simulated from a Weibull
distribution were computed for 100 iterations using a computer
program written in Pascal language.
The findings of the study revealed that the tail lengths of the
Weibull distribution increase in magnitudes as the trimming
proportions increase, the measure of the tail length and the adaptive
trimmed mean are asymptotically independent as the number of
observations n becomes very large or approaching infinity, the tail
length is asymptotically distributed as the ratio of two independent
normal random variables, and the asymptotic variances decrease as
the trimming proportions increase. The simulation study revealed
empirically that the standard error of the adaptive trimmed mean
using the ratio of tail lengths is relatively smaller for different values
of trimming proportions than its counterpart when the trimming
proportions were fixed a 'priori'.
Abstract: To support mobility in ATM networks, a number of
technical challenges need to be resolved. The impact of handoff
schemes in terms of service disruption, handoff latency, cost
implications and excess resources required during handoffs needs to
be addressed. In this paper, a one phase handoff and route
optimization solution using reserved PVCs between adjacent ATM
switches to reroute connections during inter-switch handoff is
studied. In the second phase, a distributed optimization process is
initiated to optimally reroute handoff connections. The main
objective is to find the optimal operating point at which to perform
optimization subject to cost constraint with the purpose of reducing
blocking probability of inter-switch handoff calls for delay tolerant
traffic. We examine the relation between the required bandwidth
resources and optimization rate. Also we calculate and study the
handoff blocking probability due to lack of bandwidth for resources
reserved to facilitate the rapid rerouting.
Abstract: The effect of time-periodic oscillations of the Rayleigh- Benard system on the heat transport in dielectric liquids is investigated by weakly nonlinear analysis. We focus on stationary convection using the slow time scale and arrive at the real Ginzburg- Landau equation. Classical fourth order Runge-kutta method is used to solve the Ginzburg-Landau equation which gives the amplitude of convection and this helps in quantifying the heat transfer in dielectric liquids in terms of the Nusselt number. The effect of electrical Rayleigh number and the amplitude of modulation on heat transport is studied.
Abstract: The objective of this paper is to support the application of Open Innovation practices in firms and organizations by the assessment and management of Intellectual Capital. Intellectual Capital constituents are analyzed in order to verify their capability of acting as key drivers of Open Innovation processes and, therefore, of creating value. A methodology is defined to settle a procedure which helps to select the most relevant Intellectual Capital value drivers and to provide Communities of Innovation with strategic and managerial guidelines in sustaining Open Innovation paradigm. An application of the methodology is developed within a specifically addressed project and its results are hereafter examined.
Abstract: Background: Blunt aortic trauma (BAT) includes
various morphological changes that occur during deceleration,
acceleration and/or body compression in traffic accidents. The
various forms of BAT, from limited laceration of the intima to
complete transection of the aorta, depends on the force acting on the
vessel wall and the tolerance of the aorta to injury. The force depends
on the change in velocity, the dynamics of the accident and of the
seating position in the car. Tolerance to aortic injury depends on the
anatomy, histological structure and pathomorphological alterations
due to aging or disease of the aortic wall.
An overview of the literature and medical documentation reveals
that different terms are used to describe certain forms of BAT, which
can lead to misinterpretation of findings or diagnoses. We therefore,
propose a classification that would enable uniform systematic
screening of all forms of BAT. We have classified BAT into three
morphologycal types: TYPE I (intramural), TYPE II (transmural) and
TYPE III (multiple) aortic ruptures with appropriate subtypes.
Methods: All car accident casualties examined at the Institute of
Forensic Medicine from 2001 to 2009 were included in this
retrospective study. Autopsy reports were used to determine the
occurrence of each morphological type of BAT in deceased drivers,
front seat passengers and other passengers in cars and to define the
morphology of BAT in relation to the accident dynamics and the age
of the fatalities.
Results: A total of 391 fatalities in car accidents were included in
the study. TYPE I, TYPE II and TYPE III BAT were observed in
10,9%, 55,6% and 33,5%, respectively. The incidence of BAT in
drivers, front seat and other passengers was 36,7%, 43,1% and
28,6%, respectively. In frontal collisions, the incidence of BAT was
32,7%, in lateral collisions 54,2%, and in other traffic accidents
29,3%. The average age of fatalities with BAT was 42,8 years and of
those without BAT 39,1 years.
Conclusion: Identification and early recognition of the risk factors
of BAT following a traffic accident is crucial for successful treatment
of patients with BAT. Front seat passengers over 50 years of age who
have been injured in a lateral collision are the most at risk of BAT.
Abstract: This paper presented a novel combined cycle of air separation and natural gas liquefaction. The idea is that natural gas can be liquefied, meanwhile gaseous or liquid nitrogen and oxygen are produced in one combined cryogenic system. Cycle simulation and exergy analysis were performed to evaluate the process and thereby reveal the influence of the crucial parameter, i.e., flow rate ratio through two stages expanders β on heat transfer temperature difference, its distribution and consequent exergy loss. Composite curves for the combined hot streams (feeding natural gas and recycled nitrogen) and the cold stream showed the degree of optimization available in this process if appropriate β was designed. The results indicated that increasing β reduces temperature difference and exergy loss in heat exchange process. However, the maximum limit value of β should be confined in terms of minimum temperature difference proposed in heat exchanger design standard and heat exchanger size. The optimal βopt under different operation conditions corresponding to the required minimum temperature differences was investigated.
Abstract: Relay based communication has gained considerable importance in the recent years. In this paper we find the end-toend statistics of a two hop non-regenerative relay branch, each hop being Nakagami-m faded. Closed form expressions for the probability density functions of the signal envelope at the output of a selection combiner and a maximal ratio combiner at the destination node are also derived and analytical formulations are verified through computer simulation. These density functions are useful in evaluating the system performance in terms of bit error rate and outage probability.
Abstract: Many states are now committed to implementing
international human rights standards domestically. In terms of
practical governance, how might effectiveness be measured? A facevalue
answer can be found in domestic laws and institutions relating
to human rights. However, this article provides two further tools to
help states assess their status on the spectrum of robust to fragile
human rights governance. The first recognises that each state has its
own 'human rights history' and the ideal end stage is robust human
rights governance, and the second is developing criteria to assess
robustness. Although a New Zealand case study is used to illustrate
these tools, the widespread adoption of human rights standards by
many states inevitably means that the issues are relevant to other
countries. This is even though there will always be varying degrees of
similarity-difference in constitutional background and developed or
emerging human rights systems.
Abstract: Human activities are increasingly based on the use of remote resources and services, and on the interaction between
remotely located parties that may know little about each other. Mobile agents must be prepared to execute on different hosts with
various environmental security conditions. The aim of this paper is to
propose a trust based mechanism to improve the security of mobile
agents and allow their execution in various environments. Thus, an
adaptive trust mechanism is proposed. It is based on the dynamic interaction between the agent and the environment. Information
collected during the interaction enables generation of an environment
key. This key informs on the host-s trust degree and permits the mobile agent to adapt its execution. Trust estimation is based on
concrete parameters values. Thus, in case of distrust, the source of problem can be located and a mobile agent appropriate behavior can
be selected.
Abstract: Traffic congestion has become a major problem in
many countries. One of the main causes of traffic congestion is due
to road merges. Vehicles tend to move slower when they reach the
merging point. In this paper, an enhanced algorithm for traffic
simulation based on the fluid-dynamic algorithm and kinematic wave
theory is proposed. The enhanced algorithm is used to study traffic
congestion at a road merge. This paper also describes the
development of a dynamic traffic simulation tool which is used as a
scenario planning and to forecast traffic congestion level in a certain
time based on defined parameter values. The tool incorporates the
enhanced algorithm as well as the two original algorithms. Output
from the three above mentioned algorithms are measured in terms of
traffic queue length, travel time and the total number of vehicles
passing through the merging point. This paper also suggests an
efficient way of reducing traffic congestion at a road merge by
analyzing the traffic queue length and travel time.