Abstract: An iterative definition of any n variable mean function is given in this article, which iteratively uses the two-variable form of the corresponding two-variable mean function. This extension method omits recursivity which is an important improvement compared with certain recursive formulas given before by Ando-Li-Mathias, Petz- Temesi. Furthermore it is conjectured here that this iterative algorithm coincides with the solution of the Riemann centroid minimization problem. Certain simulations are given here to compare the convergence rate of the different algorithms given in the literature. These algorithms will be the gradient and the Newton mehod for the Riemann centroid computation.
Abstract: Bones are dynamic and responsive organs, they
regulate their strength and mass according to the loads which they are subjected. Because, the Wnt/β-catenin pathway has profound
effects on the regulation of bone mass, we hypothesized that mechanical loading of bone cells stimulates Wnt/β-catenin signaling, which results in the generation of new bone mass.
Mechanical loading triggers the secretion of the Wnt molecule, which after binding to transmembrane proteins, causes GSK-3β (Glycogen synthase kinase 3 beta) to cease the phosphorylation of β-catenin. β-catenin accumulation in the cytoplasm, followed by its
transport into the nucleus, binding to transcription factors (TCF/LEF)
that initiate transcription of genes related to bone formation. To test this hypothesis, we used TOPGAL (Tcf Optimal Promoter
β-galactosidase) mice in an experiment in which cyclic loads were
applied to the forearm. TOPGAL mice are reporters for cells effected
by the Wnt/β-catenin signaling pathway. TOPGAL mice are genetically engineered mice in which transcriptional activation of β-
catenin, results in the production of an enzyme, β-galactosidase. The
presence of this enzyme allows us to localize transcriptional
activation of β-catenin to individual cells, thereby, allowing us to quantify the effects that mechanical loading has on the Wnt/β-catenin pathway and new bone formation. The ulnae of loaded TOPGAL
mice were excised and transverse slices along different parts of the
ulnar shaft were assayed for the presence of β-galactosidase.
Our results indicate that loading increases β-catenin transcriptional
activity in regions where this pathway is already primed (i.e. where basal activity is already higher) in a load magnitude dependent
manner. Further experiments are needed to determine the temporal and spatial activation of this signaling in relation to bone formation.
Abstract: In this paper, a one-dimensional numerical approach is
used to study the effect of applying electrohydrodynamics on the
temperature and species mass fraction profiles along the microcombustor.
Premixed mixture is H2-Air with a multi-step chemistry
(9 species and 19 reactions). In the micro-scale combustion because
of the increasing ratio of area-to-volume, thermal and radical
quenching mechanisms are important. Also, there is a significant heat
loss from the combustor walls. By inserting a number of electrodes
into micro-combustor and applying high voltage to them corona
discharge occurs. This leads in moving of induced ions toward
natural molecules and colliding with them. So this phenomenon
causes the movement of the molecules and reattaches the flow to the
walls. It increases the velocity near the walls that reduces the wall
boundary layer. Consequently, applying electrohydrodynamics
mechanism can enhance the temperature profile in the microcombustor.
Ultimately, it prevents the flame quenching in microcombustor.
Abstract: Representing objects in a dynamic domain is essential
in commonsense reasoning under some circumstances. Classical logics
and their nonmonotonic consequences, however, are usually not
able to deal with reasoning with dynamic domains due to the fact that
every constant in the logical language denotes some existing object
in the static domain. In this paper, we explore a logical formalization
which allows us to represent nonexisting objects in commonsense
reasoning. A formal system named N-theory is proposed for this
purpose and its possible application in computer security is briefly
discussed.
Abstract: Recurrent event data is a special type of multivariate
survival data. Dynamic and frailty models are one of the approaches
that dealt with this kind of data. A comparison between these two
models is studied using the empirical standard deviation of the
standardized martingale residual processes as a way of assessing the
fit of the two models based on the Aalen additive regression model.
Here we found both approaches took heterogeneity into account and
produce residual standard deviations close to each other both in the
simulation study and in the real data set.
Abstract: This paper presents an effective traffic lights
recognition method at the daytime. First, Potential Traffic Lights
Detector (PTLD) use whole color source of YCbCr channel image and
make each binary image of green and red traffic lights. After PTLD
step, Shape Filter (SF) use to remove noise such as traffic sign, street
tree, vehicle, and building. At this time, noise removal properties
consist of information of blobs of binary image; length, area, area of
boundary box, etc. Finally, after an intermediate association step witch
goal is to define relevant candidates region from the previously
detected traffic lights, Adaptive Multi-class Classifier (AMC) is
executed. The classification method uses Haar-like feature and
Adaboost algorithm. For simulation, we are implemented through Intel
Core CPU with 2.80 GHz and 4 GB RAM and tested in the urban and
rural roads. Through the test, we are compared with our method and
standard object-recognition learning processes and proved that it
reached up to 94 % of detection rate which is better than the results
achieved with cascade classifiers. Computation time of our proposed
method is 15 ms.
Abstract: Topics Disaster and Emergency Management are highly debated among experts. Fast communication will help to deal with emergencies. Problem is with the network connection and data exchange. The paper suggests a solution, which allows possibilities and perspectives of new flexible communication platform to the protection of communication systems for crisis management. This platform is used for everyday communication and communication in crisis situations too.
Abstract: In this study, we propose a network architecture for
providing secure access to information resources of enterprise
network from remote locations in a wireless fashion. Our proposed
architecture offers a very promising solution for organizations which
are in need of a secure, flexible and cost-effective remote access
methodology. Security of the proposed architecture is based on
Virtual Private Network technology and a special role based access
control mechanism with location and time constraints. The flexibility
mainly comes from the use of Internet as the communication medium
and cost-effectiveness is due to the possibility of in-house
implementation of the proposed architecture.
Abstract: With the advent of emerging personal computing paradigms such as ubiquitous and mobile computing, Web contents are becoming accessible from a wide range of mobile devices. Since these devices do not have the same rendering capabilities, Web contents need to be adapted for transparent access from a variety of client agents. Such content adaptation is exploited for either an individual element or a set of consecutive elements in a Web document and results in better rendering and faster delivery to the client device. Nevertheless, Web content adaptation sets new challenges for semantic markup. This paper presents an advanced components platform, called SMC, enabling the development of mobility applications and services according to a channel model based on the principles of Services Oriented Architecture (SOA). It then goes on to describe the potential for integration with the Semantic Web through a novel framework of external semantic annotation that prescribes a scheme for representing semantic markup files and a way of associating Web documents with these external annotations. The role of semantic annotation in this framework is to describe the contents of individual documents themselves, assuring the preservation of the semantics during the process of adapting content rendering. Semantic Web content adaptation is a way of adding value to Web contents and facilitates repurposing of Web contents (enhanced browsing, Web Services location and access, etc).
Abstract: This paper presents the convergence analysis
of a prediction based blind equalizer for IIR channels.
Predictor parameters are estimated by using the recursive
least squares algorithm. It is shown that the prediction
error converges almost surely (a.s.) toward a scalar
multiple of the unknown input symbol sequence. It is
also proved that the convergence rate of the parameter
estimation error is of the same order as that in the iterated
logarithm law.
Abstract: The e-government emerging concept transforms the
way in which the citizens are dealing with their governments. Thus,
the citizens can execute the intended services online anytime and
anywhere. This results in great benefits for both the governments
(reduces the number of officers) and the citizens (more flexibility and
time saving). Therefore, building a maturity model to assess the egovernment
portals becomes desired to help in the improvement
process of such portals. This paper aims at proposing an egovernment
maturity model based on the measurement of the best
practices’ presence. The main benefit of such maturity model is to
provide a way to rank an e-government portal based on the used best
practices, and also giving a set of recommendations to go to the
higher stage in the maturity model.
Abstract: The aim of our work is to study phase composition,
particle size and magnetic response of Fe2O3/TiO2 nanocomposites
with respect to the final annealing temperature. Those nanomaterials
are considered as smart catalysts, separable from a liquid/gaseous
phase by applied magnetic field. The starting product was obtained
by an ecologically acceptable route, based on heterogeneous
precipitation of the TiO2 on modified g-Fe2O3 nanocrystals dispersed
in water. The precursor was subsequently annealed on air at
temperatures ranging from 200 oC to 900 oC. The samples were
investigated by synchrotron X-ray powder diffraction (S-PXRD),
magnetic measurements and Mössbauer spectroscopy. As evidenced
by S-PXRD and Mössbauer spectroscopy, increasing the annealing
temperature causes evolution of the phase composition from
anatase/maghemite to rutile/hematite, finally above 700 oC the
pseudobrookite (Fe2TiO5) also forms. The apparent particle size of
the various Fe2O3/TiO2 phases has been determined from the highquality
S-PXRD data by using two different approaches: the Rietveld
refinement and the Debye method. Magnetic response of the samples
is discussed in considering the phase composition and the particle
size.
Abstract: The effect of beak trimming on behavior of two strains
of Thai native pullets kept in floor pens was studied. Six general
activities (standing, crouching, moving, comforting, roosting, and
nesting), 6 beak related activities (preening, feeding, drinking,
pecking at inedible object, feather pecking, and litter pecking), and 4
agonistic activities (head pecking, threatening, avoiding, and fighting)
were measured twice a for 15 consecutive days, started when the
pullets were 19 wk old. It was found that beak trimmed pullets drank
more frequent (P
Abstract: This paper discusses the landscape design that could
increase energy efficiency in a house. By planting trees in a house
compound, the tree shades prevent direct sunlight from heating up
the building, and it enables cooling off the surrounding air. The
requirement for air-conditioning could be minimized and the air
quality could be improved. During the life time of a tree, the saving
cost from the mentioned benefits could be up to US $ 200 for each
tree. The project intends to visually describe the landscape design in
a house compound that could enhance energy efficiency and
consequently lead to energy saving. The house compound model was
developed in three dimensions by using AutoCAD 2005, the
animation was programmed by using LightWave 3D softwares i.e.
Modeler and Layout to display the tree shadings in the wall. The
visualization was executed on a VRML Pad platform and
implemented on a web environment.
Abstract: This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.
Abstract: Clustering in high dimensional space is a difficult
problem which is recurrent in many fields of science and
engineering, e.g., bioinformatics, image processing, pattern
reorganization and data mining. In high dimensional space some of
the dimensions are likely to be irrelevant, thus hiding the possible
clustering. In very high dimensions it is common for all the objects in
a dataset to be nearly equidistant from each other, completely
masking the clusters. Hence, performance of the clustering algorithm
decreases.
In this paper, we propose an algorithmic framework which
combines the (reduct) concept of rough set theory with the k-means
algorithm to remove the irrelevant dimensions in a high dimensional
space and obtain appropriate clusters. Our experiment on test data
shows that this framework increases efficiency of the clustering
process and accuracy of the results.
Abstract: Short Message Service (SMS) has grown in
popularity over the years and it has become a common way of
communication, it is a service provided through General System
for Mobile Communications (GSM) that allows users to send text
messages to others.
SMS is usually used to transport unclassified information, but
with the rise of mobile commerce it has become a popular tool for
transmitting sensitive information between the business and its
clients. By default SMS does not guarantee confidentiality and
integrity to the message content.
In the mobile communication systems, security (encryption)
offered by the network operator only applies on the wireless link.
Data delivered through the mobile core network may not be
protected. Existing end-to-end security mechanisms are provided
at application level and typically based on public key
cryptosystem.
The main concern in a public-key setting is the authenticity of
the public key; this issue can be resolved by identity-based (IDbased)
cryptography where the public key of a user can be derived
from public information that uniquely identifies the user.
This paper presents an encryption mechanism based on the IDbased
scheme using Elliptic curves to provide end-to-end security
for SMS. This mechanism has been implemented over the standard
SMS network architecture and the encryption overhead has been
estimated and compared with RSA scheme. This study indicates
that the ID-based mechanism has advantages over the RSA
mechanism in key distribution and scalability of increasing
security level for mobile service.
Abstract: Quantum computation using qubits made of two component Bose-Einstein condensates (BECs) is analyzed. We construct a general framework for quantum algorithms to be executed using the collective states of the BECs. The use of BECs allows for an increase of energy scales via bosonic enhancement, resulting in two qubit gate operations that can be performed at a time reduced by a factor of N, where N is the number of bosons per qubit. We illustrate the scheme by an application to Deutsch-s and Grover-s algorithms, and discuss possible experimental implementations. Decoherence effects are analyzed under both general conditions and for the experimental implementation proposed.
Abstract: This research aims to analyze the regenerative burner and the recuperative burner for the different reheating furnaces in the steel industry. The warm air temperatures of the burners are determined to suit with the sizes of the reheating furnaces by considering the air temperature, the fuel cost and the investment cost. The calculations of the payback period and the net present value are studied to compare the burners for the different reheating furnaces. The energy balance is utilized to calculate and compare the energy used in the different sizes of reheating furnaces for each burner. It is found that the warm air temperature is different if the sizes of reheating furnaces are varied. Based on the considerations of the net present value and the payback period, the regenerative burner is suitable for all plants at the same life of the burner. Finally, the sensitivity analysis of all factors has been discussed in this research.
Abstract: In this paper, we consider the problem of logic simplification for a special class of logic functions, namely complementary Boolean functions (CBF), targeting low power implementation using static CMOS logic style. The functions are uniquely characterized by the presence of terms, where for a canonical binary 2-tuple, D(mj) ∪ D(mk) = { } and therefore, we have | D(mj) ∪ D(mk) | = 0 [19]. Similarly, D(Mj) ∪ D(Mk) = { } and hence | D(Mj) ∪ D(Mk) | = 0. Here, 'mk' and 'Mk' represent a minterm and maxterm respectively. We compare the circuits minimized with our proposed method with those corresponding to factored Reed-Muller (f-RM) form, factored Pseudo Kronecker Reed-Muller (f-PKRM) form, and factored Generalized Reed-Muller (f-GRM) form. We have opted for algebraic factorization of the Reed-Muller (RM) form and its different variants, using the factorization rules of [1], as it is simple and requires much less CPU execution time compared to Boolean factorization operations. This technique has enabled us to greatly reduce the literal count as well as the gate count needed for such RM realizations, which are generally prone to consuming more cells and subsequently more power consumption. However, this leads to a drawback in terms of the design-for-test attribute associated with the various RM forms. Though we still preserve the definition of those forms viz. realizing such functionality with only select types of logic gates (AND gate and XOR gate), the structural integrity of the logic levels is not preserved. This would consequently alter the testability properties of such circuits i.e. it may increase/decrease/maintain the same number of test input vectors needed for their exhaustive testability, subsequently affecting their generalized test vector computation. We do not consider the issue of design-for-testability here, but, instead focus on the power consumption of the final logic implementation, after realization with a conventional CMOS process technology (0.35 micron TSMC process). The quality of the resulting circuits evaluated on the basis of an established cost metric viz., power consumption, demonstrate average savings by 26.79% for the samples considered in this work, besides reduction in number of gates and input literals by 39.66% and 12.98% respectively, in comparison with other factored RM forms.