Abstract: In order to monitor for traffic traversal, sensors can be
deployed to perform collaborative target detection. Such a sensor
network achieves a certain level of detection performance with the
associated costs of deployment and routing protocol. This paper
addresses these two points of sensor deployment and routing algorithm
in the situation where the absolute quantity of sensors or total energy
becomes insufficient. This discussion on the best deployment system
concluded that two kinds of deployments; Normal and Power law
distributions, show 6 and 3 times longer than Random distribution in
the duration of coverage, respectively. The other discussion on routing
algorithm to achieve good performance in each deployment system
was also addressed. This discussion concluded that, in place of the
traditional algorithm, a new algorithm can extend the time of coverage
duration by 4 times in a Normal distribution, and in the circumstance
where every deployed sensor operates as a binary model.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: The ever-growing usage of aspect-oriented
development methodology in the field of software engineering
requires tool support for both research environments and industry. So
far, tool support for many activities in aspect-oriented software
development has been proposed, to automate and facilitate their
development. For instance, the AJaTS provides a transformation
system to support aspect-oriented development and refactoring. In
particular, it is well established that the abstract interpretation of
programs, in any paradigm, pursued in static analysis is best served
by a high-level programs representation, such as Control Flow Graph
(CFG). This is why such analysis can more easily locate common
programmatic idioms for which helpful transformation are already
known as well as, association between the input program and
intermediate representation can be more closely maintained.
However, although the current researches define the good concepts
and foundations, to some extent, for control flow analysis of aspectoriented
programs but they do not provide a concrete tool that can
solely construct the CFG of these programs. Furthermore, most of
these works focus on addressing the other issues regarding Aspect-
Oriented Software Development (AOSD) such as testing or data flow
analysis rather than CFG itself. Therefore, this study is dedicated to
build an aspect-oriented control flow graph construction tool called
AJcFgraph Builder. The given tool can be applied in many software
engineering tasks in the context of AOSD such as, software testing,
software metrics, and so forth.
Abstract: In this paper, we develop a Spatio-Temporal graph as
of a key component of our knowledge representation Scheme. We
design an integrated representation Scheme to depict not only present
and past but future in parallel with the spaces in an effective and
intuitive manner. The resulting multi-dimensional comprehensive
knowledge structure accommodates multi-layered virtual world
developing in the time to maximize the diversity of situations in the
historical context. This knowledge representation Scheme is to be used
as the basis for simulation of situations composing the virtual world
and for implementation of virtual agents' knowledge used to judge and
evaluate the situations in the virtual world. To provide natural contexts
for situated learning or simulation games, the virtual stage set by this
Spatio-Temporal graph is to be populated by agents and other objects
interrelated and changing which are abstracted in the ontology.
Abstract: The Japanese integrative approach to social systems
can be observed in supply chain management as well as in the
relationship between public and private sectors. Both the Lean
Production System and the Developmental State Model are
characterized by efforts towards the achievement of mutual goals,
resulting in initiatives for capacity building which emphasize the
system level. In Brazil, although organizations undertake efforts to
build capabilities at the individual and organizational levels, the
system level is being neglected. Fieldwork data confirmed the findings
of other studies in terms of the lack of integration in supply chain
management in the Brazilian automobile industry. Moreover, due to
the absence of an active role of the Brazilian state in its relationship
with the private sector, automakers are not fully exploiting the
opportunities in the domestic and regional markets. For promoting a
higher level of economic growth as well as to increase the degree of
spill-over of technologies and techniques, a more integrative approach
is needed.
Abstract: This paper presents a simple and sensitive kinetic
spectrophotometric method for the determination of ramipril in
commercial dosage forms. The method is based on the reaction of the
drug with 1-chloro-2,4-dinitrobenzene (CDNB) in dimethylsulfoxide
(DMSO) at 100 ± 1ºC. The reaction is followed
spectrophotometrically by measuring the rate of change of the
absorbance at 420 nm. Fixed-time (ΔA) and equilibrium methods are
adopted for constructing the calibration curves. Both the calibration
curves were found to be linear over the concentration ranges 20 - 220
μg/ml. The regression analysis of calibration data yielded the linear
equations: Δ A = 6.30 × 10-4 + 1.54 × 10-3 C and A = 3.62 × 10-4 +
6.35 × 10-3 C for fixed time (Δ A) and equilibrium methods,
respectively. The limits of detection (LOD) for fixed time and
equilibrium methods are 1.47 and 1.05 μg/ml, respectively. The
method has been successfully applied to the determination of ramipril
in commercial dosage forms. Statistical comparison of the results
shows that there is no significant difference between the proposed
methods and Abdellatef-s spectrophotometric method.
Abstract: This paper deals with a new way for designing
external fixators applied in traumatology and orthopaedics. These
fixators can be applied in the treatment of open and unstable
fractures or for lengthening human or animal bones etc. The new
design is based on the development of Ilizarov and other techniques
(i.e. shape and weight optimalization based on composite materials,
application of smart materials, nanotechnology, low x-ray absorption,
antibacterial protection, patient's comfort, reduction in the duration
of the surgical treatment, and cost).
Abstract: An interesting method to produce calcium carbonate is based in a gas-liquid reaction between carbon dioxide and aqueous solutions of calcium hydroxide. The design parameters for gas-liquid phase are flow regime, individual mass transfer, gas-liquid specific interfacial area. Most studies on gas-liquid phase were devoted to the experimental determination of some of these parameters, and more specifically, of the mass transfer coefficient, kLa which depends fundamentally on the superficial gas velocity and on the physical properties of absorption phase. The principle investigation was directed to study the effect of the vibration on the mass transfer coefficient kLa in gas-liquid phase during absorption of CO2 in the in aqueous solution of calcium hydroxide. The vibration with a higher frequency increase the mass transfer coefficient kLa, but vibration with lower frequency didn-t improve it, the mass transfer coefficient kLa increase with increase the superficial gas velocity.
Abstract: Tool Tracker is a client-server based application. It is essentially a catalogue of various network monitoring and management tools that are available online. There is a database maintained on the server side that contains the information about various tools. Several clients can access this information simultaneously and utilize this information. The various categories of tools considered are packet sniffers, port mappers, port scanners, encryption tools, and vulnerability scanners etc for the development of this application. This application provides a front end through which the user can invoke any tool from a central repository for the purpose of packet sniffing, port scanning, network analysis etc. Apart from the tool, its description and the help files associated with it would also be stored in the central repository. This facility will enable the user to view the documentation pertaining to the tool without having to download and install the tool. The application would update the central repository with the latest versions of the tools. The application would inform the user about the availability of a newer version of the tool currently being used and give the choice of installing the newer version to the user. Thus ToolTracker provides any network administrator that much needed abstraction and ease-ofuse with respect to the tools that he can use to efficiently monitor a network.
Abstract: Business process modeling has become an accepted
means for designing and describing business operations. Thereby,
consistency of business process models, i.e., the absence of modeling
faults, is of upmost importance to organizations. This paper presents
a concept and subsequent implementation for detecting faults in
business process models and for computing a measure of their
consistency. It incorporates not only syntactic consistency but also
semantic consistency, i.e., consistency regarding the meaning of
model elements from a business perspective.
Abstract: The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.
Abstract: Models are placed by modeling paradigm at the center of development process. These models are represented by languages, like UML the language standardized by the OMG which became necessary for development. Moreover the ontology engineering paradigm places ontologies at the center of development process; in this paradigm we find OWL the principal language for knowledge representation. Building ontologies from scratch is generally a difficult task. The bridging between UML and OWL appeared on several regards such as the classes and associations. In this paper, we have to profit from convergence between UML and OWL to propose an approach based on Meta-Modelling and Graph Grammars and registered in the MDA architecture for the automatic generation of OWL ontologies from UML class diagrams. The transformation is based on transformation rules; the level of abstraction in these rules is close to the application in order to have usable ontologies. We illustrate this approach by an example.
Abstract: At the present, auto part industries have become higher challenge in strategy market. As this consequence, manufacturers need to have better response to customers in terms of quality, cost, and delivery time. Moreover, they need to have a good management in factory to comply with international standard maximum capacity and lower cost. This would lead companies to have to order standard part from aboard and become the major cost of inventory. The development of auto part research by recycling materials experiment is to compare the auto parts from recycle materials to international auto parts (CKD). Factors studied in this research were the recycle material ratios of PU-foam, felt, and fabric. Results of recycling materials were considered in terms of qualities and properties on the parameters such as weight, sound absorption, water absorption, tensile strength, elongation, and heat resistance with the CKD. The results were showed that recycling materials would be used to replace for the CKD.
Abstract: Ramadan requires individuals to abstain from food and fluid intake between sunrise and sunset; physiological considerations predict that poorer mood, physical performance and mental performance will result. In addition, any difficulties will be worsened because preparations for fasting and recovery from it often mean that nocturnal sleep is decreased in length, and this independently affects mood and performance.
A difficulty of interpretation in many studies is that the observed changes could be due to fasting but also to the decreased length of sleep and altered food and fluid intakes before and after the daytime fasting. These factors were separated in this study, which took place over three separate days and compared the effects of different durations of fasting (4, 8 or 16h) upon a wide variety of measures (including subjective and objective assessments of performance, body composition, dehydration and responses to a short bout of exercise) - but with an unchanged amount of nocturnal sleep, controlled supper the previous evening, controlled intakes at breakfast and daytime naps not being allowed. Many of the negative effects of fasting observed in previous studies were present in this experiment also. These findings indicate that fasting was responsible for many of the changes previously observed, though some effect of sleep loss, particularly if occurring on successive days (as would occur in Ramadan) cannot be excluded.
Abstract: An accurate and proficient artificial neural network
(ANN) based genetic algorithm (GA) is developed for predicting of
nanofluids viscosity. A genetic algorithm (GA) is used to optimize
the neural network parameters for minimizing the error between the
predictive viscosity and the experimental one. The experimental
viscosity in two nanofluids Al2O3-H2O and CuO-H2O from 278.15
to 343.15 K and volume fraction up to 15% were used from
literature. The result of this study reveals that GA-NN model is
outperform to the conventional neural nets in predicting the viscosity
of nanofluids with mean absolute relative error of 1.22% and 1.77%
for Al2O3-H2O and CuO-H2O, respectively. Furthermore, the results
of this work have also been compared with others models. The
findings of this work demonstrate that the GA-NN model is an
effective method for prediction viscosity of nanofluids and have
better accuracy and simplicity compared with the others models.
Abstract: The approach of subset selection in polynomial
regression model building assumes that the chosen fixed full set of
predefined basis functions contains a subset that is sufficient to
describe the target relation sufficiently well. However, in most cases
the necessary set of basis functions is not known and needs to be
guessed – a potentially non-trivial (and long) trial and error process.
In our research we consider a potentially more efficient approach –
Adaptive Basis Function Construction (ABFC). It lets the model
building method itself construct the basis functions necessary for
creating a model of arbitrary complexity with adequate predictive
performance. However, there are two issues that to some extent
plague the methods of both the subset selection and the ABFC,
especially when working with relatively small data samples: the
selection bias and the selection instability. We try to correct these
issues by model post-evaluation using Cross-Validation and model
ensembling. To evaluate the proposed method, we empirically
compare it to ABFC methods without ensembling, to a widely used
method of subset selection, as well as to some other well-known
regression modeling methods, using publicly available data sets.
Abstract: This paper is a continuation of our daily energy peak load forecasting approach using our modified network which is part of the recurrent networks family and is called feed forward and feed back multi context artificial neural network (FFFB-MCANN). The inputs to the network were exogenous variables such as the previous and current change in the weather components, the previous and current status of the day and endogenous variables such as the past change in the loads. Endogenous variable such as the current change in the loads were used on the network output. Experiment shows that using endogenous and exogenous variables as inputs to the FFFBMCANN rather than either exogenous or endogenous variables as inputs to the same network produces better results. Experiments show that using the change in variables such as weather components and the change in the past load as inputs to the FFFB-MCANN rather than the absolute values for the weather components and past load as inputs to the same network has a dramatic impact and produce better accuracy.
Abstract: The Inter feeder Power Flow Regulator (IFPFR)
proposed in this paper consists of several voltage source inverters
with common dc bus; each inverter is connected in series with one of
different independent distribution feeders in the power system. This
paper is concerned with how to transfer power between the feeders for
load sharing purpose. The power controller of each inverter injects
the power (for sending feeder) or absorbs the power (for receiving
feeder) via injecting suitable voltage; this voltage injection is
simulated by voltage drop across series virtual impedance, the
impedance value is selected to achieve the concept of power exchange
between the feeders without perturbing the load voltage magnitude of
each feeder. In this paper a new control scheme for load sharing using
IFPFR is proposed.
Abstract: Series of tellurite glass of the system 78TeO2-10PbO-
10Li2O-(2-x)Nd2O3-xEr2O3, where x = 0.5, 1.0, 1.5 and 2.0 was
successfully been made. A study of upconversion luminescence of
the Nd3+/Er3+ co-doped tellurite glass has been carried out. From
Judd-Ofelt analysis, the experimental lifetime, exp. τ of the glass
serie are found higher in the visible region as they varies from
65.17ms to 114.63ms, whereas in the near infrared region (NIR) the
lifetime are varies from 2.133ms to 2.270ms. Meanwhile, the
emission cross section,σ results are found varies from 0.004 x 1020
cm2 to 1.007 x 1020 cm2 with respect to composition. The emission
spectra of the glass are found been contributed from Nd3+ and Er3+
ions by which nine significant transition peaks are observed. The
upconversion mechanism of the co-doped tellurite glass has been
shown in the schematic energy diagrams. In this works, it is found
that the excited state-absorption (ESA) is still dominant in the
upconversion excitation process as the upconversion excitation
mechanism of the Nd3+ excited-state levels is accomplished through a
stepwise multiphonon process. An efficient excitation energy transfer
(ET) has been observed between Nd3+ as a donor and Er3+ as the
acceptor. As a result, respective emission spectra had been observed.
Abstract: A new chelating resin is prepared by coupling Amberlite XAD-4 with 1-amino-2-naphthole through an azo spacer. The resulting sorbent has been characterized by FT-IR, elemental analysis and thermogravimetric analysis (TGA) and studied for preconcentrating of Fe (II) using flame atomic absorption spectrometry (FAAS) for metal monitoring. The optimum pH value for sorption of the iron ions was 6.5. The resin was subjected to evaluation through batch binding of mentioned metal ion. Quantitative desorption occurs instantaneously with 0.5 M HNO3. The sorption capacity was found 4.1 mmol.g-1 of resin for Fe (II) in the aqueous solution. The chelating resin can be reused for 10 cycles of sorption-desorption without any significant change in sorption capacity. A recovery of 97% was obtained the metal ions with 0.5 M HNO3 as eluting agent. The method was applied for metal ions determination from industrial waste water sample.