Abstract: The current paper conceptualizes the technique of
release consistency indispensable with the concept of
synchronization that is user-defined. Programming model concreted
with object and class is illustrated and demonstrated. The essence of
the paper is phases, events and parallel computing execution .The
technique by which the values are visible on shared variables is
implemented. The second part of the paper consist of user defined
high level synchronization primitives implementation and system
architecture with memory protocols. There is a proposition of
techniques which are core in deciding the validating and invalidating
a stall page .
Abstract: The paper presents an approach for handling uncertain
information in deductive databases using multivalued logics. Uncertainty
means that database facts may be assigned logical values other
than the conventional ones - true and false. The logical values represent
various degrees of truth, which may be combined and propagated
by applying the database rules. A corresponding multivalued database
semantics is defined. We show that it extends successful conventional
semantics as the well-founded semantics, and has a polynomial time
data complexity.
Abstract: In this paper, the implementation of a rule-based
intuitive reasoner is presented. The implementation included two
parts: the rule induction module and the intuitive reasoner. A large
weather database was acquired as the data source. Twelve weather
variables from those data were chosen as the “target variables"
whose values were predicted by the intuitive reasoner. A “complex"
situation was simulated by making only subsets of the data available
to the rule induction module. As a result, the rules induced were
based on incomplete information with variable levels of certainty.
The certainty level was modeled by a metric called "Strength of
Belief", which was assigned to each rule or datum as ancillary
information about the confidence in its accuracy. Two techniques
were employed to induce rules from the data subsets: decision tree
and multi-polynomial regression, respectively for the discrete and the
continuous type of target variables. The intuitive reasoner was tested
for its ability to use the induced rules to predict the classes of the
discrete target variables and the values of the continuous target
variables. The intuitive reasoner implemented two types of
reasoning: fast and broad where, by analogy to human thought, the
former corresponds to fast decision making and the latter to deeper
contemplation. . For reference, a weather data analysis approach
which had been applied on similar tasks was adopted to analyze the
complete database and create predictive models for the same 12
target variables. The values predicted by the intuitive reasoner and
the reference approach were compared with actual data. The intuitive
reasoner reached near-100% accuracy for two continuous target
variables. For the discrete target variables, the intuitive reasoner
predicted at least 70% as accurately as the reference reasoner. Since
the intuitive reasoner operated on rules derived from only about 10%
of the total data, it demonstrated the potential advantages in dealing
with sparse data sets as compared with conventional methods.
Abstract: This paper presents the development of an electricity simulation model taking into account electrical network constraints, applied on the Belgian power system. The base of the model is optimizing an extensive Unit Commitment (UC) problem through the use of Mixed Integer Linear Programming (MILP). Electrical constraints are incorporated through the implementation of a DC load flow. The model encloses the Belgian power system in a 220 – 380 kV high voltage network (i.e., 93 power plants and 106 nodes). The model features the use of pumping storage facilities as well as the inclusion of spinning reserves in a single optimization process. Solution times of the model stay below reasonable values.
Abstract: In the paper, the relative performances on spectral
classification of short exon and intron sequences of the human and
eleven model organisms is studied. In the simulations, all
combinations of sixteen one-sequence numerical representations, four
threshold values, and four window lengths are considered. Sequences
of 150-base length are chosen and for each organism, a total of
16,000 sequences are used for training and testing. Results indicate
that an appropriate combination of one-sequence numerical
representation, threshold value, and window length is essential for
arriving at top spectral classification results. For fixed-length
sequences, the precisions on exon and intron classification obtained
for different organisms are not the same because of their genomic
differences. In general, precision increases as sequence length
increases.
Abstract: This study was investigated on sampling and
analyzing water quality in water reservoir & water tower installed in
two kind of residential buildings and school facilities. Data of water
quality was collected for correlation analysis with frequency of
sanitization of water reservoir through questioning managers of
building about the inspection charts recorded on equipment for water
reservoir. Statistical software packages (SPSS) were applied to the
data of two groups (cleaning frequency and water quality) for
regression analysis to determine the optimal cleaning frequency of
sanitization. The correlation coefficient (R) in this paper represented
the degree of correlation, with values of R ranging from +1 to -1.After
investigating three categories of drinking water users; this study found
that the frequency of sanitization of water reservoir significantly
influenced the water quality of drinking water. A higher frequency of
sanitization (more than four times per 1 year) implied a higher quality
of drinking water. Results indicated that sanitizing water reservoir &
water tower should at least twice annually for achieving the aim of
safety of drinking water.
Abstract: This paper reports work done to improve the modeling of complex processes when only small experimental data sets are available. Neural networks are used to capture the nonlinear underlying phenomena contained in the data set and to partly eliminate the burden of having to specify completely the structure of the model. Two different types of neural networks were used for the application of pulping problem. A three layer feed forward neural networks, using the Preconditioned Conjugate Gradient (PCG) methods were used in this investigation. Preconditioning is a method to improve convergence by lowering the condition number and increasing the eigenvalues clustering. The idea is to solve the modified odified problem M-1 Ax= M-1b where M is a positive-definite preconditioner that is closely related to A. We mainly focused on Preconditioned Conjugate Gradient- based training methods which originated from optimization theory, namely Preconditioned Conjugate Gradient with Fletcher-Reeves Update (PCGF), Preconditioned Conjugate Gradient with Polak-Ribiere Update (PCGP) and Preconditioned Conjugate Gradient with Powell-Beale Restarts (PCGB). The behavior of the PCG methods in the simulations proved to be robust against phenomenon such as oscillations due to large step size.
Abstract: In single trial analysis, when using Principal
Component Analysis (PCA) to extract Visual Evoked Potential
(VEP) signals, the selection of principal components (PCs) is an
important issue. We propose a new method here that selects only
the appropriate PCs. We denote the method as selective eigen-rate
(SER). In the method, the VEP is reconstructed based on the rate
of the eigen-values of the PCs. When this technique is applied on
emulated VEP signals added with background
electroencephalogram (EEG), with a focus on extracting the
evoked P3 parameter, it is found to be feasible. The improvement
in signal to noise ratio (SNR) is superior to two other existing
methods of PC selection: Kaiser (KSR) and Residual Power (RP).
Though another PC selection method, Spectral Power Ratio (SPR)
gives a comparable SNR with high noise factors (i.e. EEGs), SER
give more impressive results in such cases. Next, we applied SER
method to real VEP signals to analyse the P3 responses for
matched and non-matched stimuli. The P3 parameters extracted
through our proposed SER method showed higher P3 response for
matched stimulus, which confirms to the existing neuroscience
knowledge. Single trial PCA using KSR and RP methods failed to
indicate any difference for the stimuli.
Abstract: Chronic conditions carry with them strong emotions
and often lead to charged relationships between patients and their
health providers and, by extension, patients and health researchers.
Persons are both autonomous and relational and a purely cognitive
model of autonomy neglects the social and relational basis of chronic
illness. Ensuring genuine informed consent in research requires a
thorough understanding of how participants perceive a study and
their reasons for participation. Surveys may not capture the
complexities of reasoning that underlies study participation.
Contradictory reasons for participation, for instance an initial claim
of altruism as rationale and a subsequent claim of personal benefit
(therapeutic misconception), affect the quality of informed consent.
Individuals apply principles through the filter of personal values and
lived experience. Authentic autonomy, and hence authentic consent
to research, occurs within the context of patients- unique life
narratives and illness experiences.
Abstract: In Korea, the technology of a load fo nuclear power plant has been being developed.
automatic controller which is able to control temperature and axial power distribution was developed. identification algorithm and a model predictive contact former transforms the nuclear reactor status into
numerically. And the latter uses them and ge
manipulated values such as two kinds of control ro
this automatic controller, the performance of a coperation was evaluated. As a result, the automatic generated model parameters of a nuclear react to nuclear reactor average temperature and axial power the desired targets during a daily load follow.
Abstract: Based on a non-linear single track model which
describes the dynamics of vehicle, an optimal path planning strategy
is developed. Real time optimization is used to generate reference
control values to allow leading the vehicle alongside a calculated lane
which is optimal for different objectives such as energy consumption,
run time, safety or comfort characteristics. Strict mathematic
formulation of the autonomous driving allows taking decision on
undefined situation such as lane change or obstacle avoidance. Based
on position of the vehicle, lane situation and obstacle position, the
optimization problem is reformulated in real-time to avoid the
obstacle and any car crash.
Abstract: In the globalization process, when the struggle for minds and values of the people is taking place, the impact of the virtual space can cause unexpected effects and consequences in the process of adjustment of young people in this world. Their special significance is defined by unconscious influence on the underlying process of meaning and therefore the values preached by them are much more effective and affect both the personal characteristics and the peculiarities of adjustment process. Related to this the challenge is to identify factors influencing the reflection characteristics of virtual subjects and measures their impact on the personal characteristics of the students.
Abstract: In this paper, the details of an experimental method to measure the clamping force value at bolted connections due to application of wrenching torque to tighten the nut have been presented. A simplified bolted joint including a holed plate with a single bolt was considered to carry out the experiments. This method was designed based on Hooke-s law by measuring compressive axial strain of a steel bush placed between the nut and the plate. In the experimental procedure, the values of clamping force were calculated for seven different levels of applied torque, and this process was repeated three times for each level of the torque. Moreover, the effect of lubrication of threads on the clamping value was studied using the same method. In both conditions (dry and lubricated threads), relation between the torque and the clamping force have been displayed in graphs.
Abstract: An end-member selection method for spectral unmixing that is based on Particle Swarm Optimization (PSO) is developed in this paper. The algorithm uses the K-means clustering algorithm and a method of dynamic selection of end-members subsets to find the appropriate set of end-members for a given set of multispectral images. The proposed algorithm has been successfully applied to test image sets from various platforms such as LANDSAT 5 MSS and NOAA's AVHRR. The experimental results of the proposed algorithm are encouraging. The influence of different values of the algorithm control parameters on performance is studied. Furthermore, the performance of different versions of PSO is also investigated.
Abstract: This paper considers the effect of heat generation
proportional l to (T - T∞ )p , where T is the local temperature and T∞
is the ambient temperature, in unsteady free convection flow near the
stagnation point region of a three-dimensional body. The fluid is
considered in an ambient fluid under the assumption of a step change
in the surface temperature of the body. The non-linear coupled partial
differential equations governing the free convection flow are solved
numerically using an implicit finite-difference method for different
values of the governing parameters entering these equations. The
results for the flow and heat characteristics when p ≤ 2 show that
the transition from the initial unsteady-state flow to the final steadystate
flow takes place smoothly. The behavior of the flow is seen
strongly depend on the exponent p.
Abstract: In this paper, the dynamics of a system of two van der Pol oscillators with delayed position and velocity is studied. We provide an approximate solution for this system using parameterexpansion method. Also, we obtain approximate values for frequencies of the system. The parameter-expansion method is more efficient than the perturbation method for this system because the method is independent of perturbation parameter assumption.
Abstract: Concrete strength evaluated from compression tests
on cores is affected by several factors causing differences from the
in-situ strength at the location from which the core specimen was
extracted. Among the factors, there is the damage possibly occurring
during the drilling phase that generally leads to underestimate the
actual in-situ strength. In order to quantify this effect, in this study
two wide datasets have been examined, including: (i) about 500 core
specimens extracted from Reinforced Concrete existing structures,
and (ii) about 600 cube specimens taken during the construction of
new structures in the framework of routine acceptance control. The
two experimental datasets have been compared in terms of
compression strength and specific weight values, accounting for the
main factors affecting a concrete property, that is type and amount of
cement, aggregates' grading, type and maximum size of aggregates,
water/cement ratio, placing and curing modality, concrete age. The
results show that the magnitude of the strength reduction due to
drilling damage is strongly affected by the actual properties of
concrete, being inversely proportional to its strength. Therefore, the
application of a single value of the correction coefficient, as generally
suggested in the technical literature and in structural codes, appears
inappropriate. A set of values of the drilling damage coefficient is
suggested as a function of the strength obtained from compressive
tests on cores.
Abstract: Wood pyrolysis for Casuarina glauca, Casuarina cunninghamiana, Eucalyptus camaldulensis, Eucalyptus microtheca was made at 450°C with 2.5°C/min. in a flowing N2-atmosphere. The Eucalyptus genus wood gave higher values of specific gravity, ash , total extractives, lignin, N2-liquid trap distillate (NLTD) and water trap distillate (WSP) than those for Casuarina genus. The GHC of NLTD was higher for Casuarina genus than that for Eucalyptus genus with the highest value for Casuarina cunninghamiana. Guiacol, 4-ethyl-2-methoxyphenol and syringol were observed in the NLTD of all the four wood species reflecting their parent hardwood lignin origin. Eucalyptus camaldulensis wood had the highest lignin content (28.89%) and was pyrolyzed to the highest values of phenolics (73.01%), guaiacol (11.2%) and syringol (32.28%) contents in methylene chloride fraction (MCF) of NLTD. Accordingly, recoveries of syringol and guaiacol may become economically attractive from Eucalyptus camaldulensis.
Abstract: The aim of this article is to explain how features of attacks could be extracted from the packets. It also explains how vectors could be built and then applied to the input of any analysis stage. For analyzing, the work deploys the Feedforward-Back propagation neural network to act as misuse intrusion detection system. It uses ten types if attacks as example for training and testing the neural network. It explains how the packets are analyzed to extract features. The work shows how selecting the right features, building correct vectors and how correct identification of the training methods with nodes- number in hidden layer of any neural network affecting the accuracy of system. In addition, the work shows how to get values of optimal weights and use them to initialize the Artificial Neural Network.
Abstract: In general, image-based 3D scenes can now be found in many popular vision systems, computer games and virtual reality tours. So, It is important to segment ROI (region of interest) from input scenes as a preprocessing step for geometric stricture detection in 3D scene. In this paper, we propose a method for segmenting ROI based on tensor voting and Dirichlet process mixture model. In particular, to estimate geometric structure information for 3D scene from a single outdoor image, we apply the tensor voting and Dirichlet process mixture model to a image segmentation. The tensor voting is used based on the fact that homogeneous region in an image are usually close together on a smooth region and therefore the tokens corresponding to centers of these regions have high saliency values. The proposed approach is a novel nonparametric Bayesian segmentation method using Gaussian Dirichlet process mixture model to automatically segment various natural scenes. Finally, our method can label regions of the input image into coarse categories: “ground", “sky", and “vertical" for 3D application. The experimental results show that our method successfully segments coarse regions in many complex natural scene images for 3D.