Abstract: Aims for this study: first, to compare the expertise
level in data analysis, communication and information technologies
in undergraduate psychology students. Second, to verify the factor
structure of E-ETICA (Escala de Experticia en Tecnologias de la Informacion, la Comunicacion y el Análisis or Data Analysis,
Communication and Information'Expertise Scale) which had shown
an excellent internal consistency (α= 0.92) as well as a simple factor
structure. Three factors, Complex, Basic Information and
Communications Technologies and E-Searching and Download
Abilities, explains 63% of variance. In the present study, 260
students (119 juniors and 141 seniors) were asked to respond to
ETICA (16 items Likert scale of five points 1: null domain to 5: total
domain). The results show that both junior and senior students report
having very similar expertise level; however, E-ETICA presents a
different factor structure for juniors and four factors explained also
63% of variance: Information E-Searching, Download and Process;
Data analysis; Organization; and Communication technologies.
Abstract: The influence of lactulose and inulin on rheological
properties of fermented milk during storage was studied.Pasteurized
milk, freeze-dried starter culture Bb-12 (Bifidobacterium lactis, Chr.
Hansen, Denmark), inulin – RAFTILINE®HP (ORAFI, Belgium) and
syrup of lactulose (Duphalac®, the Netherlands) were used for
experiments. The fermentation process was realized at 37 oC for 16
hours and the storage of products was provided at 4 oC for 7 days.
Measurements were carried out by BROOKFIELD standard methods
and the flow curves were described by Herschel-Bulkley model.
The results of dispersion analysis have shown that both the
concentration of prebiotics (p=0.04
Abstract: Purpose: To explore the use of Curvelet transform to
extract texture features of pulmonary nodules in CT image and support
vector machine to establish prediction model of small solitary
pulmonary nodules in order to promote the ratio of detection and
diagnosis of early-stage lung cancer. Methods: 2461 benign or
malignant small solitary pulmonary nodules in CT image from 129
patients were collected. Fourteen Curvelet transform textural features
were as parameters to establish support vector machine prediction
model. Results: Compared with other methods, using 252 texture
features as parameters to establish prediction model is more proper.
And the classification consistency, sensitivity and specificity for the
model are 81.5%, 93.8% and 38.0% respectively. Conclusion: Based
on texture features extracted from Curvelet transform, support vector
machine prediction model is sensitive to lung cancer, which can
promote the rate of diagnosis for early-stage lung cancer to some
extent.
Abstract: This paper presents the investigation results of UV
measurement at different level of altitudes and the development of a
new portable instrument for measuring UV. The rapid growth of
industrial sectors in developing countries including Malaysia, brings
not only income to the nation, but also causes pollution in various
forms. Air pollution is one of the significant contributors to global
warming by depleting the Ozone layer, which would reduce the
filtration of UV rays. Long duration of exposure to high to UV rays
has many devastating health effects to mankind directly or indirectly
through destruction of the natural resources. This study aimed to
show correlation between UV and altitudes which indirectly can help
predict Ozone depletion. An instrument had been designed to
measure and monitors the level of UV. The instrument comprises of
two main blocks namely data logger and Graphic User Interface
(GUI). Three sensors were used in the data logger to detect changes
in the temperature, humidity and ultraviolet. The system has
undergone experimental measurement to capture data at two different
conditions; industrial area and high attitude area. The performance of
the instrument showed consistency in the data captured and the
results of the experiment drew a significantly high reading of UV at
high altitudes.
Abstract: A clustering is process to identify a homogeneous
groups of object called as cluster. Clustering is one interesting topic
on data mining. A group or class behaves similarly characteristics.
This paper discusses a robust clustering process for data images with
two reduction dimension approaches; i.e. the two dimensional
principal component analysis (2DPCA) and principal component
analysis (PCA). A standard approach to overcome this problem is
dimension reduction, which transforms a high-dimensional data into
a lower-dimensional space with limited loss of information. One of
the most common forms of dimensionality reduction is the principal
components analysis (PCA). The 2DPCA is often called a variant of
principal component (PCA), the image matrices were directly treated
as 2D matrices; they do not need to be transformed into a vector so
that the covariance matrix of image can be constructed directly using
the original image matrices. The decomposed classical covariance
matrix is very sensitive to outlying observations. The objective of
paper is to compare the performance of robust minimizing vector
variance (MVV) in the two dimensional projection PCA (2DPCA)
and the PCA for clustering on an arbitrary data image when outliers
are hiden in the data set. The simulation aspects of robustness and
the illustration of clustering images are discussed in the end of
paper
Abstract: Starting from the basic pillars of the supportability
analysis this paper queries its characteristics in LCI (Life Cycle
Integration) environment. The research methodology contents a
review of modern logistics engineering literature with the objective to
collect and synthesize the knowledge relating to standards of
supportability design in e-logistics environment. The results show
that LCI framework has properties which are in fully compatibility
with the requirement of simultaneous logistics support and productservice
bundle design. The proposed approach is a contribution to the
more comprehensive and efficient supportability design process.
Also, contributions are reflected through a greater consistency of
collected data, automated creation of reports suitable for different
analysis, as well as the possibility of their customization according
with customer needs. In addition to this, convenience of this approach
is its practical use in real time. In a broader sense, LCI allows
integration of enterprises on a worldwide basis facilitating electronic
business.
Abstract: This article combines two techniques: data
envelopment analysis (DEA) and Factor analysis (FA) to data
reduction in decision making units (DMU). Data envelopment
analysis (DEA), a popular linear programming technique is useful to
rate comparatively operational efficiency of decision making units
(DMU) based on their deterministic (not necessarily stochastic)
input–output data and factor analysis techniques, have been proposed
as data reduction and classification technique, which can be applied
in data envelopment analysis (DEA) technique for reduction input –
output data. Numerical results reveal that the new approach shows a
good consistency in ranking with DEA.
Abstract: The necessity of ever-increasing use of distributed
data in computer networks is obvious for all. One technique that is
performed on the distributed data for increasing of efficiency and
reliablity is data rplication. In this paper, after introducing this
technique and its advantages, we will examine some dynamic data
replication. We will examine their characteristies for some overus
scenario and the we will propose some suggestion for their
improvement.
Abstract: Many multimedia communication applications require a
source to transmit messages to multiple destinations subject to quality
of service (QoS) delay constraint. To support delay constrained
multicast communications, computer networks need to guarantee an
upper bound end-to-end delay from the source node to each of
the destination nodes. This is known as multicast delay problem.
On the other hand, if the same message fails to arrive at each
destination node at the same time, there may arise inconsistency and
unfairness problem among users. This is related to multicast delayvariation
problem. The problem to find a minimum cost multicast
tree with delay and delay-variation constraints has been proven to
be NP-Complete. In this paper, we propose an efficient heuristic
algorithm, namely, Economic Delay and Delay-Variation Bounded
Multicast (EDVBM) algorithm, based on a novel heuristic function,
to construct an economic delay and delay-variation bounded multicast
tree. A noteworthy feature of this algorithm is that it has very high
probability of finding the optimal solution in polynomial time with
low computational complexity.
Abstract: Markov games can be effectively used to design
controllers for nonlinear systems. The paper presents two novel
controller design algorithms by incorporating ideas from gametheory
literature that address safety and consistency issues of the
'learned' control strategy. A more widely used approach for
controller design is the H∞ optimal control, which suffers from high
computational demand and at times, may be infeasible. We generate
an optimal control policy for the agent (controller) via a simple
Linear Program enabling the controller to learn about the unknown
environment. The controller is facing an unknown environment and
in our formulation this environment corresponds to the behavior rules
of the noise modeled as the opponent. Proposed approaches aim to
achieve 'safe-consistent' and 'safe-universally consistent' controller
behavior by hybridizing 'min-max', 'fictitious play' and 'cautious
fictitious play' approaches drawn from game theory. We empirically
evaluate the approaches on a simulated Inverted Pendulum swing-up
task and compare its performance against standard Q learning.
Abstract: Nejad and Mashinchi (2011) proposed a revision for ranking fuzzy numbers based on the areas of the left and the right sides of a fuzzy number. However, this method still has some shortcomings such as lack of discriminative power to rank similar fuzzy numbers and no guarantee the consistency between the ranking of fuzzy numbers and the ranking of their images. To overcome these drawbacks, we propose an epsilon-deviation degree method based on the left area and the right area of a fuzzy number, and the concept of the centroid point. The main advantage of the new approach is the development of an innovative index value which can be used to consistently evaluate and rank fuzzy numbers. Numerical examples are presented to illustrate the efficiency and superiority of the proposed method.
Abstract: The setting agent Ca(OH)2 for activation of slag
cement is used in the proportions of 0%, 2%, 4%, 6%, 8% and 10%
by various methods (substitution and addition by mass of slag
cement). The physical properties of slag cement activated by the
calcium hydroxide at anhydrous and hydrated states (fineness,
particle size distribution, consistency of the cement pastes and setting
times) were studied. The activation method by the mineral activator
of slag cement (latent hydraulicity) accelerates the hydration process
and reduces the setting times of the cement activated.
Abstract: This study deals with a multi-criteria optimization
problem which has been transformed into a single objective
optimization problem using Response Surface Methodology (RSM),
Artificial Neural Network (ANN) and Grey Relational Analyses
(GRA) approach. Grey-RSM and Grey-ANN are hybrid techniques
which can be used for solving multi-criteria optimization problem.
There have been two main purposes of this research as follows.
1. To determine optimum and robust fiber dyeing process
conditions by using RSM and ANN based on GRA,
2. To obtain the best suitable model by comparing models
developed by different methodologies.
The design variables for fiber dyeing process in textile are
temperature, time, softener, anti-static, material quantity, pH,
retarder, and dispergator. The quality characteristics to be evaluated
are nominal color consistency of fiber, maximum strength of fiber,
minimum color of dyeing solution. GRA-RSM with exact level
value, GRA-RSM with interval level value and GRA-ANN models
were compared based on GRA output value and MSE (Mean Square
Error) performance measurement of outputs with each other. As a
result, GRA-ANN with interval value model seems to be suitable
reducing the variation of dyeing process for GRA output value of the
model.
Abstract: This paper presents Qmulus- a Cloud Based GPS
Model. Qmulus is designed to compute the best possible route which
would lead the driver to the specified destination in the shortest time
while taking into account real-time constraints. Intelligence
incorporated to Qmulus-s design makes it capable of generating and
assigning priorities to a list of optimal routes through customizable
dynamic updates. The goal of this design is to minimize travel and
cost overheads, maintain reliability and consistency, and implement
scalability and flexibility. The model proposed focuses on
reducing the bridge between a Client Application and a Cloud
service so as to render seamless operations. Qmulus-s system
model is closely integrated and its concept has the potential to be
extended into several other integrated applications making it capable
of adapting to different media and resources.
Abstract: This paper seeks to give a general idea of the universe of project portfolio management, from its multidisciplinary nature, to the many challenges it raises, passing through the different techniques, models and tools used to solve the multiple problems known. It is intended to contribute to the clarification, with great depth, of the impacts and relationships involved in managing the projects- portfolio. It aims at proposing a technique for the project alignment with the organisational strategy, in order to select projects that later on will be considered in the analysis and selection of the portfolio. We consider the development of a methodology for assessing the project alignment index very relevant in the global market scenario. It can help organisations to gain a greater awareness of market dynamics, speed up the decision process and increase its consistency, thus enabling the strategic alignment and the improvement of the organisational performance.
Abstract: Caching was suggested as a solution for reducing bandwidth utilization and minimizing query latency in mobile environments. Over the years, different caching approaches have been proposed, some relying on the server to broadcast reports periodically informing of the updated data while others allowed the clients to request for the data whenever needed. Until recently a hybrid cache consistency scheme Scalable Asynchronous Cache Consistency Scheme SACCS was proposed, which combined the two different approaches benefits- and is proved to be more efficient and scalable. Nevertheless, caching has its limitations too, due to the limited cache size and the limited bandwidth, which makes the implementation of cache replacement strategy an important aspect for improving the cache consistency algorithms. In this thesis, we proposed a new cache replacement strategy, the Least Unified Value strategy (LUV) to replace the Least Recently Used (LRU) that SACCS was based on. This paper studies the advantages and the drawbacks of the new proposed strategy, comparing it with different categories of cache replacement strategies.
Abstract: Repeated observation of a given area over time yields
potential for many forms of change detection analysis. These
repeated observations are confounded in terms of radiometric
consistency due to changes in sensor calibration over time,
differences in illumination, observation angles and variation in
atmospheric effects.
This paper demonstrates applicability of an empirical relative
radiometric normalization method to a set of multitemporal cloudy
images acquired by Resourcesat1 LISS III sensor. Objective of this
study is to detect and remove cloud cover and normalize an image
radiometrically. Cloud detection is achieved by using Average
Brightness Threshold (ABT) algorithm. The detected cloud is
removed and replaced with data from another images of the same
area. After cloud removal, the proposed normalization method is
applied to reduce the radiometric influence caused by non surface
factors. This process identifies landscape elements whose reflectance
values are nearly constant over time, i.e. the subset of non-changing
pixels are identified using frequency based correlation technique. The
quality of radiometric normalization is statistically assessed by R2
value and mean square error (MSE) between each pair of analogous
band.
Abstract: In this paper an extensive verification of the extraction
method (published earlier) that consistently accounts for self-heating
and Early effect to accurately extract both base and thermal resistance
of bipolar junction transistors is presented. The method verification is
demonstrated on advanced RF SiGe HBTs were the extracted results
for the thermal resistance are compared with those from another
published method that ignores the effect of Early effect on internal
base-emitter voltage and the extracted results of the base resistance
are compared with those determined from noise measurements. A
self-consistency of our method in the extracted base resistance and
thermal resistance using compact model simulation results is also
carried out in order to study the level of accuracy of the method.
Abstract: A new numerical scheme based on the H1-Galerkin mixed finite element method for a class of second-order pseudohyperbolic equations is constructed. The proposed procedures can be split into three independent differential sub-schemes and does not need to solve a coupled system of equations. Optimal error estimates are derived for both semidiscrete and fully discrete schemes for problems in one space dimension. And the proposed method dose not requires the LBB consistency condition. Finally, some numerical results are provided to illustrate the efficacy of our method.
Abstract: This paper is on the general discussion of memory consistency model like Strict Consistency, Sequential Consistency, Processor Consistency, Weak Consistency etc. Then the techniques for implementing distributed shared memory Systems and Synchronization Primitives in Software Distributed Shared Memory Systems are discussed. The analysis involves the performance measurement of the protocol concerned that is Multiple Writer Protocol. Each protocol has pros and cons. So, the problems that are associated with each protocol is discussed and other related things are explored.