Abstract: This work aims to explore the factors that have an incidence in reading comprehension process, with different type of texts. In a recent study with 2nd, 3rd and 4th grade children, it was observed that reading comprehension of narrative texts was better than comprehension of expository texts. Nevertheless it seems that not only the type of text but also other textual factors would account for comprehension depending on the cognitive processing demands posed by the text. In order to explore this assumption, three narrative and three expository texts were elaborated with different degree of complexity. A group of 40 fourth grade Spanish-speaking children took part in the study. Children were asked to read the texts and answer orally three literal and three inferential questions for each text. The quantitative and qualitative analysis of children responses showed that children had difficulties in both, narrative and expository texts. The problem was to answer those questions that involved establishing complex relationships among information units that were present in the text or that should be activated from children’s previous knowledge to make an inference. Considering the data analysis, it could be concluded that there is some interaction between the type of text and the cognitive processing load of a specific text.
Abstract: The growing outsourcing of logistics services
resulting from the ongoing current in firms of costs
reduction/increased efficiency means that it is becoming more and
more important for the companies doing the outsourcing to carry out
a proper evaluation.
The multiple definitions and measures of logistics service
performance found in research on the topic create a certain degree of
confusion and do not clear the way towards the proper measurement
of their performance. Do a model and a specific set of indicators exist
that can be considered appropriate for measuring the performance of
logistics services outsourcing in industrial environments? Are said
indicators in keeping with the objectives pursued by outsourcing? We
aim to answer these and other research questions in the study we have
initiated in the field within the framework of the international High
Performance Manufacturing (HPM) project of which this paper
forms part.
As the first stage of this research, this paper reviews articles
dealing with the topic published in the last 15 years with the aim of
detecting the models most used to make this measurement and
determining which performance indicators are proposed as part of
said models and which are most used. The first steps are also taken in
determining whether these indicators, financial and operational, cover
the aims that are being pursued when outsourcing logistics services.
The findings show there is a wide variety of both models and
indicators used. This would seem to testify to the need to continue
with our research in order to try to propose a model and a set of
indicators for measuring the performance of logistics services
outsourcing in industrial environments.
Abstract: To investigate the behavior of sheet metals during
forming tailor welded blanks (TWB) of various thickness made via
Co2 Laser welding are under consideration. These blanks are formed
used two different forming methods of rubber as well as the
conventional punch and die methods. The main research objective is
the effects of using a rubber die instead of a solid one the
displacement of the weld line and the press force needed for forming.
Specimens with thicknesses of 0.5, 0.6, 0.8 and 1mm are subjected to
Erichsen two dimensional tests and the resulted force for each case
are compared. This is followed by a theoretical and numerical study
of press force and weld line displacement. It is concluded that using
rubber pad forming (RPF) causes a reduction in weld line
displacement and an increase in the press force.
Abstract: A major challenge in biomaterials research is the
regulation of protein adsorption which is a key factor for controlling
the subsequent cell adhesion at implant surfaces. The aim of the
present study was to control the adsorption of fibronectin (FN) and
the attachment of MG-63 osteoblasts with an electronic
nanostructure. Shallow doping line lattices with a period of 260 nm
were produced for this purpose by implantation of phosphorous in
silicon wafers. Protein coverage was determined after incubating the
substrate with FN by means of an immunostaining procedure and the
measurement of the fluorescence intensity with a TECAN analyzer.
We observed an increased amount of adsorbed FN on the
nanostructure compared to control substrates. MG-63 osteoblasts
were cultivated for 24h on FN-incubated substrates and their
morphology was assessed by SEM. Preferred orientation and
elongation of the cells in direction of the doping lattice lines was
observed on FN-coated nanostructures.
Abstract: Most fingerprint recognition techniques are based on minutiae matching and have been well studied. However, this technology still suffers from problems associated with the handling of poor quality impressions. One problem besetting fingerprint matching is distortion. Distortion changes both geometric position and orientation, and leads to difficulties in establishing a match among multiple impressions acquired from the same finger tip. Marking all the minutiae accurately as well as rejecting false minutiae is another issue still under research. Our work has combined many methods to build a minutia extractor and a minutia matcher. The combination of multiple methods comes from a wide investigation into research papers. Also some novel changes like segmentation using Morphological operations, improved thinning, false minutiae removal methods, minutia marking with special considering the triple branch counting, minutia unification by decomposing a branch into three terminations, and matching in the unified x-y coordinate system after a two-step transformation are used in the work.
Abstract: In the present work, the performance of the particle
swarm optimization and the genetic algorithm compared as a typical
geometry design problem. The design maximizes the heat transfer
rate from a given fin volume. The analysis presumes that a linear
temperature distribution along the fin. The fin profile generated using
the B-spline curves and controlled by the change of control point
coordinates. An inverse method applied to find the appropriate fin
geometry yield the linear temperature distribution along the fin
corresponds to optimum design. The numbers of the populations, the
count of iterations and time to convergence measure efficiency.
Results show that the particle swarm optimization is most efficient
for geometry optimization.
Abstract: This paper presents an evaluation for a wavelet-based
digital watermarking technique used in estimating the quality of
video sequences transmitted over Additive White Gaussian Noise
(AWGN) channel in terms of a classical objective metric, such as
Peak Signal-to-Noise Ratio (PSNR) without the need of the original
video. In this method, a watermark is embedded into the Discrete
Wavelet Transform (DWT) domain of the original video frames
using a quantization method. The degradation of the extracted
watermark can be used to estimate the video quality in terms of
PSNR with good accuracy. We calculated PSNR for video frames
contaminated with AWGN and compared the values with those
estimated using the Watermarking-DWT based approach. It is found
that the calculated and estimated quality measures of the video
frames are highly correlated, suggesting that this method can provide
a good quality measure for video frames transmitted over AWGN
channel without the need of the original video.
Abstract: The decision to recruit manpower in an organization
requires clear identification of the criteria (attributes) that distinguish
successful from unsuccessful performance. The choice of appropriate
attributes or criteria in different levels of hierarchy in an organization
is a multi-criteria decision problem and therefore multi-criteria
decision making (MCDM) techniques can be used for prioritization
of such attributes. Analytic Hierarchy Process (AHP) is one such
technique that is widely used for deciding among the complex criteria
structure in different levels. In real applications, conventional AHP
still cannot reflect the human thinking style as precise data
concerning human attributes are quite hard to be extracted. Fuzzy
logic offers a systematic base in dealing with situations, which are
ambiguous or not well defined. This study aims at defining a
methodology to improve the quality of prioritization of an
employee-s performance measurement attributes under fuzziness. To
do so, a methodology based on the Extent Fuzzy Analytic Hierarchy
Process is proposed. Within the model, four main attributes such as
Subject knowledge and achievements, Research aptitude, Personal
qualities and strengths and Management skills with their subattributes
are defined. The two approaches conventional AHP
approach and the Extent Fuzzy Analytic Hierarchy Process approach
have been compared on the same hierarchy structure and criteria set.
Abstract: Water 2H NMR signal on the surface of nano-silica material, MCM-41, consists of two overlapping resonances. The 2H water spectrum shows a superposition of a Lorentzian line shape and the familiar NMR powder pattern line shape, indicating the existence of two spin components. Chemical exchange occurs between these two groups. Decomposition of the two signals is a crucial starting point for study the exchange process. In this article we have determined these spin component populations along with other important parameters for the 2H water NMR signal over a temperature range between 223 K and 343 K.
Abstract: Conventional concentrically-braced frame (CBF)
systems have limited drift capacity before brace buckling and related
damage leads to deterioration in strength and stiffness. Self-centering
concentrically-braced frame (SC-CBF) systems have been developed
to increase drift capacity prior to initiation of damage and minimize
residual drift. SC-CBFs differ from conventional CBFs in that the
SC-CBF columns are designed to uplift from the foundation at a
specified level of lateral loading, initiating a rigid-body rotation
(rocking) of the frame. Vertically-aligned post-tensioning bars resist
uplift and provide a restoring force to return the SC-CBF columns to
the foundation (self-centering the system). This paper presents a
parametric study of different prototype buildings using SC-CBFs.
The bay widths of the SC-CBFs have been varied in these buildings
to study different geometries. Nonlinear numerical analyses of the
different SC-CBFs are presented to illustrate the effect of frame
geometry on the behavior and dynamic response of the SC-CBF
system.
Abstract: This paper suggests a new Affine Projection (AP) algorithm with variable data-reuse factor using the condition number as a decision factor. To reduce computational burden, we adopt a recently reported technique which estimates the condition number of an input data matrix. Several simulations show that the new algorithm has better performance than that of the conventional AP algorithm.
Abstract: Segmentation, filtering out of measurement errors and
identification of breakpoints are integral parts of any analysis of
microarray data for the detection of copy number variation (CNV).
Existing algorithms designed for these tasks have had some successes
in the past, but they tend to be O(N2) in either computation time or
memory requirement, or both, and the rapid advance of microarray
resolution has practically rendered such algorithms useless. Here we
propose an algorithm, SAD, that is much faster and much less thirsty
for memory – O(N) in both computation time and memory requirement
-- and offers higher accuracy. The two key ingredients of SAD are the
fundamental assumption in statistics that measurement errors are
normally distributed and the mathematical relation that the product of
two Gaussians is another Gaussian (function). We have produced a
computer program for analyzing CNV based on SAD. In addition to
being fast and small it offers two important features: quantitative
statistics for predictions and, with only two user-decided parameters,
ease of use. Its speed shows little dependence on genomic profile.
Running on an average modern computer, it completes CNV analyses
for a 262 thousand-probe array in ~1 second and a 1.8 million-probe
array in 9 seconds
Abstract: I/O workload is a critical and important factor to
analyze I/O pattern and file system performance. However tracing I/O
operations on the fly distributed parallel file system is non-trivial due
to collection overhead and a large volume of data. In this paper, we
design and implement a parallel file system logging method for high
performance computing using shared memory-based multi-layer
scheme. It minimizes the overhead with reduced logging operation
response time and provides efficient post-processing scheme through
shared memory. Separated logging server can collect sequential logs
from multiple clients in a cluster through packet communication.
Implementation and evaluation result shows low overhead and high
scalability of this architecture for high performance parallel logging
analysis.
Abstract: An experiment was conducted to examine the effect of the level of performance stabilization on the human adaptability to perceptual-motor perturbation in a complex coincident timing task. Three levels of performance stabilization were established operationally: pre-stabilization, stabilization, and super-stabilization groups. Each group practiced the task until reached its level of stabilization in a constant sequence of movements and under a constant time constraint before exposure to perturbation. The results clearly showed that performance stabilization is a pre-condition for adaptation. Moreover, variability before reaching stabilization is harmful to adaptation and persistent variability after stabilization is beneficial. Moreover, the behavior of variability is specific to each measure.
Abstract: In field of Computer Science and Mathematics,
sorting algorithm is an algorithm that puts elements of a list in a
certain order i.e. ascending or descending. Sorting is perhaps the
most widely studied problem in computer science and is frequently
used as a benchmark of a system-s performance. This paper
presented the comparative performance study of four sorting
algorithms on different platform. For each machine, it is found that
the algorithm depends upon the number of elements to be sorted. In
addition, as expected, results show that the relative performance of
the algorithms differed on the various machines. So, algorithm
performance is dependent on data size and there exists impact of
hardware also.
Abstract: The carbon based coils with the nanometer scale have
the 3 dimension helix geometry. We synthesized the carbon nano-coils
by the use of chemical vapor deposition technique with iron and tin as
the catalysts. The fabricated coils have the external diameter of
ranging few hundred nm to few thousand nm. The Scanning
Electro-Microscope (SEM) and Tunneling Electro-Microscope has
shown detail images of the coil-s structure. The fabrication of the
carbon nano-coils can be grown on the metal and non-metal substrates,
such as the stainless steel and silicon substrates. Besides growth on the
flat substrate; they also can be grown on the stainless steel wires. After
the synthesis of the coils, the mechanical and electro-mechanical
property is measured. The experimental results were reported.
Abstract: Software maintenance is extremely important activity in software development life cycle. It involves a lot of human efforts, cost and time. Software maintenance may be further subdivided into different activities such as fault prediction, fault detection, fault prevention, fault correction etc. This topic has gained substantial attention due to sophisticated and complex applications, commercial hardware, clustered architecture and artificial intelligence. In this paper we surveyed the work done in the field of software maintenance. Software fault prediction has been studied in context of fault prone modules, self healing systems, developer information, maintenance models etc. Still a lot of things like modeling and weightage of impact of different kind of faults in the various types of software systems need to be explored in the field of fault severity.
Abstract: Minimum Quantity Lubrication (MQL) technique
obtained a significant attention in machining processes to reduce
environmental loads caused by usage of conventional cutting fluids.
Recently nanofluids are finding an extensive application in the field
of mechanical engineering because of their superior lubrication and
heat dissipation characteristics. This paper investigates the use of a
nanofluid under MQL mode to improve grinding characteristics of
Ti-6Al-4V alloy. Taguchi-s experimental design technique has been
used in the present investigation and a second order model has been
established to predict grinding forces and surface roughness.
Different concentrations of water based Al2O3 nanofluids were
applied in the grinding operation through MQL setup developed in
house and the results have been compared with those of conventional
coolant and pure water. Experimental results showed that grinding
forces reduced significantly when nano cutting fluid was used even at
low concentration of the nano particles and surface finish has been
found to improve with higher concentration of the nano particles.
Abstract: The objective this study was to characterize and
develop anthropomorphic liver phantoms in tomography hepatic
procedures for quality control and improvement professionals in
nuclear medicine. For the conformation of the anthropomorphic
phantom was used in plaster and acrylic. We constructed three
phantoms representing processes with liver cirrhosis. The phantoms
were filled with 99mTc diluted with water to obtain the scintigraphic
images. Tomography images were analyzed anterior and posterior
phantom representing a body with a greater degree cirrhotic. It was
noted that the phantoms allow the acquisition of images similar to
real liver with cirrhosis. Simulations of hemangiomas may contribute
to continued professional education of nuclear medicine, on the
question of image acquisition, allowing of the study parameters such
of the matrix, energy window and count statistics.
Abstract: In this paper, we investigated vector control of an induction machine taking into account discretization problems of the command. In the purpose to show how to include in a discrete model of this current control and with rotor time constant update. The results of simulation obtained are very satisfaisant. That was possible thanks to the good choice of the values of the parameters of the regulators used which shows, the founded good of the method used, for the choice of the parameters of the discrete regulators. The simulation results are presented at the end of this paper.