Abstract: Today, Higher Education in a global scope is subordinated to the greater institutional controls through the policies of the Quality of Education. These include processes of over evaluation of all the academic activities: students- and professors- performance, educational logistics, managerial standards for the administration of institutions of higher education, as well as the establishment of the imaginaries of excellence and prestige as the foundations on which universities of the XXI century will focus their present and future goals and interests. But at the same time higher education systems worldwide are facing the most profound crisis of sense and meaning and attending enormous mutations in their identity. Based in a qualitative research approach, this paper shows the social configurations that the scholars at the Universities in Mexico build around the discourse of the Quality of Education, and how these policies put in risk the social recognition of these individuals.
Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: In this paper, we represent protein structure by using
graph. A protein structure database will become a graph database.
Each graph is represented by a spectral vector. We use Jacobi
rotation algorithm to calculate the eigenvalues of the normalized
Laplacian representation of adjacency matrix of graph. To measure
the similarity between two graphs, we calculate the Euclidean
distance between two graph spectral vectors. To cluster the graphs,
we use M-tree with the Euclidean distance to cluster spectral vectors.
Besides, M-tree can be used for graph searching in graph database.
Our proposal method was tested with graph database of 100 graphs
representing 100 protein structures downloaded from Protein Data
Bank (PDB) and we compare the result with the SCOP hierarchical
structure.
Abstract: Planning capacities when regenerating complex investment goods involves particular challenges in that the planning is subject to a large degree of uncertainty regarding load information. Using information fusion – by applying Bayesian Networks – a method is being developed for forecasting the anticipated expenditures (human labor, tool and machinery utilization, time etc.) for regenerating a good. The generated forecasts then later serve as a tool for planning capacities and ensure a greater stability in the planning processes.
Abstract: Noise level has critical effects on the diagnostic
performance of signal-averaged electrocardiogram (SAECG), because
the true starting and end points of QRS complex would be masked by
the residual noise and sensitive to the noise level. Several studies and
commercial machines have used a fixed number of heart beats
(typically between 200 to 600 beats) or set a predefined noise level
(typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform
SAECG analysis. However different criteria or methods used to
perform SAECG would cause the discrepancies of the noise levels
among study subjects. According to the recommendations of 1991
ESC, AHA and ACC Task Force Consensus Document for the use of
SAECG, the determinations of onset and offset are related closely to
the mean and standard deviation of noise sample. Hence this study
would try to perform SAECG using consistent root-mean-square
(RMS) noise levels among study subjects and analyze the noise level
effects on SAECG. This study would also evaluate the differences
between normal subjects and chronic renal failure (CRF) patients in
the time-domain SAECG parameters.
The study subjects were composed of 50 normal Taiwanese and 20
CRF patients. During the signal-averaged processing, different RMS
noise levels were adjusted to evaluate their effects on three time
domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS
voltage of the last QRS 40 ms (RMS40), and (3) duration of the low
amplitude signals below 40 μV (LAS40). The study results
demonstrated that the reduction of RMS noise level can increase
fQRSD and LAS40 and decrease the RMS40, and can further increase
the differences of fQRSD and RMS40 between normal subjects and
CRF patients. The SAECG may also become abnormal due to the
reduction of RMS noise level. In conclusion, it is essential to establish
diagnostic criteria of SAECG using consistent RMS noise levels for
the reduction of the noise level effects.
Abstract: The purpose of this study was to determine the
influence of physical activity and dietary fat intake on Body Mass
Index (BMI) of lecturers within a higher learning institutionalized
setting. The study adopted a Cross-sectional Correlational Design
and included 120 lecturers selected proportionately by simple
random sampling techniques from a population of 600 lecturers. Data
was collected using questionnaires, which had sections including
physical activity checklist adopted from the international physical
activity questionnaire (IPAQ), 24-hour food recall, anthropometric
measurements mainly weight and height. Analysis involved the use
of bivariate correlations and linear regression. A significant inverse
association was registered between BMI and duration (in minutes)
spent doing moderate intense physical activity per day (r=-0.322,
p
Abstract: The stem cells have ability to differentiated
themselves through mitotic cell division and various range of
specialized cell types. Cellular differentiation is a way by which few
specialized cell develops into more specialized.This paper studies the
fundamental problem of computational schema for an artificial neural
network based on chemical, physical and biological variables of
state. By doing this type of study system could be model for a viable
propagation of various economically important stem cells
differentiation. This paper proposes various differentiation outcomes
of artificial neural network into variety of potential specialized cells
on implementing MATLAB version 2009. A feed-forward back
propagation kind of network was created to input vector (five input
elements) with single hidden layer and one output unit in output
layer. The efficiency of neural network was done by the assessment
of results achieved from this study with that of experimental data
input and chosen target data. The propose solution for the efficiency
of artificial neural network assessed by the comparatative analysis of
“Mean Square Error" at zero epochs. There are different variables of
data in order to test the targeted results.
Abstract: Historic preservation areas are extremely vulnerable to disasters because they are home to many vulnerable people and contain many closely spaced wooden houses. However, the narrow streets in these regions have historic meaning, which means that they cannot be widened and can become blocked easily during large disasters. Here, we describe our efforts to establish a methodology for the planning of evacuation route sin such historic preservation areas. In particular, this study aims to clarify the effectiveness of measures intended to secure two-way evacuation routes for vulnerable people during large disasters in a historic area preserved under the Cultural Properties Protection Law, Japan.
Abstract: Appropriate ventilation in a classroom is helpful for
enhancing air exchange rate and student concentration. This study
focuses on the effects of fenestration in a four-story school building by
performing numerical simulation of a building when considering
indoor and outdoor environments simultaneously. The wind profile
function embedded in PHOENICS code was set as the inlet boundary
condition in a suburban environment. Sixteen fenestration
combinations were compared in a classroom containing thirty seats.
This study evaluates mean age of air (AGE) and airflow pattern of a
classroom on different floors. Considering both wind profile and
fenestration effects, the airflow on higher floors is channeled toward
the area near ceiling in a room and causes older mean age of air in the
breathing zone. The results in this study serve as a useful guide for
enhancing natural ventilation in a typical school building.
Abstract: The response of growth and yield of rainfed-chickpea
to population density should be evaluated based on long-term
experiments to include the climate variability. This is achievable just
by simulation. In this simulation study, this evaluation was done by
running the CYRUS model for long-term daily weather data of five
locations in Iran. The tested population densities were 7 to 59 (with
interval of 2) stands per square meter. Various functions, including
quadratic, segmented, beta, broken linear, and dent-like functions,
were tested. Considering root mean square of deviations and linear
regression statistics [intercept (a), slope (b), and correlation
coefficient (r)] for predicted versus observed variables, the quadratic
and broken linear functions appeared to be appropriate for describing
the changes in biomass and grain yield, and in harvest index,
respectively. Results indicated that in all locations, grain yield tends
to show increasing trend with crowding the population, but
subsequently decreases. This was also true for biomass in five
locations. The harvest index appeared to have plateau state across
low population densities, but decreasing trend with more increasing
density. The turning point (optimum population density) for grain
yield was 30.68 stands per square meter in Isfahan, 30.54 in Shiraz,
31.47 in Kermanshah, 34.85 in Tabriz, and 32.00 in Mashhad. The
optimum population density for biomass ranged from 24.6 (in
Tabriz) to 35.3 stands per square meter (Mashhad). For harvest index
it varied between 35.87 and 40.12 stands per square meter.
Abstract: Marketing is an essential issue to the survival of any
real estate company in Turkey. There are some factors which are
constraining the achievements of the marketing and sales strategies in
the Turkey real estate industry. This study aims to identify and
prioritise the most significant constraints to marketing in real estate
sector and new strategies based on those constraints. This study is
based on survey method, where the respondents such as credit
counsellors, real estate investors, consultants, academicians and
marketing representatives in Turkey were asked to rank forty seven
sub-factors according to their levels of impact. The results of Multiattribute
analytical technique indicated that the main subcomponents
having impact on marketing in real estate sector are interest rates, real
estate credit availability, accessibility, company image and consumer
real income, respectively. The identified constraints are expected to
guide the marketing team in a sales-effective way.
Abstract: Polyurethane foams (PUF) were formed by a chemical
reaction of polyol and isocyanate. The polyol was manufactured by
ring-opening hydrolysis of epoxidized soybean oil in the presence of
phosphoric acid under varying experimental conditions. Other
factors in the foam formulation such as water content and surfactant
were kept constant. The effect of the amount of solvents, phosphoric
acid, and their derivates in the foam formulation on the properties of
polyurethane foams were studied. The properties of the material were
measured via a number of parameters, which are water content of
prepared polyol, polymer density and cellular structures.
Abstract: The Brazilian legislation has only established
diagnostic reference levels (DRLs) in terms of Multiple Scan
Average Dose (MSAD) as a quality control parameter for computed
tomography (CT) scanners. Compliance with DRLs can be verified
by measuring the Computed Tomography Kerma Index (Ca,100) with
a pencil ionization chamber or by obtaining the kerma distribution in
CT scans with radiochromic films or rod shape lithium fluoride
termoluminescent dosimeters (TLD-100). TL dosimeters were used
to record kerma profiles and to determine MSAD values of a Bright
Speed model GE CT scanner. Measurements were done with
radiochromic films and TL dosimeters distributed in cylinders
positioned in the center and in four peripheral bores of a standard
polymethylmethacrylate (PMMA) body CT dosimetry phantom.
Irradiations were done using a protocol for adult chest. The
maximum values were found at the midpoint of the longitudinal axis.
The MSAD values obtained with three dosimetric techniques were
compared.
Abstract: In the current economy of increasing global
competition, many organizations are attempting to use knowledge as
one of the means to gain sustainable competitive advantage. Besides
large organizations, the success of SMEs can be linked to how well
they manage their knowledge. Despite the profusion of research
about knowledge management within large organizations, fewer
studies tried to analyze KM in SMEs.
This research proposes a new framework showing the determinant
role of organizational dimensions onto KM approaches. The paper
and its propositions are based on a literature review and analysis.
In this research, personalization versus codification,
individualization versus institutionalization and IT-based versus non
IT-based are highlighted as three distinct dimensions of knowledge
management approaches.
The study contributes to research by providing a more nuanced
classification of KM approaches and provides guidance to managers
about the types of KM approaches that should be adopted based on
the size, geographical dispersion and task nature of SMEs.
To the author-s knowledge, the paper is the first of its kind to
examine if there are suitable configurations of KM approaches for
SMEs with different dimensions. It gives valuable information, which
hopefully will help SME sector to accomplish KM.
Abstract: Interventional cardiologists are at greater risk from
radiation exposure as a result of the procedures they undertake than
most other medical specialists. A study was performed to evaluate
operator dose during interventional cardiology procedures and to
establish methods of operator dose reduction with a radiation
protective device. Different procedure technique and use of
protective tools can explain big difference in the annual equivalent
dose received by the professionals. Strategies to prevent and
monitor radiation exposure, advanced protective shielding and
effective radiation monitoring methods should be applied.
Abstract: An autonomous environmental monitoring system
(Smart Landfill) has been constructed for the quantitative
measurement of the components of landfill gas found at borehole
wells at the perimeter of landfill sites. The main components of
landfill gas are the greenhouse gases, methane and carbon dioxide
and have been monitored in the range 0-5 % volume. This monitoring
system has not only been tested in the laboratory but has been
deployed in multiple field trials and the data collected successfully
compared with on-site monitors. This success shows the potential of
this system for application in environments where reliable gas
monitoring is crucial.
Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: The breakdown strength characteristic of Low Density
Polyethylene films (LDPE) under DC voltage application and the
effect of water absorption have been studied. Mainly, our experiment
was investigated under two conditions; dry and heavy water
absorption. Under DC ramp voltage, the result found that the
breakdown strength under heavy water absorption has a lower value
than dry condition. In order to clarify the effect, the temperature rise of
film was observed using non contact thermograph until the occurrence
of the electrical breakdown and the conduction current of the sample
was also measured in correlation with the thermograph measurement.
From the observations, it was shown that under the heavy water
absorption, the hot spot in the samples appeared at lower voltage. At
the same voltage the temperature of the hot spot and conduction
current was higher than that under the dry condition. The measurement
result has a good correlation between the existence of a critical field
for conduction current and thermograph observation. In case of the
heavy water absorption, the occurrence of the threshold field was
earlier than the dry condition as result lead to higher of conduction
current and the temperature rise appears after threshold field was
significantly increased in increasing of field. The higher temperature
rise was caused by the higher current conduction as the result the
insulation leads to breakdown to the lower field application.
Abstract: Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.