Abstract: The aim of this paper is to explain what a multienterprise tie is, what evidence its analysis provides and how does the cooperation mechanism influence the establishment of a multienterprise tie. The study focuses on businesses of smaller dimension, geographically dispersed and whose businessmen are learning to cooperate in an international environment. The empirical evidence obtained at this moment permits to conclude the following: The tie is not long-lasting, it has an end; opportunism is an opportunity to learn; the multi-enterprise tie is a space to learn about the cooperation mechanism; the local tie permits a businessman to alternate between competition and cooperation strategies; the disappearance of a tie is an experience of learning for a businessman, diminishing the possibility of failure in the next tie; the cooperation mechanism tends to eliminate hierarchical relations; the multienterprise tie diminishes the asymmetries and permits SME-s to have a better position when they negotiate with large companies; the multi-enterprise tie impacts positively on the local system. The collection of empirical evidence was done trough the following instruments: direct observation in a business encounter to which the businesses attended in 2003 (202 Mexican agro industry SME-s), a survey applied in 2004 (129), a questionnaire applied in 2005 (86 businesses), field visits to the businesses during the period 2006-2008 and; a survey applied by telephone in 2008 (55 Mexican agro industry SME-s).
Abstract: Purpose:This paper aims to gain insights to the influential factors of ERM adoptions by public listed firms in Malaysia. Findings:The two factors of financial leverage and auditor type were found to be significant influential factors for ERM adoption. In other words the findings indicated that firms with higher financial leverage and with a Big Four auditor are more likely to have a form of ERM framework in place. Originality/Value:Since there are relatively few studies conducted in this area and specially in developing economies like Malaysia, this study will broaden the scope of literature by providing novel empirical evidence.
Abstract: To improve the characterization of blood flows, we propose a method which makes it possible to use the spectral analysis
of the Doppler signals. Our calculation induces a reasonable approximation, the error made on estimated speed reflects the fact
that speed depends on the flow conditions as well as on measurement parameters like the bore and the volume flow rate. The estimate of the Doppler signal frequency enables us to determine the maximum Doppler frequencie Fd max as well as the maximum flow speed. The
results show that the difference between the estimated frequencies
( Fde ) and the Doppler frequencies ( Fd ) is small, this variation tends to zero for important θ angles and it is proportional to the diameter D. The description of the speed of friction and the
coefficient of friction justify the error rate obtained.
Abstract: In this study, a frame work for verification of famous seismic codes is utilized. To verify the seismic codes performance, damage quantity of RC frames is compared with the target performance. Due to the randomness property of seismic design and earthquake loads excitation, in this paper, fragility curves are developed. These diagrams are utilized to evaluate performance level of structures which are designed by the seismic codes. These diagrams further illustrate the effect of load combination and reduction factors of codes on probability of damage exceedance. Two types of structures; very high important structures with high ductility and medium important structures with intermediate ductility are designed by different seismic codes. The Results reveal that usually lower damage ratio generate lower probability of exceedance. In addition, the findings indicate that there are buildings with higher quantity of bars which they have higher probability of damage exceedance. Life-cycle cost analysis utilized for comparison and final decision making process.
Abstract: For identifying the discriminative sequence features between exons and introns, a new paradigm, rescaled-range frameshift analysis (RRFA), was proposed. By RRFA, two new
sequence features, the frameshift sensitivity (FS) and the accumulative
penta-mer complexity (APC), were discovered which
were further integrated into a new feature of larger scale, the persistency in anti-mutation (PAM). The feature-validation experiments
were performed on six model organisms to test the power
of discrimination. All the experimental results highly support that FS, APC and PAM were all distinguishing features between exons
and introns. These identified new sequence features provide new insights into the sequence composition of genes and they have
great potentials of forming a new basis for recognizing the exonintron boundaries in gene sequences.
Abstract: The problems associated with wind predictions of
WAsP model in complex terrain are already the target of several
studies in the last decade. In this paper, the influence of surrounding
orography on accuracy of wind data analysis of a train is
investigated. For the case study, a site with complex surrounding
orography is considered. This site is located in Manjil, one of the
windiest cities of Iran. For having precise evaluation of wind regime
in the site, one-year wind data measurements from two metrological
masts are used. To validate the obtained results from WAsP, the
cross prediction between each mast is performed. The analysis
reveals that WAsP model can estimate the wind speed behavior
accurately. In addition, results show that this software can be used
for predicting the wind regime in flat sites with complex surrounding
orography.
Abstract: The study examined the influence of pay differentials on employee retention in the State Colleges of Education in the South-South Region of Nigeria. 275 subjects drawn from members of the wage negotiating teams in the Colleges were administered questionnaires constructed for study. Analysis of Variance revealed that the observed pay differentials significantly influenced retainership, f(5,269 = 6.223, P< 0.05). However, the Multiple Classification Analysis and Post-Hoc test indicated that employees in two of the Colleges with slightly lower and higher pay levels may probably remain with their employers while employees in other Colleges with the least and highest pay levels suggested quitting. Based on these observations, the influence of pay on employee retention seems inconclusive. Generally, employees in the colleges studied are dissatisfied with current pay levels. Management should confront these challenges by improving pay packages to encourage employees to remain and be dedicated to duty.
Abstract: Linear stability analysis of wake-shear layers in twophase
shallow flows is performed in the present paper. Twodimensional
shallow water equations are used in the analysis. It is
assumed that the fluid contains uniformly distributed solid particles.
No dynamic interaction between the carrier fluid and particles is
expected in the initial moment. The stability calculations are
performed for different values of the particle loading parameter and
two other parameters which characterize the velocity ratio and the
velocity deficit. The results show that the particle loading parameter
has a stabilizing effect on the flow while the increase in the velocity
ratio or in the velocity deficit destabilizes the flow.
Abstract: Microarrays have become the effective, broadly used tools in biological and medical research to address a wide range of problems, including classification of disease subtypes and tumors. Many statistical methods are available for analyzing and systematizing these complex data into meaningful information, and one of the main goals in analyzing gene expression data is the detection of samples or genes with similar expression patterns. In this paper, we express and compare the performance of several clustering methods based on data preprocessing including strategies of normalization or noise clearness. We also evaluate each of these clustering methods with validation measures for both simulated data and real gene expression data. Consequently, clustering methods which are common used in microarray data analysis are affected by normalization and degree of noise and clearness for datasets.
Abstract: Rolling element bearings are widely used in industry,
especially where high load capacity is required. The diagnosis of
their conditions is essential matter for downtime reduction and saving
cost of maintenance. Therefore, an intensive analysis of frequency
spectrum of their faults must be carried out in order to determine the
main reason of the fault. This paper focus on a beating phenomena
observed in the waveform (time domain) of a cylindrical rolling
element bearing. The beating frequencies were not related to any
sources nearby the machine nor any other malfunctions (unbalance,
misalignment ...etc). More investigation on the spike energy and the
frequency spectrum indicated a problem with races of the bearing.
Multi-harmonics of the fundamental defects frequencies were
observed. Two of them were close to each other in magnitude those
were the source of the beating phenomena.
Abstract: As a part of evaluation system for R&D program, the
Korean government has applied feasibility analysis since 2008.
Various professionals put forth a great effort in order to catch up the
high degree of freedom of R&D programs, and make contributions to
evolving the feasibility analysis. We analyze diverse R&D programs
from various viewpoints, such as technology, policy, and Economics,
integrate the separate analysis, and finally arrive at a definite result;
whether a program is feasible or unfeasible. This paper describes the
concept and method of the feasibility analysis as a decision making
tool. The analysis unit and content of each criterion, which are key
elements in a comprehensive decision making structure, are examined
Abstract: We propose a method for discrimination and
classification of ovarian with benign, malignant and normal tissue
using independent component analysis and neural networks. The
method was tested for a proteomic patters set from A database, and
radial basis functions neural networks. The best performance was
obtained with probabilistic neural networks, resulting I 99% success
rate, with 98% of specificity e 100% of sensitivity.
Abstract: In this paper, we propose an effective relay
communication for layered video transmission as an alternative to
make the most of limited resources in a wireless communication
network where loss often occurs. Relaying brings stable multimedia
services to end clients, compared to multiple description coding
(MDC). Also, retransmission of only parity data about one or more
video layer using channel coder to the end client of the relay device is
paramount to the robustness of the loss situation. Using these
methods in resource-constrained environments, such as real-time user
created content (UCC) with layered video transmission, can provide
high-quality services even in a poor communication environment.
Minimal services are also possible. The mathematical analysis shows
that the proposed method reduced the probability of GOP loss rate
compared to MDC and raptor code without relay. The GOP loss rate
is about zero, while MDC and raptor code without relay have a GOP
loss rate of 36% and 70% in case of 10% frame loss rate.
Abstract: Microarrays technique allows the simultaneous measurements of the expression levels of thousands of mRNAs. By mining this data one can identify the dynamics of the gene expression time series. By recourse of principal component analysis, we uncover the circadian rhythmic patterns underlying the gene expression profiles from Cyanobacterium Synechocystis. We applied PCA to reduce the dimensionality of the data set. Examination of the components also provides insight into the underlying factors measured in the experiments. Our results suggest that all rhythmic content of data can be reduced to three main components.
Abstract: Fault tree analysis is a well-known method for
reliability and safety assessment of engineering systems. In the last 3
decades, a number of methods have been introduced, in the literature,
for automatic construction of fault trees. The main difference between these methods is the starting model from which the tree is constructed. This paper presents a new methodology for the construction of static and dynamic fault trees from a system Simulink
model. The method is introduced and explained in detail, and its correctness and completeness is experimentally validated by using an example, taken from literature. Advantages of the method are also mentioned.
Abstract: The performances of small and medium enterprises
have stagnated in the last two decades. This has mainly been due to
the emergence of HIV / Aids. The disease has had a detrimental
effect on the general economy of the country leading to morbidity
and mortality of the Kenyan workforce in their primary age. The
present study sought to establish the economic impact of HIV / Aids
on the micro-enterprise development in Obunga slum – Kisumu, in
terms of production loss, increasing labor related cost and to establish
possible strategies to address the impact of HIV / Aids on microenterprises.
The study was necessitated by the observation that most
micro-enterprises in the slum are facing severe economic and social
crisis due to the impact of HIV / Aids, they get depleted and close
down within a short time due to death of skilled and experience
workforce. The study was carried out between June 2008 and June
2009 in Obunga slum. Data was subjected to computer aided
statistical analysis that included descriptive statistic, chi-squared and
ANOVA techniques. Chi-squared analysis on the micro-enterprise
owners opinion on the impact of HIV / Aids on depletion of microenterprise
compared to other diseases indicated high levels of the
negative effects of the disease at significance levels of P
Abstract: The purposes of this research were 1) to study
consumer-based equity of luxury brands, 2) to study consumers-
purchase intention for luxury brands, 3) to study direct factors
affecting purchase intention towards luxury brands, and 4) to study
indirect factors affecting purchase intention towards luxury brands
through brand consciousness and brand equity to analyze information
by descriptive statistic and hierarchical stepwise regression analysis.
The findings revealed that the eight variables of the framework which
were: need for uniqueness, normative susceptibility, status
consumption, brand consciousness, brand awareness, perceived
quality, brand association, and brand loyalty affected the purchase
intention of the luxury brands (at the significance of 0.05). Brand
Loyalty had the strongest direct effect while status consumption had
the strongest indirect effect affecting the purchase intention towards
luxury brands. Brand consciousness and brand equity had the
mediators through the purchase intention of the luxury brands (at the
significance of 0.05).
Abstract: The purpose of this paper primarily intends to develop GIS interface for estimating sequences of stream-flows at ungauged stations based on known flows at gauged stations. The integrated GIS interface is composed of three major steps. The first, precipitation characteristics using statistical analysis is the procedure for making multiple linear regression equation to get the long term mean daily flow at ungauged stations. The independent variables in regression equation are mean daily flow and drainage area. Traditionally, mean flow data are generated by using Thissen polygon method. However, method for obtaining mean flow data can be selected by user such as Kriging, IDW (Inverse Distance Weighted), Spline methods as well as other traditional methods. At the second, flow duration curve (FDC) is computing at unguaged station by FDCs in gauged stations. Finally, the mean annual daily flow is computed by spatial interpolation algorithm. The third step is to obtain watershed/topographic characteristics. They are the most important factors which govern stream-flows. In summary, the simulated daily flow time series are compared with observed times series. The results using integrated GIS interface are closely similar and are well fitted each other. Also, the relationship between the topographic/watershed characteristics and stream flow time series is highly correlated.
Abstract: Computer based geostatistical methods can offer effective data analysis possibilities for agricultural areas by using
vectorial data and their objective informations. These methods will help to detect the spatial changes on different locations of the large
agricultural lands, which will lead to effective fertilization for optimal yield with reduced environmental pollution. In this study, topsoil (0-20 cm) and subsoil (20-40 cm) samples were taken from a
sugar beet field by 20 x 20 m grids. Plant samples were also collected
from the same plots. Some physical and chemical analyses for these
samples were made by routine methods. According to derived variation coefficients, topsoil organic matter (OM) distribution was more than subsoil OM distribution. The highest C.V. value of
17.79% was found for topsoil OM. The data were analyzed
comparatively according to kriging methods which are also used
widely in geostatistic. Several interpolation methods (Ordinary,Simple and Universal) and semivariogram models (Spherical,
Exponential and Gaussian) were tested in order to choose the suitable
methods. Average standard deviations of values estimated by simple
kriging interpolation method were less than average standard
deviations (topsoil OM ± 0.48, N ± 0.37, subsoil OM ± 0.18) of measured values. The most suitable interpolation method was simple
kriging method and exponantial semivariogram model for topsoil,
whereas the best optimal interpolation method was simple kriging
method and spherical semivariogram model for subsoil. The results
also showed that these computer based geostatistical methods should
be tested and calibrated for different experimental conditions and semivariogram models.
Abstract: This paper aims to improve a fine lapping process of
hard disk drive (HDD) lapping machines by removing materials from
each slider together with controlling the strip height (SH) variation to
minimum value. The standard deviation is the key parameter to
evaluate the strip height variation, hence it is minimized. In this
paper, a design of experiment (DOE) with factorial analysis by twoway
analysis of variance (ANOVA) is adopted to obtain a
statistically information. The statistics results reveal that initial stripe
height patterns affect the final SH variation. Therefore, initial SH
classification using a radial basis function neural network is
implemented to achieve the proportional gain prediction.