Abstract: This article proposes a new methodology to be used by SMEs (Small and Medium enterprises) to characterize their performance in quality, highlighting weaknesses and area for improvement. The methodology aims to identify the principal causes of quality problems and help to prioritize improvement initiatives. This is a self-assessment methodology that intends to be easy to implement by companies with low maturity level in quality. The methodology is organized in six different steps which includes gathering information about predetermined processes and subprocesses of quality management, defined based on the well-known Juran-s trilogy for quality management (Quality planning, quality control and quality improvement) and, predetermined results categories, defined based on quality concept. A set of tools for data collecting and analysis, such as interviews, flowcharts, process analysis diagrams and Failure Mode and effects Analysis (FMEA) are used. The article also presents the conclusions obtained in the application of the methodology in two cases studies.
Abstract: An optical fault monitoring in FTTH-PON using ACS
is demonstrated. This device can achieve real-time fault monitoring
for protection feeder fiber. In addition, the ACS can distinguish
optical fiber fault from the transmission services to other customers
in the FTTH-PON. It is essential to use a wavelength different from
the triple-play services operating wavelengths for failure detection.
ACS is using the operating wavelength 1625 nm for monitoring and
failure detection control. Our solution works on a standard local area
network (LAN) using a specially designed hardware interfaced with a
microcontroller integrated Ethernet.
Abstract: This paper proposes a power-controlled scheduling scheme for devices using a directional antenna in smart home. In the case of the home network using directional antenna, devices can concurrently transmit data in the same frequency band. Accordingly, the throughput increases compared to that of devices using omni-directional antenna in proportional to the number of concurrent transmissions. Also, the number of concurrent transmissions depends on the beamwidth of antenna, the number of devices operating in the network , transmission power, interference and so on. In particular, the less transmission power is used, the more concurrent transmissions occur due to small transmission range. In this paper, we considered sub-optimal scheduling scheme for throughput maximization and power consumption minimization. In the scheme, each device is equipped with a directional antenna. Various beamwidths, path loss components, and antenna radiation efficiencies are considered. Numerical results show that the proposed schemes outperform the scheduling scheme using directional antennas without power control.
Abstract: The paper which is dedicated to describing the effect
made by the “significant other", presents the new model of
interrelation between self-reflection, the “significant other"
phenomenon and aggression. Tendencies of direction and type
frustration response developments in detail are discussed. New
results have been received through designing of the original
experiment. It is based on modifications of the “Picture – Frustration
Study" test by S. Rosenzweig.
Abstract: Growing world population has fundamental impacts
and often catastrophic on natural habitat. The immethodical
consumption of energy, destruction of the forests and extinction of
plant and animal species are the consequence of this experience.
Urban sustainability and sustainable urban development, that is so
spoken these days, should be considered as a strategy, goal and
policy, beyond just considering environmental issues and protection.
The desert-s climate has made a bunch of problems for its residents.
Very hot and dry climate in summers of the Iranian desert areas,
when there was no access to modern energy source and mechanical
cooling systems in the past, made Iranian architects to design a
natural ventilation system in their buildings. The structure, like a
tower going upward the roof, besides its ornamental application and
giving a beautiful view to the building, was used as a spontaneous
ventilation system. In this paper, it has been tried to name the
problems of the area and it-s inconvenience, then some answers has
pointed out in order to solve the problems and as an alternative
solution BADGIR (wind-catcher) has been introduced as a solution
knowing that it has been playing a major role in dealing with the
problems.
Abstract: It is essential to have a uniform and calm flow field
for a settling tank to have high performance. In general, the
recirculation zones always occurred in sedimentation tanks. The
presence of these regions may have different effects. The nonuniformity
of the velocity field, the short-circuiting at the surface and
the motion of the jet at the bed of the tank that occurs because of the
recirculation in the sedimentation layer, are affected by the geometry
of the tank. There are some ways to decrease the size of these dead
zones, which would increase the performance. One of the ways is to
use a suitable baffle configuration. In this study, the presence of
baffle with different position has been investigated by a finite volume
method, with VOF (Volume of Fluid) model. Besides, the k-ε
turbulence model is used in the numerical calculations. The results
indicate that the best position of the baffle is obtained when the
volume of the recirculation region is minimized or is divided to
smaller part and the flow field trend to be uniform in the settling
zone.
Abstract: While financial institutions have faced difficulties
over the years for a multitude of reasons, the major cause of serious
banking problems continues to be directly related to lax credit
standards for borrowers and counterparties, poor portfolio risk
management, or a lack of attention to changes in economic or other
circumstances that can lead to a deterioration in the credit standing of
a bank's counterparties. Credit risk is most simply defined as the
potential that a bank borrower or counterparty will fail to meet its
obligations in accordance with agreed terms. The goal of credit risk
management is to maximize a bank's risk-adjusted rate of return by
maintaining credit risk exposure within acceptable parameters. Banks
need to manage the credit risk inherent in the entire portfolio as well
as the risk in individual credits or transactions. Banks should also
consider the relationships between credit risk and other risks. The
effective management of credit risk is a critical component of a
comprehensive approach to risk management and essential to the
long-term success of any banking organization. In this research we
also study the relationship between credit risk indices and borrower-s
timely payback in Karafarin bank.
Abstract: This paper adopts a notion of expectation-perception
gap of systems users as information systems (IS) failure. Problems
leading to the expectation-perception gap are identified and modelled
as five interrelated discrepancies or gaps throughout the process of
information systems development (ISD). It describes an empirical
study on how systems developers and users perceive the size of each
gap and the extent to which each problematic issue contributes to the
gap. The key to achieving success in ISD is to keep the expectationperception
gap closed by closing all 5 pertaining gaps. The gap model
suggests that most factors in IS failure are related to organizational,
cognitive and social aspects of information systems design.
Organization requirement analysis, being the weakest link of IS
development, is particularly worthy of investigation.
Abstract: There is lot of work done in prediction of the fault proneness of the software systems. But, it is the severity of the faults that is more important than number of faults existing in the developed system as the major faults matters most for a developer and those major faults needs immediate attention. In this paper, we tried to predict the level of impact of the existing faults in software systems. Neuro-Fuzzy based predictor models is applied NASA-s public domain defect dataset coded in C programming language. As Correlation-based Feature Selection (CFS) evaluates the worth of a subset of attributes by considering the individual predictive ability of each feature along with the degree of redundancy between them. So, CFS is used for the selecting the best metrics that have highly correlated with level of severity of faults. The results are compared with the prediction results of Logistic Models (LMT) that was earlier quoted as the best technique in [17]. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provide a relatively better prediction accuracy as compared to other models and hence, can be used for the modeling of the level of impact of faults in function based systems.
Abstract: Among the chemicals used for ammunition production, TNT (Trinitrotoluene) play a significant role since World War I and II. Various types of military weapon utilize TNT in casting process. However, the TNT casting process for warhead is difficult to control the cooling rate of the liquid TNT. This problem occurs because the casting process lacks the equipment to detect the temperature during the casting procedure This study presents the temperature detected by infrared camera to illustrate the cooling rate and cooling zone of curing, and demonstrates the optimization of TNT condition to reduce the risk of air gap occurred in the warhead which can result in the destruction afterward. Premature initiation of explosive-filled projectiles in response to set-back forces during gunfiring cause by casting defects. Finally the study can help improving the process of the TNT casting. The operators can control the curing of TNT inside the case by rising up the heating rod at the proper time. Consequently this can reduce tremendous time of rework if the air gaps occur and increase strength to lower elastic modulus. Therefore, it can be clearly concluded that the use of Infrared Cameras in this process is another method to improve the casting procedure.
Abstract: The objective of this article is to discuss the potential
of economic analysis as a tool for identification and evaluation of
corruption in legislative acts. We propose that corruption be
perceived as a risk variable within the legislative process. Therefore
we find it appropriate to employ risk analysis methods, used in
various fields of economics, for the evaluation of corruption in
legislation. Furthermore we propose the incorporation of these
methods into the so called corruption impact assessment (CIA), the
general framework for detection of corruption in legislative acts. The
applications of the risk analysis methods are demonstrated on
examples of implementation of proposed CIA in the Czech Republic.
Abstract: Monitoring lightning electromagnetic pulses (sferics)
and other terrestrial as well as extraterrestrial transient radiation signals
is of considerable interest for practical and theoretical purposes
in astro- and geophysics as well as meteorology. Managing a continuous
flow of data, automisation of the detection and classification
process is important. Features based on a combination of wavelet and
statistical methods proved efficient for analysis and characterisation
of transients and as input into a radial basis function network that is
trained to discriminate transients from pulse like to wave like.
Abstract: The objective of the research was focused on the
design, development and evaluation of a sustainable web based
network system to be used as an interoperable environment for
University process workflow and document management. In this
manner the most of the process workflows in Universities can be
entirely realized electronically and promote integrated University.
Definition of the most used University process workflows enabled
creating electronic workflows and their execution on standard
workflow execution engines. Definition or reengineering of
workflows provided increased work efficiency and helped in having
standardized process through different faculties. The concept and the
process definition as well as the solution applied as Case study are
evaluated and findings are reported.
Abstract: The information revealed by derivatives can help to
better characterize digital near-end crosstalk signatures with the
ultimate goal of identifying the specific aggressor signal.
Unfortunately, derivatives tend to be very sensitive to even low
levels of noise. In this work we approximated the derivatives of both
quiet and noisy digital signals using a wavelet-based technique. The
results are presented for Gaussian digital edges, IBIS Model digital
edges, and digital edges in oscilloscope data captured from an actual
printed circuit board. Tradeoffs between accuracy and noise
immunity are presented. The results show that the wavelet technique
can produce first derivative approximations that are accurate to
within 5% or better, even under noisy conditions. The wavelet
technique can be used to calculate the derivative of a digital signal
edge when conventional methods fail.
Abstract: To study on effect of PEG and NaCl stress on
germination and early seedling stages on two cultivar of corn, two
separated experiment were laid out at physiology laboratory, faculty
of Agriculture, Razi University, Kermanshah, Iran in 2009. This
investigation was performed as factorial experiment under Complete
Randomized Design (CRD) with three replications. Cultivar factor
contains of two varieties (sweet corn SC403 and Flint corn SC704)
and five levels of stress (0, -2, -4, -6 and -8 bar). The principal aim of
current study was to compare the two varieties of maize in relative to
the stress conditions. Results indicated that significant decrease was
observed in percentage of germination, germination rate, length of
radicle and plumule and radicle and plumule dry matter. On the basis
of the results, NaCl as compared with PEG had more effect on
germination and early seedling stage and sweet corn had more
resistant than flint corn in both stress conditions.
Abstract: The purpose of the experiments described in this article was the comparison of integrated fixed film activated sludge (IFAS) and activated sludge (AS) system. The IFAS applied system consists of the cigarette filter rods (wasted filter in tobacco factories) as a biofilm carrier. The comparison with activated sludge was performed by two parallel treatment lines. Organic substance, ammonia and TP removal was investigated over four month period. Synthetic wastewater was prepared with ordinary tap water and glucose as the main sources of carbon and energy, plus balanced macro and micro nutrients. COD removal percentages of 94.55%, and 81.62% were achieved for IFAS and activated sludge system, respectively. Also, ammonia concentration significantly decreased by increasing the HRT in both systems. The average ammonia removal of 97.40 % and 96.34% were achieved for IFAS and activated sludge system, respectively. The removal efficiency of total phosphorus (TP-P) was 60.64%, higher than AS process by 56.63% respectively.
Abstract: Biological data has several characteristics that strongly differentiate it from typical business data. It is much more complex, usually large in size, and continuously changes. Until recently business data has been the main target for discovering trends, patterns or future expectations. However, with the recent rise in biotechnology, the powerful technology that was used for analyzing business data is now being applied to biological data. With the advanced technology at hand, the main trend in biological research is rapidly changing from structural DNA analysis to understanding cellular functions of the DNA sequences. DNA chips are now being used to perform experiments and DNA analysis processes are being used by researchers. Clustering is one of the important processes used for grouping together similar entities. There are many clustering algorithms such as hierarchical clustering, self-organizing maps, K-means clustering and so on. In this paper, we propose a clustering algorithm that imitates the ecosystem taking into account the features of biological data. We implemented the system using an Ant-Colony clustering algorithm. The system decides the number of clusters automatically. The system processes the input biological data, runs the Ant-Colony algorithm, draws the Topic Map, assigns clusters to the genes and displays the output. We tested the algorithm with a test data of 100 to1000 genes and 24 samples and show promising results for applying this algorithm to clustering DNA chip data.
Abstract: Classifier fusion may generate more accurate
classification than each of the basic classifiers. Fusion is often based
on fixed combination rules like the product, average etc. This paper
presents decision templates as classifier fusion method for the
recognition of the handwritten English and Farsi numerals (1-9).
The process involves extracting a feature vector on well-known
image databases. The extracted feature vector is fed to multiple
classifier fusion. A set of experiments were conducted to compare
decision templates (DTs) with some combination rules. Results from
decision templates conclude 97.99% and 97.28% for Farsi and
English handwritten digits.
Abstract: This paper presents a new fingerprint coding technique
based on contourlet transform and multistage vector quantization.
Wavelets have shown their ability in representing natural images that
contain smooth areas separated with edges. However, wavelets
cannot efficiently take advantage of the fact that the edges usually
found in fingerprints are smooth curves. This issue is addressed by
directional transforms, known as contourlets, which have the
property of preserving edges. The contourlet transform is a new
extension to the wavelet transform in two dimensions using
nonseparable and directional filter banks. The computation and
storage requirements are the major difficulty in implementing a
vector quantizer. In the full-search algorithm, the computation and
storage complexity is an exponential function of the number of bits
used in quantizing each frame of spectral information. The storage
requirement in multistage vector quantization is less when compared
to full search vector quantization. The coefficients of contourlet
transform are quantized by multistage vector quantization. The
quantized coefficients are encoded by Huffman coding. The results
obtained are tabulated and compared with the existing wavelet based
ones.
Abstract: International markets driven forces are changing
continuously, therefore companies need to gain a competitive edge in
such markets. Improving the company's products, processes and
practices is no longer auxiliary. Lean production is a production
management philosophy that consolidates work tasks with minimum
waste resulting in improved productivity. Lean production practices
can be mapped into many production areas. One of these is
Manufacturing Equipment and Technology (MET). Many lean
production practices can be implemented in MET, namely, specific
equipment configurations, total preventive maintenance, visual
control, new equipment/ technologies, production process
reengineering and shared vision of perfection.The purpose of this
paper is to investigate the implementation level of these six practices
in Jordanian industries. To achieve that a questionnaire survey has
been designed according to five-point Likert scale. The questionnaire
is validated through pilot study and through experts review. A sample
of 350 Jordanian companies were surveyed, the response rate was
83%. The respondents were asked to rate the extent of
implementation for each of practices. A relationship conceptual
model is developed, hypotheses are proposed, and consequently the
essential statistical analyses are then performed. An assessment tool
that enables management to monitor the progress and the
effectiveness of lean practices implementation is designed and
presented. Consequently, the results show that the average
implementation level of lean practices in MET is 77%, Jordanian
companies are implementing successfully the considered lean
production practices, and the presented model has Cronbach-s alpha
value of 0.87 which is good evidence on model consistency and
results validation.