Abstract: This paper presented a MATLAB-based system named Smart Access Network Testing, Analyzing and Database (SANTAD), purposely for in-service transmission surveillance and self restoration against fiber fault in fiber-to-the-home (FTTH) access network. The developed program will be installed with optical line terminal (OLT) at central office (CO) to monitor the status and detect any fiber fault that occurs in FTTH downwardly from CO towards residential customer locations. SANTAD is interfaced with optical time domain reflectometer (OTDR) to accumulate every network testing result to be displayed on a single computer screen for further analysis. This program will identify and present the parameters of each optical fiber line such as the line's status either in working or nonworking condition, magnitude of decreasing at each point, failure location, and other details as shown in the OTDR's screen. The failure status will be delivered to field engineers for promptly actions, meanwhile the failure line will be diverted to protection line to ensure the traffic flow continuously. This approach has a bright prospect to improve the survivability and reliability as well as increase the efficiency and monitoring capabilities in FTTH.
Abstract: The new idea of analyze of power system failure with
use of artificial neural network is proposed. An analysis of the
possibility of simulating phenomena accompanying system faults and
restitution is described. It was indicated that the universal model for
the simulation of phenomena in whole analyzed range does not exist.
The main classic method of search of optimal structure and
parameter identification are described shortly. The example with
results of calculation is shown.
Abstract: In this paper, subtractive clustering based fuzzy inference system approach is used for early detection of faults in the function oriented software systems. This approach has been tested with real time defect datasets of NASA software projects named as PC1 and CM1. Both the code based model and joined model (combination of the requirement and code based metrics) of the datasets are used for training and testing of the proposed approach. The performance of the models is recorded in terms of Accuracy, MAE and RMSE values. The performance of the proposed approach is better in case of Joined Model. As evidenced from the results obtained it can be concluded that Clustering and fuzzy logic together provide a simple yet powerful means to model the earlier detection of faults in the function oriented software systems.
Abstract: This paper presented a new approach for centralized
monitoring and self-protected against fiber fault in fiber-to-the-home
(FTTH) access network by using Smart Access Network Testing,
Analyzing and Database (SANTAD). SANTAD will be installed
with optical line terminal (OLT) at central office (CO) for in-service
transmission surveillance and fiber fault localization within FTTH
with point-to-multipoint (P2MP) configuration downwardly from CO
towards customer residential locations based on the graphical user
interface (GUI) processing capabilities of MATLAB software.
SANTAD is able to detect any fiber fault as well as identify the
failure location in the network system. SANTAD enable the status of
each optical network unit (ONU) connected line is displayed onto
one screen with capability to configure the attenuation and detect the
failure simultaneously. The analysis results and information will be
delivered to the field engineer for promptly actions, meanwhile the
failure line will be diverted to protection line to ensure the traffic
flow continuously. This approach has a bright prospect to improve
the survivability and reliability as well as increase the efficiency and
monitoring capabilities in FTTH.
Abstract: Wavelet transform has been extensively used in
machine fault diagnosis and prognosis owing to its strength to deal
with non-stationary signals. The existing Wavelet transform based
schemes for fault diagnosis employ wavelet decomposition of the
entire vibration frequency which not only involve huge
computational overhead in extracting the features but also increases
the dimensionality of the feature vector. This increase in the
dimensionality has the tendency to 'over-fit' the training data and
could mislead the fault diagnostic model. In this paper a novel
technique, envelope wavelet packet transform (EWPT) is proposed in
which features are extracted based on wavelet packet transform of the
filtered envelope signal rather than the overall vibration signal. It not
only reduces the computational overhead in terms of reduced number
of wavelet decomposition levels and features but also improves the
fault detection accuracy. Analytical expressions are provided for the
optimal frequency resolution and decomposition level selection in
EWPT. Experimental results with both actual and simulated machine
fault data demonstrate significant gain in fault detection ability by
EWPT at reduced complexity compared to existing techniques.
Abstract: In order to implement flexibility as well as survivable
capacities over passive optical network (PON), a new automatic
random fault-recovery mechanism with array-waveguide-grating
based (AWG-based) optical switch (OSW) is presented. Firstly,
wavelength-division-multiplexing and optical code-division
multiple-access (WDM/OCDMA) scheme are configured to meet the
various geographical locations requirement between optical network
unit (ONU) and optical line terminal (OLT). The AWG-base optical
switch is designed and viewed as central star-mesh topology to
prohibit/decrease the duplicated redundant elements such as fiber and
transceiver as well. Hence, by simple monitoring and routing switch
algorithm, random fault-recovery capacity is achieved over
bi-directional (up/downstream) WDM/OCDMA scheme. When error
of distribution fiber (DF) takes place or bit-error-rate (BER) is higher
than 10-9 requirement, the primary/slave AWG-based OSW are
adjusted and controlled dynamically to restore the affected ONU
groups via the other working DFs immediately.
Abstract: This paper presents a computational methodology
based on matrix operations for a computer based solution to the
problem of performance analysis of software reliability models
(SRMs). A set of seven comparison criteria have been formulated to
rank various non-homogenous Poisson process software reliability
models proposed during the past 30 years to estimate software
reliability measures such as the number of remaining faults, software
failure rate, and software reliability. Selection of optimal SRM for
use in a particular case has been an area of interest for researchers in
the field of software reliability. Tools and techniques for software
reliability model selection found in the literature cannot be used with
high level of confidence as they use a limited number of model
selection criteria. A real data set of middle size software project from
published papers has been used for demonstration of matrix method.
The result of this study will be a ranking of SRMs based on the
Permanent value of the criteria matrix formed for each model based
on the comparison criteria. The software reliability model with
highest value of the Permanent is ranked at number – 1 and so on.
Abstract: Transaction management is one of the most crucial requirements for enterprise application development which often require concurrent access to distributed data shared amongst multiple application / nodes. Transactions guarantee the consistency of data records when multiple users or processes perform concurrent operations. Existing Fault Tolerance Infrastructure for Mobile Agents (FTIMA) provides a fault tolerant behavior in distributed transactions and uses multi-agent system for distributed transaction and processing. In the existing FTIMA architecture, data flows through the network and contains personal, private or confidential information. In banking transactions a minor change in the transaction can cause a great loss to the user. In this paper we have modified FTIMA architecture to ensure that the user request reaches the destination server securely and without any change. We have used triple DES for encryption/ decryption and MD5 algorithm for validity of message.
Abstract: Medical negligence disputes in Malaysia are mainly resolved through litigation by using the tort system. The tort system, being adversarial in nature has subjected parties to litigation hazards such as delay, excessive costs and uncertainty of outcome. The dissatisfaction of the tort system in compensating medically injured victims has created various alternatives to litigation. Amongst them is the implementation of a no-fault compensation system which would allow compensation to be given without the need of proving fault on the medical personnel. Instead, the community now bears the burden of compensating and at the end, promotes collective responsibility. For Malaysia, introducing a no-fault system would provide a tempting solution and may ultimately, achieve justice for the medical injured victims. Nevertheless, such drastic change requires a great deal of consideration to determine the suitability of the system and whether or not it will eventually cater for the needs of the Malaysian population
Abstract: Protective relays are components of a protection system
in a power system domain that provides decision making element for
correct protection and fault clearing operations. Failure of the
protection devices may reduce the integrity and reliability of the power
system protection that will impact the overall performance of the
power system. Hence it is imperative for power utilities to assess the
reliability of protective relays to assure it will perform its intended
function without failure. This paper will discuss the application of
reliability analysis using statistical method called Life Data Analysis
in Tenaga Nasional Berhad (TNB), a government linked power utility
company in Malaysia, namely Transmission Division, to assess and
evaluate the reliability of numerical overcurrent protective relays from
two different manufacturers.
Abstract: Directional over current relays (DOCR) are commonly used in power system protection as a primary protection in distribution and sub-transmission electrical systems and as a secondary protection in transmission systems. Coordination of protective relays is necessary to obtain selective tripping. In this paper, an approach for efficiency reduction of DOCRs nonlinear optimum coordination (OC) is proposed. This was achieved by modifying the objective function and relaxing several constraints depending on the four constraints classification, non-valid, redundant, pre-obtained and valid constraints. According to this classification, the far end fault effect on the objective function and constraints, and in consequently on relay operating time, was studied. The study was carried out, firstly by taking into account the near-end and far-end faults in DOCRs coordination problem formulation; and then faults very close to the primary relays (nearend faults). The optimal coordination (OC) was achieved by simultaneously optimizing all variables (TDS and Ip) in nonlinear environment by using of Genetic algorithm nonlinear programming techniques. The results application of the above two approaches on 6-bus and 26-bus system verify that the far-end faults consideration on OC problem formulation don-t lose the optimality.
Abstract: In this paper, we use nonlinear system identification method to predict and detect process fault of a cement rotary kiln. After selecting proper inputs and output, an input-output model is identified for the plant. To identify the various operation points in the
kiln, Locally Linear Neuro-Fuzzy (LLNF) model is used. This model is trained by LOLIMOT algorithm which is an incremental treestructure
algorithm. Then, by using this method, we obtained 3
distinct models for the normal and faulty situations in the kiln. One of the models is for normal condition of the kiln with 15 minutes
prediction horizon. The other two models are for the two faulty situations in the kiln with 7 minutes prediction horizon are presented.
At the end, we detect these faults in validation data. The data collected from White Saveh Cement Company is used for in this study.
Abstract: As the majority of faults are found in a few of its modules so there is a need to investigate the modules that are affected severely as compared to other modules and proper maintenance need to be done on time especially for the critical applications. In this paper, we have explored the different predictor models to NASA-s public domain defect dataset coded in Perl programming language. Different machine learning algorithms belonging to the different learner categories of the WEKA project including Mamdani Based Fuzzy Inference System and Neuro-fuzzy based system have been evaluated for the modeling of maintenance severity or impact of fault severity. The results are recorded in terms of Accuracy, Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE). The results show that Neuro-fuzzy based model provides relatively better prediction accuracy as compared to other models and hence, can be used for the maintenance severity prediction of the software.
Abstract: Distance protection of transmission lines including advanced flexible AC transmission system (FACTS) devices has been a very challenging task. FACTS devices of interest in this paper are static synchronous series compensators (SSSC) and unified power flow controller (UPFC). In this paper, a new algorithm is proposed to detect and classify the fault and identify the fault position in a transmission line with respect to a FACTS device placed in the midpoint of the transmission line. Discrete wavelet transformation and wavelet entropy calculations are used to analyze during fault current and voltage signals of the compensated transmission line. The proposed algorithm is very simple and accurate in fault detection and classification. A variety of fault cases and simulation results are introduced to show the effectiveness of such algorithm.
Abstract: The Expert Witness Testimony in the Battered
Woman Syndrome Expert witness testimony (EWT) is a kind of
information given by an expert specialized in the field (here in BWS)
to the jury in order to help the court better understand the case. EWT
does not always work in favor of the battered women. Two main
decision-making models are discussed in the paper: the Mathematical
model and the Explanation model. In the first model, the jurors
calculate ″the importance and strength of each piece of evidence″
whereas in the second model they try to integrate the EWT with the
evidence and create a coherent story that would describe the crime.
The jury often misunderstands and misjudges battered women for
their action (or in this case inaction). They assume that these women
are masochists and accept being mistreated for if a man abuses a
woman constantly, she should and could divorce him or simply leave
at any time. The research in the domain found that indeed, expert
witness testimony has a powerful influence on juror’s decisions thus
its quality needs to be further explored. One of the important factors
that need further studies is a bias called the dispositionist worldview
(a belief that what happens to people is of their own doing). This
kind of attributional bias represents a tendency to think that a
person’s behavior is due to his or her disposition, even when the
behavior is clearly attributed to the situation. Hypothesis The
hypothesis of this paper is that if a juror has a dispositionist
worldview then he or she will blame the rape victim for triggering the
assault. The juror would therefore commit the fundamental
attribution error and believe that the victim’s disposition caused the
rape and not the situation she was in. Methods The subjects in the
study were 500 randomly sampled undergraduate students from
McGill, Concordia, Université de Montréal and UQAM.
Dispositional Worldview was scored on the Dispositionist
Worldview Questionnaire. After reading the Rape Scenarios, each
student was asked to play the role of a juror and answer a
questionnaire consisting of 7 questions about the responsibility,
causality and fault of the victim. Results The results confirm the
hypothesis which states that if a juror has a dispositionist worldview
then he or she will blame the rape victim for triggering the assault.
By doing so, the juror commits the fundamental attribution error
because he will believe that the victim’s disposition, and not the
constraints or opportunities of the situation, caused the rape scenario.
Abstract: It is estimated that the total cost of abnormal
conditions to US process industries is around $20 billion dollars in
annual losses. The hydrotreatment (HDT) of diesel fuel in petroleum
refineries is a conversion process that leads to high profitable
economical returns. However, this is a difficult process to control
because it is operated continuously, with high hydrogen pressures
and it is also subject to disturbances in feed properties and catalyst
performance. So, the automatic detection of fault and diagnosis plays
an important role in this context. In this work, a hybrid approach
based on neural networks together with a pos-processing
classification algorithm is used to detect faults in a simulated HDT
unit. Nine classes (8 faults and the normal operation) were correctly
classified using the proposed approach in a maximum time of 5
minutes, based on on-line data process measurements.
Abstract: The reliability of distributed systems and computer
networks have been modeled by a probabilistic network or a graph G.
Computing the residual connectedness reliability (RCR), denoted by
R(G), under the node fault model is very useful, but is an NP-hard
problem. Since it may need exponential time of the network size to
compute the exact value of R(G), it is important to calculate its tight
approximate value, especially its lower bound, at a moderate
calculation time. In this paper, we propose an efficient algorithm for
reliability lower bound of distributed systems with unreliable nodes.
We also applied our algorithm to several typical classes of networks
to evaluate the lower bounds and show the effectiveness of our
algorithm.
Abstract: A filter is used to remove undesirable frequency information from a dynamic signal. This paper shows that the Znotch filter filtering technique can be applied to remove the noise nuisance from a machining signal. In machining, the noise components were identified from the sound produced by the operation of machine components itself such as hydraulic system, motor, machine environment and etc. By correlating the noise components with the measured machining signal, the interested components of the measured machining signal which was less interfered by the noise, can be extracted. Thus, the filtered signal is more reliable to be analysed in terms of noise content compared to the unfiltered signal. Significantly, the I-kaz method i.e. comprises of three dimensional graphical representation and I-kaz coefficient, Z∞ could differentiate between the filtered and the unfiltered signal. The bigger space of scattering and the higher value of Z∞ demonstrated that the signal was highly interrupted by noise. This method can be utilised as a proactive tool in evaluating the noise content in a signal. The evaluation of noise content is very important as well as the elimination especially for machining operation fault diagnosis purpose. The Z-notch filtering technique was reliable in extracting noise component from the measured machining signal with high efficiency. Even though the measured signal was exposed to high noise disruption, the signal generated from the interaction between cutting tool and work piece still can be acquired. Therefore, the interruption of noise that could change the original signal feature and consequently can deteriorate the useful sensory information can be eliminated.
Abstract: Detection of incipient abnormal events is important to
improve safety and reliability of machine operations and reduce losses
caused by failures. Improper set-ups or aligning of parts often leads to
severe problems in many machines. The construction of prediction
models for predicting faulty conditions is quite essential in making
decisions on when to perform machine maintenance. This paper
presents a multivariate calibration monitoring approach based on the
statistical analysis of machine measurement data. The calibration
model is used to predict two faulty conditions from historical reference
data. This approach utilizes genetic algorithms (GA) based variable
selection, and we evaluate the predictive performance of several
prediction methods using real data. The results shows that the
calibration model based on supervised probabilistic principal
component analysis (SPPCA) yielded best performance in this work.
By adopting a proper variable selection scheme in calibration models,
the prediction performance can be improved by excluding
non-informative variables from their model building steps.
Abstract: Short circuit currents plays a vital role in influencing the design and operation of equipment and power system and could not be avoided despite careful planning and design, good maintenance and thorough operation of the system. This paper discusses the short circuit analysis conducted in KSO briefly comprising of its significances, methods and results. A result sample of the analysis based on a single transformer is detailed in this paper. Furthermore, the results of the analysis and its significances were also discussed and commented.