Abstract: Spare parts inventory management is one of the major
areas of inventory research. Analysis of recent literature showed that
an approach integrating spare parts classification, demand
forecasting, and stock control policies is essential; however, adapting
this integrated approach is limited. This work presents an integrated
framework for spare part inventory management and an Excel based
application developed for the implementation of the proposed
framework. A multi-criteria analysis has been used for spare
classification. Forecasting of spare parts- intermittent demand has
been incorporated into the application using three different
forecasting models; namely, normal distribution, exponential
smoothing, and Croston method. The application is also capable of
running with different inventory control policies. To illustrate the
performance of the proposed framework and the developed
application; the framework is applied to different items at a service
organization. The results achieved are presented and possible areas
for future work are highlighted.
Abstract: The ability of the brain to organize information and generate the functional structures we use to act, think and communicate, is a common and easily observable natural phenomenon. In object-oriented analysis, these structures are represented by objects. Objects have been extensively studied and documented, but the process that creates them is not understood. In this work, a new class of discrete, deterministic, dissipative, host-guest dynamical systems is introduced. The new systems have extraordinary self-organizing properties. They can host information representing other physical systems and generate the same functional structures as the brain does. A simple mathematical model is proposed. The new systems are easy to simulate by computer, and measurements needed to confirm the assumptions are abundant and readily available. Experimental results presented here confirm the findings. Applications are many, but among the most immediate are object-oriented engineering, image and voice recognition, search engines, and Neuroscience.
Abstract: The photocatalytic activity efficiency of TiO2 for the degradation of Toluene in photoreactor can be enhanced by nano- TiO2/LDPE composite film. Since the amount of TiO2 affected the efficiency of the photocatalytic activity, this work was mainly concentrated on the effort to embed the high amount of TiO2 in the Polyethylene matrix. The developed photocatalyst was characterized by XRD, UV-Vis spectrophotometer and SEM. The SEM images revealed the high homogeneity of the deposition of TiO2 on the polyethylene matrix. The XRD patterns interpreted that TiO2 embedded in the PE matrix exhibited mainly in anatase form. In addition, the photocatalytic results show that the toluene removal efficiencies of 30±5%, 49±4%, 68±5%, 42±6% and 33±5% were obtained when using the catalyst loading at 0%, 10%, 15%, 25% and 50% (wt. cat./wt. film), respectively.
Abstract: As in today's semiconductor industries test costs can make up to 50 percent of the total production costs, an efficient test error detection becomes more and more important. In this paper, we present a new machine learning approach to test error detection that should provide a faster recognition of test system faults as well as an improved test error recall. The key idea is to learn a classifier ensemble, detecting typical test error patterns in wafer test results immediately after finishing these tests. Since test error detection has not yet been discussed in the machine learning community, we define central problem-relevant terms and provide an analysis of important domain properties. Finally, we present comparative studies reflecting the failure detection performance of three individual classifiers and three ensemble methods based upon them. As base classifiers we chose a decision tree learner, a support vector machine and a Bayesian network, while the compared ensemble methods were simple and weighted majority vote as well as stacking. For the evaluation, we used cross validation and a specially designed practical simulation. By implementing our approach in a semiconductor test department for the observation of two products, we proofed its practical applicability.
Abstract: The introduction of sowing technologies into minimum- or no-tillage soil has a number of economical and environmental virtues, such as improving soil properties, decreasing soil erosion and degradation, and saving working time and fuel. However, the main disadvantage of these technologies is that plant residues on the soil surface reduce the quality of the planted crop seeds, thus requiring plant residues to be removed or cut. This paper presents a analysis of disc coulter parameters and an experimental investigation of cutting spring barley straw containing various amounts of moisture with different disc coulters (smooth and notched).
Abstract: In this study, the Multi-Layer Perceptron (MLP)with Back-Propagation learning algorithm are used to classify to effective diagnosis Parkinsons disease(PD).It-s a challenging problem for medical community.Typically characterized by tremor, PD occurs due to the loss of dopamine in the brains thalamic region that results in involuntary or oscillatory movement in the body. A feature selection algorithm along with biomedical test values to diagnose Parkinson disease.Clinical diagnosis is done mostly by doctor-s expertise and experience.But still cases are reported of wrong diagnosis and treatment. Patients are asked to take number of tests for diagnosis.In many cases,not all the tests contribute towards effective diagnosis of a disease.Our work is to classify the presence of Parkinson disease with reduced number of attributes.Original,22 attributes are involved in classify.We use Information Gain to determine the attributes which reduced the number of attributes which is need to be taken from patients.The Artificial neural networks is used to classify the diagnosis of patients.Twenty-Two attributes are reduced to sixteen attributes.The accuracy is in training data set is 82.051% and in the validation data set is 83.333%.
Abstract: The flow field over a three dimensional pole barn
characterized by a cylindrical roof has been numerically investigated.
Wind pressure and viscous loads acting on the agricultural building
have been analyzed for several incoming wind directions, so as to
evaluate the most critical load condition on the structure. A constant
wind velocity profile, based on the maximum reference wind speed in
the building site (peak gust speed worked out for 50 years return
period) and on the local roughness coefficient, has been simulated.
In order to contemplate also the hazard due to potential air
wedging between the stored hay and the lower part of the ceiling, the
effect of a partial filling of the barn has been investigated.
The distribution of wind-induced loads on the structure have been
determined, allowing a numerical quantification of the effect of wind
direction on the induced stresses acting on a hemicylindrical roof.
Abstract: True integration of multimedia services over wired or
wireless networks increase the productivity and effectiveness in
today-s networks. IP Multimedia Subsystems are Next Generation
Network architecture to provide the multimedia services over fixed
or mobile networks. This paper proposes an extended SIP-based QoS
Management architecture for IMS services over underlying IP access
networks. To guarantee the end-to-end QoS for IMS services in
interconnection backbone, SIP based proxy Modules are introduced
to support the QoS provisioning and to reduce the handoff disruption
time over IP access networks. In our approach these SIP Modules
implement the combination of Diffserv and MPLS QoS mechanisms
to assure the guaranteed QoS for real-time multimedia services. To
guarantee QoS over access networks, SIP Modules make QoS
resource reservations in advance to provide best QoS to IMS users
over heterogeneous networks. To obtain more reliable multimedia
services, our approach allows the use of SCTP protocol over SIP
instead of UDP due to its multi-streaming feature. This architecture
enables QoS provisioning for IMS roaming users to differentiate IMS
network from other common IP networks for transmission of realtime
multimedia services. To validate our approach simulation
models are developed on short scale basis. The results show that our
approach yields comparable performance for efficient delivery of
IMS services over heterogeneous IP access networks.
Abstract: Testing is an activity that is required both in the
development and maintenance of the software development life cycle
in which Integration Testing is an important activity. Integration
testing is based on the specification and functionality of the software
and thus could be called black-box testing technique. The purpose of
integration testing is testing integration between software
components. In function or system testing, the concern is with overall
behavior and whether the software meets its functional specifications
or performance characteristics or how well the software and
hardware work together. This explains the importance and necessity
of IT for which the emphasis is on interactions between modules and
their interfaces. Software errors should be discovered early during
IT to reduce the costs of correction. This paper introduces a new type
of integration error, presenting an overview of Integration Testing
techniques with comparison of each technique and also identifying
which technique detects what type of error.
Abstract: In this paper, we investigate the appearance of the giant component in random subgraphs G(p) of a given large finite graph family Gn = (Vn, En) in which each edge is present independently with probability p. We show that if the graph Gn satisfies a weak isoperimetric inequality and has bounded degree, then the probability p under which G(p) has a giant component of linear order with some constant probability is bounded away from zero and one. In addition, we prove the probability of abnormally large order of the giant component decays exponentially. When a contact graph is modeled as Gn, our result is of special interest in the study of the spread of infectious diseases or the identification of community in various social networks.
Abstract: Rapid advancement in computing technology brings
computers and humans to be seamlessly integrated in future. The
emergence of smartphone has driven computing era towards
ubiquitous and pervasive computing. Recognizing human activity has
garnered a lot of interest and has raised significant researches-
concerns in identifying contextual information useful to human
activity recognition. Not only unobtrusive to users in daily life,
smartphone has embedded built-in sensors that capable to sense
contextual information of its users supported with wide range
capability of network connections. In this paper, we will discuss the
classification algorithms used in smartphone-based human activity.
Existing technologies pertaining to smartphone-based researches in
human activity recognition will be highlighted and discussed. Our
paper will also present our findings and opinions to formulate
improvement ideas in current researches- trends. Understanding
research trends will enable researchers to have clearer research
direction and common vision on latest smartphone-based human
activity recognition area.
Abstract: Tensile armour wires provide a flexible pipe's
resistance to longitudinal stresses. Flexible pipe manufacturers need
to know the effect of defects such as scratches and cracks, with
dimensions less than 0.2mm which is the limit of the current nondestructive
detection technology, on the fracture stress and fracture
strain of the wire for quality assurance purposes. Recent research
involving the determination of the fracture strength of cracked wires
employed laboratory testing and classical fracture mechanics
approach using non-standardised fracture mechanics specimens
because standard test specimens could not be manufactured from the
wires owing to their sizes. In this work, the effect of miniature
cracks on the fracture properties of tensile armour wires was
investigated using laboratory and finite element tensile testing
simulations with the phenomenological shear fracture model. The
investigation revealed that the presence of cracks shallower than
0.2mm is worse on the fracture strain of the wire.
Abstract: A concern that researchers usually face in different
applications of Artificial Neural Network (ANN) is determination of
the size of effective domain in time series. In this paper, trial and
error method was used on groundwater depth time series to determine
the size of effective domain in the series in an observation well in
Union County, New Jersey, U.S. different domains of 20, 40, 60, 80,
100, and 120 preceding day were examined and the 80 days was
considered as effective length of the domain. Data sets in different
domains were fed to a Feed Forward Back Propagation ANN with
one hidden layer and the groundwater depths were forecasted. Root
Mean Square Error (RMSE) and the correlation factor (R2) of
estimated and observed groundwater depths for all domains were
determined. In general, groundwater depth forecast improved, as
evidenced by lower RMSEs and higher R2s, when the domain length
increased from 20 to 120. However, 80 days was selected as the
effective domain because the improvement was less than 1% beyond
that. Forecasted ground water depths utilizing measured daily data
(set #1) and data averaged over the effective domain (set #2) were
compared. It was postulated that more accurate nature of measured
daily data was the reason for a better forecast with lower RMSE
(0.1027 m compared to 0.255 m) in set #1. However, the size of input
data in this set was 80 times the size of input data in set #2; a factor
that may increase the computational effort unpredictably. It was
concluded that 80 daily data may be successfully utilized to lower the
size of input data sets considerably, while maintaining the effective
information in the data set.
Abstract: RC4 was used as an encryption algorithm in WEP(Wired Equivalent Privacy) protocol that is a standardized for 802.11 wireless network. A few attacks followed, indicating certain weakness in the design. In this paper, we proposed a new variant of RC4 stream cipher. The new version of the cipher does not only appear to be more secure, but its keystream also has large period, large complexity and good statistical properties.
Abstract: This paper presents a NDT by infrared thermography with excitation CO2 Laser, wavelength of 10.6 μm. This excitation is the controllable heating beam, confirmed by a preliminary test on a wooden plate 1.2 m x 0.9 m x 1 cm. As the first practice, this method is applied to detecting the defect in CFRP heated by the Laser 300 W during 40 s. Two samples 40 cm x 40 cm x 4.5 cm are prepared, one with defect, another one without defect. The laser beam passes through the lens of a deviation device, and heats the samples placed at a determinate position and area. As a result, the absence of adhesive can be detected. This method displays prominently its application as NDT with the composite materials. This work gives a good perspective to characterize the laser beam, which is very useful for the next detection campaigns.
Abstract: Cloud Computing is a new technology that helps us to
use the Cloud for compliance our computation needs. Cloud refers to a scalable network of computers that work together like Internet. An
important element in Cloud Computing is that we shift processing, managing, storing and implementing our data from, locality into the
Cloud; So it helps us to improve the efficiency. Because of it is new
technology, it has both advantages and disadvantages that are
scrutinized in this article. Then some vanguards of this technology
are studied. Afterwards we find out that Cloud Computing will have
important roles in our tomorrow life!
Abstract: Although backpropagation ANNs generally predict
better than decision trees do for pattern classification problems, they
are often regarded as black boxes, i.e., their predictions cannot be
explained as those of decision trees. In many applications, it is
desirable to extract knowledge from trained ANNs for the users to
gain a better understanding of how the networks solve the problems.
A new rule extraction algorithm, called rule extraction from artificial
neural networks (REANN) is proposed and implemented to extract
symbolic rules from ANNs. A standard three-layer feedforward ANN
is the basis of the algorithm. A four-phase training algorithm is
proposed for backpropagation learning. Explicitness of the extracted
rules is supported by comparing them to the symbolic rules generated
by other methods. Extracted rules are comparable with other methods
in terms of number of rules, average number of conditions for a rule,
and predictive accuracy. Extensive experimental studies on several
benchmarks classification problems, such as breast cancer, iris,
diabetes, and season classification problems, demonstrate the
effectiveness of the proposed approach with good generalization
ability.
Abstract: During the last couple of years, the degree of dependence on IT systems has reached a dimension nobody imagined to be possible 10 years ago. The increased usage of mobile devices (e.g., smart phones), wireless sensor networks and embedded devices (Internet of Things) are only some examples of the dependency of modern societies on cyber space. At the same time, the complexity of IT applications, e.g., because of the increasing use of cloud computing, is rising continuously. Along with this, the threats to IT security have increased both quantitatively and qualitatively, as recent examples like STUXNET or the supposed cyber attack on Illinois water system are proofing impressively. Once isolated control systems are nowadays often publicly available - a fact that has never been intended by the developers. Threats to IT systems don’t care about areas of responsibility. Especially with regard to Cyber Warfare, IT threats are no longer limited to company or industry boundaries, administrative jurisdictions or state boundaries. One of the important countermeasures is increased cooperation among the participants especially in the field of Cyber Defence. Besides political and legal challenges, there are technical ones as well. A better, at least partially automated exchange of information is essential to (i) enable sophisticated situational awareness and to (ii) counter the attacker in a coordinated way. Therefore, this publication performs an evaluation of state of the art Intrusion Detection Message Exchange protocols in order to guarantee a secure information exchange between different entities.
Abstract: As the majority of faults are found in a few of its
modules so there is a need to investigate the modules that are
affected severely as compared to other modules and proper
maintenance need to be done in time especially for the critical
applications. As, Neural networks, which have been already applied
in software engineering applications to build reliability growth
models predict the gross change or reusability metrics. Neural
networks are non-linear sophisticated modeling techniques that are
able to model complex functions. Neural network techniques are
used when exact nature of input and outputs is not known. A key
feature is that they learn the relationship between input and output
through training. In this present work, various Neural Network Based
techniques are explored and comparative analysis is performed for
the prediction of level of need of maintenance by predicting level
severity of faults present in NASA-s public domain defect dataset.
The comparison of different algorithms is made on the basis of Mean
Absolute Error, Root Mean Square Error and Accuracy Values. It is
concluded that Generalized Regression Networks is the best
algorithm for classification of the software components into different
level of severity of impact of the faults. The algorithm can be used to
develop model that can be used for identifying modules that are
heavily affected by the faults.
Abstract: Inter-organizational Workflow (IOW) is commonly
used to support the collaboration between heterogeneous and
distributed business processes of different autonomous organizations
in order to achieve a common goal. E-government is considered as an
application field of IOW. The coordination of the different
organizations is the fundamental problem in IOW and remains the
major cause of failure in e-government projects. In this paper, we
introduce a new coordination model for IOW that improves the
collaboration between government administrations and that respects
IOW requirements applied to e-government. For this purpose, we
adopt a Multi-Agent approach, which deals more easily with interorganizational
digital government characteristics: distribution,
heterogeneity and autonomy. Our model integrates also different
technologies to deal with the semantic and technologic
interoperability. Moreover, it conserves the existing systems of
government administrations by offering a distributed coordination
based on interfaces communication. This is especially applied in
developing countries, where administrations are not necessary
equipped with workflow systems. The use of our coordination
techniques allows an easier migration for an e-government solution
and with a lower cost. To illustrate the applicability of the proposed
model, we present a case study of an identity card creation in Tunisia.