Abstract: Maintenance is one of the most important activities in
the shipyard industry. However, sometimes it is not supported by
adequate services from the shipyard, where inaccuracy in estimating
the duration of the ship maintenance is still common. This makes
estimation of ship maintenance duration is crucial. This study uses
Data Mining approach, i.e., CART (Classification and Regression
Tree) to estimate the duration of ship maintenance that is limited to
dock works or which is known as dry docking. By using the volume
of dock works as an input to estimate the maintenance duration, 4
classes of dry docking duration were obtained with different linear
model and job criteria for each class. These linear models can then be
used to estimate the duration of dry docking based on job criteria.
Abstract: Grid environments include aggregation of
geographical distributed resources. Grid is put forward in three types
of computational, data and storage. This paper presents a research on
data grid. Data grid is used for covering and securing accessibility to
data from among many heterogeneous sources. Users are not worry
on the place where data is located in it, provided that, they should get
access to the data. Metadata is used for getting access to data in data
grid. Presently, application metadata catalogue and SRB middle-ware
package are used in data grids for management of metadata. At this
paper, possibility of updating, streamlining and searching is provided
simultaneously and rapidly through classified table of preserving
metadata and conversion of each table to numerous tables.
Meanwhile, with regard to the specific application, the most
appropriate and best division is set and determined. Concurrency of
implementation of some of requests and execution of pipeline is
adaptability as a result of this technique.
Abstract: Face Recognition has always been a fascinating research area. It has drawn the attention of many researchers because of its various potential applications such as security systems, entertainment, criminal identification etc. Many supervised and unsupervised learning techniques have been reported so far. Principal Component Analysis (PCA), Self Organizing Maps (SOM) and Independent Component Analysis (ICA) are the three techniques among many others as proposed by different researchers for Face Recognition, known as the unsupervised techniques. This paper proposes integration of the two techniques, SOM and PCA, for dimensionality reduction and feature selection. Simulation results show that, though, the individual techniques SOM and PCA itself give excellent performance but the combination of these two can also be utilized for face recognition. Experimental results also indicate that for the given face database and the classifier used, SOM performs better as compared to other unsupervised learning techniques. A comparison of two proposed methodologies of SOM, Local and Global processing, shows the superiority of the later but at the cost of more computational time.
Abstract: Cancer classification to their corresponding cohorts has been key area of research in bioinformatics aiming better prognosis of the disease. High dimensionality of gene data has been makes it a complex task and requires significance data identification technique in order to reducing the dimensionality and identification of significant information. In this paper, we have proposed a novel approach for classification of oral cancer into metastasis positive and negative patients. We have used significance analysis of microarrays (SAM) for identifying significant genes which constitutes gene signature. 3 different gene signatures were identified using SAM from 3 different combination of training datasets and their classification accuracy was calculated on corresponding testing datasets using k-Nearest Neighbour (kNN), Fuzzy C-Means Clustering (FCM), Support Vector Machine (SVM) and Backpropagation Neural Network (BPNN). A final gene signature of only 9 genes was obtained from above 3 individual gene signatures. 9 gene signature-s classification capability was compared using same classifiers on same testing datasets. Results obtained from experimentation shows that 9 gene signature classified all samples in testing dataset accurately while individual genes could not classify all accurately.
Abstract: Nature conducts its action in a very private manner. To
reveal these actions classical science has done a great effort. But
classical science can experiment only with the things that can be seen
with eyes. Beyond the scope of classical science quantum science
works very well. It is based on some postulates like qubit,
superposition of two states, entanglement, measurement and
evolution of states that are briefly described in the present paper.
One of the applications of quantum computing i.e.
implementation of a novel quantum evolutionary algorithm(QEA) to
automate the time tabling problem of Dayalbagh Educational Institute
(Deemed University) is also presented in this paper. Making a good
timetable is a scheduling problem. It is NP-hard, multi-constrained,
complex and a combinatorial optimization problem. The solution of
this problem cannot be obtained in polynomial time. The QEA uses
genetic operators on the Q-bit as well as updating operator of
quantum gate which is introduced as a variation operator to converge
toward better solutions.
Abstract: Knowledge Discovery in Databases (KDD) has
evolved into an important and active area of research because of
theoretical challenges and practical applications associated with the
problem of discovering (or extracting) interesting and previously
unknown knowledge from very large real-world databases. Rough
Set Theory (RST) is a mathematical formalism for representing
uncertainty that can be considered an extension of the classical set
theory. It has been used in many different research areas, including
those related to inductive machine learning and reduction of
knowledge in knowledge-based systems. One important concept
related to RST is that of a rough relation. In this paper we presented
the current status of research on applying rough set theory to KDD,
which will be helpful for handle the characteristics of real-world
databases. The main aim is to show how rough set and rough set
analysis can be effectively used to extract knowledge from large
databases.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: Urban road network traffic has become one of the
most studied research topics in the last decades. This is mainly due to
the enlargement of the cities and the growing number of motor
vehicles traveling in this road network. One of the most sensitive
problems is to verify if the network is congestion-free. Another
related problem is the automatic reconfiguration of the network
without building new roads to alleviate congestions. These problems
require an accurate model of the traffic to determine the steady state
of the system. An alternative is to simulate the traffic to see if there
are congestions and when and where they occur. One key issue is to
find an adequate model for road intersections. Once the model
established, either a large scale model is built or the intersection is
represented by its performance measures and simulation for analysis.
In both cases, it is important to seek the queueing model to represent
the road intersection. In this paper, we propose to model the road
intersection as a BCMP queueing network and we compare this
analytical model against a simulation model for validation.
Abstract: A five-class density histogram with an index named cumulative density was proposed to analyze the short-term HRV. 150 subjects participated in the test, falling into three groups with equal numbers -- the healthy young group (Young), the healthy old group (Old), and the group of patients with congestive heart failure (CHF). Results of multiple comparisons showed a significant differences of the cumulative density in the three groups, with values 0.0238 for Young, 0.0406 for Old and 0.0732 for CHF (p
Abstract: This paper presents a new method of analog fault diagnosis based on back-propagation neural networks (BPNNs) using wavelet decomposition and fractal dimension as preprocessors. The proposed method has the capability to detect and identify faulty components in an analog electronic circuit with tolerance by analyzing its impulse response. Using wavelet decomposition to preprocess the impulse response drastically de-noises the inputs to the neural network. The second preprocessing by fractal dimension can extract unique features, which are the fed to a neural network as inputs for further classification. A comparison of our work with [1] and [6], which also employs back-propagation (BP) neural networks, reveals that our system requires a much smaller network and performs significantly better in fault diagnosis of analog circuits due to our proposed preprocessing techniques.
Abstract: In this paper, we study a class of serially concatenated block codes (SCBC) based on matrix interleavers, to be employed in fixed wireless communication systems. The performances of SCBC¬coded systems are investigated under various interleaver dimensions. Numerical results reveal that the matrix interleaver could be a competitive candidate over conventional block interleaver for frame lengths of 200 bits; hence, the SCBC coding based on matrix interleaver is a promising technique to be employed for speech transmission applications in many international standards such as pan-European Global System for Mobile communications (GSM), Digital Cellular Systems (DCS) 1800, and Joint Detection Code Division Multiple Access (JD-CDMA) mobile radio systems, where the speech frame contains around 200 bits.
Abstract: Information on weed distribution within the field is necessary to implement spatially variable herbicide application. Since hand labor is costly, an automated weed control system could be feasible. This paper deals with the development of an algorithm for real time specific weed recognition system based on Histogram Maxima with threshold of an image that is used for the weed classification. This algorithm is specifically developed to classify images into broad and narrow class for real-time selective herbicide application. The developed system has been tested on weeds in the lab, which have shown that the system to be very effectiveness in weed identification. Further the results show a very reliable performance on images of weeds taken under varying field conditions. The analysis of the results shows over 95 percent classification accuracy over 140 sample images (broad and narrow) with 70 samples from each category of weeds.
Abstract: In the classical buckling analysis of rectangular plates
subjected to the concurrent action of shear and uniaxial forces, the
Euler shear buckling stress is generally evaluated separately, so that
no influence on the shear buckling coefficient, due to the in-plane
tensile or compressive forces, is taken into account.
In this paper the buckling problem of simply supported rectangular
plates, under the combined action of shear and uniaxial forces, is
discussed from the beginning, in order to obtain new project formulas
for the shear buckling coefficient that take into account the presence
of uniaxial forces.
Furthermore, as the classical expression of the shear buckling
coefficient for simply supported rectangular plates is considered only
a “rough" approximation, as the exact one is defined by a system of
intersecting curves, the convergence and the goodness of the classical
solution are analyzed, too.
Finally, as the problem of the Euler shear buckling stress
evaluation is a very important topic for a variety of structures, (e.g.
ship ones), two numerical applications are carried out, in order to
highlight the role of the uniaxial stresses on the plating scantling
procedures and the goodness of the proposed formulas.
Abstract: This research is a comparative study of complexity, as a multidimensional concept, in the context of streetscape composition in Algeria and Japan. 80 streetscapes visual arrays have been collected and then presented to 20 participants, with different cultural backgrounds, in order to be categorized and classified according to their degrees of complexity. Three analysis methods have been used in this research: cluster analysis, ranking method and Hayashi Quantification method (Method III). The results showed that complexity, disorder, irregularity and disorganization are often conflicting concepts in the urban context. Algerian daytime streetscapes seem to be balanced, ordered and regular, and Japanese daytime streetscapes seem to be unbalanced, regular and vivid. Variety, richness and irregularity with some aspects of order and organization seem to characterize Algerian night streetscapes. Japanese night streetscapes seem to be more related to balance, regularity, order and organization with some aspects of confusion and ambiguity. Complexity characterized mainly Algerian avenues with green infrastructure. Therefore, for Japanese participants, Japanese traditional night streetscapes were complex. And for foreigners, Algerian and Japanese avenues nightscapes were the most complex visual arrays.
Abstract: The paper deals with an application of quantitative analysis – the Data Envelopment Analysis (DEA) method to performance evaluation of the European Union Member States, in the reference years 2000 and 2011. The main aim of the paper is to measure efficiency changes over the reference years and to analyze a level of productivity in individual countries based on DEA method and to classify the EU Member States to homogeneous units (clusters) according to efficiency results. The theoretical part is devoted to the fundamental basis of performance theory and the methodology of DEA. The empirical part is aimed at measuring degree of productivity and level of efficiency changes of evaluated countries by basic DEA model – CCR CRS model, and specialized DEA approach – the Malmquist Index measuring the change of technical efficiency and the movement of production possibility frontier. Here, DEA method becomes a suitable tool for setting a competitive/uncompetitive position of each country because there is not only one factor evaluated, but a set of different factors that determine the degree of economic development.
Abstract: Plasmodium vivax malaria differs from P. falciparum malaria in that a person suffering from P. vivax infection can suffer relapses of the disease. This is due the parasite being able to remain dormant in the liver of the patients where it is able to re-infect the patient after a passage of time. During this stage, the patient is classified as being in the dormant class. The model to describe the transmission of P. vivax malaria consists of a human population divided into four classes, the susceptible, the infected, the dormant and the recovered. The effect of a time delay on the transmission of this disease is studied. The time delay is the period in which the P. vivax parasite develops inside the mosquito (vector) before the vector becomes infectious (i.e., pass on the infection). We analyze our model by using standard dynamic modeling method. Two stable equilibrium states, a disease free state E0 and an endemic state E1, are found to be possible. It is found that the E0 state is stable when a newly defined basic reproduction number G is less than one. If G is greater than one the endemic state E1 is stable. The conditions for the endemic equilibrium state E1 to be a stable spiral node are established. For realistic values of the parameters in the model, it is found that solutions in phase space are trajectories spiraling into the endemic state. It is shown that the limit cycle and chaotic behaviors can only be achieved with unrealistic parameter values.
Abstract: We propose a fast and robust hierarchical face detection system which finds and localizes face images with a cascade of classifiers. Three modules contribute to the efficiency of our detector. First, heterogeneous feature descriptors are exploited to enrich feature types and feature numbers for face representation. Second, a PSO-Adaboost algorithm is proposed to efficiently select discriminative features from a large pool of available features and reinforce them into the final ensemble classifier. Compared with the standard exhaustive Adaboost for feature selection, the new PSOAdaboost algorithm reduces the training time up to 20 times. Finally, a three-stage hierarchical classifier framework is developed for rapid background removal. In particular, candidate face regions are detected more quickly by using a large size window in the first stage. Nonlinear SVM classifiers are used instead of decision stump functions in the last stage to remove those remaining complex nonface patterns that can not be rejected in the previous two stages. Experimental results show our detector achieves superior performance on the CMU+MIT frontal face dataset.
Abstract: This paper presents a new data oriented model of image. Then a representation of it, ADBT, is introduced. The ability of ADBT is clustering, segmentation, measuring similarity of images etc, with desired precision and corresponding speed.
Abstract: Undoubtedly, chassis is one of the most important
parts of a vehicle. Chassis that today are produced for vehicles are
made up of four parts. These parts are jointed together by screwing.
Transverse parts are called cross member.
This study reviews the stress generated by cyclic laboratory loads
in front cross member of Peugeot 405. In this paper the finite element
method is used to simulate the welding process and to determine the
physical response of the spot-welded joints. Analysis is done by the
Abaqus software.
The Stresses generated in cross member structure are generally
classified into two groups: The stresses remained in form of residual
stresses after welding process and the mechanical stress generated by
cyclic load. Accordingly the total stress must be obtained by
determining residual stress and mechanical stress separately and then
sum them according to the superposition principle.
In order to improve accuracy, material properties including
physical, thermal and mechanical properties were supposed to be
temperature-dependent. Simulation shows that maximum Von Misses
stresses are located at special points. The model results are then
compared to the experimental results which are reported by
producing factory and good agreement is observed.
Abstract: Bone remodeling occurs by the balanced action of
bone resorbing osteoclasts (OC) and bone-building osteoblasts.
Increased bone resorption by excessive OC activity contributes
to malignant and non-malignant diseases including osteoporosis.
To study OC differentiation and function, OC formed in
in vitro cultures are currently counted manually, a tedious
procedure which is prone to inter-observer differences. Aiming
for an automated OC-quantification system, classification of
OC and precursor cells was done on fluorescence microscope
images based on the distinct appearance of fluorescent nuclei.
Following ellipse fitting to nuclei, a combination of eight
features enabled clustering of OC and precursor cell nuclei.
After evaluating different machine-learning techniques, LOGREG
achieved 74% correctly classified OC and precursor cell
nuclei, outperforming human experts (best expert: 55%). In
combination with the automated detection of total cell areas,
this system allows to measure various cell parameters and most
importantly to quantify proteins involved in osteoclastogenesis.