Reservoir Operating by Ant Colony Optimization for Continuous Domains (ACOR) Case Study: Dez Reservoir

A direct search approach to determine optimal reservoir operating is proposed with ant colony optimization for continuous domains (ACOR). The model is applied to a system of single reservoir to determine the optimum releases during 42 years of monthly steps. A disadvantage of ant colony based methods and the ACOR in particular, refers to great amount of computer run time consumption. In this study a highly effective procedure for decreasing run time has been developed. The results are compared to those of a GA based model.

Applications of Support Vector Machines on Smart Phone Systems for Emotional Speech Recognition

An emotional speech recognition system for the applications on smart phones was proposed in this study to combine with 3G mobile communications and social networks to provide users and their groups with more interaction and care. This study developed a mechanism using the support vector machines (SVM) to recognize the emotions of speech such as happiness, anger, sadness and normal. The mechanism uses a hierarchical classifier to adjust the weights of acoustic features and divides various parameters into the categories of energy and frequency for training. In this study, 28 commonly used acoustic features including pitch and volume were proposed for training. In addition, a time-frequency parameter obtained by continuous wavelet transforms was also used to identify the accent and intonation in a sentence during the recognition process. The Berlin Database of Emotional Speech was used by dividing the speech into male and female data sets for training. According to the experimental results, the accuracies of male and female test sets were increased by 4.6% and 5.2% respectively after using the time-frequency parameter for classifying happy and angry emotions. For the classification of all emotions, the average accuracy, including male and female data, was 63.5% for the test set and 90.9% for the whole data set.

A Novel Strategy for Oriented Protein Immobilization

A new strategy for oriented immobilization of proteins was proposed. The strategy contains two steps. The first step is to search for a docking site away from the active site on the protein surface. The second step is trying to find a ligand that is able to grasp the targeted site of the protein. To avoid ligand binding to the active site of protein, the targeted docking site is selected to own opposite charges to those near the active site. To enhance the ligand-protein binding, both hydrophobic and electrostatic interactions need to be included. The targeted docking site should therefore contain hydrophobic amino acids. The ligand is then selected through the help of molecular docking simulations. The enzyme α-amylase derived from Aspergillus oryzae (TAKA) was taken as an example for oriented immobilization. The active site of TAKA is surrounded by negatively charged amino acids. All the possible hydrophobic sites on the surface of TAKA were evaluated by the free energy estimation through benzene docking. A hydrophobic site on the opposite side of TAKA-s active site was found to be positive in net charges. A possible ligand, 3,3-,4,4- – Biphenyltetra- carboxylic acid (BPTA), was found to catch TAKA by the designated docking site. Then, the BPTA molecules were grafted onto silica gels and measured the affinity of TAKA adsorption and the specific activity of thereby immobilized enzymes. It was found that TAKA had a dissociation constant as low as 7.0×10-6 M toward the ligand BPTA on silica gel. The increase in ionic strength has little effect on the adsorption of TAKA, which indicated the existence of hydrophobic interaction between ligands and proteins. The specific activity of the immobilized TAKA was compared with the randomly adsorbed TAKA on primary amine containing silica gel. It was found that the orderly immobilized TAKA owns a specific activity twice as high as the one randomly adsorbed by ionic interaction.

Prevalence of Epstein-Barr Virus Latent Membrane Protein-1 in Jordanian Patients with Hodgkin's Lymphoma and Non- Hodgkin's Lymphoma

The aim of this study was to estimate the frequency of EBV infection in Hodgkin's lymphoma (HL) and non-Hodgkin's lymphoma (NHL) occurring in Jordanian patients. A total of 55 patients with lymphoma were examined in this study. Of 55 patients, 30 and 25 were diagnosed as HL and NHL, respectively. The four HL subtypes were observed with the majority of the cases exhibited the mixed cellularity (MC) subtype followed by the nodular sclerosis (NS). The high grade was found to be the commonest subtype of NHL in our sample, followed by the low grade. The presence of EBV virus was detected by immunostating for expression of latent membrane protein-1 (LMP-1). The frequency of LMP-1 expression occurred more frequent in patients with HL (60.0%) than in patients with NHL (32.0%). The frequency of LMP-1 expression was also higher in patients with MC subtype (61.11%) than those patients with NS (28.57%). No age or gender difference in occurrence of EBV infection was observed among patient with HL. By contrast, the prevalence of EBV infection in NHL patients aged below 50 was lower (16.66%) than in NHL patients aged 50 or above (46.15%). In addition, EBV infection was more frequent in females with NHL (38.46%) than in male with NHL (25%). In NHL cases, the frequency of EBV infection in intermediate grade (60.0%) was high when compared with frequency of low (25%) or high grades (25%). In conclusion, analysis of LMP-1 expression indicates an important role for this viral oncogene in the pathogenesis of EBV-associated malignant lymphomas. These data also support the previous findings that people with EBV may develop lymphoma and that efforts to maintain low lymphoma should be considered for people with EBV infection.

A Method for 3D Mesh Adaptation in FEA

The use of the mechanical simulation (in particular the finite element analysis) requires the management of assumptions in order to analyse a real complex system. In finite element analysis (FEA), two modeling steps require assumptions to be able to carry out the computations and to obtain some results: the building of the physical model and the building of the simulation model. The simplification assumptions made on the analysed system in these two steps can generate two kinds of errors: the physical modeling errors (mathematical model, domain simplifications, materials properties, boundary conditions and loads) and the mesh discretization errors. This paper proposes a mesh adaptive method based on the use of an h-adaptive scheme in combination with an error estimator in order to choose the mesh of the simulation model. This method allows us to choose the mesh of the simulation model in order to control the cost and the quality of the finite element analysis.

Multimodal Reasoning in a Knowledge Engineering Framework for Product Support

Problem solving has traditionally been one of the principal research areas for artificial intelligence. Yet, although artificial intelligence reasoning techniques have been employed in several product support systems, the benefit of integrating product support, knowledge engineering, and problem solving, is still unclear. This paper studies the synergy of these areas and proposes a knowledge engineering framework that integrates product support systems and artificial intelligence techniques. The framework includes four spaces; the data, problem, hypothesis, and solution ones. The data space incorporates the knowledge needed for structured reasoning to take place, the problem space contains representations of problems, and the hypothesis space utilizes a multimodal reasoning approach to produce appropriate solutions in the form of virtual documents. The solution space is used as the gateway between the system and the user. The proposed framework enables the development of product support systems in terms of smaller, more manageable steps while the combination of different reasoning techniques provides a way to overcome the lack of documentation resources.

Forest Growth Simulation: Tropical Rain Forest Stand Table Projection

The study on the tree growth for four species groups of commercial timber in Koh Kong province, Cambodia-s tropical rainforest is described. The simulation for these four groups had been successfully developed in the 5-year interval through year-60. Data were obtained from twenty permanent sample plots in the duration of thirteen years. The aim for this study was to develop stand table simulation system of tree growth by the species group. There were five steps involved in the development of the tree growth simulation: aggregate the tree species into meaningful groups by using cluster analysis; allocate the trees in the diameter classes by the species group; observe the diameter movement of the species group. The diameter growth rate, mortality rate and recruitment rate were calculated by using some mathematical formula. Simulation equation had been created by combining those parameters. Result showed the dissimilarity of the diameter growth among species groups.

Enhancing Seamless Communication Through a user Co-designed Wearable Device

This work aims to describe the process of developing services and applications of seamless communication within a Telecom Italia long-term research project, which takes as central aim the design of a wearable communication device. In particular, the objective was to design a wrist phone integrated into everyday life of people in full transparency. The methodology used to design the wristwatch was developed through several subsequent steps also involving the Personas Layering Framework. The data collected in this phases have been very useful for designing an improved version of the first two concepts of wrist phone going to change aspects related to the four critical points expressed by the users.

Solution of Density Dependent Nonlinear Reaction-Diffusion Equation Using Differential Quadrature Method

In this study, the density dependent nonlinear reactiondiffusion equation, which arises in the insect dispersal models, is solved using the combined application of differential quadrature method(DQM) and implicit Euler method. The polynomial based DQM is used to discretize the spatial derivatives of the problem. The resulting time-dependent nonlinear system of ordinary differential equations(ODE-s) is solved by using implicit Euler method. The computations are carried out for a Cauchy problem defined by a onedimensional density dependent nonlinear reaction-diffusion equation which has an exact solution. The DQM solution is found to be in a very good agreement with the exact solution in terms of maximum absolute error. The DQM solution exhibits superior accuracy at large time levels tending to steady-state. Furthermore, using an implicit method in the solution procedure leads to stable solutions and larger time steps could be used.

Methods for Case Maintenance in Case-Based Reasoning

Case-Based Reasoning (CBR) is one of machine learning algorithms for problem solving and learning that caught a lot of attention over the last few years. In general, CBR is composed of four main phases: retrieve the most similar case or cases, reuse the case to solve the problem, revise or adapt the proposed solution, and retain the learned cases before returning them to the case base for learning purpose. Unfortunately, in many cases, this retain process causes the uncontrolled case base growth. The problem affects competence and performance of CBR systems. This paper proposes competence-based maintenance method based on deletion policy strategy for CBR. There are three main steps in this method. Step 1, formulate problems. Step 2, determine coverage and reachability set based on coverage value. Step 3, reduce case base size. The results obtained show that this proposed method performs better than the existing methods currently discussed in literature.

Comparison of MFCC and Cepstral Coefficients as a Feature Set for PCG Biometric Systems

Heart sound is an acoustic signal and many techniques used nowadays for human recognition tasks borrow speech recognition techniques. One popular choice for feature extraction of accoustic signals is the Mel Frequency Cepstral Coefficients (MFCC) which maps the signal onto a non-linear Mel-Scale that mimics the human hearing. However the Mel-Scale is almost linear in the frequency region of heart sounds and thus should produce similar results with the standard cepstral coefficients (CC). In this paper, MFCC is investigated to see if it produces superior results for PCG based human identification system compared to CC. Results show that the MFCC system is still superior to CC despite linear filter-banks in the lower frequency range, giving up to 95% correct recognition rate for MFCC and 90% for CC. Further experiments show that the high recognition rate is due to the implementation of filter-banks and not from Mel-Scaling.

Through Biometric Card in Romania: Person Identification by Face, Fingerprint and Voice Recognition

In this paper three different approaches for person verification and identification, i.e. by means of fingerprints, face and voice recognition, are studied. Face recognition uses parts-based representation methods and a manifold learning approach. The assessment criterion is recognition accuracy. The techniques under investigation are: a) Local Non-negative Matrix Factorization (LNMF); b) Independent Components Analysis (ICA); c) NMF with sparse constraints (NMFsc); d) Locality Preserving Projections (Laplacianfaces). Fingerprint detection was approached by classical minutiae (small graphical patterns) matching through image segmentation by using a structural approach and a neural network as decision block. As to voice / speaker recognition, melodic cepstral and delta delta mel cepstral analysis were used as main methods, in order to construct a supervised speaker-dependent voice recognition system. The final decision (e.g. “accept-reject" for a verification task) is taken by using a majority voting technique applied to the three biometrics. The preliminary results, obtained for medium databases of fingerprints, faces and voice recordings, indicate the feasibility of our study and an overall recognition precision (about 92%) permitting the utilization of our system for a future complex biometric card.

Implicit Two Step Continuous Hybrid Block Methods with Four Off-Steps Points for Solving Stiff Ordinary Differential Equation

In this paper, a self starting two step continuous block hybrid formulae (CBHF) with four Off-step points is developed using collocation and interpolation procedures. The CBHF is then used to produce multiple numerical integrators which are of uniform order and are assembled into a single block matrix equation. These equations are simultaneously applied to provide the approximate solution for the stiff ordinary differential equations. The order of accuracy and stability of the block method is discussed and its accuracy is established numerically.

Limitations of the Analytic Hierarchy Process Technique with Respect to Geographically Distributed Stakeholders

The selection of appropriate requirements for product releases can make a big difference in a product success. The selection of requirements is done by different requirements prioritization techniques. These techniques are based on pre-defined and systematic steps to calculate the requirements relative weight. Prioritization is complicated by new development settings, shifting from traditional co-located development to geographically distributed development. Stakeholders, connected to a project, are distributed all over the world. These geographically distributions of stakeholders make it hard to prioritize requirements as each stakeholder have their own perception and expectations of the requirements in a software project. This paper discusses limitations of the Analytical Hierarchy Process with respect to geographically distributed stakeholders- (GDS) prioritization of requirements. This paper also provides a solution, in the form of a modified AHP, in order to prioritize requirements for GDS. We will conduct two experiments in this paper and will analyze the results in order to discuss AHP limitations with respect to GDS. The modified AHP variant is also validated in this paper.

Challenges to Enable Quick Start of an Environmental Monitoring with Wireless Sensor Network Technology

With the advancement of wireless sensor network technology, its practical utilization is becoming an important challange. This paper overviews my past environmental monitoring project, and discusses the process of starting the monitoring by classifying it into four steps. The steps to start environmental monitoring can be complicated, but not well discussed by researchers of wireless sensor network technology. This paper demonstrates our activity and challenges in each of the four steps to ease the process, and argues future challenges to enable quick start of environmental monitoring.

Project Management and Software Development Processes: Integrating PMBOK and OPEN

Software organizations are constantly looking for better solutions when designing and using well-defined software processes for the development of their products and services. However, while the technical aspects are virtually easier to arrange, many software development processes lack more support on project management issues. When adopting such processes, an organization needs to apply good project management skills along with technical views provided by those models. This research proposes the definition of a new model that integrates the concepts of PMBOK and those available on the OPEN metamodel, helping not only process integration but also building the steps towards a more comprehensive and automatable model.

The Role of the Dominant Party of the Republic of Kazakhstan and China's Ruling Party in a Country's Modernization: Similarities and Differences

The purpose of this work is to identify the positive and negative aspects of parties- participation in the country-s modernization, which in turn, will help a country to determine the necessary steps to improve the social-economic development. The article considers a question of the role of the dominating party of Kazakhstan and ruling party of China in the country-s modernization. Using a comparative analysis reveals differences between the People's Democratic Party “Nur Otan" and the Communist Party of China. It is discussed the policy of carrying out of modernization, the main actions of political parties of both countries with a view of modernization implementation.

3D Dynamic Representation System for the Human Head

The human head representations usually are based on the morphological – structural components of a real model. Over the time became more and more necessary to achieve full virtual models that comply very rigorous with the specifications of the human anatomy. Still, making and using a model perfectly fitted with the real anatomy is a difficult task, because it requires large hardware resources and significant times for processing. That is why it is necessary to choose the best compromise solution, which keeps the right balance between the details perfection and the resources consumption, in order to obtain facial animations with real-time rendering. We will present here the way in which we achieved such a 3D system that we intend to use as a base point in order to create facial animations with real-time rendering, used in medicine to find and to identify different types of pathologies.

Exponential Particle Swarm Optimization Approach for Improving Data Clustering

In this paper we use exponential particle swarm optimization (EPSO) to cluster data. Then we compare between (EPSO) clustering algorithm which depends on exponential variation for the inertia weight and particle swarm optimization (PSO) clustering algorithm which depends on linear inertia weight. This comparison is evaluated on five data sets. The experimental results show that EPSO clustering algorithm increases the possibility to find the optimal positions as it decrease the number of failure. Also show that (EPSO) clustering algorithm has a smaller quantization error than (PSO) clustering algorithm, i.e. (EPSO) clustering algorithm more accurate than (PSO) clustering algorithm.

Long-Term Simulation of Digestive Sound Signals by CEPSTRAL Technique

In this study, an investigation over digestive diseases has been done in which the sound acts as a detector medium. Pursue to the preprocessing the extracted signal in cepstrum domain is registered. After classification of digestive diseases, the system selects random samples based on their features and generates the interest nonstationary, long-term signals via inverse transform in cepstral domain which is presented in digital and sonic form as the output. This structure is updatable or on the other word, by receiving a new signal the corresponding disease classification is updated in the feature domain.