Detection of Ultrasonic Images in the Presence of a Random Number of Scatterers: A Statistical Learning Approach

Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of medical ultrasound images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to clinical ultrasound images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected ultrasound images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (detection hypotheses) in the original images.

A Fitted Random Sampling Scheme for Load Distribution in Grid Networks

Grid networks provide the ability to perform higher throughput computing by taking advantage of many networked computer-s resources to solve large-scale computation problems. As the popularity of the Grid networks has increased, there is a need to efficiently distribute the load among the resources accessible on the network. In this paper, we present a stochastic network system that gives a distributed load-balancing scheme by generating almost regular networks. This network system is self-organized and depends only on local information for load distribution and resource discovery. The in-degree of each node is refers to its free resources, and job assignment and resource discovery processes required for load balancing is accomplished by using fitted random sampling. Simulation results show that the generated network system provides an effective, scalable, and reliable load-balancing scheme for the distributed resources accessible on Grid networks.

An Approach to Solving a Permutation Problem of Frequency Domain Independent Component Analysis for Blind Source Separation of Speech Signals

Independent component analysis (ICA) in the frequency domain is used for solving the problem of blind source separation (BSS). However, this method has some problems. For example, a general ICA algorithm cannot determine the permutation of signals which is important in the frequency domain ICA. In this paper, we propose an approach to the solution for a permutation problem. The idea is to effectively combine two conventional approaches. This approach improves the signal separation performance by exploiting features of the conventional approaches. We show the simulation results using artificial data.

Approximate Solution of Nonlinear Fredholm Integral Equations of the First Kind via Converting to Optimization Problems

In this paper we introduce an approach via optimization methods to find approximate solutions for nonlinear Fredholm integral equations of the first kind. To this purpose, we consider two stages of approximation. First we convert the integral equation to a moment problem and then we modify the new problem to two classes of optimization problems, non-constraint optimization problems and optimal control problems. Finally numerical examples is proposed.

A Sub-Pixel Image Registration Technique with Applications to Defect Detection

This paper presents a useful sub-pixel image registration method using line segments and a sub-pixel edge detector. In this approach, straight line segments are first extracted from gray images at the pixel level before applying the sub-pixel edge detector. Next, all sub-pixel line edges are mapped onto the orientation-distance parameter space to solve for line correspondence between images. Finally, the registration parameters with sub-pixel accuracy are analytically solved via two linear least-square problems. The present approach can be applied to various fields where fast registration with sub-pixel accuracy is required. To illustrate, the present approach is applied to the inspection of printed circuits on a flat panel. Numerical example shows that the present approach is effective and accurate when target images contain a sufficient number of line segments, which is true in many industrial problems.

Possible Futures for Doctoral Research Training in Design

In this paper, we argue that Design research is basic to countries- national productivity and competition agendas at the same time that vagaries of research training presents as one of the barriers faced by Design Higher Degree by Research students in engaging those agendas. We argue that, given industry requirements for research-trained recruits, students have the right to expect that research training will provide the foundations of a successful career on an academic or research pathway or a professional pathway, but that universities have yet to address problems in their provision of research training for Design doctoral students. We suggest that to facilitate this, rigorous research conducted on the provision of Doctoral programs in Design would serve to inform future activities in Design research in productive ways.

Modeling Language for Machine Learning

For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem.

Gap Analysis of Cassava Sector in Cameroon

Recently, Cassava has been the driving force of many developing countries- economic progress. To attain this level, prerequisites were put in place enabling cassava sector to become an industrial and a highly competitive crop. Cameroon can achieve the same results. Moreover, it can upgrade the living conditions of both rural and urban dwellers and stimulate the development of the whole economy. Achieving this outcome calls for agricultural policy reforms. The adoption and implementation of adequate policies go along with efficient strategies. To choose effective strategies, an indepth investigation of the sector-s problems is highly recommended. This paper uses gap analysis method to evaluate cassava sector in Cameroon. It studies the present situation (where it is now), interrogates the future (where it should be) and finally proposes solutions to fill the gap.

Automated Process Quality Monitoring with Prediction of Fault Condition Using Measurement Data

Detection of incipient abnormal events is important to improve safety and reliability of machine operations and reduce losses caused by failures. Improper set-ups or aligning of parts often leads to severe problems in many machines. The construction of prediction models for predicting faulty conditions is quite essential in making decisions on when to perform machine maintenance. This paper presents a multivariate calibration monitoring approach based on the statistical analysis of machine measurement data. The calibration model is used to predict two faulty conditions from historical reference data. This approach utilizes genetic algorithms (GA) based variable selection, and we evaluate the predictive performance of several prediction methods using real data. The results shows that the calibration model based on supervised probabilistic principal component analysis (SPPCA) yielded best performance in this work. By adopting a proper variable selection scheme in calibration models, the prediction performance can be improved by excluding non-informative variables from their model building steps.

Jobs Scheduling and Worker Assignment Problem to Minimize Makespan using Ant Colony Optimization Metaheuristic

This article proposes an Ant Colony Optimization (ACO) metaheuristic to minimize total makespan for scheduling a set of jobs and assign workers for uniformly related parallel machines. An algorithm based on ACO has been developed and coded on a computer program Matlab®, to solve this problem. The paper explains various steps to apply Ant Colony approach to the problem of minimizing makespan for the worker assignment & jobs scheduling problem in a parallel machine model and is aimed at evaluating the strength of ACO as compared to other conventional approaches. One data set containing 100 problems (12 Jobs, 03 machines and 10 workers) which is available on internet, has been taken and solved through this ACO algorithm. The results of our ACO based algorithm has shown drastically improved results, especially, in terms of negligible computational effort of CPU, to reach the optimal solution. In our case, the time taken to solve all 100 problems is even lesser than the average time taken to solve one problem in the data set by other conventional approaches like GA algorithm and SPT-A/LMC heuristics.

An Ontology for Knowledge Representation and Applications

Ontology is a terminology which is used in artificial intelligence with different meanings. Ontology researching has an important role in computer science and practical applications, especially distributed knowledge systems. In this paper we present an ontology which is called Computational Object Knowledge Base Ontology. It has been used in designing some knowledge base systems for solving problems such as the system that supports studying knowledge and solving analytic geometry problems, the program for studying and solving problems in Plane Geometry, the knowledge system in linear algebra.

An Experimentally Validated Thermo- Mechanical Finite Element Model for Friction Stir Welding in Carbon Steels

Solidification cracking and hydrogen cracking are some defects generated in the fusion welding of ultrahigh carbon steels. However, friction stir welding (FSW) of such steels, being a solid-state technique, has been demonstrated to alleviate such problems encountered in traditional welding. FSW include different process parameters that must be carefully defined prior processing. These parameters included but not restricted to: tool feed, tool RPM, tool geometry, tool tilt angle. These parameters form a key factor behind avoiding warm holes and voids behind the tool and in achieving a defect-free weld. More importantly, these parameters directly affect the microstructure of the weld and hence the final mechanical properties of weld. For that, 3D finite element (FE) thermo-mechanical model was developed using DEFORM 3D to simulate FSW of carbon steel. At points of interest in the joint, tracking is done for history of critical state variables such as temperature, stresses, and strain rates. Typical results found include the ability to simulate different weld zones. Simulations predictions were successfully compared to experimental FSW tests. It is believed that such a numerical model can be used to optimize FSW processing parameters to favor desirable defect free weld with better mechanical properties.

User Satisfaction Issues in ERP Projects

Over the past few years, companies in developing countries have implemented enterprise resource planning (ERP) systems. Regardless of the various benefits of the ERP system, its adoption and implementation have not been without problems. Many companies have assigned considerable organizational resources to their ERP projects, but have encountered unexpected challenges. Neglecting a number of important factors in ERP projects might lead to failure instead of success. User satisfaction is among those factors that has a major influence on ERP implementation success. So, this paper intends to investigate the key factors that create ERP users- satisfaction and to discover whether ERP users- satisfaction varies among different users- profiles. The study was conducted using a survey questionnaire which was distributed to ERP users in Iranian organizations. A total of 384 responses were collected and analyzed. The findings indicated that younger ERP users tend to be more satisfied with ERP systems. Furthermore, ERP users with more experiences in IT and also more educated users have more satisfaction with ERP softwares. However, the study found no satisfaction differences between men and women users.

Thai Prosody Problems with First Year Students

Thai language is difficult in all four language skills, especially reading. The first year students may have different abilities in reading, so a teacher is required to find out a student’s reading level so that the teacher can help and support them till they can develop and resolve each problem themselves. This research is aimed to study the prosody problem among Thai students and will be focused on first year Thai students in the second semester. A total of 58 students were involved in this study. Four obstacles were found: 1. Interpretation from what they read and write 2. Incorrectness Pronunciation of Prosody 3. Incorrectness in Rhythm of the Poem 4. Incorrectness of the Thai Poem Pronunciation

Health Hazards Related to Computer Use: Experience of the National Institute for Medical Research in Tanzania

This paper is based on a study conducted in 2006 to assess the impact of computer usage on health of National Institute for Medical Research (NIMR) staff. NIMR being a research Institute, most of its staff spend substantial part of their working time on computers. There was notion among NIMR staff on possible prolonged computer usage health hazards. Hence, a study was conducted to establish facts and possible mitigation measures. A total of 144 NIMR staff were involved in the study of whom 63.2% were males and 36.8% females aged between 20 and 59 years. All staff cadres were included in the sample. The functions performed by Institute staff using computers includes; data management, proposal development and report writing, research activities, secretarial duties, accounting and administrative duties, on-line information retrieval and online communication through e-mail services. The interviewed staff had been using computers for 1-8 hours a day and for a period ranging from 1 to 20 years. The study has indicated ergonomic hazards for a significant proportion of interviewees (63%) of various kinds ranging from backache to eyesight related problems. The authors highlighted major issues which are substantially applicable in preventing occurrences of computer related problems and they urged NIMR Management and/or the government of Tanzania opts to adapt their practicability.

Evolution of Developing Flushing Cone during the Pressurized Flushing in Reservoir Storage

Sedimentation in reservoirs and the corresponding loss of storage capacity is one of the most serious problems in dam engineering. Pressurized flushing, a way to remove sediments from the reservoir, is flushing under a pressurized flow condition and nearly constant water level. Pressurized flushing has only local effects around the outlet. Sediment in the vicinity of the outlet openings is scoured and a funnel shaped crater is created. In this study, the temporal development of flushing cone under various hydraulic conditions was studied experimentally. Time variations of parameters such as maximum length and width of flushing and also depth of scouring cone was measured. Results indicated that an increase in flow velocity (and consequently in Froude number) established new hydraulically conditions for flushing mechanism and so a sudden growth was observed in the amount of sediment released and also scouring dimenssions. In addition, a set of nondimensional relationships were identified for temporal variations of flushing scour dimenssions, which can eventuallt be used to estimate the development of flushing cone.

Impact of the Existence of One-Way Functionson the Conceptual Difficulties of Quantum Measurements

One-way functions are functions that are easy to compute but hard to invert. Their existence is an open conjecture; it would imply the existence of intractable problems (i.e. NP-problems which are not in the P complexity class). If true, the existence of one-way functions would have an impact on the theoretical framework of physics, in particularly, quantum mechanics. Such aspect of one-way functions has never been shown before. In the present work, we put forward the following. We can calculate the microscopic state (say, the particle spin in the z direction) of a macroscopic system (a measuring apparatus registering the particle z-spin) by the system macroscopic state (the apparatus output); let us call this association the function F. The question is: can we compute the function F in the inverse direction? In other words, can we compute the macroscopic state of the system through its microscopic state (the preimage F -1)? In the paper, we assume that the function F is a one-way function. The assumption implies that at the macroscopic level the Schrödinger equation becomes unfeasible to compute. This unfeasibility plays a role of limit of the validity of the linear Schrödinger equation.

Simplex Method for Solving Linear Programming Problems with Fuzzy Numbers

The fuzzy set theory has been applied in many fields, such as operations research, control theory, and management sciences, etc. In particular, an application of this theory in decision making problems is linear programming problems with fuzzy numbers. In this study, we present a new method for solving fuzzy number linear programming problems, by use of linear ranking function. In fact, our method is similar to simplex method that was used for solving linear programming problems in crisp environment before.

Seven step Adams Type Block Method With Continuous Coefficient For Periodic Ordinary Differential Equation

We consider the development of an eight order Adam-s type method, with A-stability property discussed by expressing them as a one-step method in higher dimension. This makes it suitable for solving variety of initial-value problems. The main method and additional methods are obtained from the same continuous scheme derived via interpolation and collocation procedures. The methods are then applied in block form as simultaneous numerical integrators over non-overlapping intervals. Numerical results obtained using the proposed block form reveals that it is highly competitive with existing methods in the literature.

Combining ILP with Semi-supervised Learning for Web Page Categorization

This paper presents a semi-supervised learning algorithm called Iterative-Cross Training (ICT) to solve the Web pages classification problems. We apply Inductive logic programming (ILP) as a strong learner in ICT. The objective of this research is to evaluate the potential of the strong learner in order to boost the performance of the weak learner of ICT. We compare the result with the supervised Naive Bayes, which is the well-known algorithm for the text classification problem. The performance of our learning algorithm is also compare with other semi-supervised learning algorithms which are Co-Training and EM. The experimental results show that ICT algorithm outperforms those algorithms and the performance of the weak learner can be enhanced by ILP system.