A Hybrid Feature Selection by Resampling, Chi squared and Consistency Evaluation Techniques

In this paper a combined feature selection method is proposed which takes advantages of sample domain filtering, resampling and feature subset evaluation methods to reduce dimensions of huge datasets and select reliable features. This method utilizes both feature space and sample domain to improve the process of feature selection and uses a combination of Chi squared with Consistency attribute evaluation methods to seek reliable features. This method consists of two phases. The first phase filters and resamples the sample domain and the second phase adopts a hybrid procedure to find the optimal feature space by applying Chi squared, Consistency subset evaluation methods and genetic search. Experiments on various sized datasets from UCI Repository of Machine Learning databases show that the performance of five classifiers (Naïve Bayes, Logistic, Multilayer Perceptron, Best First Decision Tree and JRIP) improves simultaneously and the classification error for these classifiers decreases considerably. The experiments also show that this method outperforms other feature selection methods.

Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.

Automatic Voice Classification System Based on Traditional Korean Medicine

This paper introduces an automatic voice classification system for the diagnosis of individual constitution based on Sasang Constitutional Medicine (SCM) in Traditional Korean Medicine (TKM). For the developing of this algorithm, we used the voices of 309 female speakers and extracted a total of 134 speech features from the voice data consisting of 5 sustained vowels and one sentence. The classification system, based on a rule-based algorithm that is derived from a non parametric statistical method, presents 3 types of decisions: reserved, positive and negative decisions. In conclusion, 71.5% of the voice data were diagnosed by this system, of which 47.7% were correct positive decisions and 69.7% were correct negative decisions.

A Combined Fuzzy Decision Making Approach to Supply Chain Risk Assessment

Many firms implemented various initiatives such as outsourced manufacturing which could make a supply chain (SC) more vulnerable to various types of disruptions. So managing risk has become a critical component of SC management. Different types of SC vulnerability management methodologies have been proposed for managing SC risk, most offer only point-based solutions that deal with a limited set of risks. This research aims to reinforce SC risk management by proposing an integrated approach. SC risks are identified and a risk index classification structure is created. Then we develop a SC risk assessment approach based on the analytic network process (ANP) and the VIKOR methods under the fuzzy environment where the vagueness and subjectivity are handled with linguistic terms parameterized by triangular fuzzy numbers. By using FANP, risks weights are calculated and then inserted to the FVIKOR to rank the SC members and find the most risky partner.

Applying Tabu Search Algorithm in Public Transport: A Case Study for University Students in Mauritius

In this paper, the Tabu search algorithm is used to solve a transportation problem which consists of determining the shortest routes with the appropriate vehicle capacity to facilitate the travel of the students attending the University of Mauritius. The aim of this work is to minimize the total cost of the distance travelled by the vehicles in serving all the customers. An initial solution is obtained by the TOUR algorithm which basically constructs a giant tour containing all the customers and partitions it in an optimal way so as to produce a set of feasible routes. The Tabu search algorithm then makes use of a search procedure, a swapping procedure and the intensification and diversification mechanism to find the best set of feasible routes.

2D Bar Codes Reading: Solutions for Camera Phones

Two-dimensional (2D) bar codes were designed to carry significantly more data with higher information density and robustness than its 1D counterpart. Thanks to the popular combination of cameras and mobile phones, it will naturally bring great commercial value to use the camera phone for 2D bar code reading. This paper addresses the problem of specific 2D bar code design for mobile phones and introduces a low-level encoding method of matrix codes. At the same time, we propose an efficient scheme for 2D bar codes decoding, of which the effort is put on solutions of the difficulties introduced by low image quality that is very common in bar code images taken by a phone camera.

Frames about Nanotechnology Agenda in Turkish Media, 2005-2009

As the new industrial revolution advances in the nanotechnology have been followed with interest throughout the world and also in Turkey. Media has an important role in conveying these advances to public, rising public awareness and creating attitudes related to nanotechnology. As well as representing how a subject is treated, media frames determine how public think about this subject. In literature definite frames related to nanoscience and nanotechnology such as process, regulation, conflict and risks were mentioned in studies focusing different countries. So how nanotechnology news is treated by which frames and in which news categories in Turkey as a one of developing countries? In this study examining different variables about nanotechnology that affect public attitudes such as category, frame, story tone, source in Turkish media via framing analysis developed in agenda setting studies was aimed. In the analysis data between 2005 and 2009 obtained from the first five national newspapers with wide circulation in Turkey will be used. In this study the direction of the media about nanotechnology, in which frames nanotechnologic advances brought to agenda were reported as news, and sectoral, legal, economic and social scenes reflected by these frames to public related to nanotechnology in Turkey were planned.

Leatherback Turtle (Dermochelys coriacea) after Incubation Eggshell in Andaman Sea, Thailand Study: Microanalysis on Ultrastructure and Elemental Composition

There are few studies on eggshell of leatherback turtle which is endangered species in Thailand. This study was focusing on the ultrastructure and elemental composition of leatherback turtle eggshells collected from Andaman Sea Shore, Thailand during the nesting season using scanning electron microscope (SEM). Three eggshell layers of leatherback turtle; the outer cuticle layer or calcareous layer, the middle layer or middle multistrata layer and the inner fibrous layer were recognized. The outer calcareous layer was thick and porosity which consisted of loose nodular units of various crystal shapes and sizes. The loose attachment between these units resulted in numerous spaces and openings. The middle layer was compact thick with several multistrata and contained numerous openings connecting to both outer cuticle layer and inner fibrous layer. The inner fibrous layer was compact and thin, and composed of numerous reticular fibers. Energy dispersive X-ray microanalysis detector revealed energy spectrum of X-rays character emitted from all elements on each layer. The percentages of all elements were found in the following order: carbon (C) > oxygen (O) > calcium (Ca) > sulfur (S) > potassium (K) > aluminum (Al) > iodine (I) > silicon (Si) > chlorine (Cl) > sodium (Na) > fluorine (F) > phosphorus (P) > magnesium (Mg). Each layer consisted of high percentage of CaCO3 (approximately 98%) implying that it was essential for turtle embryonic development. A significant difference was found in the percentages of Ca and Mo in the 3layers. Moreover, transition metal, metal and toxic non-metal contaminations were found in leatherback turtle eggshell samples. These were palladium (Pd), molybdenum (Mo), copper (Cu), aluminum (Al), lead (Pb), and bromine (Br). The contamination elements were seen in the outer layers except for Mo. All elements were readily observed and mapped using Smiling program. X-ray images which mapped the location of all elements were showed. Calcium containing in the eggshell appeared in high contents and was widely distributing in clusters of the outer cuticle layer to form CaCO3 structure. Moreover, the accumulation of Na and Cl was observed to form NaCl which was widely distributing in 3 eggshell layers. The results from this study would be valuable on assessing the emergent success in this endangered species.

Effect of Tempering Temperature and Time on the Corrosion Behaviour of 304 and 316 Austenitic Stainless Steels in Oxalic Acid

The effect of different tempering temperatures and heat treatment times on the corrosion resistance of austenitic stainless steels in oxalic acid was studied in this work using conventional weight loss and electrochemical measurements. Typical 304 and 316 stainless steel samples were tempered at 150oC, 250oC and 350oC after being austenized at 1050oC for 10 minutes. These samples were then immersed in 1.0M oxalic acid and their weight losses were measured at every five days for 30 days. The results show that corrosion of both types of ASS samples increased with an increase in tempering temperature and time and this was due to the precipitation of chromium carbides at the grain boundaries of these metals. Electrochemical results also confirm that the 304 ASS is more susceptible to corrosion than 316 ASS in this medium. This is attributed to the molybdenum in the composition of the latter. The metallographic images of these samples showed non–uniform distribution of precipitated chromium carbides at the grain boundaries of these metals and unevenly distributed carbides and retained austenite phases which cause galvanic effects in the medium.

A Study of Distinctive Models for Pre-hospital EMS in Thailand: Knowledge Capture

In Thailand, the practice of pre-hospital Emergency Medical Service (EMS) in each area reveals the different growth rates and effectiveness of the practices. Those can be found as the diverse quality and quantity. To shorten the learning curve prior to speed-up the practices in other areas, story telling and lessons learnt from the effective practices are valued as meaningful knowledge. To this paper, it was to ascertain the factors, lessons learnt and best practices that have impact as contributing to the success of prehospital EMS system. Those were formulized as model prior to speedup the practice in other areas. To develop the model, Malcolm Baldrige National Quality Award (MBNQA), which is widely recognized as a framework for organizational quality assessment and improvement, was chosen as the discussion framework. Remarkably, this study was based on the consideration of knowledge capture; however it was not to complete the loop of knowledge activities. Nevertheless, it was to highlight the recognition of knowledge capture, which is the initiation of knowledge management.

Prevalence of Psychological Resistance to Voluntary Counselling and Testing of HIV/AIDS among Students of Tertiary Institutions in Kano State, Nigeria

The incessant discomfort for Voluntary Counselling and Testing (VCT) exhibited by students in some tertiary institutions in Kano State, Nigeria is capable of causing Psychological Resistance as well as jeopardizing the purpose of HIV intervention. This study investigated the Prevalence of Psychological Resistance to VCT of HIV/AIDS among students of tertiary institutions in the state. Two null hypotheses were postulated and tested. Cross- Sectional Survey Design was employed in which 1512 sample was selected from a student population of 104,841 following Stratified Random Sampling technique. A self-developed 20-item scale whose reliability coefficient is 0.83 was used for data collection. Data analyzed via Chi-square and t-test reveals a prevalence of 38% with males (Mean=0.34; SD=0.475) constituting 60% and females (Mean=0.45; SD=0.498) 40%. Also, the calculated chi-square and ttest were not significant at 0.05 as such the null hypotheses were upheld. Recommendation offered suggests the use of reinforcement and social support for students who patronize HIV/AIDS counselling.

Effect of Curing Profile to Eliminate the Voids / Black Dots Formation in Underfill Epoxy for Hi-CTE Flip Chip Packaging

Void formation in underfill is considered as failure in flip chip manufacturing process. Void formation possibly caused by several factors such as poor soldering and flux residue during die attach process, void entrapment due moisture contamination, dispense pattern process and setting up the curing process. This paper presents the comparison of single step and two steps curing profile towards the void and black dots formation in underfill for Hi-CTE Flip Chip Ceramic Ball Grid Array Package (FC-CBGA). Statistic analysis was conducted to analyze how different factors such as wafer lot, sawing technique, underfill fillet height and curing profile recipe were affected the formation of voids and black dots. A C-Mode Scanning Aqoustic Microscopy (C-SAM) was used to scan the total count of voids and black dots. It was shown that the 2 steps curing profile provided solution for void elimination and black dots in underfill after curing process.

Using Radio Frequency Identification Technology in Supply Chain Management

The radio frequency identification (RFID) is a technology for automatic identification of items, particularly in supply chain, but it is becoming increasingly important for industrial applications. Unlike barcode technology that detects the optical signals reflected from barcode labels, RFID uses radio waves to transmit the information from an RFID tag affixed to the physical object. In contrast to today most often use of this technology in warehouse inventory and supply chain, the focus of this paper is an overview of the structure of RFID systems used by RFID technology and it also presents a solution based on the application of RFID for brand authentication, traceability and tracking, by implementing a production management system and extending its use to traders.

Personalization and the Universal Communications Identifier Concept

As communications systems and technology become more advanced and complex, it will be increasingly important to focus on users- individual needs. Personalization and effective user profile management will be necessary to ensure the uptake and success of new services and devices and it is therefore important to focus on the users- requirements in this area and define solutions that meet these requirements. The work on personalization and user profiles emerged from earlier ETSI work on a Universal Communications Identifier (UCI) which is a unique identifier of the user rather than a range of identifiers of the many of communication devices or services (e.g. numbers of fixed phone at home/work, mobile phones, fax and email addresses). This paper describes work on personalization including standardized information and preferences and an architectural framework providing a description of how personalization can be integrated in Next Generation Networks, together with the UCI concept.

Learning Theories within Coaching Process

These days we face with so many advertisements in magazines, those mentioned coaching is pragmatic specialties which help people make change in their lives. Up to know Specialty coaches are not necessarily therapists, consultants or psychologist, thus they may not know psychological theories. The International Coach Federation identifies "facilitating learning and results" as one of its four core coach competencies, without understanding learning theories coaching practice hangs in theoretical abyss. Thus the aim of this article is investigating learning theories within coaching process. Therefore, I reviewed some cognitive and behavioral learning theories and analyzed their contribution with coaching process which has been introduced in mentor coaches and ICF certified coaches' papers and books. The result demonstrated that coaching profession is strongly grounded in learning theories, and it will be strengthened by the validation of theories and evidence-based research as we move forward. Thus, it needs more research in order to applying effective theoretical frameworks.

Towards a Systematic, Cost-Effective Approach for ERP Selection

Existing experiences indicate that one of the most prominent reasons that some ERP implementations fail is related to selecting an improper ERP package. Among those important factors resulting in inappropriate ERP selections, one is to ignore preliminary activities that should be done before the evaluation of ERP packages. Another factor yielding these unsuitable selections is that usually organizations employ prolonged and costly selection processes in such extent that sometimes the process would never be finalized or sometimes the evaluation team might perform many key final activities in an incomplete or inaccurate way due to exhaustion, lack of interest or out-of-date data. In this paper, a systematic approach that recommends some activities to be done before and after the main selection phase is introduced for choosing an ERP package. On the other hand, the proposed approach has utilized some ideas that accelerates the selection process at the same time that reduces the probability of an erroneous final selection.

A High Quality Speech Coder at 600 bps

This paper presents a vocoder to obtain high quality synthetic speech at 600 bps. To reduce the bit rate, the algorithm is based on a sinusoidally excited linear prediction model which extracts few coding parameters, and three consecutive frames are grouped into a superframe and jointly vector quantization is used to obtain high coding efficiency. The inter-frame redundancy is exploited with distinct quantization schemes for different unvoiced/voiced frame combinations in the superframe. Experimental results show that the quality of the proposed coder is better than that of 2.4kbps LPC10e and achieves approximately the same as that of 2.4kbps MELP and with high robustness.

A Comparison of Some Splines-Based Methods for the One-dimensional Heat Equation

In this paper, collocation based cubic B-spline and extended cubic uniform B-spline method are considered for solving one-dimensional heat equation with a nonlocal initial condition. Finite difference and θ-weighted scheme is used for time and space discretization respectively. The stability of the method is analyzed by the Von Neumann method. Accuracy of the methods is illustrated with an example. The numerical results are obtained and compared with the analytical solutions.

Vibration Base Identification of Impact Force Using Genetic Algorithm

This paper presents the identification of the impact force acting on a simply supported beam. The force identification is an inverse problem in which the measured response of the structure is used to determine the applied force. The identification problem is formulated as an optimization problem and the genetic algorithm is utilized to solve the optimization problem. The objective function is calculated on the difference between analytical and measured responses and the decision variables are the location and magnitude of the applied force. The results from simulation show the effectiveness of the approach and its robustness vs. the measurement noise and sensor location.

eLearning Tools Evaluation based on Quality Concept Distance Computing. A Case Study

Despite the extensive use of eLearning systems, there is no consensus on a standard framework for evaluating this kind of quality system. Hence, there is only a minimum set of tools that can supervise this judgment and gives information about the course content value. This paper presents two kinds of quality set evaluation indicators for eLearning courses based on the computational process of three known metrics, the Euclidian, Hamming and Levenshtein distances. The “distance" calculus is applied to standard evaluation templates (i.e. the European Commission Programme procedures vs. the AFNOR Z 76-001 Standard), determining a reference point in the evaluation of the e-learning course quality vs. the optimal concept(s). The case study, based on the results of project(s) developed in the framework of the European Programme “Leonardo da Vinci", with Romanian contractors, try to put into evidence the benefits of such a method.