Java Based Automatic Curriculum Generator for Children with Trisomy 21

Early Intervention Program (EIP) is required to improve the overall development of children with Trisomy 21 (Down syndrome). In order to help trainer and parent in the implementation of EIP, a support system has been developed. The support system is able to screen data automatically, store and analyze data, generate individual EIP (curriculum) with optimal training duration and to generate training automatically. The system consists of hardware and software where the software has been implemented using Java language and Linux Fedora. The software has been tested to ensure the functionality and reliability. The prototype has been also tested in Down syndrome centers. Test result shows that the system is reliable to be used for generation of an individual curriculum which includes the training program to improve the motor, cognitive, and combination abilities of Down syndrome children under 6 years.

Weak Measurement Theory for Discrete Scales

With the increasing spread of computers and the internet among culturally, linguistically and geographically diverse communities, issues of internationalization and localization and becoming increasingly important. For some of the issues such as different scales for length and temperature, there is a well-developed measurement theory. For others such as date formats no such theory will be possible. This paper fills a gap by developing a measurement theory for a class of scales previously overlooked, based on discrete and interval-valued scales such as spanner and shoe sizes. The paper gives a theoretical foundation for a class of data representation problems.

Greek Compounds: A Challenging Case for the Parsing Techniques of PC-KIMMO v.2

In this paper we describe the recognition process of Greek compound words using the PC-KIMMO software. We try to show certain limitations of the system with respect to the principles of compound formation in Greek. Moreover, we discuss the computational processing of phenomena such as stress and syllabification which are indispensable for the analysis of such constructions and we try to propose linguistically-acceptable solutions within the particular system.

MMU Simulation in Hardware Simulator Based-on State Transition Models

Embedded hardware simulator is a valuable computeraided tool for embedded application development. This paper focuses on the ARM926EJ-S MMU, builds state transition models and formally verifies critical properties for the models. The state transition models include loading instruction model, reading data model, and writing data model. The properties of the models are described by CTL specification language, and they are verified in VIS. The results obtained in VIS demonstrate that the critical properties of MMU are satisfied in the state transition models. The correct models can be used to implement the MMU component in our simulator. In the end of this paper, the experimental results show that the MMU can successfully accomplish memory access requests from CPU.

A New Weighted LDA Method in Comparison to Some Versions of LDA

Linear Discrimination Analysis (LDA) is a linear solution for classification of two classes. In this paper, we propose a variant LDA method for multi-class problem which redefines the between class and within class scatter matrices by incorporating a weight function into each of them. The aim is to separate classes as much as possible in a situation that one class is well separated from other classes, incidentally, that class must have a little influence on classification. It has been suggested to alleviate influence of classes that are well separated by adding a weight into between class scatter matrix and within class scatter matrix. To obtain a simple and effective weight function, ordinary LDA between every two classes has been used in order to find Fisher discrimination value and passed it as an input into two weight functions and redefined between class and within class scatter matrices. Experimental results showed that our new LDA method improved classification rate, on glass, iris and wine datasets, in comparison to different versions of LDA.

Design of a Mould System for Horizontal Continuous Casting of Bilayer Aluminium Strips

The present article deals with a composite casting process that allows to produce bilayer AlSn6-Al strips based on the technique of horizontal continuous casting. In the first part experimental investigations on the production of a single layer AlSn6 strip are described. Afterwards essential results of basic compound casting trials using simple test specimen are presented to define the thermal conditions required for a metallurgical compound between the alloy AlSn6 and pure aluminium. Subsequently, numerical analyses are described. A finite element model was used to examine a continuous composite casting process. As a result of the simulations the main influencing parameters concerning the thermal conditions within the composite casting region could be pointed out. Finally, basic guidance is given for the design of an appropriate composite mould system.

Adsorption of Lead from Synthetic Solution using Luffa Charcoal

This work was to study batch biosorption of Pb(II) ions from aqueous solution by Luffa charcoal. The effect of operating parameters such as adsorption contact time, initial pH solution and different initial Pb(II) concentration on the sorption of Pb(II) were investigated. The results showed that the adsorption of Pb(II) ions was initially rapid and the equilibrium time was 10 h. Adsorption kinetics of Pb(II) ions onto Luffa charcoal could be best described by the pseudo-second order model. At pH 5.0 was favorable for the adsorption and removal of Pb(II) ions. Freundlich adsorption isotherm model was better fitted for the adsorption of Pb(II) ions than Langmuir and Timkin isotherms, respectively. The highest monolayer adsorption capacity obtained from Langmuir isotherm model was 51.02 mg/g. This study demonstrated that Luffa charcoal could be used for the removal of Pb(II) ions in water treatment.

Insurance Fraud Management as an Integrated Part of Business Intelligence Framework

Frauds in insurance industry are one of the major sources of operational risk of insurance companies and constitute a significant portion of their losses. Every reasonable company on the market aims for improving their processes of uncovering frauds and invests their resources to reduce them. This article is addressing fraud management area from the view of extension of existing Business Intelligence solution. We describe the frame of such solution and would like to share with readers all benefits brought to insurance companies by adopting this approach in their fight against insurance frauds.

Metoprolol Tartrate-Ethylcellulose Tabletted Microparticles: Development of a Validated Invitro In-vivo Correlation

This study describes the methodology for the development of a validated in-vitro in-vivo correlation (IVIVC) for metoprolol tartrate modified release dosage forms with distinctive release rate characteristics. Modified release dosage forms were formulated by microencapsulation of metoprolol tartrate into different amounts of ethylcellulose by non-solvent addition technique. Then in-vitro and in-vivo studies were conducted to develop and validate level A IVIVC for metoprolol tartrate. The values of regression co-efficient (R2-values) for IVIVC of T2 and T3 formulations were not significantly (p

Modeling of Statistically Multiplexed Non Uniform Activity VBR Video

This paper reports the feasibility of the ARMA model to describe a bursty video source transmitting over a AAL5 ATM link (VBR traffic). The traffic represents the activity of the action movie "Lethal Weapon 3" transmitted over the ATM network using the Fore System AVA-200 ATM video codec with a peak rate of 100 Mbps and a frame rate of 25. The model parameters were estimated for a single video source and independently multiplexed video sources. It was found that the model ARMA (2, 4) is well-suited for the real data in terms of average rate traffic profile, probability density function, autocorrelation function, burstiness measure, and the pole-zero distribution of the filter model.

Automatic Reusability Appraisal of Software Components using Neuro-fuzzy Approach

Automatic reusability appraisal could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable components from existing legacy systems; that can save cost of developing the software from scratch. But the issue of how to identify reusable components from existing systems has remained relatively unexplored. In this paper, we have mentioned two-tier approach by studying the structural attributes as well as usability or relevancy of the component to a particular domain. Latent semantic analysis is used for the feature vector representation of various software domains. It exploits the fact that FeatureVector codes can be seen as documents containing terms -the idenifiers present in the components- and so text modeling methods that capture co-occurrence information in low-dimensional spaces can be used. Further, we devised Neuro- Fuzzy hybrid Inference System, which takes structural metric values as input and calculates the reusability of the software component. Decision tree algorithm is used to decide initial set of fuzzy rules for the Neuro-fuzzy system. The results obtained are convincing enough to propose the system for economical identification and retrieval of reusable software components.

Adaptation of State/Transition-Based Methods for Embedded System Testing

In this paper test generation methods and appropriate fault models for testing and analysis of embedded systems described as (extended) finite state machines ((E)FSMs) are presented. Compared to simple FSMs, EFSMs specify not only the control flow but also the data flow. Thus, we define a two-level fault model to cover both aspects. The goal of this paper is to reuse well-known FSM-based test generation methods for automation of embedded system testing. These methods have been widely used in testing and validation of protocols and communicating systems. In particular, (E)FSMs-based specification and testing is more advantageous because (E)FSMs support the formal semantic of already standardised formal description techniques (FDTs) despite of their popularity in the design of hardware and software systems.

University of Jordan Case Tool (Uj-Case- Tool) for Database Reverse Engineering

The database reverse engineering problems and solving processes are getting mature, even though, the academic community is facing the complex problem of knowledge transfer, both in university and industrial contexts. This paper presents a new CASE tool developed at the University of Jordan which addresses an efficient support of this transfer, namely UJ-CASE-TOOL. It is a small and self-contained application exhibiting representative problems and appropriate solutions that can be understood in a limited time. It presents an algorithm that describes the developed academic CASE tool which has been used for several years both as an illustration of the principles of database reverse engineering and as an exercise aimed at academic and industrial students.

A Family of Zero Stable Block Integrator for the Solutions of Ordinary Differential Equations

In this paper, linear multistep technique using power series as the basis function is used to develop the block methods which are suitable for generating direct solution of the special second order ordinary differential equations with associated initial or boundary conditions. The continuous hybrid formulations enable us to differentiate and evaluate at some grids and off – grid points to obtain two different four discrete schemes, each of order (5,5,5,5)T, which were used in block form for parallel or sequential solutions of the problems. The computational burden and computer time wastage involved in the usual reduction of second order problem into system of first order equations are avoided by this approach. Furthermore, a stability analysis and efficiency of the block methods are tested on linear and non-linear ordinary differential equations and the results obtained compared favorably with the exact solution.

Dual Pyramid of Agents for Image Segmentation

An effective method for the early detection of breast cancer is the mammographic screening. One of the most important signs of early breast cancer is the presence of microcalcifications. For the detection of microcalcification in a mammography image, we propose to conceive a multiagent system based on a dual irregular pyramid. An initial segmentation is obtained by an incremental approach; the result represents level zero of the pyramid. The edge information obtained by application of the Canny filter is taken into account to affine the segmentation. The edge-agents and region-agents cooper level by level of the pyramid by exploiting its various characteristics to provide the segmentation process convergence.

Effective Design Parameters on the End Effect in Single-Sided Linear Induction Motors

Linear induction motors are used in various industries but they have some specific phenomena which are the causes for some problems. The most important phenomenon is called end effect. End effect decreases efficiency, power factor and output force and unbalances the phase currents. This phenomenon is more important in medium and high speeds machines. In this paper a factor, EEF , is obtained by an accurate equivalent circuit model, to determine the end effect intensity. In this way, all of effective design parameters on end effect is described. Accuracy of this equivalent circuit model is evaluated by two dimensional finite-element analysis using ANSYS. The results show the accuracy of the equivalent circuit model.

Frequency-Energy Characteristics of Local Earthquakes using Discrete Wavelet Transform(DWT)

The wavelet transform is one of the most important method used in signal processing. In this study, we have introduced frequency-energy characteristics of local earthquakes using discrete wavelet transform. Frequency-energy characteristic was analyzed depend on difference between P and S wave arrival time and noise within records. We have found that local earthquakes have similar characteristics. If frequency-energy characteristics can be found accurately, this gives us a hint to calculate P and S wave arrival time. It can be seen that wavelet transform provides successful approximation for this. In this study, 100 earthquakes with 500 records were analyzed approximately.

Computational Modeling in Strategic Marketing

Well-developed strategic marketing planning is the essential prerequisite for establishment of the right and unique competitive advantage. Typical market, however, is a heterogeneous and decentralized structure with natural involvement of individual or group subjectivity and irrationality. These features cannot be fully expressed with one-shot rigorous formal models based on, e.g. mathematics, statistics or empirical formulas. We present an innovative solution, extending the domain of agent based computational economics towards the concept of hybrid modeling in service provider and consumer market such as telecommunications. The behavior of the market is described by two classes of agents - consumer and service provider agents - whose internal dynamics are fundamentally different. Customers are rather free multi-state structures, adjusting behavior and preferences quickly in accordance with time and changing environment. Producers, on the contrary, are traditionally structured companies with comparable internal processes and specific managerial policies. Their business momentum is higher and immediate reaction possibilities limited. This limitation underlines importance of proper strategic planning as the main process advising managers in time whether to continue with more or less the same business or whether to consider the need for future structural changes that would ensure retention of existing customers or acquisition of new ones.

A Multi-Agent Framework for Data Mining

A generic and extendible Multi-Agent Data Mining (MADM) framework, MADMF (the Multi-Agent Data Mining Framework) is described. The central feature of the framework is that it avoids the use of agreed meta-language formats by supporting a framework of wrappers. The advantage offered is that the framework is easily extendible, so that further data agents and mining agents can simply be added to the framework. A demonstration MADMF framework is currently available. The paper includes details of the MADMF architecture and the wrapper principle incorporated into it. A full description and evaluation of the framework-s operation is provided by considering two MADM scenarios.

Kinematic Analysis of Roll Motion for a Strut/SLA Suspension System

The roll center is one of the key parameters for designing a suspension. Several driving characteristics are affected significantly by the migration of the roll center during the suspension-s motion. The strut/SLA (strut/short-long-arm) suspension, which is widely used in production cars, combines the space-saving characteristics of a MacPherson strut suspension with some of the preferred handling characteristics of an SLA suspension. In this study, a front strut/SLA suspension is modeled by ADAMS/Car software. Kinematic roll analysis is then employed to investigate how the rolling characteristics change under the wheel travel and steering input. The related parameters, including the roll center height, roll camber gain, toe change, scrub radius and wheel track width change, are analyzed and discussed. It is found that the strut/SLA suspension clearly has a higher roll center than strut and SLA suspensions do. The variations in the roll center height under roll analysis are very different as the wheel travel displacement and steering angle are added. The results of the roll camber gain, scrub radius and wheel track width change are considered satisfactory. However, the toe change is too large and needs fine-tuning through a sensitivity analysis.