Simulation Model for Predicting Dengue Fever Outbreak

Dengue fever is prevalent in Malaysia with numerous cases including mortality recorded over the years. Public education on the prevention of the desease through various means has been carried out besides the enforcement of legal means to eradicate Aedes mosquitoes, the dengue vector breeding ground. Hence, other means need to be explored, such as predicting the seasonal peak period of the dengue outbreak and identifying related climate factors contributing to the increase in the number of mosquitoes. Simulation model can be employed for this purpose. In this study, we created a simulation of system dynamic to predict the spread of dengue outbreak in Hulu Langat, Selangor Malaysia. The prototype was developed using STELLA 9.1.2 software. The main data input are rainfall, temperature and denggue cases. Data analysis from the graph showed that denggue cases can be predicted accurately using these two main variables- rainfall and temperature. However, the model will be further tested over a longer time period to ensure its accuracy, reliability and efficiency as a prediction tool for dengue outbreak.

Gaming for the Energy Neutral Development: A Case Study of Strijp-S

This paper deals with stakeholders’ decisions within energy neutral urban redevelopment processes. The decisions of these stakeholders during the process will make or break energy neutral ambitions. An extensive form of game theory model gave insight in the behavioral differences of stakeholders regarding energy neutral ambitions and the effects of the changing legislation. The results show that new legislation regarding spatial planning slightly influences the behavior of stakeholders. An active behavior of the municipality will still result in the best outcome. Nevertheless, the municipality becomes more powerful when acting passively and can make the use of planning tools to provide governance towards energy neutral urban redevelopment. Moreover, organizational support, recognizing the necessity for energy neutrality, keeping focused and collaboration among stakeholders are crucial elements to achieve the objective of an energy neutral urban (re)development.

A Novel Multiple Valued Logic OHRNS Modulo rn Adder Circuit

Residue Number System (RNS) is a modular representation and is proved to be an instrumental tool in many digital signal processing (DSP) applications which require high-speed computations. RNS is an integer and non weighted number system; it can support parallel, carry-free, high-speed and low power arithmetic. A very interesting correspondence exists between the concepts of Multiple Valued Logic (MVL) and Residue Number Arithmetic. If the number of levels used to represent MVL signals is chosen to be consistent with the moduli which create the finite rings in the RNS, MVL becomes a very natural representation for the RNS. There are two concerns related to the application of this Number System: reaching the most possible speed and the largest dynamic range. There is a conflict when one wants to resolve both these problem. That is augmenting the dynamic range results in reducing the speed in the same time. For achieving the most performance a method is considere named “One-Hot Residue Number System" in this implementation the propagation is only equal to one transistor delay. The problem with this method is the huge increase in the number of transistors they are increased in order m2 . In real application this is practically impossible. In this paper combining the Multiple Valued Logic and One-Hot Residue Number System we represent a new method to resolve both of these two problems. In this paper we represent a novel design of an OHRNS-based adder circuit. This circuit is useable for Multiple Valued Logic moduli, in comparison to other RNS design; this circuit has considerably improved the number of transistors and power consumption.

Integrating Technology into Mathematics Education: A Case Study from Primary Mathematics Students Teachers

The purpose of the study is to determine the primary mathematics student teachers- views related to use instructional technology tools in course of the learning process and to reveal how the sample presentations towards different mathematical concepts affect their views. This is a qualitative study involving twelve mathematics students from a public university. The data gathered from two semi-structural interviews. The first one was realized in the beginning of the study. After that the representations prepared by the researchers were showed to the participants. These representations contain animations, Geometer-s Sketchpad activities, video-clips, spreadsheets, and power-point presentations. The last interview was realized at the end of these representations. The data from the interviews and content analyses were transcribed and read and reread to explore the major themes. Findings revealed that the views of the students changed in this process and they believed that the instructional technology tools should be used in their classroom.

Ensembling Classifiers – An Application toImage Data Classification from Cherenkov Telescope Experiment

Ensemble learning algorithms such as AdaBoost and Bagging have been in active research and shown improvements in classification results for several benchmarking data sets with mainly decision trees as their base classifiers. In this paper we experiment to apply these Meta learning techniques with classifiers such as random forests, neural networks and support vector machines. The data sets are from MAGIC, a Cherenkov telescope experiment. The task is to classify gamma signals from overwhelmingly hadron and muon signals representing a rare class classification problem. We compare the individual classifiers with their ensemble counterparts and discuss the results. WEKA a wonderful tool for machine learning has been used for making the experiments.

Software Process Improvement: A Organizational Change that Need to be Managed and Motivated

As seen in literature, about 70% of the improvement initiatives fail, and a significant number do not even get started. This paper analyses the problem of failing initiatives on Software Process Improvement (SPI), and proposes good practices supported by motivational tools that can help minimizing failures. It elaborates on the hypothesis that human factors are poorly addressed by deployers, especially because implementation guides usually emphasize only technical factors. This research was conducted with SPI deployers and analyses 32 SPI initiatives. The results indicate that although human factors are not commonly highlighted in guidelines, the successful initiatives usually address human factors implicitly. This research shows that practices based on human factors indeed perform a crucial role on successful implantations of SPI, proposes change management as a theoretical framework to introduce those practices in the SPI context and suggests some motivational tools based on SPI deployers experience to support it.

An Application for Web Mining Systems with Services Oriented Architecture

Although the World Wide Web is considered the largest source of information there exists nowadays, due to its inherent dynamic characteristics, the task of finding useful and qualified information can become a very frustrating experience. This study presents a research on the information mining systems in the Web; and proposes an implementation of these systems by means of components that can be built using the technology of Web services. This implies that they can encompass features offered by a services oriented architecture (SOA) and specific components may be used by other tools, independent of platforms or programming languages. Hence, the main objective of this work is to provide an architecture to Web mining systems, divided into stages, where each step is a component that will incorporate the characteristics of SOA. The separation of these steps was designed based upon the existing literature. Interesting results were obtained and are shown here.

Virtual Assembly in a Semi-Immersive Environment

Virtual Assembly (VA) is one of the key technologies in advanced manufacturing field. It is a promising application of virtual reality in design and manufacturing field. It has drawn much interest from industries and research institutes in the last two decades. This paper describes a process for integrating an interactive Virtual Reality-based assembly simulation of a digital mockup with the CAD/CAM infrastructure. The necessary hardware and software preconditions for the process are explained so that it can easily be adopted by non VR experts. The article outlines how assembly simulation can improve the CAD/CAM procedures and structures; how CAD model preparations have to be carried out and which virtual environment requirements have to be fulfilled. The issue of data transfer is also explained in the paper. The other challenges and requirements like anti-aliasing and collision detection have also been explained. Finally, a VA simulation has been carried out for a ball valve assembly and a car door assembly with the help of Vizard virtual reality toolkit in a semi-immersive environment and their performance analysis has been done on different workstations to evaluate the importance of graphical processing unit (GPU) in the field of VA.

Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology

We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.740.97 and 0.620.93, respectively. A new MATLAB-based programming tool aiming at analysis of cardiologic RR intervals and blood pressure descriptors, is worked out, too. For each set of data, ten different parameters are extracted: 2 in time domain, 4 in frequency domain and 4 in Poincaré plot analysis. In addition twelve different parameters of baroreflex sensitivity are calculated. All these data sets can be visualized in time domain together with their power spectra and Poincaré plots. If available, the respiratory oscillation curves can be also plotted for comparison. Another application processes biological data obtained from BLAST analysis.

Fractal Analysis of 16S rRNA Gene Sequences in Archaea Thermophiles

A nucleotide sequence can be expressed as a numerical sequence when each nucleotide is assigned its proton number. A resulting gene numerical sequence can be investigated for its fractal dimension in terms of evolution and chemical properties for comparative studies. We have investigated such nucleotide fluctuation in the 16S rRNA gene of archaea thermophiles. The studied archaea thermophiles were archaeoglobus fulgidus, methanothermobacter thermautotrophicus, methanocaldococcus jannaschii, pyrococcus horikoshii, and thermoplasma acidophilum. The studied five archaea-euryarchaeota thermophiles have fractal dimension values ranging from 1.93 to 1.97. Computer simulation shows that random sequences would have an average of about 2 with a standard deviation about 0.015. The fractal dimension was found to correlate (negative correlation) with the thermophile-s optimal growth temperature with R2 value of 0.90 (N =5). The inclusion of two aracheae-crenarchaeota thermophiles reduces the R2 value to 0.66 (N = 7). Further inclusion of two bacterial thermophiles reduces the R2 value to 0.50 (N =9). The fractal dimension is correlated (positive) to the sequence GC content with an R2 value of 0.89 for the five archaea-euryarchaeota thermophiles (and 0.74 for the entire set of N = 9), although computer simulation shows little correlation. The highest correlation (positive) was found to be between the fractal dimension and di-nucleotide Shannon entropy. However Shannon entropy and sequence GC content were observed to correlate with optimal growth temperature having an R2 of 0.8 (negative), and 0.88 (positive), respectively, for the entire set of 9 thermophiles; thus the correlation lacks species specificity. Together with another correlation study of bacterial radiation dosage with RecA repair gene sequence fractal dimension, it is postulated that fractal dimension analysis is a sensitive tool for studying the relationship between genotype and phenotype among closely related sequences.

Exploring Inter-Relationships between Events to Identify Strategic Technological Competencies: A Combined Approach

The inherent complexity in nowadays- business environments is forcing organizations to be attentive to the dynamics in several fronts. Therefore, the management of technological innovation is continually faced with uncertainty about the future. These issues lead to a need for a systemic perspective, able to analyze the consequences of interactions between different factors. The field of technology foresight has proposed methods and tools to deal with this broader perspective. In an attempt to provide a method to analyze the complex interactions between events in several areas, departing from the identification of the most strategic competencies, this paper presents a methodology based on the Delphi method and Quality Function Deployment. This methodology is applied in a sheet metal processing equipment manufacturer, as a case study.

In-flight Meals, Passengers- Level of Satisfaction and Re-flying Intention

Service quality has become a centerpiece for airline companies in vying with one another and keeps their image in the minds of passengers. Many airlines have pushed service quality through service personalization which includes both ground and on board especially from the viewpoint of retaining satisfied passengers and attracting new ones. Besides those, in-flight meals/food service is another important aspect of the airline operation. The in flight meals/food services now are seen as part of marketing strategies in attracting business or leisure travelers. This study reports the outcomes of the investigation on in-flight meals/food attributes toward passengers- level of satisfaction and re-flying intention. Taste, freshness, appearance of in-flight meals/food served and menu choices are important to the airlines passengers especially for the long haul flight. Food not only contributes to the prediction of the airline passengers- levels of satisfaction but besides other factors slightly influence passengers- re- flying intention. Airline companies therefore should not ignore this element but take the opportunity to create more attractive and acceptable in-flight meals/food along with other matter as marketing tools in attracting passengers to re-flying with them.

WPRiMA Tool: Managing Risks in Web Projects

Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.

Software Reengineering Tool for Traffic Accident Data

In today-s hip hop world where everyone is running short of time and works hap hazardly,the similar scene is common on the roads while in traffic.To do away with the fatal consequences of such speedy traffics on rushy lanes, a software to analyse and keep account of the traffic and subsequent conjestion is being used in the developed countries. This software has being implemented and used with the help of a suppprt tool called Critical Analysis Reporting Environment.There has been two existing versions of this tool.The current research paper involves examining the issues and probles while using these two practically. Further a hybrid architecture is proposed for the same that retains the quality and performance of both and is better in terms of coupling of components , maintainence and many other features.

Towards an Automatic Translation of Colored Petri Nets to Maude Language

Colored Petri Nets (CPN) are very known kind of high level Petri nets. With sound and complete semantics, rewriting logic is one of very powerful logics in description and verification of non-deterministic concurrent systems. Recently, CPN semantics are defined in terms of rewriting logic, allowing us to built models by formal reasoning. In this paper, we propose an automatic translation of CPN to the rewriting logic language Maude. This tool allows graphical editing and simulating CPN. The tool allows the user drawing a CPN graphically and automatic translating the graphical representation of the drawn CPN to Maude specification. Then, Maude language is used to perform the simulation of the resulted Maude specification. It is the first rewriting logic based environment for this category of Petri Nets.

Estimating Correlation Dimension on Japanese Candlestick, Application to FOREX Time Series

Recognizing behavioral patterns of financial markets is essential for traders. Japanese candlestick chart is a common tool to visualize and analyze such patterns in an economic time series. Since the world was introduced to Japanese candlestick charting, traders saw how combining this tool with intelligent technical approaches creates a powerful formula for the savvy investors. This paper propose a generalization to box counting method of Grassberger-Procaccia, which is based on computing the correlation dimension of Japanese candlesticks instead commonly used 'close' points. The results of this method applied on several foreign exchange rates vs. IRR (Iranian Rial). Satisfactorily show lower chaotic dimension of Japanese candlesticks series than regular Grassberger-Procaccia method applied merely on close points of these same candles. This means there is some valuable information inside candlesticks.

A Nodal Transmission Pricing Model based on Newly Developed Expressions of Real and Reactive Power Marginal Prices in Competitive Electricity Markets

In competitive electricity markets all over the world, an adoption of suitable transmission pricing model is a problem as transmission segment still operates as a monopoly. Transmission pricing is an important tool to promote investment for various transmission services in order to provide economic, secure and reliable electricity to bulk and retail customers. The nodal pricing based on SRMC (Short Run Marginal Cost) is found extremely useful by researchers for sending correct economic signals. The marginal prices must be determined as a part of solution to optimization problem i.e. to maximize the social welfare. The need to maximize the social welfare subject to number of system operational constraints is a major challenge from computation and societal point of views. The purpose of this paper is to present a nodal transmission pricing model based on SRMC by developing new mathematical expressions of real and reactive power marginal prices using GA-Fuzzy based optimal power flow framework. The impacts of selecting different social welfare functions on power marginal prices are analyzed and verified with results reported in literature. Network revenues for two different power systems are determined using expressions derived for real and reactive power marginal prices in this paper.

SVM-Based Detection of SAR Images in Partially Developed Speckle Noise

Support Vector Machine (SVM) is a statistical learning tool that was initially developed by Vapnik in 1979 and later developed to a more complex concept of structural risk minimization (SRM). SVM is playing an increasing role in applications to detection problems in various engineering problems, notably in statistical signal processing, pattern recognition, image analysis, and communication systems. In this paper, SVM was applied to the detection of SAR (synthetic aperture radar) images in the presence of partially developed speckle noise. The simulation was done for single look and multi-look speckle models to give a complete overlook and insight to the new proposed model of the SVM-based detector. The structure of the SVM was derived and applied to real SAR images and its performance in terms of the mean square error (MSE) metric was calculated. We showed that the SVM-detected SAR images have a very low MSE and are of good quality. The quality of the processed speckled images improved for the multi-look model. Furthermore, the contrast of the SVM detected images was higher than that of the original non-noisy images, indicating that the SVM approach increased the distance between the pixel reflectivity levels (the detection hypotheses) in the original images.

An Iterative Algorithm for Inverse Kinematics of 5-DOF Manipulator with Offset Wrist

This paper presents an iterative algorithm to find a inverse kinematic solution of 5-DOF robot. The algorithm is to minimize the iteration number. Since the 5-DOF robot cannot give full orientation of tool. Only z-direction of tool is satisfied while rotation of tool is determined by kinematic constraint. This work therefore described how to specify the tool direction and let the tool rotation free. The simulation results show that this algorithm effectively worked. Using the proposed iteration algorithm, error due to inverse kinematics converged to zero rapidly in 5 iterations. This algorithm was applied in real welding robot and verified through various practical works.

Forecasting Tala-AUD and Tala-USD Exchange Rates with ANN

The focus of this paper is to construct daily time series exchange rate forecast models of Samoan Tala/USD and Tala/AUD during the year 2008 to 2012 with neural network The performance of the models was measured by using varies error functions such as Root Square mean error (RSME), Mean absolute error (MAE), and Mean absolute percentage error (MAPE). Our empirical findings suggest that AR (1) model is an effective tool to forecast the Tala/USD and Tala/AUD.