Investment Prediction Using Simulation

A business case is a proposal for an investment initiative to satisfy business and functional requirements. The business case provides the foundation for tactical decision making and technology risk management. It helps to clarify how the organization will use its resources in the best way by providing justification for investment of resources. This paper describes how simulation was used for business case benefits and return on investment for the procurement of 8 production machines. With investment costs of about 4.7 million dollars and annual operating costs of about 1.3 million, we needed to determine if the machines would provide enough cost savings and cost avoidance. We constructed a model of the existing factory environment consisting of 8 machines and subsequently, we conducted average day simulations with light and heavy volumes to facilitate planning decisions required to be documented and substantiated in the business case.

Effect of Gravity Modulation on Weakly Non-Linear Stability of Stationary Convection in a Dielectric Liquid

The effect of time-periodic oscillations of the Rayleigh- Benard system on the heat transport in dielectric liquids is investigated by weakly nonlinear analysis. We focus on stationary convection using the slow time scale and arrive at the real Ginzburg- Landau equation. Classical fourth order Runge-kutta method is used to solve the Ginzburg-Landau equation which gives the amplitude of convection and this helps in quantifying the heat transfer in dielectric liquids in terms of the Nusselt number. The effect of electrical Rayleigh number and the amplitude of modulation on heat transport is studied.

Assessing and Managing Intellectual Capital to Support Open Innovation Paradigm

The objective of this paper is to support the application of Open Innovation practices in firms and organizations by the assessment and management of Intellectual Capital. Intellectual Capital constituents are analyzed in order to verify their capability of acting as key drivers of Open Innovation processes and, therefore, of creating value. A methodology is defined to settle a procedure which helps to select the most relevant Intellectual Capital value drivers and to provide Communities of Innovation with strategic and managerial guidelines in sustaining Open Innovation paradigm. An application of the methodology is developed within a specifically addressed project and its results are hereafter examined.

Visualization of Searching and Sorting Algorithms

Sequences of execution of algorithms in an interactive manner using multimedia tools are employed in this paper. It helps to realize the concept of fundamentals of algorithms such as searching and sorting method in a simple manner. Visualization gains more attention than theoretical study and it is an easy way of learning process. We propose methods for finding runtime sequence of each algorithm in an interactive way and aims to overcome the drawbacks of the existing character systems. System illustrates each and every step clearly using text and animation. Comparisons of its time complexity have been carried out and results show that our approach provides better perceptive of algorithms.

Opportunistic Routing with Secure Coded Wireless Multicast Using MAS Approach

Many Wireless Sensor Network (WSN) applications necessitate secure multicast services for the purpose of broadcasting delay sensitive data like video files and live telecast at fixed time-slot. This work provides a novel method to deal with end-to-end delay and drop rate of packets. Opportunistic Routing chooses a link based on the maximum probability of packet delivery ratio. Null Key Generation helps in authenticating packets to the receiver. Markov Decision Process based Adaptive Scheduling algorithm determines the time slot for packet transmission. Both theoretical analysis and simulation results show that the proposed protocol ensures better performance in terms of packet delivery ratio, average end-to-end delay and normalized routing overhead.

Diagnosis of Multivariate Process via Nonlinear Kernel Method Combined with Qualitative Representation of Fault Patterns

The fault detection and diagnosis of complicated production processes is one of essential tasks needed to run the process safely with good final product quality. Unexpected events occurred in the process may have a serious impact on the process. In this work, triangular representation of process measurement data obtained in an on-line basis is evaluated using simulation process. The effect of using linear and nonlinear reduced spaces is also tested. Their diagnosis performance was demonstrated using multivariate fault data. It has shown that the nonlinear technique based diagnosis method produced more reliable results and outperforms linear method. The use of appropriate reduced space yielded better diagnosis performance. The presented diagnosis framework is different from existing ones in that it attempts to extract the fault pattern in the reduced space, not in the original process variable space. The use of reduced model space helps to mitigate the sensitivity of the fault pattern to noise.

Unscented Grid Filtering and Smoothing for Nonlinear Time Series Analysis

This paper develops an unscented grid-based filter and a smoother for accurate nonlinear modeling and analysis of time series. The filter uses unscented deterministic sampling during both the time and measurement updating phases, to approximate directly the distributions of the latent state variable. A complementary grid smoother is also made to enable computing of the likelihood. This helps us to formulate an expectation maximisation algorithm for maximum likelihood estimation of the state noise and the observation noise. Empirical investigations show that the proposed unscented grid filter/smoother compares favourably to other similar filters on nonlinear estimation tasks.

Generational PipeLined Genetic Algorithm (PLGA)using Stochastic Selection

In this paper, a pipelined version of genetic algorithm, called PLGA, and a corresponding hardware platform are described. The basic operations of conventional GA (CGA) are made pipelined using an appropriate selection scheme. The selection operator, used here, is stochastic in nature and is called SA-selection. This helps maintaining the basic generational nature of the proposed pipelined GA (PLGA). A number of benchmark problems are used to compare the performances of conventional roulette-wheel selection and the SA-selection. These include unimodal and multimodal functions with dimensionality varying from very small to very large. It is seen that the SA-selection scheme is giving comparable performances with respect to the classical roulette-wheel selection scheme, for all the instances, when quality of solutions and rate of convergence are considered. The speedups obtained by PLGA for different benchmarks are found to be significant. It is shown that a complete hardware pipeline can be developed using the proposed scheme, if parallel evaluation of the fitness expression is possible. In this connection a low-cost but very fast hardware evaluation unit is described. Results of simulation experiments show that in a pipelined hardware environment, PLGA will be much faster than CGA. In terms of efficiency, PLGA is found to outperform parallel GA (PGA) also.

Analysis of Medical Data using Data Mining and Formal Concept Analysis

This paper focuses on analyzing medical diagnostic data using classification rules in data mining and context reduction in formal concept analysis. It helps in finding redundancies among the various medical examination tests used in diagnosis of a disease. Classification rules have been derived from positive and negative association rules using the Concept lattice structure of the Formal Concept Analysis. Context reduction technique given in Formal Concept Analysis along with classification rules has been used to find redundancies among the various medical examination tests. Also it finds out whether expensive medical tests can be replaced by some cheaper tests.

Alignment of Emission Gamma Ray Sources with Nai(Ti) Scintillation Detectors by Two Laser Beams to Pre-Operation using Alternating Minimization Technique

Accurate timing alignment and stability is important to maximize the true counts and minimize the random counts in positron emission tomography So signals output from detectors must be centering with the two isotopes to pre-operation and fed signals into four units of pulse-processing units, each unit can accept up to eight inputs. The dual source computed tomography consist two units on the left for 15 detector signals of Cs-137 isotope and two units on the right are for 15 detectors signals of Co-60 isotope. The gamma spectrum consisting of either single or multiple photo peaks. This allows for the use of energy discrimination electronic hardware associated with the data acquisition system to acquire photon counts data with a specific energy, even if poor energy resolution detectors are used. This also helps to avoid counting of the Compton scatter counts especially if a single discrete gamma photo peak is emitted by the source as in the case of Cs-137. In this study the polyenergetic version of the alternating minimization algorithm is applied to the dual energy gamma computed tomography problem.

Efficient Secured Lossless Coding of Medical Images– Using Modified Runlength Coding for Character Representation

Lossless compression schemes with secure transmission play a key role in telemedicine applications that helps in accurate diagnosis and research. Traditional cryptographic algorithms for data security are not fast enough to process vast amount of data. Hence a novel Secured lossless compression approach proposed in this paper is based on reversible integer wavelet transform, EZW algorithm, new modified runlength coding for character representation and selective bit scrambling. The use of the lifting scheme allows generating truly lossless integer-to-integer wavelet transforms. Images are compressed/decompressed by well-known EZW algorithm. The proposed modified runlength coding greatly improves the compression performance and also increases the security level. This work employs scrambling method which is fast, simple to implement and it provides security. Lossless compression ratios and distortion performance of this proposed method are found to be better than other lossless techniques.

Conceptual Design of an Airfoil with Temperature-Responsive Polymer

The accelerated growth in aircraft industries desire effectual schemes, programs, innovative designs of advanced systems and facilities to accomplish the augmenting need for home-free air transportation. In this paper, a contemporary conceptual design of a cambered airfoil has been proposed in order to providing augmented effective lift force relative to the airplane, and to eliminating drawbacks and limitations of an airfoil in a commercial airplane by using a kind of smart materials. This invention of an unsymmetrical airfoil structure utilizes the amplified air momentum around the airfoil and increased camber length to providing improved aircraft performance and assist to enhancing the reliability of the aircraft components. Moreover, this conjectured design helps to reducing airplane weight and total drag.

Mobile Robot Navigation Using Local Model Networks

Developing techniques for mobile robot navigation constitutes one of the major trends in the current research on mobile robotics. This paper develops a local model network (LMN) for mobile robot navigation. The LMN represents the mobile robot by a set of locally valid submodels that are Multi-Layer Perceptrons (MLPs). Training these submodels employs Back Propagation (BP) algorithm. The paper proposes the fuzzy C-means (FCM) in this scheme to divide the input space to sub regions, and then a submodel (MLP) is identified to represent a particular region. The submodels then are combined in a unified structure. In run time phase, Radial Basis Functions (RBFs) are employed as windows for the activated submodels. This proposed structure overcomes the problem of changing operating regions of mobile robots. Read data are used in all experiments. Results for mobile robot navigation using the proposed LMN reflect the soundness of the proposed scheme.

Loop Back Connected Component Labeling Algorithm and Its Implementation in Detecting Face

In this study, a Loop Back Algorithm for component connected labeling for detecting objects in a digital image is presented. The approach is using loop back connected component labeling algorithm that helps the system to distinguish the object detected according to their label. Deferent than whole window scanning technique, this technique reduces the searching time for locating the object by focusing on the suspected object based on certain features defined. In this study, the approach was also implemented for a face detection system. Face detection system is becoming interesting research since there are many devices or systems that require detecting the face for certain purposes. The input can be from still image or videos, therefore the sub process of this system has to be simple, efficient and accurate to give a good result.

Discovering Complex Regularities: from Tree to Semi-Lattice Classifications

Data mining uses a variety of techniques each of which is useful for some particular task. It is important to have a deep understanding of each technique and be able to perform sophisticated analysis. In this article we describe a tool built to simulate a variation of the Kohonen network to perform unsupervised clustering and support the entire data mining process up to results visualization. A graphical representation helps the user to find out a strategy to optimize classification by adding, moving or delete a neuron in order to change the number of classes. The tool is able to automatically suggest a strategy to optimize the number of classes optimization, but also support both tree classifications and semi-lattice organizations of the classes to give to the users the possibility of passing from one class to the ones with which it has some aspects in common. Examples of using tree and semi-lattice classifications are given to illustrate advantages and problems. The tool is applied to classify macroeconomic data that report the most developed countries- import and export. It is possible to classify the countries based on their economic behaviour and use the tool to characterize the commercial behaviour of a country in a selected class from the analysis of positive and negative features that contribute to classes formation. Possible interrelationships between the classes and their meaning are also discussed.

Delay Preserving Substructures in Wireless Networks Using Edge Difference between a Graph and its Square Graph

In practice, wireless networks has the property that the signal strength attenuates with respect to the distance from the base station, it could be better if the nodes at two hop away are considered for better quality of service. In this paper, we propose a procedure to identify delay preserving substructures for a given wireless ad-hoc network using a new graph operation G 2 – E (G) = G* (Edge difference of square graph of a given graph and the original graph). This operation helps to analyze some induced substructures, which preserve delay in communication among them. This operation G* on a given graph will induce a graph, in which 1- hop neighbors of any node are at 2-hop distance in the original network. In this paper, we also identify some delay preserving substructures in G*, which are (i) set of all nodes, which are mutually at 2-hop distance in G that will form a clique in G*, (ii) set of nodes which forms an odd cycle C2k+1 in G, will form an odd cycle in G* and the set of nodes which form a even cycle C2k in G that will form two disjoint companion cycles ( of same parity odd/even) of length k in G*, (iii) every path of length 2k+1 or 2k in G will induce two disjoint paths of length k in G*, and (iv) set of nodes in G*, which induces a maximal connected sub graph with radius 1 (which identifies a substructure with radius equal 2 and diameter at most 4 in G). The above delay preserving sub structures will behave as good clusters in the original network.

Classification and Analysis of Risks in Software Engineering

Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.

Application of Neural Network in User Authentication for Smart Home System

Security has been an important issue and concern in the smart home systems. Smart home networks consist of a wide range of wired or wireless devices, there is possibility that illegal access to some restricted data or devices may happen. Password-based authentication is widely used to identify authorize users, because this method is cheap, easy and quite accurate. In this paper, a neural network is trained to store the passwords instead of using verification table. This method is useful in solving security problems that happened in some authentication system. The conventional way to train the network using Backpropagation (BPN) requires a long training time. Hence, a faster training algorithm, Resilient Backpropagation (RPROP) is embedded to the MLPs Neural Network to accelerate the training process. For the Data Part, 200 sets of UserID and Passwords were created and encoded into binary as the input. The simulation had been carried out to evaluate the performance for different number of hidden neurons and combination of transfer functions. Mean Square Error (MSE), training time and number of epochs are used to determine the network performance. From the results obtained, using Tansig and Purelin in hidden and output layer and 250 hidden neurons gave the better performance. As a result, a password-based user authentication system for smart home by using neural network had been developed successfully.

Improved Wavelet Neural Networks for Early Cancer Diagnosis Using Clustering Algorithms

Wavelet neural networks (WNNs) have emerged as a vital alternative to the vastly studied multilayer perceptrons (MLPs) since its first implementation. In this paper, we applied various clustering algorithms, namely, K-means (KM), Fuzzy C-means (FCM), symmetry-based K-means (SBKM), symmetry-based Fuzzy C-means (SBFCM) and modified point symmetry-based K-means (MPKM) clustering algorithms in choosing the translation parameter of a WNN. These modified WNNs are further applied to the heterogeneous cancer classification using benchmark microarray data and were compared against the conventional WNN with random initialization method. Experimental results showed that a WNN classifier with the MPKM algorithm is more precise than the conventional WNN as well as the WNNs with other clustering algorithms.

The Effect of Binahong to Hematoma

In elevating performance in competetive sports, an athlete must continously train in achieving maximum performance,but needs to pay attention to recovery therapy, that is to recover from fatigue as well as injury.The correct recovery therapy will assist in process of recovery and helps in the training in achieving better performace. Binahong (Anredera cordifolia) was proven empirically by the locals in assisting speedy recovery from an injury.Clinical research with lab animals receiving blunt trauma injury, microscopically shown signs of: 1) redness, 2) heatiness, 3) swelling and, 4) lack of activity. There is also microscopic indication of: 1) infiltration of inflame cells (migration of cells to the trauma area), 2) Cells necrosis, 3) Congestion (as a result of dead red blood cells), 4) uedema. On administration of Binahong for 3 days, there is a significant drop of 5% in cell inflammation, 2% increase of fibroblast (cell membrance) count.Conclutin: Binahong do assist in reducing cell inflammation and increase counts of cells fibroblast. Suggestion: In helping athlete's to recover from force injury, we need study about Binahong's roots to inflammation cell and healing of injuried cell.