Implementation of Parallel Interface for Microprocessor Trainer

In this paper, parallel interface for microprocessor trainer was implemented. A programmable parallel–port device such as the IC 8255A is initialized for simple input or output and for handshake input or output by choosing kinds of modes. The hardware connections and the programs can be used to interface microprocessor trainer and a personal computer by using IC 8255A. The assembly programs edited on PC-s editor can be downloaded to the trainer.

A Novel Computer Vision Method for Evaluating Deformations of Fibers Cross Section in False Twist Textured Yarns

In recent five decades, textured yarns of polyester fiber produced by false twist method are the most important and mass-produced manmade fibers. There are many parameters of cross section which affect the physical and mechanical properties of textured yarns. These parameters are surface area, perimeter, equivalent diameter, large diameter, small diameter, convexity, stiffness, eccentricity, and hydraulic diameter. These parameters were evaluated by digital image processing techniques. To find trends between production criteria and evaluated parameters of cross section, three criteria of production line have been adjusted and different types of yarns were produced. These criteria are temperature, drafting ratio, and D/Y ratio. Finally the relations between production criteria and cross section parameters were considered. The results showed that the presented technique can recognize and measure the parameters of fiber cross section in acceptable accuracy. Also, the optimum condition of adjustments has been estimated from results of image analysis evaluation.

Assessing Pre-Service Teachers' Computer PhobiaLevels in terms of Gender and Experience, Turkish Sample

In this study it is aimed to determine the level of preservice teachers- computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were collected through the Computer Phobia Scale consisting of the “Personal Knowledge Questionnaire", “Computer Anxiety Rating Scale", and “Computer Thought Survey". In this study, data were analyzed with statistical processes such as t test, and correlation analysis. According to results of statistical analyses, computer phobia of male pre-service teachers does not statistically vary depending on their gender. Although male preservice teachers have higher computer anxiety scores, they have lower computer thought scores. It was also observed that there is a negative and intensive relation between computer experience and computer anxiety. Meanwhile it was found out that pre-service teachers using computer regularly indicated lower computer anxiety. Obtained results were tried to be discussed in terms of the number of computer classes in the Education Faculty curriculum, hours of computer class and the computer availability of student teachers.

Comparison of FAHP and TOPSIS for Evacuation Capability Assessment of High-rise Buildings

A lot of computer-based methods have been developed to assess the evacuation capability (EC) of high-rise buildings. Because softwares are time-consuming and not proper for on scene applications, we adopted two methods, fuzzy analytic hierarchy process (FAHP) and technique for order preference by similarity to an ideal solution (TOPSIS), for EC assessment of a high-rise building in Jinan. The EC scores obtained with the two methods and the evacuation time acquired with Pathfinder 2009 for floors 47-60 of the building were compared with each other. The results show that FAHP performs better than TOPSIS for EC assessment of high-rise buildings, especially in the aspect of dealing with the effect of occupant type and distance to exit on EC, tackling complex problem with multi-level structure of criteria, and requiring less amount of computation. However, both FAHP and TOPSIS failed to appropriately handle the situation where the exit width changes while occupants are few.

A Study on Applying 3D Reconstruction to 3D Last Morphing

When it comes to last, it is regarded as the critical foundation of shoe design and development. A computer aided methodology for various last form designs is proposed in this study. The reverse engineering is mainly applied to the process of scanning for the last form. Then with the minimum energy for revision of surface continuity, the surface reconstruction of last is rebuilt by the feature curves of the scanned last. When the surface reconstruction of last is completed, the weighted arithmetic mean method is applied to the computation on the shape morphing for the control mesh of last, thus 3D last form of different sizes is generated from its original form feature with functions remained. In the end, the result of this study is applied to an application for 3D last reconstruction system. The practicability of the proposed methodology is verified through later case studies.

View-Point Insensitive Human Pose Recognition using Neural Network and CUDA

Although lots of research work has been done for human pose recognition, the view-point of cameras is still critical problem of overall recognition system. In this paper, view-point insensitive human pose recognition is proposed. The aims of the proposed system are view-point insensitivity and real-time processing. Recognition system consists of feature extraction module, neural network and real-time feed forward calculation. First, histogram-based method is used to extract feature from silhouette image and it is suitable for represent the shape of human pose. To reduce the dimension of feature vector, Principle Component Analysis(PCA) is used. Second, real-time processing is implemented by using Compute Unified Device Architecture(CUDA) and this architecture improves the speed of feed-forward calculation of neural network. We demonstrate the effectiveness of our approach with experiments on real environment.

A Computational Stochastic Modeling Formalism for Biological Networks

Stochastic models of biological networks are well established in systems biology, where the computational treatment of such models is often focused on the solution of the so-called chemical master equation via stochastic simulation algorithms. In contrast to this, the development of storage-efficient model representations that are directly suitable for computer implementation has received significantly less attention. Instead, a model is usually described in terms of a stochastic process or a "higher-level paradigm" with graphical representation such as e.g. a stochastic Petri net. A serious problem then arises due to the exponential growth of the model-s state space which is in fact a main reason for the popularity of stochastic simulation since simulation suffers less from the state space explosion than non-simulative numerical solution techniques. In this paper we present transition class models for the representation of biological network models, a compact mathematical formalism that circumvents state space explosion. Transition class models can also serve as an interface between different higher level modeling paradigms, stochastic processes and the implementation coded in a programming language. Besides, the compact model representation provides the opportunity to apply non-simulative solution techniques thereby preserving the possible use of stochastic simulation. Illustrative examples of transition class representations are given for an enzyme-catalyzed substrate conversion and a part of the bacteriophage λ lysis/lysogeny pathway.

Load Modeling for Power Flow and Transient Stability Computer Studies at BAKHTAR Network

A method has been developed for preparing load models for power flow and stability. The load modeling (LOADMOD) computer software transforms data on load class mix, composition, and characteristics into the from required for commonly–used power flow and transient stability simulation programs. Typical default data have been developed for load composition and characteristics. This paper defines LOADMOD software and describes the dynamic and static load modeling techniques used in this software and results of initial testing for BAKHTAR power system.

Simulating a Single-Server Queue using the Q – Simulator

This paper introduces a technique for simulating a single-server exponential queuing system. The technique called the Q-Simulator is a computer program which can simulate the effect of traffic intensity on all system average quantities given the arrival and/or service rates. The Q-Simulator has three phases namely: the formula based method, the uncontrolled simulation, and the controlled simulation. The Q-Simulator generates graphs (crystal solutions) for all results of the simulation or calculation and can be used to estimate desirable average quantities such as waiting times, queue lengths, etc.

Toward Delegated Democracy: Vote by Yourself, or Trust Your Network

The recent development of Information and Communication Technology (ICT) enables new ways of "democratic" decision-making such as a page-ranking system, which estimates the importance of a web page based on indirect trust on that page shared by diverse group of unorganized individuals. These kinds of "democracy" have not been acclaimed yet in the world of real politics. On the other hand, a large amount of data about personal relations including trust, norms of reciprocity, and networks of civic engagement has been accumulated in a computer-readable form by computer systems (e.g., social networking systems). We can use these relations as a new type of social capital to construct a new democratic decision-making system based on a delegation network. In this paper, we propose an effective decision-making support system, which is based on empowering someone's vote whom you trust. For this purpose, we propose two new techniques: the first is for estimating entire vote distribution from a small number of votes, and the second is for estimating active voter choice to promote voting using a delegation network. We show that these techniques could increase the voting ratio and credibility of the whole decision by agent-based simulations.

Simulating Discrete Time Model Reference Adaptive Control System with Great Initial Error

This article is based on the technique which is called Discrete Parameter Tracking (DPT). First introduced by A. A. Azab [8] which is applicable for less order reference model. The order of the reference model is (n-l) and n is the number of the adjustable parameters in the physical plant. The technique utilizes a modified gradient method [9] where the knowledge of the exact order of the nonadaptive system is not required, so, as to eliminate the identification problem. The applicability of the mentioned technique (DPT) was examined through the solution of several problems. This article introduces the solution of a third order system with three adjustable parameters, controlled according to second order reference model. The adjustable parameters have great initial error which represent condition. Computer simulations for the solution and analysis are provided to demonstrate the simplicity and feasibility of the technique.

Image Magnification Using Adaptive Interpolationby Pixel Level Data-Dependent Geometrical Shapes

World has entered in 21st century. The technology of computer graphics and digital cameras is prevalent. High resolution display and printer are available. Therefore high resolution images are needed in order to produce high quality display images and high quality prints. However, since high resolution images are not usually provided, there is a need to magnify the original images. One common difficulty in the previous magnification techniques is that of preserving details, i.e. edges and at the same time smoothing the data for not introducing the spurious artefacts. A definitive solution to this is still an open issue. In this paper an image magnification using adaptive interpolation by pixel level data-dependent geometrical shapes is proposed that tries to take into account information about the edges (sharp luminance variations) and smoothness of the image. It calculate threshold, classify interpolation region in the form of geometrical shapes and then assign suitable values inside interpolation region to the undefined pixels while preserving the sharp luminance variations and smoothness at the same time. The results of proposed technique has been compared qualitatively and quantitatively with five other techniques. In which the qualitative results show that the proposed method beats completely the Nearest Neighbouring (NN), bilinear(BL) and bicubic(BC) interpolation. The quantitative results are competitive and consistent with NN, BL, BC and others.

A Robust Frequency Offset Estimation Scheme for OFDM System with Cyclic Delay Diversity

Cyclic delay diversity (CDD) is a simple technique to intentionally increase frequency selectivity of channels for orthogonal frequency division multiplexing (OFDM).This paper proposes a residual carrier frequency offset (RFO) estimation scheme for OFDMbased broadcasting system using CDD. In order to improve the RFO estimation, this paper addresses a decision scheme of the amount of cyclic delay and pilot pattern used to estimate the RFO. By computer simulation, the proposed estimator is shown to benefit form propoerly chosen delay parameter and perform robustly.

Computer-Aided Analysis of Flow in a Rotating Single Disk

In this study a two dimensional axisymmetric, steady state and incompressible laminar flow in a rotating single disk is numerically investigated. The finite volume method is used for solving the momentum equations. The numerical model and results are validated by comparing it to previously reported experimental data for velocities, angles and moment coefficients. It is demonstrated that increasing the axial distance increases the value of axial velocity and vice versa for tangential and total velocities. However, the maximum value of nondimensional radial velocity occurs near the disk wall. It is also found that with increase rotational Reynolds number, moment coefficient decreases.

Synchronization Between Two Chaotic Systems: Numerical and Circuit Simulation

In this paper, a generalized synchronization scheme, which is called function synchronization, for chaotic systems is studied. Based on Lyapunov method and active control method, we design the synchronization controller for the system such that the error dynamics between master and slave chaotic systems is asymptotically stable. For verification of our theory, computer and circuit simulations for a specific chaotic system is conducted.

Robust Nonlinear Control of Two Links Robot Manipulator and Computing Maximum Load

A new robust nonlinear control scheme of a manipulator is proposed in this paper which is robust against modeling errors and unknown disturbances. It is based on the principle of variable structure control, with sliding mode control (SMC) method. The variable structure control method is a robust method that appears to be well suited for robotic manipulators because it requers only bounds on the robotic arm parameters. But there is no single systematic procedure that is guaranteed to produce a suitable control law. Also, to reduce chattring of the control signal, we replaced the sgn function in the control law by a continuous approximation such as tangant function. We can compute the maximum load with regard to applied torque into joints. The effectivness of the proposed approach has been evaluated analitically demonstrated through computer simulations for the cases of variable load and robot arm parameters.

LabVIEW with Fuzzy Logic Controller Simulation Panel for Condition Monitoring of Oil and Dry Type Transformer

Condition monitoring of electrical power equipment has attracted considerable attention for many years. The aim of this paper is to use Labview with Fuzzy Logic controller to build a simulation system to diagnose transformer faults and monitor its condition. The front panel of the system was designed using LabVIEW to enable computer to act as customer-designed instrument. The dissolved gas-in-oil analysis (DGA) method was used as technique for oil type transformer diagnosis; meanwhile terminal voltages and currents analysis method was used for dry type transformer. Fuzzy Logic was used as expert system that assesses all information keyed in at the front panel to diagnose and predict the condition of the transformer. The outcome of the Fuzzy Logic interpretation will be displayed at front panel of LabVIEW to show the user the conditions of the transformer at any time.

Evaluating Complexity – Ethical Challenges in Computational Design Processes

Complexity, as a theoretical background has made it easier to understand and explain the features and dynamic behavior of various complex systems. As the common theoretical background has confirmed, borrowing the terminology for design from the natural sciences has helped to control and understand urban complexity. Phenomena like self-organization, evolution and adaptation are appropriate to describe the formerly inaccessible characteristics of the complex environment in unpredictable bottomup systems. Increased computing capacity has been a key element in capturing the chaotic nature of these systems. A paradigm shift in urban planning and architectural design has forced us to give up the illusion of total control in urban environment, and consequently to seek for novel methods for steering the development. New methods using dynamic modeling have offered a real option for more thorough understanding of complexity and urban processes. At best new approaches may renew the design processes so that we get a better grip on the complex world via more flexible processes, support urban environmental diversity and respond to our needs beyond basic welfare by liberating ourselves from the standardized minimalism. A complex system and its features are as such beyond human ethics. Self-organization or evolution is either good or bad. Their mechanisms are by nature devoid of reason. They are common in urban dynamics in both natural processes and gas. They are features of a complex system, and they cannot be prevented. Yet their dynamics can be studied and supported. The paradigm of complexity and new design approaches has been criticized for a lack of humanity and morality, but the ethical implications of scientific or computational design processes have not been much discussed. It is important to distinguish the (unexciting) ethics of the theory and tools from the ethics of computer aided processes based on ethical decisions. Urban planning and architecture cannot be based on the survival of the fittest; however, the natural dynamics of the system cannot be impeded on grounds of being “non-human". In this paper the ethical challenges of using the dynamic models are contemplated in light of a few examples of new architecture and dynamic urban models and literature. It is suggested that ethical challenges in computational design processes could be reframed under the concepts of responsibility and transparency.

Agent-based Simulation for Blood Glucose Control in Diabetic Patients

This paper employs a new approach to regulate the blood glucose level of type I diabetic patient under an intensive insulin treatment. The closed-loop control scheme incorporates expert knowledge about treatment by using reinforcement learning theory to maintain the normoglycemic average of 80 mg/dl and the normal condition for free plasma insulin concentration in severe initial state. The insulin delivery rate is obtained off-line by using Qlearning algorithm, without requiring an explicit model of the environment dynamics. The implementation of the insulin delivery rate, therefore, requires simple function evaluation and minimal online computations. Controller performance is assessed in terms of its ability to reject the effect of meal disturbance and to overcome the variability in the glucose-insulin dynamics from patient to patient. Computer simulations are used to evaluate the effectiveness of the proposed technique and to show its superiority in controlling hyperglycemia over other existing algorithms

Speaker Identification using Neural Networks

The speech signal conveys information about the identity of the speaker. The area of speaker identification is concerned with extracting the identity of the person speaking the utterance. As speech interaction with computers becomes more pervasive in activities such as the telephone, financial transactions and information retrieval from speech databases, the utility of automatically identifying a speaker is based solely on vocal characteristic. This paper emphasizes on text dependent speaker identification, which deals with detecting a particular speaker from a known population. The system prompts the user to provide speech utterance. System identifies the user by comparing the codebook of speech utterance with those of the stored in the database and lists, which contain the most likely speakers, could have given that speech utterance. The speech signal is recorded for N speakers further the features are extracted. Feature extraction is done by means of LPC coefficients, calculating AMDF, and DFT. The neural network is trained by applying these features as input parameters. The features are stored in templates for further comparison. The features for the speaker who has to be identified are extracted and compared with the stored templates using Back Propogation Algorithm. Here, the trained network corresponds to the output; the input is the extracted features of the speaker to be identified. The network does the weight adjustment and the best match is found to identify the speaker. The number of epochs required to get the target decides the network performance.