A Proposal for Federation Technology for Authenticated Information between Terminals

Recently, various services such as television and the Internet have come to be received through various terminals. However, we could gain greater convenience by receiving these services through cellular phone terminals when we go out and then continuing to receive the same services through a large screen digital television after we have come home. However, it is necessary to go through the same authentication processing again when using TVs after we have come home. In this study, we have developed an authentication method that enables users to switch terminals in environments in which the user receives service from a server through a terminal. Specifically, the method simplifies the authentication of the server side when switching from one terminal to another terminal by using previously authenticated information.

High Performance VLSI Architecture of 2D Discrete Wavelet Transform with Scalable Lattice Structure

In this paper, we propose a fully-utilized, block-based 2D DWT (discrete wavelet transform) architecture, which consists of four 1D DWT filters with two-channel QMF lattice structure. The proposed architecture requires about 2MN-3N registers to save the intermediate results for higher level decomposition, where M and N stand for the filter length and the row width of the image respectively. Furthermore, the proposed 2D DWT processes in horizontal and vertical directions simultaneously without an idle period, so that it computes the DWT for an N×N image in a period of N2(1-2-2J)/3. Compared to the existing approaches, the proposed architecture shows 100% of hardware utilization and high throughput rates. To mitigate the long critical path delay due to the cascaded lattices, we can apply the pipeline technique with four stages, while retaining 100% of hardware utilization. The proposed architecture can be applied in real-time video signal processing.

Analysis of Lightning Surge Condition Effect on Surge Arrester in Electrical Power System by using ATP/EMTP Program

The condition of lightning surge causes the traveling waves and the temporary increase in voltage in the transmission line system. Lightning is the most harmful for destroying the transmission line and setting devices so it is necessary to study and analyze the temporary increase in voltage for designing and setting the surge arrester. This analysis describes the figure of the lightning wave in transmission line with 115 kV voltage level in Thailand by using ATP/EMTP program to create the model of the transmission line and lightning surge. Because of the limit of this program, it must be calculated for the geometry of the transmission line and surge parameter and calculation in the manual book for the closest value of the parameter. On the other hand, for the effects on surge protector when the lightning comes, the surge arrester model must be right and standardized as metropolitan electrical authority's standard. The candidate compared the real information to the result from calculation, also. The results of the analysis show that the temporary increase in voltage value will be rise to 326.59 kV at the line which is done by lightning when the surge arrester is not set in the system. On the other hand, the temporary increase in voltage value will be 182.83 kV at the line which is done by lightning when the surge arrester is set in the system and the period of the traveling wave is reduced, also. The distance for setting the surge arrester must be as near to the transformer as possible. Moreover, it is necessary to know the right distance for setting the surge arrester and the size of the surge arrester for preventing the temporary increase in voltage, effectively.

Infrastructure means for Adaptive Camouflage

The paper deals with the perspectives and possibilities of "smart solutions" to critical infrastructure protection. It means that common computer aided technologies are used from the perspective of new, better protection of selected infrastructure objects. The paper is focused on the co-product of the Czech Defence Research Project - ADAPTIV. This project is carrying out by the University of Defence, Faculty of Economics and Management at the Department of Civil Protection. The project creates system and technology for adaptive cybernetic camouflage of armed forces objects, armaments, vehicles and troops and of mobilization infrastructure. These adaptive camouflage system and technology will be useful for army tactic activities protection and for decoys generation also. The fourth chapter of the paper concerns the possibilities of using the introduced technology to the protection of selected civil (economically important), critical infrastructure objects. The aim of this section is to introduce the scientific capabilities and potential of the University of Defence research results and solutions for the practice.

Quantitative Analysis of PCA, ICA, LDA and SVM in Face Recognition

Face recognition is a technique to automatically identify or verify individuals. It receives great attention in identification, authentication, security and many more applications. Diverse methods had been proposed for this purpose and also a lot of comparative studies were performed. However, researchers could not reach unified conclusion. In this paper, we are reporting an extensive quantitative accuracy analysis of four most widely used face recognition algorithms: Principal Component Analysis (PCA), Independent Component Analysis (ICA), Linear Discriminant Analysis (LDA) and Support Vector Machine (SVM) using AT&T, Sheffield and Bangladeshi people face databases under diverse situations such as illumination, alignment and pose variations.

Creative Technology as Open Ended Learning Tool: A Case Study of Design School in Malaysia

Does open ended creative technology give positive impact in learning design? Although there are many researchers had examined on the impact of technology on design education but there are very few conclusive researches done on the impact of open ended used of software to learning design. This paper sought to investigate a group of student-s experience on relatively wider range of software application within the context of design project. A typography design project was used to create a learning environment with the aim of inculcate design skills into the learners and increase their creative problem-solving and critical thinking skills. The methods used in this study were questionnaire survey and personal observation which will be focus on the individual and group response during the completion of the task.

Optimizing usage of ICTs and Outsourcing Strategic in Business Models and Customer Satisfaction

Nowadays, under developed countries for progress in science and technology and decreasing the technologic gap with developed countries, increasing the capacities and technology transfer from developed countries. To remain competitive, industry is continually searching for new methods to evolve their products. Business model is one of the latest buzzwords in the Internet and electronic business world. To be successful, organizations must look into the needs and wants of their customers. This research attempts to identify a specific feature of the company with a strong competitive advantage by analyzing the cause of Customer satisfaction. Due to the rapid development of knowledge and information technology, business environments have become much more complicated. Information technology can help a firm aiming to gain a competitive advantage. This study explores the role and effect of Information Communication Technology in Business Models and Customer satisfaction on firms and also relationships between ICTs and Outsourcing strategic.

Multi-Objective Optimization for Performance-based Seismic Retrofit using Connection Upgrade

The unanticipated brittle fracture of connection of the steel moment resisting frame (SMRF) occurred in 1994 the Northridge earthquake. Since then, the researches for the vulnerability of connection of the existing SMRF and for rehabilitation of those buildings were conducted. This paper suggests performance-based optimal seismic retrofit technique using connection upgrade. For optimal design, a multi-objective genetic algorithm(NSGA-II) is used. One of the two objective functions is to minimize initial cost and another objective function is to minimize lifetime seismic damages cost. The optimal algorithm proposed in this paper is performed satisfying specified performance objective based on FEMA 356. The nonlinear static analysis is performed for structural seismic performance evaluation. A numerical example of SAC benchmark SMRF is provided using the performance-based optimal seismic retrofit technique proposed in this paper

An Experiment on Personal Archiving and Retrieving Image System (PARIS)

PARIS (Personal Archiving and Retrieving Image System) is an experiment personal photograph library, which includes more than 80,000 of consumer photographs accumulated within a duration of approximately five years, metadata based on our proposed MPEG-7 annotation architecture, Dozen Dimensional Digital Content (DDDC), and a relational database structure. The DDDC architecture is specially designed for facilitating the managing, browsing and retrieving of personal digital photograph collections. In annotating process, we also utilize a proposed Spatial and Temporal Ontology (STO) designed based on the general characteristic of personal photograph collections. This paper explains PRAIS system.

Investigating the Performance of Minimax Search and Aggregate Mahalanobis Distance Function in Evolving an Ayo/Awale Player

In this paper we describe a hybrid technique of Minimax search and aggregate Mahalanobis distance function synthesis to evolve Awale game player. The hybrid technique helps to suggest a move in a short amount of time without looking into endgame database. However, the effectiveness of the technique is heavily dependent on the training dataset of the Awale strategies utilized. The evolved player was tested against Awale shareware program and the result is appealing.

Dimensioning of Subsynchronous Cascade for Speed Regulation of Two-Motors 6kv Conveyer Drives

One way for optimum loading of overdimensioning conveyers is speed (capacity) decrement, with attention for production capabilities and demands. At conveyers which drives with three phase slip-ring induction motor, technically reasonable solution for conveyer (driving motors) speed regulation is using constant torque subsynchronous cascade with static semiconductor converter and transformer for energy reversion to the power network. In the paper is described mathematical model for parameter calculation of two-motors 6 kV subsynchronous cascade. It is also demonstrated that applying of this cascade gave several good properties, foremost in electrical energy saving, also in improving of other energy indexes, and finally that results in cost reduction of complete electrical motor drive.

The Evaluation and Application of FMEA in Sepahan Oil Co

Failure modes and effects analysis (FMEA) is an effective technique for preventing potential problems and actions needed to error cause removal. On the other hand, the oil producing companies paly a critical role in the oil industry of Iran as a developing country out of which, Sepahan Oil Co. has a considerable contribution. The aim of this research is to show how FMEA could be applied and improve the quality of products at Sepahan Oil Co. For this purpose, the four liter production line of the company has been selected for investigation. The findings imply that the application of FMEA has reduced the scraps from 50000 ppm to 5000 ppm and has resulted in a 0.92 percent decrease of the oil waste.

Efficient and Extensible Data Processing Framework in Ubiquitious Sensor Networks

This paper presents the design and implements the prototype of an intelligent data processing framework in ubiquitous sensor networks. Much focus is put on how to handle the sensor data stream as well as the interoperability between the low-level sensor data and application clients. Our framework first addresses systematic middleware which mitigates the interaction between the application layer and low-level sensors, for the sake of analyzing a great volume of sensor data by filtering and integrating to create value-added context information. Then, an agent-based architecture is proposed for real-time data distribution to efficiently forward a specific event to the appropriate application registered in the directory service via the open interface. The prototype implementation demonstrates that our framework can host a sophisticated application on the ubiquitous sensor network and it can autonomously evolve to new middleware, taking advantages of promising technologies such as software agents, XML, cloud computing, and the like.

Object Detection based Weighted-Center Surround Difference

Intelligent traffic surveillance technology is an issue in the field of traffic data analysis. Therefore, we need the technology to detect moving objects in real-time while there are variations in background and natural light. In this paper, we proposed a Weighted-Center Surround Difference method for object detection in outdoor environments. The proposed system detects objects using the saliency map that is obtained by analyzing the weight of each layers of Gaussian pyramid. In order to validate the effectiveness of our system, we implemented the proposed method using a digital signal processor, TMS320DM6437. Experimental results show that blurred noisy around objects was effectively eliminated and the object detection accuracy is improved.

Multi-Functional Insect Cuticles: Informative Designs for Man-Made Surfaces

Biomimicry has many potential benefits as many technologies found in nature are superior to their man-made counterparts. As technological device components approach the micro and nanoscale, surface properties such as surface adhesion and friction may need to be taken into account. Lowering surface adhesion by manipulating chemistry alone might no longer be sufficient for such components and thus physical manipulation may be required. Adhesion reduction is only one of the many surface functions displayed by micro/nano-structured cuticles of insects. Here, we present a mini review of our understanding of insect cuticle structures and the relationship between the structure dimensions and the corresponding functional mechanisms. It may be possible to introduce additional properties to material surfaces (indeed multi-functional properties) based on the design of natural surfaces.

Analytical Model Based Evaluation of Human Machine Interfaces Using Cognitive Modeling

Cognitive models allow predicting some aspects of utility and usability of human machine interfaces (HMI), and simulating the interaction with these interfaces. The action of predicting is based on a task analysis, which investigates what a user is required to do in terms of actions and cognitive processes to achieve a task. Task analysis facilitates the understanding of the system-s functionalities. Cognitive models are part of the analytical approaches, that do not associate the users during the development process of the interface. This article presents a study about the evaluation of a human machine interaction with a contextual assistant-s interface using ACTR and GOMS cognitive models. The present work shows how these techniques may be applied in the evaluation of HMI, design and research by emphasizing firstly the task analysis and secondly the time execution of the task. In order to validate and support our results, an experimental study of user performance is conducted at the DOMUS laboratory, during the interaction with the contextual assistant-s interface. The results of our models show that the GOMS and ACT-R models give good and excellent predictions respectively of users performance at the task level, as well as the object level. Therefore, the simulated results are very close to the results obtained in the experimental study.

Multi Switched Split Vector Quantization of Narrowband Speech Signals

Vector quantization is a powerful tool for speech coding applications. This paper deals with LPC Coding of speech signals which uses a new technique called Multi Switched Split Vector Quantization (MSSVQ), which is a hybrid of Multi, switched, split vector quantization techniques. The spectral distortion performance, computational complexity, and memory requirements of MSSVQ are compared to split vector quantization (SVQ), multi stage vector quantization(MSVQ) and switched split vector quantization (SSVQ) techniques. It has been proved from results that MSSVQ has better spectral distortion performance, lower computational complexity and lower memory requirements when compared to all the above mentioned product code vector quantization techniques. Computational complexity is measured in floating point operations (flops), and memory requirements is measured in (floats).

Coping with the Rapidity of Information Technology Changes – A Comparison Reviewon Current Practices

Information technology managers nowadays are facing with tremendous pressure to plan, implement, and adopt new technology solution due to the rapidity of technology changes. Resulted from a lack of study that have been done in this topic, the aim of this paper is to provide a comparison review on current tools that are currently being used in order to respond to technological changes. The study is based on extensive literature review of published works with majority of them are ranging from 2000 to the first part of 2011. The works were gathered from journals, books, and other information sources available on the Web. Findings show that, each tools has different focus and none of the tools are providing a framework in holistic view, which should include technical, people, process, and business environment aspect. Hence, this result provides potential information about current available tools that IT managers could use to manage changes in technology. Further, the result reveals a research gap in the area where the industries a short of such framework.

Development of NOx Emission Model for a Tangentially Fired Acid Incinerator

This paper aims to develop a NOx emission model of an acid gas incinerator using Nelder-Mead least squares support vector regression (LS-SVR). Malaysia DOE is actively imposing the Clean Air Regulation to mandate the installation of analytical instrumentation known as Continuous Emission Monitoring System (CEMS) to report emission level online to DOE . As a hardware based analyzer, CEMS is expensive, maintenance intensive and often unreliable. Therefore, software predictive technique is often preferred and considered as a feasible alternative to replace the CEMS for regulatory compliance. The LS-SVR model is built based on the emissions from an acid gas incinerator that operates in a LNG Complex. Simulated Annealing (SA) is first used to determine the initial hyperparameters which are then further optimized based on the performance of the model using Nelder-Mead simplex algorithm. The LS-SVR model is shown to outperform a benchmark model based on backpropagation neural networks (BPNN) in both training and testing data.

A New Algorithm for Cluster Initialization

Clustering is a very well known technique in data mining. One of the most widely used clustering techniques is the k-means algorithm. Solutions obtained from this technique are dependent on the initialization of cluster centers. In this article we propose a new algorithm to initialize the clusters. The proposed algorithm is based on finding a set of medians extracted from a dimension with maximum variance. The algorithm has been applied to different data sets and good results are obtained.