Prototype of an Interactive Toy from Lego Robotics Kits for Children with Autism

This paper is the development of a concept of the man/robot interaction. More accurately in developing of an autistic child that have more troubles with interaction, here offers an efficient solution, even though simple; however, less studied for this public. This concept is based on code applied thought out the Lego NXT kit, built for the interpretation of the robot, thereby can create this interaction in a constructive way for children suffering with Autism.

Different Views and Evaluations of IT Artifacts

The introduction of a multitude of new and interactive e-commerce information technology (IT) artifacts has impacted adoption research. Rather than solely functioning as productivity tools, new IT artifacts assume the roles of interaction mediators and social actors. This paper describes the varying roles assumed by IT artifacts, and proposes and distinguishes between four distinct foci of how the artifacts are evaluated. It further proposes a theoretical model that maps the different views of IT artifacts to four distinct types of evaluations.

Construction of Space-Filling Designs for Three Input Variables Computer Experiments

Latin hypercube designs (LHDs) have been applied in many computer experiments among the space-filling designs found in the literature. A LHD can be randomly generated but a randomly chosen LHD may have bad properties and thus act poorly in estimation and prediction. There is a connection between Latin squares and orthogonal arrays (OAs). A Latin square of order s involves an arrangement of s symbols in s rows and s columns, such that every symbol occurs once in each row and once in each column and this exists for every non-negative integer s. In this paper, a computer program was written to construct orthogonal array-based Latin hypercube designs (OA-LHDs). Orthogonal arrays (OAs) were constructed from Latin square of order s and the OAs constructed were afterward used to construct the desired Latin hypercube designs for three input variables for use in computer experiments. The LHDs constructed have better space-filling properties and they can be used in computer experiments that involve only three input factors. MATLAB 2012a computer package (www.mathworks.com/) was used for the development of the program that constructs the designs.

Analysis of Combined Use of NN and MFCC for Speech Recognition

The performance and analysis of speech recognition system is illustrated in this paper. An approach to recognize the English word corresponding to digit (0-9) spoken by 2 different speakers is captured in noise free environment. For feature extraction, speech Mel frequency cepstral coefficients (MFCC) has been used which gives a set of feature vectors from recorded speech samples. Neural network model is used to enhance the recognition performance. Feed forward neural network with back propagation algorithm model is used. However other speech recognition techniques such as HMM, DTW exist. All experiments are carried out on Matlab.

Simulation Based VLSI Implementation of Fast Efficient Lossless Image Compression System Using Adjusted Binary Code & Golumb Rice Code

The Simulation based VLSI Implementation of FELICS (Fast Efficient Lossless Image Compression System) Algorithm is proposed to provide the lossless image compression and is implemented in simulation oriented VLSI (Very Large Scale Integrated). To analysis the performance of Lossless image compression and to reduce the image without losing image quality and then implemented in VLSI based FELICS algorithm. In FELICS algorithm, which consists of simplified adjusted binary code for Image compression and these compression image is converted in pixel and then implemented in VLSI domain. This parameter is used to achieve high processing speed and minimize the area and power. The simplified adjusted binary code reduces the number of arithmetic operation and achieved high processing speed. The color difference preprocessing is also proposed to improve coding efficiency with simple arithmetic operation. Although VLSI based FELICS Algorithm provides effective solution for hardware architecture design for regular pipelining data flow parallelism with four stages. With two level parallelisms, consecutive pixels can be classified into even and odd samples and the individual hardware engine is dedicated for each one. This method can be further enhanced by multilevel parallelisms.

The Load Balancing Algorithm for the Star Interconnection Network

The star network is one of the promising interconnection networks for future high speed parallel computers, it is expected to be one of the future-generation networks. The star network is both edge and vertex symmetry, it was shown to have many gorgeous topological proprieties also it is owns hierarchical structure framework. Although much of the research work has been done on this promising network in literature, it still suffers from having enough algorithms for load balancing problem. In this paper we try to work on this issue by investigating and proposing an efficient algorithm for load balancing problem for the star network. The proposed algorithm is called Star Clustered Dimension Exchange Method SCDEM to be implemented on the star network. The proposed algorithm is based on the Clustered Dimension Exchange Method (CDEM). The SCDEM algorithm is shown to be efficient in redistributing the load balancing as evenly as possible among all nodes of different factor networks.

A Robust Image Steganography Method Using PMM in Bit Plane Domain

Steganography is the art and science that hides the information in an appropriate cover carrier like image, text, audio and video media. In this work the authors propose a new image based steganographic method for hiding information within the complex bit planes of the image. After slicing into bit planes the cover image is analyzed to extract the most complex planes in decreasing order based on their bit plane complexity. The complexity function next determines the complex noisy blocks of the chosen bit plane and finally pixel mapping method (PMM) has been used to embed secret bits into those regions of the bit plane. The novel approach of using pixel mapping method (PMM) in bit plane domain adaptively embeds data on most complex regions of image, provides high embedding capacity, better imperceptibility and resistance to steganalysis attack.

An Approach for the Integration of the Existing Wireless Networks

The demand of high quality services has fueled dimensional research and development in wireless communications and networking. As a result, different wireless technologies like Wireless LAN, CDMA, GSM, UMTS, MANET, Bluetooth and satellite networks etc. have emerged in the last two decades. Future networks capable of carrying multimedia traffic need IP convergence, portability, seamless roaming and scalability among the existing networking technologies without changing the core part of the existing communications networks. To fulfill these goals, the present networking systems are required to work in cooperation to ensure technological independence, seamless roaming, high security and authentication, guaranteed Quality of Services (QoS). In this paper, a conceptual framework for a cooperative network (CN) is proposed for integration of heterogeneous existing networks to meet out the requirements of the next generation wireless networks.

An Overview of Energy Efficient Routing Protocols for Acoustic Sensor Network

Underwater acoustic network is one of the rapidly growing areas of research and finds different applications for monitoring and collecting various data for environmental studies. The communication among dynamic nodes and high error probability in an acoustic medium forced to maximize energy consumption in Underwater Sensor Networks (USN) than in traditional sensor networks. Developing energy-efficient routing protocol is the fundamental and a curb challenge because all the sensor nodes are powered by batteries, and they cannot be easily replaced in UWSNs. This paper surveys the various recent routing techniques that mainly focus on energy efficiency.

Quad Tree Decomposition Based Analysis of Compressed Image Data Communication for Lossy and Lossless Using WSN

The Quad Tree Decomposition based performance analysis of compressed image data communication for lossy and lossless through wireless sensor network is presented. Images have considerably higher storage requirement than text. While transmitting a multimedia content there is chance of the packets being dropped due to noise and interference. At the receiver end the packets that carry valuable information might be damaged or lost due to noise, interference and congestion. In order to avoid the valuable information from being dropped various retransmission schemes have been proposed. In this proposed scheme QTD is used. QTD is an image segmentation method that divides the image into homogeneous areas. In this proposed scheme involves analysis of parameters such as compression ratio, peak signal to noise ratio, mean square error, bits per pixel in compressed image and analysis of difficulties during data packet communication in Wireless Sensor Networks. By considering the above, this paper is to use the QTD to improve the compression ratio as well as visual quality and the algorithm in MATLAB 7.1 and NS2 Simulator software tool.

Categories of Botnet: A Survey

Botnets are one of the most serious and widespread cyber threats. Today botnets have been facilitating many cybercrimes, especially financial, top secret thefts. Botnets can be available for lease in the market and are utilized by the cybercriminals to launch massive attacks like DDoS, click fraud, phishing attacks etc., Several large institutions, hospitals, banks, government organizations and many social networks such as twitter, facebook etc., became the target of the botmasters. Recently, noteworthy researches have been carried out to detect bot, C&C channels, botnet and botmasters. Using many sophisticated technologies, botmasters made botnet a titan of the cyber world. Innumerable challenges have been put forth by the botmasters to the researchers in the detection of botnet. In this paper we present a survey of different types of botnet C&C channels and also provide a comparison of various botnet categories. Finally we hope that our survey will create awareness for forthcoming botnet research endeavors.

A Comparative Analysis of Different Web Content Mining Tools

Nowadays, the Web has become one of the most pervasive platforms for information change and retrieval. It collects the suitable and perfectly fitting information from websites that one requires. Data mining is the form of extracting data’s available in the internet. Web mining is one of the elements of data mining Technique, which relates to various research communities such as information recovery, folder managing system and simulated intellects. In this Paper we have discussed the concepts of Web mining. We contain generally focused on one of the categories of Web mining, specifically the Web Content Mining and its various farm duties. The mining tools are imperative to scanning the many images, text, and HTML documents and then, the result is used by the various search engines. We conclude by presenting a comparative table of these tools based on some pertinent criteria.

Real Time Remote Monitoring and Fault Detection in Wind Turbine

In new energy development, wind power has boomed. It is due to the proliferation of wind parks and their operation in supplying the national electric grid with low cost and clean resources. Hence, there is an increased need to establish a proactive maintenance for wind turbine machines based on remote control and monitoring. That is necessary with a real-time wireless connection in offshore or inaccessible locations while the wired method has many flaws. The objective of this strategy is to prolong wind turbine lifetime and to increase productivity. The hardware of a remote control and monitoring system for wind turbine parks is designed. It takes advantage of GPRS or Wi-Max wireless module to collect data measurements from different wind machine sensors through IP based multi-hop communication. Computer simulations with Proteus ISIS and OPNET software tools have been conducted to evaluate the performance of the studied system. Study findings show that the designed device is suitable for application in a wind park.

Probabilistic Graphical Model for the Web

The world wide web network is a network with a complex topology, the main properties of which are the distribution of degrees in power law, A low clustering coefficient and a weak average distance. Modeling the web as a graph allows locating the information in little time and consequently offering a help in the construction of the research engine. Here, we present a model based on the already existing probabilistic graphs with all the aforesaid characteristics. This work will consist in studying the web in order to know its structuring thus it will enable us to modelize it more easily and propose a possible algorithm for its exploration.

A Design of the Infrastructure and Computer Network for Distance Education, Online Learning via New Media, E-Learning and Blended Learning

The research focus on study, analyze and design the model of the infrastructure and computer networks for distance education, online learning via new media, e-learning and blended learning. The collected information from study and analyze process that information was evaluated by the index of item objective congruence (IOC) by 9 specialists to design model. The results of evaluate the model with the mean and standard deviation by the sample of 9 specialists value is 3.85. The results showed that the infrastructure and computer networks are designed to be appropriate to a great extent appropriate to a great extent.

Comparative Analysis of Diverse Collection of Big Data Analytics Tools

Over the past era, there have been a lot of efforts and studies are carried out in growing proficient tools for performing various tasks in big data. Recently big data have gotten a lot of publicity for their good reasons. Due to the large and complex collection of datasets it is difficult to process on traditional data processing applications. This concern turns to be further mandatory for producing various tools in big data. Moreover, the main aim of big data analytics is to utilize the advanced analytic techniques besides very huge, different datasets which contain diverse sizes from terabytes to zettabytes and diverse types such as structured or unstructured and batch or streaming. Big data is useful for data sets where their size or type is away from the capability of traditional relational databases for capturing, managing and processing the data with low-latency. Thus the out coming challenges tend to the occurrence of powerful big data tools. In this survey, a various collection of big data tools are illustrated and also compared with the salient features.

Image Retrieval Using Fused Features

The system is designed to show images which are related to the query image. Extracting color, texture, and shape features from an image plays a vital role in content-based image retrieval (CBIR). Initially RGB image is converted into HSV color space due to its perceptual uniformity. From the HSV image, Color features are extracted using block color histogram, texture features using Haar transform and shape feature using Fuzzy C-means Algorithm. Then, the characteristics of the global and local color histogram, texture features through co-occurrence matrix and Haar wavelet transform and shape are compared and analyzed for CBIR. Finally, the best method of each feature is fused during similarity measure to improve image retrieval effectiveness and accuracy.

Optimal Feature Extraction Dimension in Finger Vein Recognition Using Kernel Principal Component Analysis

In this paper the issue of dimensionality reduction is investigated in finger vein recognition systems using kernel Principal Component Analysis (KPCA). One aspect of KPCA is to find the most appropriate kernel function on finger vein recognition as there are several kernel functions which can be used within PCA-based algorithms. In this paper, however, another side of PCA-based algorithms -particularly KPCA- is investigated. The aspect of dimension of feature vector in PCA-based algorithms is of importance especially when it comes to the real-world applications and usage of such algorithms. It means that a fixed dimension of feature vector has to be set to reduce the dimension of the input and output data and extract the features from them. Then a classifier is performed to classify the data and make the final decision. We analyze KPCA (Polynomial, Gaussian, and Laplacian) in details in this paper and investigate the optimal feature extraction dimension in finger vein recognition using KPCA.

Wavelet Based Residual Method of Detecting GSM Signal Strength Fading

In this paper, GSM signal strength was measured in order to detect the type of the signal fading phenomenon using onedimensional multilevel wavelet residual method and neural network clustering to determine the average GSM signal strength received in the study area. The wavelet residual method predicted that the GSM signal experienced slow fading and attenuated with MSE of 3.875dB. The neural network clustering revealed that mostly -75dB, -85dB and -95dB were received. This means that the signal strength received in the study is a weak signal.

Inadequate Requirements Engineering Process: A Key Factor for Poor Software Development in Developing Nations: A Case Study

Developing a reliable and sustainable software products is today a big challenge among up–coming software developers in Nigeria. The inability to develop a comprehensive problem statement needed to execute proper requirements engineering process is missing. The need to describe the ‘what’ of a system in one document, written in a natural language is a major step in the overall process of Software Engineering. Requirements Engineering is a process use to discover, analyze and validate system requirements. This process is needed in reducing software errors at the early stage of the development of software. The importance of each of the steps in Requirements Engineering is clearly explained in the context of using detailed problem statement from client/customer to get an overview of an existing system along with expectations from the new system. This paper elicits inadequate Requirements Engineering principle as the major cause of poor software development in developing nations using a case study of final year computer science students of a tertiary-education institution in Nigeria.