Intention Recognition using a Graph Representation

The human friendly interaction is the key function of a human-centered system. Over the years, it has received much attention to develop the convenient interaction through intention recognition. Intention recognition processes multimodal inputs including speech, face images, and body gestures. In this paper, we suggest a novel approach of intention recognition using a graph representation called Intention Graph. A concept of valid intention is proposed, as a target of intention recognition. Our approach has two phases: goal recognition phase and intention recognition phase. In the goal recognition phase, we generate an action graph based on the observed actions, and then the candidate goals and their plans are recognized. In the intention recognition phase, the intention is recognized with relevant goals and user profile. We show that the algorithm has polynomial time complexity. The intention graph is applied to a simple briefcase domain to test our model.

Bullies and Their Mothers: Who Influence Whom?

Even though most researchers would agree that in symbiotic relationships, like the one between parent and child, influences become reciprocal over time, empirical evidence supporting this claim is limited. The aim of the current study was to develop and test a model describing the reciprocal influence between characteristics of the parent-child relationship, such as closeness and conflict, and the child-s bullying and victimization experiences at school. The study used data from the longitudinal Study of Early Child-Care, conducted by the National Institute of Child Health and Human Development. The participants were dyads of early adolescents (5th and 6th graders during the two data collection waves) and their mothers (N=1364). Supporting our hypothesis, the findings suggested a reciprocal association between bullying and positive parenting, although this association was only significant for boys. Victimization and positive parenting were not significantly interrelated.

Theoretical Analysis of a Crossed-Electrode 2D Array for 3D Imaging

Planar systems of electrodes arranged on both sides of dielectric piezoelectric layer are applied in numerous transducers. They are capable of electronic beam-steering of generated wave both in azimuth and elevation. The wave-beam control is achieved by addressable driving of two-dimensional transducer through proper voltage supply of electrodes on opposite surfaces of the layer. In this paper a semi-analytical method of analysis of the considered transducer is proposed, which is a generalization of the well-known BIS-expansion method. It was earlier exploited with great success in the theory of interdigital transducers of surface acoustic waves, theory of elastic wave scattering by cracks and certain advanced electrostatic problems. The corresponding nontrivial electrostatic problem is formulated and solved numerically.

An Optical Flow Based Segmentation Method for Objects Extraction

This paper describes a segmentation algorithm based on the cooperation of an optical flow estimation method with edge detection and region growing procedures. The proposed method has been developed as a pre-processing stage to be used in methodologies and tools for video/image indexing and retrieval by content. The addressed problem consists in extracting whole objects from background for producing images of single complete objects from videos or photos. The extracted images are used for calculating the object visual features necessary for both indexing and retrieval processes. The first task of the algorithm exploits the cues from motion analysis for moving area detection. Objects and background are then refined using respectively edge detection and region growing procedures. These tasks are iteratively performed until objects and background are completely resolved. The developed method has been applied to a variety of indoor and outdoor scenes where objects of different type and shape are represented on variously textured background.

A Strategy Based View of Supply Chain Competitiveness

In this era of competitiveness, there is a growing need for supply chains also to become competitive enough to handle pressures like varying customer’s expectations, low cost high quality products to be delivered at the minimum time and the most important is throat cutting competition at world wide scale. In the recent years, supply chain competitiveness has been, therefore, accepted as one of the most important philosophies in the supply chain literature. Various researchers and practitioners have tried to identify and implement strategies in supply chains which can bring competitiveness in the supply chains i.e. supply chain competitiveness. The purpose of this paper is to suggest select strategies for supply chain competitiveness in the Indian manufacturing sector using an integrated approach of literature review and exploratory interviews with eminent professionals from the supply chain area in various industries, academia and research. The aim of the paper is to highlight the important area of competitiveness in the supply chain and to suggest recommendations to the industry and managers of manufacturing sector.

The Optimal Placement of Capacitor in Order to Reduce Losses and the Profile of Distribution Network Voltage with GA, SA

Most of the losses in a power system relate to the distribution sector which always has been considered. From the important factors which contribute to increase losses in the distribution system is the existence of radioactive flows. The most common way to compensate the radioactive power in the system is the power to use parallel capacitors. In addition to reducing the losses, the advantages of capacitor placement are the reduction of the losses in the release peak of network capacity and improving the voltage profile. The point which should be considered in capacitor placement is the optimal placement and specification of the amount of the capacitor in order to maximize the advantages of capacitor placement. In this paper, a new technique has been offered for the placement and the specification of the amount of the constant capacitors in the radius distribution network on the basis of Genetic Algorithm (GA). The existing optimal methods for capacitor placement are mostly including those which reduce the losses and voltage profile simultaneously. But the retaliation cost and load changes have not been considered as influential UN the target function .In this article, a holistic approach has been considered for the optimal response to this problem which includes all the parameters in the distribution network: The price of the phase voltage and load changes. So, a vast inquiry is required for all the possible responses. So, in this article, we use Genetic Algorithm (GA) as the most powerful method for optimal inquiry.

Cooperative Data Caching in WSN

Wireless sensor networks (WSNs) have gained tremendous attention in recent years due to their numerous applications. Due to the limited energy resource, energy efficient operation of sensor nodes is a key issue in wireless sensor networks. Cooperative caching which ensures sharing of data among various nodes reduces the number of communications over the wireless channels and thus enhances the overall lifetime of a wireless sensor network. In this paper, we propose a cooperative caching scheme called ZCS (Zone Cooperation at Sensors) for wireless sensor networks. In ZCS scheme, one-hop neighbors of a sensor node form a cooperative cache zone and share the cached data with each other. Simulation experiments show that the ZCS caching scheme achieves significant improvements in byte hit ratio and average query latency in comparison with other caching strategies.

Watermark Bit Rate in Diverse Signal Domains

A study of the obtainable watermark data rate for information hiding algorithms is presented in this paper. As the perceptual entropy for wideband monophonic audio signals is in the range of four to five bits per sample, a significant amount of additional information can be inserted into signal without causing any perceptual distortion. Experimental results showed that transform domain watermark embedding outperforms considerably watermark embedding in time domain and that signal decompositions with a high gain of transform coding, like the wavelet transform, are the most suitable for high data rate information hiding. Keywords?Digital watermarking, information hiding, audio watermarking, watermark data rate.

A Novel 14 nm Extended Body FinFET for Reduced Corner Effect, Self-Heating Effect, and Increased Drain Current

In this paper, we have proposed a novel FinFET with extended body under the poly gate, which is called EB-FinFET, and its characteristic is demonstrated by using three-dimensional (3-D) numerical simulation. We have analyzed and compared it with conventional FinFET. The extended body height dependence on the drain induced barrier lowering (DIBL) and subthreshold swing (S.S) have been also investigated. According to the 3-D numerical simulation, the proposed structure has a firm structure, an acceptable short channel effect (SCE), a reduced series resistance, an increased on state drain current (I on) and a large normalized I DS. Furthermore, the structure can also improve corner effect and reduce self-heating effect due to the extended body. Our results show that the EBFinFET is excellent for nanoscale device.

3G WCDMA Mobile Network DoS Attack and Detection Technology

Currently, there has been a 3G mobile networks data traffic explosion due to the large increase in the number of smartphone users. Unlike a traditional wired infrastructure, 3G mobile networks have limited wireless resources and signaling procedures for complex wireless resource management. And mobile network security for various abnormal and malicious traffic technologies was not ready. So Malicious or potentially malicious traffic originating from mobile malware infected smart devices can cause serious problems to the 3G mobile networks, such as DoS and scanning attack in wired networks. This paper describes the DoS security threat in the 3G mobile network and proposes a detection technology.

Extracting Human Body based on Background Estimation in Modified HLS Color Space

The ability to recognize humans and their activities by computer vision is a very important task, with many potential application. Study of human motion analysis is related to several research areas of computer vision such as the motion capture, detection, tracking and segmentation of people. In this paper, we describe a segmentation method for extracting human body contour in modified HLS color space. To estimate a background, the modified HLS color space is proposed, and the background features are estimated by using the HLS color components. Here, the large amount of human dataset, which was collected from DV cameras, is pre-processed. The human body and its contour is successfully extracted from the image sequences.

A Traffic Simulation Package Based on Travel Demand

In this paper we propose a new traffic simulation package, TDMSim, which supports both macroscopic and microscopic simulation on free-flowing and regulated traffic systems. Both simulators are based on travel demands, which specify the numbers of vehicles departing from origins to arrive at different destinations. The microscopic simulator implements the carfollowing model given the pre-defined routes of the vehicles but also supports the rerouting of vehicles. We also propose a macroscopic simulator which is built in integration with the microscopic simulator to allow the simulation to be scaled for larger networks without sacrificing the precision achievable through the microscopic simulator. The macroscopic simulator also enables the reuse of previous simulation results when simulating traffic on the same networks at later time. Validations have been conducted to show the correctness of both simulators.

Evaluation of Sensitometric Properties of Radiographic Films at Different Processing Solutions

The aim of this study was to compare the sensitometric properties of commonly used radiographic films processed with chemical solutions in different workload hospitals. The effect of different processing conditions on induced densities on radiologic films was investigated. Two accessible double emulsions Fuji and Kodak films were exposed with 11-step wedge and processed with Champion and CPAC processing solutions. The mentioned films provided in both workloads centers, high and low. Our findings displays that the speed and contrast of Kodak filmscreen in both work load (high and low) is higher than Fuji filmscreen for both processing solutions. However there was significant differences in films contrast for both workloads when CPAC solution had been used (p=0.000 and 0.028). The results showed base plus fog density for Kodak film was lower than Fuji. Generally Champion processing solution caused more speed and contrast for investigated films in different conditions and there was significant differences in 95% confidence level between two used processing solutions (p=0.01). Low base plus fog density for Kodak films provide more visibility and accuracy and higher contrast results in using lower exposure factors to obtain better quality in resulting radiographs. In this study we found an economic advantages since Champion solution and Kodak film are used while it makes lower patient dose. Thus, in a radiologic facility any change in film processor/processing cycle or chemistry should be carefully investigated before radiological procedures of patients are acquired.

Wind Tunnel Investigation of the Turbulent Flow around the Panorama Giustinelli Building for VAWT Application

A boundary layer wind tunnel facility has been adopted in order to conduct experimental measurements of the flow field around a model of the Panorama Giustinelli Building, Trieste (Italy). Information on the main flow structures has been obtained by means of flow visualization techniques and has been compared to the numerical predictions of the vortical structures spread on top of the roof, in order to investigate the optimal positioning for a vertical-axis wind energy conversion system, registering a good agreement between experimental measurements and numerical predictions.

A Hybrid Radial-Based Neuro-GA Multiobjective Design of Laminated Composite Plates under Moisture and Thermal Actions

In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.

GeoSEMA: A Modelling Platform, Emerging “GeoSpatial-based Evolutionary and Mobile Agents“

Spatial and mobile computing evolves. This paper describes a smart modeling platform called “GeoSEMA". This approach tends to model multidimensional GeoSpatial Evolutionary and Mobile Agents. Instead of 3D and location-based issues, there are some other dimensions that may characterize spatial agents, e.g. discrete-continuous time, agent behaviors. GeoSEMA is seen as a devoted design pattern motivating temporal geographic-based applications; it is a firm foundation for multipurpose and multidimensional special-based applications. It deals with multipurpose smart objects (buildings, shapes, missiles, etc.) by stimulating geospatial agents. Formally, GeoSEMA refers to geospatial, spatio-evolutive and mobile space constituents where a conceptual geospatial space model is given in this paper. In addition to modeling and categorizing geospatial agents, the model incorporates the concept of inter-agents event-based protocols. Finally, a rapid software-architecture prototyping GeoSEMA platform is also given. It will be implemented/ validated in the next phase of our work.

On Pattern-Based Programming towards the Discovery of Frequent Patterns

The problem of frequent pattern discovery is defined as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns has become an important data mining task because it reveals associations, correlations, and many other interesting relationships hidden in a database. Most of the proposed frequent pattern mining algorithms have been implemented with imperative programming languages. Such paradigm is inefficient when set of patterns is large and the frequent pattern is long. We suggest a high-level declarative style of programming apply to the problem of frequent pattern discovery. We consider two languages: Haskell and Prolog. Our intuitive idea is that the problem of finding frequent patterns should be efficiently and concisely implemented via a declarative paradigm since pattern matching is a fundamental feature supported by most functional languages and Prolog. Our frequent pattern mining implementation using the Haskell and Prolog languages confirms our hypothesis about conciseness of the program. The comparative performance studies on line-of-code, speed and memory usage of declarative versus imperative programming have been reported in the paper.

Protocol and Method for Preventing Attacks from the Web

Nowadays, computer worms, viruses and Trojan horse become popular, and they are collectively called malware. Those malware just spoiled computers by deleting or rewriting important files a decade ago. However, recent malware seems to be born to earn money. Some of malware work for collecting personal information so that malicious people can find secret information such as password for online banking, evidence for a scandal or contact address which relates with the target. Moreover, relation between money and malware becomes more complex. Many kinds of malware bear bots to get springboards. Meanwhile, for ordinary internet users, countermeasures against malware come up against a blank wall. Pattern matching becomes too much waste of computer resources, since matching tools have to deal with a lot of patterns derived from subspecies. Virus making tools can automatically bear subspecies of malware. Moreover, metamorphic and polymorphic malware are no longer special. Recently there appears malware checking sites that check contents in place of users' PC. However, there appears a new type of malicious sites that avoids check by malware checking sites. In this paper, existing protocols and methods related with the web are reconsidered in terms of protection from current attacks, and new protocol and method are indicated for the purpose of security of the web.

Design and Fabrication of a Miniature Railway Vehicle

We present design, fabrication, and characterization of a small (12 mm × 12 mm × 8 mm) movable railway vehicle for sensor carrying. The miniature railway vehicle (MRV) was mainly composed of a vibrational structure and three legs. A railway was designed and fabricated to power and guide the MRV. It also transmits the sensed data from the MRV to the signal processing unit. The MRV with legs on the railway was moving due to its high-frequency vibration. A model was derived to describe the motion. Besides, FEM simulations were performed to design the legs. Then, the MRV and the railway were fabricated by precision machining. Finally, an infrared sensor was carried and tested. The result shows that the MRV without loading was moving along the railway and its maximum speed was 12.2 mm/s. Moreover, the testing signal was sensed by the MRV.

Generator of Hypotheses an Approach of Data Mining Based on Monotone Systems Theory

Generator of hypotheses is a new method for data mining. It makes possible to classify the source data automatically and produces a particular enumeration of patterns. Pattern is an expression (in a certain language) describing facts in a subset of facts. The goal is to describe the source data via patterns and/or IF...THEN rules. Used evaluation criteria are deterministic (not probabilistic). The search results are trees - form that is easy to comprehend and interpret. Generator of hypotheses uses very effective algorithm based on the theory of monotone systems (MS) named MONSA (MONotone System Algorithm).