Adopting Procedural Animation Technology to Generate Locomotion of Quadruped Characters in Dynamic Environments

A procedural-animation-based approach which rapidly synthesize the adaptive locomotion for quadruped characters that they can walk or run in any directions on an uneven terrain within a dynamic environment was proposed. We devise practical motion models of the quadruped animals for adapting to a varied terrain in a real-time manner. While synthesizing locomotion, we choose the corresponding motion models by means of the footstep prediction of the current state in the dynamic environment, adjust the key-frames of the motion models relying on the terrain-s attributes, calculate the collision-free legs- trajectories, and interpolate the key-frames according to the legs- trajectories. Finally, we apply dynamic time warping to each part of motion for seamlessly concatenating all desired transition motions to complete the whole locomotion. We reduce the time cost of producing the locomotion and takes virtual characters to fit in with dynamic environments no matter when the environments are changed by users.

Replacement of Power Transformers basis on Diagnostic Results and Load Forecasting

This paper describes interconnection between technical and economical making decision. The reason of this dealing could be different: poor technical condition, change of substation (electrical network) regime, power transformer owner budget deficit and increasing of tariff on electricity. Establishing of recommended practice as well as to give general advice and guidance in economical sector, testing, diagnostic power transformers to establish its conditions, identify problems and provide potential remedies.

Effective Image and Video Error Concealment using RST-Invariant Partial Patch Matching Model and Exemplar-based Inpainting

An effective visual error concealment method has been presented by employing a robust rotation, scale, and translation (RST) invariant partial patch matching model (RSTI-PPMM) and exemplar-based inpainting. While the proposed robust and inherently feature-enhanced texture synthesis approach ensures the generation of excellent and perceptually plausible visual error concealment results, the outlier pruning property guarantees the significant quality improvements, both quantitatively and qualitatively. No intermediate user-interaction is required for the pre-segmented media and the presented method follows a bootstrapping approach for an automatic visual loss recovery and the image and video error concealment.

Design of Multi-disease Diagnosis Processor using Hypernetworks Technique

In this paper, we propose disease diagnosis hardware architecture by using Hypernetworks technique. It can be used to diagnose 3 different diseases (SPECT Heart, Leukemia, Prostate cancer). Generally, the disparate diseases require specified diagnosis hardware model for each disease. Using similarities of three diseases diagnosis processor, we design diagnosis processor that can diagnose three different diseases. Our proposed architecture that is combining three processors to one processor can reduce hardware size without decrease of the accuracy.

Spread Spectrum Code Estimationby Particle Swarm Algorithm

In the context of spectrum surveillance, a new method to recover the code of spread spectrum signal is presented, while the receiver has no knowledge of the transmitter-s spreading sequence. In our previous paper, we used Genetic algorithm (GA), to recover spreading code. Although genetic algorithms (GAs) are well known for their robustness in solving complex optimization problems, but nonetheless, by increasing the length of the code, we will often lead to an unacceptable slow convergence speed. To solve this problem we introduce Particle Swarm Optimization (PSO) into code estimation in spread spectrum communication system. In searching process for code estimation, the PSO algorithm has the merits of rapid convergence to the global optimum, without being trapped in local suboptimum, and good robustness to noise. In this paper we describe how to implement PSO as a component of a searching algorithm in code estimation. Swarm intelligence boasts a number of advantages due to the use of mobile agents. Some of them are: Scalability, Fault tolerance, Adaptation, Speed, Modularity, Autonomy, and Parallelism. These properties make swarm intelligence very attractive for spread spectrum code estimation. They also make swarm intelligence suitable for a variety of other kinds of channels. Our results compare between swarm-based algorithms and Genetic algorithms, and also show PSO algorithm performance in code estimation process.

The Role of Cognitive Decision Effort in Electronic Commerce Recommendation System

The purpose of this paper is to explore the role of cognitive decision effort in recommendation system, combined with indicators "information quality" and "service quality" from IS success model to exam the awareness of the user for the "recommended system performance". A total of 411 internet user answered a questionnaire assessing their attention of use and satisfaction of recommendation system in internet book store. Quantitative result indicates following research results. First, information quality of recommended system has obvious influence in consumer shopping decision-making process, and the attitude to use the system. Second, in the process of consumer's shopping decision-making, the recommendation system has no significant influence for consumers to pay lower cognitive decision-making effort. Third, e-commerce platform provides recommendations and information is necessary, but the quality of information on user needs must be considered, or they will be other competitors offer homogeneous services replaced.

A Generic e-Tutor for Graphical Problems

For a variety of safety and economic reasons, engineering undergraduates in Australia have experienced diminishing access to the real hardware that is typically the embodiment of their theoretical studies. This trend will delay the development of practical competence, decrease the ability to model and design, and suppress motivation. The author has attempted to address this concern by creating a software tool that contains both photographic images of real machinery, and sets of graphical modeling 'tools'. Academics from a range of disciplines can use the software to set tutorial tasks, and incorporate feedback comments for a range of student responses. An evaluation of the software demonstrated that students who had solved modeling problems with the aid of the electronic tutor performed significantly better in formal examinations with similar problems. The 2-D graphical diagnostic routines in the Tutor have the potential to be used in a wider range of problem-solving tasks.

Why Traditional Technology Acceptance Models Won't Work for Future Information Technologies?

This paper illustrates why existing technology acceptance models are only of limited use for predicting and explaining the adoption of future information and communication technologies. It starts with a general overview over technology adoption processes, and presents several theories for the acceptance as well as adoption of traditional information technologies. This is followed by an overview over the recent developments in the area of information and communication technologies. Based on the arguments elaborated in these sections, it is shown why the factors used to predict adoption in existing systems, will not be sufficient for explaining the adoption of future information and communication technologies.

Micro-aerobic, Anaerobic and Two-stage Condition for Ethanol Production by enterobacter aerogenes from Biodiesel-derived Crude Glycerol

The microbial production of ethanol from biodiesel¬derived crude glycerol by Enterobacter aerogenes TISTR1468, under micro-aerobic and anaerobic conditions, was investigated. The experimental results showed that micro-aerobic conditions were more favorable for cellular growth (4.0 g/L DCW), ethanol production (20.7 g/L) as well as the ethanol yield (0.47 g/g glycerol) than anaerobic conditions (1.2 g/L DCW, 6.3 g/L ethanol and 0.72 g/g glycerol, respectively). Crude glycerol (100 g/L) was consumed completely with the rate of 1.80 g/L/h. Two-stage fermentation (combination of micro-aerobic and anaerobic condition) exhibited higher ethanol production (24.5 g/L) than using one-stage fermentation (either micro-aerobic or anaerobic condition. The two- stage configuration, exhibited slightly higher crude glycerol consumption rate (1.81 g/L/h), as well as ethanol yield (0.56 g/g) than the one-stage configuration. Therefore, two-stage process was selected for ethanol production from E. aerogenes TISTR1468 in scale-up studies.

A New Predictor of Coding Regions in Genomic Sequences using a Combination of Different Approaches

Identifying protein coding regions in DNA sequences is a basic step in the location of genes. Several approaches based on signal processing tools have been applied to solve this problem, trying to achieve more accurate predictions. This paper presents a new predictor that improves the efficacy of three techniques that use the Fourier Transform to predict coding regions, and that could be computed using an algorithm that reduces the computation load. Some ideas about the combination of the predictor with other methods are discussed. ROC curves are used to demonstrate the efficacy of the proposed predictor, based on the computation of 25 DNA sequences from three different organisms.

Expelling Policy Based Buffer Control during Congestion in Differentiated Service Routers

In this paper a special kind of buffer management policy is studied where the packet are preempted even when sufficient space is available in the buffer for incoming packets. This is done to congestion for future incoming packets to improve QoS for certain type of packets. This type of study has been done in past for ATM type of scenario. We extend the same for heterogeneous traffic where data rate and size of the packets are very versatile in nature. Typical example of this scenario is the buffer management in Differentiated Service Router. There are two aspects that are of interest. First is the packet size: whether all packets have same or different sizes. Second aspect is the value or space priority of the packets, do all packets have the same space priority or different packets have different space priorities. We present two types of policies to achieve QoS goals for packets with different priorities: the push out scheme and the expelling scheme. For this work the scenario of packets of variable length is considered with two space priorities and main goal is to minimize the total weighted packet loss. Simulation and analytical studies show that, expelling policies can outperform the push out policies when it comes to offering variable QoS for packets of two different priorities and expelling policies also help improve the amount of admissible load. Some other comparisons of push out and expelling policies are also presented using simulations.

Project Portfolio Management Phases: A Technique for Strategy Alignment

This paper seeks to give a general idea of the universe of project portfolio management, from its multidisciplinary nature, to the many challenges it raises, passing through the different techniques, models and tools used to solve the multiple problems known. It is intended to contribute to the clarification, with great depth, of the impacts and relationships involved in managing the projects- portfolio. It aims at proposing a technique for the project alignment with the organisational strategy, in order to select projects that later on will be considered in the analysis and selection of the portfolio. We consider the development of a methodology for assessing the project alignment index very relevant in the global market scenario. It can help organisations to gain a greater awareness of market dynamics, speed up the decision process and increase its consistency, thus enabling the strategic alignment and the improvement of the organisational performance.

A New Traffic Pattern Matching for DDoS Traceback Using Independent Component Analysis

Recently, Denial of Service(DoS) attacks and Distributed DoS(DDoS) attacks which are stronger form of DoS attacks from plural hosts have become security threats on the Internet. It is important to identify the attack source and to block attack traffic as one of the measures against these attacks. In general, it is difficult to identify them because information about the attack source is falsified. Therefore a method of identifying the attack source by tracing the route of the attack traffic is necessary. A traceback method which uses traffic patterns, using changes in the number of packets over time as criteria for the attack traceback has been proposed. The traceback method using the traffic patterns can trace the attack by matching the shapes of input traffic patterns and the shape of output traffic pattern observed at a network branch point such as a router. The traffic pattern is a shapes of traffic and unfalsifiable information. The proposed trace methods proposed till date cannot obtain enough tracing accuracy, because they directly use traffic patterns which are influenced by non-attack traffics. In this paper, a new traffic pattern matching method using Independent Component Analysis(ICA) is proposed.

Object-Oriented Simulation of Simulating Anticipatory Systems

The present paper is oriented to problems of simulation of anticipatory systems, namely those that use simulation models for the aid of anticipation. A certain analogy between use of simulation and imagining will be applied to make the explication more comprehensible. The paper will be completed by notes of problems and by some existing applications. The problems consist in the fact that simulation of the mentioned anticipatory systems end is simulation of simulating systems, i.e. in computer models handling two or more modeled time axes that should be mapped to real time flow in a nondescent manner. Languages oriented to objects, processes and blocks can be used to surmount the problems.

Comprehensive Characteristics of the Municipal Solid Waste Generated in the Faculty of Engineering, UKM

The main aims in this research are to study the solid waste generation in the Faculty of Engineering and Built Environment in the UKM and at the same time to determine composition and some of the waste characteristics likewise: moisture content, density, pH and C/N ratio. For this purpose multiple campaigns were conducted to collect the wastes produced in all hostels, faculties, offices and so on, during 24th of February till 2nd of March 2009, measure and investigate them with regard to both physical and chemical characteristics leading to highlight the necessary management policies. Research locations are Faculty of Engineering and the Canteen nearby that. From the result gained, the most suitable solid waste management solution will be proposed to UKM. The average solid waste generation rate in UKM is 203.38 kg/day. The composition of solid waste generated are glass, plastic, metal, aluminum, organic and inorganic waste and others waste. From the laboratory result, the average moisture content, density, pH and C/N ratio values from the solid waste generated are 49.74%, 165.1 kg/m3, 5.3, and 7:1 respectively. Since, the food waste (organic waste) were the most dominant component, around 62% from the total waste generated hence, the most suitable solid waste management solution is composting.

Role-based Access Control Model in Home Network Environments

The home in these days has not one computer connected to the Internet but rather a network of many devices within the home, and that network might be connected to the Internet. In such an environment, the potential for attacks is greatly increased. The general security technology can not apply because of the use of various wired and wireless network, middleware and protocol in digital home environment and a restricted system resource of home information appliances. To offer secure home services home network environments have need of access control for various home devices and information when users want to access. Therefore home network access control for user authorization is a very important issue. In this paper we propose access control model using RBAC in home network environments to provide home users with secure home services.

Is the Expansion of High-Tech Leaders Possible Within the New EU Members? A Case Study of Ammono S.A. and the High-Tech Financing System in Poland

Innovations, especially technological, are considered key-drivers for sustainable economic growth and competitiveness in the globalised world. As such they should also play an important role in the process of economical convergence inside the EU. Unfortunately, the problem of insufficient innovation performance concerns around half of the EU countries. Poland shows that a lack of a consistent high-tech financing system constitutes a serious obstacle for the development of innovative firms. In this article we will evaluate these questions referring to the example of Ammono S.A., a Polish company established to develop and commercialise an original technology for the production of bulk GaN crystals. We will focus on its efforts to accumulate the financial resources necessary at different stages of its development. The purpose of this article is to suggest possible ways to improve the national innovative system, which would make it more competent in generating high-tech leaders.

MIBiClus: Mutual Information based Biclustering Algorithm

Most of the biclustering/projected clustering algorithms are based either on the Euclidean distance or correlation coefficient which capture only linear relationships. However, in many applications, like gene expression data and word-document data, non linear relationships may exist between the objects. Mutual Information between two variables provides a more general criterion to investigate dependencies amongst variables. In this paper, we improve upon our previous algorithm that uses mutual information for biclustering in terms of computation time and also the type of clusters identified. The algorithm is able to find biclusters with mixed relationships and is faster than the previous one. To the best of our knowledge, none of the other existing algorithms for biclustering have used mutual information as a similarity measure. We present the experimental results on synthetic data as well as on the yeast expression data. Biclusters on the yeast data were found to be biologically and statistically significant using GO Tool Box and FuncAssociate.

Predicting Individual Investors- Intention to Invest: An Experimental Analysis of Attitude as a Mediator

The survival of publicly listed companies largely depends on their stocks being liquidly traded. This goal can be achieved when new investors are attracted to invest on companies- stocks. Among different groups of investors, individual investors are generally less able to objectively evaluate companies- risks and returns, and tend to be emotionally biased in their investing decisions. Therefore their decisions may be formed as a result of perceived risks and returns, and influenced by companies- images. This study finds that perceived risk, perceived returns and trust directly affect individual investors- trading decisions while attitude towards brand partially mediates the relationships. This finding suggests that, in courting individual investors, companies still need to perform financially while building a good image can result in their stocks being accepted quicker than the stocks of good performing companies with hidden images.