Performance Evaluation of Music and Minimum Norm Eigenvector Algorithms in Resolving Noisy Multiexponential Signals

Eigenvector methods are gaining increasing acceptance in the area of spectrum estimation. This paper presents a successful attempt at testing and evaluating the performance of two of the most popular types of subspace techniques in determining the parameters of multiexponential signals with real decay constants buried in noise. In particular, MUSIC (Multiple Signal Classification) and minimum-norm techniques are examined. It is shown that these methods perform almost equally well on multiexponential signals with MUSIC displaying better defined peaks.

Development of Improved Three Dimensional Unstructured Tetrahedral Mesh Generator

Meshing is the process of discretizing problem domain into many sub domains before the numerical calculation can be performed. One of the most popular meshes among many types of meshes is tetrahedral mesh, due to their flexibility to fit into almost any domain shape. In both 2D and 3D domains, triangular and tetrahedral meshes can be generated by using Delaunay triangulation. The quality of mesh is an important factor in performing any Computational Fluid Dynamics (CFD) simulations as the results is highly affected by the mesh quality. Many efforts had been done in order to improve the quality of the mesh. The paper describes a mesh generation routine which has been developed capable of generating high quality tetrahedral cells in arbitrary complex geometry. A few test cases in CFD problems are used for testing the mesh generator. The result of the mesh is compared with the one generated by a commercial software. The results show that no sliver exists for the meshes generated, and the overall quality is acceptable since the percentage of the bad tetrahedral is relatively small. The boundary recovery was also successfully done where all the missing faces are rebuilt.

Numerical Simulation of Conjugated Heat Transfer Characteristics of Laminar Air Flows in Parallel-Plate Dimpled Channels

This paper presents a numerical study on surface heat transfer characteristics of laminar air flows in parallel-plate dimpled channels. The two-dimensional numerical model is provided by commercial code FLUENT and the results are obtained for channels with symmetrically opposing hemi-cylindrical cavities onto both walls for Reynolds number ranging from 1000 to 2500. The influence of variations in relative depth of dimples (the ratio of cavity depth to the cavity curvature diameter), the number of them and the thermophysical properties of channel walls on heat transfer enhancement is studied. The results are evident for existence of an optimum value for the relative depth of dimples in which the largest wall heat flux and average Nusselt number can be achieved. In addition, the results of conjugation simulation indicate that the overall influence of the ratio of wall thermal conductivity to the one of the fluid on heat transfer rate is not much significant and can be ignored.

Distributed Estimation Using an Improved Incremental Distributed LMS Algorithm

In this paper we consider the problem of distributed adaptive estimation in wireless sensor networks for two different observation noise conditions. In the first case, we assume that there are some sensors with high observation noise variance (noisy sensors) in the network. In the second case, different variance for observation noise is assumed among the sensors which is more close to real scenario. In both cases, an initial estimate of each sensor-s observation noise is obtained. For the first case, we show that when there are such sensors in the network, the performance of conventional distributed adaptive estimation algorithms such as incremental distributed least mean square (IDLMS) algorithm drastically decreases. In addition, detecting and ignoring these sensors leads to a better performance in a sense of estimation. In the next step, we propose a simple algorithm to detect theses noisy sensors and modify the IDLMS algorithm to deal with noisy sensors. For the second case, we propose a new algorithm in which the step-size parameter is adjusted for each sensor according to its observation noise variance. As the simulation results show, the proposed methods outperforms the IDLMS algorithm in the same condition.

Evaluating Telepresence Experience and Game Players' Intention to Purchase Product Advertised in Advergame

In line with changes of consumers modern lifestyle has call for the advertising strategy to change. This research is to find out how game with telepresence and product experience embedded in the computer game to affect users- intention to purchase. Game content developers are urging to consider of placing product message as part of game design strategy that can influence the game player-s intention to purchase. Experiment was carried out on two hundred and fifty undergraduate students who volunteered to participate in the Internet game playing activities. A factor analysis and correlation analysis was performed on items designed to measure telepresence, attitudes toward telepresence, and game player intention to purchase the product advertise in the game that respondents experienced. The results indicated that telepresence consist of interactive experience and product experience. The study also found that product experience is positively related to the game players- intention to purchase. The significance of product experience implies the usefulness of an interactive advertising in the game playing to attract players- intention to purchase the advertised product placed in the creative game design.

Wormhole Attack Detection in Wireless Sensor Networks

The nature of wireless ad hoc and sensor networks make them very attractive to attackers. One of the most popular and serious attacks in wireless ad hoc networks is wormhole attack and most proposed protocols to defend against this attack used positioning devices, synchronized clocks, or directional antennas. This paper analyzes the nature of wormhole attack and existing methods of defending mechanism and then proposes round trip time (RTT) and neighbor numbers based wormhole detection mechanism. The consideration of proposed mechanism is the RTT between two successive nodes and those nodes- neighbor number which is needed to compare those values of other successive nodes. The identification of wormhole attacks is based on the two faces. The first consideration is that the transmission time between two wormhole attack affected nodes is considerable higher than that between two normal neighbor nodes. The second detection mechanism is based on the fact that by introducing new links into the network, the adversary increases the number of neighbors of the nodes within its radius. This system does not require any specific hardware, has good performance and little overhead and also does not consume extra energy. The proposed system is designed in ad hoc on-demand distance vector (AODV) routing protocol and analysis and simulations of the proposed system are performed in network simulator (ns-2).

Effectiveness and Equity: New Challenges for Social Recognition in Higher Education

Today, Higher Education in a global scope is subordinated to the greater institutional controls through the policies of the Quality of Education. These include processes of over evaluation of all the academic activities: students- and professors- performance, educational logistics, managerial standards for the administration of institutions of higher education, as well as the establishment of the imaginaries of excellence and prestige as the foundations on which universities of the XXI century will focus their present and future goals and interests. But at the same time higher education systems worldwide are facing the most profound crisis of sense and meaning and attending enormous mutations in their identity. Based in a qualitative research approach, this paper shows the social configurations that the scholars at the Universities in Mexico build around the discourse of the Quality of Education, and how these policies put in risk the social recognition of these individuals.

Strategies for Connectivity Configuration to Access e-Learning Resources: Case of Rural Secondary Schools in Tanzania

In response to address different development challenges, Tanzania is striving to achieve its fourth attribute of the National Development Vision, i.e. to have a well educated and learned society by the year 2025. One of the most cost effective methods that can reach a large part of the society in a short time is to integrate ICT in education through e-learning initiatives. However, elearning initiatives are challenged by limited or lack of connectivity to majority of secondary schools, especially those in rural and remote areas. This paper has explores the possibility for rural secondary school to access online e-Learning resources from a centralized e- Learning Management System (e-LMS). The scope of this paper is limited to schools that have computers irrespective of internet connectivity, resulting in two categories schools; those with internet access and those without. Different connectivity configurations have been proposed according to the ICT infrastructure status of the respective schools. However, majority of rural secondary schools in Tanzania have neither computers nor internet connection. Therefore this is a challenge to be addressed for the disadvantaged schools to benefit from e-Learning initiatives.

Using Spectral Vectors and M-Tree for Graph Clustering and Searching in Graph Databases of Protein Structures

In this paper, we represent protein structure by using graph. A protein structure database will become a graph database. Each graph is represented by a spectral vector. We use Jacobi rotation algorithm to calculate the eigenvalues of the normalized Laplacian representation of adjacency matrix of graph. To measure the similarity between two graphs, we calculate the Euclidean distance between two graph spectral vectors. To cluster the graphs, we use M-tree with the Euclidean distance to cluster spectral vectors. Besides, M-tree can be used for graph searching in graph database. Our proposal method was tested with graph database of 100 graphs representing 100 protein structures downloaded from Protein Data Bank (PDB) and we compare the result with the SCOP hierarchical structure.

Analysis of Noise Level Effects on Signal-Averaged Electrocardiograms

Noise level has critical effects on the diagnostic performance of signal-averaged electrocardiogram (SAECG), because the true starting and end points of QRS complex would be masked by the residual noise and sensitive to the noise level. Several studies and commercial machines have used a fixed number of heart beats (typically between 200 to 600 beats) or set a predefined noise level (typically between 0.3 to 1.0 μV) in each X, Y and Z lead to perform SAECG analysis. However different criteria or methods used to perform SAECG would cause the discrepancies of the noise levels among study subjects. According to the recommendations of 1991 ESC, AHA and ACC Task Force Consensus Document for the use of SAECG, the determinations of onset and offset are related closely to the mean and standard deviation of noise sample. Hence this study would try to perform SAECG using consistent root-mean-square (RMS) noise levels among study subjects and analyze the noise level effects on SAECG. This study would also evaluate the differences between normal subjects and chronic renal failure (CRF) patients in the time-domain SAECG parameters. The study subjects were composed of 50 normal Taiwanese and 20 CRF patients. During the signal-averaged processing, different RMS noise levels were adjusted to evaluate their effects on three time domain parameters (1) filtered total QRS duration (fQRSD), (2) RMS voltage of the last QRS 40 ms (RMS40), and (3) duration of the low amplitude signals below 40 μV (LAS40). The study results demonstrated that the reduction of RMS noise level can increase fQRSD and LAS40 and decrease the RMS40, and can further increase the differences of fQRSD and RMS40 between normal subjects and CRF patients. The SAECG may also become abnormal due to the reduction of RMS noise level. In conclusion, it is essential to establish diagnostic criteria of SAECG using consistent RMS noise levels for the reduction of the noise level effects.

Component Based Framework for Authoring and Multimedia Training in Mathematics

The new programming technologies allow for the creation of components which can be automatically or manually assembled to reach a new experience in knowledge understanding and mastering or in getting skills for a specific knowledge area. The project proposes an interactive framework that permits the creation, combination and utilization of components that are specific to mathematical training in high schools. The main framework-s objectives are: • authoring lessons by the teacher or the students; all they need are simple operating skills for Equation Editor (or something similar, or Latex); the rest are just drag & drop operations, inserting data into a grid, or navigating through menus • allowing sonorous presentations of mathematical texts and solving hints (easier understood by the students) • offering graphical representations of a mathematical function edited in Equation • storing of learning objects in a database • storing of predefined lessons (efficient for expressions and commands, the rest being calculations; allows a high compression) • viewing and/or modifying predefined lessons, according to the curricula The whole thing is focused on a mathematical expressions minicompiler, storing the code that will be later used for different purposes (tables, graphics, and optimisations). Programming technologies used. A Visual C# .NET implementation is proposed. New and innovative digital learning objects for mathematics will be developed; they are capable to interpret, contextualize and react depending on the architecture where they are assembled.

An Iterative Method for the Least-squares Symmetric Solution of AXB+CYD=F and its Application

Based on the classical algorithm LSQR for solving (unconstrained) LS problem, an iterative method is proposed for the least-squares like-minimum-norm symmetric solution of AXB+CYD=E. As the application of this algorithm, an iterative method for the least-squares like-minimum-norm biymmetric solution of AXB=E is also obtained. Numerical results are reported that show the efficiency of the proposed methods.

Interface Terminologies: A Case Study on the International Classification of Primary Care

The International Classification of Primary Care (ICPC), which belongs to the WHO Family of International Classifications (WHO-FIC), has a low granularity, which is convenient for describing general medical practice. However, its lack of specificity makes it useful to be used along with an interface terminology. An international survey has been performed, using a questionnaire sent by email to experts from 25 countries, in order to describe the terminologies interfacing with ICPC. Eleven interface terminologies have been identified, developed in Argentina, Australia, Belgium (2), Canada, Denmark, France, Germany, Norway, South Africa, and The Netherlands. Globally, these systems have been poorly assessed until now.

H∞ Approach to Functional Projective Synchronization for Chaotic Systems with Disturbances

This paper presents a method for functional projective H∞ synchronization problem of chaotic systems with external disturbance. Based on Lyapunov theory and linear matrix inequality (LMI) formulation, the novel feedback controller is established to not only guarantee stable synchronization of both drive and response systems but also reduce the effect of external disturbance to an H∞ norm constraint.

Web-GIS based Outdoor Education Program for Junior High Schools

This study, focusing on the importance of encouraging outdoor activities for children, aims to propose and implement a Web-GIS based outdoor education program for junior high schools, which will then be evaluated by users. Specifically, for the purpose of improved outdoor activities in the junior high school education, the outdoor education program, with chiefly using the Web-GIS that provides a good information provision and sharing tool, is proposed and implemented before being evaluated by users. The conclusion of this study can be summarized in the following two points. (1) A five -step outdoor education program based on Web-GIS was proposed for a “second school" at junior high schools that was then implemented before being evaluated by teachers as users. (2) Based on the results of evaluation by teachers, it was clear that the general operation of Web-GIS based outdoor education program with them only is difficult due to their lack of knowledge regarding Web-GIS and that support staff who can effectively utilize Web-GIS are essential.

Generating Concept Trees from Dynamic Self-organizing Map

Self-organizing map (SOM) provides both clustering and visualization capabilities in mining data. Dynamic self-organizing maps such as Growing Self-organizing Map (GSOM) has been developed to overcome the problem of fixed structure in SOM to enable better representation of the discovered patterns. However, in mining large datasets or historical data the hierarchical structure of the data is also useful to view the cluster formation at different levels of abstraction. In this paper, we present a technique to generate concept trees from the GSOM. The formation of tree from different spread factor values of GSOM is also investigated and the quality of the trees analyzed. The results show that concept trees can be generated from GSOM, thus, eliminating the need for re-clustering of the data from scratch to obtain a hierarchical view of the data under study.

Development of a 3D Mathematical Model for a Doxorubicin Controlled Release System using Pluronic Gel for Breast Cancer Treatment

Female breast cancer is the second in frequency after cervical cancer. Surgery is the most common treatment for breast cancer, followed by chemotherapy as a treatment of choice. Although effective, it causes serious side effects. Controlled-release drug delivery is an alternative method to improve the efficacy and safety of the treatment. It can release the dosage of drug between the minimum effect concentration (MEC) and minimum toxic concentration (MTC) within tumor tissue and reduce the damage of normal tissue and the side effect. Because an in vivo experiment of this system can be time-consuming and labor-intensive, a mathematical model is desired to study the effects of important parameters before the experiments are performed. Here, we describe a 3D mathematical model to predict the release of doxorubicin from pluronic gel to treat human breast cancer. This model can, ultimately, be used to effectively design the in vivo experiments.

Are PEG Molecules a Universal Protein Repellent?

Poly (ethylene glycol) (PEG) molecules attached to surfaces have shown high potential as a protein repellent due to their flexibility and highly water solubility. A quartz crystal microbalance recording frequency and dissipation changes (QCM-D) has been used to study the adsorption from aqueous solutions, of lysozyme and α-lactalbumin proteins (the last with and without calcium) onto modified stainless steel surfaces. Surfaces were coated with poly(ethylene imine) (PEI) and silicate before grafting on PEG molecules. Protein adsorption was also performed on the bare stainless steel surface as a control. All adsorptions were conducted at 23°C and pH 7.2. The results showed that the presence of PEG molecules significantly reduced the adsorption of lysozyme and α- lactalbumin (with calcium) onto the stainless steel surface. By contrast, and unexpected, PEG molecules enhanced the adsorption of α-lactalbumin (without calcium). It is suggested that the PEG -α- lactalbumin hydrophobic interaction plays a dominant role which leads to protein aggregation at the surface for this latter observation. The findings also lead to the general conclusion that PEG molecules are not a universal protein repellent. PEG-on-PEI surfaces were better at inhibiting the adsorption of lysozyme and α-lactalbumin (with calcium) than with PEG-on-silicate surfaces.

Development of Mechanical Properties of Self Compacting Concrete Contain Rice Husk Ash

Self-compacting concrete (SCC), a new kind of high performance concrete (HPC) have been first developed in Japan in 1986. The development of SCC has made casting of dense reinforcement and mass concrete convenient, has minimized noise. Fresh self-compacting concrete (SCC) flows into formwork and around obstructions under its own weight to fill it completely and self-compact (without any need for vibration), without any segregation and blocking. The elimination of the need for compaction leads to better quality concrete and substantial improvement of working conditions. SCC mixes generally have a much higher content of fine fillers, including cement, and produce excessively high compressive strength concrete, which restricts its field of application to special concrete only. To use SCC mixes in general concrete construction practice, requires low cost materials to make inexpensive concrete. Rice husk ash (RHA) has been used as a highly reactive pozzolanic material to improve the microstructure of the interfacial transition zone (ITZ) between the cement paste and the aggregate in self compacting concrete. Mechanical experiments of RHA blended Portland cement concretes revealed that in addition to the pozzolanic reactivity of RHA (chemical aspect), the particle grading (physical aspect) of cement and RHA mixtures also exerted significant influences on the blending efficiency. The scope of this research was to determine the usefulness of Rice husk ash (RHA) in the development of economical self compacting concrete (SCC). The cost of materials will be decreased by reducing the cement content by using waste material like rice husk ash instead of. This paper presents a study on the development of Mechanical properties up to 180 days of self compacting and ordinary concretes with rice-husk ash (RHA), from a rice paddy milling industry in Rasht (Iran). Two different replacement percentages of cement by RHA, 10%, and 20%, and two different water/cementicious material ratios (0.40 and 0.35), were used for both of self compacting and normal concrete specimens. The results are compared with those of the self compacting concrete without RHA, with compressive, flexural strength and modulus of elasticity. It is concluded that RHA provides a positive effect on the Mechanical properties at age after 60 days. Base of the result self compacting concrete specimens have higher value than normal concrete specimens in all test except modulus of elasticity. Also specimens with 20% replacement of cement by RHA have the best performance.

Architectural Stratification and Woody Species Diversity of a Subtropical Forest Grown in a Limestone Habitat in Okinawa Island, Japan

The forest stand consisted of four layers. The species composition between the third and the bottom layers was almost similar, whereas it was almost exclusive between the top and the lower three layers. The values of Shannon-s index H' and Pielou-s index J ' tended to increase from the bottom layer upward, except for H' -value of the top layer. The values of H' and J ' were 4.21 bit and 0.73, respectively, for the total stand. High woody species diversity of the forest depended on large trees in the upper layers, which trend was different from a subtropical evergreen broadleaf forest grown in silicate habitat in the northern part of Okinawa Island. The spatial distribution of trees was overlapped between the third and the bottom layers, whereas it was independent or slightly exclusive between the top and the lower three layers. Mean tree weight of each layer decreased from the top toward the bottom layer, whereas the corresponding tree density increased from the top downward. This relationship was analogous to the process of self-thinning plant populations.