Virtual Reality Classrooms Strategies for Creating a Social Presence

Delivering course material via a virtual environment is beneficial to today-s students because it offers the interactivity, real-time interaction and social presence that students of all ages have come to accept in our gaming rich community. It is essential that the Net Generation also known as Generation Why, have exposure to learning communities that encompass interactivity to form social and educational connections. As student and professor become interconnected through collaboration and interaction in a virtual learning space, relationships develop and students begin to take on an individual identity. With this in mind the research project was developed to investigate the use of virtual environments on student satisfaction and the effectiveness of course delivery. Furthermore, the project was designed to integrate both interactive (real-time) classes conducted in the Virtual Reality (VR) environment while also creating archived VR sessions for student use in retaining and reviewing course content.

Fuzzy Types Clustering for Microarray Data

The main goal of microarray experiments is to quantify the expression of every object on a slide as precisely as possible, with a further goal of clustering the objects. Recently, many studies have discussed clustering issues involving similar patterns of gene expression. This paper presents an application of fuzzy-type methods for clustering DNA microarray data that can be applied to typical comparisons. Clustering and analyses were performed on microarray and simulated data. The results show that fuzzy-possibility c-means clustering substantially improves the findings obtained by others.

A “Greedy“ Czech Manufacturing Case

The article describes a case study on one of Czech Republic-s manufacturing middle size enterprises (ME), where due to the European financial crisis, production lines had to be redesigned and optimized in order to minimize the total costs of the production of goods. It is considered an optimization problem of minimizing the total cost of the work load, according to the costs of the possible locations of the workplaces, with an application of the Greedy algorithm and a partial analogy to a Set Packing Problem. The displacement of working tables in a company should be as a one-toone monotone increasing function in order for the total costs of production of the goods to be at minimum. We use a heuristic approach with greedy algorithm for solving this linear optimization problem, regardless the possible greediness which may appear and we apply it in a Czech ME.

New Wavelet-Based Superresolution Algorithm for Speckle Reduction in SAR Images

This paper describes a novel projection algorithm, the Projection Onto Span Algorithm (POSA) for wavelet-based superresolution and removing speckle (in wavelet domain) of unknown variance from Synthetic Aperture Radar (SAR) images. Although the POSA is good as a new superresolution algorithm for image enhancement, image metrology and biometric identification, here one will use it like a tool of despeckling, being the first time that an algorithm of super-resolution is used for despeckling of SAR images. Specifically, the speckled SAR image is decomposed into wavelet subbands; POSA is applied to the high subbands, and reconstruct a SAR image from the modified detail coefficients. Experimental results demonstrate that the new method compares favorably to several other despeckling methods on test SAR images.

Instance-Based Ontology Matching Using Different Kinds of Formalism

Ontology Matching is a task needed in various applica-tions, for example for comparison or merging purposes. In literature,many algorithms solving the matching problem can be found, butmost of them do not consider instances at all. Mappings are deter-mined by calculating the string-similarity of labels, by recognizinglinguistic word relations (synonyms, subsumptions etc.) or by ana-lyzing the (graph) structure. Due to the facts that instances are oftenmodeled within the ontology and that the set of instances describesthe meaning of the concepts better than their meta information,instances should definitely be incorporated into the matching process.In this paper several novel instance-based matching algorithms arepresented which enhance the quality of matching results obtainedwith common concept-based methods. Different kinds of formalismsare use to classify concepts on account of their instances and finallyto compare the concepts directly.KeywordsInstances, Ontology Matching, Semantic Web

Changes in the Research of Crisis

Thanks to the interdisciplinary nature of crises, the position of researchers in that field is rather difficult. Very often the traditional methods of research cannot be applied there. The article is aimed at the changes in crises research. It describes the substance of individual changes and emphasizes the shift in research approaches to the crisis.

Combining Bagging and Boosting

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, in this work we built an ensemble using a voting methodology of bagging and boosting ensembles with 10 subclassifiers in each one. We performed a comparison with simple bagging and boosting ensembles with 25 sub-classifiers, as well as other well known combining methods, on standard benchmark datasets and the proposed technique was the most accurate.

The Islamic Element of Al-‘Adl in Critical Thinking: the Perception of Muslim Engineering Undergraduates in Malaysia

The element of justice or al-‘adl in the context of Islamic critical thinking deals with the notion of justice in a thinking process which critically rationalizes the truth in a fair and objective manner with no irrelevant interference that can jeopardize a sound judgment. This Islamic axiological element is vital in technological decision making as it addresses the issues of religious values and ethics that are primarily set to fulfill the purpose of human life on earth. The main objective of this study was to examine and analyze the perception of Muslim engineering students in Malaysian higher education institutions towards the concept of al-‘adl as an essential element of Islamic critical thinking. The study employed mixed methods approach that comprises data collection from the questionnaire survey and the interview responses. A total of 557 Muslim engineering undergraduates from six Malaysian universities participated in the study. The study generally indicated that Muslim engineering undergraduates in the higher institutions have rather good comprehension and consciousness for al-‘adl with a slight awareness on the importance of objective thinking. Nonetheless there were a few items on the concept that have implied a comparatively low perception on the rational justice in Islam as the means to grasp the ultimate truth.

Acoustic Finite Element Analysis of a Slit Model with Consideration of Air Viscosity

In very narrow pathways, the speed of sound propagation and the phase of sound waves change due to the air viscosity. We have developed a new finite element method (FEM) that includes the effects of air viscosity for modeling a narrow sound pathway. This method is developed as an extension of the existing FEM for porous sound-absorbing materials. The numerical calculation results for several three-dimensional slit models using the proposed FEM are validated against existing calculation methods.

Kernel’s Parameter Selection for Support Vector Domain Description

Support Vector Domain Description (SVDD) is one of the best-known one-class support vector learning methods, in which one tries the strategy of using balls defined on the feature space in order to distinguish a set of normal data from all other possible abnormal objects. As all kernel-based learning algorithms its performance depends heavily on the proper choice of the kernel parameter. This paper proposes a new approach to select kernel's parameter based on maximizing the distance between both gravity centers of normal and abnormal classes, and at the same time minimizing the variance within each class. The performance of the proposed algorithm is evaluated on several benchmarks. The experimental results demonstrate the feasibility and the effectiveness of the presented method.

Determination of Required Ion Exchange Solution for Stabilizing Clayey Soils with Various PI

Soil stabilization has been widely used to improve soil strength and durability or to prevent erosion and dust generation. Generally to reduce problems of clayey soils in engineering work and to stabilize these soils additional materials are used. The most common materials are lime, fly ash and cement. Using this materials, although improve soil property , but in some cases due to financial problems and the need to use special equipment are limited .One of the best methods for stabilization clayey soils is neutralization the clay particles. For this purpose we can use ion exchange materials. Ion exchange solution like CBR plus can be used for soil stabilization. One of the most important things in using CBR plus is determination the amount of this solution for various soils with different properties. In this study a laboratory experiment is conduct to evaluate the ion exchange capacity of three soils with various plasticity index (PI) to determine amount or required CBR plus solution for soil stabilization.

Heuristic Set-Covering-Based Postprocessing for Improving the Quine-McCluskey Method

Finding the minimal logical functions has important applications in the design of logical circuits. This task is solved by many different methods but, frequently, they are not suitable for a computer implementation. We briefly summarise the well-known Quine-McCluskey method, which gives a unique procedure of computing and thus can be simply implemented, but, even for simple examples, does not guarantee an optimal solution. Since the Petrick extension of the Quine-McCluskey method does not give a generally usable method for finding an optimum for logical functions with a high number of values, we focus on interpretation of the result of the Quine-McCluskey method and show that it represents a set covering problem that, unfortunately, is an NP-hard combinatorial problem. Therefore it must be solved by heuristic or approximation methods. We propose an approach based on genetic algorithms and show suitable parameter settings.

Probabilistic Method of Wind Generation Placement for Congestion Management

Wind farms (WFs) with high level of penetration are being established in power systems worldwide more rapidly than other renewable resources. The Independent System Operator (ISO), as a policy maker, should propose appropriate places for WF installation in order to maximize the benefits for the investors. There is also a possibility of congestion relief using the new installation of WFs which should be taken into account by the ISO when proposing the locations for WF installation. In this context, efficient wind farm (WF) placement method is proposed in order to reduce burdens on congested lines. Since the wind speed is a random variable and load forecasts also contain uncertainties, probabilistic approaches are used for this type of study. AC probabilistic optimal power flow (P-OPF) is formulated and solved using Monte Carlo Simulations (MCS). In order to reduce computation time, point estimate methods (PEM) are introduced as efficient alternative for time-demanding MCS. Subsequently, WF optimal placement is determined using generation shift distribution factors (GSDF) considering a new parameter entitled, wind availability factor (WAF). In order to obtain more realistic results, N-1 contingency analysis is employed to find the optimal size of WF, by means of line outage distribution factors (LODF). The IEEE 30-bus test system is used to show and compare the accuracy of proposed methodology.

A Cooperative Weighted Discriminator Energy Detector Technique in Fading Environment

The need in cognitive radio system for a simple, fast, and independent technique to sense the spectrum occupancy has led to the energy detection approach. Energy detector is known by its dependency on noise variation in the system which is one of its major drawbacks. In this paper, we are aiming to improve its performance by utilizing a weighted collaborative spectrum sensing, it is similar to the collaborative spectrum sensing methods introduced previously in the literature. These weighting methods give more improvement for collaborative spectrum sensing as compared to no weighting case. There is two method proposed in this paper: the first one depends on the channel status between each sensor and the primary user while the second depends on the value of the energy measured in each sensor.

A Comparison between Heuristic and Meta-Heuristic Methods for Solving the Multiple Traveling Salesman Problem

The multiple traveling salesman problem (mTSP) can be used to model many practical problems. The mTSP is more complicated than the traveling salesman problem (TSP) because it requires determining which cities to assign to each salesman, as well as the optimal ordering of the cities within each salesman's tour. Previous studies proposed that Genetic Algorithm (GA), Integer Programming (IP) and several neural network (NN) approaches could be used to solve mTSP. This paper compared the results for mTSP, solved with Genetic Algorithm (GA) and Nearest Neighbor Algorithm (NNA). The number of cities is clustered into a few groups using k-means clustering technique. The number of groups depends on the number of salesman. Then, each group is solved with NNA and GA as an independent TSP. It is found that k-means clustering and NNA are superior to GA in terms of performance (evaluated by fitness function) and computing time.

Production Planning and Measuring Method for Non Patterned Production System Using Stock Cutting Model

The simple methods used to plan and measure non patterned production system are developed from the basic definition of working efficiency. Processing time is assigned as the variable and used to write the equation of production efficiency. Consequently, such equation is extensively used to develop the planning method for production of interest using one-dimensional stock cutting problem. The application of the developed method shows that production efficiency and production planning can be determined effectively.

Elliptical Features Extraction Using Eigen Values of Covariance Matrices, Hough Transform and Raster Scan Algorithms

In this paper, we introduce a new method for elliptical object identification. The proposed method adopts a hybrid scheme which consists of Eigen values of covariance matrices, Circular Hough transform and Bresenham-s raster scan algorithms. In this approach we use the fact that the large Eigen values and small Eigen values of covariance matrices are associated with the major and minor axial lengths of the ellipse. The centre location of the ellipse can be identified using circular Hough transform (CHT). Sparse matrix technique is used to perform CHT. Since sparse matrices squeeze zero elements and contain a small number of nonzero elements they provide an advantage of matrix storage space and computational time. Neighborhood suppression scheme is used to find the valid Hough peaks. The accurate position of circumference pixels is identified using raster scan algorithm which uses the geometrical symmetry property. This method does not require the evaluation of tangents or curvature of edge contours, which are generally very sensitive to noise working conditions. The proposed method has the advantages of small storage, high speed and accuracy in identifying the feature. The new method has been tested on both synthetic and real images. Several experiments have been conducted on various images with considerable background noise to reveal the efficacy and robustness. Experimental results about the accuracy of the proposed method, comparisons with Hough transform and its variants and other tangential based methods are reported.

Seismic Behavior and Capacity/Demand Analyses of a Simply-Supported Multi-Span Precast Bridge

This paper presents the results of an analytical study on the seismic response of a Multi-Span-Simply-Supported precast bridge in Washington State. The bridge was built in the early 1960's along Interstate 5 and was widened the first time in 1979 and the second time in 2001. The primary objective of this research project is to determine the seismic vulnerability of the bridge in order to develop the required retrofit measure. The seismic vulnerability of the bridge is evaluated using two seismic evaluation methods presented in the FHWA Seismic Retrofitting Manual for Highway Bridges, Method C and Method D2. The results of the seismic analyses demonstrate that Method C and Method D2 vary markedly in terms of the information they provide to the bridge designer regarding the vulnerability of the bridge columns.

A Multiresolution Approach for Noised Texture Classification based on the Co-occurrence Matrix and First Order Statistics

Wavelet transform provides several important characteristics which can be used in a texture analysis and classification. In this work, an efficient texture classification method, which combines concepts from wavelet and co-occurrence matrices, is presented. An Euclidian distance classifier is used to evaluate the various methods of classification. A comparative study is essential to determine the ideal method. Using this conjecture, we developed a novel feature set for texture classification and demonstrate its effectiveness

Multiproject Scheduling in Construction Industry

In this paper, supply policy and procurement of shared resources in some kinds of concurrent construction projects are investigated. This could be oriented to the problems of holding construction companies who involve in different projects concurrently and they have to supply limited resources to several projects as well as prevent delays to any project. Limits on transportation vehicles and storage facilities for potential construction materials and also the available resources (such as cash or manpower) are some of the examples which affect considerably on management of all projects over all. The research includes investigation of some real multi-storey buildings during their execution periods and surveying the history of the activities. It is shown that the common resource demand variation curve of the projects may be expanded or displaced to achieve an optimum distribution scheme. Of course, it may cause some delay to some projects, but it has minimum influence on whole execution period of all projects and its influence on procurement cost of the projects is considerable. These observations on investigation of some multistorey building which are built in Iran will be presented in this paper.