Virtual Reality Models used on the Visualization of Construction Activities in Civil Engineering Education

Three-dimensional geometric models have been used to present architectural and engineering works, showing their final configuration. When the clarification of a detail or the constitution of a construction step in needed, these models are not appropriate. They do not allow the observation of the construction progress of a building. Models that could present dynamically changes of the building geometry are a good support to the elaboration of projects. Techniques of geometric modeling and virtual reality were used to obtain models that could visually simulate the construction activity. The applications explain the construction work of a cavity wall and a bridge. These models allow the visualization of the physical progression of the work following a planned construction sequence, the observation of details of the form of every component of the works and support the study of the type and method of operation of the equipment applied in the construction. These models presented distinct advantage as educational aids in first-degree courses in Civil Engineering. The use of Virtual Reality techniques in the development of educational applications brings new perspectives to the teaching of subjects related to the field of civil construction.

Determination of Non Uniform Sinusoidal Microstrip Leaky-Wave Antenna Radiating Performances in Millimeter Band

Here we have considered non uniform microstrip leaky-wave antenna implemented on a dielectric waveguide by a sinusoidal profile of periodic metallic grating. The non distribution of the attenuation constant α along propagation axis, optimize the radiating characteristics and performances of such antennas. The method developped here is based on an integral method where the formalism of the admittance operator is combined to a BKW approximation. First, the effect of the modeling in the modal analysis of complex waves is studied in detail. Then, the BKW model is used for the dispersion analysis of the antenna of interest. According to antenna theory, a forced continuity of the leaky-wave magnitude at discontinuities of the non uniform structure is established. To test the validity of our dispersion analysis, computed radiation patterns are presented and compared in the millimeter band.

An Implementation of Stipple Operations

Stipples are desired for pattern fillings and transparency effects. In contrast, some graphics standards, including OpenGL ES 1.1 and 2.0, omitted this feature. We represent details of providing line stipples and polygon stipples, through combining texture mapping and alpha blending functions. We start from the OpenGL-specified stipple-related API functions. The details of mathematical transformations are explained to get the correct texture coordinates. Then, the overall algorithm is represented, and its implementation results are followed. We accomplished both of line and polygon stipples, and verified its result with conformance test routines.

Towards an Enhanced Stochastic Simulation Model for Risk Analysis in Highway Construction

Over the years, there is a growing trend towards quality-based specifications in highway construction. In many Quality Control/Quality Assurance (QC/QA) specifications, the contractor is primarily responsible for quality control of the process, whereas the highway agency is responsible for testing the acceptance of the product. A cooperative investigation was conducted in Illinois over several years to develop a prototype End-Result Specification (ERS) for asphalt pavement construction. The final characteristics of the product are stipulated in the ERS and the contractor is given considerable freedom in achieving those characteristics. The risk for the contractor or agency depends on how the acceptance limits and processes are specified. Stochastic simulation models are very useful in estimating and analyzing payment risk in ERS systems and these form an integral part of the Illinois-s prototype ERS system. This paper describes the development of an innovative methodology to estimate the variability components in in-situ density, air voids and asphalt content data from ERS projects. The information gained from this would be crucial in simulating these ERS projects for estimation and analysis of payment risks associated with asphalt pavement construction. However, these methods require at least two parties to conduct tests on all the split samples obtained according to the sampling scheme prescribed in present ERS implemented in Illinois.

DNA Computing for an Absolute 1-Center Problem: An Evolutionary Approach

Deoxyribonucleic Acid or DNA computing has emerged as an interdisciplinary field that draws together chemistry, molecular biology, computer science and mathematics. Thus, in this paper, the possibility of DNA-based computing to solve an absolute 1-center problem by molecular manipulations is presented. This is truly the first attempt to solve such a problem by DNA-based computing approach. Since, part of the procedures involve with shortest path computation, research works on DNA computing for shortest path Traveling Salesman Problem, in short, TSP are reviewed. These approaches are studied and only the appropriate one is adapted in designing the computation procedures. This DNA-based computation is designed in such a way that every path is encoded by oligonucleotides and the path-s length is directly proportional to the length of oligonucleotides. Using these properties, gel electrophoresis is performed in order to separate the respective DNA molecules according to their length. One expectation arise from this paper is that it is possible to verify the instance absolute 1-center problem using DNA computing by laboratory experiments.

A Hybrid Fuzzy AGC in a Competitive Electricity Environment

This paper presents a new Hybrid Fuzzy (HF) PID type controller based on Genetic Algorithms (GA-s) for solution of the Automatic generation Control (AGC) problem in a deregulated electricity environment. In order for a fuzzy rule based control system to perform well, the fuzzy sets must be carefully designed. A major problem plaguing the effective use of this method is the difficulty of accurately constructing the membership functions, because it is a computationally expensive combinatorial optimization problem. On the other hand, GAs is a technique that emulates biological evolutionary theories to solve complex optimization problems by using directed random searches to derive a set of optimal solutions. For this reason, the membership functions are tuned automatically using a modified GA-s based on the hill climbing method. The motivation for using the modified GA-s is to reduce fuzzy system effort and take large parametric uncertainties into account. The global optimum value is guaranteed using the proposed method and the speed of the algorithm-s convergence is extremely improved, too. This newly developed control strategy combines the advantage of GA-s and fuzzy system control techniques and leads to a flexible controller with simple stricture that is easy to implement. The proposed GA based HF (GAHF) controller is tested on a threearea deregulated power system under different operating conditions and contract variations. The results of the proposed GAHF controller are compared with those of Multi Stage Fuzzy (MSF) controller, robust mixed H2/H∞ and classical PID controllers through some performance indices to illustrate its robust performance for a wide range of system parameters and load changes.

Analysis of Physicochemical Properties on Prediction of R5, X4 and R5X4 HIV-1 Coreceptor Usage

Bioinformatics methods for predicting the T cell coreceptor usage from the array of membrane protein of HIV-1 are investigated. In this study, we aim to propose an effective prediction method for dealing with the three-class classification problem of CXCR4 (X4), CCR5 (R5) and CCR5/CXCR4 (R5X4). We made efforts in investigating the coreceptor prediction problem as follows: 1) proposing a feature set of informative physicochemical properties which is cooperated with SVM to achieve high prediction test accuracy of 81.48%, compared with the existing method with accuracy of 70.00%; 2) establishing a large up-to-date data set by increasing the size from 159 to 1225 sequences to verify the proposed prediction method where the mean test accuracy is 88.59%, and 3) analyzing the set of 14 informative physicochemical properties to further understand the characteristics of HIV-1coreceptors.

Elections Management Information Communication System Voter Ballot

Abovepresented work deals with the new scope of application of information and communication technologies for the improvement of the election process in the biased environment. We are introducing a new concept of construction of the information-communication system for the election participant. It consists of four main components: Software, Physical Infrastructure, Structured Information and the Trained Stuff. The Structured Information is the bases of the whole system and is the collection of all possible events (irregularities among them) at the polling stations, which are structured in special templates, forms and integrated in mobile devices.The software represents a package of analytic modules, which operates with the dynamic database. The application of modern communication technologies facilities the immediate exchange of information and of relevant documents between the polling stations and the Server of the participant. No less important is the training of the staff for the proper functioning of the system. The e-training system with various modules should be applied in this respect. The presented methodology is primarily focused on the election processes in the countries of emerging democracies.It can be regarded as the tool for the monitoring of elections process by the political organization(s) and as one of the instruments to foster the spread of democracy in these countries.

Use of Fuzzy Edge Image in Block Truncation Coding for Image Compression

An image compression method has been developed using fuzzy edge image utilizing the basic Block Truncation Coding (BTC) algorithm. The fuzzy edge image has been validated with classical edge detectors on the basis of the results of the well-known Canny edge detector prior to applying to the proposed method. The bit plane generated by the conventional BTC method is replaced with the fuzzy bit plane generated by the logical OR operation between the fuzzy edge image and the corresponding conventional BTC bit plane. The input image is encoded with the block mean and standard deviation and the fuzzy bit plane. The proposed method has been tested with test images of 8 bits/pixel and size 512×512 and found to be superior with better Peak Signal to Noise Ratio (PSNR) when compared to the conventional BTC, and adaptive bit plane selection BTC (ABTC) methods. The raggedness and jagged appearance, and the ringing artifacts at sharp edges are greatly reduced in reconstructed images by the proposed method with the fuzzy bit plane.

Hutchinson-Barnsley Operator in Intuitionistic Fuzzy Metric Spaces

The main purpose of this paper is to prove the intuitionistic fuzzy contraction properties of the Hutchinson-Barnsley operator on the intuitionistic fuzzy hyperspace with respect to the Hausdorff intuitionistic fuzzy metrics. Also we discuss about the relationships between the Hausdorff intuitionistic fuzzy metrics on the intuitionistic fuzzy hyperspaces. Our theorems generalize and extend some recent results related with Hutchinson-Barnsley operator in the metric spaces to the intuitionistic fuzzy metric spaces.

A Novel Nano-Scaled SRAM Cell

To help overcome limits to the density of conventional SRAMs and leakage current of SRAM cell in nanoscaled CMOS technology, we have developed a four-transistor SRAM cell. The newly developed CMOS four-transistor SRAM cell uses one word-line and one bit-line during read/write operation. This cell retains its data with leakage current and positive feedback without refresh cycle. The new cell size is 19% smaller than a conventional six-transistor cell using same design rules. Also the leakage current of new cell is 60% smaller than a conventional sixtransistor SRAM cell. Simulation result in 65nm CMOS technology shows new cell has correct operation during read/write operation and idle mode.

Distributed Load Flow Analysis using Graph Theory

In today scenario, to meet enhanced demand imposed by domestic, commercial and industrial consumers, various operational & control activities of Radial Distribution Network (RDN) requires a focused attention. Irrespective of sub-domains research aspects of RDN like network reconfiguration, reactive power compensation and economic load scheduling etc, network performance parameters are usually estimated by an iterative process and is commonly known as load (power) flow algorithm. In this paper, a simple mechanism is presented to implement the load flow analysis (LFA) algorithm. The reported algorithm utilizes graph theory principles and is tested on a 69- bus RDN.

Development of Motor and Controller for VVA Module of Gasoline Vehicle

Due to environmental concerns, the recent regulation on automobile fuel economy has been strengthened. The market demand for efficient vehicles is growing and automakers to improve engine fuel efficiency in the industry have been paying a lot of effort. To improve the fuel efficiency, it is necessary to reduce losses or to improve combustion efficiency of the engine. VVA (Variable Valve Actuation) technology enhances the engine's intake air flow, reduce pumping losses and mechanical friction losses. And also, VVA technology is the engine's low speed and high speed operation to implement each of appropriate valve lift. It improves the performance of engine in the entire operating range. This paper presents a design procedure of DC motor and drive for VVA system and shows the validity of the design result by experimental result with prototype.

Multi-matrix Real-coded Genetic Algorithm for Minimising Total Costs in Logistics Chain Network

The importance of supply chain and logistics management has been widely recognised. Effective management of the supply chain can reduce costs and lead times and improve responsiveness to changing customer demands. This paper proposes a multi-matrix real-coded Generic Algorithm (MRGA) based optimisation tool that minimises total costs associated within supply chain logistics. According to finite capacity constraints of all parties within the chain, Genetic Algorithm (GA) often produces infeasible chromosomes during initialisation and evolution processes. In the proposed algorithm, chromosome initialisation procedure, crossover and mutation operations that always guarantee feasible solutions were embedded. The proposed algorithm was tested using three sizes of benchmarking dataset of logistic chain network, which are typical of those faced by most global manufacturing companies. A half fractional factorial design was carried out to investigate the influence of alternative crossover and mutation operators by varying GA parameters. The analysis of experimental results suggested that the quality of solutions obtained is sensitive to the ways in which the genetic parameters and operators are set.

Quality Fed-Batch Bioprocess Control A Case Study

Bioprocesses are appreciated as difficult to control because their dynamic behavior is highly nonlinear and time varying, in particular, when they are operating in fed batch mode. The research objective of this study was to develop an appropriate control method for a complex bioprocess and to implement it on a laboratory plant. Hence, an intelligent control structure has been designed in order to produce biomass and to maximize the specific growth rate.

Knowledge Based Wear Particle Analysis

The paper describes a knowledge based system for analysis of microscopic wear particles. Wear particles contained in lubricating oil carry important information concerning machine condition, in particular the state of wear. Experts (Tribologists) in the field extract this information to monitor the operation of the machine and ensure safety, efficiency, quality, productivity, and economy of operation. This procedure is not always objective and it can also be expensive. The aim is to classify these particles according to their morphological attributes of size, shape, edge detail, thickness ratio, color, and texture, and by using this classification thereby predict wear failure modes in engines and other machinery. The attribute knowledge links human expertise to the devised Knowledge Based Wear Particle Analysis System (KBWPAS). The system provides an automated and systematic approach to wear particle identification which is linked directly to wear processes and modes that occur in machinery. This brings consistency in wear judgment prediction which leads to standardization and also less dependence on Tribologists.

Dynamically Monitoring Production Methods for Identifying Structural Changes relevant to Logistics

Due to the growing dynamic and complexity within the market environment production enterprises in particular are faced with new logistic challenges. Moreover, it is here in this dynamic environment that the Logistic Operating Curve Theory also reaches its limits as a method for describing the correlations between the logistic objectives. In order to convert this theory into a method for dynamically monitoring productions this paper will introduce methods for reliably and quickly identifying structural changes relevant to logistics.

Intuitionistic Fuzzy Multisets And Its Application in Medical Diagnosis

In this paper a new concept named Intuitionistic Fuzzy Multiset is introduced. The basic operations on Intuitionistic Fuzzy Multisets such as union, intersection, addition, multiplication etc. are discussed. An application of Intuitionistic Fuzzy Multiset in Medical diagnosis problem using a distance function is discussed in detail.

Analysis of Key Factors for Formation of Strategic Alliances in Liner Shipping Company: Service Quality Perspective on Asia/Europe Route after Global Economic Crisis

Strategic alliances generally mean the cooperation or collaboration between firms which pursue for a synergy that each member hopes the benefits from the alliances would be much more than those from individual efforts. Past researches provide us sufficient theories and considerations for alliance forming in liner shipping market. This research reviews important academic journals for the past decade regarding to the most important reasons to form the alliances. We would explain the motive of alliances and details of shipping cooperation in literature review. The paper also empirically investigates the key service quality requirements improved through alliances by using quality function deployment (QFD). Moreover, the research investigates famous shipping reports, shipping consultant websites and most recent shipping publications to find out the executive-s viewpoint of several leading carriers among top 20 to assess current shipping strategic alliance on Asia/Europe route. These comments provide meaningful managerial reasons to consider alliance formations and search if there is any gap between the theories and industrial practice. Analysis of the empirical investigation and top management-s perspective on current market situation will contribute us some meaningful managerial suggestions to evaluate these theories applied to current strategic alliances.

The Research and Application of M/M/1/N Queuing Model with Variable Input Rates, Variable Service Rates and Impatient Customers

How to maintain the service speeds for the business to make the biggest profit is a problem worthy of study, which is discussed in this paper with the use of queuing theory. An M/M/1/N queuing model with variable input rates, variable service rates and impatient customers is established, and the following conclusions are drawn: the stationary distribution of the model, the relationship between the stationary distribution and the probability that there are n customers left in the system when a customer leaves (not including the customer who leaves himself), the busy period of the system, the average operating cycle, the loss probability for the customers not entering the system while they arriving at the system, the mean of the customers who leaves the system being for impatient, the loss probability for the customers not joining the queue due to the limited capacity of the system and many other indicators. This paper also indicates that the following conclusion is not correct: the more customers the business serve, the more profit they will get. At last, this paper points out the appropriate service speeds the business should keep to make the biggest profit.