A Computer Aided Model for Supporting Design Education

Educating effective architect designers is an important goal of architectural education. But what contributes to students- performance, and to critical and creative thinking in architectural design education? Besides teaching architecture students how to understand logical arguments, eliminate the inadequate solutions and focus on the correct ones, it is also crucial to teach students how to focus on exploring ideas and the alternative solutions and seeking for other right answers rather than one. This paper focuses on the enhancing architectural design education and may provide implications for enhancing teaching design.

A Software of Intrusion Detection Mechanism for Virtual Platforms

Security is an interesting and significance issue for popular virtual platforms, such as virtualization cluster and cloud platforms. Virtualization is the powerful technology for cloud computing services, there are a lot of benefits by using virtual machine tools which be called hypervisors, such as it can quickly deploy all kinds of virtual Operating Systems in single platform, able to control all virtual system resources effectively, cost down for system platform deployment, ability of customization, high elasticity and high reliability. However, some important security problems need to take care and resolved in virtual platforms that include terrible viruses, evil programs, illegal operations and intrusion behavior. In this paper, we present useful Intrusion Detection Mechanism (IDM) software that not only can auto to analyze all system-s operations with the accounting journal database, but also is able to monitor the system-s state for virtual platforms.

Non Destructive Characterisation of Cement Mortar during Carbonation

The objective of this work was to examine the changes in non destructive properties caused by carbonation of CEM II mortar. Samples of CEM II mortar were prepared and subjected to accelerated carbonation at 20°C, 65% relative humidity and 20% CO2 concentration. We examined the evolutions of the gas permeability, the thermal conductivity, the thermal diffusivity, the volume of the solid phase by helium pycnometry, the longitudinal and transverse ultrasonic velocities. The principal contribution of this work is that, apart of the gas permeability, changes in other non destructive properties have never been studied during the carbonation of cement materials. These properties are important in predicting/measuring the durability of reinforced concrete in CO2 environment. The carbonation depth and the porosity accessible to water were also reported in order to explain comprehensively the changes in non destructive parameters.

Computations of Bezier Geodesic-like Curves on Spheres

It is an important problem to compute the geodesics on a surface in many fields. To find the geodesics in practice, however, the traditional discrete algorithms or numerical approaches can only find a list of discrete points. The first author proposed in 2010 a new, elegant and accurate method, the geodesic-like method, for approximating geodesics on a regular surface. This paper will present by use of this method a computation of the Bezier geodesic-like curves on spheres.

Molecular Analysis of Somaclonal Variation in Tissue Culture Derived Bananas Using MSAP and SSR Markers

The project was undertaken to determine the effects of modified tissue culture protocols e.g. age of culture and hormone levels (2,4-D) in generating somaclonal variation. Moreover, the utility of molecular markers (SSR and MSAP) in sorting off types/somaclones were investigated. Results show that somaclonal variation is in effect due to prolonged subculture and high 2,4-D concentration. The resultant variation was observed to be due to high level of methylation events specifically cytosine methylation either at the internal or external cytosine and was identified by methylation sensitive amplification polymorphism (MSAP).Simple sequence repeats (SSR) on the other hand, was able to associate a marker to a trait of interest. These therefore, show that molecular markers can be an important tool in sorting out variation/mutants at an early stage.

Crash Severity Modeling in Urban Highways Using Backward Regression Method

Identifying and classifying intersections according to severity is very important for implementation of safety related counter measures and effective models are needed to compare and assess the severity. Highway safety organizations have considered intersection safety among their priorities. In spite of significant advances in highways safety, the large numbers of crashes with high severities still occur in the highways. Investigation of influential factors on crashes enables engineers to carry out calculations in order to reduce crash severity. Previous studies lacked a model capable of simultaneous illustration of the influence of human factors, road, vehicle, weather conditions and traffic features including traffic volume and flow speed on the crash severity. Thus, this paper is aimed at developing the models to illustrate the simultaneous influence of these variables on the crash severity in urban highways. The models represented in this study have been developed using binary Logit Models. SPSS software has been used to calibrate the models. It must be mentioned that backward regression method in SPSS was used to identify the significant variables in the model. Consider to obtained results it can be concluded that the main factor in increasing of crash severity in urban highways are driver age, movement with reverse gear, technical defect of the vehicle, vehicle collision with motorcycle and bicycle, bridge, frontal impact collisions, frontal-lateral collisions and multi-vehicle crashes in urban highways which always increase the crash severity in urban highways.

A Hybrid Approach to Fault Detection and Diagnosis in a Diesel Fuel Hydrotreatment Process

It is estimated that the total cost of abnormal conditions to US process industries is around $20 billion dollars in annual losses. The hydrotreatment (HDT) of diesel fuel in petroleum refineries is a conversion process that leads to high profitable economical returns. However, this is a difficult process to control because it is operated continuously, with high hydrogen pressures and it is also subject to disturbances in feed properties and catalyst performance. So, the automatic detection of fault and diagnosis plays an important role in this context. In this work, a hybrid approach based on neural networks together with a pos-processing classification algorithm is used to detect faults in a simulated HDT unit. Nine classes (8 faults and the normal operation) were correctly classified using the proposed approach in a maximum time of 5 minutes, based on on-line data process measurements.

An Effective Traffic Control for both Real-time Bursts and Reliable Bursts in OBS Networks

Optical burst switching(OBS) is considered as one of preferable network technologies for the next generation Internet. The Internet has two traffic classes, i.e. real-time bursts and reliable bursts. It is an important subject for OBS to achieve cooperated operation of real-time bursts and reliable bursts. In this paper, we proposes a new effective traffic control method named Separate TB+LB (Token Bucket + Leaky Bucket : TB+LB) method. The proposed method presents a new Token Bucket scheme for real-time bursts called as RBO-TB (Real-time Bursts Oriented Token Bucket). The method also applies the LB method to reliable bursts for obtaining better performance. This paper verifies the effectiveness of the Separate TB+LB method through the performance evaluation.

An Efficient Algorithm for Reliability Lower Bound of Distributed Systems

The reliability of distributed systems and computer networks have been modeled by a probabilistic network or a graph G. Computing the residual connectedness reliability (RCR), denoted by R(G), under the node fault model is very useful, but is an NP-hard problem. Since it may need exponential time of the network size to compute the exact value of R(G), it is important to calculate its tight approximate value, especially its lower bound, at a moderate calculation time. In this paper, we propose an efficient algorithm for reliability lower bound of distributed systems with unreliable nodes. We also applied our algorithm to several typical classes of networks to evaluate the lower bounds and show the effectiveness of our algorithm.

Eye Location Based on Structure Feature for Driver Fatigue Monitoring

One of the most important problems to solve is eye location for a driver fatigue monitoring system. This paper presents an efficient method to achieve fast and accurate eye location in grey level images obtained in the real-word driving conditions. The structure of eye region is used as a robust cue to find possible eye pairs. Candidates of eye pair at different scales are selected by finding regions which roughly match with the binary eye pair template. To obtain real one, all the eye pair candidates are then verified by using support vector machines. Finally, eyes are precisely located by using binary vertical projection and eye classifier in eye pair images. The proposed method is robust to deal with illumination changes, moderate rotations, glasses wearing and different eye states. Experimental results demonstrate its effectiveness.

ANP-based Intra and Inter-industry Analysis for Measuring Spillover Effect of ICT Industries

The interaction among information and communication technology (ICT) industries is a recently ubiquitous phenomenon through fixed-mobile integration. To monitor the impact of interaction, previous research has mainly focused on measuring spillover effect among ICT industries using various methods. Among others, inter-industry analysis is one of the useful methods for examining spillover effect between industries. However, more complex ICT industries become, more important the impact within an industry is. Inter-industry analysis is limited in mirroring intra-relationships within an industry. Thus, this study applies the analytic network process (ANP) to measure the spillover effect, capturing all of the intra and inter-relationships. Using ANP-based intra and inter-industry analysis, the spillover effect is effectively measured, mirroring the complex structure of ICT industries. A main ICT industry and its linkages are also explored to show the current structure of ICT industries. The proposed approach is expected to allow policy makers to understand interactions of ICT industries and their impact.

Achieving Performance in an Organization through Marketing Innovation

Innovation is becoming more and more important in modern society. There are a lot of researches on different kinds of innovation but marketing innovation is one kind of innovation that has not been studied frequently before. Marketing innovation is defined as a new way in which companies can market themselves to potential or existing customers. The study shows some key elements for marketing innovation that are worth paying attention to when implementing marketing innovation projects. Examples of such key elements are: paying attention to the neglected market, suitable market segmentatio reliable market information, public relationship, increased customer value, combination of market factors, explore different marketing channels and the use of technology in combination with what? Beside the key elements for marketing innovation, we also present some risks that may occur, such as cost, market uncertainty, information leakage, imitation and overdependence on experience. By proposing a set of indicators to measure marketing innovation, the article offers solutions for marketing innovation implementation so that any organization can achieve optimal results.

Approach to Design of Composition of Current Concrete with Respect to Strength and Static Elasticity Modulus

The paper reflects current state of popularization of static elasticity modulus of concrete. This parameter is undoubtedly very important for designing of concrete structures, and very often neglected and rarely determined before designing concrete technology itself. The paper describes assessment and comparison of four mix designs with almost constant dosage of individual components. The only difference is area of origin of small size fraction of aggregate 0/4. Development of compressive strength and static elasticity modulus at the age of 7, 28 and 180 days were observed. As the experiment showed, designing of individual components and their quality are the basic factor influencing elasticity modulus of current concrete.

Contributions to Design of Systems Actuated by Shape Memory Active Elements

Even it has been recognized that Shape Memory Alloys (SMA) have a significant potential for deployment actuators, the number of applications of SMA-based actuators to the present day is still quite small, due to the need of deep understanding of the thermo-mechanical behavior of SMA, causing an important need for a mathematical model able to describe all thermo-mechanical properties of SMA by relatively simple final set of constitutive equations. SMAs offer attractive potentials such as: reversible strains of several percent, generation of high recovery stresses and high power / weight ratios. The paper tries to provide an overview of the shape memory functions and a presentation of the designed and developed temperature control system used for a gripper actuated by two pairs of differential SMA active springs. An experimental setup was established, using electrical energy for actuator-s springs heating process. As for holding the temperature of the SMA springs at certain level for a long time was developed a control system in order to avoid the active elements overheating.

Estimation of Natural Frequency of the Bearing System under Periodic Force Based on Principal of Hydrodynamic Mass of Fluid

Estimation of natural frequency of structures is very important and isn-t usually calculated simply and sometimes complicated. Lack of knowledge about that caused hard damage and hazardous effects. In this paper, with using from two different models in FEM method and based on hydrodynamic mass of fluids, natural frequency of an especial bearing (Fig. 1) in an electric field (or, a periodic force) is calculated in different stiffness and different geometric. In final, the results of two models and analytical solution are compared.

Effect of Different Fertilization Methods on Soil Biological Indexes

Fertilization plays an important role in crop growth and soil improvement. This study was conducted to determine the best fertilization system for wheat production. Experiments were arranged in a complete block design with three replications in two years. Main plots consisted of six methods of fertilization including (N1): farmyard manure; (N2): compost; (N3): chemical fertilizers; (N4): farmyard manure + compost; (N5): farmyard manure + compost + chemical fertilizers and (N6): control were arranged in sub plots. The addition of compost or farm yard manure significantly increased the soil microbial biomass carbon in comparison to the chemical fertilizer. The dehydrogenase, phosphatase and urease activities in the N3 treatment were significantly lower than in the farm yard manure and compost treatments.

“The Social Destination“: How Social Media Influences the Organisational Structure and Leadership of DMOs

The paper deals with the most important changes that have occurred in business because of social media and its impact on organisations and leadership in recent years. It seeks to synthesize existing research, theories and concepts, in order to understand "social destinations", and to provide a bridge from past research to future success. Becoming a "social destination" is a strategic and tactical leadership and management issue and the paper will present the importance of destination leadership in choosing the way towards a social destination and some organisational models. It also presents some social media tools that can be used in transforming a destination into a social one. Adapting organisations to the twentyfirst century means adopting social media as a way of life and a way of business.

Interest of the Sequences Pseudo Noises Codes of Different Lengths for the Reduction from the Interference between Users of CDMA Network

The third generation (3G) of cellular system adopted the spread spectrum as solution for the transmission of the data in the physical layer. Contrary to systems IS-95 or CDMAOne (systems with spread spectrum of the preceding generation), the new standard, called Universal Mobil Telecommunications System (UMTS), uses long codes in the down link. The system is conceived for the vocal communication and the transmission of the data. In particular, the down link is very important, because of the asymmetrical request of the data, i.e., more remote loading towards the mobiles than towards the basic station. Moreover, the UMTS uses for the down link an orthogonal spreading out with a variable factor of spreading out (OVSF for Orthogonal Variable Spreading Factor). This characteristic makes it possible to increase the flow of data of one or more users by reducing their factor of spreading out without changing the factor of spreading out of other users. In the current standard of the UMTS, two techniques to increase the performances of the down link were proposed, the diversity of sending antenna and the codes space-time. These two techniques fight only fainding. The receiver proposed for the mobil station is the RAKE, but one can imagine a receiver more sophisticated, able to reduce the interference between users and the impact of the coloured noise and interferences to narrow band. In this context, where the users have long codes synchronized with variable factor of spreading out and ignorance by the mobile of the other active codes/users, the use of the sequences of code pseudo-noises different lengths is presented in the form of one of the most appropriate solutions.

A Text Mining Technique Using Association Rules Extraction

This paper describes text mining technique for automatically extracting association rules from collections of textual documents. The technique called, Extracting Association Rules from Text (EART). It depends on keyword features for discover association rules amongst keywords labeling the documents. In this work, the EART system ignores the order in which the words occur, but instead focusing on the words and their statistical distributions in documents. The main contributions of the technique are that it integrates XML technology with Information Retrieval scheme (TFIDF) (for keyword/feature selection that automatically selects the most discriminative keywords for use in association rules generation) and use Data Mining technique for association rules discovery. It consists of three phases: Text Preprocessing phase (transformation, filtration, stemming and indexing of the documents), Association Rule Mining (ARM) phase (applying our designed algorithm for Generating Association Rules based on Weighting scheme GARW) and Visualization phase (visualization of results). Experiments applied on WebPages news documents related to the outbreak of the bird flu disease. The extracted association rules contain important features and describe the informative news included in the documents collection. The performance of the EART system compared with another system that uses the Apriori algorithm throughout the execution time and evaluating extracted association rules.

A New Ridge Orientation based Method of Computation for Feature Extraction from Fingerprint Images

An important step in studying the statistics of fingerprint minutia features is to reliably extract minutia features from the fingerprint images. A new reliable method of computation for minutiae feature extraction from fingerprint images is presented. A fingerprint image is treated as a textured image. An orientation flow field of the ridges is computed for the fingerprint image. To accurately locate ridges, a new ridge orientation based computation method is proposed. After ridge segmentation a new method of computation is proposed for smoothing the ridges. The ridge skeleton image is obtained and then smoothed using morphological operators to detect the features. A post processing stage eliminates a large number of false features from the detected set of minutiae features. The detected features are observed to be reliable and accurate.