Design of a Tube Vent to Enhance the Role of Roof Solar Collector

The objective of this paper was to designing a ventilation system to enhance the performance of roof solar collector (RSC) for reducing heat accumulation inside the house. The RSC has 1.8 m2 surface area made of CPAC monier roof tiles on the upper part and gypsum board on the lower part. The space between CPAC monier and gypsum board was fixed at 14 cm. Ventilation system of modified roof solar collector (modified RSC) consists of 9 tubes of 0.15m diameter and installed in the lower part of RSC. Experimental result showed that the temperature of the room, and attic temperature. The average temperature reduction of room of house used modified RSC is about 2oC. and the percentage of room temperature reduction varied between 0 to 10%. Therefore, modified RSC is an interesting option in the sense that it promotes solar energy and conserve energy.

Neural Network Control of a Biped Robot Model with Composite Adaptation Low

this paper presents a novel neural network controller with composite adaptation low to improve the trajectory tracking problems of biped robots comparing with classical controller. The biped model has 5_link and 6 degrees of freedom and actuated by Plated Pneumatic Artificial Muscle, which have a very high power to weight ratio and it has large stoke compared to similar actuators. The proposed controller employ a stable neural network in to approximate unknown nonlinear functions in the robot dynamics, thereby overcoming some limitation of conventional controllers such as PD or adaptive controllers and guarantee good performance. This NN controller significantly improve the accuracy requirements by retraining the basic PD/PID loop, but adding an inner adaptive loop that allows the controller to learn unknown parameters such as friction coefficient, therefore improving tracking accuracy. Simulation results plus graphical simulation in virtual reality show that NN controller tracking performance is considerably better than PD controller tracking performance.

SOA Embedded in BPM: A High Level View of Object Oriented Paradigm

The trends of design and development of information systems have undergone a variety of ongoing phases and stages. These variations have been evolved due to brisk changes in user requirements and business needs. To meet these requirements and needs, a flexible and agile business solution was required to come up with the latest business trends and styles. Another obstacle in agility of information systems was typically different treatment of same diseases of two patients: business processes and information services. After the emergence of information technology, the business processes and information systems have become counterparts. But these two business halves have been treated under totally different standards. There is need to streamline the boundaries of these both pillars that are equally sharing information system's burdens and liabilities. In last decade, the object orientation has evolved into one of the major solutions for modern business needs and now, SOA is the solution to shift business on ranks of electronic platform. BPM is another modern business solution that assists to regularize optimization of business processes. This paper discusses how object orientation can be conformed to incorporate or embed SOA in BPM for improved information systems.

Application of Process Approach to Evaluate the Information Security Risk and its Implementation in an Iranian Private Bank

Every organization is continually subject to new damages and threats which can be resulted from their operations or their goal accomplishment. Methods of providing the security of space and applied tools have been widely changed with increasing application and development of information technology (IT). From this viewpoint, information security management systems were evolved to construct and prevent reiterating the experienced methods. In general, the correct response in information security management systems requires correct decision making, which in turn requires the comprehensive effort of managers and everyone involved in each plan or decision making. Obviously, all aspects of work or decision are not defined in all decision making conditions; therefore, the possible or certain risks should be considered when making decisions. This is the subject of risk management and it can influence the decisions. Investigation of different approaches in the field of risk management demonstrates their progress from quantitative to qualitative methods with a process approach.

A Formal Implementation of Database Security

This paper is to investigate the impplementation of security mechanism in object oriented database system. Formal methods plays an essential role in computer security due to its powerful expressiveness and concise syntax and semantics. In this paper, both issues of specification and implementation in database security environment will be considered; and the database security is achieved through the development of an efficient implementation of the specification without compromising its originality and expressiveness.

Compromise Ratio Method for Decision Making under Fuzzy Environment using Fuzzy Distance Measure

The aim of this paper is to adopt a compromise ratio (CR) methodology for fuzzy multi-attribute single-expert decision making proble. In this paper, the rating of each alternative has been described by linguistic terms, which can be expressed as triangular fuzzy numbers. The compromise ratio method for fuzzy multi-attribute single expert decision making has been considered here by taking the ranking index based on the concept that the chosen alternative should be as close as possible to the ideal solution and as far away as possible from the negative-ideal solution simultaneously. From logical point of view, the distance between two triangular fuzzy numbers also is a fuzzy number, not a crisp value. Therefore a fuzzy distance measure, which is itself a fuzzy number, has been used here to calculate the difference between two triangular fuzzy numbers. Now in this paper, with the help of this fuzzy distance measure, it has been shown that the compromise ratio is a fuzzy number and this eases the problem of the decision maker to take the decision. The computation principle and the procedure of the compromise ratio method have been described in detail in this paper. A comparative analysis of the compromise ratio method previously proposed [1] and the newly adopted method have been illustrated with two numerical examples.

An Automation of Check Focusing on CRUD for Requirements Analysis Model in UML

A key to success of high quality software development is to define valid and feasible requirements specification. We have proposed a method of model-driven requirements analysis using Unified Modeling Language (UML). The main feature of our method is to automatically generate a Web user interface mock-up from UML requirements analysis model so that we can confirm validity of input/output data for each page and page transition on the system by directly operating the mock-up. This paper proposes a support method to check the validity of a data life cycle by using a model checking tool “UPPAAL" focusing on CRUD (Create, Read, Update and Delete). Exhaustive checking improves the quality of requirements analysis model which are validated by the customers through automatically generated mock-up. The effectiveness of our method is discussed by a case study of requirements modeling of two small projects which are a library management system and a supportive sales system for text books in a university.

Simulated Annealing Algorithm for Data Aggregation Trees in Wireless Sensor Networks and Comparison with Genetic Algorithm

In ad hoc networks, the main issue about designing of protocols is quality of service, so that in wireless sensor networks the main constraint in designing protocols is limited energy of sensors. In fact, protocols which minimize the power consumption in sensors are more considered in wireless sensor networks. One approach of reducing energy consumption in wireless sensor networks is to reduce the number of packages that are transmitted in network. The technique of collecting data that combines related data and prevent transmission of additional packages in network can be effective in the reducing of transmitted packages- number. According to this fact that information processing consumes less power than information transmitting, Data Aggregation has great importance and because of this fact this technique is used in many protocols [5]. One of the Data Aggregation techniques is to use Data Aggregation tree. But finding one optimum Data Aggregation tree to collect data in networks with one sink is a NP-hard problem. In the Data Aggregation technique, related information packages are combined in intermediate nodes and form one package. So the number of packages which are transmitted in network reduces and therefore, less energy will be consumed that at last results in improvement of longevity of network. Heuristic methods are used in order to solve the NP-hard problem that one of these optimization methods is to solve Simulated Annealing problems. In this article, we will propose new method in order to build data collection tree in wireless sensor networks by using Simulated Annealing algorithm and we will evaluate its efficiency whit Genetic Algorithm.

Economic effects and Energy Use Efficiency of Incorporating Alfalfa and Fertilizer into Grass- Based Pasture Systems

A ten-year grazing study was conducted at the Agriculture and Agri-Food Canada Brandon Research Centre in Manitoba to study the effect of alfalfa inclusion and fertilizer (N, P, K, and S) addition on economics and efficiency of non-renewable energy use in meadow brome grass-based pasture systems for beef production. Fertilizing grass-only or alfalfa-grass pastures to full soil test recommendations improved pasture productivity, but did not improve profitability compared to unfertilized pastures. Fertilizing grass-only pastures resulted in the highest net loss of any pasture management strategy in this study. Adding alfalfa at the time of seeding, with no added fertilizer, was economically the best pasture improvement strategy in this study. Because of moisture limitations, adding commercial fertilizer to full soil test recommendations is probably not economically justifiable in most years, especially with the rising cost of fertilizer. Improving grass-only pastures by adding fertilizer and/or alfalfa required additional non-renewable energy inputs; however, the additional energy required for unfertilized alfalfa-grass pastures was minimal compared to the fertilized pastures. Of the four pasture management strategies, adding alfalfa to grass pastures without adding fertilizer had the highest efficiency of energy use. Based on energy use and economic performance, the unfertilized alfalfa-grass pasture was the most efficient and sustainable pasture system.

Image Compression with Back-Propagation Neural Network using Cumulative Distribution Function

Image Compression using Artificial Neural Networks is a topic where research is being carried out in various directions towards achieving a generalized and economical network. Feedforward Networks using Back propagation Algorithm adopting the method of steepest descent for error minimization is popular and widely adopted and is directly applied to image compression. Various research works are directed towards achieving quick convergence of the network without loss of quality of the restored image. In general the images used for compression are of different types like dark image, high intensity image etc. When these images are compressed using Back-propagation Network, it takes longer time to converge. The reason for this is, the given image may contain a number of distinct gray levels with narrow difference with their neighborhood pixels. If the gray levels of the pixels in an image and their neighbors are mapped in such a way that the difference in the gray levels of the neighbors with the pixel is minimum, then compression ratio as well as the convergence of the network can be improved. To achieve this, a Cumulative distribution function is estimated for the image and it is used to map the image pixels. When the mapped image pixels are used, the Back-propagation Neural Network yields high compression ratio as well as it converges quickly.

A Local Statistics Based Region Growing Segmentation Method for Ultrasound Medical Images

This paper presents the region based segmentation method for ultrasound images using local statistics. In this segmentation approach the homogeneous regions depends on the image granularity features, where the interested structures with dimensions comparable to the speckle size are to be extracted. This method uses a look up table comprising of the local statistics of every pixel, which are consisting of the homogeneity and similarity bounds according to the kernel size. The shape and size of the growing regions depend on this look up table entries. The algorithms are implemented by using connected seeded region growing procedure where each pixel is taken as seed point. The region merging after the region growing also suppresses the high frequency artifacts. The updated merged regions produce the output in formed of segmented image. This algorithm produces the results that are less sensitive to the pixel location and it also allows a segmentation of the accurate homogeneous regions.

A Training Model for Successful Implementation of Enterprise Resource Planning

It well recognized that one feature that makes a successful company is its ability to successfully align its business goals with its information communication technologies platform. Enterprise Resource Planning (ERP) systems contribute to achieve better performance by integrating various business functions and providing support for information flows. However, the technological systems complexity is known to prevent the business users to exploit in an efficient way the Enterprise Resource Planning Systems (ERP). This paper aims to investigate the role of training in improving the usage of ERP systems. To this end, we have designed an instrument survey to employees of a Norwegian multinational global provider of technology solutions. Based on the analysis of collected data, we have delineated a training model that could be high relevance for both researchers and practitioners as a step towards a better understanding of ERP system implementation.

Thermal and Mechanical Properties of Modified CaCO3 /PP Nanocomposites

Inorganic nanoparticles filled polymer composites have extended their multiple functionalities to various applications, including mechanical reinforcement, gas barrier, dimensional stability, heat distortion temperature, flame-retardant, and thermal conductivity. Sodium stearate-modified calcium carbonate (CaCO3) nanoparticles were prepared using surface modification method. The results showed that sodium stearate attached to the surface of CaCO3 nanoparticles with the chemical bond. The effect of modified CaCO3 nanoparticles on thermal properties of polypropylene (PP) was studied by means of differential scanning calorimetry (DSC) and Thermogravimetric analysis (TGA). It was found that CaCO3 significantly affected the crystallization temperature and crystallization degree of PP. Effect of the modified CaCO3 content on mechanical properties of PP/CaCO3 nanocomposites was also studied. The results showed that the modified CaCO3 can effectively improve the mechanical properties of PP. In comparison with PP, the impact strength of PP/CaCO3 nanocomposites increased by about 65% and the hardness increased by about 5%.

Experimental Evaluation of Drilling Damage on the Strength of Cores Extracted from RC Buildings

Concrete strength evaluated from compression tests on cores is affected by several factors causing differences from the in-situ strength at the location from which the core specimen was extracted. Among the factors, there is the damage possibly occurring during the drilling phase that generally leads to underestimate the actual in-situ strength. In order to quantify this effect, in this study two wide datasets have been examined, including: (i) about 500 core specimens extracted from Reinforced Concrete existing structures, and (ii) about 600 cube specimens taken during the construction of new structures in the framework of routine acceptance control. The two experimental datasets have been compared in terms of compression strength and specific weight values, accounting for the main factors affecting a concrete property, that is type and amount of cement, aggregates' grading, type and maximum size of aggregates, water/cement ratio, placing and curing modality, concrete age. The results show that the magnitude of the strength reduction due to drilling damage is strongly affected by the actual properties of concrete, being inversely proportional to its strength. Therefore, the application of a single value of the correction coefficient, as generally suggested in the technical literature and in structural codes, appears inappropriate. A set of values of the drilling damage coefficient is suggested as a function of the strength obtained from compressive tests on cores.

Awareness of Reading Strategies among EFL Learners at Bangkok University

This questionnaire-based study, aimed to measure and compare the awareness of English reading strategies among EFL learners at Bangkok University (BU) classified by their gender, field of study, and English learning experience. Proportional stratified random sampling was employed to formulate a sample of 380 BU students. The data were statistically analyzed in terms of the mean and standard deviation. t-Test analysis was used to find differences in awareness of reading strategies between two groups (-male and female- /-science and social-science students). In addition, one-way analysis of variance (ANOVA) was used to compare reading strategy awareness among BU students with different lengths of English learning experience. The results of this study indicated that the overall awareness of reading strategies of EFL learners at BU was at a high level (ðÑ = 3.60) and that there was no statistically significant difference between males and females, and among students who have different lengths of English learning experience at the significance level of 0.05. However, significant differences among students coming from different fields of study were found at the same level of significance.

GPU-Based Volume Rendering for Medical Imagery

We present a method for fast volume rendering using graphics hardware (GPU). To our knowledge, it is the first implementation on the GPU. Based on the Shear-Warp algorithm, our GPU-based method provides real-time frame rates and outperforms the CPU-based implementation. When the number of slices is not sufficient, we add in-between slices computed by interpolation. This improves then the quality of the rendered images. We have also implemented the ray marching algorithm on the GPU. The results generated by the three algorithms (CPU-based and GPU-based Shear- Warp, GPU-based Ray Marching) for two test models has proved that the ray marching algorithm outperforms the shear-warp methods in terms of speed up and image quality.

Measuring Relative Efficiency of Korean Construction Company using DEA/Window

Sub-prime mortgage crisis which began in the US is regarded as the most economic crisis since the Great Depression in the early 20th century. Especially, hidden problems on efficient operation of a business were disclosed at a time and many financial institutions went bankrupt and filed for court receivership. The collapses of physical market lead to bankruptcy of manufacturing and construction businesses. This study is to analyze dynamic efficiency of construction businesses during the five years at the turn of the global financial crisis. By discovering the trend and stability of efficiency of a construction business, this study-s objective is to improve management efficiency of a construction business in the ever-changing construction market. Variables were selected by analyzing corporate information on top 20 construction businesses in Korea and analyzed for static efficiency in 2008 and dynamic efficiency between 2006 and 2010. Unlike other studies, this study succeeded in deducing efficiency trend and stability of a construction business for five years by using the DEA/Window model. Using the analysis result, efficient and inefficient companies could be figured out. In addition, relative efficiency among DMU was measured by comparing the relationship between input and output variables of construction businesses. This study can be used as a literature to improve management efficiency for companies with low efficiency based on efficiency analysis of construction businesses.

Bi-Criteria Latency Optimization of Intra-and Inter-Autonomous System Traffic Engineering

Traffic Engineering (TE) is the process of controlling how traffic flows through a network in order to facilitate efficient and reliable network operations while simultaneously optimizing network resource utilization and traffic performance. TE improves the management of data traffic within a network and provides the better utilization of network resources. Many research works considers intra and inter Traffic Engineering separately. But in reality one influences the other. Hence the effective network performances of both inter and intra Autonomous Systems (AS) are not optimized properly. To achieve a better Joint Optimization of both Intra and Inter AS TE, we propose a joint Optimization technique by considering intra-AS features during inter – AS TE and vice versa. This work considers the important criterion say latency within an AS and between ASes. and proposes a Bi-Criteria Latency optimization model. Hence an overall network performance can be improved by considering this jointoptimization technique in terms of Latency.

Research on Hybrid Neural Network in Intrusion Detection System

This paper presents an intrusion detection system of hybrid neural network model based on RBF and Elman. It is used for anomaly detection and misuse detection. This model has the memory function .It can detect discrete and related aggressive behavior effectively. RBF network is a real-time pattern classifier, and Elman network achieves the memory ability for former event. Based on the hybrid model intrusion detection system uses DARPA data set to do test evaluation. It uses ROC curve to display the test result intuitively. After the experiment it proves this hybrid model intrusion detection system can effectively improve the detection rate, and reduce the rate of false alarm and fail.