Analytical Solution for Free Vibration of Rectangular Kirchhoff Plate from Wave Approach

In this paper, an analytical approach for free vibration analysis of four edges simply supported rectangular Kirchhoff plates is presented. The method is based on wave approach. From wave standpoint vibration propagate, reflect and transmit in a structure. Firstly, the propagation and reflection matrices for plate with simply supported boundary condition are derived. Then, these matrices are combined to provide a concise and systematic approach to free vibration analysis of a simply supported rectangular Kirchhoff plate. Subsequently, the eigenvalue problem for free vibration of plates is formulated and the equation of plate natural frequencies is constructed. Finally, the effectiveness of the approach is shown by comparison of the results with existing classical solution.

Assessment of Cadmium Level in Water from Watershed of the Kowsar Dam

The Kowsar dam supply water for different usages such as drinking, industrial, agricultural and aquaculture farms usages and located next to the city of Dehdashat in Kohgiluye and Boyerahmad province in southern Iran. There are some towns and villages on the Kowsar dam watersheds, which Dehdasht and Choram are the most important and populated cities in this area. The study was undertaken to assess the status of water quality in the urban areas of the Kowsar dam. A total of 28 water samples were collected from 6 stations on surface water and 1 station from groundwater on the watershed of the Kowsar dam. All the samples were analyzed for Cd concentration using standard procedures. The results were compared with other national and international standards. Among the analyzed samples, as the maximum value of cadmium (1.131 μg/L) was observed on the station 2 at the winter 2009, all the samples analyzed were within the maximum admissible limits by the United States Environmental Protection Agency, EU, WHO, New Zealand , Australian, Iranian, and the Indian standards. In general results of the present study have shown that Cd mean values of stations No. 4, 1 and 2 with 0.5135, 0.0.4733 and 0.4573 μg/L respectively are higher than the other stations . Although Cd level of all samples and stations have had normal values but this is an indication of pollution potential and hazards because of human activity and waste water of towns in the areas, which can effect on human health implications in future. This research, therefore, recommends the government and other responsible authorities to take suitable improving measures in the Kowsar dam watershed-s.

Fuzzy Control of the Air Conditioning System at Different Operating Pressures

The present work demonstrates the design and simulation of a fuzzy control of an air conditioning system at different pressures. The first order Sugeno fuzzy inference system is utilized to model the system and create the controller. In addition, an estimation of the heat transfer rate and water mass flow rate injection into or withdraw from the air conditioning system is determined by the fuzzy IF-THEN rules. The approach starts by generating the input/output data. Then, the subtractive clustering algorithm along with least square estimation (LSE) generates the fuzzy rules that describe the relationship between input/output data. The fuzzy rules are tuned by Adaptive Neuro-Fuzzy Inference System (ANFIS). The results show that when the pressure increases the amount of water flow rate and heat transfer rate decrease within the lower ranges of inlet dry bulb temperatures. On the other hand, and as pressure increases the amount of water flow rate and heat transfer rate increases within the higher ranges of inlet dry bulb temperatures. The inflection in the pressure effect trend occurs at lower temperatures as the inlet air humidity increases.

A Monte Carlo Method to Data Stream Analysis

Data stream analysis is the process of computing various summaries and derived values from large amounts of data which are continuously generated at a rapid rate. The nature of a stream does not allow a revisit on each data element. Furthermore, data processing must be fast to produce timely analysis results. These requirements impose constraints on the design of the algorithms to balance correctness against timely responses. Several techniques have been proposed over the past few years to address these challenges. These techniques can be categorized as either dataoriented or task-oriented. The data-oriented approach analyzes a subset of data or a smaller transformed representation, whereas taskoriented scheme solves the problem directly via approximation techniques. We propose a hybrid approach to tackle the data stream analysis problem. The data stream has been both statistically transformed to a smaller size and computationally approximated its characteristics. We adopt a Monte Carlo method in the approximation step. The data reduction has been performed horizontally and vertically through our EMR sampling method. The proposed method is analyzed by a series of experiments. We apply our algorithm on clustering and classification tasks to evaluate the utility of our approach.

XPM Response of Multiple Quantum Well chirped DFB-SOA All Optical Flip-Flop Switching

In this paper, based on the coupled-mode and carrier rate equations, derivation of a dynamic model and numerically analysis of a MQW chirped DFB-SOA all-optical flip-flop is done precisely. We have analyzed the effects of strains of QW and MQW and cross phase modulation (XPM) on the dynamic response, and rise and fall times of the DFB-SOA all optical flip flop. We have shown that strained MQW active region in under an optimized condition into a DFB-SOA with chirped grating can improve the switching ON speed limitation in such a of the device, significantly while the fall time is increased. The values of the rise times for such an all optical flip-flop, are obtained in an optimized condition, areas tr=255ps.

Ezilla Cloud Service with Cassandra Database for Sensor Observation System

The main mission of Ezilla is to provide a friendly interface to access the virtual machine and quickly deploy the high performance computing environment. Ezilla has been developed by Pervasive Computing Team at National Center for High-performance Computing (NCHC). Ezilla integrates the Cloud middleware, virtualization technology, and Web-based Operating System (WebOS) to form a virtual computer in distributed computing environment. In order to upgrade the dataset and speedup, we proposed the sensor observation system to deal with a huge amount of data in the Cassandra database. The sensor observation system is based on the Ezilla to store sensor raw data into distributed database. We adopt the Ezilla Cloud service to create virtual machines and login into virtual machine to deploy the sensor observation system. Integrating the sensor observation system with Ezilla is to quickly deploy experiment environment and access a huge amount of data with distributed database that support the replication mechanism to protect the data security.

Forming of Institutional Mechanism of Region's Innovative Development

The regional innovative competitiveness is an integrating characteristic of the innovative sphere of the region. It depends on a big variety of different parameters connected with all kinds of economic entities- activities. But management parameters shouldn't be irregular, so in order to avoid it, an institutional system should be formed. This system should carry out strategic management of factors having the greatest influence on the region's innovative development. This article is devoted to different aspects of organization of the region's development institutional mechanism, which is based on management of regional innovative competitiveness parameters. The base of the analysis is innovatively-active Russian regions which were compared according to the level of the innovative competitiveness. After that the most important parameters of successful innovative development of the region were revealed with the help of the correlation-regression analysis. The results of the research could be used for investigation of the region's innovative policy.

Faults Forecasting System

This paper presents Faults Forecasting System (FFS) that utilizes statistical forecasting techniques in analyzing process variables data in order to forecast faults occurrences. FFS is proposing new idea in detecting faults. Current techniques used in faults detection are based on analyzing the current status of the system variables in order to check if the current status is fault or not. FFS is using forecasting techniques to predict future timing for faults before it happens. Proposed model is applying subset modeling strategy and Bayesian approach in order to decrease dimensionality of the process variables and improve faults forecasting accuracy. A practical experiment, designed and implemented in Okayama University, Japan, is implemented, and the comparison shows that our proposed model is showing high forecasting accuracy and BEFORE-TIME.

Scatter Analysis of Fatigue Life and Pore Size Data of Die-Cast AM60B Magnesium Alloy

Scatter behavior of fatigue life in die-cast AM60B alloy was investigated. For comparison, those in rolled AM60B alloy and die-cast A365-T5 aluminum alloy were also studied. Scatter behavior of pore size was also investigated to discuss dominant factors for fatigue life scatter in die-cast materials. Three-parameter Weibull function was suitable to explain the scatter behavior of both fatigue life and pore size. The scatter of fatigue life in die-cast AM60B alloy was almost comparable to that in die-cast A365-T5 alloy, while it was significantly large compared to that in the rolled AM60B alloy. Scatter behavior of pore size observed at fracture nucleation site on the fracture surface was comparable to that observed on the specimen cross-section and also to that of fatigue life. Therefore, the dominant factor for large scatter of fatigue life in die-cast alloys would be the large scatter of pore size. This speculation was confirmed by the fracture mechanics fatigue life prediction, where the pore observed at fatigue crack nucleation site was assumed as the pre-existing crack.

Pushover Analysis of Short Structures

In this paper first, Two buildings have been modeled and then analyzed using nonlinear static analysis method under two different conditions in Nonlinear SAP 2000 software. In the first condition the interaction of soil adjacent to the walls of basement are ignored while in the second case this interaction have been modeled using Gap elements of nonlinear SAP2000 software. Finally, comparing the results of two models, the effects of soil-structure on period, target point displacement, internal forces, shape deformations and base shears have been studied. According to the results, this interaction has always increased the base shear of buildings, decreased the period of structure and target point displacement, and often decreased the internal forces and displacements.

Energy Consumption in Forward Osmosis Desalination Compared to other Desalination Techniques

The draw solute separation process in Forward Osmosis desalination was simulated in Aspen Plus chemical process modeling software, to estimate the energy consumption and compare it with other desalination processes, mainly the Reverse Osmosis process which is currently most prevalent. The electrolytic chemistry for the system was retrieved using the Elec – NRTL property method in the Aspen Plus database. Electrical equivalent of energy required in the Forward Osmosis desalination technique was estimated and compared with the prevalent desalination techniques.

Closed Form Optimal Solution of a Tuned Liquid Column Damper Responding to Earthquake

In this paper the vibration behaviors of a structure equipped with a tuned liquid column damper (TLCD) under a harmonic type of earthquake loading are studied. However, due to inherent nonlinear liquid damping, it is no doubt that a great deal of computational effort is required to search the optimum parameters of the TLCD, numerically. Therefore by linearization the equation of motion of the single degree of freedom structure equipped with the TLCD, the closed form solutions of the TLCD-structure system are derived. To find the reliability of the analytical method, the results have been compared with other researcher and have good agreement. Further, the effects of optimal design parameters such as length ratio and mass ratio on the performance of the TLCD for controlling the responses of a structure are investigated by using the harmonic type of earthquake excitation. Finally, the Citicorp Center which has a very flexible structure is used as an example to illustrate the design procedure for the TLCD under the earthquake excitation.

A Multiresolution Approach for Noised Texture Classification based on the Co-occurrence Matrix and First Order Statistics

Wavelet transform provides several important characteristics which can be used in a texture analysis and classification. In this work, an efficient texture classification method, which combines concepts from wavelet and co-occurrence matrices, is presented. An Euclidian distance classifier is used to evaluate the various methods of classification. A comparative study is essential to determine the ideal method. Using this conjecture, we developed a novel feature set for texture classification and demonstrate its effectiveness

Fabrication and Characterization of Poly-Si Vertical Nanowire Thin Film Transistor

In this paper, we present a vertical nanowire thin film transistor with gate-all-around architecture, fabricated using CMOS compatible processes. A novel method of fabricating polysilicon vertical nanowires of diameter as small as 30 nm using wet-etch is presented. Both n-type and p-type vertical poly-silicon nanowire transistors exhibit superior electrical characteristics as compared to planar devices. On a poly-crystalline nanowire of 30 nm diameter, high Ion/Ioff ratio of 106, low drain-induced barrier lowering (DIBL) of 50 mV/V, and low sub-threshold slope SS~100mV/dec are demonstrated for a device with channel length of 100 nm.

Design, Development and Implementation of aTemperature Sensor using Zigbee Concepts

This paper deals with the design, development & implementation of a temperature sensor using zigbee. The main aim of the work undertaken in this paper is to sense the temperature and to display the result on the LCD using the zigbee technology. ZigBee operates in the industrial, scientific and medical (ISM) radio bands; 868 MHz in Europe, 915 MHz in the USA and 2.4 GHz in most jurisdictions worldwide. The technology is intended to be simpler and cheaper than other WPANs such as Bluetooth. The most capable ZigBee node type is said to require only about 10 % of the software of a typical Bluetooth or Wireless Internet node, while the simplest nodes are about 2 %. However, actual code sizes are much higher, more like 50 % of the Bluetooth code size. ZigBee chip vendors have announced 128-kilobyte devices. In this work undertaken in the design & development of the temperature sensor, it senses the temperature and after amplification is then fed to the micro controller, this is then connected to the zigbee module, which transmits the data and at the other end the zigbee reads the data and displays on to the LCD. The software developed is highly accurate and works at a very high speed. The method developed shows the effectiveness of the scheme employed.

REDD: Reliable Energy-Efficient Data Dissemination in Wireless Sensor Networks with Multiple Mobile Sinks

In wireless sensor network (WSN) the use of mobile sink has been attracting more attention in recent times. Mobile sinks are more effective means of balancing load, reducing hotspot problem and elongating network lifetime. The sensor nodes in WSN have limited power supply, computational capability and storage and therefore for continuous data delivery reliability becomes high priority in these networks. In this paper, we propose a Reliable Energy-efficient Data Dissemination (REDD) scheme for WSNs with multiple mobile sinks. In this strategy, sink first determines the location of source and then directly communicates with the source using geographical forwarding. Every forwarding node (FN) creates a local zone comprising some sensor nodes that can act as representative of FN when it fails. Analytical and simulation study reveals significant improvement in energy conservation and reliable data delivery in comparison to existing schemes.

CAD/CAM Algorithms for 3D Woven Multilayer Textile Structures

This paper proposes new algorithms for the computeraided design and manufacture (CAD/CAM) of 3D woven multi-layer textile structures. Existing commercial CAD/CAM systems are often restricted to the design and manufacture of 2D weaves. Those CAD/CAM systems that do support the design and manufacture of 3D multi-layer weaves are often limited to manual editing of design paper grids on the computer display and weave retrieval from stored archives. This complex design activity is time-consuming, tedious and error-prone and requires considerable experience and skill of a technical weaver. Recent research reported in the literature has addressed some of the shortcomings of commercial 3D multi-layer weave CAD/CAM systems. However, earlier research results have shown the need for further work on weave specification, weave generation, yarn path editing and layer binding. Analysis of 3D multi-layer weaves in this research has led to the design and development of efficient and robust algorithms for the CAD/CAM of 3D woven multi-layer textile structures. The resulting algorithmically generated weave designs can be used as a basis for lifting plans that can be loaded onto looms equipped with electronic shedding mechanisms for the CAM of 3D woven multi-layer textile structures.

Multiproject Scheduling in Construction Industry

In this paper, supply policy and procurement of shared resources in some kinds of concurrent construction projects are investigated. This could be oriented to the problems of holding construction companies who involve in different projects concurrently and they have to supply limited resources to several projects as well as prevent delays to any project. Limits on transportation vehicles and storage facilities for potential construction materials and also the available resources (such as cash or manpower) are some of the examples which affect considerably on management of all projects over all. The research includes investigation of some real multi-storey buildings during their execution periods and surveying the history of the activities. It is shown that the common resource demand variation curve of the projects may be expanded or displaced to achieve an optimum distribution scheme. Of course, it may cause some delay to some projects, but it has minimum influence on whole execution period of all projects and its influence on procurement cost of the projects is considerable. These observations on investigation of some multistorey building which are built in Iran will be presented in this paper.

Neuro-Fuzzy Network Based On Extended Kalman Filtering for Financial Time Series

The neural network's performance can be measured by efficiency and accuracy. The major disadvantages of neural network approach are that the generalization capability of neural networks is often significantly low, and it may take a very long time to tune the weights in the net to generate an accurate model for a highly complex and nonlinear systems. This paper presents a novel Neuro-fuzzy architecture based on Extended Kalman filter. To test the performance and applicability of the proposed neuro-fuzzy model, simulation study of nonlinear complex dynamic system is carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction of financial time series. A benchmark case studie is used to demonstrate that the proposed model is a superior neuro-fuzzy modeling technique.

A Real-Time Rendering based on Efficient Updating of Static Objects Buffer

Real-time 3D applications have to guarantee interactive rendering speed. There is a restriction for the number of polygons which is rendered due to performance of a graphics hardware or graphics algorithms. Generally, the rendering performance will be drastically increased when handling only the dynamic 3d models, which is much fewer than the static ones. Since shapes and colors of the static objects don-t change when the viewing direction is fixed, the information can be reused. We render huge amounts of polygon those cannot handled by conventional rendering techniques in real-time by using a static object image and merging it with rendering result of the dynamic objects. The performance must be decreased as a consequence of updating the static object image including removing an static object that starts to move, re-rending the other static objects being overlapped by the moving ones. Based on visibility of the object beginning to move, we can skip the updating process. As a result, we enhance rendering performance and reduce differences of rendering speed between each frame. Proposed method renders total 200,000,000 polygons that consist of 500,000 dynamic polygons and the rest are static polygons in about 100 frames per second.