An Improved Tie Force Method for Progressive Collapse Resistance of Precast Concrete Cross Wall Structures

Progressive collapse of buildings typically occurs  when abnormal loading conditions cause local damages, which leads  to a chain reaction of failure and ultimately catastrophic collapse. The  tie force (TF) method is one of the main design approaches for  progressive collapse. As the TF method is a simplified method, further  investigations on the reliability of the method is necessary. This study  aims to develop an improved TF method to design the cross wall  structures for progressive collapse. To this end, the pullout behavior of  strands in grout was firstly analyzed; and then, by considering the tie  force-slip relationship in the friction stage together with the catenary  action mechanism, a comprehensive analytical method was developed.  The reliability of this approach is verified by the experimental results  of concrete block pullout tests and full scale floor-to-floor joints tests  undertaken by Portland Cement Association (PCA). Discrepancies in  the tie force between the analytical results and codified specifications  have suggested the deficiency of TF method, hence an improved  model based on the analytical results has been proposed to address this  concern.  

Carbon Nanotubes Synthesized Using Sugar Cane as a Percursor

This article deals with the carbon nanotubes (CNT) synthesized from a novel precursor, sugar cane and Anodic Aluminum Oxide (AAO). The objective was to produce CNTs to be used as catalyst supports for Proton Exchange Membranes. The influence of temperature, inert gas flow rate and concentration of the precursor is presented. The CNTs prepared were characterized using TEM, XRD, Raman Spectroscopy, and the surface area determined by BET. The results show that it is possible to form CNT from sugar cane by pyrolysis and the CNTs are the type multi-walled carbon nanotubes. The MWCNTs are short and closed at the two ends with very small surface area of SBET= 3.691m,/g.

Extreme Rainfall Frequency Analysis for Meteorological Sub-Division 4 of India Using L-Moments

Extreme rainfall frequency analysis for Meteorological Sub-Division 4 of India was analyzed using L-moments approach. Serial Correlation and Mann Kendall tests were conducted for checking serially independent and stationarity of the observations. The discordancy measure for the sites was conducted to detect the discordant sites. The regional homogeneity was tested by comparing with 500 generated homogeneous regions using a 4 parameter Kappa distribution. The best fit distribution was selected based on ZDIST statistics and L-moments ratio diagram from the five extreme value distributions GPD, GLO, GEV, P3 and LP3. The LN3 distribution was selected and regional rainfall frequency relationship was established using index-rainfall procedure. A regional mean rainfall relationship was developed using multiple linear regression with latitude and longitude of the sites as variables.

Optimal Trailing Edge Flap Positions of Helicopter Rotor for Various Thrust Coefficients to Solidity (Ct/σ) Ratios

This study aims to determine change in optimal locations of dual trailing-edge flaps for various thrust coefficient to solidity (Ct /σ) ratios of helicopter to achieve minimum hub vibration levels, with low penalty in terms of required trailing-edge flap control power. Polynomial response functions are used to approximate hub vibration and flap power objective functions. Single objective and multiobjective optimization is carried with the objective of minimizing hub vibration and flap power. The optimization result shows that the inboard flap location at low Ct /σ ratio move farther from the baseline value and at high Ct /σ ratio move towards the root of the blade for minimizing hub vibration.

An Approach for Optimization of Functions and Reducing the Value of the Product by Using Virtual Models

New developed approach for Functional Cost Analysis (FCA) based on virtual prototyping (VP) models in CAD/CAE environment, applicable and necessary in developing new products is presented. It is instrument for improving the value of the product while maintaining costs and/or reducing the costs of the product without reducing value. Five broad classes of VP methods are identified. Efficient use of prototypes in FCA is a vital activity that can make the difference between successful and unsuccessful entry of new products into the competitive word market. Successful realization of this approach is illustrated for a specific example using press joint power tool.

Effect of Transverse Reinforcement on the Behavior of Tension Lap splice in High-Strength Reinforced Concrete Beams

The results of an experimental program conducted on seventeen simply supported concrete beams to study the effect of transverse reinforcement on the behavior of lap splice of steel reinforcement in tension zones in high strength concrete beams, are presented. The parameters included in the experimental program were the concrete compressive strength, the lap splice length, the amount of transverse reinforcement provided within the splice region, and the shape of transverse reinforcement around spliced bars. The experimental results showed that the displacement ductility increased and the mode of failure changed from splitting bond failure to flexural failure when the amount of transverse reinforcement in splice region increased, and the compressive strength increased up to 100 MPa. The presence of transverse reinforcement around spliced bars had pronounced effect on increasing the ultimate load, the ultimate deflection, and the displacement ductility. The prediction of maximum steel stresses for spliced bars using ACI 318-05 building code was compared with the experimental results. The comparison showed that the effect of transverse reinforcement around spliced bars has to be considered into the design equations for lap splice length in high strength concrete beams.

CFD Study of the Fluid Viscosity Variation and Effect on the Flow in a Stirred Tank

Stirred tanks are widely used in all industrial sectors. The need for further studies of the mixing operation and its different aspects comes from the diversity of agitation tools and implemented geometries in addition to the specific characteristics of each application. Viscous fluids are often encountered in industry and they represent the majority of treated cases, as in the polymer sector, food processing, pharmaceuticals and cosmetics. That's why in this paper, we will present a three-dimensional numerical study using the software Fluent, to study the effect of varying the fluid viscosity in a stirred tank with a Rushton turbine. This viscosity variation was performed by adding carboxymethylcellulose (CMC) to the fluid (water) in the vessel. In this work, we studied first the flow generated in the tank with a Rushton turbine. Second, we studied the effect of the fluid viscosity variation on the thermodynamic quantities defining the flow. For this, three viscosities (0.9% CMC, 1.1% CMC and 1.7% CMC) were considered.

Genetic Algorithm with Fuzzy Genotype Values and Its Application to Neuroevolution

The author proposes an extension of genetic algorithm (GA) for solving fuzzy-valued optimization problems. In the proposed GA, values in the genotypes are not real numbers but fuzzy numbers. Evolutionary processes in GA are extended so that GA can handle genotype instances with fuzzy numbers. The proposed method is applied to evolving neural networks with fuzzy weights and biases. Experimental results showed that fuzzy neural networks evolved by the fuzzy GA could model hidden target fuzzy functions well despite the fact that no training data was explicitly provided.

Performance Analysis of Wavelet Based Multiuser MIMO OFDM

Wavelet analysis has some strong advantages over Fourier analysis, as it allows a time-frequency domain analysis, allowing optimal resolution and flexibility. As a result, they have been satisfactorily applied in almost all the fields of communication systems including OFDM which is a strong candidate for next generation of wireless technology. In this paper, the performances of wavelet based Multiuser Multiple Input and Multiple Output Orthogonal Frequency Division Multiplexing (MU-MIMO OFDM) systems are analyzed in terms of BER. It has been shown that the wavelet based systems outperform the classical FFT based systems. This analysis also unfolds an interesting result, where wavelet based OFDM system will have a constant error performance using Regularized Channel Inversion (RCI) beamforming for any number of users, and outperforms in all possible scenario in a multiuser environment. An extensive computer simulations show that a PAPR reduction of up to 6.8dB can be obtained with M=64.

Dual-Network Memory Model for Temporal Sequences

In neural networks, when new patters are learned by a network, they radically interfere with previously stored patterns. This drawback is called catastrophic forgetting. We have already proposed a biologically inspired dual-network memory model which can much reduce this forgetting for static patterns. In this model, information is first stored in the hippocampal network, and thereafter, it is transferred to the neocortical network using pseudopatterns. Because temporal sequence learning is more important than static pattern learning in the real world, in this study, we improve our conventional  dual-network memory model so that it can deal with temporal sequences without catastrophic forgetting. The computer simulation results show the effectiveness of the proposed dual-network memory model.  

Leakage Reduction ONOFIC Approach for Deep Submicron VLSI Circuits Design

Minimizations of power dissipation, chip area with higher circuit performance are the necessary and key parameters in deep submicron regime. The leakage current increases sharply in deep submicron regime and directly affected the power dissipation of the logic circuits. In deep submicron region the power dissipation as well as high performance is the crucial concern since increasing importance of portable systems. Number of leakage reduction techniques employed to reduce the leakage current in deep submicron region but they have some trade-off to control the leakage current. ONOFIC approach gives an excellent agreement between power dissipation and propagation delay for designing the efficient CMOS logic circuits. In this article ONOFIC approach is compared with LECTOR technique and output results show that ONOFIC approach significantly reduces the power dissipation and enhance the speed of the logic circuits. The lower power delay product is the big outcome of this approach and makes it an influential leakage reduction technique.

Consumer Online Shopping Behavior: The Effect of Internet Marketing Environment, Product Characteristics, Familiarity and Confidence, and Promotional Offer

Online shopping enables consumers to search for information and purchase products or services through direct interaction with online store. This study aims to examine the effect of Internet marketing environment, product characteristics, familiarity and confidence, and promotional offers on consumer online shopping behavior. 200 questionnaires were distributed to the respondents, who are students and staff at a public university in the Federal Territory of Labuan, Malaysia, following simple random sampling as a means of data collection. Multiple regression analysis was used as a statistical measure to determine the strength of the relationship between one dependent variable and a series of other independent variables. Results revealed that familiarity and confidence was found to greatly influence consumer online shopping behavior followed by promotional offers. A clear understanding of consumer online shopping behavior can help marketing managers predict the online shopping rate and evaluate the future growth of online commerce.

Exploring Additional Intention Predictors within Dietary Behavior among Type 2 Diabetes

Objective: This study explored the possibility of integrating Health Belief Concepts as additional predictors of intention to adopt a recommended diet-category within the Theory of Planned Behavior (TPB). Methods: The study adopted a Sequential Exploratory Mixed Methods approach. Qualitative data were generated on attitude, subjective norm, perceived behavioral control and perceptions on predetermined diet-categories including perceived susceptibility, perceived benefits, perceived severity and cues to action. Synthesis of qualitative data was done using constant comparative approach during phase 1. A survey tool developed from qualitative results was used to collect information on the same concepts across 237 legible Type 2 diabetics. Data analysis included use of Structural Equation Modeling in Analysis of Moment Structures to explore the possibility of including perceived susceptibility, perceived benefits, perceived severity and cues to action as additional intention predictors in a single nested model. Results: Two models-one nested based on the traditional TPB model {χ2=223.3, df = 77, p = .02, χ2/df = 2.9; TLI = .93; CFI =.91; RMSEA (90CI) = .090(.039, .146)} and the newly proposed Planned Behavior Health Belief Model (PBHB) {χ2 = 743.47, df = 301, p = .019; TLI = .90; CFI=.91; RMSEA (90CI) = .079(.031, .14)} passed the goodness of fit tests based on common fit indicators used. Conclusion: The newly developed PBHB Model ranked higher than the traditional TPB model with reference made to chi-square ratios (PBHB: χ2/df = 2.47; p=0.19 against TPB: χ2/df = 2.9, p=0.02). The integrated model can be used to motivate Type 2 diabetics towards healthy eating.

Study of the Effects of Ceramic Nano-Pigments in Cement Mortar Corrosion Caused by Chlorine Ions

Superfine pigments that consist of natural and artificial pigments and are made of mineral soil with special characteristics are used in cementitious materials for various purposes. These pigments can decrease the amount of cement needed without loss of performance and strength and also change the monotonous and turbid colours of concrete into various attractive and light colours. In this study, the mechanical strength and resistance against chloride and halogen attacks of cement mortars containing ceramic nano-pigments in an affected environment are studied. This research suggests utilisation of ceramic nano-pigments between 50 and 1000 nm, obtaining full-depth coloured concrete, preventing chlorine penetration in the concrete up to a certain depth, and controlling corrosion in steel rebar with the Potentiostat (EG&G) apparatus.

The Automated Selective Acquisition System

To support design process for launching the product on time, reverse engineering (RE) process has been introduced for quickly generating 3D CAD model from its physical object. The accuracy of the 3D CAD model depends upon the data acquisition technique selected, contact or non-contact methods. In order to reduce times used for acquiring surface and eliminating noises, the automated selective acquisition system has been developed and presented in this research as the alternative channel for non-contact acquisition technique where the data is selectively and locally scanned contour by contour without performing data reduction process. The results present as the organized contour points which are directly used to generate 3D virtual model. The comparison between the proposed technique and another non-contact scanning technique has been presented and discussed.

Time Series Regression with Meta-Clusters

This paper presents a preliminary attempt to apply classification of time series using meta-clusters in order to improve the quality of regression models. In this case, clustering was performed as a method to obtain subgroups of time series data with normal distribution from the inflow into wastewater treatment plant data, composed of several groups differing by mean value. Two simple algorithms, K-mean and EM, were chosen as a clustering method. The Rand index was used to measure the similarity. After simple meta-clustering, a regression model was performed for each subgroups. The final model was a sum of the subgroups models. The quality of the obtained model was compared with the regression model made using the same explanatory variables, but with no clustering of data. Results were compared using determination coefficient (R2), measure of prediction accuracy- mean absolute percentage error (MAPE) and comparison on a linear chart. Preliminary results allow us to foresee the potential of the presented technique.

A Novel Application of Network Equivalencing Method in Time Domain to Precise Calculation of Dead Time in Power Transmission Title

Various studies have showed that about 90% of single line to ground faults occurred on High voltage transmission lines have transient nature. This type of faults is cleared by temporary outage (by the single phase auto-reclosure). The interval between opening and reclosing of the faulted phase circuit breakers is named “Dead Time” that is varying about several hundred milliseconds. For adjustment of traditional single phase auto-reclosures that usually are not intelligent, it is necessary to calculate the dead time in the off-line condition precisely. If the dead time used in adjustment of single phase auto-reclosure is less than the real dead time, the reclosing of circuit breakers threats the power systems seriously. So in this paper a novel approach for precise calculation of dead time in power transmission lines based on the network equivalencing in time domain is presented. This approach has extremely higher precision in comparison with the traditional method based on Thevenin equivalent circuit. For comparison between the proposed approach in this paper and the traditional method, a comprehensive simulation by EMTP-ATP is performed on an extensive power network.

The Visual Inspection of Surgical Tasks Using Machine Vision: Applications to Robotic Surgery

In this paper, the feasibility of using machine vision to assess task completion in a surgical intervention is investigated, with the aim of incorporating vision based inspection in robotic surgery systems. The visually rich operative field presents a good environment for the development of automated visual inspection techniques in these systems, for a more comprehensive approach when performing a surgical task. As a proof of concept, machine vision techniques were used to distinguish the two possible outcomes i.e. satisfactory or unsatisfactory, of three primary surgical tasks involved in creating a burr hole in the skull, namely incision, retraction, and drilling. Encouraging results were obtained for the three tasks under consideration, which has been demonstrated by experiments on cadaveric pig heads. These findings are suggestive for the potential use of machine vision to validate successful task completion in robotic surgery systems. Finally, the potential of using machine vision in the operating theatre, and the challenges that must be addressed, are identified and discussed.

Adaptive WiFi Fingerprinting for Location Approximation

WiFi has become an essential technology that is widely used nowadays. It is famous due to its convenience to be used with mobile devices. This is especially true for Internet users worldwide that use WiFi connections. There are many location based services that are available nowadays which uses Wireless Fidelity (WiFi) signal fingerprinting. A common example that is gaining popularity in this era would be Foursquare. In this work, the WiFi signal would be used to estimate the user or client’s location. Similar to GPS, fingerprinting method needs a floor plan to increase the accuracy of location estimation. Still, the factor of inconsistent WiFi signal makes the estimation defer at different time intervals. Given so, an adaptive method is needed to obtain the most accurate signal at all times. WiFi signals are heavily distorted by external factors such as physical objects, radio frequency interference, electrical interference, and environmental factors to name a few. Due to these factors, this work uses a method of reducing the signal noise and estimation using the Nearest Neighbour based on past activities of the signal to increase the signal accuracy up to more than 80%. The repository yet increases the accuracy by using Artificial Neural Network (ANN) pattern matching. The repository acts as the server cum support of the client side application decision. Numerous previous works has adapted the methods of collecting signal strengths in the repository over the years, but mostly were just static. In this work, proposed solutions on how the adaptive method is done to match the signal received to the data in the repository are highlighted. With the said approach, location estimation can be done more accurately. Adaptive update allows the latest location fingerprint to be stored in the repository. Furthermore, any redundant location fingerprints are removed and only the updated version of the fingerprint is stored in the repository. How the location estimation of the user can be predicted would be highlighted more in the proposed solution section. After some studies on previous works, it is found that the Artificial Neural Network is the most feasible method to deploy in updating the repository and making it adaptive. The Artificial Neural Network functions are to do the pattern matching of the WiFi signal to the existing data available in the repository.

Improved Ant Colony Optimization for Solving Reliability Redundancy Allocation Problems

This paper presents an improved ant colony optimization (IACO) for solving the reliability redundancy allocation problem (RAP) in order to maximize system reliability. To improve the performance of ACO algorithm, two additional techniques, i.e. neighborhood search, and re-initialization process are presented. To show its efficiency and effectiveness, the proposed IACO is applied to solve three RAPs. Additionally, the results of the proposed IACO are compared with those of the conventional heuristic approaches i.e. genetic algorithm (GA), particle swarm optimization (PSO) and ant colony optimization (ACO). The experimental results show that the proposed IACO approach is comparatively capable of obtaining higher quality solution and faster computational time.