Action Recognition in Video Sequences using a Mealy Machine

In this paper the use of sequential machines for recognizing actions taken by the objects detected by a general tracking algorithm is proposed. The system may deal with the uncertainty inherent in medium-level vision data. For this purpose, fuzzification of input data is performed. Besides, this transformation allows to manage data independently of the tracking application selected and enables adding characteristics of the analyzed scenario. The representation of actions by means of an automaton and the generation of the input symbols for finite automaton depending on the object and action compared are described. The output of the comparison process between an object and an action is a numerical value that represents the membership of the object to the action. This value is computed depending on how similar the object and the action are. The work concludes with the application of the proposed technique to identify the behavior of vehicles in road traffic scenes.

Using Reuse Water for Irrigation Green space of Naein City

Since water resources of desert Naein City are very limited, a approach which saves water resources and meanwhile meets the needs of the greenspace for water is to use city-s sewage wastewater. Proper treatment of Naein-s sewage up to the standards required for green space uses may solve some of the problems of green space development of the city. The present paper closely examines available statistics and information associated with city-s sewage system, and determines complementary stages of sewage treatment facilities of the city. In the present paper, population, per capita water use, and required discharge for various greenspace pieces including different plants are calculated. Moreover, in order to facilitate the application of water resources, a Crude water distribution network apart from drinking water distribution network is designed, and a plan for mixing municipal wells- water with sewage wastewater in proposed mixing tanks is suggested. Hence, following greenspace irrigation reform and complementary plan, per capita greenspace of the city will be increased from current amount of 13.2 square meters to 32 square meters.

Analytical Model for Brine Discharges from a Sea Outfall with Multiport Diffusers

Multiport diffusers are the effective engineering devices installed at the modern marine outfalls for the steady discharge of effluent streams from the coastal plants, such as municipal sewage treatment, thermal power generation and seawater desalination. A mathematical model using a two-dimensional advection-diffusion equation based on a flat seabed and incorporating the effect of a coastal tidal current is developed to calculate the compounded concentration following discharges of desalination brine from a sea outfall with multiport diffusers. The analytical solutions are computed graphically to illustrate the merging of multiple brine plumes in shallow coastal waters, and further approximation will be made to the maximum shoreline's concentration to formulate dilution of a multiport diffuser discharge.

Study of Peptide Fragment of Alpha-Fetoprotein as a Radionuclide Vehicle

Alpfa-fetoprotein and its fragments may be an important vehicle for targeted delivery of radionuclides to the tumor. We investigated the effect of conditions on the labeling of biologically active synthetic peptide based on the (F-afp) with technetium-99m. The influence of the nature of the buffer solution, pH, concentration of reductant, concentration of the peptide and the reaction temperature on the yield of labeling was examined. As a result, the following optimal conditions for labeling of (F-afp) are found: pH 8.5 (phosphate and bicarbonate buffers) and pH from 1.7 to 7.0 (citrate buffer). The reaction proceeds with sufficient yield at room temperature for 30 min at the concentration of SnCl2 and (Fafp) (F-afp) is to be less than 10 mkg/ml and 25 mkg/ml, respectively. Investigations of the test drug accumulation in the tumor cells of human breast cancer were carried out. Results can be assumed that the in vivo study of the (F-afp) in experimental tumor lesions will show concentrations sufficient for imaging these lesions by SPECT.

Information Retrieval in Domain Specific Search Engine with Machine Learning Approaches

As the web continues to grow exponentially, the idea of crawling the entire web on a regular basis becomes less and less feasible, so the need to include information on specific domain, domain-specific search engines was proposed. As more information becomes available on the World Wide Web, it becomes more difficult to provide effective search tools for information access. Today, people access web information through two main kinds of search interfaces: Browsers (clicking and following hyperlinks) and Query Engines (queries in the form of a set of keywords showing the topic of interest) [2]. Better support is needed for expressing one's information need and returning high quality search results by web search tools. There appears to be a need for systems that do reasoning under uncertainty and are flexible enough to recover from the contradictions, inconsistencies, and irregularities that such reasoning involves. In a multi-view problem, the features of the domain can be partitioned into disjoint subsets (views) that are sufficient to learn the target concept. Semi-supervised, multi-view algorithms, which reduce the amount of labeled data required for learning, rely on the assumptions that the views are compatible and uncorrelated. This paper describes the use of semi-structured machine learning approach with Active learning for the “Domain Specific Search Engines". A domain-specific search engine is “An information access system that allows access to all the information on the web that is relevant to a particular domain. The proposed work shows that with the help of this approach relevant data can be extracted with the minimum queries fired by the user. It requires small number of labeled data and pool of unlabelled data on which the learning algorithm is applied to extract the required data.

Adsorption of Crystal Violet onto BTEA- and CTMA-bentonite from Aqueous Solutions

CTMA-bentonite and BTEA-Bentonite prepared by Na-bentonite cation exchanged with cetyltrimethylammonium(CTMA) and benzyltriethylammonium (BTEA). Products were characterized by XRD and IR techniques.The d001 spacing value of CTMA-bentonite and BTEA-bentonite are 7.54Å and 3.50Å larger than that of Na-bentonite at 100% cation exchange capacity, respectively. The IR spectrum showed that the intensities of OH stretching and bending vibrations of the two organoclays decreased greatly comparing to untreated Na-bentonite. Batch experiments were carried out at 303 K, 318 K and 333 K to obtain the sorption isotherms of Crystal violet onto the two organoclays. The results show that the sorption isothermal data could be well described by Freundlich model. The dynamical data for the two organoclays fit well with pseudo-second-order kinetic model. The adsorption capacity of CTMA-bentonite was found higher than that of BTEA-Bentonite. Thermodynamic parameters such as changes in the free energy (ΔG°), the enthalpy (ΔH°) and the entropy (ΔS°) were also evaluated. The overall adsorption process of Crystal violet onto the two organoclays were spontaneous, endothermic physisorption. The CTMA-bentonite and BTEA-Bentonite could be employed as low-cost alternatives to activated carbon in wastewater treatment for the removal of color which comes from textile dyes.

A Questionnaire-Based Survey: Therapist’s Response towards the Upper Limb Disorder Learning Tool

Previous studies have shown that there are arguments regarding the reliability and validity of the Ashworth and Modified Ashworth Scale towards evaluating patients diagnosed with upper limb disorders. These evaluations depended on the raters’ experiences. This initiated us to develop an upper limb disorder part-task trainer that is able to simulate consistent upper limb disorders, such as spasticity and rigidity signs, based on the Modified Ashworth Scale to improve the variability occurring between raters and intra-raters themselves. By providing consistent signs, novice therapists would be able to increase training frequency and exposure towards various levels of signs. A total of 22 physiotherapists and occupational therapists participated in the study. The majority of the therapists agreed that with current therapy education, they still face problems with inter-raters and intra-raters variability (strongly agree 54%; n = 12/22, agree 27%; n = 6/22) in evaluating patients’ conditions. The therapists strongly agreed (72%; n = 16/22) that therapy trainees needed to increase their frequency of training; therefore believe that our initiative to develop an upper limb disorder training tool will help in improving the clinical education field (strongly agree and agree 63%; n = 14/22).

Complexity Analysis of Some Known Graph Coloring Instances

Graph coloring is an important problem in computer science and many algorithms are known for obtaining reasonably good solutions in polynomial time. One method of comparing different algorithms is to test them on a set of standard graphs where the optimal solution is already known. This investigation analyzes a set of 50 well known graph coloring instances according to a set of complexity measures. These instances come from a variety of sources some representing actual applications of graph coloring (register allocation) and others (mycieleski and leighton graphs) that are theoretically designed to be difficult to solve. The size of the graphs ranged from ranged from a low of 11 variables to a high of 864 variables. The method used to solve the coloring problem was the square of the adjacency (i.e., correlation) matrix. The results show that the most difficult graphs to solve were the leighton and the queen graphs. Complexity measures such as density, mobility, deviation from uniform color class size and number of block diagonal zeros are calculated for each graph. The results showed that the most difficult problems have low mobility (in the range of .2-.5) and relatively little deviation from uniform color class size.

A Microstrip Antenna Design and Performance Analysis for RFID High Bit Rate Applications

Lately, an interest has grown greatly in the usages of RFID in an un-presidential applications. It is shown in the adaptation of major software companies such as Microsoft, IBM, and Oracle the RFID capabilities in their major software products. For example Microsoft SharePoints 2010 workflow is now fully compatible with RFID platform. In addition, Microsoft BizTalk server is also capable of all RFID sensors data acquisition. This will lead to applications that required high bit rate, long range and a multimedia content in nature. Higher frequencies of operation have been designated for RFID tags, among them are the 2.45 and 5.8 GHz. The higher the frequency means higher range, and higher bit rate, but the drawback is the greater cost. In this paper we present a single layer, low profile patch antenna operates at 5.8 GHz with pure resistive input impedance of 50 and close to directive radiation. Also, we propose a modification to the design in order to improve the operation band width from 8.7 to 13.8

The Effect of Different Nozzle Configurations on Airflow Behaviour and Yarn Quality

Nozzle is the main part of various spinning systems such as air-jet and Murata air vortex systems. Recently, many researchers worked on the usage of the nozzle on different spinning systems such as conventional ring and compact spinning systems. In these applications, primary purpose is to improve the yarn quality. In present study, it was produced the yarns with two different nozzle types and determined the changes in yarn properties. In order to explain the effect of the nozzle, airflow structure in the nozzle was modelled and airflow variables were determined. In numerical simulation, ANSYS 12.1 package program and Fluid Flow (CFX) analysis method was used. As distinct from the literature, Shear Stress Turbulent (SST) model is preferred. And also air pressure at the nozzle inlet was measured by electronic mass flow meter and these values were used for the simulation of the airflow. At last, the yarn was modelled and the area from where the yarn is passing was included to the numerical analysis.

Finite Element Analysis of Thin Steel Plate Shear Walls

Steel plate shear walls (SPSWs) in buildings are known to be an effective means for resisting lateral forces. By using un-stiffened walls and allowing them to buckle, their energy absorption capacity will increase significantly due to the postbuckling capacity. The post-buckling tension field action of SPSWs can provide substantial strength, stiffness and ductility. This paper presents the Finite Element Analysis of low yield point (LYP) steel shear walls. In this shear wall system, the LYP steel plate is used for the steel panel and conventional structural steel is used for boundary frames. A series of nonlinear cyclic analyses were carried out to obtain the stiffness, strength, deformation capacity, and energy dissipation capacity of the LYP steel shear wall. The effect of widthto- thickness ratio of steel plate on buckling behavior, and energy dissipation capacities were studied. Good energy dissipation and deformation capacities were obtained for all models.

Electronic Government in the GCC Countries

The study investigated the practices of organisations in Gulf Cooperation Council (GCC) countries with regards to G2C egovernment maturity. It reveals that e-government G2C initiatives in the surveyed countries in particular, and arguably around the world in general, are progressing slowly because of the lack of a trusted and secure medium to authenticate the identities of online users. The authors conclude that national ID schemes will play a major role in helping governments reap the benefits of e-government if the three advanced technologies of smart card, biometrics and public key infrastructure (PKI) are utilised to provide a reliable and trusted authentication medium for e-government services.

A Fast Replica Placement Methodology for Large-scale Distributed Computing Systems

Fine-grained data replication over the Internet allows duplication of frequently accessed data objects, as opposed to entire sites, to certain locations so as to improve the performance of largescale content distribution systems. In a distributed system, agents representing their sites try to maximize their own benefit since they are driven by different goals such as to minimize their communication costs, latency, etc. In this paper, we will use game theoretical techniques and in particular auctions to identify a bidding mechanism that encapsulates the selfishness of the agents, while having a controlling hand over them. In essence, the proposed game theory based mechanism is the study of what happens when independent agents act selfishly and how to control them to maximize the overall performance. A bidding mechanism asks how one can design systems so that agents- selfish behavior results in the desired system-wide goals. Experimental results reveal that this mechanism provides excellent solution quality, while maintaining fast execution time. The comparisons are recorded against some well known techniques such as greedy, branch and bound, game theoretical auctions and genetic algorithms.

Low Air Velocity Measurement Characteristics- Variation Due to Flow Regime

The paper depicts air velocity values, reproduced by laser Doppler anemometer (LDA) and ultrasonic anemometer (UA), relations with calculated ones from flow rate measurements using the gas meter which calibration uncertainty is ± (0.15 – 0.30) %. Investigation had been performed in channel installed in aerodynamical facility used as a part of national standard of air velocity. Relations defined in a research let us confirm the LDA and UA for air velocity reproduction to be the most advantageous measures. The results affirm ultrasonic anemometer to be reliable and favourable instrument for measurement of mean velocity or control of velocity stability in the velocity range of 0.05 m/s – 10 (15) m/s when the LDA used. The main aim of this research is to investigate low velocity regularities, starting from 0.05 m/s, including region of turbulent, laminar and transitional air flows. Theoretical and experimental results and brief analysis of it are given in the paper. Maximum and mean velocity relations for transitional air flow having unique distribution are represented. Transitional flow having distinctive and different from laminar and turbulent flow characteristics experimentally have not yet been analysed.

An Approach for Transient Response Calculation of large Nonproportionally Damped Structures using Component Mode Synthesis

A minimal complexity version of component mode synthesis is presented that requires simplified computer programming, but still provides adequate accuracy for modeling lower eigenproperties of large structures and their transient responses. The novelty is that a structural separation into components is done along a plane/surface that exhibits rigid-like behavior, thus only normal modes of each component is sufficient to use, without computing any constraint, attachment, or residual-attachment modes. The approach requires only such input information as a few (lower) natural frequencies and corresponding undamped normal modes of each component. A novel technique is shown for formulation of equations of motion, where a double transformation to generalized coordinates is employed and formulation of nonproportional damping matrix in generalized coordinates is shown.

3D Numerical Simulation of Scouring around Bridge Piers (Case Study: Bridge 524 Crosses the Tanana River)

Due to the three- dimensional flow pattern interacting with bed material, the process of local scour around bridge piers is complex. Modeling 3D flow field and scour hole evolution around a bridge pier is more feasible nowadays because the computational cost and computational time have significantly decreased. In order to evaluate local flow and scouring around a bridge pier, a completely three-dimensional numerical model, SSIIM program, was used. The model solves 3-D Navier-Stokes equations and a bed load conservation equation. The model was applied to simulate local flow and scouring around a bridge pier in a large natural river with four piers. Computation for 1 day of flood condition was carried out to predict the maximum local scour depth. The results show that the SSIIM program can be used efficiently for simulating the scouring in natural rivers. The results also showed that among the various turbulence models, the k-ω model gives more reasonable results.

Thermo-Sensitive Hydrogel: Control of Hydrophilic-Hydrophobic Transition

The study investigated the hydrophilic to hydrophobic transition of modified polyacrylamide hydrogel with the inclusion of N-isopropylacrylamide (NIAM). The modification was done by mimicking micellar polymerization, which resulted in better arrangement of NIAM chains in the polyacrylamide network. The degree of NIAM arrangement is described by NH number. The hydrophilic to hydrophobic transition was measured through the partition coefficient, K, of Orange II and Methylene Blue in hydrogel and in water. These dyes were chosen as a model for solutes with different degree of hydrophobicity. The study showed that the hydrogel with higher NH values resulted in better solubility of both dyes. Moreover, in temperature above the lower critical solution temperature (LCST) of Poly(N-isopropylacrylamide) (PNIAM)also caused the collapse of NIPAM chains which results in a more hydrophobic environment that increases the solubility of Methylene Blue and decreases the solubility of Orange II in the hydrogels with NIPAM present.

Surfactant Stabilized Nanoemulsion: Characterization and Application in Enhanced Oil Recovery

Nanoemulsions are a class of emulsions with a droplet size in the range of 50–500 nm and have attracted a great deal of attention in recent years because it is unique characteristics. The physicochemical properties of nanoemulsion suggests that it can be successfully used to recover the residual oil which is trapped in the fine pore of reservoir rock by capillary forces after primary and secondary recovery. Oil-in-water nanoemulsion which can be formed by high-energy emulsification techniques using specific surfactants can reduce oil-water interfacial tension (IFT) by 3-4 orders of magnitude. The present work is aimed on characterization of oil-inwater nanoemulsion in terms of its phase behavior, morphological studies; interfacial energy; ability to reduce the interfacial tension and understanding the mechanisms of mobilization and displacement of entrapped oil blobs by lowering interfacial tension both at the macroscopic and microscopic level. In order to investigate the efficiency of oil-water nanoemulsion in enhanced oil recovery (EOR), experiments were performed to characterize the emulsion in terms of their physicochemical properties and size distribution of the dispersed oil droplet in water phase. Synthetic mineral oil and a series of surfactants were used to prepare oil-in-water emulsions. Characterization of emulsion shows that it follows pseudo-plastic behaviour and drop size of dispersed oil phase follows lognormal distribution. Flooding experiments were also carried out in a sandpack system to evaluate the effectiveness of the nanoemulsion as displacing fluid for enhanced oil recovery. Substantial additional recoveries (more than 25% of original oil in place) over conventional water flooding were obtained in the present investigation.

The Semantic Web: a New Approach for Future World Wide Web

The purpose of semantic web research is to transform the Web from a linked document repository into a distributed knowledge base and application platform, thus allowing the vast range of available information and services to be more efficiently exploited. As a first step in this transformation, languages such as OWL have been developed. Although fully realizing the Semantic Web still seems some way off, OWL has already been very successful and has rapidly become a defacto standard for ontology development in fields as diverse as geography, geology, astronomy, agriculture, defence and the life sciences. The aim of this paper is to classify key concepts of Semantic Web as well as introducing a new practical approach which uses these concepts to outperform Word Wide Web.

Processing Web-Cam Images by a Neuro-Fuzzy Approach for Vehicular Traffic Monitoring

Traffic management in an urban area is highly facilitated by the knowledge of the traffic conditions in every street or highway involved in the vehicular mobility system. Aim of the paper is to propose a neuro-fuzzy approach able to compute the main parameters of a traffic system, i.e., car density, velocity and flow, by using the images collected by the web-cams located at the crossroads of the traffic network. The performances of this approach encourage its application when the traffic system is far from the saturation. A fuzzy model is also outlined to evaluate when it is suitable to use more accurate, even if more time consuming, algorithms for measuring traffic conditions near to saturation.