Aggressive Interactions in Hospital Emergency Units

International literature emphasizes on the concern regarding the phenomenon of aggression in hospital. This paper focuses on the reality of aggressive interactions reigning within an emergency triage involving three chaps of protagonists: the professionals, the patients and their carers. The data collection was made from a grid of observation, in which the various variables exposed in the literature were integrated. They observations took place around the clock, for three weeks, at the rate of one week a month. In this research 331 aggressive interactions have been listed and analyzed by means of the software SPSS. This research is one of the very few continuous observation surveys in the literature. It shows the various human factors at play in the emergence of aggressive interaction. The data may be used both for taking steps in primary prevention, thanks to the analysis of interaction modes, and in secondary prevention by integrating the useful results in situational prevention.

Smith Predictor Design by CDM for Temperature Control System

Smith Predictor control is theoretically a good solution to the problem of controlling the time delay systems. However, it seldom gets use because it is almost impossible to find out a precise mathematical model of the practical system and very sensitive to uncertain system with variable time-delay. In this paper is concerned with a design method of smith predictor for temperature control system by Coefficient Diagram Method (CDM). The simulation results show that the control system with smith predictor design by CDM is stable and robust whilst giving the desired time domain system performance.

The Potential Use of Nanofilters to Supply Potable Water in Persian Gulf and Oman Sea Watershed Basin

In a world worried about water resources with the shadow of drought and famine looming all around, the quality of water is as important as its quantity. The source of all concerns is the constant reduction of per capita quality water for different uses. Iran With an average annual precipitation of 250 mm compared to the 800 mm world average, Iran is considered a water scarce country and the disparity in the rainfall distribution, the limitations of renewable resources and the population concentration in the margins of desert and water scarce areas have intensified the problem. The shortage of per capita renewable freshwater and its poor quality in large areas of the country, which have saline, brackish or hard water resources, and the profusion of natural and artificial pollutant have caused the deterioration of water quality. Among methods of treatment and use of these waters one can refer to the application of membrane technologies, which have come into focus in recent years due to their great advantages. This process is quite efficient in eliminating multi-capacity ions; and due to the possibilities of production at different capacities, application as treatment process in points of use, and the need for less energy in comparison to Reverse Osmosis processes, it can revolutionize the water and wastewater sector in years to come. The article studied the different capacities of water resources in the Persian Gulf and Oman Sea watershed basins, and processes the possibility of using nanofiltration process to treat brackish and non-conventional waters in these basins.

Runoff Quality and Pollution Loading from a Residential Catchment in Miri, Sarawak

Urban non-point source (NPS) pollution for a residential catchment in Miri, Sarawak was investigated for two storm events in 2011. Runoff from two storm events were sampled and tested for water quality parameters including TSS, BOD5, COD, NH3-N, NO3-N, NO2-N, P and Pb. Concentration of the water quality parameters was found to vary significantly between storms and the pollutant of concern was found to be NO3-N, TSS, COD and Pb. Results were compared to the Interim National Water Quality Standards for Malaysia (INWQS),and the stormwater runoff from the study can be classified as polluted, exceeding class III water quality, especially in terms of TSS, COD, and NH3-N with maximum EMCs of 158, 135, and 2.17 mg/L, respectively.

Optimization of Petroleum Refinery Configuration Design with Logic Propositions

This work concerns the topological optimization problem for determining the optimal petroleum refinery configuration. We are interested in further investigating and hopefully advancing the existing optimization approaches and strategies employing logic propositions to conceptual process synthesis problems. In particular, we seek to contribute to this increasingly exciting area of chemical process modeling by addressing the following potentially important issues: (a) how the formulation of design specifications in a mixed-logical-and-integer optimization model can be employed in a synthesis problem to enrich the problem representation by incorporating past design experience, engineering knowledge, and heuristics; and (b) how structural specifications on the interconnectivity relationships by space (states) and by function (tasks) in a superstructure should be properly formulated within a mixed-integer linear programming (MILP) model. The proposed modeling technique is illustrated on a case study involving the alternative processing routes of naphtha, in which significant improvement in the solution quality is obtained.

Multi-board Run-time Reconfigurable Implementation of Intrinsic Evolvable Hardware

A multi-board run-time reconfigurable (MRTR) system for evolvable hardware (EHW) is introduced with the aim to implement on hardware the bidirectional incremental evolution (BIE) method. The main features of this digital intrinsic EHW solution rely on the multi-board approach, the variable chromosome length management and the partial configuration of the reconfigurable circuit. These three features provide a high scalability to the solution. The design has been written in VHDL with the concern of not being platform dependant in order to keep a flexibility factor as high as possible. This solution helps tackling the problem of evolving complex task on digital configurable support.

Appraisal of Energy Efficiency of Urban Development Plans: The Fidelity Concept on Izmir-Balcova Case

Design and land use are closely linked to the energy efficiency levels for an urban area. The current city planning practice does not involve an effective land useenergy evaluation in its 'blueprint' urban plans. The study proposed an appraisal method that can be embedded in GIS programs using five planning criteria as how far a planner can give away from the planning principles (criteria) for the most energy output s/he can obtain. The case of Balcova, a district in the Izmir Metropolitan area, is used conformingly for evaluating the proposed master plan and the geothermal energy (heating only) use for the concern district. If the land use design were proposed accordingly at-most energy efficiency (a 30% obtained), mainly increasing the density around the geothermal wells and also proposing more mixed use zones, we could have 17% distortion (infidelity to the main planning principles) from the original plan. The proposed method can be an effective tool for planners as simulation media, of which calculations can be made by GIS ready tools, to evaluate efficiency levels for different plan proposals, letting to know how much energy saving causes how much deviation from the other planning ideals. Lower energy uses can be possible for different land use proposals for various policy trials.

How the Conversations in Social Media Concern in Sales in the Automobile Industry in Spain

Automobile Industry has great importance in the Spanish economy (8,7 % of the active Spanish population is employed in this sector).The above mentioned sector has been one of the principal sectors affected by the current economic crisis, consistently, the budgets in advertising have been severely limited (46,9 % less in the period of reference), these needs of reduction have originated a substantial change in the advertising strategy (from 2007 the increase of the advertising investment in Internet is 251,6 %), and increase profitability. The growing use of social media by consumers therefore makes online consumer conversations an attractive additional format for Automobile firms to promote products at a lower cost. This research analyzes the relation between the activity in Social Media and the design in the car industry, looking for relations between strategies of design based on Social Media and sales and a channel of information for companies to know what the consumer preferences. For this ongoing research we used a longitudinal withdrawal of information has been used using information of panel. Managerial and research implications of the finding are discussed.

Geospatial Network Analysis Using Particle Swarm Optimization

The shortest path (SP) problem concerns with finding the shortest path from a specific origin to a specified destination in a given network while minimizing the total cost associated with the path. This problem has widespread applications. Important applications of the SP problem include vehicle routing in transportation systems particularly in the field of in-vehicle Route Guidance System (RGS) and traffic assignment problem (in transportation planning). Well known applications of evolutionary methods like Genetic Algorithms (GA), Ant Colony Optimization, Particle Swarm Optimization (PSO) have come up to solve complex optimization problems to overcome the shortcomings of existing shortest path analysis methods. It has been reported by various researchers that PSO performs better than other evolutionary optimization algorithms in terms of success rate and solution quality. Further Geographic Information Systems (GIS) have emerged as key information systems for geospatial data analysis and visualization. This research paper is focused towards the application of PSO for solving the shortest path problem between multiple points of interest (POI) based on spatial data of Allahabad City and traffic speed data collected using GPS. Geovisualization of results of analysis is carried out in GIS.

Analysis of Statistical Data on Social Resources Dimension of Occupational Status Attainment: A Rational Choice Approach

The aim of the present study is to analyze empirical researches on the social resources dimension of occupational status attainment process and relate them to the rational choice approach. The analysis suggests that the existing data on the strength of ties aspect of social resources is insufficient and does not allow any implication concerning rational actor-s behavior. However, the results concerning work relation aspect are more encouraging.

The Number of Rational Points on Conics Cp,k : x2 − ky2 = 1 over Finite Fields Fp

Let p be a prime number, Fp be a finite field, and let k ∈ F*p. In this paper, we consider the number of rational points onconics Cp,k: x2 − ky2 = 1 over Fp. We proved that the order of Cp,k over Fp is p-1 if k is a quadratic residue mod p and is p + 1 if k is not a quadratic residue mod p. Later we derive some resultsconcerning the sums ΣC[x]p,k(Fp) and ΣC[y]p,k(Fp), the sum of x- and y-coordinates of all points (x, y) on Cp,k, respectively.

Numerical Study of Cyclic Behavior of Shallow Foundations on Sand Reinforced with Geogrid and Grid-Anchor

When the foundations of structures under cyclic loading with amplitudes less than their permissible load, the concern exists often for the amount of uniform and non-uniform settlement of such structures. Storage tank foundations with numerous filling and discharging and railways ballast course under repeating transportation loads are examples of such conditions. This paper deals with the effects of using the new generation of reinforcements, Grid-Anchor, for the purpose of reducing the permanent settlement of these foundations under the influence of different proportions of the ultimate load. Other items such as the type and the number of reinforcements as well as the number of loading cycles are studied numerically. Numerical models were made using the Plaxis3D Tunnel finite element code. The results show that by using gridanchor and increasing the number of their layers in the same proportion as that of the cyclic load being applied, the amount of permanent settlement decreases up to 42% relative to unreinforced condition depends on the number of reinforcement layers and percent of applied load and the number of loading cycles to reach a constant value of dimensionless settlement decreases up to 20% relative to unreinforced condition.

Ensemble Learning with Decision Tree for Remote Sensing Classification

In recent years, a number of works proposing the combination of multiple classifiers to produce a single classification have been reported in remote sensing literature. The resulting classifier, referred to as an ensemble classifier, is generally found to be more accurate than any of the individual classifiers making up the ensemble. As accuracy is the primary concern, much of the research in the field of land cover classification is focused on improving classification accuracy. This study compares the performance of four ensemble approaches (boosting, bagging, DECORATE and random subspace) with a univariate decision tree as base classifier. Two training datasets, one without ant noise and other with 20 percent noise was used to judge the performance of different ensemble approaches. Results with noise free data set suggest an improvement of about 4% in classification accuracy with all ensemble approaches in comparison to the results provided by univariate decision tree classifier. Highest classification accuracy of 87.43% was achieved by boosted decision tree. A comparison of results with noisy data set suggests that bagging, DECORATE and random subspace approaches works well with this data whereas the performance of boosted decision tree degrades and a classification accuracy of 79.7% is achieved which is even lower than that is achieved (i.e. 80.02%) by using unboosted decision tree classifier.

Quality-Driven Business Process Refactoring

Appropriate description of business processes through standard notations has become one of the most important assets for organizations. Organizations must therefore deal with quality faults in business process models such as the lack of understandability and modifiability. These quality faults may be exacerbated if business process models are mined by reverse engineering, e.g., from existing information systems that support those business processes. Hence, business process refactoring is often used, which change the internal structure of business processes whilst its external behavior is preserved. This paper aims to choose the most appropriate set of refactoring operators through the quality assessment concerning understandability and modifiability. These quality features are assessed through well-proven measures proposed in the literature. Additionally, a set of measure thresholds are heuristically established for applying the most promising refactoring operators, i.e., those that achieve the highest quality improvement according to the selected measures in each case.

An Efficient and Generic Hybrid Framework for High Dimensional Data Clustering

Clustering in high dimensional space is a difficult problem which is recurrent in many fields of science and engineering, e.g., bioinformatics, image processing, pattern reorganization and data mining. In high dimensional space some of the dimensions are likely to be irrelevant, thus hiding the possible clustering. In very high dimensions it is common for all the objects in a dataset to be nearly equidistant from each other, completely masking the clusters. Hence, performance of the clustering algorithm decreases. In this paper, we propose an algorithmic framework which combines the (reduct) concept of rough set theory with the k-means algorithm to remove the irrelevant dimensions in a high dimensional space and obtain appropriate clusters. Our experiment on test data shows that this framework increases efficiency of the clustering process and accuracy of the results.

Secure Protocol for Short Message Service

Short Message Service (SMS) has grown in popularity over the years and it has become a common way of communication, it is a service provided through General System for Mobile Communications (GSM) that allows users to send text messages to others. SMS is usually used to transport unclassified information, but with the rise of mobile commerce it has become a popular tool for transmitting sensitive information between the business and its clients. By default SMS does not guarantee confidentiality and integrity to the message content. In the mobile communication systems, security (encryption) offered by the network operator only applies on the wireless link. Data delivered through the mobile core network may not be protected. Existing end-to-end security mechanisms are provided at application level and typically based on public key cryptosystem. The main concern in a public-key setting is the authenticity of the public key; this issue can be resolved by identity-based (IDbased) cryptography where the public key of a user can be derived from public information that uniquely identifies the user. This paper presents an encryption mechanism based on the IDbased scheme using Elliptic curves to provide end-to-end security for SMS. This mechanism has been implemented over the standard SMS network architecture and the encryption overhead has been estimated and compared with RSA scheme. This study indicates that the ID-based mechanism has advantages over the RSA mechanism in key distribution and scalability of increasing security level for mobile service.

Investigation of Inter Feeder Power Flow Regulator: Load Sharing Mode

The Inter feeder Power Flow Regulator (IFPFR) proposed in this paper consists of several voltage source inverters with common dc bus; each inverter is connected in series with one of different independent distribution feeders in the power system. This paper is concerned with how to transfer power between the feeders for load sharing purpose. The power controller of each inverter injects the power (for sending feeder) or absorbs the power (for receiving feeder) via injecting suitable voltage; this voltage injection is simulated by voltage drop across series virtual impedance, the impedance value is selected to achieve the concept of power exchange between the feeders without perturbing the load voltage magnitude of each feeder. In this paper a new control scheme for load sharing using IFPFR is proposed.

Adaptive Impedance Control for Unknown Time-Varying Environment Position and Stiffness

This study is concerned with a new adaptive impedance control strategy to compensate for unknown time-varying environment stiffness and position. The uncertainties are expressed by Function Approximation Technique (FAT), which allows the update laws to be derived easily using Lyapunov stability theory. Computer simulation results are presented to validate the effectiveness of the proposed strategy.

The Intuitionistic Fuzzy Ordered Weighted Averaging-Weighted Average Operator and its Application in Financial Decision Making

We present a new intuitionistic fuzzy aggregation operator called the intuitionistic fuzzy ordered weighted averaging-weighted average (IFOWAWA) operator. The main advantage of the IFOWAWA operator is that it unifies the OWA operator with the WA in the same formulation considering the degree of importance that each concept has in the aggregation. Moreover, it is able to deal with an uncertain environment that can be assessed with intuitionistic fuzzy numbers. We study some of its main properties and we see that it has a lot of particular cases such as the intuitionistic fuzzy weighted average (IFWA) and the intuitionistic fuzzy OWA (IFOWA) operator. Finally, we study the applicability of the new approach on a financial decision making problem concerning the selection of financial strategies.

Person Identification using Gait by Combined Features of Width and Shape of the Binary Silhouette

Current image-based individual human recognition methods, such as fingerprints, face, or iris biometric modalities generally require a cooperative subject, views from certain aspects, and physical contact or close proximity. These methods cannot reliably recognize non-cooperating individuals at a distance in the real world under changing environmental conditions. Gait, which concerns recognizing individuals by the way they walk, is a relatively new biometric without these disadvantages. The inherent gait characteristic of an individual makes it irreplaceable and useful in visual surveillance. In this paper, an efficient gait recognition system for human identification by extracting two features namely width vector of the binary silhouette and the MPEG-7-based region-based shape descriptors is proposed. In the proposed method, foreground objects i.e., human and other moving objects are extracted by estimating background information by a Gaussian Mixture Model (GMM) and subsequently, median filtering operation is performed for removing noises in the background subtracted image. A moving target classification algorithm is used to separate human being (i.e., pedestrian) from other foreground objects (viz., vehicles). Shape and boundary information is used in the moving target classification algorithm. Subsequently, width vector of the outer contour of binary silhouette and the MPEG-7 Angular Radial Transform coefficients are taken as the feature vector. Next, the Principal Component Analysis (PCA) is applied to the selected feature vector to reduce its dimensionality. These extracted feature vectors are used to train an Hidden Markov Model (HMM) for identification of some individuals. The proposed system is evaluated using some gait sequences and the experimental results show the efficacy of the proposed algorithm.