Measuring the Comprehensibility of a UML-B Model and a B Model

Software maintenance, which involves making enhancements, modifications and corrections to existing software systems, consumes more than half of developer time. Specification comprehensibility plays an important role in software maintenance as it permits the understanding of the system properties more easily and quickly. The use of formal notation such as B increases a specification-s precision and consistency. However, the notation is regarded as being difficult to comprehend. Semi-formal notation such as the Unified Modelling Language (UML) is perceived as more accessible but it lacks formality. Perhaps by combining both notations could produce a specification that is not only accurate and consistent but also accessible to users. This paper presents an experiment conducted on a model that integrates the use of both UML and B notations, namely UML-B, versus a B model alone. The objective of the experiment was to evaluate the comprehensibility of a UML-B model compared to a traditional B model. The measurement used in the experiment focused on the efficiency in performing the comprehension tasks. The experiment employed a cross-over design and was conducted on forty-one subjects, including undergraduate and masters students. The results show that the notation used in the UML-B model is more comprehensible than the B model.

Key Based Text Watermarking of E-Text Documents in an Object Based Environment Using Z-Axis for Watermark Embedding

Data hiding into text documents itself involves pretty complexities due to the nature of text documents. A robust text watermarking scheme targeting an object based environment is presented in this research. The heart of the proposed solution describes the concept of watermarking an object based text document where each and every text string is entertained as a separate object having its own set of properties. Taking advantage of the z-ordering of objects watermark is applied with the z-axis letting zero fidelity disturbances to the text. Watermark sequence of bits generated against user key is hashed with selected properties of given document, to determine the bit sequence to embed. Bits are embedded along z-axis and the document has no fidelity issues when printed, scanned or photocopied.

Effect of Increasing Road Light Luminance on Night Driving Performance of Older Adults

The main objective of this study was to determine if a minimal increase in road light level (luminance) could lead to improved driving performance among older adults. Older, middleaged and younger adults were tested in a driving simulator following vision and cognitive screening. Comparisons were made for the performance of simulated night driving under two road light conditions (0.6 and 2.5 cd/m2). At each light level, the effects of self reported night driving avoidance were examined along with the vision/cognitive performance. It was found that increasing road light level from 0.6 cd/m2 to 2.5 cd/m2 resulted in improved recognition of signage on straight highway segments. The improvement depends on different driver-related factors such as vision and cognitive abilities, and confidence. On curved road sections, the results showed that driver-s performance worsened. It is concluded that while increasing road lighting may be helpful to older adults especially for sign recognition, it may also result in increased driving confidence and thus reduced attention in some driving situations.

Removal of Ciprofloxazin and Carbamazepine by Adsorption on Functionalized Mesoporous Silicates

Ciprofloxacin (CIP) and Carbamazepine (CBZ), nonbiodegradable pharmaceutical residues, were become emerging pollutants in several aquatic environments. The objectives of this research were to study the possibility to recover these pharmaceuticals residues from pharmaceutical wastewater by increasing the selective adsorption on synthesized functionalized porous silicate, comparing with powdered activated carbon (PAC). Hexagonal mesoporous silicate (HMS), functionalized HMSs (3- aminopropyltriethoxy, 3- mercaptopropyltrimethoxy and noctyldimethyl) were synthesized and characterized physico-chemical characteristics. Obtained adsorption kinetics and isotherms showed that 3-mercaptopropyltrimethoxy functional groups grafted on HMS provided highest CIP and CBZ adsorption capacities; however, it was still lower than that of PAC. The kinetic results were compatible with pseudo-second order. The hydrophobicity and hydrogen bonding might play a key role on the adsorption. Furthermore, the capacities were affected by varying pH values due to the strength of hydrogen bonding between targeted compounds and adsorbents. Electrostatic interaction might not affect the adsorption capacities.

2D-Modeling with Lego Mindstorms

The whole work is based on possibility to use Lego Mindstorms robotics systems to reduce costs. Lego Mindstorms consists of a wide variety of hardware components necessary to simulate, programme and test of robotics systems in practice. To programme algorithm, which simulates space using the ultrasonic sensor, was used development environment supplied with kit. Software Matlab was used to render values afterwards they were measured by ultrasonic sensor. The algorithm created for this paper uses theoretical knowledge from area of signal processing. Data being processed by algorithm are collected by ultrasonic sensor that scans 2D space in front of it. Ultrasonic sensor is placed on moving arm of robot which provides horizontal moving of sensor. Vertical movement of sensor is provided by wheel drive. The robot follows map in order to get correct positioning of measured data. Based on discovered facts it is possible to consider Lego Mindstorm for low-cost and capable kit for real-time modelling.

A Soft Set based Group Decision Making Method with Criteria Weight

Molodstov-s soft sets theory was originally proposed as general mathematical tool for dealing with uncertainty problems. The matrix form has been introduced in soft set and some of its properties have been discussed. However, the formulation of soft matrix in group decision making problem only with equal importance weights of criteria, which does not show the true opinion of decision maker on each criteria. The aim of this paper is to propose a method for solving group decision making problem incorporating the importance of criteria by using soft matrices in a more objective manner. The weight of each criterion is calculated by using the Analytic Hierarchy Process (AHP) method. An example of house selection process is given to illustrate the effectiveness of the proposed method.

Robust Design of Power System Stabilizers Using Adaptive Genetic Algorithms

Genetic algorithms (GAs) have been widely used for global optimization problems. The GA performance depends highly on the choice of the search space for each parameter to be optimized. Often, this choice is a problem-based experience. The search space being a set of potential solutions may contain the global optimum and/or other local optimums. A bad choice of this search space results in poor solutions. In this paper, our approach consists in extending the search space boundaries during the GA optimization, only when it is required. This leads to more diversification of GA population by new solutions that were not available with fixed search space boundaries. So, these dynamic search spaces can improve the GA optimization performances. The proposed approach is applied to power system stabilizer optimization for multimachine power system (16-generator and 68-bus). The obtained results are evaluated and compared with those obtained by ordinary GAs. Eigenvalue analysis and nonlinear system simulation results show the effectiveness of the proposed approach to damp out the electromechanical oscillation and enhance the global system stability.

DRE - A Quality Metric for Component based Software Products

The overriding goal of software engineering is to provide a high quality system, application or a product. To achieve this goal, software engineers must apply effective methods coupled with modern tools within the context of a mature software process [2]. In addition, it is also must to assure that high quality is realized. Although many quality measures can be collected at the project levels, the important measures are errors and defects. Deriving a quality measure for reusable components has proven to be challenging task now a days. The results obtained from the study are based on the empirical evidence of reuse practices, as emerged from the analysis of industrial projects. Both large and small companies, working in a variety of business domains, and using object-oriented and procedural development approaches contributed towards this study. This paper proposes a quality metric that provides benefit at both project and process level, namely defect removal efficiency (DRE).

Industrial Waste Monitoring

Conventional industrial monitoring systems are tedious, inefficient and the at times integrity of the data is unreliable. The objective of this system is to monitor industrial processes specifically the fluid level which will measure the instantaneous fluid level parameter and respond by text messaging the exact value of the parameter to the user when being enquired by a privileged access user. The development of the embedded program code and the circuit for fluid level measuring are discussed as well. Suggestions for future implementations and efficient remote monitoring works are included.

Implementation of a “DIVA“ Concept withspecific Elisa Kits; When Subunit H5 Avian Influenza Vaccine is used

The main objective of this study was to demonstrate that differentiation of infected and vaccinated animals (DIVA) strategy using different ELISA tests is possible when a subunit vaccine (Haemagglutinin protein) is used to prevent Avian influenza. Special emphasis was placed on the differentiation in the serological response to different components of the AIV (Nucleoprotein, Neuraminidase, Haemagglutinin, Nucleocapsid) between chickens that were vaccinated with a whole virus kill vaccine and recombinant vaccine. Furthermore, the potential use of this DIVA strategy using ELISA assays to detect Neuraminidase 1 (N1) was analyzed as strategy in countries where the field virus is H5N1 and the vaccine used is formulated with H5N2. Detection of AIV-s antibodies to any component in serum was negative for all animals on the study days 0-13. At study day 14 the titers of antibodies against Nucleoprotein (NP) and Nucleocapsid (NC) rose in the experimental groups vaccinated with Volvac® AI KV and were negatives during all the trial in the experimental groups vaccinated with a subunit H5; significant statistically differences were observed between these groups (p < 0.05). The seroconversion either Haemagglutinin or Neuraminidase was evident after 21 days post-vaccination in the experimental groups vaccinated with the respective viral fraction. Regarding the main aim of this study and according with the results that were obtained, use a combination of different ELISA test as a DIVA strategy is feasible when the vaccination is carry out with a subunit H5 vaccine. Also is possible to use the ELISA kit to detect Neuraminidase (either N1 or N2) as a DIVA concept in countries where H5N1 is present and the vaccination programs are done with H5N2 vaccine.

A Logic Based Framework for Planning for Mobile Agents

The objective of the paper is twofold. First, to develop a formal framework for planning for mobile agents. A logical language based on a temporal logic is proposed that can express a type of tasks which often arise in network management. Second, to design a planning algorithm for such tasks. The aim of this paper is to study the importance of finding plans for mobile agents. Although there has been a lot of research in mobile agents, not much work has been done to incorporate planning ideas for such agents. This paper makes an attempt in this direction. A theoretical study of finding plans for mobile agents is undertaken. A planning algorithm (based on the paradigm of mobile computing) is proposed and its space, time, and communication complexity is analyzed. The algorithm is illustrated by working out an example in detail.

Collaborative Web Platform for Rich Media Educational Material Creation

This paper describes a platform that faces the main research areas for e-learning educational contents. Reusability tackles the possibility to use contents in different courses reducing costs and exploiting available data from repositories. In our approach the production of educational material is based on templates to reuse learning objects. In terms of interoperability the main challenge lays on reaching the audience through different platforms. E-learning solution must track social consumption evolution where nowadays lots of multimedia contents are accessed through the social networks. Our work faces it by implementing a platform for generation of multimedia presentations focused on the new paradigm related to social media. The system produces videos-courses on top of web standard SMIL (Synchronized Multimedia Integration Language) ready to be published and shared. Regarding interfaces it is mandatory to satisfy user needs and ease communication. To overcome it the platform deploys virtual teachers that provide natural interfaces while multimodal features remove barriers to pupils with disabilities.

Remediation of Petroleum Hydrocarbon-contaminated Soil Slurry by Fenton Oxidation

Theobjective of this study was to evaluate the optimal treatment condition of Fenton oxidation process to removal contaminant in soil slurry contaminated by petroleum hydrocarbons. This research studied somefactors that affect the removal efficiency of petroleum hydrocarbons in soil slurry including molar ratio of hydrogen peroxide (H2O2) to ferrous ion(Fe2+), pH condition and reaction time.The resultsdemonstrated that the optimum condition was that the molar ratio of H2O2:Fe3+ was 200:1,the pHwas 4.0and the rate of reaction was increasing rapidly from starting point to 7th hour and destruction kinetic rate (k) was 0.24 h-1. Approximately 96% of petroleum hydrocarbon was observed(initialtotal petroleum hydrocarbon (TPH) concentration = 70±7gkg-1)

Disparity Estimation for Objects of Interest

An algorithm for estimating the disparity of objects of interest is proposed. This algorithm uses image shifting and overlapping area to estimate the disparity value; thereby depth of the objects of interest can be obtained. The algorithm is able to perform at different levels of accuracy. However, as the accuracy increases the processing speed decreases. The algorithm is tested with static stereo images and sequence of stereo images. The experimental results are presented in this paper.

Surface Topography Assessment Techniques based on an In-process Monitoring Approach of Tool Wear and Cutting Force Signature

The quality of a machined surface is becoming more and more important to justify the increasing demands of sophisticated component performance, longevity, and reliability. Usually, any machining operation leaves its own characteristic evidence on the machined surface in the form of finely spaced micro irregularities (surface roughness) left by the associated indeterministic characteristics of the different elements of the system: tool-machineworkpart- cutting parameters. However, one of the most influential sources in machining affecting surface roughness is the instantaneous state of tool edge. The main objective of the current work is to relate the in-process immeasurable cutting edge deformation and surface roughness to a more reliable easy-to-measure force signals using a robust non-linear time-dependent modeling regression techniques. Time-dependent modeling is beneficial when modern machining systems, such as adaptive control techniques are considered, where the state of the machined surface and the health of the cutting edge are monitored, assessed and controlled online using realtime information provided by the variability encountered in the measured force signals. Correlation between wear propagation and roughness variation is developed throughout the different edge lifetimes. The surface roughness is further evaluated in the light of the variation in both the static and the dynamic force signals. Consistent correlation is found between surface roughness variation and tool wear progress within its initial and constant regions. At the first few seconds of cutting, expected and well known trend of the effect of the cutting parameters is observed. Surface roughness is positively influenced by the level of the feed rate and negatively by the cutting speed. As cutting continues, roughness is affected, to different extents, by the rather localized wear modes either on the tool nose or on its flank areas. Moreover, it seems that roughness varies as wear attitude transfers from one mode to another and, in general, it is shown that it is improved as wear increases but with possible corresponding workpart dimensional inaccuracy. The dynamic force signals are found reasonably sensitive to simulate either the progressive or the random modes of tool edge deformation. While the frictional force components, feeding and radial, are found informative regarding progressive wear modes, the vertical (power) components is found more representative carrier to system instability resulting from the edge-s random deformation.

Object Localization in Medical Images Using Genetic Algorithms

We present a genetic algorithm application to the problem of object registration (i.e., object detection, localization and recognition) in a class of medical images containing various types of blood cells. The genetic algorithm approach taken here is seen to be most appropriate for this type of image, due to the characteristics of the objects. Successful cell registration results on real life microscope images of blood cells show the potential of the proposed approach.

On Problem of Parameters Identification of Dynamic Object

In this paper, some problem formulations of dynamic object parameters recovery described by non-autonomous system of ordinary differential equations with multipoint unshared edge conditions are investigated. Depending on the number of additional conditions the problem is reduced to an algebraic equations system or to a problem of quadratic programming. With this purpose the paper offers a new scheme of the edge conditions transfer method called by conditions shift. The method permits to get rid from differential links and multipoint unshared initially-edge conditions. The advantage of the proposed approach is concluded by capabilities of reduction of a parametric identification problem to essential simple problems of the solution of an algebraic system or quadratic programming.

Analysis and Research of Two-Level Scheduling Profile for Open Real-Time System

In an open real-time system environment, the coexistence of different kinds of real-time and non real-time applications makes the system scheduling mechanism face new requirements and challenges. One two-level scheduling scheme of the open real-time systems is introduced, and points out that hard and soft real-time applications are scheduled non-distinctively as the same type real-time applications, the Quality of Service (QoS) cannot be guaranteed. It has two flaws: The first, it can not differentiate scheduling priorities of hard and soft real-time applications, that is to say, it neglects characteristic differences between hard real-time applications and soft ones, so it does not suit a more complex real-time environment. The second, the worst case execution time of soft real-time applications cannot be predicted exactly, so it is not worth while to cost much spending in order to assure all soft real-time applications not to miss their deadlines, and doing that may cause resource wasting. In order to solve this problem, a novel two-level real-time scheduling mechanism (including scheduling profile and scheduling algorithm) which adds the process of dealing with soft real-time applications is proposed. Finally, we verify real-time scheduling mechanism from two aspects of theory and experiment. The results indicate that our scheduling mechanism can achieve the following objectives. (1) It can reflect the difference of priority when scheduling hard and soft real-time applications. (2) It can ensure schedulability of hard real-time applications, that is, their rate of missing deadline is 0. (3) The overall rate of missing deadline of soft real-time applications can be less than 1. (4) The deadline of a non-real-time application is not set, whereas the scheduling algorithm that server 0 S uses can avoid the “starvation" of jobs and increase QOS. By doing that, our scheduling mechanism is more compatible with different types of applications and it will be applied more widely.

Impulse Noise Reduction in Brain Magnetic Resonance Imaging Using Fuzzy Filters

Noise contamination in a magnetic resonance (MR) image could occur during acquisition, storage, and transmission in which effective filtering is required to avoid repeating the MR procedure. In this paper, an iterative asymmetrical triangle fuzzy filter with moving average center (ATMAVi filter) is used to reduce different levels of salt and pepper noise in a brain MR image. Besides visual inspection on filtered images, the mean squared error (MSE) is used as an objective measurement. When compared with the median filter, simulation results indicate that the ATMAVi filter is effective especially for filtering a higher level noise (such as noise density = 0.45) using a smaller window size (such as 3x3) when operated iteratively or using a larger window size (such as 5x5) when operated non-iteratively.

A Study on the Leadership Behavior, Safety Culture, and Safety Performance of the Healthcare Industry

Object: Review recent publications of patient safety culture to investigate the relationship between leadership behavior, safety culture, and safety performance in the healthcare industry. Method: This study is a cross-sectional study, 350 questionnaires were mailed to hospital workers with 195 valid responses obtained, and a 55.7% valid response rate. Confirmatory factor analysis (CFA) was carried out to test the factor structure and determine if the composite reliability was significant with a factor loading of >0.5, resulting in an acceptable model fit. Results: Through the analysis of One-way ANOVA, the results showed that physicians significantly have more negative patient safety culture perceptions and safety performance perceptions than non- physicians. Conclusions: The path analysis results show that leadership behavior affects safety culture and safety performance in the health care industry. Safety performance was affected and improved with contingency leadership and a positive patient safety organization culture. The study suggests improving safety performance by providing a well-managed system that includes: consideration of leadership, hospital worker training courses, and a solid safety reporting system.