A Generalized Framework for Working with Multiagent Systems

The present paper discusses the basic concepts and the underlying principles of Multi-Agent Systems (MAS) along with an interdisciplinary exploitation of these principles. It has been found that they have been utilized for lots of research and studies on various systems spanning across diverse engineering and scientific realms showing the need of development of a proper generalized framework. Such framework has been developed for the Multi-Agent Systems and it has been generalized keeping in mind the diverse areas where they find application. All the related aspects have been categorized and a general definition has been given where ever possible.

Power Optimization Techniques in FPGA Devices: A Combination of System- and Low-Levels

This paper presents preliminary results regarding system-level power awareness for FPGA implementations in wireless sensor networks. Re-configurability of field programmable gate arrays (FPGA) allows for significant flexibility in its applications to embedded systems. However, high power consumption in FPGA becomes a significant factor in design considerations. We present several ideas and their experimental verifications on how to optimize power consumption at high level of designing process while maintaining the same energy per operation (low-level methods can be used additionally). This paper demonstrates that it is possible to estimate feasible power consumption savings even at the high level of designing process. It is envisaged that our results can be also applied to other embedded systems applications, not limited to FPGA-based.

Identification of Ductile Damage Parameters for Austenitic Steel

The modeling of inelastic behavior of plastic materials requires measurements providing information on material response to different multiaxial loading conditions. Different triaxiality conditions and values of Lode parameters have to be covered for complex description of the material plastic behavior. Samples geometries providing material plastic behavoiur over the range of interest are proposed with the use of FEM analysis. Round samples with 3 different notches and smooth surface are used together with butterfly type of samples tested at angle ranging for 0 to 90°. Identification of ductile damage parameters is carried out on the basis of obtained experimental data for austenitic stainless steel. The obtained material plastic damage parameters are subsequently applied to FEM simulation of notched CT normally samples used for fracture mechanics testing and results from the simulation are compared with real tests.

Animal-Assisted Therapy for Persons with Disabilities Based on Canine Tail Language Interpretation via Gaussian-Trapezoidal Fuzzy Emotional Behavior Model

In order to alleviate the mental and physical problems of persons with disabilities, animal-assisted therapy (AAT) is one of the possible modalities that employs the merit of the human-animal interaction. Nevertheless, to achieve the purpose of AAT for persons with severe disabilities (e.g. spinal cord injury, stroke, and amyotrophic lateral sclerosis), real-time animal language interpretation is desirable. Since canine behaviors can be visually notable from its tail, this paper proposes the automatic real-time interpretation of canine tail language for human-canine interaction in the case of persons with severe disabilities. Canine tail language is captured via two 3-axis accelerometers. Directions and frequencies are selected as our features of interests. The novel fuzzy rules based on Gaussian-Trapezoidal model and center of gravity (COG)-based defuzzification method are proposed in order to interpret the features into four canine emotional behaviors, i.e., agitate, happy, scare and neutral as well as its blended emotional behaviors. The emotional behavior model is performed in the simulated dog and has also been evaluated in the real dog with the perfect recognition rate.

Biodiesel as an Alternative Fuel for Diesel Engines

There is growing interest in biodiesel (fatty acid methyl ester or FAME) because of the similarity in its properties when compared to those of diesel fuels. Diesel engines operated on biodiesel have lower emissions of carbon monoxide, unburned hydrocarbons, particulate matter, and air toxics than when operated on petroleum-based diesel fuel. Production of fatty acid methyl ester (FAME) from rapeseed (nonedible oil) fatty acid distillate having high free fatty acids (FFA) was investigated in this work. Conditions for esterification process of rapeseed oil were 1.8 % H2SO4 as catalyst, MeOH/oil of molar ratio 2 : 0.1 and reaction temperature 65 °C, for a period of 3h. The yield of methyl ester was > 90 % in 1 h. The amount of FFA was reduced from 93 wt % to less than 2 wt % at the end of the esterification process. The FAME was pureed by neutralization with 1 M sodium hydroxide in water solution at a reaction temperature of 62 °C. The final FAME product met with the biodiesel quality standard, and ASTM D 6751.

Hand Vein Image Enhancement With Radon Like Features Descriptor

Nowadays, hand vein recognition has attracted more attentions in identification biometrics systems. Generally, hand vein image is acquired with low contrast and irregular illumination. Accordingly, if you have a good preprocessing of hand vein image, we can easy extracted the feature extraction even with simple binarization. In this paper, a proposed approach is processed to improve the quality of hand vein image. First, a brief survey on existing methods of enhancement is investigated. Then a Radon Like features method is applied to preprocessing hand vein image. Finally, experiments results show that the proposed method give the better effective and reliable in improving hand vein images.

Development of a Kinetic Model for the Photodegradation of 4-Chlorophenol using a XeBr Excilamp

Excilamps are new UV sources with great potential for application in wastewater treatment. In the present work, a XeBr excilamp emitting radiation at 283 nm has been used for the photodegradation of 4-chlorophenol within a range of concentrations from 50 to 500 mg L-1. Total removal of 4-chlorophenol was achieved for all concentrations assayed. The two main photoproduct intermediates formed along the photodegradation process, benzoquinone and hydroquinone, although not being completely removed, remain at very low residual concentrations. Such concentrations are insignificant compared to the 4-chlorophenol initial ones and non-toxic. In order to simulate the process and scaleup, a kinetic model has been developed and validated from the experimental data.

Virtual Assembly in a Semi-Immersive Environment

Virtual Assembly (VA) is one of the key technologies in advanced manufacturing field. It is a promising application of virtual reality in design and manufacturing field. It has drawn much interest from industries and research institutes in the last two decades. This paper describes a process for integrating an interactive Virtual Reality-based assembly simulation of a digital mockup with the CAD/CAM infrastructure. The necessary hardware and software preconditions for the process are explained so that it can easily be adopted by non VR experts. The article outlines how assembly simulation can improve the CAD/CAM procedures and structures; how CAD model preparations have to be carried out and which virtual environment requirements have to be fulfilled. The issue of data transfer is also explained in the paper. The other challenges and requirements like anti-aliasing and collision detection have also been explained. Finally, a VA simulation has been carried out for a ball valve assembly and a car door assembly with the help of Vizard virtual reality toolkit in a semi-immersive environment and their performance analysis has been done on different workstations to evaluate the importance of graphical processing unit (GPU) in the field of VA.

A Kernel Classifier using Linearised Bregman Iteration

In this paper we introduce a novel kernel classifier based on a iterative shrinkage algorithm developed for compressive sensing. We have adopted Bregman iteration with soft and hard shrinkage functions and generalized hinge loss for solving l1 norm minimization problem for classification. Our experimental results with face recognition and digit classification using SVM as the benchmark have shown that our method has a close error rate compared to SVM but do not perform better than SVM. We have found that the soft shrinkage method give more accuracy and in some situations more sparseness than hard shrinkage methods.

Developing Examination Management System: Senior Capstone Project, a Case Study

This paper presents the result of three senior capstone projects at the Department of Computer Engineering, Prince of Songkla University, Thailand. These projects focus on developing an examination management system for the Faculty of Engineering in order to manage the examination both the examination room assignments and the examination proctor assignments in each room. The current version of the software is a web-based application. The developed software allows the examination proctors to select their scheduled time online while each subject is assigned to each available examination room according to its type and the room capacity. The developed system is evaluated using real data by prospective users of the system. Several suggestions for further improvements are given by the testers. Even though the features of the developed software are not superior, the developing process can be a case study for a projectbased teaching style. Furthermore, the process of developing this software can show several issues in developing an educational support application.

A Systematic Mapping Study on Software Engineering Education

Inadequate curriculum for software engineering is considered to be one of the most common software risks. A number of solutions, on improving Software Engineering Education (SEE) have been reported in literature but there is a need to collectively present these solutions at one place. We have performed a mapping study to present a broad view of literature; published on improving the current state of SEE. Our aim is to give academicians, practitioners and researchers an international view of the current state of SEE. Our study has identified 70 primary studies that met our selection criteria, which we further classified and categorized in a well-defined Software Engineering educational framework. We found that the most researched category within the SE educational framework is Innovative Teaching Methods whereas the least amount of research was found in Student Learning and Assessment category. Our future work is to conduct a Systematic Literature Review on SEE.

Application of Spreadsheet and Queuing Network Model to Capacity Optimization in Product Development

Modeling of a manufacturing system enables one to identify the effects of key design parameters on the system performance and as a result to make correct decision. This paper proposes a manufacturing system modeling approach using a spreadsheet model based on queuing network theory, in which a static capacity planning model and stochastic queuing model are integrated. The model was used to improve the existing system utilization in relation to product design. The model incorporates few parameters such as utilization, cycle time, throughput, and batch size. The study also showed that the validity of developed model is good enough to apply and the maximum value of relative error is 10%, far below the limit value 32%. Therefore, the model developed in this study is a valuable alternative model in evaluating a manufacturing system

Human Capital and Capability Approach in European Lifelong Learning Development: A Case Study of Macedonia in the Balkan

The paper discusses European Lifelong Learning policy in the European enlargement to the Balkan. The European Lifelong Learning policy with Human Capital approach is researched in the country case of Macedonia. The paper argues that Human Capital approach focusing on instrumental and economic importance of learning for employability and economic growth needs to be complemented with Capability Approach for intrinsic and noneconomic needs of learning among the ethnic minorities. The paper identifies two dimensions of importance – minority languages and civic education – that the Capability Approach may develop to guarantee equal opportunities to all to benefit from European educational and lifelong learning development and to build an inclusive and socially just democracy in Macedonia.

Flexible Laser Reduced Graphene Oxide/ MnO2 Electrode for Supercapacitor Applications

We succeeded to produce a high performance and flexible graphene/Manganese dioxide (G/MnO2) electrode coated on flexible polyethylene terephthalate (PET) substrate. The graphene film is initially synthesized by drop-casting the graphene oxide (GO) solution on the PET substrate, followed by simultaneous reduction and patterning of the dried film using carbon dioxide (CO2) laser beam with power of 1.8 W. Potentiostatic Anodic Deposition method was used to deposit thin film of MnO2 with different loading mass 10 – 50 and 100 μg.cm-2 on the pre-prepared graphene film. The electrodes were fully characterized in terms of structure, morphology, and electrochemical performance. A maximum specific capacitance of 973 F.g-1 was attributed when depositing 50μg.cm-2 MnO2 on the laser reduced graphene oxide rGO (or G/50MnO2) and over 92% of its initial capacitance was retained after 1000 cycles. The good electrochemical performance and long-term cycling stability make our proposed approach a promising candidate in the supercapacitor applications.

Computer Software Applicable in Rehabilitation, Cardiology and Molecular Biology

We have developed a computer program consisting of 6 subtests assessing the children hand dexterity applicable in the rehabilitation medicine. We have carried out a normative study on a representative sample of 285 children aged from 7 to 15 (mean age 11.3) and we have proposed clinical standards for three age groups (7-9, 9-11, 12-15 years). We have shown statistical significance of differences among the corresponding mean values of the task time completion. We have also found a strong correlation between the task time completion and the age of the subjects, as well as we have performed the test-retest reliability checks in the sample of 84 children, giving the high values of the Pearson coefficients for the dominant and non-dominant hand in the range 0.740.97 and 0.620.93, respectively. A new MATLAB-based programming tool aiming at analysis of cardiologic RR intervals and blood pressure descriptors, is worked out, too. For each set of data, ten different parameters are extracted: 2 in time domain, 4 in frequency domain and 4 in Poincaré plot analysis. In addition twelve different parameters of baroreflex sensitivity are calculated. All these data sets can be visualized in time domain together with their power spectra and Poincaré plots. If available, the respiratory oscillation curves can be also plotted for comparison. Another application processes biological data obtained from BLAST analysis.

Noise Analysis of Single-Ended Input Differential Amplifier using Stochastic Differential Equation

In this paper, we analyze the effect of noise in a single- ended input differential amplifier working at high frequencies. Both extrinsic and intrinsic noise are analyzed using time domain method employing techniques from stochastic calculus. Stochastic differential equations are used to obtain autocorrelation functions of the output noise voltage and other solution statistics like mean and variance. The analysis leads to important design implications and suggests changes in the device parameters for improved noise characteristics of the differential amplifier.

Efficient Large Numbers Karatsuba-Ofman Multiplier Designs for Embedded Systems

Long number multiplications (n ≥ 128-bit) are a primitive in most cryptosystems. They can be performed better by using Karatsuba-Ofman technique. This algorithm is easy to parallelize on workstation network and on distributed memory, and it-s known as the practical method of choice. Multiplying long numbers using Karatsuba-Ofman algorithm is fast but is highly recursive. In this paper, we propose different designs of implementing Karatsuba-Ofman multiplier. A mixture of sequential and combinational system design techniques involving pipelining is applied to our proposed designs. Multiplying large numbers can be adapted flexibly to time, area and power criteria. Computationally and occupation constrained in embedded systems such as: smart cards, mobile phones..., multiplication of finite field elements can be achieved more efficiently. The proposed designs are compared to other existing techniques. Mathematical models (Area (n), Delay (n)) of our proposed designs are also elaborated and evaluated on different FPGAs devices.

Burstiness Reduction of a Doubly Stochastic AR-Modeled Uniform Activity VBR Video

Stochastic modeling of network traffic is an area of significant research activity for current and future broadband communication networks. Multimedia traffic is statistically characterized by a bursty variable bit rate (VBR) profile. In this paper, we develop an improved model for uniform activity level video sources in ATM using a doubly stochastic autoregressive model driven by an underlying spatial point process. We then examine a number of burstiness metrics such as the peak-to-average ratio (PAR), the temporal autocovariance function (ACF) and the traffic measurements histogram. We found that the former measure is most suitable for capturing the burstiness of single scene video traffic. In the last phase of this work, we analyse statistical multiplexing of several constant scene video sources. This proved, expectedly, to be advantageous with respect to reducing the burstiness of the traffic, as long as the sources are statistically independent. We observed that the burstiness was rapidly diminishing, with the largest gain occuring when only around 5 sources are multiplexed. The novel model used in this paper for characterizing uniform activity video was thus found to be an accurate model.

Lightning Protection Systems Design for Substations by Using Masts and Matlab

The economical criterion is accounted as the objective function to develop a computer program for designing lightning protection systems for substations by using masts and Matlab in this work. Masts are needed to be placed at desired locations; the program will then show mast heights whose sum is the smallest, i.e. satisfies the economical criterion. The program is helpful for engineers to quickly design a lightning protection system for a substation. To realize this work, methodology and limited conditions of the program, as well as an example of the program result, were described in this paper.

Reliable Capacitated Facility Location Problem Considering Maximal Covering

This paper provides a framework in order to incorporate reliability issue as a sign of disruption in distribution systems and partial covering theory as a response to limitation in coverage radios and economical preferences, simultaneously into the traditional literatures of capacitated facility location problems. As a result we develop a bi-objective model based on the discrete scenarios for expected cost minimization and demands coverage maximization through a three echelon supply chain network by facilitating multi-capacity levels for provider side layers and imposing gradual coverage function for distribution centers (DCs). Additionally, in spite of objectives aggregation for solving the model through LINGO software, a branch of LP-Metric method called Min- Max approach is proposed and different aspects of corresponds model will be explored.