Abstract: The purpose of our study was to compare spontaneous
re-epithelisation characteristics versus assisted re-epithelisation. In
order to assess re-epithelisation of the injured skin, we have imagined
and designed a burn wound model on Wistar rat skin. Our aim was to
create standardised, easy reproducible and quantifiable skin lesions
involving entire epidermis and superficial dermis. We then have
applied the above mentioned therapeutic strategies to compare
regeneration of epidermis and dermis, local and systemic parameter
changes in different conditions. We have enhanced the reepithelisation
process under a moist atmosphere of a polyurethane
wound dress modified with helium non-thermal plasma, and with the
aid of direct cold-plasma treatment respectively. We have followed
systemic parameters change: hematologic and biochemical
parameters, and local features: oxidative stress markers and histology
of skin in the above mentioned conditions. Re-epithelisation is just a
part of the skin regeneration process, which recruits cellular
components, with the aid of epidermal and dermal interaction via
signal molecules.
Abstract: This paper presents a method to estimate load profile
in a multiple power flow solutions for every minutes in 24 hours per
day. A method to calculate multiple solutions of non linear profile is
introduced. The Power System Simulation/Engineering (PSS®E) and
python has been used to solve the load power flow. The result of this
power flow solutions has been used to estimate the load profiles for
each load at buses using Independent Component Analysis (ICA)
without any knowledge of parameter and network topology of the
systems. The proposed algorithm is tested with IEEE 69 test bus
system represents for distribution part and the method of ICA has
been programmed in MATLAB R2012b version. Simulation results
and errors of estimations are discussed in this paper.
Abstract: Stochastic models of biological networks are well established in systems biology, where the computational treatment of such models is often focused on the solution of the so-called chemical master equation via stochastic simulation algorithms. In contrast to this, the development of storage-efficient model representations that are directly suitable for computer implementation has received significantly less attention. Instead, a model is usually described in terms of a stochastic process or a "higher-level paradigm" with graphical representation such as e.g. a stochastic Petri net. A serious problem then arises due to the exponential growth of the model-s state space which is in fact a main reason for the popularity of stochastic simulation since simulation suffers less from the state space explosion than non-simulative numerical solution techniques. In this paper we present transition class models for the representation of biological network models, a compact mathematical formalism that circumvents state space explosion. Transition class models can also serve as an interface between different higher level modeling paradigms, stochastic processes and the implementation coded in a programming language. Besides, the compact model representation provides the opportunity to apply non-simulative solution techniques thereby preserving the possible use of stochastic simulation. Illustrative examples of transition class representations are given for an enzyme-catalyzed substrate conversion and a part of the bacteriophage λ lysis/lysogeny pathway.
Abstract: A new nonlinear PID controller and its stability
analysis are presented in this paper. A nonlinear function is deduced
from the similarities between the control effort and the electric-field
effect of a capacitor. The conventional linear PID controller can be
modified into a nonlinear one by this function. To analyze the stability
of the nonlinear PID controlled system, an idea of energy equivalence
is adapted to avoid the conservativeness which is usually arisen from
some traditional theorems and Criterions. The energy equivalence is
naturally related with the conceptions of Passivity and T-Passivity. As
a result, an engineering guideline for the parameter design of the
nonlinear PID controller is obtained. An inverted pendulum system is
tested to verify the nonlinear PID control scheme.
Abstract: Controlled modification of appropriate sharpness for
nanotips is of paramount importance to develop novel materials and
functional devices at a nanometer resolution. Herein, we present a
reliable and unique strategy of laser irradiation enhanced
physicochemical etching to manufacture super sharp tungsten tips
with reproducible shape and dimension as well as high yields
(~80%). The corresponding morphology structure evolution of
tungsten tips and laser-tip interaction mechanisms were
systematically investigated and discussed using field emission
scanning electron microscope (SEM) and physical optics statistics
method with different fluences under 532 nm laser irradiation. This
work paves the way for exploring more accessible metallic tips
applications with tunable apex diameter and aspect ratio, and,
furthermore, facilitates the potential sharpening enhancement
technique for other materials used in a variety of nanoscale devices.
Abstract: This paper introduces an intelligent system, which can be applied in the monitoring of vehicle speed using a single camera. The ability of motion tracking is extremely useful in many automation problems and the solution to this problem will open up many future applications. One of the most common problems in our daily life is the speed detection of vehicles on a highway. In this paper, a novel technique is developed to track multiple moving objects with their speeds being estimated using a sequence of video frames. Field test has been conducted to capture real-life data and the processed results were presented. Multiple object problems and noisy in data are also considered. Implementing this system in real-time is straightforward. The proposal can accurately evaluate the position and the orientation of moving objects in real-time. The transformations and calibration between the 2D image and the actual road are also considered.
Abstract: This research paper presents a framework on how to
build up malware dataset.Many researchers took longer time to
clean the dataset from any noise or to transform the dataset into a
format that can be used straight away for testing. Therefore, this
research is proposing a framework to help researchers to speed up
the malware dataset cleaningprocesses which later can be used for
testing. It is believed, an efficient malware dataset cleaning
processes, can improved the quality of the data, thus help to improve
the accuracy and the efficiency of the subsequent analysis. Apart
from that, an in-depth understanding of the malware taxonomy is
also important prior and during the dataset cleaning processes. A
new Trojan classification has been proposed to complement this
framework.This experiment has been conducted in a controlled lab
environment and using the dataset from VxHeavens dataset. This
framework is built based on the integration of static and dynamic
analyses, incident response method and knowledge database
discovery (KDD) processes.This framework can be used as the basis
guideline for malware researchers in building malware dataset.
Abstract: A lot of Scientific and Engineering problems require the solution of large systems of linear equations of the form bAx in an effective manner. LU-Decomposition offers good choices for solving this problem. Our approach is to find the lower bound of processing elements needed for this purpose. Here is used the so called Omega calculus, as a computational method for solving problems via their corresponding Diophantine relation. From the corresponding algorithm is formed a system of linear diophantine equalities using the domain of computation which is given by the set of lattice points inside the polyhedron. Then is run the Mathematica program DiophantineGF.m. This program calculates the generating function from which is possible to find the number of solutions to the system of Diophantine equalities, which in fact gives the lower bound for the number of processors needed for the corresponding algorithm. There is given a mathematical explanation of the problem as well. Keywordsgenerating function, lattice points in polyhedron, lower bound of processor elements, system of Diophantine equationsand : calculus.
Abstract: There are many issues that affect modeling and designing real-time databases. One of those issues is maintaining consistency between the actual state of the real-time object of the external environment and its images as reflected by all its replicas distributed over multiple nodes. The need to improve the scalability is another important issue. In this paper, we present a general framework to design a replicated real-time database for small to medium scale systems and maintain all timing constrains. In order to extend the idea for modeling a large scale database, we present a general outline that consider improving the scalability by using an existing static segmentation algorithm applied on the whole database, with the intent to lower the degree of replication, enables segments to have individual degrees of replication with the purpose of avoiding excessive resource usage, which all together contribute in solving the scalability problem for DRTDBS.
Abstract: Emerging adulthood, between the ages of 18 and 25, as a new developmental stage extending from adolescence to young adulthood. According to Arnett [2004], there are experiments related to identity in three basic fields which are love, work and view of the world in emerging adulthood. When the literature related to identity is examined, it is seen that identity has been studied more with adolescent, and studies were concentrated on the relationship of identity with many demographic variables neglecting important variables such as marital status, parental status and SES. Thus, the main aim of this study is to determine whether identity statuses differenciate with marital status, parental status and SES. A total of 700 emerging adults participated in this study, and the mean age was 22,45 years [SD = 3.76]. The sample was made up of 347 female and 353 male. All participants in the study were students from colleges. Student responses to the Extended Version of the Objective Measure of Ego Identity Status [EOM-EIS-2] used to classify students into one of the four identity statuses. SPSS 15.00 program wasa used to analyse data. Percentage, frequency and X2 analysis were used in the analysis of data. When the findings of the study is viewed as a whole, the most frequently observed identity status in the group is found to be moratorium. Also, identity statuses differenciate with marital status, parental status and SES. Findings were discussed in the context of emerging adulthood.
Abstract: A method has been developed for preparing load
models for power flow and stability. The load modeling
(LOADMOD) computer software transforms data on load class mix,
composition, and characteristics into the from required for
commonly–used power flow and transient stability simulation
programs. Typical default data have been developed for load
composition and characteristics. This paper defines LOADMOD
software and describes the dynamic and static load modeling
techniques used in this software and results of initial testing for
BAKHTAR power system.
Abstract: This paper presents the cepstral and trispectral
analysis of a speech signal produced by normal men, men with
defective audition (deaf, deep deaf) and others affected by
tracheotomy, the trispectral analysis based on parametric methods
(Autoregressive AR) using the fourth order cumulant. These
analyses are used to detect and compare the pitches and the formants
of corresponding voiced sounds (vowel \a\, \i\ and \u\). The first
results appear promising, since- it seems after several experimentsthere
is no deformation of the spectrum as one could have supposed
it at the beginning, however these pathologies influenced the two
characteristics:
The defective audition influences to the formants contrary to the
tracheotomy, which influences the fundamental frequency (pitch).
Abstract: The full length mitochondrial small subunit ribosomal
(mt-rns) gene has been characterized for Ophiostoma novo-ulmi
subspecies americana. The gene was also characterized for
Ophiostoma ulmi and a group II intron was noted in the mt-rns gene
of O. ulmi. The insertion in the mt-rns gene is at position S952 and it
is a group IIB1 intron that encodes a double motif LAGLIDADG
homing endonuclease from an open reading frame located within a
loop of domain III. Secondary structure models for the mt-rns RNA
of O. novo-ulmi subsp. americana and O. ulmi were generated to
place the intron within the context of the ribosomal RNA. The in vivo
splicing of the O.ul-mS952 group II intron was confirmed with
reverse transcription-PCR. A survey of 182 strains of Dutch Elm
Diseases causing agents showed that the mS952 intron was absent in
what is considered to be the more aggressive species O. novo-ulmi
but present in strains of the less aggressive O. ulmi. This observation
suggests that the O.ul-mS952 intron can be used as a PCR-based
molecular marker to discriminate between O. ulmi and O. novo-ulmi
subsp. americana.
Abstract: In this paper, a vision based system has been used for
controlling an industrial 3P Cartesian robot. The vision system will
recognize the target and control the robot by obtaining images from
environment and processing them. At the first stage, images from
environment are changed to a grayscale mode then it can diverse and
identify objects and noises by using a threshold objects which are
stored in different frames and then the main object will be
recognized. This will control the robot to achieve the target. A vision
system can be an appropriate tool for measuring errors of a robot in a
situation where the experimental test is conducted for a 3P robot.
Finally, the international standard ANSI/RIA R15.05-2 is used for
evaluating the path-related characteristics of the robot. To evaluate
the performance of the proposed method experimental test is carried
out.
Abstract: Risk Assessment Tool (RAT) is an expert system that
assesses, monitors, and gives preliminary treatments automatically
based on the project plan. In this paper, a review was taken out for
the current project time management risk assessment tools for SME
software development projects, analyze risk assessment parameters,
conditions, scenarios, and finally propose risk assessment tool (RAT)
model to assess, treat, and monitor risks. An implementation prototype
system is developed to validate the model.
Abstract: We present in this paper an acquisition and treatment system designed for semi-analog Gamma-camera. It consists of a nuclear medical Image Acquisition, Treatment and Display chain(IATD) ensuring the acquisition, the treatment of the signals(resulting from the Gamma-camera detection head) and the scintigraphic image construction in real time. This chain is composed by an analog treatment board and a digital treatment board. We describe the designed systems and the digital treatment algorithms in which we have improved the performance and the flexibility. The digital treatment algorithms are implemented in a specific reprogrammable circuit FPGA (Field Programmable Gate Array).interface for semi-analog cameras of Sopha Medical Vision(SMVi) by taking as example SOPHY DS7. The developed system consists of an Image Acquisition, Treatment and Display (IATD) ensuring the acquisition and the treatment of the signals resulting from the DH. The developed chain is formed by a treatment analog board and a digital treatment board designed around a DSP [2]. In this paper we have presented the architecture of a new version of our chain IATD in which the integration of the treatment algorithms is executed on an FPGA (Field Programmable Gate Array)
Abstract: This paper introduces a technique for simulating a
single-server exponential queuing system. The technique called the
Q-Simulator is a computer program which can simulate the effect of
traffic intensity on all system average quantities given the arrival
and/or service rates. The Q-Simulator has three phases namely: the
formula based method, the uncontrolled simulation, and the
controlled simulation. The Q-Simulator generates graphs (crystal
solutions) for all results of the simulation or calculation and can be
used to estimate desirable average quantities such as waiting times,
queue lengths, etc.
Abstract: The dramatic effect of information technology on
society is undeniable. In education, it is evident in the use of terms
like active learning, blended learning, electronic learning and mobile
learning (ubiquitous learning). This study explores the perceptions of
54 learners in a higher education institution regarding the use of
mobile devices in a third year module. Using semi-structured
interviews, it was found that mobile devices had a positive impact on
learner motivation, engagement and enjoyment. It also improved the
consistency of learning material, and the convenience and flexibility
(anywhere, anytime) of learning. User-interfacelimitation, bandwidth
and cognitive overload, however, were of concern. The use of cloud
based resources like Youtube and Google Docs, through mobile
devices, positively influenced learner perceptions, making them
prosumers (both consumers and producers) of education content.
Abstract: Analyses carried out on examples of detected defects
echoes showed clearly that one can describe these detected forms according to a whole of characteristic parameters in order to be able to make discrimination between a planar defect and a volumic defect.
This work answers to a problem of ultrasonics NDT like Identification of the defects. The problems as well as the objective of
this realized work, are divided in three parts: Extractions of the parameters of wavelets from the ultrasonic echo of the detected defect - the second part is devoted to principal components analysis
(PCA) for optimization of the attributes vector. And finally to establish the algorithm of classification (SVM, Support Vector Machine) which allows discrimination between a plane defect and a
volumic defect. We have completed this work by a conclusion where we draw up a summary of the completed works, as well as the robustness of the
various algorithms proposed in this study.
Abstract: This paper presents probabilistic horizontal seismic
hazard assessment of Naghan, Iran. It displays the probabilistic
estimate of Peak Ground Horizontal Acceleration (PGHA) for the
return period of 475, 950 and 2475 years. The output of the
probabilistic seismic hazard analysis is based on peak ground
acceleration (PGA), which is the most common criterion in designing
of buildings. A catalogue of seismic events that includes both
historical and instrumental events was developed and covers the
period from 840 to 2009. The seismic sources that affect the hazard
in Naghan were identified within the radius of 200 km and the
recurrence relationships of these sources were generated by Kijko
and Sellevoll. Finally Peak Ground Horizontal Acceleration (PGHA)
has been prepared to indicate the earthquake hazard of Naghan for
different hazard levels by using SEISRISK III software.