Abstract: Safety instrumented systems (SISs) are becoming
increasingly complex and the proportion of programmable electronic
parts is growing. The IEC 61508 global standard was established to
ensure the functional safety of SISs, but it was expressed in highly
macroscopic terms. This study introduces an evaluation process for
hardware safety integrity levels through failure modes, effects, and
diagnostic analysis (FMEDA).FMEDA is widely used to evaluate
safety levels, and it provides the information on failure rates and
failure mode distributions necessary to calculate a diagnostic coverage
factor for a given component. In our evaluation process, the
components of the SIS subsystem are first defined in terms of failure
modes and effects. Then, the failure rate and failure mechanism
distribution are assigned to each component. The safety mode and
detectability of each failure mode are determined for each component.
Finally, the hardware safety integrity level is evaluated based on the
calculated results.
Abstract: Nowadays, with the emerging of the new applications
like robot control in image processing, artificial vision for visual
servoing is a rapidly growing discipline and Human-machine
interaction plays a significant role for controlling the robot. This
paper presents a new algorithm based on spatio-temporal volumes for
visual servoing aims to control robots. In this algorithm, after
applying necessary pre-processing on video frames, a spatio-temporal
volume is constructed for each gesture and feature vector is extracted.
These volumes are then analyzed for matching in two consecutive
stages. For hand gesture recognition and classification we tested
different classifiers including k-Nearest neighbor, learning vector
quantization and back propagation neural networks. We tested the
proposed algorithm with the collected data set and results showed the
correct gesture recognition rate of 99.58 percent. We also tested the
algorithm with noisy images and algorithm showed the correct
recognition rate of 97.92 percent in noisy images.
Abstract: In this paper, five options of Iran’s gas flare recovery
have been compared via MCDM method. For developing the model,
the weighing factor of each indicator an AHP method is used via the
Expert-choice software. Several cases were considered in this
analysis. They are defined where the priorities were defined always
keeping one criterion in first position, while the priorities of the other
criteria were defined by ordinal information defining the mutual
relations of the criteria and the respective indicators. The results,
show that amongst these cases, priority is obtained for CHP usage
where availability indicator is highly weighted while the pipeline
usage is obtained where environmental indicator highly weighted and
the injection priority is obtained where economic indicator is highly
weighted and also when the weighing factor of all the criteria are the
same the Injection priority is obtained.
Abstract: The statistical process control (SPC) is one of the most powerful tools developed to assist ineffective control of quality, involves collecting, organizing and interpreting data during production. This article aims to show how the use of CEP industries can control and continuously improve product quality through monitoring of production that can detect deviations of parameters representing the process by reducing the amount of off-specification products and thus the costs of production. This study aimed to conduct a technological forecasting in order to characterize the research being done related to the CEP. The survey was conducted in the databases Spacenet, WIPO and the National Institute of Industrial Property (INPI). Among the largest are the United States depositors and deposits via PCT, the classification section that was presented in greater abundance to F.
Abstract: Several works regarding facial recognition have dealt with methods which identify isolated characteristics of the face or with templates which encompass several regions of it. In this paper a new technique which approaches the problem holistically dispensing with the need to identify geometrical characteristics or regions of the face is introduced. The characterization of a face is achieved by randomly sampling selected attributes of the pixels of its image. From this information we construct a set of data, which correspond to the values of low frequencies, gradient, entropy and another several characteristics of pixel of the image. Generating a set of “p" variables. The multivariate data set with different polynomials minimizing the data fitness error in the minimax sense (L∞ - Norm) is approximated. With the use of a Genetic Algorithm (GA) it is able to circumvent the problem of dimensionality inherent to higher degree polynomial approximations. The GA yields the degree and values of a set of coefficients of the polynomials approximating of the image of a face. By finding a family of characteristic polynomials from several variables (pixel characteristics) for each face (say Fi ) in the data base through a resampling process the system in use, is trained. A face (say F ) is recognized by finding its characteristic polynomials and using an AdaBoost Classifier from F -s polynomials to each of the Fi -s polynomials. The winner is the polynomial family closer to F -s corresponding to target face in data base.
Abstract: The study of effect of laser scanning speed on
material efficiency in Ti6Al4V application is very important because unspent powder is not reusable because of high temperature oxygen
pick-up and contamination. This study carried out an extensive study
on the effect of scanning speed on material efficiency by varying the
speed between 0.01 to 0.1m/sec. The samples are wire brushed and
cleaned with acetone after each deposition to remove un-melted
particles from the surface of the deposit. The substrate is weighed before and after deposition. A formula was developed to calculate the
material efficiency and the scanning speed was compared with the
powder efficiency obtained. The results are presented and discussed.
The study revealed that the optimum scanning speed exists for this study at 0.01m/sec, above and below which the powder efficiency
will drop
Abstract: This work represents the first review paper to explore the relationship between perfectionistic personality and borderline personality organization. The developmental origins, identity diffusion, interpersonal difficulties, and defense mechanisms that are common to both borderline personality and the interpersonal components of perfectionism are explored, and existing research on perfectionism and borderline personality is reviewed. The importance of the link between perfectionism and borderline features is discussed in terms of its contribution to the conceptual understanding of personality pathology as well as to applied clinical practices.
Abstract: Eukaryotic protein-coding genes are interrupted by spliceosomal introns, which are removed from the RNA transcripts before translation into a protein. The exon-intron structures of different eukaryotic species are quite different from each other, and the evolution of such structures raises many questions. We try to address some of these questions using statistical analysis of whole genomes. We go through all the protein-coding genes in a genome and study correlations between the net length of all the exons in a gene, the number of the exons, and the average length of an exon. We also take average values of these features for each chromosome and study correlations between those averages on the chromosomal level. Our data show universal features of exon-intron structures common to animals, plants, and protists (specifically, Arabidopsis thaliana, Caenorhabditis elegans, Drosophila melanogaster, Cryptococcus neoformans, Homo sapiens, Mus musculus, Oryza sativa, and Plasmodium falciparum). We have verified linear correlation between the number of exons in a gene and the length of a protein coded by the gene, while the protein length increases in proportion to the number of exons. On the other hand, the average length of an exon always decreases with the number of exons. Finally, chromosome clustering based on average chromosome properties and parameters of linear regression between the number of exons in a gene and the net length of those exons demonstrates that these average chromosome properties are genome-specific features.
Abstract: Webcam systems now function as the new privileged
vantage points from which to view the city. This transformation of
CCTV technology from surveillance to promotional tool is significant
because its'scopic regime' presents, back to the public, a new virtual
'site' that sits alongside its real-time counterpart. Significantly,
thisraw 'image' data can, in fact,be co-optedand processed so as to
disrupt their original purpose. This paper will demonstrate this
disruptive capacity through an architectural project. It will reveal how
the adaption the webcam image offers a technical springboard by
which to initiate alternate urban form making decisions and subvert
the disciplinary reliance on the 'flat' orthographic plan. In so doing,
the paper will show how this 'digital material' exceeds the imagistic
function of the image; shiftingit from being a vehicle of signification
to a site of affect.
Abstract: Coronary artery bypass grafts (CABG) are widely
studied with respect to hemodynamic conditions which play
important role in presence of a restenosis. However, papers which
concern with constitutive modeling of CABG are lacking in the
literature. The purpose of this study is to find a constitutive model for
CABG tissue. A sample of the CABG obtained within an autopsy
underwent an inflation–extension test. Displacements were
recoredered by CCD cameras and subsequently evaluated by digital
image correlation. Pressure – radius and axial force – elongation
data were used to fit material model. The tissue was modeled as onelayered
composite reinforced by two families of helical fibers. The
material is assumed to be locally orthotropic, nonlinear,
incompressible and hyperelastic. Material parameters are estimated
for two strain energy functions (SEF). The first is classical
exponential. The second SEF is logarithmic which allows
interpretation by means of limiting (finite) strain extensibility.
Presented material parameters are estimated by optimization based
on radial and axial equilibrium equation in a thick-walled tube. Both
material models fit experimental data successfully. The exponential
model fits significantly better relationship between axial force and
axial strain than logarithmic one.
Abstract: Statistics indicate that more than 1000 phishing attacks are launched every month. With 57 million people hit by the fraud so far in America alone, how do we combat phishing?This publication aims to discuss strategies in the war against Phishing. This study is an examination of the analysis and critique found in the ways adopted at various levels to counter the crescendo of phishing attacks and new techniques being adopted for the same. An analysis of the measures taken up by the varied popular Mail servers and popular browsers is done under this study. This work intends to increase the understanding and awareness of the internet user across the globe and even discusses plausible countermeasures at the users as well as the developers end. This conceptual paper will contribute to future research on similar topics.
Abstract: Main Memory Database systems (MMDB) store their
data in main physical memory and provide very high-speed access.
Conventional database systems are optimized for the particular
characteristics of disk storage mechanisms. Memory resident
systems, on the other hand, use different optimizations to structure
and organize data, as well as to make it reliable.
This paper provides a brief overview on MMDBs and one of the
memory resident systems named FastDB and compares the
processing time of this system with a typical disc resident database
based on the results of the implementation of TPC benchmarks
environment on both.
Abstract: The purpose of this study was to understand the main
sources of copper (Cu) accumulation in target organs of tilapia
(Oreochromis mossambicus) and to investigate how the organism
mediate the process of Cu accumulation under prolonged conditions.
By measuring both dietary and waterborne Cu accumulation and total
concentrations in tilapia with biokinetic modeling approach, we were
able to clarify the biokinetic coping mechanisms for the long term Cu
accumulation. This study showed that water and food are both the
major source of Cu for the muscle and liver of tilapia. This implied
that control the Cu concentration in these two routes will be correlated
to the Cu bioavailability for tilapia. We found that exposure duration
and level of waterborne Cu drove the Cu accumulation in tilapia. The
ability for Cu biouptake and depuration in organs of tilapia were
actively mediated under prolonged exposure conditions. Generally,
the uptake rate, depuration rate and net bioaccumulation ability in all
selected organs decreased with the increasing level of waterborne Cu
and extension of exposure duration.Muscle tissues accounted for over
50%of the total accumulated Cu and played a key role in buffering the
Cu burden in the initial period of exposure, alternatively, the liver
acted a more important role in the storage of Cu with the extension of
exposures. We concluded that assumption of the constant biokinetic
rates could lead to incorrect predictions with overestimating the
long-term Cu accumulation in ecotoxicological risk assessments.
Abstract: Dust storms are one of the most costly and destructive
events in many desert regions. They can cause massive damages both
in natural environments and human lives. This paper is aimed at
presenting a preliminary study on dust storms, as a major natural
hazard in arid and semi-arid regions. As a case study, dust storm
events occurred in Zabol city located in Sistan Region of Iran was
analyzed to diagnose and predict dust storms. The identification and
prediction of dust storm events could have significant impacts on
damages reduction. Present models for this purpose are complicated
and not appropriate for many areas with poor-data environments. The
present study explores Gamma test for identifying inputs of ANNs
model, for dust storm prediction. Results indicate that more attempts
must be carried out concerning dust storms identification and
segregate between various dust storm types.
Abstract: The study deals with the modelling of the gas flow during heliox therapy. A special model has been developed to study the effect of the helium upon the gas flow in the airways during the spontaneous breathing. Lower density of helium compared with air decreases the Reynolds number and it allows improving the flow during the spontaneous breathing. In the cases, where the flow becomes turbulent while the patient inspires air the flow is still laminar when the patient inspires heliox. The use of heliox decreases the work of breathing and improves ventilation. It allows in some cases to prevent the intubation of the patients.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In the context of computer numerical control (CNC) and computer aided manufacturing (CAM), the capabilities of programming languages such as symbolic and intuitive programming, program portability and geometrical portfolio have special importance. They allow to save time and to avoid errors during part programming and permit code re-usage. Our updated literature review indicates that the current state of art presents voids in parametric programming, program portability and programming flexibility. In response to this situation, this article presents a compiler implementation for EGCL (Extended G-code Language), a new, enriched CNC programming language which allows the use of descriptive variable names, geometrical functions and flow-control statements (if-then-else, while). Our compiler produces low-level generic, elementary ISO-compliant Gcode, thus allowing for flexibility in the choice of the executing CNC machine and in portability. Our results show that readable variable names and flow control statements allow a simplified and intuitive part programming and permit re-usage of the programs. Future work includes allowing the programmer to define own functions in terms of EGCL, in contrast to the current status of having them as library built-in functions.
Abstract: In this paper, by using the continuation theorem of coincidence degree theory, M-matrix theory and constructing some suitable Lyapunov functions, some sufficient conditions are obtained for the existence and global exponential stability of periodic solutions of recurrent neural networks with distributed delays and impulses on time scales. Without assuming the boundedness of the activation functions gj, hj , these results are less restrictive than those given in the earlier references.
Abstract: In this article, while it is attempted to describe the
problem and its importance, transformational leadership is studied by considering leadership theories. Issues such as the definition of
transformational leadership and its aspects are compared on the basis of the ideas of various connoisseurs and then it (transformational leadership) is examined in successful and
unsuccessful companies. According to the methodology, the
method of research, hypotheses, population and statistical sample
are investigated and research findings are analyzed by using descriptive and inferential statistical methods in the framework of
analytical tables. Finally, our conclusion is provided by considering the results of statistical tests. The final result shows that
transformational leadership is significantly higher in successful companies than unsuccessful ones P
Abstract: Effective evaluation of software development effort is an important aspect of successful project management. Based on a large database with 4106 projects ever developed, this study statistically examines the factors that influence development effort. The factors found to be significant for effort are project size, average number of developers that worked on the project, type of development, development language, development platform, and the use of rapid application development. Among these factors, project size is the most critical cost driver. Unsurprisingly, this study found that the use of CASE tools does not necessarily reduce development effort, which adds support to the claim that the use of tools is subtle. As many of the current estimation models are rarely or unsuccessfully used, this study proposes a parsimonious parametric model for the prediction of effort which is both simple and more accurate than previous models.