Abstract: Data Envelopment Analysis (DEA) is one of the most
widely used technique for evaluating the relative efficiency of a set
of homogeneous decision making units. Traditionally, it assumes that
input and output variables are known in advance, ignoring the critical
issue of data uncertainty. In this paper, we deal with the problem
of efficiency evaluation under uncertain conditions by adopting the
general framework of the stochastic programming. We assume that
output parameters are represented by discretely distributed random
variables and we propose two different models defined according to a
neutral and risk-averse perspective. The models have been validated
by considering a real case study concerning the evaluation of the
technical efficiency of a sample of individual firms operating in
the Italian leather manufacturing industry. Our findings show the
validity of the proposed approach as ex-ante evaluation technique
by providing the decision maker with useful insights depending on
his risk aversion degree.
Abstract: Despite extensive study on wireless sensor network
security, defending internal attacks and finding abnormal behaviour
of the sensor are still difficult and unsolved task. The conventional
cryptographic technique does not give the robust security or detection
process to save the network from internal attacker that cause by
abnormal behavior. The insider attacker or abnormally behaved
sensor identificationand location detection framework using false
massage detection and Time difference of Arrival (TDoA) is
presented in this paper. It has been shown that the new framework
can efficiently identify and detect the insider attacker location so that
the attacker can be reprogrammed or subside from the network to
save from internal attack.
Abstract: Digital watermarking is one of the techniques for
copyright protection. In this paper, a normalization-based robust
image watermarking scheme which encompasses singular value
decomposition (SVD) and discrete cosine transform (DCT)
techniques is proposed. For the proposed scheme, the host image is
first normalized to a standard form and divided into non-overlapping
image blocks. SVD is applied to each block. By concatenating the
first singular values (SV) of adjacent blocks of the normalized image,
a SV block is obtained. DCT is then carried out on the SV blocks to
produce SVD-DCT blocks. A watermark bit is embedded in the highfrequency
band of a SVD-DCT block by imposing a particular
relationship between two pseudo-randomly selected DCT
coefficients. An adaptive frequency mask is used to adjust local
watermark embedding strength. Watermark extraction involves
mainly the inverse process. The watermark extracting method is blind
and efficient. Experimental results show that the quality degradation
of watermarked image caused by the embedded watermark is visually
transparent. Results also show that the proposed scheme is robust
against various image processing operations and geometric attacks.
Abstract: Document image processing has become an
increasingly important technology in the automation of office
documentation tasks. During document scanning, skew is inevitably
introduced into the incoming document image. Since the algorithm
for layout analysis and character recognition are generally very
sensitive to the page skew. Hence, skew detection and correction in
document images are the critical steps before layout analysis. In this
paper, a novel skew detection method is presented for binary
document images. The method considered the some selected
characters of the text which may be subjected to thinning and Hough
transform to estimate skew angle accurately. Several experiments
have been conducted on various types of documents such as
documents containing English Documents, Journals, Text-Book,
Different Languages and Document with different fonts, Documents
with different resolutions, to reveal the robustness of the proposed
method. The experimental results revealed that the proposed method
is accurate compared to the results of well-known existing methods.
Abstract: This paper discusses the causal explanation capability
of QRIOM, a tool aimed at supporting learning of organic chemistry
reactions. The development of the tool is based on the hybrid use of
Qualitative Reasoning (QR) technique and Qualitative Process
Theory (QPT) ontology. Our simulation combines symbolic,
qualitative description of relations with quantity analysis to generate
causal graphs. The pedagogy embedded in the simulator is to both
simulate and explain organic reactions. Qualitative reasoning through
a causal chain will be presented to explain the overall changes made
on the substrate; from initial substrate until the production of final
outputs. Several uses of the QPT modeling constructs in supporting
behavioral and causal explanation during run-time will also be
demonstrated. Explaining organic reactions through causal graph
trace can help improve the reasoning ability of learners in that their
conceptual understanding of the subject is nurtured.
Abstract: Various models have been derived by studying large number of completed software projects from various organizations and applications to explore how project sizes mapped into project effort. But, still there is a need to prediction accuracy of the models. As Neuro-fuzzy based system is able to approximate the non-linear function with more precision. So, Neuro-Fuzzy system is used as a soft computing approach to generate model by formulating the relationship based on its training. In this paper, Neuro-Fuzzy technique is used for software estimation modeling of on NASA software project data and performance of the developed models are compared with the Halstead, Walston-Felix, Bailey-Basili and Doty Models mentioned in the literature.
Abstract: The problem of frequent itemset mining is considered in this paper. One new technique proposed to generate frequent patterns in large databases without time-consuming candidate generation. This technique is based on focusing on transaction instead of concentrating on itemset. This algorithm based on take intersection between one transaction and others transaction and the maximum shared items between transactions computed instead of creating itemset and computing their frequency. With applying real life transactions and some consumption is taken from real life data, the significant efficiency acquire from databases in generation association rules mining.
Abstract: Optical flow is a research topic of interest for many
years. It has, until recently, been largely inapplicable to real-time
applications due to its computationally expensive nature. This paper
presents a new reliable flow technique which is combined with a
motion detection algorithm, from stationary camera image streams,
to allow flow-based analyses of moving entities, such as rigidity, in
real-time. The combination of the optical flow analysis with motion
detection technique greatly reduces the expensive computation of
flow vectors as compared with standard approaches, rendering the
method to be applicable in real-time implementation. This paper
describes also the hardware implementation of a proposed pipelined
system to estimate the flow vectors from image sequences in real
time. This design can process 768 x 576 images at a very high frame
rate that reaches to 156 fps in a single low cost FPGA chip, which is
adequate for most real-time vision applications.
Abstract: In this paper a new approach to face recognition is presented that achieves double dimension reduction making the system computationally efficient with better recognition results. In pattern recognition techniques, discriminative information of image increases with increase in resolution to a certain extent, consequently face recognition results improve with increase in face image resolution and levels off when arriving at a certain resolution level. In the proposed model of face recognition, first image decimation algorithm is applied on face image for dimension reduction to a certain resolution level which provides best recognition results. Due to better computational speed and feature extraction potential of Discrete Cosine Transform (DCT) it is applied on face image. A subset of coefficients of DCT from low to mid frequencies that represent the face adequately and provides best recognition results is retained. A trade of between decimation factor, number of DCT coefficients retained and recognition rate with minimum computation is obtained. Preprocessing of the image is carried out to increase its robustness against variations in poses and illumination level. This new model has been tested on different databases which include ORL database, Yale database and a color database. The proposed technique has performed much better compared to other techniques. The significance of the model is two fold: (1) dimension reduction up to an effective and suitable face image resolution (2) appropriate DCT coefficients are retained to achieve best recognition results with varying image poses, intensity and illumination level.
Abstract: Ferroresonance is an electrical phenomenon in
nonlinear character, which frequently occurs in power system due to
transmission line faults and single or more-phase switching on the
lines as well as usage of the saturable transformers. In this study, the
ferroresonance phenomena are investigated under the modeling of the
West Anatolian Electric Power Network of 380 kV in Turkey. The
ferroresonance event is observed as a result of removing the loads at
the end of the lines. In this sense, two different cases are considered.
At first, the switching is applied at 2nd second and the ferroresonance
affects are observed between 2nd and 4th seconds in the voltage
variations of the phase-R. Hence the ferroresonance and nonferroresonance
parts of the overall data are compared with each
others using the Fourier transform techniques to show the
ferroresonance affects.
Abstract: Most of the losses in a power system relate to
the distribution sector which always has been considered.
From the important factors which contribute to increase losses
in the distribution system is the existence of radioactive flows.
The most common way to compensate the radioactive power
in the system is the power to use parallel capacitors. In
addition to reducing the losses, the advantages of capacitor
placement are the reduction of the losses in the release peak of
network capacity and improving the voltage profile. The point
which should be considered in capacitor placement is the
optimal placement and specification of the amount of the
capacitor in order to maximize the advantages of capacitor
placement.
In this paper, a new technique has been offered for the
placement and the specification of the amount of the constant
capacitors in the radius distribution network on the basis of
Genetic Algorithm (GA). The existing optimal methods for
capacitor placement are mostly including those which reduce
the losses and voltage profile simultaneously. But the
retaliation cost and load changes have not been considered as
influential UN the target function .In this article, a holistic
approach has been considered for the optimal response to this
problem which includes all the parameters in the distribution
network: The price of the phase voltage and load changes. So,
a vast inquiry is required for all the possible responses. So, in
this article, we use Genetic Algorithm (GA) as the most
powerful method for optimal inquiry.
Abstract: Linear stochastic estimation and quadratic stochastic
estimation techniques were applied to estimate the entire velocity
flow-field of an open cavity with a length to depth ratio of 2. The
estimations were done through the use of instantaneous velocity
magnitude as estimators. These measurements were obtained by
Particle Image Velocimetry. The predicted flow was compared
against the original flow-field in terms of the Reynolds stresses and
turbulent kinetic energy. Quadratic stochastic estimation proved to be
more superior than linear stochastic estimation in resolving the shear
layer flow. When the velocity fluctuations were scaled up in the
quadratic estimate, both the time-averaged quantities and the
instantaneous cavity flow can be predicted to a rather accurate extent.
Abstract: A boundary layer wind tunnel facility has been
adopted in order to conduct experimental measurements of the flow field around a model of the Panorama Giustinelli Building, Trieste
(Italy). Information on the main flow structures has been obtained by means of flow visualization techniques and has been compared to the
numerical predictions of the vortical structures spread on top of the roof, in order to investigate the optimal positioning for a vertical-axis
wind energy conversion system, registering a good agreement between experimental measurements and numerical predictions.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Business scenario is an important technique that may be used at various stages of the enterprise architecture to derive its characteristics based on the high-level requirements of the business. In terms of wireless deployments, they are used to help identify and understand business needs involving wireless services, and thereby to derive the business requirements that the architecture development has to address by taking into account of various wireless challenges. This study assesses the deployment of Wireless Local Area Network (WLAN) and Broadband Wireless Access (BWA) solutions for several business scenarios in Asia Pacific region. This paper focuses on the overview of the business and technology environments, whereby examples of existing (or suggested) wireless solutions (to be) adopted in Asia Pacific region will be discussed. Interactions of several players, enabling technologies, and key processes in the wireless environments are studied. The analysis and discussions associated to this study are divided into two divisions: healthcare and education, where the merits of wireless solutions in improving living quality are highlighted.
Abstract: In this paper, the optimum weight and cost of a laminated composite plate is seeked, while it undergoes the heaviest load prior to a complete failure. Various failure criteria are defined for such structures in the literature. In this work, the Tsai-Hill theory is used as the failure criterion. The theory of analysis was based on the Classical Lamination Theory (CLT). A newly type of Genetic Algorithm (GA) as an optimization technique with a direct use of real variables was employed. Yet, since the optimization via GAs is a long process, and the major time is consumed through the analysis, Radial Basis Function Neural Networks (RBFNN) was employed in predicting the output from the analysis. Thus, the process of optimization will be carried out through a hybrid neuro-GA environment, and the procedure will be carried out until a predicted optimum solution is achieved.
Abstract: Encrypted messages sending frequently draws the attention
of third parties, perhaps causing attempts to break and
reveal the original messages. Steganography is introduced to hide
the existence of the communication by concealing a secret message
in an appropriate carrier like text, image, audio or video. Quantum
steganography where the sender (Alice) embeds her steganographic
information into the cover and sends it to the receiver (Bob) over a
communication channel. Alice and Bob share an algorithm and hide
quantum information in the cover. An eavesdropper (Eve) without
access to the algorithm can-t find out the existence of the quantum
message. In this paper, a text quantum steganography technique based
on the use of indefinite articles (a) or (an) in conjunction with the nonspecific
or non-particular nouns in English language and quantum
gate truth table have been proposed. The authors also introduced a
new code representation technique (SSCE - Secret Steganography
Code for Embedding) at both ends in order to achieve high level of
security. Before the embedding operation each character of the secret
message has been converted to SSCE Value and then embeds to cover
text. Finally stego text is formed and transmits to the receiver side.
At the receiver side different reverse operation has been carried out
to get back the original information.
Abstract: Practicum placements are an critical factor for student teachers on Education Programs. How can student teachers become professionals? This study was to investigate problems, weakness and obstacles of practicum placements and develop guidelines for partnership in the practicum placements. In response to this issue, a partnership concept was implemented for developing student teachers into professionals. Data were collected through questionnaires on attitude toward problems, weaknesses, and obstacles of practicum placements of student teachers in Rajabhat universities and included focus group interviews. The research revealed that learning management, classroom management, curriculum, assessment and evaluation, classroom action research, and teacher demeanor are the important factors affecting the professional development of Education Program student teachers. Learning management plan and classroom management concerning instructional design, teaching technique, instructional media, and student behavior management are another important aspects influencing the professional development for student teachers.
Abstract: Medical image modalities such as computed
tomography (CT), magnetic resonance imaging (MRI), ultrasound
(US), X-ray are adapted to diagnose disease. These modalities
provide flexible means of reviewing anatomical cross-sections and
physiological state in different parts of the human body. The raw
medical images have a huge file size and need large storage
requirements. So it should be such a way to reduce the size of those
image files to be valid for telemedicine applications. Thus the image
compression is a key factor to reduce the bit rate for transmission or
storage while maintaining an acceptable reproduction quality, but it is
natural to rise the question of how much an image can be compressed
and still preserve sufficient information for a given clinical
application. Many techniques for achieving data compression have
been introduced. In this study, three different MRI modalities which
are Brain, Spine and Knee have been compressed and reconstructed
using wavelet transform. Subjective and objective evaluation has
been done to investigate the clinical information quality of the
compressed images. For the objective evaluation, the results show
that the PSNR which indicates the quality of the reconstructed image
is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and
26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For
the subjective evaluation test, the results show that the compression
ratio of 40:1 was acceptable for brain image, whereas for spine and
knee images 50:1 was acceptable.
Abstract: The purpose of this study was to elucidate the factors affecting antimicrobial effectiveness of essential oils against food spoilage and pathogenic bacteria. The minimum inhibition concentrations (MIC) of the essential oils, were determined by turbidimetric technique using Biocreen C, analyzer. The effects of pH ranging from 7.3 to 5.5 in absence and presence of essential oils and/or NaCl on the lag time and mean generation time of the bacteria at 370C, were carried out and results were determined showed that, combination of low pH and essential oil at 370C had additive effects against the test micro-organisms. The combination of 1.2 % (w/v) of NaCl and clove essential oil at 0.0325% (v/v) was effective against E. coli. The use of concentrations less than MIC in combination with low pH and or NaCl has the potential of being used as an alternative to “traditional food preservatives".