Abstract: Control chart pattern recognition is one of the most important tools to identify the process state in statistical process control. The abnormal process state could be classified by the recognition of unnatural patterns that arise from assignable causes. In this study, a wavelet based neural network approach is proposed for the recognition of control chart patterns that have various characteristics. The procedure of proposed control chart pattern recognizer comprises three stages. First, multi-resolution wavelet analysis is used to generate time-shape and time-frequency coefficients that have detail information about the patterns. Second, distance based features are extracted by a bi-directional Kohonen network to make reduced and robust information. Third, a back-propagation network classifier is trained by these features. The accuracy of the proposed method is shown by the performance evaluation with numerical results.
Abstract: The wind resource in the Italian site of Lendinara
(RO) is analyzed through a systematic anemometric campaign
performed on the top of the bell tower, at an altitude of over 100 m
above the ground. Both the average wind speed and the Weibull
distribution are computed. The resulting average wind velocity is in
accordance with the numerical predictions of the Italian Wind Atlas,
confirming the accuracy of the extrapolation of wind data adopted for
the evaluation of wind potential at higher altitudes with respect to the
commonly placed measurement stations.
Abstract: The mosaicing technique has been employed in more and more application fields, from entertainment to scientific ones. In the latter case, often the final evaluation is still left to human beings, that assess visually the quality of the mosaic. Many times, a lack of objective measurements in microscopic mosaicing may prevent the mosaic from being used as a starting image for further analysis. In this work we analyze three different metrics and indexes, in the domain of signal analysis, image analysis and visual quality, to measure the quality of different aspects of the mosaicing procedure, such as registration errors and visual quality. As the case study we consider the mosaicing algorithm we developed. The experiments have been carried out by considering mosaics with very different features: histological samples, that are made of detailed and contrasted images, and live stem cells, that show a very low contrast and low detail levels.
Abstract: Sensorized instruments that accurately measure the interaction forces (between biological tissue and instrument endeffector) during surgical procedures offer surgeons a greater sense of immersion during minimally invasive robotic surgery. Although there is ongoing research into force measurement involving surgical graspers little corresponding effort has been carried out on the measurement of forces between scissor blades and tissue. This paper presents the design and development of a force measurement test apparatus, which will serve as a sensor characterization and evaluation platform. The primary aim of the experiments is to ascertain whether the system can differentiate between tissue samples with differing mechanical properties in a reliable, repeatable manner. Force-angular displacement curves highlight trends in the cutting process as well the forces generated along the blade during a cutting procedure. Future applications of the test equipment will involve the assessment of new direct force sensing technologies for telerobotic surgery.
Abstract: In this paper, we present an analytical framework for the evaluation of the uplink performance of multihop cellular networks based on dynamic time division duplex (TDD). New wireless broadband protocols, such as WiMAX, WiBro, and 3G-LTE apply TDD, and mobile communication protocols under standardization (e.g., IEEE802.16j) are investigating mobile multihop relay (MMR) as a future technology. In this paper a novel MMR TDD scheme is presented, where the dynamic range of the frame is shared to traffic resources of asymmetric nature and multihop relaying. The mobile communication channel interference model comprises of inner and co-channel interference (CCI). The performance analysis focuses on the uplink due to the fact that the effects of dynamic resource allocation show significant performance degradation only in the uplink compared to time division multiple access (TDMA) schemes due to CCI [1-3], where the downlink results to be the same or better.The analysis was based on the signal to interference power ratio (SIR) outage probability of dynamic TDD (D-TDD) and TDMA systems,which are the most widespread mobile communication multi-user control techniques. This paper presents the uplink SIR outage probability with multihop results and shows that the dynamic TDD scheme applying MMR can provide a performance improvement compared to single hop applications if executed properly.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: The counting and analysis of blood cells allows the
evaluation and diagnosis of a vast number of diseases. In particular,
the analysis of white blood cells (WBCs) is a topic of great interest to
hematologists. Nowadays the morphological analysis of blood cells is
performed manually by skilled operators. This involves numerous
drawbacks, such as slowness of the analysis and a nonstandard
accuracy, dependent on the operator skills. In literature there are only
few examples of automated systems in order to analyze the white
blood cells, most of which only partial. This paper presents a
complete and fully automatic method for white blood cells
identification from microscopic images. The proposed method firstly
individuates white blood cells from which, subsequently, nucleus and
cytoplasm are extracted. The whole work has been developed using
MATLAB environment, in particular the Image Processing Toolbox.
Abstract: Biodiesel as an alternative fuel for diesel engines has been developed for some three decades now. While it is gaining wide acceptance in Europe, USA and some parts of Asia, the same cannot be said of Africa. With more than 35 countries in the continent depending on imported crude oil, it is necessary to look for alternative fuels which can be produced from resources available locally within any country. Hence this study presents performance of single cylinder diesel engine using blends of shea butter biodiesel. Shea butter was transformed into biodiesel by transesterification process. Tests are conducted to compare the biodiesel with baseline diesel fuel in terms of engine performance and exhaust emission characteristics. The results obtained showed that the addition of biodiesel to diesel fuel decreases the brake thermal efficiency (BTE) and increases the brake specific fuel consumption (BSFC). These results are expected due to the lower energy content of biodiesel fuel. On the other hand while the NOx emissions increased with increase in biodiesel content in the fuel blends, the emissions of carbon monoxide (CO), un-burnt hydrocarbon (UHC) and smoke opacity decreased. The engine performance which indicates that the biodiesel has properties and characteristics similar to diesel fuel and the reductions in exhaust emissions make shea butter biodiesel a viable additive or substitute to diesel fuel.
Abstract: the work contains the results of complex investigation
related to the evaluation of condition of working blades of gas turbine
engines during fatigue tests by applying the acoustic emission
method. It demonstrates the possibility of estimating the fatigue
damage of blades in the process of factory tests. The acoustic
emission criteria for detecting and testing the kinetics of fatigue crack
distribution were detected. It also shows the high effectiveness of the
method for non-destructive testing of condition of solid and cooled
working blades for high-temperature gas turbine engines.
Abstract: In today-s turbulent environment, companies are faced with two principal challenges. On the one hand, it is necessary to produce ever more cost-effectively to remain competitive. On the other hand, factories need to be transformable in order to manage unpredictable changes in the corporate environment. To deal with these different challenges, companies use the philosophy of lean production in the first case, in the second case the philosophy of transformability. To a certain extent these two approaches follow different directions. This can cause conflicts when designing factories. Therefore, the Institute of Production Systems and Logistics (IFA) of the Leibniz University of Hanover has developed a procedure to allow companies to evaluate and design their factories with respect to the requirements of both philosophies.
Abstract: An empirical study of web applications that use
software frameworks is presented here. The analysis is based on two
approaches. In the first, developers using such frameworks are
required, based on their experience, to assign weights to parameters
such as database connection. In the second approach, a performance
testing tool, OpenSTA, is used to compute start time and other such
measures. From such an analysis, it is concluded that open source
software is superior to proprietary software. The motivation behind
this research is to examine ways in which a quantitative assessment
can be made of software in general and frameworks in particular.
Concepts such as metrics and architectural styles are discussed along
with previously published research.
Abstract: This study aims to propose three evaluation methods to
evaluate the Tokyo Cap and Trade Program when emissions trading is
performed virtually among enterprises, focusing on carbon dioxide
(CO2), which is the only emitted greenhouse gas that tends to increase.
The first method clarifies the optimum reduction rate for the highest
cost benefit, the second discusses emissions trading among enterprises
through market trading, and the third verifies long-term emissions
trading during the term of the plan (2010-2019), checking the validity
of emissions trading partly using Geographic Information Systems
(GIS). The findings of this study can be summarized in the following
three points.
1. Since the total cost benefit is the greatest at a 44% reduction rate, it
is possible to set it more highly than that of the Tokyo Cap and
Trade Program to get more total cost benefit.
2. At a 44% reduction rate, among 320 enterprises, 8 purchasing
enterprises and 245 sales enterprises gain profits from emissions
trading, and 67 enterprises perform voluntary reduction without
conducting emissions trading. Therefore, to further promote
emissions trading, it is necessary to increase the sales volumes of
emissions trading in addition to sales enterprises by increasing the
number of purchasing enterprises.
3. Compared to short-term emissions trading, there are few enterprises
which benefit in each year through the long-term emissions trading
of the Tokyo Cap and Trade Program. Only 81 enterprises at the
most can gain profits from emissions trading in FY 2019. Therefore,
by setting the reduction rate more highly, it is necessary to increase
the number of enterprises that participate in emissions trading and
benefit from the restraint of CO2 emissions.
Abstract: In the classical buckling analysis of rectangular plates
subjected to the concurrent action of shear and uniaxial forces, the
Euler shear buckling stress is generally evaluated separately, so that
no influence on the shear buckling coefficient, due to the in-plane
tensile or compressive forces, is taken into account.
In this paper the buckling problem of simply supported rectangular
plates, under the combined action of shear and uniaxial forces, is
discussed from the beginning, in order to obtain new project formulas
for the shear buckling coefficient that take into account the presence
of uniaxial forces.
Furthermore, as the classical expression of the shear buckling
coefficient for simply supported rectangular plates is considered only
a “rough" approximation, as the exact one is defined by a system of
intersecting curves, the convergence and the goodness of the classical
solution are analyzed, too.
Finally, as the problem of the Euler shear buckling stress
evaluation is a very important topic for a variety of structures, (e.g.
ship ones), two numerical applications are carried out, in order to
highlight the role of the uniaxial stresses on the plating scantling
procedures and the goodness of the proposed formulas.
Abstract: Many agent-oriented software engineering
methodologies have been proposed for software developing; however
their application is still limited due to their lack of maturity.
Evaluating the strengths and weaknesses of these methodologies
plays an important role in improving them and in developing new
stronger methodologies. This paper presents an evaluation framework
for agent-oriented methodologies, which addresses six major areas:
concepts, notation, process, pragmatics, support for software
engineering and marketability. The framework is then used to
evaluate the Gaia methodology to identify its strengths and
weaknesses, and to prove the ability of the framework for promoting
the agent-oriented methodologies by detecting their weaknesses in
detail.
Abstract: Extensive use of the Internet coupled with the
marvelous growth in e-commerce and m-commerce has created a
huge demand for information security. The Secure Socket Layer
(SSL) protocol is the most widely used security protocol in the
Internet which meets this demand. It provides protection against
eaves droppings, tampering and forgery. The cryptographic
algorithms RC4 and HMAC have been in use for achieving security
services like confidentiality and authentication in the SSL. But recent
attacks against RC4 and HMAC have raised questions in the
confidence on these algorithms. Hence two novel cryptographic
algorithms MAJE4 and MACJER-320 have been proposed as
substitutes for them. The focus of this work is to demonstrate the
performance of these new algorithms and suggest them as dependable
alternatives to satisfy the need of security services in SSL. The
performance evaluation has been done by using practical
implementation method.
Abstract: The paper deals with an application of quantitative analysis – the Data Envelopment Analysis (DEA) method to performance evaluation of the European Union Member States, in the reference years 2000 and 2011. The main aim of the paper is to measure efficiency changes over the reference years and to analyze a level of productivity in individual countries based on DEA method and to classify the EU Member States to homogeneous units (clusters) according to efficiency results. The theoretical part is devoted to the fundamental basis of performance theory and the methodology of DEA. The empirical part is aimed at measuring degree of productivity and level of efficiency changes of evaluated countries by basic DEA model – CCR CRS model, and specialized DEA approach – the Malmquist Index measuring the change of technical efficiency and the movement of production possibility frontier. Here, DEA method becomes a suitable tool for setting a competitive/uncompetitive position of each country because there is not only one factor evaluated, but a set of different factors that determine the degree of economic development.
Abstract: In this study spatial-temporal speckle correlation techniques have been applied for the quality evaluation of three different Indian fruits namely apple, pear and tomato for the first time. The method is based on the analysis of variations of laser light scattered from biological samples. The results showed that crosscorrelation coefficients of biospeckle patterns change subject to their freshness and the storage conditions. The biospeckle activity was determined by means of the cross-correlation functions of the intensity fluctuations. Significant changes in biospeckle activity were observed during their shelf lives. From the study, it is found that the biospeckle activity decreases with the shelf-life storage time. Further it has been shown that biospeckle activity changes according to their respiration rates.
Abstract: One of the determinants of a firm-s prosperity is the
customers- perceived service quality and satisfaction. While service
quality is wide in scope, and consists of various dimensions, there
may be differences in the relative importance of these dimensions in
affecting customers- overall satisfaction of service quality.
Identifying the relative rank of different dimensions of service quality
is very important in that it can help managers to find out which
service dimensions have a greater effect on customers- overall
satisfaction. Such an insight will consequently lead to more effective
resource allocation which will finally end in higher levels of
customer satisfaction. This issue –despite its criticality- has not
received enough attention so far. Therefore, using a sample of 240
bank customers in Iran, an artificial neural network is developed to
address this gap in the literature. As customers- evaluation of service
quality is a subjective process, artificial neural networks –as a brain
metaphor- may appear to have a potentiality to model such a
complicated process. Proposing a neural network which is able to
predict the customers- overall satisfaction of service quality with a
promising level of accuracy is the first contribution of this study. In
addition, prioritizing the service quality dimensions in affecting
customers- overall satisfaction –by using sensitivity analysis of
neural network- is the second important finding of this paper.
Abstract: Actual load, material characteristics and other
quantities often differ from the design values. This can cause worse
function, shorter life or failure of a civil engineering structure, a
machine, vehicle or another appliance. The paper shows main causes
of the uncertainties and deviations and presents a systematic
approach and efficient tools for their elimination or mitigation of
consequences. Emphasis is put on the design stage, which is most
important for reliability ensuring. Principles of robust design and
important tools are explained, including FMEA, sensitivity analysis
and probabilistic simulation methods. The lifetime prediction of
long-life objects can be improved by long-term monitoring of the
load response and damage accumulation in operation. The condition
evaluation of engineering structures, such as bridges, is often based
on visual inspection and verbal description. Here, methods based on
fuzzy logic can reduce the subjective influences.
Abstract: In this paper, we propose a chaotic cipher system consisting of Improved Volterra Filters and the mapping that is created from the actual voice by using Radial Basis Function Network. In order to achieve a practical system, the system supposes to use the digital communication line, such as the Internet, to maintain the parameter matching between the transmitter and receiver sides. Therefore, in order to withstand the attack from outside, it is necessary that complicate the internal state and improve the sensitivity coefficient. In this paper, we validate the robustness of proposed method from three perspectives of "Chaotic properties", "Randomness", "Coefficient sensitivity".