Abstract: Beam and diffuse radiation data are extracted analytically from previous measured data on a horizontal surface in Zarqa city. Moreover, radiation data on a tilted surfaces with different slopes have been derived and analyzed. These data are consisting of of beam contribution, diffuse contribution, and ground reflected contribution radiation. Hourly radiation data for horizontal surface possess the highest radiation values on June, and then the values decay as the slope increases and the sharp decreasing happened for vertical surface. The beam radiation on a horizontal surface owns the highest values comparing to diffuse radiation for all days of June. The total daily radiation on the tilted surface decreases with slopes. The beam radiation data also decays with slopes especially for vertical surface. Diffuse radiation slightly decreases with slopes with sharp decreases for vertical surface. The groundreflected radiation grows with slopes especially for vertical surface. It-s clear that in June the highest harvesting of solar energy occurred for horizontal surface, then the harvesting decreases as the slope increases.
Abstract: In general, class complexity is measured based on any
one of these factors such as Line of Codes (LOC), Functional points
(FP), Number of Methods (NOM), Number of Attributes (NOA) and so on. There are several new techniques, methods and metrics with
the different factors that are to be developed by the researchers for calculating the complexity of the class in Object Oriented (OO)
software. Earlier, Arockiam et.al has proposed a new complexity measure namely Extended Weighted Class Complexity (EWCC)
which is an extension of Weighted Class Complexity which is proposed by Mishra et.al. EWCC is the sum of cognitive weights of
attributes and methods of the class and that of the classes derived. In EWCC, a cognitive weight of each attribute is considered to be 1.
The main problem in EWCC metric is that, every attribute holds the
same value but in general, cognitive load in understanding the
different types of attributes cannot be the same. So here, we are proposing a new metric namely Attribute Weighted Class Complexity
(AWCC). In AWCC, the cognitive weights have to be assigned for the attributes which are derived from the effort needed to understand
their data types. The proposed metric has been proved to be a better
measure of complexity of class with attributes through the case studies and experiments
Abstract: This paper presents the exergy analysis of a
desalination unit using humidification-dehumidification process.
Here, this unit is considered as a thermal system with three main
components, which are the heating unit by using a solar collector, the
evaporator or the humidifier, and the condenser or the dehumidifier.
In these components the exergy is a measure of the quality or grade
of energy and it can be destroyed in them. According to the second
law of thermodynamics this destroyed part is due to irreversibilities
which must be determined to obtain the exergetic efficiency of the
system.
In the current paper a computer program has been developed using
visual basic to determine the exergy destruction and the exergetic
efficiencies of the components of the desalination unit at variable
operation conditions such as feed water temperature, outlet air
temperature, air to feed water mass ratio and salinity, in addition to
cooling water mass flow rate and inlet temperature, as well as
quantity of solar irradiance.
The results obtained indicate that the exergy efficiency of the
humidifier increases by increasing the mass ratio and decreasing the
outlet air temperature. In the other hand the exergy efficiency of the
condenser increases with the increase of this ratio and also with the
increase of the outlet air temperature.
Abstract: A study was carried out at the Rice Research Institute of Iran (RRII) to investigate the effect of rollers differential peripheral speed of commercial rubber roll husker and paddy moisture content on the husking index and percentage of broken rice. The experiment was conducted at six levels of rollers differential speed (1.5, 2.2, 2.9, 3.6, 4.3 and 5 m/s) and three levels of paddy moisture content (8-9, 10-11 and 12-13% w.b.). Two common paddy varieties namely, Binam and Khazer, were selected for this study. Results revealed that the effect of rollers differential speed and moisture content significantly (P
Abstract: A full six degrees of freedom (6-DOF) flight dynamics
model is proposed for the accurate prediction of short and long-range
trajectories of high spin and fin-stabilized projectiles via atmospheric
flight to final impact point. The projectiles is assumed to be both rigid
(non-flexible), and rotationally symmetric about its spin axis launched
at low and high pitch angles. The mathematical model is based on the
full equations of motion set up in the no-roll body reference frame and
is integrated numerically from given initial conditions at the firing
site. The projectiles maneuvering motion depends on the most
significant force and moment variations, in addition to wind and
gravity. The computational flight analysis takes into consideration the
Mach number and total angle of attack effects by means of the
variable aerodynamic coefficients. For the purposes of the present
work, linear interpolation has been applied from the tabulated database
of McCoy-s book. The developed computational method gives
satisfactory agreement with published data of verified experiments and
computational codes on atmospheric projectile trajectory analysis for
various initial firing flight conditions.
Abstract: Electrocardiogram (ECG) data compression algorithm
is needed that will reduce the amount of data to be transmitted, stored
and analyzed, but without losing the clinical information content. A
wavelet ECG data codec based on the Set Partitioning In Hierarchical
Trees (SPIHT) compression algorithm is proposed in this paper. The
SPIHT algorithm has achieved notable success in still image coding.
We modified the algorithm for the one-dimensional (1-D) case and
applied it to compression of ECG data.
By this compression method, small percent root mean square
difference (PRD) and high compression ratio with low
implementation complexity are achieved. Experiments on selected
records from the MIT-BIH arrhythmia database revealed that the
proposed codec is significantly more efficient in compression and in
computation than previously proposed ECG compression schemes.
Compression ratios of up to 48:1 for ECG signals lead to acceptable
results for visual inspection.
Abstract: In this study the effect of incorporation of recycled
glass-fibre reinforced polymer (GFRP) waste materials, obtained by
means of milling processes, on mechanical behaviour of polyester
polymer mortars was assessed. For this purpose, different contents of
recycled GFRP waste powder and fibres, with distinct size gradings,
were incorporated into polyester based mortars as sand aggregates
and filler replacements. Flexural and compressive loading capacities
were evaluated and found better than unmodified polymer mortars.
GFRP modified polyester based mortars also show a less brittle
behaviour, with retention of some loading capacity after peak load.
Obtained results highlight the high potential of recycled GFRP waste
materials as efficient and sustainable reinforcement and admixture for
polymer concrete and mortars composites, constituting an emergent
waste management solution.
Abstract: The increasingly sophisticated technologies have now been able to provide assistance for surgeons to improve surgical
performance through various training programs. Equally important to learning skills is the assessment method as it determines the learning and technical proficiency of a trainee. A consistent and
rigorous assessment system will ensure that trainees acquire the specific level of competency prior to certification. This paper
reviews the methods currently in use for assessment of surgical
skill and some modern techniques using computer-based
measurements and virtual reality systems for more quantitative
measurements
Abstract: In the paper the method of product analysis from
recycling point of view has been described. The analysis bases on set
of measures that assess a product from the point of view of final
stages of its lifecycle. It was assumed that such analysis will be
performed at the design phase – in order to conduct such analysis the
computer system that aids the designer during the design process has
been developed. The structure of the computer tool, based on agent
technology, and example results has been also included in the paper.
Abstract: An area-integrating method that uses the technique of total integrated light scatter for evaluating the root mean square height of the surface Sq has been presented in the paper. It is based on the measurement of the scatter power using a flat photodiode integrator rather than an optical sphere or a hemisphere. By this means, one can obtain much less expensive and smaller instruments than traditional ones. Thanks to this, they could find their application for surface control purposes, particularly in small and medium size enterprises. A description of the functioning of the measuring unit as well as the impact caused by different factors on its properties is presented first. Next, results of measurements of the Sq values performed for optical, silicon and metal samples have been shown. It has been also proven that they are in a good agreement with the results obtained using the Ulbricht sphere instrument.
Abstract: In this paper we study the rheonomic mechanical systems from the point of view of Lagrange geometry, by means of its canonical semispray. We present an example of the constraint motion of a material point, in the rheonomic case.
Abstract: This paper addresses the problems encountered by conventional distance relays when protecting double-circuit transmission lines. The problems arise principally as a result of the mutual coupling between the two circuits under different fault conditions; this mutual coupling is highly nonlinear in nature. An adaptive protection scheme is proposed for such lines based on application of artificial neural network (ANN). ANN has the ability to classify the nonlinear relationship between measured signals by identifying different patterns of the associated signals. One of the key points of the present work is that only current signals measured at local end have been used to detect and classify the faults in the double circuit transmission line with double end infeed. The adaptive protection scheme is tested under a specific fault type, but varying fault location, fault resistance, fault inception angle and with remote end infeed. An improved performance is experienced once the neural network is trained adequately, which performs precisely when faced with different system parameters and conditions. The entire test results clearly show that the fault is detected and classified within a quarter cycle; thus the proposed adaptive protection technique is well suited for double circuit transmission line fault detection & classification. Results of performance studies show that the proposed neural network-based module can improve the performance of conventional fault selection algorithms.
Abstract: A clustering based technique has been developed and implemented for Short Term Load Forecasting, in this article. Formulation has been done using Mean Absolute Percentage Error (MAPE) as an objective function. Data Matrix and cluster size are optimization variables. Model designed, uses two temperature variables. This is compared with six input Radial Basis Function Neural Network (RBFNN) and Fuzzy Inference Neural Network (FINN) for the data of the same system, for same time period. The fuzzy inference system has the network structure and the training procedure of a neural network which initially creates a rule base from existing historical load data. It is observed that the proposed clustering based model is giving better forecasting accuracy as compared to the other two methods. Test results also indicate that the RBFNN can forecast future loads with accuracy comparable to that of proposed method, where as the training time required in the case of FINN is much less.
Abstract: In the present communication, we have proposed
some new generalized measure of fuzzy entropy based upon real
parameters, discussed their and desirable properties, and presented
these measures graphically. An important property, that is,
monotonicity of the proposed measures has also been studied.
Abstract: ECG analysis method was developed using ROC
analysis of PVC detecting algorithm. ECG signal of MIT-BIH
arrhythmia database was analyzed by MATLAB. First of all, the
baseline was removed by median filter to preprocess the ECG signal.
R peaks were detected for ECG analysis method, and normal VCG
was extracted for VCG analysis method. Four PVC detecting
algorithm was analyzed by ROC curve, which parameters are
maximum amplitude of QRS complex, width of QRS complex, r-r
interval and geometric mean of VCG. To set cut-off value of
parameters, ROC curve was estimated by true-positive rate
(sensitivity) and false-positive rate. sensitivity and false negative rate
(specificity) of ROC curve calculated, and ECG was analyzed using
cut-off value which was estimated from ROC curve. As a result, PVC
detecting algorithm of VCG geometric mean have high availability,
and PVC could be detected more accurately with amplitude and width
of QRS complex.
Abstract: The present study was done primarily to address two major research gaps: firstly, development of an empirical measure of life meaningfulness for substance users and secondly, to determine the psychosocial determinants of life meaningfulness among the substance users. The study is classified into two phases: the first phase which dealt with development of Life Meaningfulness Scale and the second phase which examined the relationship between life meaningfulness and social support, abstinence self efficacy and depression. Both qualitative and quantitative approaches were used for framing items. A Principal Component Analysis yielded three components: Overall Goal Directedness, Striving for healthy lifestyle and Concern for loved ones which collectively accounted for 42.06% of the total variance. The scale and its subscales were also found to be highly reliable. Multiple regression analyses in the second phase of the study revealed that social support and abstinence self efficacy significantly predicted life meaningfulness among 48 recovering inmates of a de-addiction center while level of depression failed to predict life meaningfulness.
Abstract: This research is aimed to compare the percentages of correct classification of Empirical Bayes method (EB) to Classical method when data are constructed as near normal, short-tailed and long-tailed symmetric, short-tailed and long-tailed asymmetric. The study is performed using conjugate prior, normal distribution with known mean and unknown variance. The estimated hyper-parameters obtained from EB method are replaced in the posterior predictive probability and used to predict new observations. Data are generated, consisting of training set and test set with the sample sizes 100, 200 and 500 for the binary classification. The results showed that EB method exhibited an improved performance over Classical method in all situations under study.
Abstract: The analysis of Acoustic Emission (AE) signal
generated from metal cutting processes has often approached
statistically. This is due to the stochastic nature of the emission
signal as a result of factors effecting the signal from its generation
through transmission and sensing. Different techniques are applied in
this manner, each of which is suitable for certain processes. In metal
cutting where the emission generated by the deformation process is
rather continuous, an appropriate method for analysing the AE signal
based on the root mean square (RMS) of the signal is often used and
is suitable for use with the conventional signal processing systems.
The aim of this paper is to set a strategy in tool failure detection in
turning processes via the statistic analysis of the AE generated from
the cutting zone. The strategy is based on the investigation of the
distribution moments of the AE signal at predetermined sampling.
The skews and kurtosis of these distributions are the key elements in
the detection. A normal (Gaussian) distribution has first been
suggested then this was eliminated due to insufficiency. The so
called Beta distribution was then considered, this has been used with
an assumed β density function and has given promising results with
regard to chipping and tool breakage detection.
Abstract: Erroneous computer entry problems [here: 'e'errors] in hospital labs threaten the patients-–health carers- relationship, undermining the health system credibility. Are e-errors random, and do lab professionals make them accidentally, or may they be traced through meaningful determinants? Theories on internal causality of mistakes compel to seek specific causal ascriptions of hospital lab eerrors instead of accepting some inescapability. Undeniably, 'To Err is Human'. But in view of rapid global health organizational changes, e-errors are too expensive to lack in-depth considerations. Yet, that efunction might supposedly be entrenched in the health carers- job description remains under dispute – at least for Hellenic labs, where e-use falls behind generalized(able) appreciation and application. In this study: i) an empirical basis of a truly high annual cost of e-errors at about €498,000.00 per rural Hellenic hospital was established, hence interest in exploring the issue was sufficiently substantiated; ii) a sample of 270 lab-expert nurses, technicians and doctors were assessed on several personality, burnout and e-error measures, and iii) the hypothesis that the Hardiness vs Alienation personality construct disposition explains resistance vs proclivity to e-errors was tested and verified: Hardiness operates as a resilience source in the encounter of high pressures experienced in the hospital lab, whereas its 'opposite', i.e., Alienation, functions as a predictor, not only of making e-errors, but also of leading to burn-out. Implications for apt interventions are discussed.
Abstract: The adsorption of simulated aqueous solution containing textile remazol reactive dye, namely Red 3BS by palm shell activated carbon (PSAC) as adsorbent was carried out using Response Surface Methodology (RSM). A Box-Behnken design in three most important operating variables; initial dye concentration, dosage of adsorbent and speed of impeller was employed for experimental design and optimization of results. The significance of independent variables and their interactions were tested by means of the analysis of variance (ANOVA) with 95% confidence limits. Model indicated that with the increasing of dosage and speed give the result of removal up to 90% with the capacity uptake more than 7 mg/g. High regression coefficient between the variables and the response (R-Sq = 93.9%) showed of good evaluation of experimental data by polynomial regression model.