Abstract: Hybrid photovoltaic thermal (PV/T) solar system comprises a solar collector which is disposed on photovoltaic solar cells. The disadvantage of a conventional photovoltaic cell is that its performance decreases as the temperature increases. Indeed, part of the solar radiation is converted into electricity and is dissipated as heat, increasing the temperature of the photovoltaic cell with respect to the ambient temperature. The objective of this work is to study experimentally and implement a hybrid prototype to evaluate electrical and thermal performance. In this paper, an experimental study of two new configurations of hybrid collectors is exposed. The results are given and interpreted. The two configurations of absorber studied are a new combination with tubes and galvanized tank, the other is a tubes and sheet.
Abstract: Objective: The objective of this paper is to assess the
hospitals preparedness for emergency using WHO standards.
Method: This is a cross-sectional study, consisted of site visit,
questionnaire survey, 16 health facilities were included. The WHO
standard for emergency preparedness of health facilities was used to
evaluate and assess the hospitals preparedness of health facilities.
Result: 13 hospitals were responded. They scored below average
in all measure >75%), while above average score was in 7 out 9 nine
measure with a range of 8%-25%. Un acceptable below average was
noted in two measures only.
Discussion: The biggest challenge facing the hospitals in their
emergency intervention is the lack of pre-emergency and emergency
preparedness plans as well as the coordination of the hospitals
response mechanisms.
Conclusion: The studied hospitals presently are far from
international disasters preparedness protocols. That necessitates
improvements in emergency preparedness, as well as in physician
skills for injury management.
Abstract: At present, dictionary attack has been the basic tool for
recovering key passwords. In order to avoid dictionary attack, users
purposely choose another character strings as passwords. According to
statistics, about 14% of users choose keys on a keyboard (Kkey, for
short) as passwords. This paper develops a framework system to attack
the password chosen from Kkeys and analyzes its efficiency. Within
this system, we build up keyboard rules using the adjacent and parallel
relationship among Kkeys and then use these Kkey rules to generate
password databases by depth-first search method. According to the
experiment results, we find the key space of databases derived from
these Kkey rules that could be far smaller than the password databases
generated within brute-force attack, thus effectively narrowing down
the scope of attack research. Taking one general Kkey rule, the
combinations in all printable characters (94 types) with Kkey adjacent
and parallel relationship, as an example, the derived key space is about
240 smaller than those in brute-force attack. In addition, we
demonstrate the method's practicality and value by successfully
cracking the access password to UNIX and PC using the password
databases created
Abstract: Chinese Idioms are a type of traditional Chinese idiomatic
expressions with specific meanings and stereotypes structure
which are widely used in classical Chinese and are still common in
vernacular written and spoken Chinese today. Currently, Chinese
Idioms are retrieved in glossary with key character or key word in
morphology or pronunciation index that can not meet the need of
searching semantically. OCIRS is proposed to search the desired
idiom in the case of users only knowing its meaning without any key
character or key word. The user-s request in a sentence or phrase will
be grammatically analyzed in advance by word segmentation, key
word extraction and semantic similarity computation, thus can be
mapped to the idiom domain ontology which is constructed to provide
ample semantic relations and to facilitate description logics-based
reasoning for idiom retrieval. The experimental evaluation shows that
OCIRS realizes the function of searching idioms via semantics, obtaining
preliminary achievement as requested by the users.
Abstract: Severe acute respiratory syndrome (SARS) is a respiratory disease in humans which is caused by the SARS coronavirus. The treatment of coronavirus-associated SARS has been evolving and so far there is no consensus on an optimal regimen. The mainstream therapeutic interventions for SARS involve broad-spectrum antibiotics and supportive care, as well as antiviral agents and immunomodulatory therapy. The Protein- Ligand interaction plays a significant role in structural based drug designing. In the present work we have taken the receptor Angiotensin converting enzyme 2 and identified the drugs that are commonly used against SARS. They are Lopinavir, Ritonavir, Ribavirin, and Oseltamivir. The receptor Angiotensin converting enzyme 2 (ACE-2) was docked with above said drugs and the energy value obtained are as follows, Lopinavir (-292.3), Ritonavir (-325.6), Oseltamivir (- 229.1), Ribavirin (-208.8). Depending on the least energy value we have chosen the best two drugs out of the four conventional drugs. We tried to improve the binding efficiency and steric compatibility of the two drugs namely Ritonavir and Lopinavir. Several modifications were made to the probable functional groups (phenylic, ketonic groups in case of Ritonavir and carboxylic groups in case of Lopinavir respectively) which were interacting with the receptor molecule. Analogs were prepared by Marvin Sketch software and were docked using HEX docking software. Lopinavir analog 8 and Ritonavir analog 11 were detected with significant energy values and are probable lead molecule. It infers that some of the modified drugs are better than the original drugs. Further work can be carried out to improve the steric compatibility of the drug based upon the work done above for a more energy efficient binding of the drugs to the receptor.
Abstract: A new reverse phase-high performance liquid chromatography (RP-HPLC) method with fluorescent detector (FLD) was developed and optimized for Norfloxacin determination in human plasma. Mobile phase specifications, extraction method and excitation and emission wavelengths were varied for optimization. HPLC system contained a reverse phase C18 (5 μm, 4.6 mm×150 mm) column with FLD operated at excitation 330 nm and emission 440 nm. The optimized mobile phase consisted of 14% acetonitrile in buffer solution. The aqueous phase was prepared by mixing 2g of citric acid, 2g sodium acetate and 1 ml of triethylamine in 1 L of Milli-Q water was run at a flow rate of 1.2 mL/min. The standard curve was linear for the range tested (0.156–20 μg/mL) and the coefficient of determination was 0.9978. Aceclofenac sodium was used as internal standard. A detection limit of 0.078 μg/mL was achieved. Run time was set at 10 minutes because retention time of norfloxacin was 0.99 min. which shows the rapidness of this method of analysis. The present assay showed good accuracy, precision and sensitivity for Norfloxacin determination in human plasma with a new internal standard and can be applied pharmacokinetic evaluation of Norfloxacin tablets after oral administration in human.
Abstract: The development of wireless communication technologies has changed our living style in global level. After the international success of mobile telephony standards, the location and time independent voice connection has become a default method in daily telecommunications. As for today, highly advanced multimedia messaging plays a key role in value added service handling. Along with evolving data services, the need for more complex applications can be seen, including the mobile usage of broadcast technologies. Here performance of a system design for terrestrial multimedia content is examined with emphasis on mobile reception. This review paper has accommodated the understanding of physical layer role and the flavour of terrestrial channel effects on the terrestrial multimedia transmission using OFDM keeping DVB-H as benchmark standard.
Abstract: This paper presents an algorithm of particle swarm
optimization with reduction for global optimization problems. Particle
swarm optimization is an algorithm which refers to the collective
motion such as birds or fishes, and a multi-point search algorithm
which finds a best solution using multiple particles. Particle
swarm optimization is so flexible that it can adapt to a number
of optimization problems. When an objective function has a lot of
local minimums complicatedly, the particle may fall into a local
minimum. For avoiding the local minimum, a number of particles are
initially prepared and their positions are updated by particle swarm
optimization. Particles sequentially reduce to reach a predetermined
number of them grounded in evaluation value and particle swarm
optimization continues until the termination condition is met. In order
to show the effectiveness of the proposed algorithm, we examine the
minimum by using test functions compared to existing algorithms.
Furthermore the influence of best value on the initial number of
particles for our algorithm is discussed.
Abstract: The image segmentation method described in this
paper has been developed as a pre-processing stage to be used in
methodologies and tools for video/image indexing and retrieval by
content. This method solves the problem of whole objects extraction
from background and it produces images of single complete objects
from videos or photos. The extracted images are used for calculating
the object visual features necessary for both indexing and retrieval
processes.
The segmentation algorithm is based on the cooperation among an
optical flow evaluation method, edge detection and region growing
procedures. The optical flow estimator belongs to the class of
differential methods. It permits to detect motions ranging from a
fraction of a pixel to a few pixels per frame, achieving good results in
presence of noise without the need of a filtering pre-processing stage
and includes a specialised model for moving object detection.
The first task of the presented method exploits the cues from
motion analysis for moving areas detection. Objects and background
are then refined using respectively edge detection and seeded region
growing procedures. All the tasks are iteratively performed until
objects and background are completely resolved.
The method has been applied to a variety of indoor and outdoor
scenes where objects of different type and shape are represented on
variously textured background.
Abstract: This article demonstrated development of
controlled release system of an NSAID drug, Diclofenac
sodium employing different ratios of Ethyl cellulose.
Diclofenac sodium and ethyl cellulose in different proportions
were processed by microencapsulation based on phase
separation technique to formulate microcapsules. The
prepared microcapsules were then compressed into tablets to
obtain controlled release oral formulations. In-vitro evaluation
was performed by dissolution test of each preparation was
conducted in 900 ml of phosphate buffer solution of pH 7.2
maintained at 37 ± 0.5 °C and stirred at 50 rpm. At predetermined
time intervals (0, 0.5, 1.0, 1.5, 2, 3, 4, 6, 8, 10, 12,
16, 20 and 24 hrs). The drug concentration in the collected
samples was determined by UV spectrophotometer at 276 nm.
The physical characteristics of diclofenac sodium
microcapsules were according to accepted range. These were
off-white, free flowing and spherical in shape. The release
profile of diclofenac sodium from microcapsules was found to
be directly proportional to the proportion of ethylcellulose and
coat thickness. The in-vitro release pattern showed that with
ratio of 1:1 and 1:2 (drug: polymer), the percentage release of
drug at first hour was 16.91 and 11.52 %, respectively as
compared to 1:3 which is only 6.87 % with in this time. The
release mechanism followed higuchi model for its release
pattern. Tablet Formulation (F2) of present study was found
comparable in release profile the marketed brand Phlogin-SR,
microcapsules showed an extended release beyond 24 h.
Further, a good correlation was found between drug release
and proportion of ethylcellulose in the microcapsules.
Microencapsulation based on coacervation found as good
technique to control release of diclofenac sodium for making
the controlled release formulations.
Abstract: Program slicing is the task of finding all statements in
a program that directly or indirectly influence the value of a variable
occurrence. The set of statements that can affect the value of a
variable at some point in a program is called a program backward
slice. In several software engineering applications, such as program
debugging and measuring program cohesion and parallelism, several
slices are computed at different program points. The existing
algorithms for computing program slices are introduced to compute a
slice at a program point. In these algorithms, the program, or the
model that represents the program, is traversed completely or
partially once. To compute more than one slice, the same algorithm
is applied for every point of interest in the program. Thus, the same
program, or program representation, is traversed several times.
In this paper, an algorithm is introduced to compute all forward
static slices of a computer program by traversing the program
representation graph once. Therefore, the introduced algorithm is
useful for software engineering applications that require computing
program slices at different points of a program. The program
representation graph used in this paper is called Program Dependence
Graph (PDG).
Abstract: This paper deals with modeling and parameter
identification of nonlinear systems described by Hammerstein model
having Piecewise nonlinear characteristics such as Dead-zone
nonlinearity characteristic. The simultaneous use of both an easy
decomposition technique and the triangular basis functions leads to a
particular form of Hammerstein model. The approximation by using
Triangular basis functions for the description of the static nonlinear
block conducts to a linear regressor model, so that least squares
techniques can be used for the parameter estimation. Singular Values
Decomposition (SVD) technique has been applied to separate the
coupled parameters. The proposed approach has been efficiently
tested on academic examples of simulation.
Abstract: In any trust model, the two information sources that a peer relies on to predict trustworthiness of another peer are direct experience as well as reputation. These two vital components evolve over time. Trust evolution is an important issue, where the objective is to observe a sequence of past values of a trust parameter and determine the future estimates. Unfortunately, trust evolution algorithms received little attention and the proposed algorithms in the literature do not comply with the conditions and the nature of trust. This paper contributes to this important problem in the following ways: (a) presents an algorithm that manages and models trust evolution in a P2P environment, (b) devises new mechanisms for effectively maintaining trust values based on the conditions that influence trust evolution , and (c) introduces a new methodology for incorporating trust-nurture incentives into the trust evolution algorithm. Simulation experiments are carried out to evaluate our trust evolution algorithm.
Abstract: This paper presents the comparative study of coded
data methods for finding the benefit of concealing the natural data
which is the mercantile secret. Influential parameters of the number
of replicates (rep), treatment effects (τ) and standard deviation (σ)
against the efficiency of each transformation method are investigated.
The experimental data are generated via computer simulations under
the specified condition of the process with the completely
randomized design (CRD). Three ways of data transformation consist
of Box-Cox, arcsine and logit methods. The difference values of F
statistic between coded data and natural data (Fc-Fn) and hypothesis
testing results were determined. The experimental results indicate
that the Box-Cox results are significantly different from natural data
in cases of smaller levels of replicates and seem to be improper when
the parameter of minus lambda has been assigned. On the other hand,
arcsine and logit transformations are more robust and obviously,
provide more precise numerical results. In addition, the alternate
ways to select the lambda in the power transformation are also
offered to achieve much more appropriate outcomes.
Abstract: In this study, we developed an algorithm for detecting
seam cracks in a steel plate. Seam cracks are generated in the edge
region of a steel plate. We used the Gabor filter and an adaptive double
threshold method to detect them. To reduce the number of pseudo
defects, features based on the shape of seam cracks were used. To
evaluate the performance of the proposed algorithm, we tested 989
images with seam cracks and 9470 defect-free images. Experimental
results show that the proposed algorithm is suitable for detecting seam
cracks. However, it should be improved to increase the true positive
rate.
Abstract: An overview of the important aspects of managing
and controlling industrial effluent discharges to public sewers namely
sampling, characterization, quantification and legislative controls has
been presented. The findings have been validated by means of a case
study covering three industrial sectors namely, tanning, textile
finishing and food processing industries. Industrial effluents
discharges were found to be best monitored by systematic and
automatic sampling and quantified using water meter readings
corrected for evaporative and consumptive losses. Based on the
treatment processes employed in the public owned treatment works
and the chemical oxygen demand and biochemical oxygen demand
levels obtained, the effluent from all the three industrial sectors
studied were found to lie in the toxic zone. Thus, physico-chemical
treatment of these effluents is required to bring them into the
biodegradable zone. KL values (quoted to base e) were greater than
0.50 day-1 compared to 0.39 day-1 for typical municipality
wastewater.
Abstract: In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.
Abstract: Aim of this paper is to explore the prospect of a new approach of mobile phone banking in Libya. This study evaluates customer knowledge on commercial mobile banking in Libya. To examine the relationship between age, occupation and intention for using mobile banking for commercial purpose, a survey was conducted to gather information from one hundred Libyan bank clients. The results indicate that Libyan customers have accepted the new technology and they are ready to use it. There is no significant joint relationship between age and occupation found in intention to use mobile banking in Libya. On the other hand, the customers’ knowledge about mobile banking has a greater relationship with the intention. This study has implications for demographic researches and consumer behaviour disciplines. It also has profitable implications for banks and managers in Libya, as it will assist in better understanding of the Libyan consumers and their activities, when they develop their market strategies and new service.
Abstract: In this paper, we first introduce the new concept of completely semiprime fuzzy ideals of an ordered semigroup S, which is an extension of completely semiprime ideals of ordered semigroup S, and investigate some its related properties. Especially, we characterize an ordered semigroup that is a semilattice of simple ordered semigroups in terms of completely semiprime fuzzy ideals of ordered semigroups. Furthermore, we introduce the notion of semiprime fuzzy ideals of ordered semigroup S and establish the relations between completely semiprime fuzzy ideals and semiprime fuzzy ideals of S. Finally, we give a characterization of prime fuzzy ideals of an ordered semigroup S and show that a nonconstant fuzzy ideal f of an ordered semigroup S is prime if and only if f is twovalued, and max{f(a), f(b)} = inf f((aSb]), ∀a, b ∈ S.
Abstract: This paper presents a new technique of compensation
of the effect of variation parameters in the direct field oriented
control of induction motor. The proposed method uses an adaptive
tuning of the value of synchronous speed to obtain the robustness for
the field oriented control. We show that this adaptive tuning allows
having robustness for direct field oriented control to changes in rotor
resistance, load torque and rotational speed. The effectiveness of the
proposed control scheme is verified by numerical simulations. The
numerical validation results of the proposed scheme have presented
good performances compared to the usual direct-field oriented
control.