Abstract: Nature conducts its action in a very private manner. To
reveal these actions classical science has done a great effort. But
classical science can experiment only with the things that can be seen
with eyes. Beyond the scope of classical science quantum science
works very well. It is based on some postulates like qubit,
superposition of two states, entanglement, measurement and
evolution of states that are briefly described in the present paper.
One of the applications of quantum computing i.e.
implementation of a novel quantum evolutionary algorithm(QEA) to
automate the time tabling problem of Dayalbagh Educational Institute
(Deemed University) is also presented in this paper. Making a good
timetable is a scheduling problem. It is NP-hard, multi-constrained,
complex and a combinatorial optimization problem. The solution of
this problem cannot be obtained in polynomial time. The QEA uses
genetic operators on the Q-bit as well as updating operator of
quantum gate which is introduced as a variation operator to converge
toward better solutions.
Abstract: Droughts are complex, natural hazards that, to a
varying degree, affect some parts of the world every year. The range
of drought impacts is related to drought occurring in different stages
of the hydrological cycle and usually different types of droughts,
such as meteorological, agricultural, hydrological, and socioeconomical
are distinguished. Streamflow drought was analyzed by
the method of truncation level (at 70% level) on daily discharges
measured in 54 hydrometric stations in southwestern Iran. Frequency
analysis was carried out for annual maximum series (AMS) of
drought deficit volume and duration series. Some factors including
physiographic, climatic, geologic, and vegetation cover were studied
as influential factors in the regional analysis. According to the results
of factor analysis, six most effective factors were identified as area,
rainfall from December to February, the percent of area with
Normalized Difference Vegetation Index (NDVI)
Abstract: Rapid Prototyping (RP) is a technology that produces models and prototype parts from 3D CAD model data, CT/MRI scan data, and model data created from 3D object digitizing systems. There are several RP process like Stereolithography (SLA), Solid Ground Curing (SGC), Selective Laser Sintering (SLS), Fused Deposition Modeling (FDM), 3D Printing (3DP) among them SLS and FDM RP processes are used to fabricate pattern of custom cranial implant. RP technology is useful in engineering and biomedical application. This is helpful in engineering for product design, tooling and manufacture etc. RP biomedical applications are design and development of medical devices, instruments, prosthetics and implantation; it is also helpful in planning complex surgical operation. The traditional approach limits the full appreciation of various bony structure movements and therefore the custom implants produced are difficult to measure the anatomy of parts and analyze the changes in facial appearances accurately. Cranioplasty surgery is a surgical correction of a defect in cranial bone by implanting a metal or plastic replacement to restore the missing part. This paper aims to do a comparative study on the dimensional error of CAD and SLS RP Models for reconstruction of cranial defect by comparing the virtual CAD with the physical RP model of a cranial defect.
Abstract: Design of a fixed parameter robust STATCOM controller for a multi-machine power system through an H-? based loop-shaping procedure is presented. The trial and error part of the graphical loop-shaping procedure has been eliminated by embedding a particle swarm optimization (PSO) technique in the design loop. Robust controllers were designed considering the detailed dynamics of the multi-machine system and results were compared with reduced order models. The robust strategy employing loop-shaping and PSO algorithms was observed to provide very good damping profile for a wide range of operation and for various disturbance conditions.
Abstract: The article deals with technical support of intracranial single unit activity measurement. The parameters of the whole measuring set were tested in order to assure the optimal conditions of extracellular single-unit recording. Metal microelectrodes for measuring the single-unit were tested during animal experiments. From signals recorded during these experiments, requirements for the measuring set parameters were defined. The impedance parameters of the metal microelectrodes were measured. The frequency-gain and autonomous noise properties of preamplifier and amplifier were verified. The measurement and the description of the extracellular single unit activity could help in prognoses of brain tissue damage recovery.
Abstract: Availability of high dimensional biological datasets such as from gene expression, proteomic, and metabolic experiments can be leveraged for the diagnosis and prognosis of diseases. Many classification methods in this area have been studied to predict disease states and separate between predefined classes such as patients with a special disease versus healthy controls. However, most of the existing research only focuses on a specific dataset. There is a lack of generic comparison between classifiers, which might provide a guideline for biologists or bioinformaticians to select the proper algorithm for new datasets. In this study, we compare the performance of popular classifiers, which are Support Vector Machine (SVM), Logistic Regression, k-Nearest Neighbor (k-NN), Naive Bayes, Decision Tree, and Random Forest based on mock datasets. We mimic common biological scenarios simulating various proportions of real discriminating biomarkers and different effect sizes thereof. The result shows that SVM performs quite stable and reaches a higher AUC compared to other methods. This may be explained due to the ability of SVM to minimize the probability of error. Moreover, Decision Tree with its good applicability for diagnosis and prognosis shows good performance in our experimental setup. Logistic Regression and Random Forest, however, strongly depend on the ratio of discriminators and perform better when having a higher number of discriminators.
Abstract: Aligned and random nanofibrous scaffolds of PVA/PCL/nHA were fabricated by electrospinning method. The composite nanofibrous scaffolds were subjected to detailed analysis. Morphological investigations revealed that the prepared nanofibers have uniform morphology and the average fiber diameters of aligned and random scaffolds were 135.5 and 290 nm, respectively. The obtained scaffolds have a porous structure with porosity of 88 and 76% for random and aligned nanofibers, respectively. Furthermore, FTIR analysis demonstrated that there were strong intramolecular interactions between the molecules of PVA/PCL/nHA. On the other hand, mechanical characterizations show that aligning the nanofibers, could significantly improve the rigidity of the resultant biocomposite nanofibrous scaffolds.
Abstract: Asiatic Houbara ( Chlamydotis macqueenii ) is a
flagship and vulnerable species. In-situ conservation of this
threatened species demands for knowledge of its habitat selection.
The aim of this study was to determine habitat variables influencing
birds wintering and breeding selection in semi- arid central Iran.
Habitat features of the detected nest and pellet sites were compared
with paired and random plots by quantifying a number of habitat
variables. In wintering habitat use at micro scale houbara selected
sites where vegetation cover was significantly lower compard to
control sites( p< 0.001). Areas with low number of larger plant
species (p=0.03) that were not too close to a vegetation
patch(p
Abstract: In seismic survey, the information regarding the
velocity of compression wave (Vp) as well as shear wave (Vs) are
very useful especially during the seismic interpretation. Previous
studies showed that both Vp and Vs determined by above methods
are totally different with respect to each other but offered good
approximation. In this study, both Vp and Vs of consolidated granite
rock were studied by using ultrasonic testing method and seismic
refraction method. In ultrasonic testing, two different condition of
rock are used which is dry and wet. The differences between Vp and
Vs getting by using ultrasonic testing and seismic refraction were
investigated and studied. The effect of water content in granite rock
towards the value of Vp and Vs during ultrasonic testing are also
measured. Within this work, the tolerance of the differences between
the velocity of seismic wave getting from ultrasonic testing and the
velocity of seismic wave getting from seismic refraction are also
measured and investigated.
Abstract: System-level design based on high-level abstractions
is becoming increasingly important in hardware and embedded
system design. This paper analyzes meta-design techniques oriented
at developing meta-programs and meta-models for well-understood
domains. Meta-design techniques include meta-programming and
meta-modeling. At the programming level of design process, metadesign
means developing generic components that are usable in a
wider context of application than original domain components. At the
modeling level, meta-design means developing design patterns that
describe general solutions to the common recurring design problems,
and meta-models that describe the relationship between different
types of design models and abstractions. The paper describes and
evaluates the implementation of meta-design in hardware design
domain using object-oriented and meta-programming techniques.
The presented ideas are illustrated with a case study.
Abstract: Today, cancer remains one of the major diseases that
lead to death. The main obstacle in chemotherapy as a main cancer
treatment is the toxicity to normal cells due to Multidrug Resistance
(MDR) after the use of anticancer drugs. Proposed solution to
overcome this problem is the use of MDR efflux inhibitor of cinchona
alkaloids which is delivered together with anticancer drugs
encapsulated in the form of polymeric nanoparticles. The particles
were prepared by the hydration method. The characterization of
nanoparticles was particle size, zeta potential, entrapment efficiency
and in vitro drug release. Combination nanoparticle size ranged 29-45
nm with a neutral surface charge. Entrapment efficiency was above
87% for the use quinine, quinidine or cinchonidine in combination
with etoposide. The release test results exhibited that the cinchona
alkaloids release released faster than that of etoposide. Collectively,
cinchona alkaloids can be packaged along with etoposide in
nanomicelles for better cancer therapy.
Abstract: Two short sediment cores collected from mangrove
areas of Manori and Thane creeks along Mumbai coast were analysed
for sediment composition and metals (Fe, Mn, Cu, Pb, Co, Ni, Zn, Cr
and V). The statistical analysis of Pearson correlation matrix proved
that there is a significant relationship between metal concentration
and finer grain size in Manori creek while poor correlation was
observed in Thane creek. Based on the enrichment factor, the present
metal to background metal ratios clearly reflected maximum
enrichment of Cu and Pb in Manori creek and Mn in Thane creek.
Geoaccumulation index calculated indicate that the study area is
unpolluted with respect to Fe, Mn, Co, Ni, Zn and Cr in both the
cores while moderately polluted with Cu and Pb in Manori creek.
Based on contamination degree, both the core sediments were found
to be considerably contaminated with metals.
Abstract: In modern distributed software systems, the issue of communication among composing parts represents a critical point, but the idea of extending conventional programming languages with general purpose communication constructs seems difficult to realize. As a consequence, there is a (growing) gap between the abstraction level required by distributed applications and the concepts provided by platforms that enable communication. This work intends to discuss how the Model Driven Software Development approach can be considered as a mature technology to generate in automatic way the schematic part of applications related to communication, by providing at the same time high level specialized languages useful in all the phases of software production. To achieve the goal, a stack of languages (meta-meta¬models) has been introduced in order to describe – at different levels of abstraction – the collaborative behavior of generic entities in terms of communication actions related to a taxonomy of messages. Finally, the generation of platforms for communication is viewed as a form of specification of language semantics, that provides executable models of applications together with model-checking supports and effective runtime environments.
Abstract: This paper presents a new approach for busbar protection with stable operation of current transformer during saturation, using fuzzy neuro and symmetrical components theory. This technique uses symmetrical components of current signals to learn the hidden relationship existing in the input patterns. Simulation studies are preformed and the influence of changing system parameters such as inception fault and source impedance is studied. Details of the design procedure and the results of performance studies with the proposed relay are given in the paper. An analysis of the performance of the proposed technique during ct saturation conditions is presented. The performance of the technique was investigated for a variety of operating conditions and for several busbar configurations. Data generated by EMTDC simulations of model power systems were used in the investigations. The results indicate that the proposed technique is stable during ct saturation conditions.
Abstract: Electrocardiogram (ECG) is considered to be the
backbone of cardiology. ECG is composed of P, QRS & T waves and
information related to cardiac diseases can be extracted from the
intervals and amplitudes of these waves. The first step in extracting
ECG features starts from the accurate detection of R peaks in the
QRS complex. We have developed a robust R wave detector using
wavelets. The wavelets used for detection are Daubechies and
Symmetric. The method does not require any preprocessing therefore,
only needs the ECG correct recordings while implementing the
detection. The database has been collected from MIT-BIH arrhythmia
database and the signals from Lead-II have been analyzed. MatLab
7.0 has been used to develop the algorithm. The ECG signal under
test has been decomposed to the required level using the selected
wavelet and the selection of detail coefficient d4 has been done based
on energy, frequency and cross-correlation analysis of decomposition
structure of ECG signal. The robustness of the method is apparent
from the obtained results.
Abstract: An empirical study of web applications that use
software frameworks is presented here. The analysis is based on two
approaches. In the first, developers using such frameworks are
required, based on their experience, to assign weights to parameters
such as database connection. In the second approach, a performance
testing tool, OpenSTA, is used to compute start time and other such
measures. From such an analysis, it is concluded that open source
software is superior to proprietary software. The motivation behind
this research is to examine ways in which a quantitative assessment
can be made of software in general and frameworks in particular.
Concepts such as metrics and architectural styles are discussed along
with previously published research.
Abstract: The Muslim faith requires individuals to fast between
the hours of sunrise and sunset during the month of Ramadan. Our
recent work has concentrated on some of the changes that take place
during the daytime when fasting. A questionnaire was developed to
assess subjective estimates of physical, mental and social activities,
and fatigue. Four days were studied: in the weeks before and after
Ramadan (control days) and during the first and last weeks of
Ramadan (experimental days). On each of these four days, this
questionnaire was given several times during the daytime and once
after the fast had been broken and just before individuals retired at
night.
During Ramadan, daytime mental, physical and social activities
all decreased below control values but then increased to abovecontrol
values in the evening. The desires to perform physical and
mental activities showed very similar patterns. That is, individuals
tried to conserve energy during the daytime in preparation for the
evenings when they ate and drank, often with friends. During
Ramadan also, individuals were more fatigued in the daytime and
napped more often than on control days. This extra fatigue probably
reflected decreased sleep, individuals often having risen earlier
(before sunrise, to prepare for fasting) and retired later (to enable
recovery from the fast).
Some physiological measures and objective measures of
performance (including the response to a bout of exercise) have also
been investigated. Urine osmolality fell during the daytime on
control days as subjects drank, but rose in Ramadan to reach values
at sunset indicative of dehydration. Exercise performance was also
compromised, particularly late in the afternoon when the fast had
lasted several hours. Self-chosen exercise work-rates fell and a set
amount of exercise felt more arduous. There were also changes in
heart rate and lactate accumulation in the blood, indicative of greater
cardiovascular and metabolic stress caused by the exercise in
subjects who had been fasting. Daytime fasting in Ramadan produces
widespread effects which probably reflect combined effects of sleep
loss and restrictions to intakes of water and food.
Abstract: Subsurface erosion in river banks and its details, in
spite of its occurrence in various parts of the world has rarely been
paid attention by researchers. In this paper, quantitative concept of
the subsurface bank erosion has been investigated for vertical banks.
Vertical banks were simulated experimentally by considering a sandy
erodible layer overlaid by clayey one under uniformly distributed
constant overhead pressure. Results of the experiments are indicated
that rate of sandy layer erosion is decreased by an increase in
overburden; likewise, substituting 20% of coarse (3.5 mm) sand layer
bed material by fine material (1.4 mm) may lead to a decrease in
erosion rate by one-third. This signifies the importance of the bed
material composition effect on sandy layers erosion due to subsurface
erosion in river banks.
Abstract: Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Abstract: Many agent-oriented software engineering
methodologies have been proposed for software developing; however
their application is still limited due to their lack of maturity.
Evaluating the strengths and weaknesses of these methodologies
plays an important role in improving them and in developing new
stronger methodologies. This paper presents an evaluation framework
for agent-oriented methodologies, which addresses six major areas:
concepts, notation, process, pragmatics, support for software
engineering and marketability. The framework is then used to
evaluate the Gaia methodology to identify its strengths and
weaknesses, and to prove the ability of the framework for promoting
the agent-oriented methodologies by detecting their weaknesses in
detail.