Abstract: The running logs of a process hold valuable
information about its executed activity behavior and generated activity
logic structure. Theses informative logs can be extracted, analyzed and
utilized to improve the efficiencies of the process's execution and
conduction. One of the techniques used to accomplish the process
improvement is called as process mining. To mine similar processes is
such an improvement mission in process mining. Rather than directly
mining similar processes using a single comparing coefficient or a
complicate fitness function, this paper presents a simplified heuristic
process mining algorithm with two similarity comparisons that are
able to relatively conform the activity logic sequences (traces) of
mining processes with those of a normalized (regularized) one. The
relative process conformance is to find which of the mining processes
match the required activity sequences and relationships, further for
necessary and sufficient applications of the mined processes to process
improvements. One similarity presented is defined by the relationships
in terms of the number of similar activity sequences existing in
different processes; another similarity expresses the degree of the
similar (identical) activity sequences among the conforming processes.
Since these two similarities are with respect to certain typical behavior
(activity sequences) occurred in an entire process, the common
problems, such as the inappropriateness of an absolute comparison and
the incapability of an intrinsic information elicitation, which are often
appeared in other process conforming techniques, can be solved by the
relative process comparison presented in this paper. To demonstrate
the potentiality of the proposed algorithm, a numerical example is
illustrated.
Abstract: Thousands of masters athletes participate
quadrennially in the World Masters Games (WMG), yet this cohort
of athletes remains proportionately under-investigated. Due to a
growing global obesity pandemic in context of benefits of physical
activity across the lifespan, the BMI trends for this unique population
was of particular interest. The nexus between health, physical
activity and aging is complex and has raised much interest in recent
times due to the realization that a multifaceted approach is necessary
in order to counteract the obesity pandemic. By investigating age
based trends within a population adhering to competitive sport at
older ages, further insight might be gleaned to assist in understanding
one of many factors influencing this relationship.BMI was derived
using data gathered on a total of 6,071 masters athletes (51.9% male,
48.1% female) aged 25 to 91 years ( =51.5, s =±9.7), competing at
the Sydney World Masters Games (2009). Using linear and loess
regression it was demonstrated that the usual tendency for prevalence
of higher BMI increasing with age was reversed in the sample. This
trend in reversal was repeated for both male and female only sub-sets
of the sample participants, indicating the possibility of improved
prevalence of BMI with increasing age for both the sample as a
whole and these individual sub-groups.This evidence of improved
classification in one index of health (reduced BMI) for masters
athletes (when compared to the general population) implies there are
either improved levels of this index of health with aging due to
adherence to sport or possibly the reduced BMI is advantageous and
contributes to this cohort adhering (or being attracted) to masters
sport at older ages.
Abstract: It is expected that ubiquitous era will come soon. A ubiquitous environment has features like peer-to-peer and nomadic environments. Such features can be represented by peer-to-peer systems and mobile ad-hoc networks (MANETs). The features of P2P systems and MANETs are similar, appealing for implementing P2P systems in MANET environment. It has been shown that, however, the performance of the P2P systems designed for wired networks do not perform satisfactorily in mobile ad-hoc environment. Subsequently, this paper proposes a method to improve P2P performance using cross-layer design and the goodness of a node as a peer. The proposed method uses routing metric as well as P2P metric to choose favorable peers to connect. It also utilizes proactive approach for distributing peer information. According to the simulation results, the proposed method provides higher query success rate, shorter query response time and less energy consumption by constructing an efficient overlay network.
Abstract: Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Abstract: Human immunodeficiency virus infection and
acquired immunodeficiency syndrome is a global pandemic with
cases reporting from virtually every country and continues to be a
common infection in developing country like India.
Microalbuminuria is a manifestation of human immunodeficiency
virus associated nephropathy. Therefore, microalbuminuria may be
an early marker of human immunodeficiency virus associated
nephropathy, and screening for its presence may be beneficial. A
strikingly high prevalence of microalbuminuria among human
immunodeficiency virus infected patients has been described in
various studies. Risk factors for clinically significant proteinuria
include African - American race, higher human immunodeficiency
virus ribonucleic acid level and lower CD4 lymphocyte count. The
cardiovascular risk factors of increased systolic blood pressure and
increase fasting blood sugar level are strongly associated with
microalbuminuria in human immunodeficiency virus patient. These
results suggest that microalbuminuria may be a sign of current
endothelial dysfunction and micro-vascular disease and there is
substantial risk of future cardiovascular disease events. Positive
contributing factors include early kidney disease such as human
immunodeficiency virus associated nephropathy, a marker of end
organ damage related to co morbidities of diabetes or hypertension,
or more diffuse endothelial cells dysfunction. Nevertheless after
adjustment for non human immunodeficiency virus factors, human
immunodeficiency virus itself is a major risk factor. The presence of
human immunodeficiency virus infection is independent risk to
develop microalbuminuria in human immunodeficiency virus patient.
Cardiovascular risk factors appeared to be stronger predictors of
microalbuminuria than markers of human immunodeficiency virus
severity person with human immunodeficiency virus infection and
microalbuminuria therefore appear to potentially bear the burden of
two separate damage related to known vascular end organ damage
related to know vascular risk factors, and human immunodeficiency
virus specific processes such as the direct viral infection of kidney
cells.The higher prevalence of microalbuminuria among the human
immunodeficiency virus infected could be harbinger of future
increased risks of both kidney and cardiovascular disease. Further
study defining the prognostic significance of microalbuminuria
among human immunodeficiency virus infected persons will be
essential. Microalbuminuria seems to be a predictor of cardiovascular
disease in diabetic and non diabetic subjects, hence it can also be
used for early detection of micro vascular disease in human
immunodeficiency virus positive patients, thus can help to diagnose
the disease at the earliest.
Abstract: Corner detection and optical flow are common techniques for feature-based video stabilization. However, these algorithms are computationally expensive and should be performed at a reasonable rate. This paper presents an algorithm for discarding irrelevant feature points and maintaining them for future use so as to improve the computational cost. The algorithm starts by initializing a maintained set. The feature points in the maintained set are examined against its accuracy for modeling. Corner detection is required only when the feature points are insufficiently accurate for future modeling. Then, optical flows are computed from the maintained feature points toward the consecutive frame. After that, a motion model is estimated based on the simplified affine motion model and least square method, with outliers belonging to moving objects presented. Studentized residuals are used to eliminate such outliers. The model estimation and elimination processes repeat until no more outliers are identified. Finally, the entire algorithm repeats along the video sequence with the points remaining from the previous iteration used as the maintained set. As a practical application, an efficient video stabilization can be achieved by exploiting the computed motion models. Our study shows that the number of times corner detection needs to perform is greatly reduced, thus significantly improving the computational cost. Moreover, optical flow vectors are computed for only the maintained feature points, not for outliers, thus also reducing the computational cost. In addition, the feature points after reduction can sufficiently be used for background objects tracking as demonstrated in the simple video stabilizer based on our proposed algorithm.
Abstract: Using activity theory, organisational theory and
didactics as theoretical foundations, a comprehensive model of the
organisational dimensions relevant for learning and knowledge
transfer will be developed. In a second step, a Learning Assessment
Guideline will be elaborated. This guideline will be designed to
permit a targeted analysis of organisations to identify the status quo
in those areas crucial to the implementation of learning and
knowledge transfer. In addition, this self-analysis tool will enable
learning managers to select adequate didactic models for e- and
blended learning. As part of the European Integrated Project
"Process-oriented Learning and Information Exchange" (PROLIX),
this model of organisational prerequisites for learning and knowledge
transfer will be empirically tested in four profit and non-profit
organisations in Great Britain, Germany and France (to be finalized
in autumn 2006). The findings concern not only the capability of the
model of organisational dimensions, but also the predominant
perceptions of and obstacles to learning in organisations.
Abstract: Three sulphonic acid-doped polyanilines were
synthesized through chemical oxidation at low temperature (0-5 oC)
and potential of these polymers as sensing agent for O2 gas detection
in terms of fluorescence quenching was studied. Sulphuric acid,
dodecylbenzene sulphonic acid (DBSA) and camphor sulphonic acid
(CSA) were used as doping agents. All polymers obtained were dark
green powder. Polymers obtained were characterized by Fourier
transform infrared spectroscopy, ultraviolet-visible absorption
spectroscopy, thermogravimetry analysis, elemental analysis,
differential scanning calorimeter and gel permeation
chromatography. Characterizations carried out showed that polymers
were successfully synthesized with mass recovery for sulphuric aciddoped
polyaniline (SPAN), DBSA-doped polyaniline (DBSA-doped
PANI) and CSA-doped polyaniline (CSA-doped PANI) of 71.40%,
75.00% and 39.96%, respectively. Doping level of SPAN, DBSAdoped
PANI and CSA-doped PANI were 32.86%, 33.13% and
53.96%, respectively as determined based on elemental analysis.
Sensing test was carried out on polymer sample in the form of
solution and film by using fluorescence spectrophotometer. Samples
of polymer solution and polymer film showed positive response
towards O2 exposure. All polymer solutions and films were fully
regenerated by using N2 gas within 1 hour period. Photostability
study showed that all samples of polymer solutions and films were
stable towards light when continuously exposed to xenon lamp for 9
hours. The relative standard deviation (RSD) values for SPAN
solution, DBSA-doped PANI solution and CSA-doped PANI
solution for repeatability were 0.23%, 0.64% and 0.76%,
respectively. Meanwhile RSD values for reproducibility were 2.36%,
6.98% and 1.27%, respectively. Results for SPAN film, DBSAdoped
PANI film and CSA-doped PANI film showed the same
pattern with RSD values for repeatability of 0.52%, 4.05% and
0.90%, respectively. Meanwhile RSD values for reproducibility were
2.91%, 10.05% and 7.42%, respectively. The study on effect of the
flow rate on response time was carried out using 3 different rates
which were 0.25 mL/s, 1.00 mL/s and 2.00 mL/s. Results obtained
showed that the higher the flow rate, the shorter the response time.
Abstract: Many studies have shown that parallelization decreases efficiency [1], [2]. There are many reasons for these decrements. This paper investigates those which appear in the context of parallel data integration. Integration processes generally cannot be allocated to packages of identical size (i. e. tasks of identical complexity). The reason for this is unknown heterogeneous input data which result in variable task lengths. Process delay is defined by the slowest processing node. It leads to a detrimental effect on the total processing time. With a real world example, this study will show that while process delay does initially increase with the introduction of more nodes it ultimately decreases again after a certain point. The example will make use of the cloud computing platform Hadoop and be run inside Amazon-s EC2 compute cloud. A stochastic model will be set up which can explain this effect.
Abstract: Oxygen and carbon isotopes records of multi-species planktonic, benthic foraminifera and bulk carbonate sample from Central Java Indonesia demonstrate that warm sea surface temperature occurred during the Miocene. Planktonic δ18O values from this study consistently lighter (-4 to -3 ‰PDB) than previous studies that indicate sea surface temperature during Miocene in this area was warm than tropical/equatorial localities. A surprising decrease of oxygen isotopic composition was recorded at ±14 Ma where the maximum of δ18O values is -4.87 ‰PDB for Orbulina universa, -5.02 ‰PDB for Globigerinoides sacculifer and -4.30 ‰PDB for Globoquadrina dehiscens, this event we predict as Middle Miocene Optimum. Warming of sea surface temperature we interpret as related to the development of Western Pacific Warm Pool where warm water from Pacific Ocean through the Indonesian seaway appears to remain during Miocene. Our result also show increasing suddenly of oxygen isotope values of planktic, benthic and bulk carbonate sample from ± 12 Ma, the increasing cooled surface water relatively high degree with Late Miocene global cooling climate or we predict that due to closing of Indonesian Gateway.
Abstract: An ecofriendly Citrus paradisipeel extract mediated synthesis of TiO2 nanoparticles is reported under sonication. U.V.-vis, Transmission electron microscopy, Dynamic light scattering, and X-ray analyses are performed to characterize the formation of TiO2 nanoparticles. It is almost spherical in shape, having a size of 60–140 nm and the XRD peaks at 2θ = 25.363° confirm the characteristic facets for anatase form. The synthesized nanocatalyst is highly active in the decomposition of methyl orange (64 mg/L) in sunlight (~73%) for 2.5h.
Abstract: Microparticles carrier systems made from naturally occurring polymers based on chitosan/casein system appears to be a promising carrier for the sustained release of orally and parenteral administered drugs. In the current study we followed a microencapsulation technique based aqueous coacervation method to prepare chitosan/casein microparticles of compositions 1:1, 1:2 and 1:5 incorporated with chloramphenicol. Glutaraldehyde was used as a chemical cross-linking agent. The microparticles were prepared by aerosol method and studied by optical microscopy, infrared spectroscopy, thermo gravimetric analysis, swelling studies and drug release studies at various pH. The percentage swelling of the polymers are found to be in the order pH 4 > pH 10 > pH 7 and the increase in casein composition decrease the swelling percentage. The drug release studies also follow the above order.
Abstract: The well known NP-complete problem of the Traveling Salesman Problem (TSP) is coded in genetic form. A software system is proposed to determine the optimum route for a Traveling Salesman Problem using Genetic Algorithm technique. The system starts from a matrix of the calculated Euclidean distances between the cities to be visited by the traveling salesman and a randomly chosen city order as the initial population. Then new generations are then created repeatedly until the proper path is reached upon reaching a stopping criterion. This search is guided by a solution evaluation function.
Abstract: The stone is a constituent part of the geological
structure of the Territory, introducing himself as a subject that has always interconnected human and environment in the development of a discourse of meanings and symbols that reflect elements realized in
different cultures and experiences.
This action meant that the first settlements and their areas of influence gained importance in the field of humanization and spatial
organization of the territory, not only for the appropriation that its
inhabitants did, but mainly because the community regardless of their
economic or social condition, used it as living space and cultural integration.
These factors become decisive in the characterization of the
landscape area in the northwest of Portugal, because the stone is a
material that appears not only in the natural landscape, but is also a strong element in humanized landscape, becoming this relation the
main characterization of the study area.
Abstract: The paper presents the potential for RES in Romania
and the results of the Romanian national research project “Romania
contribution to the European targets regarding the development of
renewable energy sources - PROMES". The objective of the project
is the development of energy generation from renewable energy
sources (RES) in Romania by drawing up scenarios and prognosis
harmonized with national and European targets, RES development
effects modeling (environmental, economic, social etc.), research of
the impact of the penetration of RES into the main, implementation
of an advanced software system tool for RES information recording
and communication, experimental research based on demonstrative
applications.
The expected results are briefly presented, as well as the social,
economic and environmental impact.
Abstract: Till date, English as a Second Language (ESL) educators involved in teaching language and communication to engineering students face an uphill task in developing graduate communicative competency. This challenge is accentuated by the apparent lack of English for Specific Purposes (ESP) materials for engineering students in the engineering curriculum. As such, most ESL educators are forced to play multiple roles. They don tasks such as curriculum designers, material writers and teachers with limited knowledge of the disciplinary content. Previous research indicates that prospective professional engineers should possess some sub-sets of competency: technical, linguistic oral immediacy, meta-cognitive and rhetorical explanatory competence. Another study revealed that engineering students need to be equipped with technical and linguistic oral immediacy competence. However, little is known whether these competency needs are in line with the educators- perceptions of communicative competence. This paper examines the best mix of communicative competence subsets that create the magic for engineering students in technical oral presentations. For the purpose of this study, two groups of educators were interviewed. These educators were language and communication lecturers involved in teaching a speaking course and content experts who assess students- technical oral presentations at tertiary level. The findings indicate that these two groups differ in their perceptions
Abstract: Most simple nonlinear thresholding rules for
wavelet- based denoising assume that the wavelet coefficients are independent. However, wavelet coefficients of natural images have significant dependencies. This paper attempts to give a recipe for selecting one of the popular image-denoising algorithms based
on VisuShrink, SureShrink, OracleShrink, BayesShrink and BiShrink and also this paper compares different Bivariate models used for image denoising applications. The first part of the paper
compares different Shrinkage functions used for image-denoising.
The second part of the paper compares different bivariate models
and the third part of this paper uses the Bivariate model with modified marginal variance which is based on Laplacian assumption. This paper gives an experimental comparison on six 512x512 commonly used images, Lenna, Barbara, Goldhill,
Clown, Boat and Stonehenge. The following noise powers 25dB,26dB, 27dB, 28dB and 29dB are added to the six standard images and the corresponding Peak Signal to Noise Ratio (PSNR) values
are calculated for each noise level.
Abstract: Oxidative stress makes up common incidents in
eukaryotic metabolism. The presence of diverse components
disturbing the equilibrium during oxygen metabolism increases
oxidative damage unspecifically in living cells. Body´s own
ubiquinone (Q10) seems to be a promising drug in defending the
heightened appearance of reactive oxygen species (ROS). Though, its
lipophilic properties require a new strategy in drug formulation to
overcome their low bioavailability. Consequently, the manufacture of
heterogeneous nanodispersions is in focus for medical applications.
The composition of conventional nanodispersions is made up of a
drug-consisting core and a surfactive agent, also named as surfactant.
Long-termed encapsulation of the surfactive components into tissues
might be the consequence of the use during medical therapeutics. The
potential of provoking side-effects is given by their nonbiodegradable
properties. Further improvements during fabrication
process use the incorporation of biodegradable components such as
modified γ-polyglutamic acid which decreases the potential of
prospective side-effects.
Abstract: Protection and proper management of archaeological heritage are an essential process of studying and interpreting the generations present and future. Protecting the archaeological heritage is based upon multidiscipline professional collaboration. This study aims to gather data by different sources (Photogrammetry and Geographic Information System (GIS)) integrated for the purpose of documenting one the of significant archeological sites (Ahl-Alkahf, Jordan). 3D modeling deals with the actual image of the features, shapes and texture to represent reality as realistically as possible by using texture. The 3D coordinates that result of the photogrammetric adjustment procedures are used to create 3D-models of the study area. Adding Textures to the 3D-models surfaces gives a 'real world' appearance to the displayed models. GIS system combined all data, including boundary maps, indicating the location of archeological sites, transportation layer, digital elevation model and orthoimages. For realistic representation of the study area, 3D - GIS model prepared, where efficient generation, management and visualization of such special data can be achieved.
Abstract: Raman spectroscopy are used to characterize the
chemical changes in normoxic polyhydroxyethylacrylate gel
dosimeter (PHEA) induced by radiation. Irradiations in the low dose
region are performed and the polymerizations of PHEA gels are
monitored by the observing the changes of Raman shift intensity of
the carbon covalent bond of PHEA originated from both monomer
and the cross-linker. The variation in peak intensities with absorbed
dose was observed. As the dose increase, the peak intensities of
covalent bond of carbon in the polymer gels decrease. This point out
that the amount of absorbed dose affect the polymerization of
polymer gels. As the absorbed dose increase, the polymerizations
also increase. Results verify that PHEA gel dosimeters are sensitive
even in lower dose region.