Abstract: The scale, complexity and worldwide geographical
spread of the LHC computing and data analysis problems are
unprecedented in scientific research. The complexity of processing
and accessing this data is increased substantially by the size and
global span of the major experiments, combined with the limited
wide area network bandwidth available. We present the latest
generation of the MONARC (MOdels of Networked Analysis at
Regional Centers) simulation framework, as a design and modeling
tool for large scale distributed systems applied to HEP experiments.
We present simulation experiments designed to evaluate the
capabilities of the current real-world distributed infrastructure to
support existing physics analysis processes and the means by which
the experiments bands together to meet the technical challenges
posed by the storage, access and computing requirements of LHC
data analysis within the CMS experiment.
Abstract: The successful implementation of Service-Oriented Architecture (SOA) is not confined to Information Technology systems and required changes of the whole enterprise. In order to adapt IT and business, the enterprise requires adequate and measurable methods. The adoption of SOA creates new problem with regard to measuring and analysis the performance. In fact the enterprise should investigate to what extent the development of services will increase the value of business. It is required for every business to measure the extent of SOA adaptation with the goals of enterprise. Moreover, precise performance metrics and their combination with the advanced evaluation methodologies as a solution should be defined. The aim of this paper is to present a systematic methodology for designing a measurement system at the technical and business levels, so that: (1) it will determine measurement metrics precisely (2) the results will be analysed by mapping identified metrics to the measurement tools.
Abstract: Current OCR technology does not allow to
accurately recognizing small text images, such as those found
in web images. Our goal is to investigate new approaches to
recognize very low resolution text images containing antialiased
character shapes.
This paper presents a preliminary study on the variability of
such characters and the feasibility to discriminate them by
using geometrical features. In a first stage we analyze the
distribution of these features. In a second stage we present a
study on the discriminative power for recognizing isolated
characters, using various rendering methods and font
properties. Finally we present interesting results of our
evaluation tests leading to our conclusion and future focus.
Abstract: Five crystal modifications of water insoluble
artesunate were generated by recrystallizing it from various solvents
with improved physicochemical properties. These generated crystal
forms were characterized to select the most potent and soluble form.
SEM of all the forms showed changes in external shape leading them
to be different morphologically. DSC thermograms of Form III and
Form V showed broad endotherm peaks at 83.04oC and 76.96oC prior
to melting fusion of drug respectively. Calculated weight loss in TGA
revealed that Form III and Form V are methanol and acetone solvates
respectively. However, few additional peaks were appeared in XRPD
pattern in these two solvate forms. All forms exhibit exothermic
behavior in buffer and two solvates display maximum ease of
molecular release from the lattice. Methanol and acetone solvates
were found to be most soluble forms and exhibited higher
antimalarial efficacy showing higher survival rate (83.3%) after 30
days.
Abstract: As in today's semiconductor industries test costs can make up to 50 percent of the total production costs, an efficient test error detection becomes more and more important. In this paper, we present a new machine learning approach to test error detection that should provide a faster recognition of test system faults as well as an improved test error recall. The key idea is to learn a classifier ensemble, detecting typical test error patterns in wafer test results immediately after finishing these tests. Since test error detection has not yet been discussed in the machine learning community, we define central problem-relevant terms and provide an analysis of important domain properties. Finally, we present comparative studies reflecting the failure detection performance of three individual classifiers and three ensemble methods based upon them. As base classifiers we chose a decision tree learner, a support vector machine and a Bayesian network, while the compared ensemble methods were simple and weighted majority vote as well as stacking. For the evaluation, we used cross validation and a specially designed practical simulation. By implementing our approach in a semiconductor test department for the observation of two products, we proofed its practical applicability.
Abstract: The aerodynamic noise radiation from a side view mirror (SVM) in the high-speed airflow is calculated by the combination of unsteady incompressible fluid flow analysis and acoustic analysis. The transient flow past the generic SVM is simulated with variable turbulence model, namely DES Detached Eddy Simulation and LES (Large Eddy Simulation). Detailed velocity vectors and contour plots of the time-varying velocity and pressure fields are presented along cut planes in the flow-field. Mean and transient pressure are also monitored at several points in the flow field and compared to corresponding experimentally data published in literature. The acoustic predictions made using the Ffowcs-Williams-Hawkins acoustic analogy (FW-H) and the boundary element (BEM).
Abstract: During the last couple of years, the degree of dependence on IT systems has reached a dimension nobody imagined to be possible 10 years ago. The increased usage of mobile devices (e.g., smart phones), wireless sensor networks and embedded devices (Internet of Things) are only some examples of the dependency of modern societies on cyber space. At the same time, the complexity of IT applications, e.g., because of the increasing use of cloud computing, is rising continuously. Along with this, the threats to IT security have increased both quantitatively and qualitatively, as recent examples like STUXNET or the supposed cyber attack on Illinois water system are proofing impressively. Once isolated control systems are nowadays often publicly available - a fact that has never been intended by the developers. Threats to IT systems don’t care about areas of responsibility. Especially with regard to Cyber Warfare, IT threats are no longer limited to company or industry boundaries, administrative jurisdictions or state boundaries. One of the important countermeasures is increased cooperation among the participants especially in the field of Cyber Defence. Besides political and legal challenges, there are technical ones as well. A better, at least partially automated exchange of information is essential to (i) enable sophisticated situational awareness and to (ii) counter the attacker in a coordinated way. Therefore, this publication performs an evaluation of state of the art Intrusion Detection Message Exchange protocols in order to guarantee a secure information exchange between different entities.
Abstract: Pattern recognition and image recognition methods are commonly developed and tested using testbeds, which contain known responses to a query set. Until now, testbeds available for image analysis and content-based image retrieval (CBIR) have been scarce and small-scale. Here we present the one million images CEA-List Image Collection (CLIC) testbed that we have produced, and report on our use of this testbed to evaluate image analysis merging techniques. This testbed will soon be made publicly available through the EU MUSCLE Network of Excellence.
Abstract: Online news websites are one of the main and wide areas of Mass Media. Since the nineties several Jordanian newspapers were introduced to the World Wide Web to reach various and large numbers of audiances. Examples of these newspapers that have online version are Al-Rai, Ad-Dustor and AlGhad. Other pure online news websites include Ammon and Rum. The main aim of this study is to evaluate online newspaper websites using two assessment measures; usability and web content. This aim is achieved by using a questionnaire based evaluation which is based on the definition of usability and web content in the ISO document as the standard number 9241-part 11. The results are obtained based on 204 audiences- responses. The results of the research showed that the usability factor is relatively good for all Jordanian online newspapers whereas the web content factor is moderate.
Abstract: The policies governing the business of any
organization are well reflected in her business rules. The business
rules are implemented by data validation techniques, coded during
the software development process. Any change in business
policies results in change in the code written for data validation
used to enforce the business policies. Implementing the change in
business rules without changing the code is the objective of this
paper. The proposed approach enables users to create rule sets at
run time once the software has been developed. The newly defined
rule sets by end users are associated with the data variables for
which the validation is required. The proposed approach facilitates
the users to define business rules using all the comparison
operators and Boolean operators. Multithreading is used to
validate the data entered by end user against the business rules
applied. The evaluation of the data is performed by a newly
created thread using an enhanced form of the RPN (Reverse Polish
Notation) algorithm.
Abstract: As known, the guard wires of overhead high voltage
are usually grounded through the grounding systems of support and
of the terminal stations. They do affect the zero sequence impedance
value of the line, Z0, which is generally, calculated assuming that the
wires guard are at ground potential. In this way it is not considered
the effect of the resistances of earth of supports and stations. In this
work is formed a formula for the calculation of Z0 which takes
account of said resistances. Is also proposed a method of calculating
the impedance zero sequence overhead lines in which, in various
sections or spans, the guard wires are connected to the supports, or
isolated from them, or are absent. Parametric analysis is given for
lines 220 kV and 400 kV, which shows the extent of the errors made
with traditional methods of calculation.
Abstract: In this paper, we present a novel objective nonreference performance assessment algorithm for image fusion. It takes into account local measurements to estimate how well the important information in the source images is represented by the fused image. The metric is based on the Universal Image Quality Index and uses the similarity between blocks of pixels in the input images and the fused image as the weighting factors for the metrics. Experimental results confirm that the values of the proposed metrics correlate well with the subjective quality of the fused images, giving a significant improvement over standard measures based on mean squared error and mutual information.
Abstract: The purpose of this study was to examine the viewpoints in terms of changing distances and levels and thereby, comparatively analyze the visual sensitivity to the elements of the natural views. The questionnaire survey was conducted separately for experts and non-experts. Summing up, it was confirmed that the visual sensitivity to the elements of the same natural views differed significantly depending on subjects' professionalism, changes of the viewpoint levels and distances, while the visual sensitivity to 'openness of visual/view axes' did not differ significantly when only the distances of the viewpoints were varied. In addition, the visual sensitivity to visual/view axes differed between experts and ordinary people when the levels of the viewpoints were varied, while the visual sensitivity to 'damaged natural view resources' differed between two groups when the distances of the viewpoints were varied.
Abstract: Key performance indicators (KPIs) are used for post
result evaluation in the construction industry, and they normally do
not have provisions for changes. This paper proposes a set of
dynamic key performance indicators (d-KPIs) which predicts the
future performance of the activity being measured and presents the
opportunity to change practice accordingly. Critical to the
predictability of a construction project is the ability to achieve
automated data collection. This paper proposes an effective way to
collect the process and engineering management data from an
integrated construction management system. The d-KPI matrix,
consisting of various indicators under seven categories, developed
from this study can be applied to close monitoring of the
development projects of aged-care facilities. The d-KPI matrix also
enables performance measurement and comparison at both project
and organization levels.
Abstract: Olomouc is a unique and complex landmark with
widespread forestation and land use. This research work was
conducted to assess important and complex land use change
trajectories in Olomouc region. Multi-temporal satellite data from
1991, 2001 and 2013 were used to extract land use/cover types by
object oriented classification method. To achieve the objectives, three
different aspects were used: (1) Calculate the quantity of each
transition; (2) Allocate location based landscape pattern (3) Compare
land use/cover evaluation procedure. Land cover change trajectories
shows that 16.69% agriculture, 54.33% forest and 21.98% other areas
(settlement, pasture and water-body) were stable in all three decade.
Approximately 30% of the study area maintained as a same land cove
type from 1991 to 2013. Here broad scale of political and socioeconomic
factors was also affect the rate and direction of landscape
changes. Distance from the settlements was the most important
predictor of land cover change trajectories. This showed that most of
landscape trajectories were caused by socio-economic activities and
mainly led to virtuous change on the ecological environment.
Abstract: Experiments have been carried out at the Latvia
University of Agriculture Department of Food Technology. The aim
of this work was to assess the effect of thermal treatment in flexible
retort pouch packaging on the quality of potatoes’ produce during the
storage time. Samples were evaluated immediately after retort
thermal treatment; and following 1; 2; 3 and 4 storage months at the
ambient temperature of +18±2ºC in vacuum packaging from
polyamide/polyethylene (PA/PE) and aluminum/polyethylene
(Al/PE) film pouches with barrier properties. Experimentally the
quality of the potatoes’ produce in dry butter and mushroom
dressings was characterized by measuring pH, hardness, color,
microbiological properties and sensory evaluation. The sterilization
was effective in protecting the produce from physical, chemical, and
microbial quality degradation. According to the study of obtained
data, it can be argued that the selected product processing technology
and packaging materials could be applied to provide the safety and
security during four-month storage period.
Abstract: There exists a strong correlation between efficient project management and competitive advantage for organizations. Therefore, organizations are striving to standardize and assess the rigor of their project management processes and capabilities i.e. project management maturity. Researchers and standardization organizations have developed several project management maturity models (PMMMs) to assess project management maturity of the organizations. This study presents a critical evaluation of some of the leading PMMMs against OPM3® in a multitude of ways to look at which PMMM is the most comprehensive model - which could assess most aspects of organizations and also help the organizations in gaining competitive advantage over competitors. After a detailed morphological analysis of the models, it is concluded that OPM3® is the most promising maturity model that can really provide a competitive advantage to the organizations due to its unique approach of assessment and improvement strategies.
Abstract: A sequential decision problem, based on the task ofidentifying the species of trees given acoustic echo data collectedfrom them, is considered with well-known stochastic classifiers,including single and mixture Gaussian models. Echoes are processedwith a preprocessing stage based on a model of mammalian cochlearfiltering, using a new discrete low-pass filter characteristic. Stoppingtime performance of the sequential decision process is evaluated andcompared. It is observed that the new low pass filter processingresults in faster sequential decisions.
Abstract: Vernonia divergens Benth., commonly known as
“Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the
leaves of the plant, boiled in water are successfully administered to a
large number of diabetic patients. The present study evaluates the
putative anti-diabetic ingredients, isolated from the in vivo and in
vitro grown plantlets of V. divergens for their antimicrobial and
anticancer activities. Sterilized explants of nodal segments were
cultured on MS (Musashige and Skoog, 1962) medium in presence of
different combinations of hormones. Multiple shoots along with
bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA.
Micro-plantlets were separated and sub-cultured on the double
strength (2X) of the above combination of hormones leading to
increased length of roots and shoots. These plantlets were
successfully transferred to soil and survived well in nature. The
ethanol extract of plantlets from both in vivo & in vitro sources were
prepared in soxhlet extractor and then concentrated to dryness under
reduced pressure in rotary evaporator. Thus obtainedconcentrated
extracts showed significant inhibitory activity against gram
negative bacteria like Escherichia coli and Pseudomonas
aeruginosa but no inhibition was found against gram positive
bacteria. Further, these ethanol extracts were screened for in vitro
percentage cytotoxicity at different time periods (24 h, 48 h and 72 h)
of different dilutions. The in vivo plant extract inhibited the growth of
EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50,
25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in
vitro origin, the inhibition was found against EAC cell lines even at
48h. During spectrophotometric scanning, the extracts exhibited
different maxima (ʎ) - four peaks in in vitro extracts as against single
in in vivo preparation suggesting the possible change in the nature of
ingredients during micropropagation through tissue culture
techniques.
Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.