Abstract: Cloud computing is becoming more and more matured over the last few years and consequently the demands for better cloud services is increasing rapidly. One of the research topics to improve cloud services is the desktop computing in virtualized environment. This paper aims at the development of an adaptive virtual desktop service in cloud computing platform based on our previous research on the virtualization technology. We implement cloud virtual desktop and application software streaming technology that make it possible for providing Virtual Desktop as a Service (VDaaS). Given the development of remote desktop virtualization, it allows shifting the user’s desktop from the traditional PC environment to the cloud-enabled environment, which is stored on a remote virtual machine rather than locally. This proposed effort has the potential to positively provide an efficient, resilience and elastic environment for online cloud service. Users no longer need to burden the platform maintenances and drastically reduces the overall cost of hardware and software licenses. Moreover, this flexible remote desktop service represents the next significant step to the mobile workplace, and it lets users access their desktop environments from virtually anywhere.
Abstract: Implicit equations play a crucial role in Engineering.
Based on this importance, several techniques have been applied to
solve this particular class of equations. When it comes to practical
applications, in general, iterative procedures are taken into account.
On the other hand, with the improvement of computers, other
numerical methods have been developed to provide a more
straightforward methodology of solution. Analytical exact approaches
seem to have been continuously neglected due to the difficulty
inherent in their application; notwithstanding, they are indispensable
to validate numerical routines. Lagrange-s Inversion Theorem is a
simple mathematical tool which has proved to be widely applicable to
engineering problems. In short, it provides the solution to implicit
equations by means of an infinite series. To show the validity of this
method, the tree-parameter infiltration equation is, for the first time,
analytically and exactly solved. After manipulating these series,
closed-form solutions are presented as H-functions.
Abstract: The aim of this paper is to rank the impact of Object
Oriented(OO) metrics in fault prediction modeling using Artificial
Neural Networks(ANNs). Past studies on empirical validation of
object oriented metrics as fault predictors using ANNs have focused
on the predictive quality of neural networks versus standard
statistical techniques. In this empirical study we turn our attention to
the capability of ANNs in ranking the impact of these explanatory
metrics on fault proneness. In ANNs data analysis approach, there is
no clear method of ranking the impact of individual metrics. Five
ANN based techniques are studied which rank object oriented
metrics in predicting fault proneness of classes. These techniques are
i) overall connection weights method ii) Garson-s method iii) The
partial derivatives methods iv) The Input Perturb method v) the
classical stepwise methods. We develop and evaluate different
prediction models based on the ranking of the metrics by the
individual techniques. The models based on overall connection
weights and partial derivatives methods have been found to be most
accurate.
Abstract: In this document, we have proposed a robust
conceptual strategy, in order to improve the robustness against the manufacturing defects and thus the reliability of logic CMOS circuits. However, in order to enable the use of future CMOS
technology nodes this strategy combines various types of design:
DFR (Design for Reliability), techniques of tolerance: hardware
redundancy TMR (Triple Modular Redundancy) for hard error
tolerance, the DFT (Design for Testability. The Results on largest ISCAS and ITC benchmark circuits show that our approach improves
considerably the reliability, by reducing the key factors, the area costs and fault tolerance probability.
Abstract: One of the ubiquitous routines in medical practice is searching through voluminous piles of clinical documents. In this paper we introduce a distributed system to search and exchange clinical documents. Clinical documents are distributed peer-to-peer. Relevant information is found in multiple iterations of cross-searches between the clinical text and its domain encyclopedia.
Abstract: Computer aided design accounts with the support of
parametric software in the design of machine components as well as
of any other pieces of interest. The complexities of the element under
study sometimes offer certain difficulties to computer design, or ever
might generate mistakes in the final body conception. Reverse
engineering techniques are based on the transformation of already
conceived body images into a matrix of points which can be
visualized by the design software. The literature exhibits several
techniques to obtain machine components dimensional fields, as
contact instrument (MMC), calipers and optical methods as laser
scanner, holograms as well as moiré methods. The objective of this
research work was to analyze the moiré technique as instrument of
reverse engineering, applied to bodies of nom complex geometry as
simple solid figures, creating matrices of points. These matrices were
forwarded to a parametric software named SolidWorks to generate
the virtual object. Volume data obtained by mechanical means, i.e.,
by caliper, the volume obtained through the moiré method and the
volume generated by the SolidWorks software were compared and
found to be in close agreement. This research work suggests the
application of phase shifting moiré methods as instrument of reverse
engineering, serving also to support farm machinery element designs.
Abstract: On-board Error Detection and Correction (EDAC)
devices aim to secure data transmitted between the central
processing unit (CPU) of a satellite onboard computer and its local
memory. This paper presents a comparison of the performance of
four low complexity EDAC techniques for application in Random
Access Memories (RAMs) on-board small satellites. The
performance of a newly proposed EDAC architecture is measured
and compared with three different EDAC strategies, using the same
FPGA technology. A statistical analysis of single-event upset (SEU)
and multiple-bit upset (MBU) activity in commercial memories
onboard Alsat-1 is given for a period of 8 years
Abstract: The seismic rehabilitation designs of two reinforced
concrete school buildings, representative of a wide stock of similar
edifices designed under earlier editions of the Italian Technical
Standards, are presented in this paper. The mutual retrofit solution
elaborated for the two buildings consists in the incorporation of a
dissipative bracing system including pressurized fluid viscous springdampers
as passive protective devices. The mechanical parameters,
layouts and locations selected for the constituting elements of the
system; the architectural renovation projects developed to properly
incorporate the structural interventions and improve the appearance
of the buildings; highlights of the installation works already
completed in one of the two structures; and a synthesis of the
performance assessment analyses carried out in original and
rehabilitated conditions, are illustrated. The results of the analyses
show a remarkable enhancement of the seismic response capacities of
both structures. This allows reaching the high performance objectives
postulated in the retrofit designs with much lower costs and
architectural intrusion as compared to traditional rehabilitation
interventions designed for the same objectives.
Abstract: Flat double-layer grid is from category of space structures that are formed from two flat layers connected together with diagonal members. Increased stiffness and better seismic resistance in relation to other space structures are advantages of flat double layer space structures. The objective of this study is assessment and calculation of Behavior factor of flat double layer space structures. With regarding that these structures are used widely but Behavior factor used to design these structures against seismic force is not determined and exact, the necessity of study is obvious. This study is theoretical. In this study we used structures with span length of 16m and 20 m. All connections are pivotal. ANSYS software is used to non-linear analysis of structures.
Abstract: This paper describes the NEAR (Navigating Exhibitions, Annotations and Resources) panel, a novel interactive visualization technique designed to help people navigate and interpret groups of resources, exhibitions and annotations by revealing hidden relations such as similarities and references. NEAR is implemented on A•VI•RE, an extended online information repository. A•VI•RE supports a semi-structured collection of exhibitions containing various resources and annotations. Users are encouraged to contribute, share, annotate and interpret resources in the system by building their own exhibitions and annotations. However, it is hard to navigate smoothly and efficiently in A•VI•RE because of its high capacity and complexity. We present a visual panel that implements new navigation and communication approaches that support discovery of implied relations. By quickly scanning and interacting with NEAR, users can see not only implied relations but also potential connections among different data elements. NEAR was tested by several users in the A•VI•RE system and shown to be a supportive navigation tool. In the paper, we further analyze the design, report the evaluation and consider its usage in other applications.
Abstract: Flood management is one of the important fields in
urban storm water management. Floods are influenced by the
increase of huge storm event, or improper planning of the area. This study mainly provides the flood protection in four stages; planning,
flood event, responses and evaluation. However it is most effective then flood protection is considered in planning/design and
evaluation stages since both stages represent the land development of the area. Structural adjustments are often more reliable than nonstructural
adjustments in providing flood protection, however
structural adjustments are constrained by numerous factors such as
political constraints and cost. Therefore it is important to balance
both adjustments with the situation. The technical decisions provided
will have to be approved by the higher-ups who have the power to
decide on the final solution. Costs however, are the biggest factor in
determining the final decision. Therefore this study recommends
flood protection system should have been integrated and enforces
more in the early stages (planning and design) as part of the storm
water management plan. Factors influencing the technical decisions
provided should be reduced as low as possible to avoid a reduction in
the expected performance of the proposed adjustments.
Abstract: The X-ray technology has been used in non-destructive evaluation in the Power System, in which a visual non-destructive inspection method for the electrical equipment is provided. However, lots of noise is existed in the images that are got from the X-ray digital images equipment. Therefore, the auto defect detection which based on these images will be very difficult to proceed. A theory on X-ray image de-noising algorithm based on wavelet transform is proposed in this paper. Then the edge detection algorithm is used so that the defect can be pushed out. The result of experiment shows that the method which utilized by this paper is very useful for de-noising on the X-ray images.
Abstract: This research intends to introduce a new usage of Artificial Intelligent (AI) approaches in Stepping Stone Detection (SSD) fields of research. By using Self-Organizing Map (SOM) approaches as the engine, through the experiment, it is shown that SOM has the capability to detect the number of connection chains that involved in a stepping stones. Realizing that by counting the number of connection chain is one of the important steps of stepping stone detection and it become the research focus currently, this research has chosen SOM as the AI techniques because of its capabilities. Through the experiment, it is shown that SOM can detect the number of involved connection chains in Network-based Stepping Stone Detection (NSSD).
Abstract: Waste management is now a global concern due to its
high environmental impact on climate change. Because of generating
huge amount of waste through our daily activities, managing waste in
an efficient way has become more important than ever. Alternative
Waste Technology (AWT), a new category of waste treatment
technology has been developed for energy recovery in recent years to
address this issue. AWT describes a technology that redirects waste
away from landfill, recovers more useable resources from the waste
flow and reduces the impact on the surroundings. Australia is one of
the largest producers of waste per-capita. A number of AWTs are
using in Australia to produce energy from waste. Presently, it is vital
to identify an appropriate AWT to establish a sustainable waste
management system in Australia. Identification of an appropriate
AWT through Multi-criteria analysis (MCA) of four AWTs by using
five key decision making criteria is presented and discussed in this
paper.
Abstract: Numerical analysis naturally finds applications in all
fields of engineering and the physical sciences, but in the
21st century, the life sciences and even the arts have adopted
elements of scientific computations. The numerical data analysis
became key process in research and development of all the fields [6].
In this paper we have made an attempt to analyze the specified
numerical patterns with reference to the association rule mining
techniques with minimum confidence and minimum support mining
criteria. The extracted rules and analyzed results are graphically
demonstrated. Association rules are a simple but very useful form of
data mining that describe the probabilistic co-occurrence of certain
events within a database [7]. They were originally designed to
analyze market-basket data, in which the likelihood of items being
purchased together within the same transactions are analyzed.
Abstract: In the recent years, high dynamic range imaging has
gain popularity with the advancement in digital photography. In this
contribution we present a subjective evaluation of various tone
production and tone mapping techniques by a number of participants.
Firstly, standard HDR images were used and the participants were
asked to rate them based on a given rating scheme. After that, the
participant was asked to rate HDR image generated using linear and
nonlinear combination approach of multiple exposure images. The
experimental results showed that linearly generated HDR images
have better visualization than the nonlinear combined ones. In
addition, Reinhard et al. and the exponential tone mapping operators
have shown better results compared to logarithmic and the Garrett et
al. tone mapping operators.
Abstract: The electromagnetic spectrum is a natural resource
and hence well-organized usage of the limited natural resources is the
necessities for better communication. The present static frequency
allocation schemes cannot accommodate demands of the rapidly
increasing number of higher data rate services. Therefore, dynamic
usage of the spectrum must be distinguished from the static usage to
increase the availability of frequency spectrum. Cognitive radio is not
a single piece of apparatus but it is a technology that can incorporate
components spread across a network. It offers great promise for
improving system efficiency, spectrum utilization, more effective
applications, reduction in interference and reduced complexity of
usage for users. Cognitive radio is aware of its environmental,
internal state, and location, and autonomously adjusts its operations
to achieve designed objectives. It first senses its spectral environment
over a wide frequency band, and then adapts the parameters to
maximize spectrum efficiency with high performance. This paper
only focuses on the analysis of Bit-Error-Rate in cognitive radio by
using Particle Swarm Optimization Algorithm. It is theoretically as
well as practically analyzed and interpreted in the sense of
advantages and drawbacks and how BER affects the efficiency and
performance of the communication system.
Abstract: Thirty six samples from each (aerobic and anoxic)
activated sludge were collected from two wastewater treatment plants
with MBRs in Berlin, Germany. The samples were prepared for count
and definition of fungal isolates; these isolates were purified by
conventional techniques and identified by microscopic examination.
Sixty tow species belonging to 28 genera were isolated from
activated sludge samples under aerobic conditions (28 genera and 58
species) and anoxic conditions (26 genera and 52 species). The
obtained data show that, Aspergillus was found at 94.4% followed by
Penicillium 61.1 %, Fusarium (61.1 %), Trichoderma (44.4 %) and
Geotrichum candidum (41.6 %) species were the most prevalent in all
activated sludge samples. The study confirmed that fungi can thrive
in activated sludge and sporulation, but isolated in different numbers
depending on the effect of aeration system. Some fungal species in
our study are saprophytic, and other a pathogenic to plants and
animals.
Abstract: In this research, Response Surface Methodology (RSM) is used to investigate the effect of four controllable input variables namely: discharge current, pulse duration, pulse off time and applied voltage Surface Roughness (SR) of on Electrical Discharge Machined surface. To study the proposed second-order polynomial model for SR, a Central Composite Design (CCD) is used to estimation the model coefficients of the four input factors, which are alleged to influence the SR in Electrical Discharge Machining (EDM) process. Experiments were conducted on AISI D2 tool steel with copper electrode. The response is modeled using RSM on experimental data. The significant coefficients are obtained by performing Analysis of Variance (ANOVA) at 5% level of significance. It is found that discharge current, pulse duration, and pulse off time and few of their interactions have significant effect on the SR. The model sufficiency is very satisfactory as the Coefficient of Determination (R2) is found to be 91.7% and adjusted R2-statistic (R2 adj ) 89.6%.
Abstract: Prospective readers can quickly determine whether a document is relevant to their information need if the significant phrases (or keyphrases) in this document are provided. Although keyphrases are useful, not many documents have keyphrases assigned to them, and manually assigning keyphrases to existing documents is costly. Therefore, there is a need for automatic keyphrase extraction. This paper introduces a new domain independent keyphrase extraction algorithm. The algorithm approaches the problem of keyphrase extraction as a classification task, and uses a combination of statistical and computational linguistics techniques, a new set of attributes, and a new machine learning method to distinguish keyphrases from non-keyphrases. The experiments indicate that this algorithm performs better than other keyphrase extraction tools and that it significantly outperforms Microsoft Word 2000-s AutoSummarize feature. The domain independence of this algorithm has also been confirmed in our experiments.