Abstract: In 2011, Debiao et al. pointed out that S-3PAKE protocol proposed by Lu and Cao for password-authenticated key exchange in the three-party setting is vulnerable to an off-line dictionary attack. Then, they proposed some countermeasures to eliminate the security vulnerability of the S-3PAKE. Nevertheless, this paper points out their enhanced S-3PAKE protocol is still vulnerable to undetectable on-line dictionary attacks unlike their claim.
Abstract: Brain Computer Interface (BCI) has been recently
increased in research. Functional Near Infrared Spectroscope (fNIRs)
is one the latest technologies which utilize light in the near-infrared
range to determine brain activities. Because near infrared technology
allows design of safe, portable, wearable, non-invasive and wireless
qualities monitoring systems, fNIRs monitoring of brain
hemodynamics can be value in helping to understand brain tasks. In
this paper, we present results of fNIRs signal analysis indicating that
there exist distinct patterns of hemodynamic responses which
recognize brain tasks toward developing a BCI. We applied two
different mathematics tools separately, Wavelets analysis for
preprocessing as signal filters and feature extractions and Neural
networks for cognition brain tasks as a classification module. We
also discuss and compare with other methods while our proposals
perform better with an average accuracy of 99.9% for classification.
Abstract: This paper represents an investigation on how exploiting multiple transmit antennas by OFDM based wireless LAN subscribers can mitigate physical layer error rate. Then by comparing the Wireless LANs that utilize spatial diversity techniques with the conventional ones it will reveal how PHY and TCP throughputs behaviors are ameliorated. In the next step it will assess the same issues based on a cellular context operation which is mainly introduced as an innovated solution that beside a multi cell operation scenario benefits spatio-temporal signaling schemes as well. Presented simulations will shed light on the improved performance of the wide range and high quality wireless LAN services provided by the proposed approach.
Abstract: Building life cycle will never be excused from the existence of defects and deterioration. They are common problems in building, existed in newly build or in aged building. Buildings constructed from wood are indeed affected by its agent and serious defects and damages can reduce values to a building. In repair works, it is important to identify the causes and repair techniques that best suites with the condition. This paper reviews the conservation of traditional timber mosque in Malaysia comprises the concept, principles and approaches of mosque conservation in general. As in conservation practice, wood in historic building can be conserved by using various restoration and conservation techniques which this can be grouped as Fully and Partial Replacement, Mechanical Reinforcement, Consolidation by Impregnation and Reinforcement, Removing Paint and also Preservation of Wood and Control Insect Invasion, as to prolong and extended the function of a timber in a building. It resulted that the common techniques adopted in timber mosque conservation are from the conventional ways and the understanding of the repair technique requires the use of only preserve wood to prevent the future immature defects.
Abstract: Traditional wind tunnel models are meticulously machined from metal in a process that can take several months. While very precise, the manufacturing process is too slow to assess a new design's feasibility quickly. Rapid prototyping technology makes this concurrent study of air vehicle concepts via computer simulation and in the wind tunnel possible. This paper described the Affects layer thickness models product with rapid prototyping on Aerodynamic Coefficients for Constructed wind tunnel testing models. Three models were evaluated. The first model was a 0.05mm layer thickness and Horizontal plane 0.1μm (Ra) second model was a 0.125mm layer thickness and Horizontal plane 0.22μm (Ra) third model was a 0.15mm layer thickness and Horizontal plane 4.6μm (Ra). These models were fabricated from somos 18420 by a stereolithography (SLA). A wing-body-tail configuration was chosen for the actual study. Testing covered the Mach range of Mach 0.3 to Mach 0.9 at an angle-of-attack range of -2° to +12° at zero sideslip. Coefficients of normal force, axial force, pitching moment, and lift over drag are shown at each of these Mach numbers. Results from this study show that layer thickness does have an effect on the aerodynamic characteristics in general; the data differ between the three models by fewer than 5%. The layer thickness does have more effect on the aerodynamic characteristics when Mach number is decreased and had most effect on the aerodynamic characteristics of axial force and its derivative coefficients.
Abstract: With the aim of improving nutritional profile and antioxidant capacity of gluten-free cookies, blueberry pomace, by-product of juice production, was processed into a new food ingredient by drying and grinding and used for a gluten-free cookie formulation. Since the quality of a baked product is highly influenced by the baking conditions, the objective of this work was to optimize the baking time and thickness of dough pieces, by applying Response Surface Methodology (RSM) in order to obtain the best technological quality of the cookies. The experiments were carried out according to a Central Composite Design (CCD) by selecting the dough thickness and baking time as independent variables, while hardness, color parameters (L*, a* and b* values), water activity, diameter and short/long ratio were response variables. According to the results of RSM analysis, the baking time of 13.74min and dough thickness of 4.08mm was found to be the optimal for the baking temperature of 170°C. As similar optimal parameters were obtained by previously conducted experiment based on sensory analysis, response surface methodology (RSM) can be considered as a suitable approach to optimize the baking process.
Abstract: Proteins or genes that have similar sequences are likely to perform the same function. One of the most widely used techniques for sequence comparison is sequence alignment. Sequence alignment allows mismatches and insertion/deletion, which represents biological mutations. Sequence alignment is usually performed only on two sequences. Multiple sequence alignment, is a natural extension of two-sequence alignment. In multiple sequence alignment, the emphasis is to find optimal alignment for a group of sequences. Several applicable techniques were observed in this research, from traditional method such as dynamic programming to the extend of widely used stochastic optimization method such as Genetic Algorithms (GAs) and Simulated Annealing. A framework with combination of Genetic Algorithm and Simulated Annealing is presented to solve Multiple Sequence Alignment problem. The Genetic Algorithm phase will try to find new region of solution while Simulated Annealing can be considered as an alignment improver for any near optimal solution produced by GAs.
Abstract: Place is a where dimension formed by people-s
relationship with physical settings, individual and group activities,
and meanings. 'Place Attachment', 'Place Identity'and 'Sense of
Place' are some concepts that could describe the quality of people-s
relationships with a place. The concept of Sense of place is used in
studying human-place bonding, attachment and place meaning. Sense
of Place usually is defined as an overarching impression
encompassing the general ways in which people feel about places,
senses it, and assign concepts and values to it. Sense of place is
highlighted in this article as one of the prevailing concepts among
place-based researches. Considering dimensions of sense of place has
always been beneficial for investigating public place attachment and
pro-environmental attitudes towards these places. The creation or
preservation of Sense of place is important in maintaining the quality
of the environment as well as the integrity of human life within it.
While many scholars argued that sense of place is a vague concept,
this paper will summarize and analyze the existing seminal literature.
Therefore, in this paper first the concept of Sense of place and its
characteristics will be examined afterward the scales of Sense of
place will be reviewed and the factors that contribute to form Sense
of place will be evaluated and finally Place Attachment as an
objective dimension for measuring the sense of place will be
described.
Abstract: Globalization, supported by information and
communication technologies, changes the rules of competitiveness
and increases the significance of information, knowledge and
network cooperation. In line with this trend, the need for efficient
trust-building tools has emerged. The absence of trust building
mechanisms and strategies was identified within several studies.
Through trust development, participation on e-business network and
usage of network services will increase and provide to SMEs new
economic benefits. This work is focused on effective trust building
strategies development for electronic business network platforms.
Based on trust building mechanism identification, the questionnairebased
analysis of its significance and minimum level of requirements
was conducted. In the paper, we are confirming the trust dependency
on e-Skills which play crucial role in higher level of trust into the
more sophisticated and complex trust building ICT solutions.
Abstract: ECG contains very important clinical information about the cardiac activities of the heart. Often the ECG signal needs to be captured for a long period of time in order to identify abnormalities in certain situations. Such signal apart of a large volume often is characterised by low quality due to the noise and other influences. In order to extract features in the ECG signal with time-varying characteristics at first need to be preprocessed with the best parameters. Also, it is useful to identify specific parts of the long lasting signal which have certain abnormalities and to direct the practitioner to those parts of the signal. In this work we present a method based on wavelet transform, standard deviation and variable threshold which achieves 100% accuracy in identifying the ECG signal peaks and heartbeat as well as identifying the standard deviation, providing a quick reference to abnormalities.
Abstract: This research contribution propels the idea of collaborating environment for the execution of student satellite projects in the backdrop of project management principles. The recent past has witnessed a technological shift in the aerospace industry from the big satellite projects to the small spacecrafts especially for the earth observation and communication purposes. This vibrant shift has vitalized the academia and industry to share their resources and to create a win-win paradigm of mutual success and technological development along with the human resource development in the field of aerospace. Small student satellites are the latest jargon of academia and more than 100 CUBESAT projects have been executed successfully all over the globe and many new student satellite projects are in the development phase. The small satellite project management requires the application of specific knowledge, skills, tools and techniques to achieve the defined mission requirements. The Authors have presented the detailed outline for the project management of student satellites and presented the role of industry to collaborate with the academia to get the optimized results in academic environment.
Abstract: This paper focuses on operational risk measurement
techniques and on economic capital estimation methods. A data
sample of operational losses provided by an anonymous Central
European bank is analyzed using several approaches. Loss
Distribution Approach and scenario analysis method are considered.
Custom plausible loss events defined in a particular scenario are
merged with the original data sample and their impact on capital
estimates and on the financial institution is evaluated. Two main
questions are assessed – What is the most appropriate statistical
method to measure and model operational loss data distribution? and
What is the impact of hypothetical plausible events on the financial
institution? The g&h distribution was evaluated to be the most
suitable one for operational risk modeling. The method based on the
combination of historical loss events modeling and scenario analysis
provides reasonable capital estimates and allows for the measurement
of the impact of extreme events on banking operations.
Abstract: The aim of the present study was to develop and
validate an inexpensive and simple high performance liquid
chromatographic (HPLC) method for the determination of colistin
sulfate. Separation of colistin sulfate was achieved on a ZORBAX
Eclipse XDB-C18 column using UV detection at λ=215 nm. The
mobile phase was 30 mM sulfate buffer (pH 2.5):acetonitrile(76:24).
An excellent linearity (r2=0.998) was found in the concentration
range of 25 - 400 μg/mL. Intra- day and inter-day precisions of
method (%RSD, n=3) were less than 7.9%.The developed and
validated method was applied to determination of the content of
colistin sulfate in medicated premix and animal feed sample.The
recovery of colistin from animal feed was satisfactorily ranged from
90.92 to 93.77%. The results demonstrated that the HPLC method
developed in this work is appropriate for direct determination of
colistin sulfate in commercial medicated premixes and animal feed.
Abstract: Speckled images arise when coherent microwave,
optical, and acoustic imaging techniques are used to image an object, surface or scene. Examples of coherent imaging systems include synthetic aperture radar, laser imaging systems, imaging sonar
systems, and medical ultrasound systems. Speckle noise is a form of object or target induced noise that results when the surface of the object is Rayleigh rough compared to the wavelength of the illuminating radiation. Detection and estimation in images corrupted
by speckle noise is complicated by the nature of the noise and is not
as straightforward as detection and estimation in additive noise. In
this work, we derive stochastic models for speckle noise, with an emphasis on speckle as it arises in medical ultrasound images. The
motivation for this work is the problem of segmentation and tissue classification using ultrasound imaging. Modeling of speckle in this
context involves partially developed speckle model where an underlying Poisson point process modulates a Gram-Charlier series
of Laguerre weighted exponential functions, resulting in a doubly
stochastic filtered Poisson point process. The statistical distribution of partially developed speckle is derived in a closed canonical form.
It is observed that as the mean number of scatterers in a resolution cell is increased, the probability density function approaches an
exponential distribution. This is consistent with fully developed speckle noise as demonstrated by the Central Limit theorem.
Abstract: Knowledge-based e-mail systems focus on
incorporating knowledge management approach in order to enhance
the traditional e-mail systems. In this paper, we present a knowledgebased
e-mail system called KS-Mail where people do not only send
and receive e-mail conventionally but are also able to create a sense
of knowledge flow. We introduce semantic processing on the e-mail
contents by automatically assigning categories and providing links to
semantically related e-mails. This is done to enrich the knowledge
value of each e-mail as well as to ease the organization of the e-mails
and their contents. At the application level, we have also built
components like the service manager, evaluation engine and search
engine to handle the e-mail processes efficiently by providing the
means to share and reuse knowledge. For this purpose, we present the
KS-Mail architecture, and elaborate on the details of the e-mail
server and the application server. We present the ontology mapping
technique used to achieve the e-mail content-s categorization as well
as the protocols that we have developed to handle the transactions in
the e-mail system. Finally, we discuss further on the implementation
of the modules presented in the KS-Mail architecture.
Abstract: This paper presents a comparative study of Ant Colony and Genetic Algorithms for VLSI circuit bi-partitioning. Ant colony optimization is an optimization method based on behaviour of social insects [27] whereas Genetic algorithm is an evolutionary optimization technique based on Darwinian Theory of natural evolution and its concept of survival of the fittest [19]. Both the methods are stochastic in nature and have been successfully applied to solve many Non Polynomial hard problems. Results obtained show that Genetic algorithms out perform Ant Colony optimization technique when tested on the VLSI circuit bi-partitioning problem.
Abstract: DSTATCOM is one of the equipments for voltage sag mitigation in power systems. In this paper a new control method for balanced and unbalanced voltage sag mitigation using DSTATCOM is proposed. The control system has two loops in order to regulate compensator current and load voltage. Delayed signal cancellation has been used for sequence separation. The compensator should protect sensitive loads against different types of voltage sag. Performance of the proposed method is investigated under different types of voltage sags for linear and nonlinear loads. Simulation results show appropriate operation of the proposed control system.
Abstract: For the past one decade, biclustering has become popular data mining technique not only in the field of biological data analysis but also in other applications like text mining, market data analysis with high-dimensional two-way datasets. Biclustering clusters both rows and columns of a dataset simultaneously, as opposed to traditional clustering which clusters either rows or columns of a dataset. It retrieves subgroups of objects that are similar in one subgroup of variables and different in the remaining variables. Firefly Algorithm (FA) is a recently-proposed metaheuristic inspired by the collective behavior of fireflies. This paper provides a preliminary assessment of discrete version of FA (DFA) while coping with the task of mining coherent and large volume bicluster from web usage dataset. The experiments were conducted on two web usage datasets from public dataset repository whereby the performance of FA was compared with that exhibited by other population-based metaheuristic called binary Particle Swarm Optimization (PSO). The results achieved demonstrate the usefulness of DFA while tackling the biclustering problem.
Abstract: We present a visualization technique for radial drawing of trees consisting of two slightly different algorithms. Both of them make use of node-link diagrams for visual encoding. This visualization creates clear drawings without edge crossing. One of the algorithms is suitable for real-time visualization of large trees, as it requires minimal recalculation of the layout if leaves are inserted or removed from the tree; while the other algorithm makes better utilization of the drawing space. The algorithms are very similar and follow almost the same procedure but with different parameters. Both algorithms assign angular coordinates for all nodes which are then converted into 2D Cartesian coordinates for visualization. We present both algorithms and discuss how they compare to each other.
Abstract: The method described in this paper deals with the problems of T-wave detection in an ECG. Determining the position of a T-wave is complicated due to the low amplitude, the ambiguous and changing form of the complex. A wavelet transform approach handles these complications therefore a method based on this concept was developed. In this way we developed a detection method that is able to detect T-waves with a sensitivity of 93% and a correct-detection ratio of 93% even with a serious amount of baseline drift and noise.