Abstract: In this paper, the melting of a semi-infinite body as a
result of a moving laser beam has been studied. Because the Fourier
heat transfer equation at short times and large dimensions does not
have sufficient accuracy; a non-Fourier form of heat transfer
equation has been used. Due to the fact that the beam is moving in x
direction, the temperature distribution and the melting pool shape are
not asymmetric. As a result, the problem is a transient threedimensional
problem. Therefore, thermophysical properties such as
heat conductivity coefficient, density and heat capacity are functions
of temperature and material states. The enthalpy technique, used for
the solution of phase change problems, has been used in an explicit
finite volume form for the hyperbolic heat transfer equation. This
technique has been used to calculate the transient temperature
distribution in the semi-infinite body and the growth rate of the melt
pool. In order to validate the numerical results, comparisons were
made with experimental data. Finally, the results of this paper were
compared with similar problem that has used the Fourier theory. The
comparison shows the influence of infinite speed of heat propagation
in Fourier theory on the temperature distribution and the melt pool
size.
Abstract: Managers as the key employees have a very important role in maintaining the workforce performance which is critical to the
construction companies- success in the future. If motivated employees start with motivated managers probably it would seem
plausible if the de-motivated ones start with de-motivated managers. This study aims to analyze the importance of motivated managers to
their successes and construction companies- successes. In this study,
a quantitative method was used and the study area was in Medan, North Sumatera. Questionnaire survey was distributed directly to
construction companies in Medan which are listed in the
Construction Services Development Board. A total of 60 managers responded and the completed questionnaires were analyzed using the
descriptive analysis. The results indicated that the respondents acknowledge the importance of motivation among themselves to the
projects and construction companies- success, implying that it is vital o maintain the motivation and good performance of the workforce.
Abstract: In this study, the hydrogen transport phenomenon was
numerically evaluated by using hydrogen-enhanced localized
plasticity (HELP) mechanisms. Two dominant governing equations,
namely, the hydrogen transport model and the elasto-plastic model,
were introduced. In addition, the implicitly formulated equations of
the governing equations were implemented into ABAQUS UMAT
user-defined subroutines. The simulation results were compared to
published results to validate the proposed method.
Abstract: Checkpointing is one of the commonly used techniques to provide fault-tolerance in distributed systems so that the system can operate even if one or more components have failed. However, mobile computing systems are constrained by low bandwidth, mobility, lack of stable storage, frequent disconnections and limited battery life. Hence, checkpointing protocols having lesser number of synchronization messages and fewer checkpoints are preferred in mobile environment. There are two different approaches, although not orthogonal, to checkpoint mobile computing systems namely, time-based and index-based. Our protocol is a fusion of these two approaches, though not first of its kind. In the present exposition, an index-based checkpointing protocol has been developed, which uses time to indirectly coordinate the creation of consistent global checkpoints for mobile computing systems. The proposed algorithm is non-blocking, adaptive, and does not use any control message. Compared to other contemporary checkpointing algorithms, it is computationally more efficient because it takes lesser number of checkpoints and does not need to compute dependency relationships. A brief account of important and relevant works in both the fields, time-based and index-based, has also been included in the presentation.
Abstract: We present a new algorithm for nonlinear dimensionality reduction that consistently uses global information, and that enables understanding the intrinsic geometry of non-convex manifolds. Compared to methods that consider only local information, our method appears to be more robust to noise. Unlike most methods that incorporate global information, the proposed approach automatically handles non-convexity of the data manifold. We demonstrate the performance of our algorithm and compare it to state-of-the-art methods on synthetic as well as real data.
Abstract: There are a lot of extensions made to the classic model of multi-layer perceptron (MLP). A notable amount of them has been designed to hasten the learning process without considering the quality of generalization. The paper proposes a new MLP extension based on exploiting topology of the input layer of the network. Experimental results show the extended model to improve upon generalization capability in certain cases. The new model requires additional computational resources to compare to the classic model, nevertheless the loss in efficiency isn-t regarded to be significant.
Abstract: Dill (Anethum graveolens L.) is a popular herb used in
many regions, including Baltic countries. Dill is widely used for
flavoring foods and beverages due to its pleasant spicy aroma. The
aim of this work was to determine the best blanching method for
processing of dill prior to microwave vacuum drying based on
sensory properties, color and volatile compounds in dried product.
Two blanching mediums were used – water and steam, and for part of
samples microwave pretreatment was additionally used. Evaluation of
dried dill volatile aroma compounds, color changes and sensory
attributes was performed. Results showed that blanching significantly
influences the quality of dried dill. After evaluation of volatile aroma
compounds, color and sensory properties of microwave vacuum dried
dill, as the best method for dill pretreatment was established
blanching at 90 °C for 30 s.
Abstract: Security management has changed from the
management of security equipments and useful interface to manager.
It analyzes the whole security conditions of network and preserves the
network services from attacks. Secure router technology has security
functions, such as intrusion detection, IPsec(IP Security) and access
control, are applied to legacy router for secure networking. It controls
an unauthorized router access and detects an illegal network intrusion.
This paper relates to a security engine management of router based on
a security policy, which is the definition of security function against a
network intrusion. This paper explains the security policy and designs
the structure of security engine management framework.
Abstract: This paper presents an analytical model to estimate
the cost of an optimized design of reinforced concrete isolated
footing base on structural safety. Flexural and optimized formulas for
square and rectangular footingare derived base on ACI building code
of design, material cost and optimization. The optimization
constraints consist of upper and lower limits of depth and area of
steel. Footing depth and area of reinforcing steel are to be minimized
to yield the optimal footing dimensions. Optimized footing materials
cost of concrete, reinforcing steel and formwork of the designed
sections are computed. Total cost factor TCF and other cost factors
are developed to generalize and simplify the calculations of footing
material cost. Numerical examples are presented to illustrate the
model capability of estimating the material cost of the footing for a
desired axial load.
Abstract: This paper describes a new algorithm of arrangement
in parallel, based on Odd-Even Mergesort, called division and
concurrent mixes. The main idea of the algorithm is to achieve that
each processor uses a sequential algorithm for ordering a part of the
vector, and after that, for making the processors work in pairs in
order to mix two of these sections ordered in a greater one, also
ordered; after several iterations, the vector will be completely
ordered. The paper describes the implementation of the new
algorithm on a Message Passing environment (such as MPI). Besides,
it compares the obtained experimental results with the quicksort
sequential algorithm and with the parallel implementations (also on
MPI) of the algorithms quicksort and bitonic sort. The comparison
has been realized in an 8 processors cluster under GNU/Linux which
is running on a unique PC processor.
Abstract: This paper presents a new and efficient approach for
capacitor placement in radial distribution systems that determine
the optimal locations and size of capacitor with an objective of
improving the voltage profile and reduction of power loss. The
solution methodology has two parts: in part one the loss sensitivity
factors are used to select the candidate locations for the capacitor
placement and in part two a new algorithm that employs Plant growth
Simulation Algorithm (PGSA) is used to estimate the optimal size
of capacitors at the optimal buses determined in part one. The main
advantage of the proposed method is that it does not require any
external control parameters. The other advantage is that it handles the
objective function and the constraints separately, avoiding the trouble
to determine the barrier factors. The proposed method is applied to 9
and 34 bus radial distribution systems. The solutions obtained by the
proposed method are compared with other methods. The proposed
method has outperformed the other methods in terms of the quality
of solution.
Abstract: Low power consumption is a major constraint for battery-powered system like computer notebook or PDA. In the past, specialists usually designed both specific optimized equipments and codes to relief this concern. Doing like this could work for quite a long time, however, in this era, there is another significant restraint, the time to market. To be able to serve along the power constraint while can launch products in shorter production period, objectoriented programming (OOP) has stepped in to this field. Though everyone knows that OOP has quite much more overhead than assembly and procedural languages, development trend still heads to this new world, which contradicts with the target of low power consumption. Most of the prior power related software researches reported that OOP consumed much resource, however, as industry had to accept it due to business reasons, up to now, no papers yet had mentioned about how to choose the best OOP practice in this power limited boundary. This article is the pioneer that tries to specify and propose the optimized strategy in writing OOP software under energy concerned environment, based on quantitative real results. The language chosen for studying is C# based on .NET Framework 2.0 which is one of the trendy OOP development environments. The recommendation gotten from this research would be a good roadmap that can help developers in coding that well balances between time to market and time of battery.
Abstract: Paper deals with environmental metrics and assessment systems devoted to Small and Medium Sized Enterprises. Authors are presenting proposed assessment model which has an ability to discover current environmental strengths and weaknesses of Small and Middle Sized Enterprise. Suggested model has also an ambition to become a Sustainability Decision Tool. Model is able to identify "best environmental devision" in the company, and to quantify how this decision contributed into overall environmental improvement. Authors understand environmental improvements as environmental innovations (product, process and organizational). Suggested model is based on its own concept; however, authors are also utilizing already existing environmental assessment tools.
Abstract: In order to encourage the construction of green homes
(GH) in Malaysia, a simple and attainable framework for designing
and building GHs is needed. This can be achieved by aligning GH
principles against Cole-s 'Sustainable Building Criteria' (SBC). This
set of considerations was used to categorize the GH features of three
case studies from Malaysia. Although the categorization of building
features is useful at exploring the presence of sustainability
inclinations of each house, the overall impact of building features in
each of the five SBCs are unknown. Therefore, this paper explored
the possibility of quantifying the impact of building features
categorized in SBC1 – “Buildings will have to adapt to the new
environment and restore damaged ecology while mitigating resource
use" based on existing GH assessment tools and methods and other
literature. This process as reported in this paper could lead to a new
dimension in green home rating and assessment methods.
Abstract: Restoration of endodontically treated teeth is a
common problem in dentistry, related to the fractures occurring in
such teeth and to concentration of forces little information regarding
variation of basic preparation guidelines in stress distribution has
been available. To date, there is still no agreement in the literature
about which material or technique can optimally restore
endodontically treated teeth. The aim of the present study was to
evaluate the influence of the core height and restoration materials on
corono-radicular restored upper first premolar. The first step of the
study was to achieve 3D models in order to analyze teeth, dowel and
core restorations and overlying full ceramic crowns. The FEM model
was obtained by importing the solid model into ANSYS finite
element analysis software. An occlusal load of 100 N was conducted,
and stresses occurring in the restorations, and teeth structures were
calculated. Numerical simulations provide a biomechanical
explanation for stress distribution in prosthetic restored teeth. Within
the limitations of the present study, it was found that the core height
has no important influence on the stress generated in coronoradicular
restored premolars. It can be drawn that the cervical regions
of the teeth and restorations were subjected to the highest stress
concentrations.
Abstract: Waiting times and queues are a daily problem for theme parks. Fast lines or priority queues appear as a solution for a specific segment of customers, that is, tourists who are willing to pay to avoid waiting. This paper analyzes the fast line system and explores the factors that affect the decision to purchase a fast line pass. A greater understanding of these factors may help companies to design appropriate products and services. This conceptual paper was based on a literature review in marketing and consumer behavior. Additional research was identified in related disciplines such as leisure studies, psychology, and sociology. A conceptual framework of the factors influencing the decision to purchase a fast line pass is presented.
Abstract: We decribe a formal specification and verification of the Rabin public-key scheme in the formal proof system Is-abelle/HOL. The idea is to use the two views of cryptographic verification: the computational approach relying on the vocabulary of probability theory and complexity theory and the formal approach based on ideas and techniques from logic and programming languages. The analysis presented uses a given database to prove formal properties of our implemented functions with computer support. Thema in task in designing a practical formalization of correctness as well as security properties is to cope with the complexity of cryptographic proving. We reduce this complexity by exploring a light-weight formalization that enables both appropriate formal definitions as well as eficient formal proofs. This yields the first computer-proved implementation of the Rabin public-key scheme in Isabelle/HOL. Consequently, we get reliable proofs with a minimal error rate augmenting the used database. This provides a formal basis for more computer proof constructions in this area.
Abstract: Diagnostic and detection of the arterial stiffness is
very important; which gives indication of the associated increased risk of cardiovascular diseases. To make a cheap and easy method for general screening technique to avoid the future cardiovascular
complexes , due to the rising of the arterial stiffness ; a proposed algorithm depending on photoplethysmogram to be used. The
photoplethysmograph signals would be processed in MATLAB. The
signal will be filtered, baseline wandering removed, peaks and
valleys detected and normalization of the signals should be achieved
.The area under the catacrotic phase of the photoplethysmogram
pulse curve is calculated using trapezoidal algorithm ; then will used
in cooperation with other parameters such as age, height, blood
pressure in neural network for arterial stiffness detection. The Neural
network were implemented with sensitivity of 80%, accuracy 85%
and specificity of 90% were got from the patients data. It is
concluded that neural network can detect the arterial STIFFNESS
depending on risk factor parameters.
Abstract: A novel and efficient approach to realize
fractional-order capacitors is investigated in this paper. Meanwhile, a
new approach which is more efficient for semiconductor
implementation of fractional-order capacitors is proposed. The
feasibility of the approach has been verified with the preliminary
measured results.
Abstract: The dynamic speckle or biospeckle is an interference
phenomenon generated at the reflection of a coherent light by an
active surface or even by a particulate or living body surface. The
above mentioned phenomenon gave scientific support to a method
named biospeckle which has been employed to study seed viability,
biological activity, tissue senescence, tissue water content, fruit
bruising, etc. Since the above mentioned method is not invasive and
yields numerical values, it can be considered for possible automation
associated to several processes, including selection and sorting.
Based on these preliminary considerations, this research work
proposed to study the interaction of a laser beam with vegetative
samples by measuring the incident light intensity and the transmitted
light beam intensity at several vegetative slabs of varying thickness.
Tests were carried on fifteen slices of apple tissue divided into three
thickness groups, i.e., 4 mm, 5 mm, 18 mm and 22 mm. A diode laser
beam of 10mW and 632 nm wavelength and a Samsung digital
camera were employed to carry the tests. Outgoing images were
analyzed by comparing the gray gradient of a fixed image column of
each image to obtain a laser penetration scale into the tissue,
according to the slice thickness.