Abstract: This research is designed for helping a WAPbased mobile phone-s user in order to analyze of logistics in the traffic area by applying and designing the accessible processes from mobile user to server databases. The research-s design comprises Mysql 4.1.8-nt database system for being the server which there are three sub-databases, traffic light – times of intersections in periods of the day, distances on the road of area-blocks where are divided from the main sample-area and speeds of sample vehicles (motorcycle, personal car and truck) in periods of the day. For interconnections between the server and user, PHP is used to calculate distances and travelling times from the beginning point to destination, meanwhile XHTML applied for receiving, sending and displaying data from PHP to user-s mobile. In this research, the main sample-area is focused at the Huakwang-Ratchada-s area, Bangkok, Thailand where usually the congested point and 6.25 km2 surrounding area which are split into 25 blocks, 0.25 km2 for each. For simulating the results, the designed server-database and all communicating models of this research have been uploaded to www.utccengineering.com/m4tg and used the mobile phone which supports WAP 2.0 XHTML/HTML multimode browser for observing values and displayed pictures. According to simulated results, user can check the route-s pictures from the requiring point to destination along with analyzed consuming times when sample vehicles travel in various periods of the day.
Abstract: Functional imaging procedures for the non-invasive assessment of tissue microcirculation are highly requested, but require a mathematical approach describing the trans- and intercapillary passage of tracer particles. Up to now, two theoretical, for the moment different concepts have been established for tracer kinetic modeling of contrast agent transport in tissues: pharmacokinetic compartment models, which are usually written as coupled differential equations, and the indicator dilution theory, which can be generalized in accordance with the theory of lineartime- invariant (LTI) systems by using a convolution approach. Based on mathematical considerations, it can be shown that also in the case of an open two-compartment model well-known from functional imaging, the concentration-time course in tissue is given by a convolution, which allows a separation of the arterial input function from a system function being the impulse response function, summarizing the available information on tissue microcirculation. Due to this reason, it is possible to integrate the open two-compartment model into the system-theoretic concept of indicator dilution theory (IDT) and thus results known from IDT remain valid for the compartment approach. According to the long number of applications of compartmental analysis, even for a more general context similar solutions of the so-called forward problem can already be found in the extensively available appropriate literature of the seventies and early eighties. Nevertheless, to this day, within the field of biomedical imaging – not from the mathematical point of view – there seems to be a trench between both approaches, which the author would like to get over by exemplary analysis of the well-known model.
Abstract: Web applications have become very complex and crucial, especially when combined with areas such as CRM (Customer Relationship Management) and BPR (Business Process Reengineering), the scientific community has focused attention to Web applications design, development, analysis, and testing, by studying and proposing methodologies and tools. This paper proposes an approach to automatic multi-dimensional concern mining for Web Applications, based on concepts analysis, impact analysis, and token-based concern identification. This approach lets the user to analyse and traverse Web software relevant to a particular concern (concept, goal, purpose, etc.) via multi-dimensional separation of concerns, to document, understand and test Web applications. This technique was developed in the context of WAAT (Web Applications Analysis and Testing) project. A semi-automatic tool to support this technique is currently under development.
Abstract: The purpose of this study is to identify and evaluate
the scale of implementation of Just-In-Time (JIT) in the different industrial sectors in the Middle East. This study analyzes the empirical data collected by a questionnaire survey distributed to
companies in three main industrial sectors in the Middle East, which
are: food, chemicals and fabrics. The following main hypotheses is formulated and tested: (The requirements of JIT application differ
according to the type of industrial sector).Descriptive statistics and Box plot analysis were used to examine the hypotheses. This study indicates a reasonable evidence for accepting the main hypotheses. It
reveals that there is no standard way to adopt JIT as a production system. But each industrial sector should concentrate in the
investment on critical requirements that differ according to the nature
and strategy of production followed in that sector.
Abstract: In this paper, we present a methodology for finding
authoritative researchers by analyzing academic Web sites. We show
a case study in which we concentrate on a set of Czech computer
science departments- Web sites. We analyze the relations between
them via hyperlinks and find the most important ones using several
common ranking algorithms. We then examine the contents of the
research papers present on these sites and determine the most
authoritative Czech authors.
Abstract: Despite various methods that exist in software risk management, software projects have a high rate of failure. When complexity and size of the projects are increased, managing software development becomes more difficult. In these projects the need for more analysis and risk assessment is vital. In this paper, a classification for software risks is specified. Then relations between these risks using risk tree structure are presented. Analysis and assessment of these risks are done using probabilistic calculations. This analysis helps qualitative and quantitative assessment of risk of failure. Moreover it can help software risk management process. This classification and risk tree structure can apply to some software tools.
Abstract: The objective of this paper is to present the
development of the frame of Chulalongkorn University team in TSAE
Auto Challenge Student Formula and Student Formula SAE
Competition of Japan. Chulalongkorn University's SAE team, has
established since year 2003, joined many competitions since year 2006
and became the leading team in Thailand. Through these 5 years, space
frame was the most selected and developed year by year through six
frame designs. In this paper, the discussions on the conceptual design
of these frames are introduced, focusing on the mass and torsional
stiffness improvement. The torsional stiffness test was performed on
the real used frames and the results are compared. It can be seen that
the 2010-2011 frame is firstly designed based on the analysis and
experiment that considered the required mass and torsional stiffness.
From the torsional stiffness results, it can be concluded that the frames
were developed including the decreasing of mass and the increasing
torsional stiffness by applying many techniques.
Abstract: This paper focuses on cost and profit analysis of
single-server Markovian queuing system with two priority classes. In
this paper, functions of total expected cost, revenue and profit of the
system are constructed and subjected to optimization with respect to
its service rates of lower and higher priority classes. A computing
algorithm has been developed on the basis of fast converging
numerical method to solve the system of non linear equations formed
out of the mathematical analysis. A novel performance measure of
cost and profit analysis in view of its economic interpretation for the
system with priority classes is attempted to discuss in this paper. On
the basis of computed tables observations are also drawn to enlighten
the variational-effect of the model on the parameters involved
therein.
Abstract: The frequency contents of the non-stationary
signals vary with time. For proper characterization of such
signals, a smart time-frequency representation is necessary.
Classically, the STFT (short-time Fourier transform) is
employed for this purpose. Its limitation is the fixed timefrequency
resolution. To overcome this drawback an enhanced
STFT version is devised. It is based on the signal driven
sampling scheme, which is named as the cross-level sampling.
It can adapt the sampling frequency and the window function
(length plus shape) by following the input signal local
variations. This adaptation results into the proposed technique
appealing features, which are the adaptive time-frequency
resolution and the computational efficiency.
Abstract: This paper maps the structure of the social network of
the 2011 class ofsixty graduate students of the Masters of Science
(Knowledge Management) programme at the Nanyang Technological
University, based on their friending relationships on Facebook. To
ensure anonymity, actual names were not used. Instead, they were
replaced with codes constructed from their gender, nationality, mode
of study, year of enrollment and a unique number. The relationships
between friends within the class, and among the seniors and alumni
of the programme wereplotted. UCINet and Pajek were used to plot
the sociogram, to compute the density, inclusivity, and degree,
global, betweenness, and Bonacich centralities, to partition the
students into two groups, namely, active and peripheral, and to
identify the cut-points. Homophily was investigated, and it was
observed for nationality and study mode. The groups students formed
on Facebook were also studied, and of fifteen groups, eight were
classified as dead, which we defined as those that have been inactive
for over two months.
Abstract: We describe a novel method for removing noise (in wavelet domain) of unknown variance from microarrays. The method is based on a smoothing of the coefficients of the highest subbands. Specifically, we decompose the noisy microarray into wavelet subbands, apply smoothing within each highest subband, and reconstruct a microarray from the modified wavelet coefficients. This process is applied a single time, and exclusively to the first level of decomposition, i.e., in most of the cases, it is not necessary a multirresoltuion analysis. Denoising results compare favorably to the most of methods in use at the moment.
Abstract: The main aim of this study is to identify the most
influential variables that cause defects on the items produced by a
casting company located in Turkey. To this end, one of the items
produced by the company with high defective percentage rates is
selected. Two approaches-the regression analysis and decision treesare
used to model the relationship between process parameters and
defect types. Although logistic regression models failed, decision tree
model gives meaningful results. Based on these results, it can be
claimed that the decision tree approach is a promising technique for
determining the most important process variables.
Abstract: This paper aims to select the optimal location and
setting parameters of TCSC (Thyristor Controlled Series
Compensator) controller using Particle Swarm Optimization (PSO)
and Genetic Algorithm (GA) to mitigate small signal oscillations in a
multimachine power system. Though Power System Stabilizers
(PSSs) are prime choice in this issue, installation of FACTS device
has been suggested here in order to achieve appreciable damping of
system oscillations. However, performance of any FACTS devices
highly depends upon its parameters and suitable location in the
power network. In this paper PSO as well as GA based techniques are
used separately and compared their performances to investigate this
problem. The results of small signal stability analysis have been
represented employing eigenvalue as well as time domain response in
face of two common power system disturbances e.g., varying load
and transmission line outage. It has been revealed that the PSO based
TCSC controller is more effective than GA based controller even
during critical loading condition.
Abstract: At the previous study of new metal gasket, contact
width and contact stress were important design parameter for
optimizing metal gasket performance. However, the range of contact
stress had not been investigated thoroughly. In this study, we
conducted a gasket design optimization based on an elastic and plastic
contact stress analysis considering forming effect using FEM. The
gasket model was simulated by using two simulation stages which is
forming and tightening simulation. The optimum design based on an
elastic and plastic contact stress was founded. Final evaluation was
determined by helium leak quantity to check leakage performance of
both type of gaskets. The helium leak test shows that a gasket based
on the plastic contact stress design better than based on elastic stress
design.
Abstract: A generalization of the concepts of Feistel Networks (FN), known as Extended Feistel Network (EFN) is examined. EFN splits the input blocks into n > 2 sub-blocks. Like conventional FN, EFN consists of a series of rounds whereby at least one sub-block is subjected to an F function. The function plays a key role in the diffusion process due to its completeness property. It is also important to note that in EFN the F-function is the most computationally expensive operation in a round. The aim of this paper is to determine a suitable type of EFN for a scalable cipher. This is done by analyzing the threshold number of rounds for different types of EFN to achieve the completeness property as well as the number of F-function required in the network. The work focuses on EFN-Type I, Type II and Type III only. In the analysis it is found that EFN-Type II and Type III diffuses at the same rate and both are faster than Type-I EFN. Since EFN-Type-II uses less F functions as compared to EFN-Type III, therefore Type II is the most suitable EFN for use in a scalable cipher.
Abstract: Employing a recently introduced unified adaptive filter
theory, we show how the performance of a large number of important
adaptive filter algorithms can be predicted within a general framework
in nonstationary environment. This approach is based on energy conservation
arguments and does not need to assume a Gaussian or white
distribution for the regressors. This general performance analysis can
be used to evaluate the mean square performance of the Least Mean
Square (LMS) algorithm, its normalized version (NLMS), the family
of Affine Projection Algorithms (APA), the Recursive Least Squares
(RLS), the Data-Reusing LMS (DR-LMS), its normalized version
(NDR-LMS), the Block Least Mean Squares (BLMS), the Block
Normalized LMS (BNLMS), the Transform Domain Adaptive Filters
(TDAF) and the Subband Adaptive Filters (SAF) in nonstationary
environment. Also, we establish the general expressions for the
steady-state excess mean square in this environment for all these
adaptive algorithms. Finally, we demonstrate through simulations that
these results are useful in predicting the adaptive filter performance.
Abstract: The aim of this study was to estimate the frequency of
EBV infection in Hodgkin's lymphoma (HL) and non-Hodgkin's
lymphoma (NHL) occurring in Jordanian patients. A total of 55
patients with lymphoma were examined in this study. Of 55 patients,
30 and 25 were diagnosed as HL and NHL, respectively. The four
HL subtypes were observed with the majority of the cases exhibited
the mixed cellularity (MC) subtype followed by the nodular sclerosis
(NS). The high grade was found to be the commonest subtype of
NHL in our sample, followed by the low grade. The presence of EBV
virus was detected by immunostating for expression of latent
membrane protein-1 (LMP-1). The frequency of LMP-1 expression
occurred more frequent in patients with HL (60.0%) than in patients
with NHL (32.0%). The frequency of LMP-1 expression was also
higher in patients with MC subtype (61.11%) than those patients with
NS (28.57%). No age or gender difference in occurrence of EBV
infection was observed among patient with HL. By contrast, the
prevalence of EBV infection in NHL patients aged below 50 was
lower (16.66%) than in NHL patients aged 50 or above (46.15%). In
addition, EBV infection was more frequent in females with NHL
(38.46%) than in male with NHL (25%). In NHL cases, the
frequency of EBV infection in intermediate grade (60.0%) was high
when compared with frequency of low (25%) or high grades (25%).
In conclusion, analysis of LMP-1 expression indicates an important
role for this viral oncogene in the pathogenesis of EBV-associated
malignant lymphomas. These data also support the previous findings
that people with EBV may develop lymphoma and that efforts to
maintain low lymphoma should be considered for people with EBV
infection.
Abstract: The use of the mechanical simulation (in particular the finite element analysis) requires the management of assumptions in order to analyse a real complex system. In finite element analysis (FEA), two modeling steps require assumptions to be able to carry out the computations and to obtain some results: the building of the physical model and the building of the simulation model. The simplification assumptions made on the analysed system in these two steps can generate two kinds of errors: the physical modeling errors (mathematical model, domain simplifications, materials properties, boundary conditions and loads) and the mesh discretization errors. This paper proposes a mesh adaptive method based on the use of an h-adaptive scheme in combination with an error estimator in order to choose the mesh of the simulation model. This method allows us to choose the mesh of the simulation model in order to control the cost and the quality of the finite element analysis.
Abstract: The Chichiawan stream in the Wulin catchment in
Taiwan is the natural habitat of Formosan landlocked salmon. Human
and agriculture activities gradually worsen water quality and impact
the fish habitat negatively. To protect and manage Formosan
landlocked salmon habitat, it is important to understand a variety
land-uses affect on the watershed responses to storms. This study
discusses watershed responses to the dry-day before a storm event and
a variety of land-uses in the Wulin catchment. Under the land-use
planning in the Wulin catchment, the peak flows during typhoon
events do not have noticeable difference. However, the nutrient
exports can be highly reduced under the strategies of restraining
agriculture activities. Due to the higher affinity of P for soil than that
of N, the exports of TN from overall Wuling catchment were much
greater than Ortho-P. Agriculture mainly centralized in subbasin A,
which is the important source of nutrients in nonpoint source discharge.
The subbasin A supplied about 26% of the TN and 32% of the Ortho-P
discharge in 2004, despite the fact it only covers 19% area of the
Wuling catchment. The subbasin analysis displayed that the
agricultural subbasin A exports higher nutrients per unit area than
other forest subbasins. Additionally, the agricultural subbasin A
contributed a higher percentage to total Ortho-P exports compares to
TN. The results of subbasin analysis might imply the transport of
Ortho-P was similar to the particulate matter which was mainly
influenced by the runoff and affected by the desorption from soil
particles while the TN (dominated as nitrate-N) was mainly influenced
by base-flow.
Abstract: Information and communication service providers
(ICSP) that are significant in size and provide Internet-based services
take administrative, technical, and physical protection measures via
the information security check service (ISCS). These protection
measures are the minimum action necessary to secure the stability and
continuity of the information and communication services (ICS) that
they provide. Thus, information assets are essential to providing ICS,
and deciding the relative importance of target assets for protection is a
critical procedure. The risk analysis model designed to decide the
relative importance of information assets, which is described in this
study, evaluates information assets from many angles, in order to
choose which ones should be given priority when it comes to
protection. Many-sided risk analysis (MSRS) grades the importance of
information assets, based on evaluation of major security check items,
evaluation of the dependency on the information and communication
facility (ICF) and influence on potential incidents, and evaluation of
major items according to their service classification, in order to
identify the ISCS target. MSRS could be an efficient risk analysis
model to help ICSPs to identify their core information assets and take
information protection measures first, so that stability of the ICS can
be ensured.