Abstract: The decoding of Low-Density Parity-Check (LDPC) codes is operated over a redundant structure known as the bipartite graph, meaning that the full set of bit nodes is not absolutely necessary for decoder convergence. In 2008, Soyjaudah and Catherine designed a recovery algorithm for LDPC codes based on this assumption and showed that the error-correcting performance of their codes outperformed conventional LDPC Codes. In this work, the use of the recovery algorithm is further explored to test the performance of LDPC codes while the number of iterations is progressively increased. For experiments conducted with small blocklengths of up to 800 bits and number of iterations of up to 2000, the results interestingly demonstrate that contrary to conventional wisdom, the error-correcting performance keeps increasing with increasing number of iterations.
Abstract: Computerized alarm systems have been applied
increasingly to nuclear power plants. For existing plants, an add-on
computer alarm system is often installed to the control rooms. Alarm
avalanches during the plant transients are major problems with the
alarm systems in nuclear power plants. Computerized alarm systems
can process alarms to reduce the number of alarms during the plant
transients. This paper describes various alarm processing methods, an
alarm cause tracking function, and various alarm presentation schemes
to show alarm information to the operators effectively which are
considered during the development of several computerized alarm
systems for Korean nuclear power plants and are found to be helpful to
the operators.
Abstract: In this paper, we have proposed a Haar wavelet quasilinearization
method to solve the well known Blasius equation. The
method is based on the uniform Haar wavelet operational matrix
defined over the interval [0, 1]. In this method, we have proposed the
transformation for converting the problem on a fixed computational
domain. The Blasius equation arises in the various boundary layer
problems of hydrodynamics and in fluid mechanics of laminar
viscous flows. Quasi-linearization is iterative process but our
proposed technique gives excellent numerical results with quasilinearization
for solving nonlinear differential equations without any
iteration on selecting collocation points by Haar wavelets. We have
solved Blasius equation for 1≤α ≤ 2 and the numerical results are
compared with the available results in literature. Finally, we
conclude that proposed method is a promising tool for solving the
well known nonlinear Blasius equation.
Abstract: Algae-based fuel are considered a promising sources
of clean energy, and because it has many advantages over traditional
biofuel, research and business ventures have driven into developing
and producing Algal-biofuel. But its production stages create a cost
structure that it is not competitive with traditional fuels. Therefore,
cost becomes the main obstacle in commercial production purpose.
However, the present research which aims at using cost structure
model, and designed MS-Dose program, to investigate the a mount of
production cost and determined the parameter had great effect on it,
second to measured the amount of contribution rate of algae in
process the pollution by capturing Co2 from air . The result generated
from the model shows that the production cost of biomass is between
$0.137 /kg for 100 ha and $0.132 /kg for 500 ha which was less than
cost of other studies, while gallon costs between $3.4 - 3.5, more
than traditional sources of oil about $1 ,which regarded as a rate of
contribution of algal in capturing CO2 from air.
Abstract: In this paper, we present a comparative study between two computer vision systems for objects recognition and tracking, these algorithms describe two different approach based on regions constituted by a set of pixels which parameterized objects in shot sequences. For the image segmentation and objects detection, the FCM technique is used, the overlapping between cluster's distribution is minimized by the use of suitable color space (other that the RGB one). The first technique takes into account a priori probabilities governing the computation of various clusters to track objects. A Parzen kernel method is described and allows identifying the players in each frame, we also show the importance of standard deviation value research of the Gaussian probability density function. Region matching is carried out by an algorithm that operates on the Mahalanobis distance between region descriptors in two subsequent frames and uses singular value decomposition to compute a set of correspondences satisfying both the principle of proximity and the principle of exclusion.
Abstract: Appropriate description of business processes through
standard notations has become one of the most important assets for
organizations. Organizations must therefore deal with quality faults
in business process models such as the lack of understandability and
modifiability. These quality faults may be exacerbated if business
process models are mined by reverse engineering, e.g., from existing
information systems that support those business processes. Hence,
business process refactoring is often used, which change the internal
structure of business processes whilst its external behavior is
preserved. This paper aims to choose the most appropriate set of
refactoring operators through the quality assessment concerning
understandability and modifiability. These quality features are
assessed through well-proven measures proposed in the literature.
Additionally, a set of measure thresholds are heuristically established
for applying the most promising refactoring operators, i.e., those that
achieve the highest quality improvement according to the selected
measures in each case.
Abstract: The performance of schedules released to a shop floor may greatly be affected by unexpected disruptions. Thus, this paper considers the flexible job shop scheduling problem when processing times of some operations are represented by a uniform distribution with given lower and upper bounds. The objective is to find a predictive schedule that can deal with this uncertainty. The paper compares two genetic approaches to obtain predictive schedule. To determine the performance of the predictive schedules obtained by both approaches, an experimental study is conducted on a number of benchmark problems.
Abstract: The main aim of this research is to develop a methodology to encourage people's awareness, knowledge and understanding on the participation of flood management for cultural heritage, as the cooperation and interaction among government section, private section, and public section through role-play gaming simulation theory. The format of this research is to develop Role-play gaming simulation from existing documents, game or role-playing from several sources and existing data of the research site. We found that role-play gaming simulation can be implemented to help improving the understanding of the existing problem and the impact of the flood on cultural heritage, and the role-play game can be developed into the tool to improve people's knowledge, understanding and awareness about people's participation for flood management on cultural heritage, moreover the cooperation among the government, private section and public section will be improved through the theory of role-play gaming simulation.
Abstract: Short Message Service (SMS) has grown in
popularity over the years and it has become a common way of
communication, it is a service provided through General System
for Mobile Communications (GSM) that allows users to send text
messages to others.
SMS is usually used to transport unclassified information, but
with the rise of mobile commerce it has become a popular tool for
transmitting sensitive information between the business and its
clients. By default SMS does not guarantee confidentiality and
integrity to the message content.
In the mobile communication systems, security (encryption)
offered by the network operator only applies on the wireless link.
Data delivered through the mobile core network may not be
protected. Existing end-to-end security mechanisms are provided
at application level and typically based on public key
cryptosystem.
The main concern in a public-key setting is the authenticity of
the public key; this issue can be resolved by identity-based (IDbased)
cryptography where the public key of a user can be derived
from public information that uniquely identifies the user.
This paper presents an encryption mechanism based on the IDbased
scheme using Elliptic curves to provide end-to-end security
for SMS. This mechanism has been implemented over the standard
SMS network architecture and the encryption overhead has been
estimated and compared with RSA scheme. This study indicates
that the ID-based mechanism has advantages over the RSA
mechanism in key distribution and scalability of increasing
security level for mobile service.
Abstract: Using mobile Internet access technologies and eservices,
various economic agents can efficiently offer their products
or services to a large number of clients. With the support of mobile
communications networks, the clients can have access to e-services,
anywhere and anytime. This is a base to establish a convergence of
technological and financial interests of mobile operators, software
developers, mobile terminals producers and e-content providers. In
this paper, a client server system is presented, using 3G, EDGE,
mobile terminals, for Stock Exchange e-services access.
Abstract: Quantum computation using qubits made of two component Bose-Einstein condensates (BECs) is analyzed. We construct a general framework for quantum algorithms to be executed using the collective states of the BECs. The use of BECs allows for an increase of energy scales via bosonic enhancement, resulting in two qubit gate operations that can be performed at a time reduced by a factor of N, where N is the number of bosons per qubit. We illustrate the scheme by an application to Deutsch-s and Grover-s algorithms, and discuss possible experimental implementations. Decoherence effects are analyzed under both general conditions and for the experimental implementation proposed.
Abstract: Petroleum refineries discharged large amount of
wastewater -during the refining process- that contains hazardous
constituents that is hard to degrade. Anaerobic treatment process is
well known as an efficient method to degrade high strength
wastewaters. Up-flow Anaerobic Sludge Blanker (UASB) is a
common process used for various wastewater treatments. Two UASB
reactors were set up and operated in parallel to evaluate the treatment
efficiency of petroleum refinery wastewater. In this study four
organic volumetric loading rates were applied (i.e. 0.58, 0.89, 1.21
and 2.34 kg/m3·d), two loads to each reactor. Each load was applied
for a period of 60 days for the reactor to acclimatize and reach steady
state, and then the second load applied. The chemical oxygen demand
(COD) removals were satisfactory with the removal efficiencies at the
loadings applied were 78, 82, 83 and 81 % respectively.
Abstract: In this paper a stochastic scenario-based model predictive control applied to molten salt storage systems in concentrated solar tower power plant is presented. The main goal of this study is to build up a tool to analyze current and expected future resources for evaluating the weekly power to be advertised on electricity secondary market. This tool will allow plant operator to maximize profits while hedging the impact on the system of stochastic variables such as resources or sunlight shortage.
Solving the problem first requires a mixed logic dynamic modeling of the plant. The two stochastic variables, respectively the sunlight incoming energy and electricity demands from secondary market, are modeled by least square regression. Robustness is achieved by drawing a certain number of random variables realizations and applying the most restrictive one to the system. This scenario approach control technique provides the plant operator a confidence interval containing a given percentage of possible stochastic variable realizations in such a way that robust control is always achieved within its bounds. The results obtained from many trajectory simulations show the existence of a ‘’reliable’’ interval, which experimentally confirms the algorithm robustness.
Abstract: In this paper, we consider the problem of logic simplification for a special class of logic functions, namely complementary Boolean functions (CBF), targeting low power implementation using static CMOS logic style. The functions are uniquely characterized by the presence of terms, where for a canonical binary 2-tuple, D(mj) ∪ D(mk) = { } and therefore, we have | D(mj) ∪ D(mk) | = 0 [19]. Similarly, D(Mj) ∪ D(Mk) = { } and hence | D(Mj) ∪ D(Mk) | = 0. Here, 'mk' and 'Mk' represent a minterm and maxterm respectively. We compare the circuits minimized with our proposed method with those corresponding to factored Reed-Muller (f-RM) form, factored Pseudo Kronecker Reed-Muller (f-PKRM) form, and factored Generalized Reed-Muller (f-GRM) form. We have opted for algebraic factorization of the Reed-Muller (RM) form and its different variants, using the factorization rules of [1], as it is simple and requires much less CPU execution time compared to Boolean factorization operations. This technique has enabled us to greatly reduce the literal count as well as the gate count needed for such RM realizations, which are generally prone to consuming more cells and subsequently more power consumption. However, this leads to a drawback in terms of the design-for-test attribute associated with the various RM forms. Though we still preserve the definition of those forms viz. realizing such functionality with only select types of logic gates (AND gate and XOR gate), the structural integrity of the logic levels is not preserved. This would consequently alter the testability properties of such circuits i.e. it may increase/decrease/maintain the same number of test input vectors needed for their exhaustive testability, subsequently affecting their generalized test vector computation. We do not consider the issue of design-for-testability here, but, instead focus on the power consumption of the final logic implementation, after realization with a conventional CMOS process technology (0.35 micron TSMC process). The quality of the resulting circuits evaluated on the basis of an established cost metric viz., power consumption, demonstrate average savings by 26.79% for the samples considered in this work, besides reduction in number of gates and input literals by 39.66% and 12.98% respectively, in comparison with other factored RM forms.
Abstract: This policy participation action research explores the
roles of Thai government units during its 2010 fiscal year on how to
create value added to recycling business in the central part of
Thailand. The research aims to a) study how the government plays a
role to support the business, and its problems and obstacles on
supporting the business, b) to design a strategic action – short,
medium, and long term plans -- to create value added to the recycling
business, particularly in local full-loop companies/organizations
licensed by Wongpanit Waste Separation Plant as well as those
licensed by the Department of Provincial Administration. Mixed
method research design, i.e., a combination of quantitative and
qualitative methods is utilized in the present study in both data
collection and analysis procedures. Quantitative data was analyzed
by frequency, percent value, mean scores, and standard deviation,
and aimed to note trend and generalizations. Qualitative data was
collected via semi-structured interviews/focus group interviews to
explore in-depth views of the operators. The sampling included 1,079
operators in eight provinces in the central part of Thailand.
Abstract: This paper is prepared to provide a review of how an automotive manufacturer, ISUZU HICOM Malaysia Co. Ltd. sustained the supply chain management after business process reengineering in 2007. One of the authors is currently undergoing industrial attachment and has spent almost 6 months researching in the production and operation management system of the company. This study was carried out as part of the tasks in the attachment program. The result shows that delivery lateness and outsourcing are the main barriers that affected productivity. From the gap analysis, the authors found that new business process operation had improved suppliers delivery performance.
Abstract: In this paper by measuring the cutting forces the effect
of the tool shape and qualifications (sharp and worn cutting tools of
both vee and knife edge profile) and cutting conditions (depth of cut
and cutting speed) in the turning operation on the tool deflection and
cutting force is investigated. The workpiece material was mild steel
and the cutting tool was made of high speed steel. Cutting forces
were measured by a dynamometer (type P.E.I. serial No 154). The
dynamometer essentially consisted of a cantilever structure which
held the cutting tool. Deflection of the cantilever was measured by an
L.V.D.T (Mercer 122) deflection indicator. No cutting fluid was used
during the turning operations. A modern CNC lathe machine (Okuma
LH35-N) was used for the tests. It was noted that worn vee profile
tools tended to produce a greater increase in the vertical force
component than the axial component, whereas knife tools tended to
show a more pronounced increase in the axial component.
Abstract: An optimal power flow (OPF) based on particle swarm
optimization (PSO) was developed with more realistic generator
security constraint using the capability curve instead of only Pmin/Pmax
and Qmin/Qmax. Neural network (NN) was used in designing digital
capability curve and the security check algorithm. The algorithm is
very simple and flexible especially for representing non linear
generation operation limit near steady state stability limit and under
excitation operation area. In effort to avoid local optimal power flow
solution, the particle swarm optimization was implemented with
enough widespread initial population. The objective function used in
the optimization process is electric production cost which is
dominated by fuel cost. The proposed method was implemented at
Java Bali 500 kV power systems contain of 7 generators and 20
buses. The simulation result shows that the combination of generator
power output resulted from the proposed method was more economic
compared with the result using conventional constraint but operated
at more marginal operating point.
Abstract: In India, the quarrel between the budding human
populace and the planet-s unchanging supply of freshwater and
falling water tables has strained attention the reuse of gray water as
an alternative water resource in rural development. This paper
present the finest design of laboratory scale gray water treatment
plant, which is a combination of natural and physical operations such
as primary settling with cascaded water flow, aeration, agitation and
filtration, hence called as hybrid treatment process. The economical
performance of the plant for treatment of bathrooms, basins and
laundries gray water showed in terms of deduction competency of
water pollutants such as COD (83%), TDS (70%), TSS (83%), total
hardness (50%), oil and grease (97%), anions (46%) and cations
(49%). Hence, this technology could be a good alternative to treat
gray water in residential rural area.