Abstract: In Knowledge and Data Engineering field, relational
database is the best repository to store data in a real world. It has
been using around the world more than eight decades. Normalization
is the most important process for the analysis and design of relational
databases. It aims at creating a set of relational tables with minimum
data redundancy that preserve consistency and facilitate correct
insertion, deletion, and modification. Normalization is a major task in
the design of relational databases. Despite its importance, very few
algorithms have been developed to be used in the design of
commercial automatic normalization tools. It is also rare technique to
do it automatically rather manually. Moreover, for a large and
complex database as of now, it make even harder to do it manually.
This paper presents a new complete automated relational database
normalization method. It produces the directed graph and spanning
tree, first. It then proceeds with generating the 2NF, 3NF and also
BCNF normal forms. The benefit of this new algorithm is that it can
cope with a large set of complex function dependencies.
Abstract: Theiterative scheme which is used to treat buildup factors for stratified shields is being investigated here using the layer-splitting technique.A simple suggested formalism for the scheme based on the Kalos’ formula is introduced, based on which the implementation of the testing technique is carried out.
The second layer in a double-layer shield was split into two equivalent layers and the scheme (with the suggested formalism) was implemented on the new “three-layer” shieldconfiguration.The results of such manipulation on water-lead and water-iron shields combinations are presented here for 1MeV photons.
It was found that splitting the second layer introduces some deviation on the overall buildup factor value. This expected deviation appeared to be higher in the case of low Z layer followed by high Z. However, the overall performance of the iterative scheme showed a great consistency and strong coherence even with the introduced changes. The introduced layer-splitting testing technique shows the capability to be implemented in test the iterative scheme with a wide range of formalisms.
Abstract: The aim of the research was to evaluate the influence of flakes from biologically activated hull-less barley grain and malt extract on quality of yoghurt during its storage.
The results showed that the concentration of added malt extract and storage time influenced the changes of pH and lactic acid in yoghurt samples. Sensory properties – aroma, taste, consistency and appearance – of yoghurt enriched with flakes from biologically activated hull-less barley grain and malt extract changed significantly (p
Abstract: It is well-known as Fitts’ law that the time for a user to point a target on a GUI screen can be modeled as a linear function of “index of difficulty (ID).” In this paper, the authors investigate whether the traditional ID formulation is appropriate independently of device screen sizes. Result of our experiment reveals that the ID formulation may not consistently capture actual difficulty: users’ pointing performances are not consistent among pointing target variations of which index of difficulty are consistent. The term A/W may not be appropriate because the term causes the observed inconsistency. Based on this finding, the authors then evaluate the applicability of possible models other than Fitts’ one. Multiple regression models are found to be able to appropriately represent the effects of target design variations. The authors next make an attempt to improve the definition of ID in Fitts’ model. Our idea is to raise the size or the distance values depending on the screen size. The modified model is found to fit well to the users’ pointing data, which supports the idea.
Abstract: Inconsistency in manual inspection is real because humans get tired after some time. Recent trends show that automatic inspection is more appealing for mass production inspections. In such as a case, a robot manipulator seems the best candidate to run a dynamic visual inspection. The purpose of this work is to estimate the optimum workspace where a robot manipulator would perform a visual inspection process onto a work piece where a camera is attached to the end effector. The pseudo codes for the planned path are derived from the number of tool transit points, the delay time at the transit points, the process cycle time, and the configuration space that the distance between the tool and the work piece. It is observed that express start and swift end are acceptable in a robot program because applicable works usually in existence during these moments. However, during the mid-range cycle, there are always practical tasks programmed to be executed. For that reason, it is acceptable to program the robot such as that speedy alteration of actuator displacement is avoided. A dynamic visual inspection system using a robot manipulator seems practical for a work piece with a complex shape.
Abstract: Breast region segmentation is an essential prerequisite in computerized analysis of mammograms. It aims at separating the breast tissue from the background of the mammogram and it includes two independent segmentations. The first segments the background region which usually contains annotations, labels and frames from the whole breast region, while the second removes the pectoral muscle portion (present in Medio Lateral Oblique (MLO) views) from the rest of the breast tissue. In this paper we propose hybridization of Connected Component Labeling (CCL), Fuzzy, and Straight line methods. Our proposed methods worked good for separating pectoral region. After removal pectoral muscle from the mammogram, further processing is confined to the breast region alone. To demonstrate the validity of our segmentation algorithm, it is extensively tested using over 322 mammographic images from the Mammographic Image Analysis Society (MIAS) database. The segmentation results were evaluated using a Mean Absolute Error (MAE), Hausdroff Distance (HD), Probabilistic Rand Index (PRI), Local Consistency Error (LCE) and Tanimoto Coefficient (TC). The hybridization of fuzzy with straight line method is given more than 96% of the curve segmentations to be adequate or better. In addition a comparison with similar approaches from the state of the art has been given, obtaining slightly improved results. Experimental results demonstrate the effectiveness of the proposed approach.
Abstract: Paper focuses on experimental testing of possibilities of mechanical activation of fly ash and observation of influence of specific surface and granulometry on final properties of fresh and hardened concrete. Mechanical grinding prepared various fineness of fly ash, which was classed by specific surface in accordance with Blain and their granulometry was determined by means of laser granulometer. Then, sets of testing specimens were made from mix designs of identical composition with 25% or Portland cement CEM I 42.5 R replaced with fly ash with various specific surface and granulometry. Mix design with only Portland cement was used as reference. Mix designs were tested on consistency of fresh concrete and compressive strength after 7, 28, 60 and 90 days.
Abstract: Data replication in data grid systems is one of the important solutions that improve availability, scalability, and fault tolerance. However, this technique can also bring some involved issues such as maintaining replica consistency. Moreover, as grid environment are very dynamic some nodes can be more uploaded than the others to become eventually a bottleneck. The main idea of our work is to propose a complementary solution between replica consistency maintenance and dynamic load balancing strategy to improve access performances under a simulated grid environment.
Abstract: Power quality is used to describe the degree of consistency of electrical energy expected from generation source to point of use. The term power quality refers to a wide variety of electromagnetic phenomena that characterize the voltage and current at a given time and at a given location on the power system. Power quality problems can be defined as problem that results in failure of customer equipments, which manifests itself as an economic burden to users, or produces negative impacts on the environment. Voltage stability, power factor, harmonics pollution, reactive power and load unbalance are some of the factors that affect the consistency or the quality level. This research proposal proposes to investigate and analyze the causes and effects of power quality to homes and industries in Sarawak. The increasing application of electronics equipment used in the industries and homes has caused a big impact on the power quality. Many electrical devices are now interconnected to the power network and it can be observed that if the power quality of the network is good, then any loads connected to it will run smoothly and efficiently. On the other hand, if the power quality of the network is bad, then loads connected to it will fail or may cause damage to the equipments and reduced its lifetime. The outcome of this research will enable better and novel solutions of poor power quality to small industries and reduce damage of electrical devices and products in the industries.
Abstract: Aerosols are small particles suspended in air that have wide varying spatial and temporal distributions. The concentration of aerosol in total columnar atmosphere is normally measured using aerosol optical depth (AOD). In long-term monitoring stations, accurate AOD retrieval is often difficult due to the lack of frequent calibration. To overcome this problem, a near-sea-level Langley calibration algorithm is developed using the combination of clear-sky detection model and statistical filter. It attempts to produce a dataset that consists of only homogenous and stable atmospheric condition for the Langley calibration purposes. In this paper, a radiance-based validation method is performed to further investigate the feasibility and consistency of the proposed algorithm at different location, day, and time. The algorithm is validated using SMARTS model based n DNI value. The overall results confirmed that the proposed calibration algorithm feasible and consistent for measurements taken at different sites and weather conditions.
Abstract: This paper presents a visualized computer aided case tool for non-expert, called Visual Time, for representing and reasoning about incomplete and uncertain temporal information. It is both expressive and versatile, allowing logical conjunctions and disjunctions of both absolute and relative temporal relations, such as “Before”, “Meets”, “Overlaps”, “Starts”, “During”, and “Finishes”, etc. In terms of a visualized framework, Visual Time provides a user-friendly environment for describing scenarios with rich temporal structure in natural language, which can be formatted as structured temporal phrases and modeled in terms of Temporal Relationship Diagrams (TRD). A TRD can be automatically and visually transformed into a corresponding Time Graph, supported by automatic consistency checker that derives a verdict to confirm if a given scenario is temporally consistent or inconsistent.
Abstract: LuGre friction model is an ordinary differential equation that is widely used in describing the friction phenomenon
for mechanical systems. The importance of this model comes from the fact that it captures most of the friction behavior that has been observed including hysteresis. In this paper, we study some aspects related to the hysteresis behavior induced by the LuGre friction model.
Abstract: Rain attenuation plays a lot of roles in the design of satellite and terrestrial microwave radio links, hence a good knowledge of its effect is of great interest to Engineers and scientists in that it is often required to give a high level of accuracy of the rainrate distribution that expresses rainrate from the lowest value to the highest. This study proposes a model to express rainrate parameters alpha (α) and beta (β) as a function of geographical location at 0.01% of the time. The tropical locations used in the development of the effect were Ilorin, Ile-Ife, Douala, Dar-es-Selam, Nairobi, Lusaka, and Brazilia.
This expression clearly confirms the variability of rainfall from place to place. When consistency test was carried out using the expression to generate rainrate for each location examined, the result obtained was reliable for rain intensities between 5mm/h and 200mm/h. The variability of α and β with latitude also shows that different latitudes have different cumulative rain distribution. The model proposed in this study would be one of the useful tools to Radio Engineers since the precipitation effect in the design of satellite and terrestrial microwave radio links is among the factors to consider when designing communication systems.
Abstract: The paper presents an on-line recognition machine
(RM) for continuous/isolated, dynamic and static gestures that arise
in Flight Deck Officer (FDO) training. RM is based on generic pattern
recognition framework. Gestures are represented as templates using
summary statistics. The proposed recognition algorithm exploits temporal
and spatial characteristics of gestures via dynamic programming
and Markovian process. The algorithm predicts corresponding index
of incremental input data in the templates in an on-line mode.
Accumulated consistency in the sequence of prediction provides a
similarity measurement (Score) between input data and the templates.
The algorithm provides an intuitive mechanism for automatic detection
of start/end frames of continuous gestures. In the present paper,
we consider isolated gestures. The performance of RM is evaluated
using four datasets - artificial (W TTest), hand motion (Yang) and
FDO (tracker, vision-based ). RM achieves comparable results which
are in agreement with other on-line and off-line algorithms such as
hidden Markov model (HMM) and dynamic time warping (DTW).
The proposed algorithm has the additional advantage of providing
timely feedback for training purposes.
Abstract: The problem of N cracks interaction in an isotropic
elastic solid is decomposed into a subproblem of a homogeneous solid
without crack and N subproblems with each having a single crack
subjected to unknown tractions on the two crack faces. The unknown
tractions, namely pseudo tractions on each crack are expanded into
polynomials with unknown coefficients, which have to be determined
by the consistency condition, i.e. by the equivalence of the original
multiple cracks interaction problem and the superposition of the N+1
subproblems. In this paper, Kachanov-s approach of average tractions
is extended into the method of moments to approximately impose the
consistence condition. Hence Kachanov-s method can be viewed as
the zero-order method of moments. Numerical results of the stress
intensity factors are presented for interactions of two collinear cracks,
three collinear cracks, two parallel cracks, and three parallel cracks.
As the order of moment increases, the accuracy of the method of
moments improves.
Abstract: The paper describes a knowledge based system for
analysis of microscopic wear particles. Wear particles contained in
lubricating oil carry important information concerning machine
condition, in particular the state of wear. Experts (Tribologists) in the
field extract this information to monitor the operation of the machine
and ensure safety, efficiency, quality, productivity, and economy of
operation. This procedure is not always objective and it can also be
expensive. The aim is to classify these particles according to their
morphological attributes of size, shape, edge detail, thickness ratio,
color, and texture, and by using this classification thereby predict
wear failure modes in engines and other machinery. The attribute
knowledge links human expertise to the devised Knowledge Based
Wear Particle Analysis System (KBWPAS). The system provides an
automated and systematic approach to wear particle identification
which is linked directly to wear processes and modes that occur in
machinery. This brings consistency in wear judgment prediction
which leads to standardization and also less dependence on
Tribologists.
Abstract: Flash floods are considered natural disasters that can
cause casualties and demolishing of infra structures. The problem is
that flash floods, particularly in arid and semi arid zones, take place
in very short time. So, it is important to forecast flash floods earlier to
its events with a lead time up to 48 hours to give early warning alert
to avoid or minimize disasters. The flash flood took place over Wadi
Watier - Sinai Peninsula, in October 24th, 2008, has been simulated,
investigated and analyzed using the state of the art regional weather
model. The Weather Research and Forecast (WRF) model, which is a
reliable short term forecasting tool for precipitation events, has been
utilized over the study area. The model results have been calibrated
with the real data, for the same date and time, of the rainfall
measurements recorded at Sorah gauging station. The WRF model
forecasted total rainfall of 11.6 mm while the real measured one was
10.8 mm. The calibration shows significant consistency between
WRF model and real measurements results.
Abstract: System development life cycle (SDLC) is a
process uses during the development of any system. SDLC
consists of four main phases: analysis, design, implement and
testing. During analysis phase, context diagram and data flow
diagrams are used to produce the process model of a system.
A consistency of the context diagram to lower-level data flow
diagrams is very important in smoothing up developing
process of a system. However, manual consistency check from
context diagram to lower-level data flow diagrams by using a
checklist is time-consuming process. At the same time, the
limitation of human ability to validate the errors is one of the
factors that influence the correctness and balancing of the
diagrams. This paper presents a tool that automates the
consistency check between Data Flow Diagrams (DFDs)
based on the rules of DFDs. The tool serves two purposes: as
an editor to draw the diagrams and as a checker to check the
correctness of the diagrams drawn. The consistency check
from context diagram to lower-level data flow diagrams is
embedded inside the tool to overcome the manual checking
problem.
Abstract: Isobaric vapor-liquid equilibrium measurements are reported for the binary mixtures of n-Butylamine and Triethylamine with Cumene at 97.3 kPa. The measurements have been performed using a vapor recirculating type (modified Othmer's) equilibrium still. The binary mixture of n-Butylamine + Cumene shows positive deviation from ideality. Triethylamine + Cumene mixture shows negligible deviation from ideality. None of the systems form an azeotrope. The activity coefficients have been calculated taking into consideration the vapor phase nonideality. The data satisfy the thermodynamic consistency test of Herington. The activity coefficients have been satisfactorily correlated by means of the Margules, NRTL, and Black equations. The activity coefficient values obtained by the UNIFAC model are also reported.
Abstract: The Czech Republic is a country whose economy has
undergone a transformation since 1989. Since joining the EU it has
been striving to reduce the differences in its economic standard and
the quality of its institutional environment in comparison with
developed countries. According to an assessment carried out by the
World Bank, the Czech Republic was long classed as a country
whose institutional development was seen as problematic. For many
years one of the things it was rated most poorly on was its bankruptcy
law. The new Insolvency Act, which is a modern law in terms of its
treatment of bankruptcy, was first adopted in the Czech Republic in
2006. This law, together with other regulatory measures, offers debtridden
Czech economic subjects legal instruments which are well
established and in common practice in developed market economies.
Since then, analyses performed by the World Bank and the London
EBRD have shown that there have been significant steps forward in
the quality of Czech bankruptcy law. The Czech Republic still lacks
an analytical apparatus which can offer a structured characterisation
of the general and specific conditions of Czech company and
household debt which is subject to current changes in the global
economy. This area has so far not been given the attention it
deserves. The lack of research is particularly clear as regards analysis
of household debt and householders- ability to settle their debts in a
reasonable manner using legal and other state means of regulation.
We assume that Czech households have recourse to a modern
insolvency law, yet the effective application of this law is hampered
by the inconsistencies in the formal and informal institutions
involved in resolving debt. This in turn is based on the assumption
that this lack of consistency is more marked in cases of personal
bankruptcy. Our aim is to identify the symptoms which indicate that
for some time the effective application of bankruptcy law in the
Czech Republic will be hindered by factors originating in
householders- relative inability to identify the risks of falling into
debt.