Abstract: Today's business environment requires that companies have access to highly relevant information in a matter of seconds.
Modern Business Intelligence tools rely on data structured mostly in traditional dimensional database schemas, typically represented by
star schemas. Dimensional modeling is already recognized as a
leading industry standard in the field of data warehousing although
several drawbacks and pitfalls were reported. This paper focuses on
the analysis of another data warehouse modeling technique - the
anchor modeling, and its characteristics in context with the standardized dimensional modeling technique from a query performance perspective. The results of the analysis show
information about performance of queries executed on database
schemas structured according to principles of each database modeling
technique.
Abstract: Wireless Sensor Networks (WSNs) are used to monitor/observe vast inaccessible regions through deployment of large number of sensor nodes in the sensing area. For majority of WSN applications, the collected data needs to be combined with geographic information of its origin to make it useful for the user; information received from remote Sensor Nodes (SNs) that are several hops away from base station/sink is meaningless without knowledge of its source. In addition to this, location information of SNs can also be used to propose/develop new network protocols for WSNs to improve their energy efficiency and lifetime. In this paper, range free localization protocols for WSNs have been proposed. The proposed protocols are based on weighted centroid localization technique, where the edge weights of SNs are decided by utilizing fuzzy logic inference for received signal strength and link quality between the nodes. The fuzzification is carried out using (i) Mamdani, (ii) Sugeno, and (iii) Combined Mamdani Sugeno fuzzy logic inference. Simulation results demonstrate that proposed protocols provide better accuracy in node localization compared to conventional centroid based localization protocols despite presence of unintentional radio frequency interference from radio frequency (RF) sources operating in same frequency band.
Abstract: This paper presents an approach for an unequal error
protection of facial features of personal ID images coding. We
consider unequal error protection (UEP) strategies for the efficient
progressive transmission of embedded image codes over noisy
channels. This new method is based on the progressive image
compression embedded zerotree wavelet (EZW) algorithm and UEP
technique with defined region of interest (ROI). In this case is ROI
equal facial features within personal ID image. ROI technique is
important in applications with different parts of importance. In ROI
coding, a chosen ROI is encoded with higher quality than the
background (BG). Unequal error protection of image is provided by
different coding techniques and encoding LL band separately. In our
proposed method, image is divided into two parts (ROI, BG) that
consist of more important bytes (MIB) and less important bytes
(LIB). The proposed unequal error protection of image transmission
has shown to be more appropriate to low bit rate applications,
producing better quality output for ROI of the compresses image.
The experimental results verify effectiveness of the design. The
results of our method demonstrate the comparison of the UEP of
image transmission with defined ROI with facial features and the
equal error protection (EEP) over additive white gaussian noise
(AWGN) channel.
Abstract: This paper examines the interplay of policy options
and cost-effective technology in providing sustainable distance
education. A case study has been conducted among the learners and
teachers. The emergence of learning technologies through CD,
internet, and mobile is increasingly adopted by distance institutes for
quick delivery and cost-effective factors. Their sustainability is
conditioned by the structure of learners and well as the teaching
community. The structure of learners in terms of rural and urban
background revealed similarity in adoption and utilization of mobile
learning. In other words, the technology transcended the rural-urban
dichotomy. The teaching community was divided into two groups on
policy issues. This study revealed both cost-effective as well as
sustainability impacts on different learners groups divided by rural
and urban location.
Abstract: The need for multilingual communication in Japan has
increased due to an increase in the number of foreigners in the
country. When people communicate in their nonnative language,
the differences in language prevent mutual understanding among
the communicating individuals. In the medical field, communication
between the hospital staff and patients is a serious problem. Currently,
medical translators accompany patients to medical care facilities, and
the demand for medical translators is increasing. However, medical
translators cannot necessarily provide support, especially in cases in
which round-the-clock support is required or in case of emergencies.
The medical field has high expectations from information technology.
Hence, a system that supports accurate multilingual communication is
required. Despite recent advances in machine translation technology,
it is very difficult to obtain highly accurate translations. We have
developed a support system called M3 for multilingual medical
reception. M3 provides support functions that aid foreign patients in
the following respects: conversation, questionnaires, reception procedures,
and hospital navigation; it also has a Q&A function. Users
can operate M3 using a touch screen and receive text-based support.
In addition, M3 uses accurate translation tools called parallel texts
to facilitate reliable communication through conversations between
the hospital staff and the patients. However, if there is no parallel
text that expresses what users want to communicate, the users cannot
communicate. In this study, we have developed a circulating support
environment for multilingual medical communication using parallel
texts. The proposed environment can circulate necessary parallel texts
through the following procedure: (1) a user provides feedback about
the necessary parallel texts, following which (2) these parallel texts
are created and evaluated.
Abstract: In this paper, we have presented the effect of varying
time-delays on performance and stability in the single-channel multirate
sampled-data system in hard real-time (RT-Linux) environment.
The sampling task require response time that might exceed the
capacity of RT-Linux. So a straight implementation with RT-Linux is
not feasible, because of the latency of the systems and hence,
sampling period should be less to handle this task. The best sampling
rate is chosen for the sampled-data system, which is the slowest rate
meets all performance requirements. RT-Linux is consistent with its
specifications and the resolution of the real-time is considered 0.01
seconds to achieve an efficient result. The test results of our
laboratory experiment shows that the multi-rate control technique in
hard real-time operating system (RTOS) can improve the stability
problem caused by the random access delays and asynchronization.
Abstract: Data Mining aims at discovering knowledge out of
data and presenting it in a form that is easily comprehensible to
humans. One of the useful applications in Egypt is the Cancer
management, especially the management of Acute Lymphoblastic
Leukemia or ALL, which is the most common type of cancer in
children.
This paper discusses the process of designing a prototype that can
help in the management of childhood ALL, which has a great
significance in the health care field. Besides, it has a social impact
on decreasing the rate of infection in children in Egypt. It also
provides valubale information about the distribution and
segmentation of ALL in Egypt, which may be linked to the possible
risk factors.
Undirected Knowledge Discovery is used since, in the case of this
research project, there is no target field as the data provided is
mainly subjective. This is done in order to quantify the subjective
variables. Therefore, the computer will be asked to identify
significant patterns in the provided medical data about ALL. This
may be achieved through collecting the data necessary for the
system, determimng the data mining technique to be used for the
system, and choosing the most suitable implementation tool for the
domain.
The research makes use of a data mining tool, Clementine, so as to
apply Decision Trees technique. We feed it with data extracted from
real-life cases taken from specialized Cancer Institutes. Relevant
medical cases details such as patient medical history and diagnosis
are analyzed, classified, and clustered in order to improve the disease
management.
Abstract: Natural frequencies and dynamic response of a spur
gear sector are investigated using a two dimensional finite element
model that offers significant advantages for dynamic gear analyses.
The gear teeth are analyzed for different operating speeds. A primary
feature of this modeling is determination of mesh forces using a
detailed contact analysis for each time step as the gears roll through
the mesh. ANSYS software has been used on the proposed model to
find the natural frequencies by Block Lanczos technique and
displacements and dynamic stresses by transient mode super position
method. The effect of rotational speed of the gear on the dynamic
response of gear tooth has been studied and design limits have been
discussed.
Abstract: There are many real world problems in which
parameters like the arrival time of new jobs, failure of resources, and
completion time of jobs change continuously. This paper tackles the
problem of scheduling jobs with random due dates on multiple
identical machines in a stochastic environment. First to assign jobs to
different machine centers LPT scheduling methods have been used,
after that the particular sequence of jobs to be processed on the
machine have been found using simple stochastic techniques. The
performance parameter under consideration has been the maximum
lateness concerning the stochastic due dates which are independent
and exponentially distributed. At the end a relevant problem has been
solved using the techniques in the paper..
Abstract: With the advent of digital cinema and digital
broadcasting, copyright protection of video data has been one of the
most important issues.
We present a novel method of watermarking for video image data
based on the hardware and digital wavelet transform techniques and
name it as “traceable watermarking" because the watermarked data is
constructed before the transmission process and traced after it has been
received by an authorized user.
In our method, we embed the watermark to the lowest part of each
image frame in decoded video by using a hardware LSI.
Digital Cinema is an important application for traceable
watermarking since digital cinema system makes use of watermarking
technology during content encoding, encryption, transmission,
decoding and all the intermediate process to be done in digital cinema
systems. The watermark is embedded into the randomly selected
movie frames using hash functions.
Embedded watermark information can be extracted from the
decoded video data. For that, there is no need to access original movie
data. Our experimental results show that proposed traceable
watermarking method for digital cinema system is much better than the
convenient watermarking techniques in terms of robustness, image
quality, speed, simplicity and robust structure.
Abstract: The objective of this research is to study the technical
and economic performance of wind/diesel/battery (W/D/B) system
supplying a remote small gathering of six families using HOMER
software package. The electrical energy is to cater for the basic needs
for which the daily load pattern is estimated. Net Present Cost (NPC)
and Cost of Energy (COE) are used as economic criteria, while the measure of performance is % of power shortage. Technical and
economic parameters are defined to estimate the feasibility of the
system under study. Optimum system configurations are estimated for two sites. Using HOMER software, the simulation results showed that W/D/B systems are economical for the assumed community sites
as the price of generated electricity is about 0.308 $/kWh, without
taking external benefits into considerations. W/D/B systems are more
economical than W/B or diesel alone systems, as the COE is 0.86 $/kWh for W/B and 0.357 $/kWh for diesel alone.
Abstract: A novel nanofinishing process using improved ball
end magnetorheological (MR) finishing tool was developed for finishing of flat as well as 3D surfaces of ferromagnetic and non ferromagnetic workpieces. In this process a magnetically controlled
ball end of smart MR polishing fluid is generated at the tip surface of
the tool which is used as a finishing medium and it is guided to
follow the surface to be finished through computer controlled 3-axes
motion controller. The experiments were performed on ferromagnetic
workpiece surface in the developed MR finishing setup to study the effect of finishing time on final surface roughness. The performance
of present finishing process on final finished surface roughness was studied. The surface morphology was observed under scanning
electron microscopy and atomic force microscope. The final surface finish was obtained as low as 19.7 nm from the initial surface
roughness of 142.9 nm. The outcome of newly developed finishing process can be found useful in its applications in aerospace,
automotive, dies and molds manufacturing industries, semiconductor and optics machining etc.
Abstract: Semantic Web Technologies enable machines to
interpret data published in a machine-interpretable form on the web.
At the present time, only human beings are able to understand the
product information published online. The emerging semantic Web
technologies have the potential to deeply influence the further
development of the Internet Economy. In this paper we propose a
scenario based research approach to predict the effects of these new
technologies on electronic markets and business models of traders
and intermediaries and customers. Over 300 million searches are
conducted everyday on the Internet by people trying to find what
they need. A majority of these searches are in the domain of
consumer ecommerce, where a web user is looking for something to
buy. This represents a huge cost in terms of people hours and an
enormous drain of resources. Agent enabled semantic search will
have a dramatic impact on the precision of these searches. It will
reduce and possibly eliminate information asymmetry where a better
informed buyer gets the best value. By impacting this key
determinant of market prices semantic web will foster the evolution
of different business and economic models. We submit that there is a
need for developing these futuristic models based on our current
understanding of e-commerce models and nascent semantic web
technologies. We believe these business models will encourage
mainstream web developers and businesses to join the “semantic web
revolution."
Abstract: In this paper, we explore the applicability of the Sinc-
Collocation method to a three-dimensional (3D) oceanography model.
The model describes a wind-driven current with depth-dependent
eddy viscosity in the complex-velocity system. In general, the
Sinc-based methods excel over other traditional numerical methods
due to their exponentially decaying errors, rapid convergence and
handling problems in the presence of singularities in end-points.
Together with these advantages, the Sinc-Collocation approach that
we utilize exploits first derivative interpolation, whose integration
is much less sensitive to numerical errors. We bring up several
model problems to prove the accuracy, stability, and computational
efficiency of the method. The approximate solutions determined by
the Sinc-Collocation technique are compared to exact solutions and
those obtained by the Sinc-Galerkin approach in earlier studies. Our
findings indicate that the Sinc-Collocation method outperforms other
Sinc-based methods in past studies.
Abstract: In this paper a new approach is proposed for the
adaptation of the simulated annealing search in the field of the
Multi-Objective Optimization (MOO). This new approach is called
Multi-Case Multi-Objective Simulated Annealing (MC-MOSA). It
uses some basics of a well-known recent Multi-Objective Simulated
Annealing proposed by Ulungu et al., which is referred in the
literature as U-MOSA. However, some drawbacks of this algorithm
have been found, and are substituted by other ones, especially in
the acceptance decision criterion. The MC-MOSA has shown better
performance than the U-MOSA in the numerical experiments. This
performance is further improved by some other subvariants of the
MC-MOSA, such as Fast-annealing MC-MOSA, Re-annealing MCMOSA
and the Two-Stage annealing MC-MOSA.
Abstract: The charge-pump circuit is an important component in a phase-locked loop (PLL). The charge-pump converts Up and Down signals from the phase/frequency detector (PFD) into current. A conventional CMOS charge-pump circuit consists of two switched current sources that pump charge into or out of the loop filter according to two logical inputs. The mismatch between the charging current and the discharging current causes phase offset and reference spurs in a PLL. We propose a new charge-pump circuit to reduce the current mismatch by using a regulated cascode circuit. The proposed charge-pump circuit is designed and simulated by spectre with TSMC 0.18-μm 1.8-V CMOS technology.
Abstract: Arvia®, a spin-out company of University of Manchester, UK is commercialising a water treatment technology for the removal of low concentrations of organics from water. This technology is based on the adsorption of organics onto graphite based adsorbents coupled with their electrochemical regeneration in a simple electrochemical cell. In this paper, the potential of the process to adsorb microorganisms and electrochemically disinfect them present in water has been demonstrated. Bench scale experiments have indicated that the process of adsorption using graphite adsorbents with electrochemical regeneration can be used for water disinfection effectively. The most likely mechanisms of disinfection of water through this process include direct electrochemical oxidation and electrochemical chlorination.
Abstract: The aim of this paper is to determine the stress levels
at the end of a long slender shaft such as a drilling assembly used in
the oil or gas industry using a mathematical model in real-time. The
torsional deflection experienced by this type of drilling shaft (about 4
KM length and 20 cm diameter hollow shaft with a thickness of 1
cm) can only be determined using a distributed modeling technique.
The main objective of this project is to calculate angular velocity and
torque at the end of the shaft by TLM method and also analyzing of
the behavior of the system by transient response. The obtained result
is compared with lumped modeling technique the importance of these
results will be evident only after the mentioned comparison. Two
systems have different transient responses and in this project because
of the length of the shaft transient response is very important.
Abstract: According to FDA (Food and Drug Administration of the United States), vinegar is definedas a sour liquid containing at least 4 grams acetic acid in 100 cubic centimeter (4% solution of acetic acid) of solution that is produced from sugary materials by alcoholic fermentation. In the base of microbial starters, vinegars could be contained of more than 50 types of volatile and aromatic substances that responsible for their sweet taste and smelling. Recently the vinegar industry has a great proportion in agriculture, food and microbial biotechnology. The acetic acid bacteria are from the family Acetobacteraceae. Regarding to the latest version of Bergy-s Mannual of Systematic Bacteriology that has categorized bacteria in the base of their 16s RNA differences, the most important acetic acid genera are included Acetobacter (genus I), Gluconacetobacter (genus VIII) and Gluconobacter (genus IX). The genus Acetobacter that is primarily used in vinegar manufacturing plants is a gram negative, obligate aerobe coccus or rod shaped bacterium with the size 0.6 - 0.8 X 1.0 - 4.0 μm, nonmotile or motile with peritrichous flagella and catalase positive – oxidase negative biochemically. Some strains are overoxidizer that could convert acetic acid to carbon dioxide and water.In this research one Acetobacter native strain with high acetic acid productivity was isolated from Iranian white – red cherry. We used two specific culture media include Carr medium [yeast extract, 3%; ethanol, 2% (v/v); bromocresol green, 0.002%; agar, 2% and distilled water, 1000 ml], Frateur medium [yeast extract, 10 g/l; CaCO3, 20 g/l; ethanol, 20 g/l; agar, 20 g/l and distilled water, 1000 ml] and an industrial culture medium. In addition to high acetic acid production and high growth rate, this strain had a good tolerance against ethanol concentration that was examined using modified Carr media with 5%, 7% and 9% ethanol concentrations. While the industrial strains of acetic acid bacteria grow in the thermal range of 28 – 30 °C, this strain was adapted for growth in 34 – 36 °C after 96 hours incubation period. These dramatic characteristics suggest a potential biotechnological strain in production of cherry vinegar with a sweet smell and different nutritional properties in comparison to recent vinegar types. The lack of growth after 24, 48 and 72 hours incubation at 34 – 36 °C and the growth after 96 hours indicates a good and fast thermal flexibility of this strain as a significant characteristic of biotechnological and industrial strains.
Abstract: Recently, a lot of attention has been devoted to
advanced techniques of system modeling. PNN(polynomial neural
network) is a GMDH-type algorithm (Group Method of Data
Handling) which is one of the useful method for modeling nonlinear
systems but PNN performance depends strongly on the number of
input variables and the order of polynomial which are determined by
trial and error. In this paper, we introduce GPNN (genetic
polynomial neural network) to improve the performance of PNN.
GPNN determines the number of input variables and the order of all
neurons with GA (genetic algorithm). We use GA to search between
all possible values for the number of input variables and the order of
polynomial. GPNN performance is obtained by two nonlinear
systems. the quadratic equation and the time series Dow Jones stock
index are two case studies for obtaining the GPNN performance.