Abstract: In this study, an experimental investigation was carried
out to fix CO2 into the electronic arc furnace (EAF) reducing slag from
stainless steelmaking process under wet grinding. The slag was ground
by the vibrating ball mill with the CO2 and pure water. The reaction
behavior was monitored with constant pressure method, and the
change of CO2 volume in the experimental system with grinding time
was measured. It was found that the CO2 absorption occurred as soon
as the grinding started. The CO2 absorption under wet grinding was
significantly larger than that under dry grinding. Generally, the
amount of CO2 absorption increased as the amount of water, the
amount of slag, the diameter of alumina ball and the initial pressure of
CO2 increased. However, the initial absorption rate was scarcely
influenced by the experimental conditions except for the initial CO2
pressure. According to this research, the CO2 reacted with the CaO
inside the slag to form CaCO3.
Abstract: The proper selection of the AC-side passive filter
interconnecting the voltage source converter to the power supply is
essential to obtain satisfactory performances of an active power filter
system. The use of the LCL-type filter has the advantage of
eliminating the high frequency switching harmonics in the current
injected into the power supply. This paper is mainly focused on
analyzing the influence of the interface filter parameters on the active
filtering performances. Some design aspects are pointed out. Thus,
the design of the AC interface filter starts from transfer functions by
imposing the filter performance which refers to the significant current
attenuation of the switching harmonics without affecting the
harmonics to be compensated. A Matlab/Simulink model of the entire
active filtering system including a concrete nonlinear load has been
developed to examine the system performances. It is shown that a
gamma LC filter could accomplish the attenuation requirement of the
current provided by converter. Moreover, the existence of an optimal
value of the grid-side inductance which minimizes the total harmonic
distortion factor of the power supply current is pointed out.
Nevertheless, a small converter-side inductance and a damping
resistance in series with the filter capacitance are absolutely needed
in order to keep the ripple and oscillations of the current at the
converter side within acceptable limits. The effect of change in the
LCL-filter parameters is evaluated. It is concluded that good active
filtering performances can be achieved with small values of the
capacitance and converter-side inductance.
Abstract: The operational behavior of a six-phase squirrel cage
induction machine with faulted stator terminals is presented in this
paper. The study is carried out using the derived mathematical model
of the machine in the arbitrary reference frame. Tests are conducted
on a 1 kW experimental machine.
Steady-state and dynamic performance are analyzed for the
machine unloaded and loaded conditions. The results shows that with
one of the stator phases experiencing either an open- circuit or short
circuit fault the machine still produces starting torque, albeit the
running performance is significantly derated.
Abstract: The implementation of electronic government started since the initiation of Multimedia Super Corridor (MSC) by the Malaysia government. The introduction of ICT in the public sector especially e-Government initiatives opens up a new book in the government administration throughout the world. The aim or this paper is to discuss the implementation of e-government in Malaysia, covering the result of public user self assessment on Malaysia's electronic government applications. E-services, e-procurement, Generic Office Environment (GOE), Human Resources Management Information System (HRMIS), Project Monitoring System (PMS), Electronic Labor Exchange (ELX) and e-syariah(religion) were the seven flagship application assessed. The study adopted a crosssectional survey research approach and information system literature were used. The analysis was done for 35 responden in pilot test and there was evidence from public user's perspective to suggest that the e-government applications were generally successful.
Abstract: This paper introduces the application of seismic wave method in earthquake prediction and early estimation. The advantages of the seismic wave method over the traditional earthquake prediction method are demonstrated. An example is presented in this study to show the accuracy and efficiency of using the seismic wave method in predicting a medium-sized earthquake swarm occurred in Wencheng, Zhejiang, China. By applying this method, correct predictions were made on the day after this earthquake swarm started and the day the maximum earthquake occurred, which provided scientific bases for governmental decision-making.
Abstract: Nowadays, manufacturers are facing great challenges
with regard to the production of green products due to the emerging issue of hazardous substance management (HSM). In particular,
environmental legislation pressures have yielded to increased risk,
manufacturing complexity and green components demands. The green principles were expanded to many departments within
organization, including supply chain. Green supply chain
management (GSCM) was emerging in the last few years. This idea
covers every stage in manufacturing from the first to the last stage of
life cycle. From product lifecycle concept, the cycle starts at the design of a product. QFD is a customer-driven product development
tool, considered as a structured management approach for efficiently
translating customer needs into design requirements and parts deployment, as well as manufacturing plans and controls in order to
achieve higher customer satisfaction. This paper develops an Eco-
QFD to provide a framework for designing Eco-mobile phone by integrating the life cycle analysis LCA into QFD throughout the entire product development process.
Abstract: Needs of an efficient information retrieval in recent
years in increased more then ever because of the frequent use of
digital information in our life. We see a lot of work in the area of
textual information but in multimedia information, we cannot find
much progress. In text based information, new technology of data
mining and data marts are now in working that were started from the
basic concept of database some where in 1960.
In image search and especially in image identification,
computerized system at very initial stages. Even in the area of image
search we cannot see much progress as in the case of text based
search techniques. One main reason for this is the wide spread roots
of image search where many area like artificial intelligence,
statistics, image processing, pattern recognition play their role. Even
human psychology and perception and cultural diversity also have
their share for the design of a good and efficient image recognition
and retrieval system.
A new object based search technique is presented in this paper
where object in the image are identified on the basis of their
geometrical shapes and other features like color and texture where
object-co-relation augments this search process.
To be more focused on objects identification, simple images are
selected for the work to reduce the role of segmentation in overall
process however same technique can also be applied for other
images.
Abstract: With the rapid growth in business size, today-s businesses orient Throughout thirty years local, national and international experience in medicine as a medical student, junior doctor and eventually Consultant and Professor in Anaesthesia, Intensive Care and Pain Management, I note significant generalised dissatisfaction among medical students and doctors regarding their medical education and practice. We repeatedly hear complaints from patients about the dysfunctional health care system they are dealing with and subsequently the poor medical service that they are receiving. Medical students are bombarded with lectures, tutorials, clinical rounds and various exams. Clinicians are weighed down with a never-ending array of competing duties. Patients are extremely unhappy about the long waiting lists, loss of their records and the continuous deterioration of the health care service. This problem has been reported in different countries by several authors [1,2,3]. In a trial to solve this dilemma, a genuine idea has been suggested implementing computer technology in medicine [2,3]. Computers in medicine are a medium of international communication of the revolutionary advances being made in the application of the computer to the fields of bioscience and medicine [4,5]. The awareness about using computers in medicine has recently increased all over the world. In Misr University for Science & Technology (MUST), Egypt, medical students are now given hand-held computers (Laptop) with Internet facility making their medical education accessible, convenient and up to date. However, this trial still needs to be validated. Helping the readers to catch up with the on going fast development in this interesting field, the author has decided to continue reviewing the literature, exploring the state-of-art in computer based medicine and up dating the medical professionals especially the local trainee Doctors in Egypt. In part I of this review article we will give a general background discussing the potential use of computer technology in the various aspects of the medical field including education, research, clinical practice and the health care service given to patients. Hope this will help starting changing the culture, promoting the awareness about the importance of implementing information technology (IT) in medicine, which is a field in which such help is needed. An international collaboration is recommended supporting the emerging countries achieving this target.
Abstract: The aim of the article is extending and developing
econometrics and network structure based methods which are able to
distinguish price manipulation in Tehran stock exchange. The
principal goal of the present study is to offer model for
approximating price manipulation in Tehran stock exchange. In order
to do so by applying separation method a sample consisting of 397
companies accepted at Tehran stock exchange were selected and
information related to their price and volume of trades during years
2001 until 2009 were collected and then through performing runs
test, skewness test and duration correlative test the selected
companies were divided into 2 sets of manipulated and non
manipulated companies. In the next stage by investigating
cumulative return process and volume of trades in manipulated
companies, the date of starting price manipulation was specified and
in this way the logit model, artificial neural network, multiple
discriminant analysis and by using information related to size of
company, clarity of information, ratio of P/E and liquidity of stock
one year prior price manipulation; a model for forecasting price
manipulation of stocks of companies present in Tehran stock
exchange were designed. At the end the power of forecasting models
were studied by using data of test set. Whereas the power of
forecasting logit model for test set was 92.1%, for artificial neural
network was 94.1% and multi audit analysis model was 90.2%;
therefore all of the 3 aforesaid models has high power to forecast
price manipulation and there is no considerable difference among
forecasting power of these 3 models.
Abstract: Three batches of yogurts were made with soy protein
isolate (SPI) supplemented with 2% (S2), 4% (S4) or 6% (S6) of
skim milk powder (SMP). The fourth batch (control; S0) was
prepared from SPI without SMP supplementation. Lactobacillus
delbrueckii ssp. bulgaricus ATCC 11842 (Lb 11842) and
Streptococcus thermophilus ST 1342 (ST 1342) were used as the
starter culture. Biotransformation of the inactive forms, isoflavone
glycosides (IG) to biologically active forms, isoflavone aglycones
(IA), was determined during 28 d storage. The viability of both
microorganisms was significantly higher (P < 0.05) in S2, S4, and S6
than that in S0. The ratio of lactic acid/acetic acid in S0 was in the
range of 15.53 – 22.31 compared to 7.24 – 12.81 in S2, S4 and S6.
The biotransformation of IG to IA in S2, S4 and S6 was also
enhanced by 9.9 -13.3% compared to S0.
Abstract: Hong Kong is one of the regions in the world where Total Fertility Rate (TFR) is very low. In 2001, the TFR dropped until 0.931, which means 1 woman even cannot give birth to one child on average. However, after the reformation of the 'Right of Abode of Hong Kong' in 2001 and the Chinese Central Government loosened the disembarkation procedure of mainland Chinese (mainlander) to enter Hong Kong in 2003; mainlander couples started to cross the border for giving births in Hong Kong. This action raises Hong Kong-s TFR quickly from 0.931 (2001) to 1.094 (2010). Usually, an increasing trend of TFR means a sign of rejuvenation in low-fertility, but in the case of Hong Kong, the increase of TFR does not, rather it generates other population problems. This paper is going to discuss do mainlanders- births help to solve the low-fertility problem in Hong Kong.
Abstract: In this paper we used data mining techniques to
identify outlier patients who are using large amount of drugs over a
long period of time. Any healthcare or health insurance system
should deal with the quantities of drugs utilized by chronic diseases
patients. In Kingdom of Bahrain, about 20% of health budget is spent
on medications. For the managers of healthcare systems, there is no
enough information about the ways of drug utilization by chronic
diseases patients, is there any misuse or is there outliers patients. In
this work, which has been done in cooperation with information
department in the Bahrain Defence Force hospital; we select the data
for Cardiac patients in the period starting from 1/1/2008 to
December 31/12/2008 to be the data for the model in this paper. We
used three techniques for finding the drug utilization for cardiac
patients. First we applied a clustering technique, followed by
measuring of clustering validity, and finally we applied a decision
tree as classification algorithm. The clustering results is divided into
three clusters according to the drug utilization, for 1603 patients, who
received 15,806 prescriptions during this period can be partitioned
into three groups, where 23 patients (2.59%) who received 1316
prescriptions (8.32%) are classified to be outliers. The classification
algorithm shows that the use of average drug utilization and the age,
and the gender of the patient can be considered to be the main
predictive factors in the induced model.
Abstract: Information Retrieval has the objective of studying
models and the realization of systems allowing a user to find the
relevant documents adapted to his need of information. The
information search is a problem which remains difficult because the
difficulty in the representing and to treat the natural languages such
as polysemia. Intentional Structures promise to be a new paradigm to
extend the existing documents structures and to enhance the different
phases of documents process such as creation, editing, search and
retrieval. The intention recognition of the author-s of texts can reduce
the largeness of this problem. In this article, we present intentions
recognition system is based on a semi-automatic method of
extraction the intentional information starting from a corpus of text.
This system is also able to update the ontology of intentions for the
enrichment of the knowledge base containing all possible intentions
of a domain. This approach uses the construction of a semi-formal
ontology which considered as the conceptualization of the intentional
information contained in a text. An experiments on scientific
publications in the field of computer science was considered to
validate this approach.
Abstract: Cognitive Science appeared about 40 years ago,
subsequent to the challenge of the Artificial Intelligence, as common
territory for several scientific disciplines such as: IT, mathematics,
psychology, neurology, philosophy, sociology, and linguistics. The
new born science was justified by the complexity of the problems
related to the human knowledge on one hand, and on the other by the
fact that none of the above mentioned sciences could explain alone
the mental phenomena. Based on the data supplied by the
experimental sciences such as psychology or neurology, models of
the human mind operation are built in the cognition science. These
models are implemented in computer programs and/or electronic
circuits (specific to the artificial intelligence) – cognitive systems –
whose competences and performances are compared to the human
ones, leading to the psychology and neurology data reinterpretation,
respectively to the construction of new models. During these
processes if psychology provides the experimental basis, philosophy
and mathematics provides the abstraction level utterly necessary for
the intermission of the mentioned sciences.
The ongoing general problematic of the cognitive approach
provides two important types of approach: the computational one,
starting from the idea that the mental phenomenon can be reduced to
1 and 0 type calculus operations, and the connection one that
considers the thinking products as being a result of the interaction
between all the composing (included) systems. In the field of
psychology measurements in the computational register use classical
inquiries and psychometrical tests, generally based on calculus
methods. Deeming things from both sides that are representing the
cognitive science, we can notice a gap in psychological product
measurement possibilities, regarded from the connectionist
perspective, that requires the unitary understanding of the quality –
quantity whole. In such approach measurement by calculus proves to
be inefficient. Our researches, deployed for longer than 20 years,
lead to the conclusion that measuring by forms properly fits to the
connectionism laws and principles.
Abstract: Consumer demand for products with low fat or sugar content and low levels of food additives, as well as cost factors, make exopolysaccharides (EPS) a viable alternative. EPS remain an interesting tool to modulate the sensory properties of yoghurt. This study was designed to evaluate EPS production potential of commercial yoghurt starter cultures (Yo-Flex starters: Harmony 1.0, TWIST 1.0 and YF-L902, Chr.Hansen, Denmark) and their influence on an apparent viscosity of yoghurt samples. The production of intracellularly synthesized EPS by different commercial yoghurt starters varies roughly from 144,08 to 440,81 mg/l. Analysing starters’ producing EPS, they showed large variations in concentration and supposedly composition. TWIST 1.0 had produced greater amounts of EPS in MRS medium and in yoghurt samples but there wasn’t determined significant contribution to development of texture as well as an apparent viscosity of the final product. YF-L902 and Harmony 1.0 starters differed considerably in EPS yields, but not in apparent viscosities (p>0.05) of the final yoghurts. Correlation between EPS concentration and viscosity of yoghurt samples was not established in the study.
Abstract: This paper attempts to establish the fact that Multi
State Network Classification is essential for performance
enhancement of Transport protocols over Satellite based Networks. A
model to classify Multi State network condition taking into
consideration both congestion and channel error is evolved. In order
to arrive at such a model an analysis of the impact of congestion and
channel error on RTT values has been carried out using ns2. The
analysis results are also reported in the paper. The inference drawn
from this analysis is used to develop a novel statistical RTT based
model for multi state network classification.
An Adaptive Multi State Proactive Transport Protocol consisting
of Proactive Slow Start, State based Error Recovery, Timeout Action
and Proactive Reduction is proposed which uses the multi state
network state classification model. This paper also confirms through
detail simulation and analysis that a prior knowledge about the
overall characteristics of the network helps in enhancing the
performance of the protocol over satellite channel which is
significantly affected due to channel noise and congestion.
The necessary augmentation of ns2 simulator is done for
simulating the multi state network classification logic. This
simulation has been used in detail evaluation of the protocol under
varied levels of congestion and channel noise. The performance
enhancement of this protocol with reference to established protocols
namely TCP SACK and Vegas has been discussed. The results as
discussed in this paper clearly reveal that the proposed protocol
always outperforms its peers and show a significant improvement in
very high error conditions as envisaged in the design of the protocol.
Abstract: A new algorithm called Character-Comparison to Character-Access (CCCA) is developed to test the effect of both: 1) converting character-comparison and number-comparison into character-access and 2) the starting point of checking on the performance of the checking operation in string searching. An experiment is performed using both English text and DNA text with different sizes. The results are compared with five algorithms, namely, Naive, BM, Inf_Suf_Pref, Raita, and Cycle. With the CCCA algorithm, the results suggest that the evaluation criteria of the average number of total comparisons are improved up to 35%. Furthermore, the results suggest that the clock time required by the other algorithms is improved in range from 22.13% to 42.33% by the new CCCA algorithm.
Abstract: Vision-based intelligent vehicle applications often require large amounts of memory to handle video streaming and image processing, which in turn increases complexity of hardware and software. This paper presents an FPGA implement of a vision-based blind spot warning system. Using video frames, the information of the blind spot area turns into one-dimensional information. Analysis of the estimated entropy of image allows the detection of an object in time. This idea has been implemented in the XtremeDSP video starter kit. The blind spot warning system uses only 13% of its logic resources and 95k bits block memory, and its frame rate is over 30 frames per sec (fps).
Abstract: In order to optimize annual IT spending and to reduce
the complexity of an entire system architecture, SOA trials have been
started. It is common knowledge that to design an SOA system we
have to adopt the top-down approach, but in reality silo systems are
being made, so these companies cannot reuse newly designed services,
and cannot enjoy SOA-s economic benefits. To prevent this situation,
we designed a generic SOA development process referred to as the
architecture of “mass customization."
To define the generic detail development processes, we did a case
study on an imaginary company. Through the case study, we could
define the practical development processes and found this could vastly
reduce updating development costs.
Abstract: The increasing competitiveness in manufacturing
industry is forcing manufacturers to seek effective processing
schedules. The paper presents an optimization manufacture
scheduling approach for dependent details processing with given
processing sequences and times on multiple machines. By defining
decision variables as start and end moments of details processing it is
possible to use straightforward variables restrictions to satisfy
different technological requirements and to formulate easy to
understand and solve optimization tasks for multiple numbers of
details and machines. A case study example is solved for seven base
moldings for CNC metalworking machines processed on five
different machines with given processing order among details and
machines and known processing time-s duration. As a result of linear
optimization task solution the optimal manufacturing schedule
minimizing the overall processing time is obtained. The
manufacturing schedule defines the moments of moldings delivery
thus minimizing storage costs and provides mounting due-time
satisfaction. The proposed optimization approach is based on real
manufacturing plant problem. Different processing schedules variants
for different technological restrictions were defined and implemented
in the practice of Bulgarian company RAIS Ltd. The proposed
approach could be generalized for other job shop scheduling
problems for different applications.