Abstract: This paper presents a novel methodology for Maximum Power Point Tracking (MPPT) of a grid-connected 20 kW Photovoltaic (PV) system using neuro-fuzzy network. The proposed method predicts the reference PV voltage guarantying optimal power transfer between the PV generator and the main utility grid. The neuro-fuzzy network is composed of a fuzzy rule-based classifier and three Radial Basis Function Neural Networks (RBFNN). Inputs of the network (irradiance and temperature) are classified before they are fed into the appropriated RBFNN for either training or estimation process while the output is the reference voltage. The main advantage of the proposed methodology, comparing to a conventional single neural network-based approach, is the distinct generalization ability regarding to the nonlinear and dynamic behavior of a PV generator. In fact, the neuro-fuzzy network is a neural network based multi-model machine learning that defines a set of local models emulating the complex and non-linear behavior of a PV generator under a wide range of operating conditions. Simulation results under several rapid irradiance variations proved that the proposed MPPT method fulfilled the highest efficiency comparing to a conventional single neural network.
Abstract: Nowadays, the rapid development of multimedia
and internet allows for wide distribution of digital media data.
It becomes much easier to edit, modify and duplicate digital
information Besides that, digital documents are also easy to
copy and distribute, therefore it will be faced by many
threatens. It-s a big security and privacy issue with the large
flood of information and the development of the digital
format, it become necessary to find appropriate protection
because of the significance, accuracy and sensitivity of the
information. Nowadays protection system classified with more
specific as hiding information, encryption information, and
combination between hiding and encryption to increase information
security, the strength of the information hiding science is due to the
non-existence of standard algorithms to be used in hiding secret
messages. Also there is randomness in hiding methods such as
combining several media (covers) with different methods to pass a
secret message. In addition, there are no formal methods to be
followed to discover the hidden data. For this reason, the task of this
research becomes difficult. In this paper, a new system of information
hiding is presented. The proposed system aim to hidden information
(data file) in any execution file (EXE) and to detect the hidden file
and we will see implementation of steganography system which
embeds information in an execution file. (EXE) files have been
investigated. The system tries to find a solution to the size of the
cover file and making it undetectable by anti-virus software. The
system includes two main functions; first is the hiding of the
information in a Portable Executable File (EXE), through the
execution of four process (specify the cover file, specify the
information file, encryption of the information, and hiding the
information) and the second function is the extraction of the hiding
information through three process (specify the steno file, extract the
information, and decryption of the information). The system has
achieved the main goals, such as make the relation of the size of the
cover file and the size of information independent and the result file
does not make any conflict with anti-virus software.
Abstract: Paper presents knowledge about types of test in area
of materials properties of selected methods of rapid prototyping
technologies. In today used rapid prototyping technologies for
production of models and final parts are used materials in initial state
as solid, liquid or powder material structure. In solid state are used
various forms such as pellets, wire or laminates. Basic range
materials include paper, nylon, wax, resins, metals and ceramics. In
Fused Deposition Modeling (FDM) rapid prototyping technology are
mainly used as basic materials ABS (Acrylonitrile Butadiene
Styrene), polyamide, polycarbonate, polyethylene and polypropylene.
For advanced FDM applications are used special materials as silicon
nitrate, PZT (Piezoceramic Material - Lead Zirconate Titanate),
aluminium oxide, hydroxypatite and stainless steel.
Abstract: The world is moving rapidly toward the deployment
of information and communication systems. Nowadays, computing
systems with their fast growth are found everywhere and one of the main challenges for these systems is increasing attacks and security threats against them. Thus, capturing, analyzing and verifying security requirements becomes a very important activity in
development process of computing systems, specially in developing
systems such as banking, military and e-business systems. For
developing every system, a process model which includes a process,
methods and tools is chosen. The Rational Unified Process (RUP) is
one of the most popular and complete process models which is used
by developers in recent years. This process model should be extended to be used in developing secure software systems. In this
paper, the Requirement Discipline of RUP is extended to improve RUP for developing secure software systems. These proposed extensions are adding and integrating a number of Activities, Roles,
and Artifacts to RUP in order to capture, document and model threats
and security requirements of system. These extensions introduce a
group of clear and stepwise activities to developers. By following these activities, developers assure that security requirements are
captured and modeled. These models are used in design, implementation and test activitie
Abstract: Knowledge capabilities are increasingly important for
the innovative technology enterprises to enhance the business
performance in terms of product competitiveness, innovation and
sales. Recognition of the company capability by auditing allows them
to further pursue advancement, strategic planning and hence gain
competitive advantages. This paper attempts to develop an
Organizations- Knowledge Capabilities Assessment (OKCA) method
to assess the knowledge capabilities of technology companies. The
OKCA is a questionnaire-based assessment tool which has been
developed to uncover the impact of various knowledge capabilities on
different organizational performance. The collected data is then
analyzed to find out the crucial elements for different technological
companies. Based on the results, innovative technology enterprises are
able to recognize the direction for further improvement on business
performance and future development plan. External environmental
factors affecting organization performance can be found through the
further analysis of some selected reference companies.
Abstract: Calcium is a vital second messenger used in signal transduction. Calcium controls secretion, cell movement, muscular contraction, cell differentiation, ciliary beating and so on. Two theories have been used to simplify the system of reaction-diffusion equations of calcium into a single equation. One is excess buffer approximation (EBA) which assumes that mobile buffer is present in excess and cannot be saturated. The other is rapid buffer approximation (RBA), which assumes that calcium binding to buffer is rapid compared to calcium diffusion rate. In the present work, attempt has been made to develop a model for calcium diffusion under excess buffer approximation in neuron cells. This model incorporates the effect of [Na+] influx on [Ca2+] diffusion,variable calcium and sodium sources, sodium-calcium exchange protein, Sarcolemmal Calcium ATPase pump, sodium and calcium channels. The proposed mathematical model leads to a system of partial differential equations which have been solved numerically using Forward Time Centered Space (FTCS) approach. The numerical results have been used to study the relationships among different types of parameters such as buffer concentration, association rate, calcium permeability.
Abstract: The introduction of haptic elements in a graphic user interfaces are becoming more widespread. Since haptics are being introduced rapidly into computational tools, investigating how these models affect Human-Computer Interaction would help define how to integrate and model new modes of interaction. The interest of this paper is to discuss and investigate the issues surrounding Haptic and Graphic User Interface designs (GUI) as separate systems, as well as understand how these work in tandem. The development of these systems is explored from a psychological perspective, based on how usability is addressed through learning and affordances, defined by J.J. Gibson. Haptic design can be a powerful tool, aiding in intuitive learning. The problems discussed within the text is how can haptic interfaces be integrated within a GUI without the sense of frivolity. Juxtaposing haptics and Graphic user interfaces has issues of motivation; GUI tends to have a performatory process, while Haptic Interfaces use affordances to learn tool use. In a deeper view, it is noted that two modes of perception, foveal and ambient, dictate perception. These two modes were once thought to work in tandem, however it has been discovered that these processes work independently from each other. Foveal modes interpret orientation is space which provide for posture, locomotion, and motor skills with variations of the sensory information, which instructs perceptions of object-task performance. It is contended, here, that object-task performance is a key element in the use of Haptic Interfaces because exploratory learning uses affordances in order to use an object, without meditating an experience cognitively. It is a direct experience that, through iteration, can lead to skill-sets. It is also indicated that object-task performance will not work as efficiently without the use of exploratory or kinesthetic learning practices. Therefore, object-task performance is not as congruently explored in GUI than it is practiced in Haptic interfaces.
Abstract: Sunflower stalks were analysed for chemical
compositions: pentosan 15.84%, holocellulose 70.69%,
alphacellulose 45.74%, glucose 27.10% and xylose 7.69% based on
dry weight of 100-g raw material. The most optimum condition for
steam explosion pretreatment was as follows. Sunflower stalks were
cut into small pieces and soaked in 0.02 M H2SO4 for overnight.
After that, they were steam exploded at 207 C and 21 kg/cm2 for 3
minutes to fractionate cellulose, hemicellulose and lignin. The
resulting hydrolysate, containing hemicellulose, and cellulose pulp
contained xylose sugar at 2.53% and 7.00%, respectively.The pulp
was further subjected to enzymatic saccharification at 50 C, pH 4.8 citrate buffer) with pulp/buffer 6% (w/w)and Celluclast 1.5L/pulp
2.67% (w/w) to obtain single glucose with maximum yield 11.97%.
After fixed-bed fermentation under optimum condition using
conventional yeast mixtures to produce bioethanol, it indicated
maximum ethanol yield of 0.028 g/100 g sunflower stalk.
Abstract: Stevia rebaudiana Bertoni (natural sweetener) belongs
to Asteraceae family and can be used as substitute of artificial
sweeteners for diabetic patients. Conventionally, it is cultivated by
seeds or stem cutting, but seed viability rate is poor. A protocol for
callus induction and multiplication was developed to produce large
no. of calli in short period. Surface sterilized nodal, leaf and root
explants were cultured on Murashige and Skoog (MS) medium with
different concentrations of plant hormone like, IBA, kinetin, NAA,
2,4-D, and NAA in combination with 2,4-D. 100% callusing was
observed from leaf explants cultured on combination of NAA and
2,4-D after three weeks while with 2,4-D, only 10% callusing was
observed. Calli obtained from leaf and root explants were shiny green
while with nodal explants it was hard and brown. The present
findings deal with induction of callusing in Stevia to achieve the
rapid callus multiplication for study of steviol glycosides in callus
culture.
Abstract: Due to the legacy of apartheid segregation South Africa remains a divided society where most voters live in politically homogenous social environments. This paper argues that political discussion within one’s social context plays a primary role in shaping political attitudes and vote choice. Using data from the Comparative National Elections Project 2004 and 2009 South African post-election surveys, the paper explores the extent of social context partisan homogeneity in South Africa and finds that voters are not overly embedded in homogenous social contexts. It then demonstrates the consequences of partisan homogeneity on voting behavior. Homogenous social contexts tend to encourage stronger partisan loyalties and fewer defections in vote choice while voters in more heterogeneous contexts show less consistency in their attitudes and behaviour. Finally, the analysis shows how momentous sociopolitical events at the time of a particular election can change the social context, with important consequences for electoral outcomes.
Abstract: Electronic commerce is growing rapidly with on-line
sales already heading for hundreds of billion dollars per year. Due to
the huge amount of money transferred everyday, an increased
security level is required. In this work we present the architecture of
an intelligent speaker verification system, which is able to accurately
verify the registered users of an e-commerce service using only their
voices as an input. According to the proposed architecture, a
transaction-based e-commerce application should be complemented
by a biometric server where customer-s unique set of speech models
(voiceprint) is stored. The verification procedure requests from the
user to pronounce a personalized sequence of digits and after
capturing speech and extracting voice features at the client side are
sent back to the biometric server. The biometric server uses pattern
recognition to decide whether the received features match the stored
voiceprint of the customer who claims to be, and accordingly grants
verification. The proposed architecture can provide e-commerce
applications with a higher degree of certainty regarding the identity
of a customer, and prevent impostors to execute fraudulent
transactions.
Abstract: Changing technology and increased constituent
demand for government services derive the need for governmental
responsiveness. The government organisations in the developing
countries will be under increased pressure to change their
bureaucratic systems to be able to respond rapidly to changing and
increasing requirements and rapid technology advancements. This
paper aims to present a conceptual framework for explaining the
main barriers and drivers of public e-service development. Therefore,
the framework provides a basic context within which the process and
practice of E-Service can be implemented successfully in the public
sector organisations. The framework is flexible enough to be adopted
by governments at different levels; national or local by developing
countries around the world.
Abstract: This study aimed to develop and initially validate an instrument that measures social competency among tertiary level faculty members. A review of extant literature on social competence was done. The review of extant literature led to the writing of the items in the initial instrument which was evaluated by 11 Subject Matter Experts (SMEs). The SMEs were either educators or psychologists. The results of the evaluations done by the SMEs served as bases for the creation of the pre-try-out instrument used in the first trial-run. Insights from the first trial-run participants led to the development of the main try-out instrument used in the final test administration. One Hundred Forty-one participants from five private Higher Education Institutions (HEIs) in the National Capital Region (NCR) and five private HEIs in Central Luzon in the Philippines participated in the final test administration. The reliability of the instrument was evaluated using Cronbach-s Coefficient Alpha formula and had a Cronbach-s Alpha of 0.92. On the other hand, Factor Analysis was used to evaluate the validity of the instrument and six factors were identified. The development of the final instrument was based on the results of the evaluation of the instrument-s reliability and validity. For purposes of recognition, the instrument was named “Social Competency Inventory for Tertiary Level Faculty Members (SCI-TLFM)."
Abstract: Most neural network (NN) models of human category learning use a gradient-based learning method, which assumes that locally-optimal changes are made to model parameters on each learning trial. This method tends to under predict variability in individual-level cognitive processes. In addition many recent models of human category learning have been criticized for not being able to replicate rapid changes in categorization accuracy and attention processes observed in empirical studies. In this paper we introduce stochastic learning algorithms for NN models of human category learning and show that use of the algorithms can result in (a) rapid changes in accuracy and attention allocation, and (b) different learning trajectories and more realistic variability at the individual-level.
Abstract: The interdependences among stock market indices
were studied for a long while by academics in the entire world. The
current financial crisis opened the door to a wide range of opinions
concerning the understanding and measurement of the connections
considered to provide the controversial phenomenon of market
integration. Using data on the log-returns of 17 stock market indices
that include most of the CEE markets, from 2005 until 2009, our
paper studies the problem of these dependences using a new
methodological tool that takes into account both the volatility
clustering effect and the stochastic properties of these linkages
through a Dynamic Conditional System of Simultaneous Equations.
We find that the crisis is well captured by our model as it provides
evidence for the high volatility – high dependence effect.
Abstract: Gas condensate Reservoirs show complicated thermodynamic behavior when their pressure reduces to under dew point pressure. Condensate blockage around the producing well cause significant reduction of production rate as well bottom-hole pressure drops below saturation pressure. The main objective of this work was to examine the well test analysis of naturally fractured lean gas condensate reservoir and investigate the effect of condensate formed around the well-bore on behavior of single phase pseudo pressure and its derivative curves. In this work a naturally fractured lean gas condensate reservoir is simulated with compositional simulator. Different sensitivity analysis done on Corry parameters and result of simulator is feed to analytical well testing software. For consideration of these phenomena eighteen compositional models with Capillary number effect are constructed. Matrix relative permeability obeys Corry relative permeability and relative permeability in fracture is linear. Well testing behavior of these models are studied and interpreted. Results show different sensitivity analysis on relative permeability of matrix does not have strong effect on well testing behavior even most part of the matrix around the well is occupied with condensate.
Abstract: With the rapid growth in business size, today-s businesses orient Throughout thirty years local, national and international experience in medicine as a medical student, junior doctor and eventually Consultant and Professor in Anaesthesia, Intensive Care and Pain Management, I note significant generalised dissatisfaction among medical students and doctors regarding their medical education and practice. We repeatedly hear complaints from patients about the dysfunctional health care system they are dealing with and subsequently the poor medical service that they are receiving. Medical students are bombarded with lectures, tutorials, clinical rounds and various exams. Clinicians are weighed down with a never-ending array of competing duties. Patients are extremely unhappy about the long waiting lists, loss of their records and the continuous deterioration of the health care service. This problem has been reported in different countries by several authors [1,2,3]. In a trial to solve this dilemma, a genuine idea has been suggested implementing computer technology in medicine [2,3]. Computers in medicine are a medium of international communication of the revolutionary advances being made in the application of the computer to the fields of bioscience and medicine [4,5]. The awareness about using computers in medicine has recently increased all over the world. In Misr University for Science & Technology (MUST), Egypt, medical students are now given hand-held computers (Laptop) with Internet facility making their medical education accessible, convenient and up to date. However, this trial still needs to be validated. Helping the readers to catch up with the on going fast development in this interesting field, the author has decided to continue reviewing the literature, exploring the state-of-art in computer based medicine and up dating the medical professionals especially the local trainee Doctors in Egypt. In part I of this review article we will give a general background discussing the potential use of computer technology in the various aspects of the medical field including education, research, clinical practice and the health care service given to patients. Hope this will help starting changing the culture, promoting the awareness about the importance of implementing information technology (IT) in medicine, which is a field in which such help is needed. An international collaboration is recommended supporting the emerging countries achieving this target.
Abstract: The purpose of this study is to identify the underlying
causes of late payment from the contractors- perspective in the
Malaysian construction industry and to recommend effective solutions
to mitigate late payment problems. The target groups of respondents in
this study were Grades G3, G5, G6 and G7 contractors with
specialization in building works and civil engineering works registered
with the Construction Industry Development Board (CIDB) in
Malaysia. Results from this study were analyzed with Statistical
Package for the Social Science (SPSS 15.0). From this study, it was
found that respondents have highest ranked five significant variables
out of a total of forty-one variables which can caused late payment
problems: a) cash flow problems due to deficiencies in client-s
management capacity (mean = 3.96); b) client-s ineffective utilization
of funds (mean = 3.88); c) scarcity of capital to finance the project
(mean = 3.81); d) clients failure to generate income from bank when
sales of houses do not hit the targeted amount (mean=3.72); and e)
poor cash flow because of lack of proper process implementation,
delay in releasing of the retention monies to contractor and delay in the
evaluation and certification of interim and final payment (mean =
3.66).
Abstract: The aim of the article is extending and developing
econometrics and network structure based methods which are able to
distinguish price manipulation in Tehran stock exchange. The
principal goal of the present study is to offer model for
approximating price manipulation in Tehran stock exchange. In order
to do so by applying separation method a sample consisting of 397
companies accepted at Tehran stock exchange were selected and
information related to their price and volume of trades during years
2001 until 2009 were collected and then through performing runs
test, skewness test and duration correlative test the selected
companies were divided into 2 sets of manipulated and non
manipulated companies. In the next stage by investigating
cumulative return process and volume of trades in manipulated
companies, the date of starting price manipulation was specified and
in this way the logit model, artificial neural network, multiple
discriminant analysis and by using information related to size of
company, clarity of information, ratio of P/E and liquidity of stock
one year prior price manipulation; a model for forecasting price
manipulation of stocks of companies present in Tehran stock
exchange were designed. At the end the power of forecasting models
were studied by using data of test set. Whereas the power of
forecasting logit model for test set was 92.1%, for artificial neural
network was 94.1% and multi audit analysis model was 90.2%;
therefore all of the 3 aforesaid models has high power to forecast
price manipulation and there is no considerable difference among
forecasting power of these 3 models.
Abstract: Recently, content delivery services have grown rapidly
over the Internet. For ASPs (Application Service Provider) providing
content delivery services, P2P architecture is beneficial to reduce
outgoing traffic from content servers. On the other hand, ISPs are
suffering from the increase in P2P traffic. The P2P traffic is
unnecessarily redundant because the same content or the same
fractions of content are transferred through an inter-ISP link several
times. Subscriber ISPs have to pay a transit fee to upstream ISPs based
on the volume of inter-ISP traffic. In order to solve such problems,
several works have been done for the purpose of P2P traffic reduction.
However, these existing works cannot control the traffic volume of a
certain link. In order to solve such an ISP-s operational requirement,
we propose a method to control traffic volume for a link within a
preconfigured upper bound value. We evaluated that the proposed
method works well by conducting a simulation on a 1,000-user scale.
We confirm that the traffic volume could be controlled at a lower level
than the upper bound for all evaluated conditions. Moreover, our
method could control the traffic volume at 98.95% link usage against
the target value.