Abstract: The waves of eGovernment are rising very fast
through almost all public administration, or at least most of the
public administrations around the world, and not only the public
administration, but also the entire government and all of their
organization as a whole. The government uses information
technology, and above all the internet or web network, to facilitate
the exchange of services between government agencies and citizens,
businesses, employees and other non-governmental agencies. With
efficient and transparent information exchange, the information
becomes accessible to the society (citizens, business, employees etc.),
and as a result of these processes the society itself becomes the
information society or knowledge society. This paper discusses the
knowledge management for eGovernment development in
significance and role. Also, the paper reviews the role of virtual
communities as a knowledge management mechanism to support
eGovernment in Montenegro. It explores the need for knowledge
management in eGovernment, identifies knowledge management
technologies, and highlights the challenges for developing countries,
such as Montenegro in the implementation of eGovernment. The
paper suggests that knowledge management is needed to facilitate
information exchange and transaction processing with citizens, as
well as to enable creation of knowledge society.
Abstract: In order to enhance the contrast in the regions where the pixels have similar intensities, this paper presents a new histogram equalization scheme. Conventional global equalization schemes over-equalizes these regions so that too bright or dark pixels are resulted and local equalization schemes produce unexpected discontinuities at the boundaries of the blocks. The proposed algorithm segments the original histogram into sub-histograms with reference to brightness level and equalizes each sub-histogram with the limited extents of equalization considering its mean and variance. The final image is determined as the weighted sum of the equalized images obtained by using the sub-histogram equalizations. By limiting the maximum and minimum ranges of equalization operations on individual sub-histograms, the over-equalization effect is eliminated. Also the result image does not miss feature information in low density histogram region since the remaining these area is applied separating equalization. This paper includes how to determine the segmentation points in the histogram. The proposed algorithm has been tested with more than 100 images having various contrasts in the images and the results are compared to the conventional approaches to show its superiority.
Abstract: This research aims to study consumer acceptance of Tempeh from various raw materials (type of bean) and determine protein contents for comparison. Tempeh made from soybean, peanut, white kidney bean and sesame in the ratio: - soybean:sesame =1:0.1, soybean:white kidney:sesame =1:1:0.1, soybean:peanut:sesame =1:1:0.1 and peanut:white kidney bean: sesame =1:1:0.1. The study found that consumer is most satisfied with appearances on soybean mixed with white kidney and black sesame tempeh (3.98). The most satisfied tempeh with textures is soybean mixed with peanut and black sesame tempeh (4.00). The most satisfied tempeh with odor is peanut mixed with white kidney bean and black sesame tempeh (4.04). And the most satisfied tempeh with flavor is peanut mixed with white kidney bean and black sesame tempeh (4.2). The amount of protein in production, soybean tempeh has the highest protein. When we add sesame seeds, it made the protein content slightly decreased (1.86 and 0.6 %). When we use peanut as raw material, the protein content decreased 15.3%. And when we use
white kidney bean as raw material, the protein content decreased (22.77- 26.11%).
Abstract: Automatic reusability appraisal is helpful in
evaluating the quality of developed or developing reusable software
components and in identification of reusable components from
existing legacy systems; that can save cost of developing the
software from scratch. But the issue of how to identify reusable
components from existing systems has remained relatively
unexplored. In this research work, structural attributes of software
components are explored using software metrics and quality of the
software is inferred by different Neural Network based approaches,
taking the metric values as input. The calculated reusability value
enables to identify a good quality code automatically. It is found that
the reusability value determined is close to the manual analysis used
to be performed by the programmers or repository managers. So, the
developed system can be used to enhance the productivity and
quality of software development.
Abstract: .Hardware realization of a Neural Network (NN), to a large extent depends on the efficient implementation of a single neuron. FPGA-based reconfigurable computing architectures are suitable for hardware implementation of neural networks. FPGA realization of ANNs with a large number of neurons is still a challenging task. This paper discusses the issues involved in implementation of a multi-input neuron with linear/nonlinear excitation functions using FPGA. Implementation method with resource/speed tradeoff is proposed to handle signed decimal numbers. The VHDL coding developed is tested using Xilinx XC V50hq240 Chip. To improve the speed of operation a lookup table method is used. The problems involved in using a lookup table (LUT) for a nonlinear function is discussed. The percentage saving in resource and the improvement in speed with an LUT for a neuron is reported. An attempt is also made to derive a generalized formula for a multi-input neuron that facilitates to estimate approximately the total resource requirement and speed achievable for a given multilayer neural network. This facilitates the designer to choose the FPGA capacity for a given application. Using the proposed method of implementation a neural network based application, namely, a Space vector modulator for a vector-controlled drive is presented
Abstract: This paper reported an experimental research of
steady-state heat transfer behaviour of a gas flowing through a fixed
bed under the different operating conditions. Studies had been carried
out in a fixed-bed packed methanol synthesis catalyst percolated by air
at appropriate flow rate. Both radial and axial direction temperature
distribution had been investigated under the different operating
conditions. The effects of operating conditions including the reactor
inlet air temperature, the heating pipe temperature and the air flow rate
on temperature distribution was investigated and the experimental
results showed that a higher inlet air temperature was conducive to
uniform temperature distribution in the fixed bed. A large temperature
drop existed at the radial direction, and the temperature drop increased
with the heating pipe temperature increasing under the experimental
conditions; the temperature profile of the vicinity of the heating pipe
was strongly affected by the heating pipe temperature. A higher air
flow rate can improve the heat transfer in the fixed bed. Based on the
thermal distribution, heat transfer models of the fixed bed could be
established, and the characteristics of the temperature distribution in
the fixed bed could be finely described, that had an important practical
significance.
Abstract: Wireless mesh networks based on IEEE 802.11
technology are a scalable and efficient solution for next generation
wireless networking to provide wide-area wideband internet access to
a significant number of users. The deployment of these wireless mesh
networks may be within different authorities and without any
planning, they are potentially overlapped partially or completely in
the same service area. The aim of the proposed model is design a new
model to Enhancement Throughput of Unplanned Wireless Mesh
Networks Deployment Using Partitioning Hierarchical Cluster
(PHC), the unplanned deployment of WMNs are determinates there
performance. We use throughput optimization approach to model the
unplanned WMNs deployment problem based on partitioning
hierarchical cluster (PHC) based architecture, in this paper the
researcher used bridge node by allowing interworking traffic between
these WMNs as solution for performance degradation.
Abstract: In this paper, a uniform calculus-based approach for
synthesizing monitors checking correctness properties specified by a
large variety of logics at runtime is provided, including future and past
time logics, interval logics, state machine and parameterized temporal
logics. We present a calculus mechanism to synthesize monitors from
the logical specification for the incremental analysis of execution
traces during test and real run. The monitor detects both good and bad
prefix of a particular kind, namely those that are informative for the
property under investigation. We elaborate the procedure of calculus
as monitors.
Abstract: Some Chromium (III) complexes were synthesized
with three amino acids: L Glutamic Acid, Glycine, and L-cysteine as
the ligands, in order to provide a new supplement containing Cr(III)
for patients with type 2 diabetes mellitus. The complexes have been
prepared by refluxing a mixture of Chromium(III) chloride in
aqueous solution with L-glutamic acid, Glycine, and L-cysteine after
pH adjustment by sodium hydroxide. These complexes were
characterized by Infrared and Uv-Vis spectrophotometer and
Elemental analyzer. The product yields of four products were 87.50
and 56.76% for Cr-Glu complexes, 46.70% for Cr-Gly complex and
40.08% for Cr-Cys complex respectively. The predicted structure of
the complexes are [Cr(glu)2(H2O)2].xH2O, Cr(gly)3..xH2O and
Cr(cys)3.xH2O., respectively.
Abstract: With the exponential rise in the number of multimedia
applications available, the best-effort service provided by the Internet
today is insufficient. Researchers have been working on new
architectures like the Next Generation Network (NGN) which, by
definition, will ensure Quality of Service (QoS) in an all-IP based
network [1]. For this approach to become a reality, reservation of
bandwidth is required per application per user. WiMAX (Worldwide
Interoperability for Microwave Access) is a wireless communication
technology which has predefined levels of QoS which can be
provided to the user [4]. IPv6 has been created as the successor for
IPv4 and resolves issues like the availability of IP addresses and
QoS. This paper provides a design to use the power of WiMAX as an
NSP (Network Service Provider) for NGN using IPv6. The use of the
Traffic Class (TC) field and the Flow Label (FL) field of IPv6 has
been explained for making QoS requests and grants [6], [7]. Using
these fields, the processing time is reduced and routing is simplified.
Also, we define the functioning of the ASN gateway and the NGN
gateway (NGNG) which are edge node interfaces in the NGNWiMAX
design. These gateways ensure QoS management through
built in functions and by certain physical resources and networking
capabilities.
Abstract: Recently, web services to access from many type devices
are often used. We have developed the shortest path planning
system called "Bus-Net" in Tottori prefecture as a web application
to sustain the public transport. And it used the same user interface
for both devices. To support both devices, the interface cannot use
JavaScript and so on.
Thus, we developed the method that use individual user interface
for each device type to improve its convenience. To be concrete,
we defined formats of condition input to the path planning system
and result output from it and separate the system into the request
processing part and user interface parts that depend on device types.
By this method, we have also developed special device for Bus-Net
named "Intelligent-Bus-Stop".
Abstract: Image compression can improve the performance of
the digital systems by reducing time and cost in image storage
and transmission without significant reduction of the image quality.
Furthermore, the discrete cosine transform has emerged as the new
state-of-the art standard for image compression. In this paper, a
hybrid image compression technique based on reversible blockade
transform coding is proposed. The technique, implemented over
regions of interest (ROIs), is based on selection of the coefficients
that belong to different transforms, depending on the coefficients is
proposed. This method allows: (1) codification of multiple kernals
at various degrees of interest, (2) arbitrary shaped spectrum,and (3)
flexible adjustment of the compression quality of the image and the
background. No standard modification for JPEG2000 decoder was
required. The method was applied over different types of images.
Results show a better performance for the selected regions, when
image coding methods were employed for the whole set of images.
We believe that this method is an excellent tool for future image
compression research, mainly on images where image coding can
be of interest, such as the medical imaging modalities and several
multimedia applications. Finally VLSI implementation of proposed
method is shown. It is also shown that the kernal of Hartley and
Cosine transform gives the better performance than any other model.
Abstract: Iron in groundwater is one of the problems that render the water unsuitable for drinking. The concentration above 0.3 mg/L is common in groundwater. The conventional method of removal is by precipitation under oxic condition. In this study, iron removal under anaerobic conditions was examined by batch experiment as a main purpose. The process involved by purging of groundwater samples with H2S to form iron sulfide. Removal up to 83% for 1 mg/L iron solution was achieved. The removal efficiency dropped to 82% and 75% for the higher initial iron concentrations 3.55 and 5.01 mg/L, respectively. The average residual sulfide concentration in water after the process was 25*g/L. The Eh level during the process was -272 mV. The removal process was found to follow the first order reaction with average rate constant of 4.52 x 10-3. The half-life for the concentrations to reduce from initial values was 157 minutes.
Abstract: Patients with diabetes are susceptible to chronic foot
wounds which may be difficult to manage and slow to heal.
Diagnosis and treatment currently rely on the subjective judgement of
experienced professionals. An objective method of tissue assessment
is required. In this paper, a data fusion approach was taken to wound
tissue classification. The supervised Maximum Likelihood and
unsupervised Multi-Modal Expectation Maximisation algorithms
were used to classify tissues within simulated wound models by
weighting the contributions of both colour and 3D depth information.
It was found that, at low weightings, depth information could show
significant improvements in classification accuracy when compared
to classification by colour alone, particularly when using the
maximum likelihood method. However, larger weightings were
found to have an entirely negative effect on accuracy.
Abstract: Worm propagation profiles have significantly changed
since 2003-2004: sudden world outbreaks like Blaster or Slammer
have progressively disappeared and slower but stealthier worms
appeared since, most of them for botnets dissemination. Decreased
worm virulence results in more difficult detection.
In this paper, we describe a stealth worm propagation model
which has been extensively simulated and analysed on a huge virtual
network. The main features of this model is its ability to infect any
Internet-like network in a few seconds, whatever may be its size while
greatly limiting the reinfection attempt overhead of already infected
hosts. The main simulation results shows that the combinatorial
topology of routing may have a huge impact on the worm propagation
and thus some servers play a more essential and significant role than
others. The real-time capability to identify them may be essential to
greatly hinder worm propagation.
Abstract: For a given specific problem an efficient algorithm has been the matter of study. However, an alternative approach orthogonal to this approach comes out, which is called a reduction. In general for a given specific problem this reduction approach studies how to convert an original problem into subproblems. This paper proposes a formal modeling language to support this reduction approach in order to make a solver quickly. We show three examples from the wide area of learning problems. The benefit is a fast prototyping of algorithms for a given new problem. It is noted that our formal modeling language is not intend for providing an efficient notation for data mining application, but for facilitating a designer who develops solvers in machine learning.
Abstract: The Taiwan government has started to promote the “Plain Landscape Afforestation and Greening Program" since 2002. A key task of the program was the payment for environmental services (PES), entitled the “Plain Landscape Afforestation Policy" (PLAP), which was certificated by the Executive Yuan on August 31, 2001 and enacted on January 1, 2002. According to the policy, it is estimated that the total area of afforestation will be 25,100 hectares by December 31, 2007. Until the end of 2007, the policy had been enacted for six years in total and the actual area of afforestation was 8,919.18 hectares. Among them, Taiwan Sugar Corporation (TSC) was accounted for 7,960 hectares (with 2,450.83 hectares as public service area) which occupied 86.22% of the total afforestation area; the private farmland promoted by local governments was accounted for 869.18 hectares which occupied 9.75% of the total afforestation area. Based on the above, we observe that most of the afforestation area in this policy is executed by TSC, and the achievement ratio by TSC is better than by others. It implies that the success of the PLAP is seriously related to the execution of TSC. The objective of this study is to analyze the relevant policy planning of TSC's participation in the PLAP, suggest complementary measures, and draw up effective adjustment mechanisms, so as to improve the effectiveness of executing the policy. Our main conclusions and suggestions are summarized as follows: 1. The main reason for TSC’s participation in the PLAP is based on their passive cooperation with the central government or company policy. Prior to TSC’s participation in the PLAP, their lands were mainly used for growing sugarcane. 2. The main factors of TSC's consideration on the selection of tree species are based on the suitability of land and species. The largest proportion of tree species is allocated to economic forests, and the lack of technical instruction was the main problem during afforestation. Moreover, the method of improving TSC’s future development in leisure agriculture and landscape business becomes a key topic. 3. TSC has developed short and long-term plans on participating in the PLAP for the future. However, there is no great willingness or incentive on budgeting for such detailed planning. 4. Most people from TSC interviewed consider the requirements on PLAP unreasonable. Among them, an unreasonable requirement on the number of trees accounted for the greatest proportion; furthermore, most interviewees suggested that the government should continue to provide incentives even after 20 years. 5. Since the government shares the same goals as TSC, there should be sufficient cooperation and communication that support the technical instruction and reduction of afforestation cost, which will also help to improve effectiveness of the policy.
Abstract: In this study, a fuzzy similarity approach for Arabic
web pages classification is presented. The approach uses a fuzzy
term-category relation by manipulating membership degree for the
training data and the degree value for a test web page. Six measures
are used and compared in this study. These measures include:
Einstein, Algebraic, Hamacher, MinMax, Special case fuzzy and
Bounded Difference approaches. These measures are applied and
compared using 50 different Arabic web pages. Einstein measure was
gave best performance among the other measures. An analysis of
these measures and concluding remarks are drawn in this study.
Abstract: In the paper, a fast high-resolution range profile synthetic algorithm called orthogonal matching pursuit with sensing dictionary (OMP-SD) is proposed. It formulates the traditional HRRP synthetic to be a sparse approximation problem over redundant dictionary. As it employs a priori that the synthetic range profile (SRP) of targets are sparse, SRP can be accomplished even in presence of data lost. Besides, the computation complexity decreases from O(MNDK) flops for OMP to O(M(N + D)K) flops for OMP-SD by introducing sensing dictionary (SD). Simulation experiments illustrate its advantages both in additive white Gaussian noise (AWGN) and noiseless situation, respectively.
Abstract: The objective of this research was to investigate biodegradation of water hyacinth (Eichhornia crassipes) to produce bioethanol using dilute-acid pretreatment (1% sulfuric acid) results in high hemicellulose decomposition and using yeast (Pachysolen tannophilus) as bioethanol producing strain. A maximum ethanol yield of 1.14g/L with coefficient, 0.24g g-1; productivity, 0.015g l-1h-1 was comparable to predicted value 32.05g/L obtained by Central Composite Design (CCD). Maximum ethanol yield coefficient was comparable to those obtained through enzymatic saccharification and fermentation of acid hydrolysate using fully equipped fermentor. Although maximum ethanol concentration was low in lab scale, the improvement of lignocellulosic ethanol yield is necessary for large scale production.