Abstract: Medical image modalities such as computed
tomography (CT), magnetic resonance imaging (MRI), ultrasound
(US), X-ray are adapted to diagnose disease. These modalities
provide flexible means of reviewing anatomical cross-sections and
physiological state in different parts of the human body. The raw
medical images have a huge file size and need large storage
requirements. So it should be such a way to reduce the size of those
image files to be valid for telemedicine applications. Thus the image
compression is a key factor to reduce the bit rate for transmission or
storage while maintaining an acceptable reproduction quality, but it is
natural to rise the question of how much an image can be compressed
and still preserve sufficient information for a given clinical
application. Many techniques for achieving data compression have
been introduced. In this study, three different MRI modalities which
are Brain, Spine and Knee have been compressed and reconstructed
using wavelet transform. Subjective and objective evaluation has
been done to investigate the clinical information quality of the
compressed images. For the objective evaluation, the results show
that the PSNR which indicates the quality of the reconstructed image
is ranging from (21.95 dB to 30.80 dB, 27.25 dB to 35.75 dB, and
26.93 dB to 34.93 dB) for Brain, Spine, and Knee respectively. For
the subjective evaluation test, the results show that the compression
ratio of 40:1 was acceptable for brain image, whereas for spine and
knee images 50:1 was acceptable.
Abstract: Cryptography provides the secure manner of
information transmission over the insecure channel. It authenticates
messages based on the key but not on the user. It requires a lengthy
key to encrypt and decrypt the sending and receiving the messages,
respectively. But these keys can be guessed or cracked. Moreover,
Maintaining and sharing lengthy, random keys in enciphering and
deciphering process is the critical problem in the cryptography
system. A new approach is described for generating a crypto key,
which is acquired from a person-s iris pattern. In the biometric field,
template created by the biometric algorithm can only be
authenticated with the same person. Among the biometric templates,
iris features can efficiently be distinguished with individuals and
produces less false positives in the larger population. This type of iris
code distribution provides merely less intra-class variability that aids
the cryptosystem to confidently decrypt messages with an exact
matching of iris pattern. In this proposed approach, the iris features
are extracted using multi resolution wavelets. It produces 135-bit iris
codes from each subject and is used for encrypting/decrypting the
messages. The autocorrelators are used to recall original messages
from the partially corrupted data produced by the decryption process.
It intends to resolve the repudiation and key management problems.
Results were analyzed in both conventional iris cryptography system
(CIC) and non-repudiation iris cryptography system (NRIC). It
shows that this new approach provides considerably high
authentication in enciphering and deciphering processes.
Abstract: Based on 276 responses from academic staff in an
evaluation of an online learning environment (OLE), this paper
identifies those elements of the OLE that were most used and valued
by staff, those elements of the OLE that staff most wanted to see
improved, and those factors that most contributed to staff perceptions
that the use of the OLE enhanced their teaching. The most used and
valued elements were core functions, including accessing unit
information, accessing lecture/tutorial/lab notes, and reading online
discussions. The elements identified as most needing attention related
to online assessment: submitting assignments, managing assessment
items, and receiving feedback on assignments. Staff felt that using the
OLE enhanced their teaching when they were satisfied that their
students were able to access and use their learning materials, and
when they were satisfied with the professional development they
received and were confident with their ability to teach with the OLE.
Abstract: The vast amount of information on the World Wide
Web is created and published by many different types of providers.
Unlike books and journals, most of this information is not subject to
editing or peer review by experts. This lack of quality control and the
explosion of web sites make the task of finding quality information
on the web especially critical. Meanwhile new facilities for
producing web pages such as Blogs make this issue more significant
because Blogs have simple content management tools enabling nonexperts
to build easily updatable web diaries or online journals. On
the other hand despite a decade of active research in information
quality (IQ) there is no framework for measuring information quality
on the Blogs yet. This paper presents a novel experimental
framework for ranking quality of information on the Weblog. The
results of data analysis revealed seven IQ dimensions for the Weblog.
For each dimension, variables and related coefficients were
calculated so that presented framework is able to assess IQ of
Weblogs automatically.
Abstract: Friction-stir welding has received a huge interest in the last few years. The many advantages of this promising process have led researchers to present different theoretical and experimental explanation of the process. The way to quantitatively and qualitatively control the different parameters of the friction-stir welding process has not been paved. In this study, a refined energybased model that estimates the energy generated due to friction and plastic deformation is presented. The effect of the plastic deformation at low energy levels is significant and hence a scale factor is introduced to control its effect. The predicted heat energy and the obtained maximum temperature using our model are compared to the theoretical and experimental results available in the literature and a good agreement is obtained. The model is applied to AA6000 and AA7000 series.
Abstract: In July 1, 2007, Taiwan Stock Exchange (TWSE) on
market observation post system (MOPS) adds a new "Financial
reference database" for investors to do investment reference. This
database as a warning to public offering companies listed on the
public financial information and it original within eight targets. In
this paper, this database provided by the indicators for the application
of company financial crisis early warning model verify that the
database provided by the indicator forecast for the financial crisis,
whether or not companies have a high accuracy rate as opposed to
domestic and foreign scholars have positive results. There is use of
Logistic Regression Model application of the financial early warning
model, in which no joined back-conditions is the first model, joined it
in is the second model, has been taken occurred in the financial crisis
of companies to research samples and then business took place
before the financial crisis point with T-1 and T-2 sample data to do
positive analysis. The results show that this database provided the
debt ratio and net per share for the best forecast variables.
Abstract: In view of geological origin, formation of the shallow
gas reservoir of the Hangzhou Bay, northern Zhejiang Province,
eastern China, and original occurrence characteristics of the gassy
sand are analyzed. Generally, gassy sand in scale gas reservoirs is in
the state of residual moisture content and the approximate scope of
initial matric suction of sand ranges about from 0kPa to100kPa.
Results based on GDS triaxial tests show that the classical shear
strength formulas of unsaturated soil can not effectively describe basic
strength characteristics of gassy sand; the relationship between
apparent cohesion and matric suction of gassy sand agrees well with
the power function, which can reasonably be used to describe the
strength of gassy sand. In the stress path of gas release, shear strength
of gassy sand will increase and experimental results show the formula
proposed in this paper can effectively predict the strength increment.
When saturated strength indexes of the sand are used in engineering
design, moderate reduction should be considered.
Abstract: LSP routing is among the prominent issues in MPLS
networks traffic engineering. The objective of this routing is to
increase number of the accepted requests while guaranteeing the
quality of service (QoS). Requested bandwidth is the most important
QoS criterion that is considered in literatures, and a various number
of heuristic algorithms have been presented with that regards. Many
of these algorithms prevent flows through bottlenecks of the network
in order to perform load balancing, which impedes optimum
operation of the network. Here, a modern routing algorithm is
proposed as MIRAD: having a little information of the network
topology, links residual bandwidth, and any knowledge of the
prospective requests it provides every request with a maximum
bandwidth as well as minimum end-to-end delay via uniform load
distribution across the network. Simulation results of the proposed
algorithm show a better efficiency in comparison with similar
algorithms.
Abstract: For several high speed networks, providing resilience against failures is an essential requirement. The main feature for designing next generation optical networks is protecting and restoring high capacity WDM networks from the failures. Quick detection, identification and restoration make networks more strong and consistent even though the failures cannot be avoided. Hence, it is necessary to develop fast, efficient and dependable fault localization or detection mechanisms. In this paper we propose a new fault localization algorithm for WDM networks which can identify the location of a failure on a failed lightpath. Our algorithm detects the failed connection and then attempts to reroute data stream through an alternate path. In addition to this, we develop an algorithm to analyze the information of the alarms generated by the components of an optical network, in the presence of a fault. It uses the alarm correlation in order to reduce the list of suspected components shown to the network operators. By our simulation results, we show that our proposed algorithms achieve less blocking probability and delay while getting higher throughput.
Abstract: Today, advantage of biotechnology especially in environmental issues compared to other technologies is irrefragable. Kimia Gharb Gostar Industries Company, as a largest producer of citric acid in Middle East, applies biotechnology for this goal. Citrogypsum is a by–product of citric acid production and it considered as a valid residuum of this company. At this paper summary of acid citric production and condition of Citrogypsum production in company were introduced in addition to defmition of Citrogypsum production and its applications in world. According to these information and evaluation of present conditions about Iran needing to Citrogypsum, the best priority was introduced and emphasized on strategy selection and proper programming for self-sufficiency. The Delphi technique was used to elicit expert opinions about criteria for evaluating the usages. The criteria identified by the experts were profitability, capacity of production, the degree of investment, marketable, production ease and time production. The Analytical Hierarchy Process (ARP) and Expert Choice software were used to compare the alternatives on the criteria derived from the Delphi process.
Abstract: With the rapid growth and development of information and communication technology, the Internet has played a definite and irreplaceable role in people-s social lives in Taiwan like in other countries. In July 2008, on a general social website, an unexpected phenomenon was noticed – that there were more than one hundred users who started forming clubs voluntarily and having face-to-face gatherings for specific purposes. In this study, it-s argued whether or not teenagers- social contact on the Internet is involved in their life context, and tried to reveal the teenagers- social preferences, values, and needs, which merge with and influence teenagers- social activities. Therefore, the study conducts multiple user experience research methods, which include practical observations and qualitative analysis by contextual inquiries and in-depth interviews. Based on the findings, several design implications for software related to social interactions and cultural inheritance are offered. It is concluded that the inherent values of a social behaviors might be a key issue in developing computer-mediated communication or interaction designs in the future.
Abstract: Various formal and informal brand alliances are being formed in professional service firms. Professional service corporate brand is heavily dependent on brands of professional employees who comprise them, and professional employee brands are in turn dependent on the corporate brand. Prior work provides limited scientific evidence of brand alliance effects in professional service area – i.e., how professional service corporate-employee brand allies are affected by an alliance, what are brand attitude effects after alliance formation and how these effects vary with different strengths of an ally. Scientific literature analysis and theoretical modeling are the main methods of the current study. As a result, a theoretical model is constructed for estimating spillover effects of professional service corporate-employee brand alliances and for comparison among different professional service firm expertise practice models – from “brains" to “procedure" model. The resulting theoretical model lays basis for future experimental studies.
Abstract: Measuring the complexity of software has been an
insoluble problem in software engineering. Complexity measures can
be used to predict critical information about testability, reliability,
and maintainability of software systems from automatic analysis of
the source code. During the past few years, many complexity
measures have been invented based on the emerging Cognitive
Informatics discipline. These software complexity measures,
including cognitive functional size, lend themselves to the approach
of the total cognitive weights of basic control structures such as loops
and branches. This paper shows that the current existing calculation
method can generate different results that are algebraically
equivalence. However, analysis of the combinatorial meanings of this
calculation method shows significant flaw of the measure, which also
explains why it does not satisfy Weyuker's properties. Based on the
findings, improvement directions, such as measures fusion, and
cumulative variable counting scheme are suggested to enhance the
effectiveness of cognitive complexity measures.
Abstract: Within the realm of e-government, the development has moved towards testing new means for democratic decisionmaking, like e-panels, electronic discussion forums, and polls. Although such new developments seem promising, they are not problem-free, and the outcomes are seldom used in the subsequent formal political procedures. Nevertheless, process models offer promising potential when it comes to structuring and supporting transparency of decision processes in order to facilitate the integration of the public into decision-making procedures in a reasonable and manageable way. Based on real-life cases of urban planning processes in Sweden, we present an outline for an integrated framework for public decision making to: a) provide tools for citizens to organize discussion and create opinions; b) enable governments, authorities, and institutions to better analyse these opinions; and c) enable governments to account for this information in planning and societal decision making by employing a process model for structured public decision making.
Abstract: The classification of the protein structure is commonly
not performed for the whole protein but for structural domains, i.e.,
compact functional units preserved during evolution. Hence, a first
step to a protein structure classification is the separation of the
protein into its domains. We approach the problem of protein domain
identification by proposing a novel graph theoretical algorithm. We
represent the protein structure as an undirected, unweighted and
unlabeled graph which nodes correspond the secondary structure
elements of the protein. This graph is call the protein graph. The
domains are then identified as partitions of the graph corresponding
to vertices sets obtained by the maximization of an objective function,
which mutually maximizes the cycle distributions found in the
partitions of the graph. Our algorithm does not utilize any other kind
of information besides the cycle-distribution to find the partitions. If
a partition is found, the algorithm is iteratively applied to each of
the resulting subgraphs. As stop criterion, we calculate numerically
a significance level which indicates the stability of the predicted
partition against a random rewiring of the protein graph. Hence,
our algorithm terminates automatically its iterative application. We
present results for one and two domain proteins and compare our
results with the manually assigned domains by the SCOP database
and differences are discussed.
Abstract: Volume rendering is widely used in medical CT image
visualization. Applying 3D image visualization to diagnosis
application can require accurate volume rendering with high
resolution. Interpolation is important in medical image processing
applications such as image compression or volume resampling.
However, it can distort the original image data because of edge
blurring or blocking effects when image enhancement procedures
were applied. In this paper, we proposed adaptive tension control
method exploiting gradient information to achieve high resolution
medical image enhancement in volume visualization, where restored
images are similar to original images as much as possible. The
experimental results show that the proposed method can improve
image quality associated with the adaptive tension control efficacy.
Abstract: Weblogs are resource of social structure to discover and track the various type of information written by blogger. In this paper, we proposed to use mining weblogs technique for identifying the trends of influenza where blogger had disseminated their opinion for the anomaly disease. In order to identify the trends, web crawler is applied to perform a search and generated a list of visited links based on a set of influenza keywords. This information is used to implement the analytics report system for monitoring and analyzing the pattern and trends of influenza (H1N1). Statistical and graphical analysis reports are generated. Both types of the report have shown satisfactory reports that reflect the awareness of Malaysian on the issue of influenza outbreak through blogs.
Abstract: One of the basic concepts in marketing is the concept
of meeting customers- needs. Since customer satisfaction is essential
for lasting survival and development of a business, screening and
observing customer satisfaction and recognizing its underlying
factors must be one of the key activities of every business.
The purpose of this study is to recognize the drivers that effect
customer satisfaction in a business-to-business situation in order to
improve marketing activities. We conducted a survey in which 93
business customers of a manufacturer of Diesel Generator in Iran
participated and they talked about their ideas and satisfaction of
supplier-s services related to its products. We developed the measures
for drivers of satisfaction first by as investigative research (by means
of feedback from executives and customers of sponsoring firm). Then
based on these measures, we created a mail survey, and asked the
respondents to explain their opinion about the sponsoring firm which
was a supplier of diesel generator and similar products. Furthermore,
the survey required the participants to mention their functional areas
and their company features.
In Conclusion we found that there are three drivers for customer
satisfaction, which are reliability, information about product, and
commercial features. Buyers/users from different functional areas
attribute different degree of importance to the last two drivers. For
instance, people from buying and management areas believe that
commercial features are more important than information about
products. But people in engineering, maintenance and production
areas believe that having information about products is more
important than commercial aspects. Marketing experts should
consider the attribute of customers regarding information about the
product and commercial features to improve market share.
Abstract: Recommender Systems act as personalized decision
guides, aiding users in decisions on matters related to personal taste.
Most previous research on Recommender Systems has focused on the
statistical accuracy of the algorithms driving the systems, with no
emphasis on the trustworthiness of the user. RS depends on
information provided by different users to gather its knowledge. We
believe, if a large group of users provide wrong information it will
not be possible for the RS to arrive in an accurate conclusion. The
system described in this paper introduce the concept of Testing the
knowledge of user to filter out these “bad users".
This paper emphasizes on the mechanism used to provide robust
and effective recommendation.
Abstract: We present a non standard Euclidean vehicle
routing problem adding a level of clustering, and we revisit the use
of self-organizing maps as a tool which naturally handles such
problems. We present how they can be used as a main operator
into an evolutionary algorithm to address two conflicting
objectives of route length and distance from customers to bus stops
minimization and to deal with capacity constraints. We apply the
approach to a real-life case of combined clustering and vehicle
routing for the transportation of the 780 employees of an
enterprise. Basing upon a geographic information system we
discuss the influence of road infrastructures on the solutions
generated.