Abstract: This paper outlines the development of an
experimental technique in quantifying supersonic jet flows, in an
attempt to avoid seeding particle problems frequently associated with
particle-image velocimetry (PIV) techniques at high Mach numbers.
Based on optical flow algorithms, the idea behind the technique
involves using high speed cameras to capture Schlieren images of the
supersonic jet shear layers, before they are subjected to an adapted
optical flow algorithm based on the Horn-Schnuck method to
determine the associated flow fields. The proposed method is capable
of offering full-field unsteady flow information with potentially
higher accuracy and resolution than existing point-measurements or
PIV techniques. Preliminary study via numerical simulations of a
circular de Laval jet nozzle successfully reveals flow and shock
structures typically associated with supersonic jet flows, which serve
as useful data for subsequent validation of the optical flow based
experimental results. For experimental technique, a Z-type Schlieren
setup is proposed with supersonic jet operated in cold mode,
stagnation pressure of 4 bar and exit Mach of 1.5. High-speed singleframe
or double-frame cameras are used to capture successive
Schlieren images. As implementation of optical flow technique to
supersonic flows remains rare, the current focus revolves around
methodology validation through synthetic images. The results of
validation test offers valuable insight into how the optical flow
algorithm can be further improved to improve robustness and
accuracy. Despite these challenges however, this supersonic flow
measurement technique may potentially offer a simpler way to
identify and quantify the fine spatial structures within the shock shear
layer.
Abstract: The present study aims to explore the effect of
computerization on marketing performance in Snowa Company. In
other words, this study intends to respond to this question that
whether or not, is there any relationship between utilization of
computerization in marketing activities and marketing performance?
The statistical population included 60 marketing managers of Snowa
Company. In order to test the research hypotheses, Pearson
correlation coefficient was employed. The reliability was equal to
96.8%. In this study, computerization was the independent variable
and marketing performance was the dependent variable with
characteristics of market share, improving the competitive position,
and sales volume. The results of testing the hypotheses revealed that
there is a significant relationship between utilization of
computerization and market share, sales volume and improving the
competitive position.
Abstract: E-government has been adopted and used by many governments/countries around the world including Ghana to provide citizens and businesses with more accurate, real-time, and high quality services and information. The objective of this paper is to present an overview of the Government of Ghana’s (GoG) adoption and implement of e-government and its usage by the Ministries, Departments and its agencies (MDAs) as well as other public sector institutions to deliver efficient public service to the general public i.e. citizens, business etc. Government implementation of e-government focused on facilitating effective delivery of government service to the public and ultimately to provide efficient government-wide electronic means of sharing information and knowledge through a network infrastructure developed to connect all major towns and cities, Ministries, Departments and Agencies and other public sector organizations in Ghana. One aim for the Government of Ghana use of ICT in public administration is to improve productivity in government administration and service by facilitating exchange of information to enable better interaction and coordination of work among MDAs, citizens and private businesses. The study was prepared using secondary sources of data from government policy documents, national and international published reports, journal articles, and web sources. This study indicates that through the e-government initiative, currently citizens and businesses can access and pay for services such as renewal of driving license, business registration, payment of taxes, acquisition of marriage and birth certificates as well as application for passport through the GoG electronic service (eservice) and electronic payment (epay) portal. Further, this study shows that there is enormous commitment from GoG to adopt and implement e-government as a tool not only to transform the business of government but also to bring efficiency in public services delivered by the MDAs. To ascertain this, a further study need to be carried out to determine if the use of e-government has brought about the anticipated improvements and efficiency in service delivery of MDAs and other state institutions in Ghana.
Abstract: In order to retrieve images efficiently from a large
database, a unique method integrating color and texture features
using genetic programming has been proposed. Opponent color
histogram which gives shadow, shade, and light intensity invariant
property is employed in the proposed framework for extracting color
features. For texture feature extraction, fast discrete curvelet
transform which captures more orientation information at different
scales is incorporated to represent curved like edges. The recent
scenario in the issues of image retrieval is to reduce the semantic gap
between user’s preference and low level features. To address this
concern, genetic algorithm combined with relevance feedback is
embedded to reduce semantic gap and retrieve user’s preference
images. Extensive and comparative experiments have been conducted
to evaluate proposed framework for content based image retrieval on
two databases, i.e., COIL-100 and Corel-1000. Experimental results
clearly show that the proposed system surpassed other existing
systems in terms of precision and recall. The proposed work achieves
highest performance with average precision of 88.2% on COIL-100
and 76.3% on Corel, the average recall of 69.9% on COIL and 76.3%
on Corel. Thus, the experimental results confirm that the proposed
content based image retrieval system architecture attains better
solution for image retrieval.
Abstract: The introduction of a multitude of new and interactive
e-commerce information technology (IT) artifacts has impacted
adoption research. Rather than solely functioning as productivity
tools, new IT artifacts assume the roles of interaction mediators and
social actors. This paper describes the varying roles assumed by IT
artifacts, and proposes and distinguishes between four distinct foci of
how the artifacts are evaluated. It further proposes a theoretical
model that maps the different views of IT artifacts to four distinct
types of evaluations.
Abstract: Scripts are one of the basic text resources to understand
broadcasting contents. Topic modeling is the method to get the
summary of the broadcasting contents from its scripts. Generally,
scripts represent contents descriptively with directions and speeches,
and provide scene segments that can be seen as semantic units.
Therefore, a script can be topic modeled by treating a scene segment
as a document. Because scene segments consist of speeches mainly,
however, relatively small co-occurrences among words in the scene
segments are observed. This causes inevitably the bad quality of
topics by statistical learning method. To tackle this problem, we
propose a method to improve topic quality with additional word
co-occurrence information obtained using scene similarities. The
main idea of improving topic quality is that the information that
two or more texts are topically related can be useful to learn high
quality of topics. In addition, more accurate topical representations
lead to get information more accurate whether two texts are related
or not. In this paper, we regard two scene segments are related
if their topical similarity is high enough. We also consider that
words are co-occurred if they are in topically related scene segments
together. By iteratively inferring topics and determining semantically
neighborhood scene segments, we draw a topic space represents
broadcasting contents well. In the experiments, we showed the
proposed method generates a higher quality of topics from Korean
drama scripts than the baselines.
Abstract: Multiple Sclerosis (MS) is a disease which affects the
central nervous system and causes balance problem. In clinical, this
disorder is usually evaluated using static posturography. Some linear
or nonlinear measures, extracted from the posturographic data (i.e.
center of pressure, COP) recorded during a balance test, has been
used to analyze postural control of MS patients. In this study, the
trend (TREND) and the sample entropy (SampEn), two nonlinear
parameters were chosen to investigate their relationships with the
expanded disability status scale (EDSS) score. 40 volunteers with
different EDSS scores participated in our experiments with eyes open
(EO) and closed (EC). TREND and 2 types of SampEn (SampEn1
and SampEn2) were calculated for each combined COP’s position
signal. The results have shown that TREND had a weak negative
correlation to EDSS while SampEn2 had a strong positive correlation
to EDSS. Compared to TREND and SampEn1, SampEn2 showed a
better significant correlation to EDSS and an ability to discriminate
the MS patients in the EC case. In addition, the outcome of the study
suggests that the multi-dimensional nonlinear analysis could provide
some information about the impact of disability progression in MS on
dynamics of the COP data.
Abstract: In this research, we propose to conduct diagnostic and
predictive analysis about the key factors and consequences of urban
population relocation. To achieve this goal, urban simulation models
extract the urban development trends as land use change patterns from
a variety of data sources. The results are treated as part of urban big
data with other information such as population change and economic
conditions. Multiple data mining methods are deployed on this data to
analyze nonlinear relationships between parameters. The result
determines the driving force of population relocation with respect to
urban sprawl and urban sustainability and their related parameters.
This work sets the stage for developing a comprehensive urban
simulation model for catering to specific questions by targeted users. It
contributes towards achieving sustainability as a whole.
Abstract: Objective: Acute coronary syndrome is a clinical
condition encompassing ST segments elevation myocardial
infraction, Non ST segment is elevation myocardial infraction and un
stable angina is characterized by ruptured coronary plaque, stress and
myocardial injury. Angina pectoris is a pressure like pain in the chest
that is induced by exertion or stress and relived with in the minute
after cessation of effort or using sublingual nitroglycerin. The present
research was undertaken to study the drug utilization pattern of
antiplatelet drugs for the ischemic heart disease in a tertiary care
hospital. Method: The present study is retrospective drug utilization
study and study period is 6months. The data is collected from the
discharge case sheet of general medicine department from medical
department Rajiv Gandhi institute of medical sciences, Kadapa. The
tentative sample size fixed was 250 patients. Out of 250 cases 19
cases was excluded because of unrelated data. Results: A total of 250
prescriptions were collected for the study according to the inclusion
criteria 233 prescriptions were diagnosed with ischemic heart disease
17 prescriptions were excluded due to unrelated information. out of
233 prescriptions 128 are male (54.9%) and 105 patients are were
female (45%). According to the gender distribution, the prevalence of
ischemic heart disease in males are 90 (70.31%) and females are 39
(37.1%). In the same way the prevalence of ischemic heart disease
along with cerebrovascular disease in males are 39 (29.6%) and
females are 66 (62.6%). Conclusion: We found that 94.8% of drug
utilization of antiplatelet drugs was achieved in the Rajiv Gandhi
institute of medical sciences, Kadapa from 2011-2012.
Abstract: Advances in spatial and spectral resolution of satellite
images have led to tremendous growth in large image databases. The
data we acquire through satellites, radars, and sensors consists of
important geographical information that can be used for remote
sensing applications such as region planning, disaster management.
Spatial data classification and object recognition are important tasks
for many applications. However, classifying objects and identifying
them manually from images is a difficult task. Object recognition is
often considered as a classification problem, this task can be
performed using machine-learning techniques. Despite of many
machine-learning algorithms, the classification is done using
supervised classifiers such as Support Vector Machines (SVM) as the
area of interest is known. We proposed a classification method,
which considers neighboring pixels in a region for feature extraction
and it evaluates classifications precisely according to neighboring
classes for semantic interpretation of region of interest (ROI). A
dataset has been created for training and testing purpose; we
generated the attributes by considering pixel intensity values and
mean values of reflectance. We demonstrated the benefits of using
knowledge discovery and data-mining techniques, which can be on
image data for accurate information extraction and classification from
high spatial resolution remote sensing imagery.
Abstract: Digital images are widely used in computer
applications. To store or transmit the uncompressed images
requires considerable storage capacity and transmission bandwidth.
Image compression is a means to perform transmission or storage of
visual data in the most economical way. This paper explains about
how images can be encoded to be transmitted in a multiplexing
time-frequency domain channel. Multiplexing involves packing
signals together whose representations are compact in the working
domain. In order to optimize transmission resources each 4 × 4
pixel block of the image is transformed by a suitable polynomial
approximation, into a minimal number of coefficients. Less than
4 × 4 coefficients in one block spares a significant amount of
transmitted information, but some information is lost. Different
approximations for image transformation have been evaluated as
polynomial representation (Vandermonde matrix), least squares +
gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev
polynomials or singular value decomposition (SVD). Results have
been compared in terms of nominal compression rate (NCR),
compression ratio (CR) and peak signal-to-noise ratio (PSNR)
in order to minimize the error function defined as the difference
between the original pixel gray levels and the approximated
polynomial output. Polynomial coefficients have been later encoded
and handled for generating chirps in a target rate of about two
chirps per 4 × 4 pixel block and then submitted to a transmission
multiplexing operation in the time-frequency domain.
Abstract: Journal bearings used in IC engines are prone to premature
failures and are likely to fail earlier than the rated life due to
highly impulsive and unstable operating conditions and frequent
starts/stops. Vibration signature extraction and wear debris analysis
techniques are prevalent in industry for condition monitoring of
rotary machinery. However, both techniques involve a great deal of
technical expertise, time, and cost. Limited literature is available on
the application of these techniques for fault detection in reciprocating
machinery, due to the complex nature of impact forces that
confounds the extraction of fault signals for vibration-based analysis
and wear prediction. In present study, a simulation model was developed to investigate
the bearing wear behaviour, resulting because of different operating
conditions, to complement the vibration analysis. In current
simulation, the dynamics of the engine was established first, based on
which the hydrodynamic journal bearing forces were evaluated by
numerical solution of the Reynold’s equation. In addition, the
essential outputs of interest in this study, critical to determine wear
rates are the tangential velocity and oil film thickness between the
journals and bearing sleeve, which if not maintained appropriately,
have a detrimental effect on the bearing performance. Archard’s wear prediction model was used in the simulation to
calculate the wear rate of bearings with specific location information
as all determinative parameters were obtained with reference to crank
rotation. Oil film thickness obtained from the model was used as a
criterion to determine if the lubrication is sufficient to prevent contact
between the journal and bearing thus causing accelerated wear. A
limiting value of 1 μm was used as the minimum oil film thickness
needed to prevent contact. The increased wear rate with growing
severity of operating conditions is analogous and comparable to the
rise in amplitude of the squared envelope of the referenced vibration
signals. Thus on one hand, the developed model demonstrated its
capability to explain wear behaviour and on the other hand it also
helps to establish a co-relation between wear based and vibration
based analysis. Therefore, the model provides a cost effective and
quick approach to predict the impending wear in IC engine bearings
under various operating conditions.
Abstract: Cochlear Implantation (CI) which became a routine
procedure for the last decades is an electronic device that provides a
sense of sound for patients who are severely and profoundly deaf.
The optimal success of this implantation depends on the electrode
technology and deep insertion techniques. However, this manual
insertion procedure may cause mechanical trauma which can lead to
severe destruction of the delicate intracochlear structure.
Accordingly, future improvement of the cochlear electrode implant
insertion needs reduction of the excessive force application during
the cochlear implantation which causes tissue damage and trauma.
This study is examined tool-tissue interaction of large prototype scale
digit embedded with distributive tactile sensor based upon cochlear
electrode and large prototype scale cochlea phantom for simulating
the human cochlear which could lead to small scale digit
requirements. The digit, distributive tactile sensors embedded with
silicon-substrate was inserted into the cochlea phantom to measure
any digit/phantom interaction and position of the digit in order to
minimize tissue and trauma damage during the electrode cochlear
insertion. The digit have provided tactile information from the digitphantom
insertion interaction such as contact status, tip penetration,
obstacles, relative shape and location, contact orientation and
multiple contacts. The tests demonstrated that even devices of such a
relative simple design with low cost have potential to improve
cochlear implant surgery and other lumen mapping applications by
providing tactile sensory feedback information and thus controlling
the insertion through sensing and control of the tip of the implant
during the insertion. In that approach, the surgeon could minimize the
tissue damage and potential damage to the delicate structures within
the cochlear caused by current manual electrode insertion of the
cochlear implantation. This approach also can be applied to other
minimally invasive surgery applications as well as diagnosis and path
navigation procedures.
Abstract: The occurrences of precipitation, also commonly
referred as rain, in the form of "convective" and "stratiform" have
been identified to exist worldwide. In this study, the radar return
echoes or known as reflectivity values acquired from radar scans
have been exploited in the process of classifying the type of rain
endured. The investigation use radar data from Malaysian
Meteorology Department (MMD). It is possible to discriminate the
types of rain experienced in tropical region by observing the vertical
characteristics of the rain structure. .Heavy rain in tropical region
profoundly affects radiowave signals, causing transmission
interference and signal fading. Required wireless system fade margin
depends on the type of rain. Information relating to the two
mentioned types of rain is critical for the system engineers and
researchers in their endeavour to improve the reliability of
communication links. This paper highlights the quantification of
percentage occurrences over one year period in 2009.
Abstract: Information technology has been gaining more and
more space whether in industry, commerce or even for personal use,
but the misuse of it brings harm to the environment and human health
as a result. Contribute to the sustainability of the planet is to
compensate the environment, all or part of what withdraws it. The
green computing also came to propose practical for use in IT in an
environmentally correct way in aid of strategic management and
communication. This work focuses on showing how a mobile
application can help businesses reduce costs and reduced
environmental impacts caused by its processes, through a case study
of a public company in Brazil.
Abstract: In this work, a framework to model the Supply Chain
(SC) Collaborative Planning (CP) process is proposed. The main
contributions of this framework concern 1) the presentation of the
decision view, the most important one due to the characteristics of the
process, jointly within the physical, organisation and information
views, and 2) the simultaneous consideration of the spatial and
temporal integration among the different supply chain decision
centres. This framework provides the basis for a realistic and
integrated perspective of the supply chain collaborative planning
process and also the analytical modeling of each of its decisional
activities.
Abstract: This article discusses the passage of RDB to XML
documents (schema and data) based on metadata and semantic
enrichment, which makes the RDB under flattened shape and is
enriched by the object concept. The integration and exploitation of
the object concept in the XML uses a syntax allowing for the
verification of the conformity of the document XML during the
creation. The information extracted from the RDB is therefore
analyzed and filtered in order to adjust according to the structure of
the XML files and the associated object model. Those implemented
in the XML document through a SQL query are built dynamically. A
prototype was implemented to realize automatic migration, and so
proves the effectiveness of this particular approach.
Abstract: Social networking sites such as Twitter and Facebook
attracts over 500 million users across the world, for those users, their
social life, even their practical life, has become interrelated. Their
interaction with social networking has affected their life forever.
Accordingly, social networking sites have become among the main
channels that are responsible for vast dissemination of different kinds
of information during real time events. This popularity in Social
networking has led to different problems including the possibility of
exposing incorrect information to their users through fake accounts
which results to the spread of malicious content during life events.
This situation can result to a huge damage in the real world to the
society in general including citizens, business entities, and others. In this paper, we present a classification method for detecting the
fake accounts on Twitter. The study determines the minimized set of
the main factors that influence the detection of the fake accounts on
Twitter, and then the determined factors are applied using different
classification techniques. A comparison of the results of these
techniques has been performed and the most accurate algorithm is
selected according to the accuracy of the results. The study has been
compared with different recent researches in the same area; this
comparison has proved the accuracy of the proposed study. We claim
that this study can be continuously applied on Twitter social network
to automatically detect the fake accounts; moreover, the study can be
applied on different social network sites such as Facebook with minor
changes according to the nature of the social network which are
discussed in this paper.
Abstract: Bezier curves have useful properties for path
generation problem, for instance, it can generate the reference
trajectory for vehicles to satisfy the path constraints. Both algorithms
join cubic Bezier curve segment smoothly to generate the path. Some
of the useful properties of Bezier are curvature. In mathematics,
curvature is the amount by which a geometric object deviates from
being flat, or straight in the case of a line. Another extrinsic example
of curvature is a circle, where the curvature is equal to the reciprocal
of its radius at any point on the circle. The smaller the radius, the
higher the curvature thus the vehicle needs to bend sharply. In this
study, we use Bezier curve to fit highway-like curve. We use
different approach to find the best approximation for the curve so that
it will resembles highway-like curve. We compute curvature value by
analytical differentiation of the Bezier Curve. We will then compute
the maximum speed for driving using the curvature information
obtained. Our research works on some assumptions; first, the Bezier
curve estimates the real shape of the curve which can be verified
visually. Even though, fitting process of Bezier curve does not
interpolate exactly on the curve of interest, we believe that the
estimation of speed are acceptable. We verified our result with the
manual calculation of the curvature from the map.
Abstract: Recently, Job Recommender Systems have gained
much attention in industries since they solve the problem of
information overload on the recruiting website. Therefore, we
proposed Extended Personalized Job System that has the capability of
providing the appropriate jobs for job seeker and recommending
some suitable information for them using Data Mining Techniques
and Dynamic User Profile. On the other hands, company can also
interact to the system for publishing and updating job information.
This system have emerged and supported various platforms such as
web application and android mobile application. In this paper, User
profiles, Implicit User Action, User Feedback, and Clustering
Techniques in WEKA libraries were applied and implemented. In
additions, open source tools like Yii Web Application Framework,
Bootstrap Front End Framework and Android Mobile Technology
were also applied.