Abstract: The paper contains a review of the literature in terms of the critical analysis of methodologies of university ranking systems. Furthermore, the initiatives supported by the European Commission (U-Map, U-Multirank) and CHE Ranking are described. Special attention is paid to the tendencies in the development of ranking systems. According to the author, the ranking organizations should abandon the classic form of ranking, namely a hierarchical ordering of universities from “the best" to “the worse". In the empirical part of this paper, using one of the method of cluster analysis called k-means clustering, the author presents university classifications of the top universities from the Shanghai Jiao Tong University-s (SJTU) Academic Ranking of World Universities (ARWU).
Abstract: The essentiality of maintenance assessment and
maintenance optimization in design stage is analyzed, and the existent
problems of conventional maintenance design method are illuminated.
MDMVM (Maintenance Design Method based Virtual Maintenance)
is illuminated, and the process of MDMVM established, and the
MDMVM architecture is given out. The key techniques of MDMVM
are analyzed, and include maintenance design based KBE (Knowledge
Based Engineering) and virtual maintenance based physically
attribute. According to physical property, physically based modeling,
visual object movement control, the simulation of operation force and
maintenance sequence planning method are emphatically illuminated.
Maintenance design system based virtual maintenance is established in
foundation of maintenance design method.
Abstract: In this paper, we use an M/G/C/C state dependent
queuing model within a complex network topology to determine the
different performance measures for pedestrian traffic flow. The
occupants in this network topology need to go through some source
corridors, from which they can choose their suitable exiting
corridors. The performance measures were calculated using arrival
rates that maximize the throughputs of source corridors. In order to
increase the throughput of the network, the result indicates that the
flow direction of pedestrian through the corridors has to be restricted
and the arrival rates to the source corridor need to be controlled.
Abstract: Olomouc is a unique and complex landmark with
widespread forestation and land use. This research work was
conducted to assess important and complex land use change
trajectories in Olomouc region. Multi-temporal satellite data from
1991, 2001 and 2013 were used to extract land use/cover types by
object oriented classification method. To achieve the objectives, three
different aspects were used: (1) Calculate the quantity of each
transition; (2) Allocate location based landscape pattern (3) Compare
land use/cover evaluation procedure. Land cover change trajectories
shows that 16.69% agriculture, 54.33% forest and 21.98% other areas
(settlement, pasture and water-body) were stable in all three decade.
Approximately 30% of the study area maintained as a same land cove
type from 1991 to 2013. Here broad scale of political and socioeconomic
factors was also affect the rate and direction of landscape
changes. Distance from the settlements was the most important
predictor of land cover change trajectories. This showed that most of
landscape trajectories were caused by socio-economic activities and
mainly led to virtuous change on the ecological environment.
Abstract: Encoded information based on synchronization of coupled chaotic Nd:YAG lasers in master-slave configuration is numerically studied. Encoding, transmission, and decoding of information in optical chaotic communication with a single channel is presented. We analyze the robustness of the encrypted audio transmission in a channel noise. In order to illustrate this synchronization robustness, we present two cases of study: synchronization and transmission with a single channel without and with noise in the channel.
Abstract: This article is an extension and a practical application
approach of Wheeler-s NEBIC theory (Net Enabled Business
Innovation Cycle). NEBIC theory is a new approach in IS research
and can be used for dynamic environment related to new technology.
Firms can follow the market changes rapidly with support of the IT
resources. Flexible firms adapt their market strategies, and respond
more quickly to customers changing behaviors. When every leading
firm in an industry has access to the same IT resources, the way that
these IT resources are managed will determine the competitive
advantages or disadvantages of firm. From Dynamic Capabilities
Perspective and from newly introduced NEBIC theory by Wheeler,
we know that only IT resources cannot deliver customer value but
good configuration of those resources can guarantee customer value
by choosing the right emerging technology, grasping the right
economic opportunities through business innovation and growth. We
found evidences in literature that SOA (Service Oriented
Architecture) is a promising emerging technology which can deliver
the desired economic opportunity through modularity, flexibility and
loose-coupling. SOA can also help firms to connect in network which
can open a new window of opportunity to collaborate in innovation
and right kind of outsourcing. There are many articles and research
reports indicates that failure rate in outsourcing is very high but at the
same time research indicates that successful outsourcing projects
adds tangible and intangible benefits to the service consumer.
Business executives and policy makers in the west should not afraid
of outsourcing but they should choose the right strategy through the
use of emerging technology to significantly reduce the failure rate in
outsourcing.
Abstract: This paper proposes a new of cloud computing for individual computer users to share applications in distributed communities, called community-based personal cloud computing (CPCC). The paper also presents a prototype design and implementation of CPCC. The users of CPCC are able to share their computing applications with other users of the community. Any member of the community is able to execute remote applications shared by other members. The remote applications behave in the same way as their local counterparts, allowing the user to enter input, receive output as well as providing the access to the local data of the user. CPCC provides a peer-to-peer (P2P) environment where each peer provides applications which can be used by the other peers that are connected CPCC.
Abstract: There exists a strong correlation between efficient project management and competitive advantage for organizations. Therefore, organizations are striving to standardize and assess the rigor of their project management processes and capabilities i.e. project management maturity. Researchers and standardization organizations have developed several project management maturity models (PMMMs) to assess project management maturity of the organizations. This study presents a critical evaluation of some of the leading PMMMs against OPM3® in a multitude of ways to look at which PMMM is the most comprehensive model - which could assess most aspects of organizations and also help the organizations in gaining competitive advantage over competitors. After a detailed morphological analysis of the models, it is concluded that OPM3® is the most promising maturity model that can really provide a competitive advantage to the organizations due to its unique approach of assessment and improvement strategies.
Abstract: In this paper, a new proposed system for Persian
printed numeral characters recognition with emphasis on
representation and recognition stages is introduced. For the first time,
in Persian optical character recognition, geometrical central moments
as character image descriptor and fuzzy min-max neural network for
Persian numeral character recognition has been used. Set of different
experiments on binary images of regular, translated, rotated and
scaled Persian numeral characters has been done and variety of
results has been presented. The best result was 99.16% correct
recognition demonstrating geometrical central moments and fuzzy
min-max neural network are adequate for Persian printed numeral
character recognition.
Abstract: Vernonia divergens Benth., commonly known as
“Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the
leaves of the plant, boiled in water are successfully administered to a
large number of diabetic patients. The present study evaluates the
putative anti-diabetic ingredients, isolated from the in vivo and in
vitro grown plantlets of V. divergens for their antimicrobial and
anticancer activities. Sterilized explants of nodal segments were
cultured on MS (Musashige and Skoog, 1962) medium in presence of
different combinations of hormones. Multiple shoots along with
bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA.
Micro-plantlets were separated and sub-cultured on the double
strength (2X) of the above combination of hormones leading to
increased length of roots and shoots. These plantlets were
successfully transferred to soil and survived well in nature. The
ethanol extract of plantlets from both in vivo & in vitro sources were
prepared in soxhlet extractor and then concentrated to dryness under
reduced pressure in rotary evaporator. Thus obtainedconcentrated
extracts showed significant inhibitory activity against gram
negative bacteria like Escherichia coli and Pseudomonas
aeruginosa but no inhibition was found against gram positive
bacteria. Further, these ethanol extracts were screened for in vitro
percentage cytotoxicity at different time periods (24 h, 48 h and 72 h)
of different dilutions. The in vivo plant extract inhibited the growth of
EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50,
25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in
vitro origin, the inhibition was found against EAC cell lines even at
48h. During spectrophotometric scanning, the extracts exhibited
different maxima (ʎ) - four peaks in in vitro extracts as against single
in in vivo preparation suggesting the possible change in the nature of
ingredients during micropropagation through tissue culture
techniques.
Abstract: Energy dissipation in drops has been investigated by
physical models. After determination of effective parameters on the
phenomenon, three drops with different heights have been
constructed from Plexiglas. They have been installed in two existing
flumes in the hydraulic laboratory. Several runs of physical models
have been undertaken to measured required parameters for
determination of the energy dissipation. Results showed that the
energy dissipation in drops depend on the drop height and discharge.
Predicted relative energy dissipations varied from 10.0% to 94.3%.
This work has also indicated that the energy loss at drop is mainly
due to the mixing of the jet with the pool behind the jet that causes
air bubble entrainment in the flow. Statistical model has been
developed to predict the energy dissipation in vertical drops denotes
nonlinear correlation between effective parameters. Further an
artificial neural networks (ANNs) approach was used in this paper to
develop an explicit procedure for calculating energy loss at drops
using NeuroSolutions. Trained network was able to predict the
response with R2 and RMSE 0.977 and 0.0085 respectively. The
performance of ANN was found effective when compared to
regression equations in predicting the energy loss.
Abstract: The paper provides a numerical investigation of the
entropy generation analysis due to natural convection in an inclined
square porous cavity. The coupled equations of mass, momentum,
energy and species conservation are solved using the Control Volume
Finite-Element Method. Effect of medium permeability and
inclination angle on entropy generation is analysed. It was found that
according to the Darcy number and the porous thermal Raleigh
number values, the entropy generation could be mainly due to heat
transfer or to fluid friction irreversibility and that entropy generation
reaches extremum values for specific inclination angles.
Abstract: In this paper we propose a new knowledge model using
the Dempster-Shafer-s evidence theory for image segmentation and
fusion. The proposed method is composed essentially of two steps.
First, mass distributions in Dempster-Shafer theory are obtained from
the membership degrees of each pixel covering the three image
components (R, G and B). Each membership-s degree is determined by
applying Fuzzy C-Means (FCM) clustering to the gray levels of the
three images. Second, the fusion process consists in defining three
discernment frames which are associated with the three images to be
fused, and then combining them to form a new frame of discernment.
The strategy used to define mass distributions in the combined
framework is discussed in detail. The proposed fusion method is
illustrated in the context of image segmentation. Experimental
investigations and comparative studies with the other previous methods
are carried out showing thus the robustness and superiority of the
proposed method in terms of image segmentation.
Abstract: In this work, I present a review on Sparse Distributed
Memory for Small Cues (SDMSCue), a variant of Sparse Distributed
Memory (SDM) that is capable of handling small cues. I then conduct
and show some cognitive experiments on SDMSCue to test its
cognitive soundness compared to SDM. Small cues refer to input
cues that are presented to memory for reading associations; but have
many missing parts or fields from them. The original SDM failed to
handle such a problem. SDMSCue handles and overcomes this
pitfall. The main idea in SDMSCue; is the repeated projection of the
semantic space on smaller subspaces; that are selected based on the
input cue length and pattern. This process allows for Read/Write
operations using an input cue that is missing a large portion.
SDMSCue is augmented with the use of genetic algorithms for
memory allocation and initialization. I claim that SDM functionality
is a subset of SDMSCue functionality.
Abstract: In today-s hip hop world where everyone is running
short of time and works hap hazardly,the similar scene is common on
the roads while in traffic.To do away with the fatal consequences of
such speedy traffics on rushy lanes, a software to analyse and keep
account of the traffic and subsequent conjestion is being used in the
developed countries. This software has being implemented and used
with the help of a suppprt tool called Critical Analysis Reporting
Environment.There has been two existing versions of this tool.The
current research paper involves examining the issues and probles
while using these two practically. Further a hybrid architecture is
proposed for the same that retains the quality and performance of
both and is better in terms of coupling of components , maintainence
and many other features.
Abstract: Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.
Abstract: A numbers of important developments have led to an
increasing attractiveness for very high speed electrical machines
(either motor or generator). Specifically the increasing switching
speed of power electronics, high energy magnets, high strength
retaining materials, better high speed bearings and improvements in
design analysis are the primary drivers in a move to higher speed. The
design challenges come in the mechanical design both in terms of
strength and resonant modes and in the electromagnetic design
particularly in respect of iron losses and ac losses in the various
conducting parts including the rotor. This paper describes detailed
design work which has been done on a 50,000 rpm, 50kW permanent
magnet( PM) synchronous machine. It describes work on
electromagnetic and rotor eddy current losses using a variety of
methods including both 2D finite element analysis
Abstract: Liposomal magnetofection is a simple, highly efficient
technology for cell transfection, demonstrating better outcome than a
number of other common gene delivery methods. However,
aggregate complexes distribution over the cell surface is non-uniform
due to the gradient of the permanent magnetic field. The aim of this
study was to estimate the efficiency of liposomal magnetofection for
prostate carcinoma PC3 cell line using newly designed device,
“DynaFECTOR", ensuring magnetofection in a dynamic gradient
magnetic field. Liposomal magnetofection in a dynamic gradient
magnetic field demonstrated the highest transfection efficiency for
PC3 cells – it increased for 21% in comparison with liposomal
magnetofection and for 42% in comparison with lipofection alone.
The optimal incubation time under dynamic magnetic field for PC3
cell line was 5 minutes and the optimal rotation frequency of
magnets – 5 rpm. The new approach also revealed lower cytotoxic
effect to cells than liposomal magnetofection.
Abstract: E-learning is not restricted to the use of new technologies for the online content, but also induces the adoption of new approaches to improve the quality of education. This quality depends on the ability of these approaches (technical and pedagogical) to provide an adaptive learning environment. Thus, the environment should include features that convey intentions and meeting the educational needs of learners by providing a customized learning path to acquiring a competency concerned In our proposal, we believe that an individualized learning path requires knowledge of the learner. Therefore, it must pass through a personalization of diagnosis to identify precisely the competency gaps to fill, and reduce the cognitive load To personalize the diagnosis and pertinently measure the competency gap, we suggest implementing the formative assessment in the e-learning environment and we propose the introduction of a pre-regulation process in the area of formative assessment, involving its individualization and implementation in e-learning.
Abstract: We present the results of a case study aiming to assess the reflection of the tourism community in the Web and its usability to propose new ways to communicate visually. The wealth of information contained in the Web and the clear facilities to communicate personals points of view makes of the social web a new space of exploration. In this way, social web allow the sharing of information between communities with similar interests. However, the tourism community remains unexplored as is the case of the information covered in travel stories. Along the Web, we find multiples sites allowing the users to communicate their experiences and personal points of view of a particular place of the world. This cultural heritage is found in multiple documents, usually very little supplemented with photos, so they are difficult to explore due to the lack of visual information. This paper explores the possibility of analyzing travel stories to display them visually on maps and generate new knowledge such as patterns of travel routes. This way, travel narratives published in electronic formats can be very important especially to the tourism community because of the great amount of knowledge that can be extracted. Our approach is based on the use of a Geoparsing Web Service to extract geographic coordinates from travel narratives in order to draw the geo-positions and link the documents into a map image.