Abstract: This part of study deals with description of unsteady isothermal melt flow in the container with cuboid shape. This melt flow is driven by rotating magnetic field. Input data (instantaneous velocities, grid coordinates and Lorentz forces) were obtained from in-house CFD code (called NS-FEM3D) which uses DDES method of computing. Description of the flow was performed by contours of Lorentz forces and caused velocity field. Taylor magnetic numbers of the flow were used 1.10^6, 5.10^6 and 1.10^7, flow was in 3D turbulent flow regime.
Abstract: This paper investigates a method for the state estimation of nonlinear systems described by a class of differential-algebraic equation (DAE) models using the extended Kalman filter. The method involves the use of a transformation from a DAE to ordinary differential equation (ODE). A relevant dynamic power system model using decoupled techniques will be proposed. The estimation technique consists of a state estimator based on the EKF technique as well as the local stability analysis. High performances are illustrated through a simulation study applied on IEEE 13 buses test system.
Abstract: Breast region segmentation is an essential prerequisite in computerized analysis of mammograms. It aims at separating the breast tissue from the background of the mammogram and it includes two independent segmentations. The first segments the background region which usually contains annotations, labels and frames from the whole breast region, while the second removes the pectoral muscle portion (present in Medio Lateral Oblique (MLO) views) from the rest of the breast tissue. In this paper we propose hybridization of Connected Component Labeling (CCL), Fuzzy, and Straight line methods. Our proposed methods worked good for separating pectoral region. After removal pectoral muscle from the mammogram, further processing is confined to the breast region alone. To demonstrate the validity of our segmentation algorithm, it is extensively tested using over 322 mammographic images from the Mammographic Image Analysis Society (MIAS) database. The segmentation results were evaluated using a Mean Absolute Error (MAE), Hausdroff Distance (HD), Probabilistic Rand Index (PRI), Local Consistency Error (LCE) and Tanimoto Coefficient (TC). The hybridization of fuzzy with straight line method is given more than 96% of the curve segmentations to be adequate or better. In addition a comparison with similar approaches from the state of the art has been given, obtaining slightly improved results. Experimental results demonstrate the effectiveness of the proposed approach.
Abstract: The Smith arithmetic determinant is investigated in this paper. By using two different methods, we derive the explicit formula for the Smith arithmetic determinant.
Abstract: We model the process of a data center as a multi- objective problem of mapping independent tasks onto a set of data center machines that simultaneously minimizes the energy consump¬tion and response time (makespan) subject to the constraints of deadlines and architectural requirements. A simple technique based on multi-objective goal programming is proposed that guarantees Pareto optimal solution with excellence in convergence process. The proposed technique also is compared with other traditional approach. The simulation results show that the proposed technique achieves superior performance compared to the min-min heuristics, and com¬petitive performance relative to the optimal solution implemented in UNDO for small-scale problems.
Abstract: Networking solutions, particularly wireless local area networks have revolutionized the technological advancement. Wireless Local Area Networks (WLANs) have gained a lot of popularity as they provide location-independent network access between computing devices. There are a number of access methods used in Wireless Networks among which DCF and PCF are the fundamental access methods. This paper emphasizes on the impact of DCF and PCF access mechanisms on the performance of the IEEE 802.11a, 802.11b and 802.11g standards. On the basis of various parameters viz. throughput, delay, load etc performance is evaluated between these three standards using above mentioned access mechanisms. Analysis revealed a superior throughput performance with low delays for 802.11g standard as compared to 802.11 a/b standard using both DCF and PCF access methods.
Abstract: Cloud computing technology is very useful in present day to day life, it uses the internet and the central remote servers to provide and maintain data as well as applications. Such applications in turn can be used by the end users via the cloud communications without any installation. Moreover, the end users’ data files can be accessed and manipulated from any other computer using the internet services. Despite the flexibility of data and application accessing and usage that cloud computing environments provide, there are many questions still coming up on how to gain a trusted environment that protect data and applications in clouds from hackers and intruders. This paper surveys the “keys generation and management” mechanism and encryption/decryption algorithms used in cloud computing environments, we proposed new security architecture for cloud computing environment that considers the various security gaps as much as possible. A new cryptographic environment that implements quantum mechanics in order to gain more trusted with less computation cloud communications is given.
Abstract: In today's world, success of most systems depend on the use of new technologies and information technology (IT) which aimed to increase efficiency and satisfaction of users. One of the most important systems that use information technology to deliver services is the education system. But for educational services in the form of E-learning systems, hardware and software equipment should be containing high quality, which requires substantial investment. Because the vast majority of educational establishments can not invest in this area so the best way for them is reducing the costs and providing the E-learning services by using cloud computing. But according to the novelty of the cloud technology, it can create challenges and concerns that the most noted among them are security issues. Security concerns about cloud-based E-learning products are critical and security measures essential to protect valuable data of users from security vulnerabilities in products. Thus, the success of these products happened if customers meet security requirements then can overcome security threats. In this paper tried to explore cloud computing and its positive impact on E- learning and put main focus to identify security issues that related to cloud-based E-learning efforts which have been improve security and provide solutions in management challenges.
Abstract: Cloud computing is a style of computing which is formed from the aggregation and development of technologies such as grid computing distributed computing, parallel computing and service-oriented architecture. And its aim is to provide computing, communication and storage resources in a safe environment based on service, as fast as possible, which is virtually provided via Internet platform. Considering that the provided Services in e-government are available via the Internet, thus cloud computing can be used in the implementation of e-government architecture and provide better service with the lowest economic cost using its benefits. In this paper, the Methods of using cloud computing in e-government has been studied and it's been attempted to identify the challenges and benefits of the cloud to get used in the e-government and proposals have been offered to overcome its shortcomings, encourage and partnership of governments and people to use this economical and new technology.
Abstract: This paper gives a consideration of the achievement of productive level parallel programming skills, based on the data of the graduation studies in the Polytechnic University of Japan. The data show that most students can achieve only parallel programming skills during the graduation study (about 600 to 700 hours), if the programming environment is limited to GPGPUs. However, the data also show that it is a very high level task that a student achieves productive level parallel programming skills during only the graduation study. In addition, it shows that the parallel programming environments for GPGPU, such as CUDA and OpenCL, may be more suitable for parallel computing education than other environments such as MPI on a cluster system and Cell.B.E. These results must be useful for the areas of not only software developments, but also hardware product developments using computer technologies.
Abstract: Decisions for investment, buying and selling of properties depend upon the market value of that property. Issues arise in arriving at the actual value of the property as well as computing the rate of returns from the estate. Addressing valuation related issues through an understanding of behavior of real property rates provide the means to explore the quality of past decisions and to make valid future decisions. Pune, an important city in India, has witnessed a high rate of growth in past few years. Increased demand for housing and investment in properties has led to increase in the rates of real estate. An attempt has been made to study the change and behavior of rates of real estate and factors influencing the same in Pune city.
Abstract: This paper provides an identification of the existing practical skills gap between school-based learning (SBL) and laboratory based learning (LBL) in the Computing Department within the Faculty of Science at Omar Al-Mukhtar University in Libya. A survey has been conducted and the first author has elicited the responses of two groups of stakeholders, namely the academic teachers and students.
The primary goal is to review the main strands of evidence available and argue that there is a gap between laboratory and school-based learning in terms of opportunities for experiment and application of skills. In addition, the nature of experimental work within the laboratory at Omar Al-Mukhtar University needs to be reconsidered. Another goal of our study was to identify the reasons for students’ poor performance in the laboratory and to determine how this poor performance can be eliminated by the modification of teaching methods. Bloom’s taxonomy of learning outcomes has been applied in order to classify questions and problems into categories, and the survey was formulated with reference to third year Computing Department students. Furthermore, to discover students’ opinions with respect to all the issues, an exercise was conducted. The survey provided questions related to what the students had learnt and how well they had learnt. We were also interested in feedback on how to improve the course and the final question provided an opportunity for such feedback.
Abstract: Real time non-invasive Brain Computer Interfaces have a significant progressive role in restoring or maintaining a quality life for medically challenged people. This manuscript provides a comprehensive review of emerging research in the field of cognitive/affective computing in context of human neural responses. The perspectives of different emotion assessment modalities like face expressions, speech, text, gestures, and human physiological responses have also been discussed. Focus has been paid to explore the ability of EEG (Electroencephalogram) signals to portray thoughts, feelings, and unspoken words. An automated workflow-based protocol to design an EEG-based real time Brain Computer Interface system for analysis and classification of human emotions elicited by external audio/visual stimuli has been proposed. The front end hardware includes a cost effective and portable Emotiv EEG Neuroheadset unit, a personal computer and a set of external stimulators. Primary signal analysis and processing of real time acquired EEG shall be performed using MATLAB based advanced brain mapping toolbox EEGLab/BCILab. This shall be followed by the development of MATLAB based self-defined algorithm to capture and characterize temporal and spectral variations in EEG under emotional stimulations. The extracted hybrid feature set shall be used to classify emotional states using artificial intelligence tools like Artificial Neural Network. The final system would result in an inexpensive, portable and more intuitive Brain Computer Interface in real time scenario to control prosthetic devices by translating different brain states into operative control signals.
Abstract: It can be determined in preference between
representative mechanical and mathematical model of elasticcreeping
deformation of transversally isotropic array with doubly
periodic system of tilted slots, and offer of the finite elements
calculation scheme, and inspection of the states of two diagonal
arbitrary profile cavities of deep inception, and in setting up the tense
and dislocation fields distribution nature in computing processes.
Abstract: In order to integrate knowledge in heterogeneous
case-based reasoning (CBR) systems, ontology-based CBR system
has become a hot topic. To solve the facing problems of
ontology-based CBR system, for example, its architecture is
nonstandard, reusing knowledge in legacy CBR is deficient, ontology
construction is difficult, etc, we propose a novel approach for
semi-automatically construct ontology-based CBR system whose
architecture is based on two-layer ontology. Domain knowledge
implied in legacy case bases can be mapped from relational database
schema and knowledge items to relevant OWL local ontology
automatically by a mapping algorithm with low time-complexity. By
concept clustering based on formal concept analysis, computing
concept equation measure and concept inclusion measure, some
suggestions about enriching or amending concept hierarchy of OWL
local ontologies are made automatically that can aid designers to
achieve semi-automatic construction of OWL domain ontology.
Validation of the approach is done by an application example.
Abstract: There are many classical algorithms for finding
routing in FPGA. But Using DNA computing we can solve the routes
efficiently and fast. The run time complexity of DNA algorithms is
much less than other classical algorithms which are used for solving
routing in FPGA. The research in DNA computing is in a primary
level. High information density of DNA molecules and massive
parallelism involved in the DNA reactions make DNA computing a
powerful tool. It has been proved by many research accomplishments
that any procedure that can be programmed in a silicon computer can
be realized as a DNA computing procedure. In this paper we have
proposed two tier approaches for the FPGA routing solution. First,
geometric FPGA detailed routing task is solved by transforming it
into a Boolean satisfiability equation with the property that any
assignment of input variables that satisfies the equation specifies a
valid routing. Satisfying assignment for particular route will result in
a valid routing and absence of a satisfying assignment implies that
the layout is un-routable. In second step, DNA search algorithm is
applied on this Boolean equation for solving routing alternatives
utilizing the properties of DNA computation. The simulated results
are satisfactory and give the indication of applicability of DNA
computing for solving the FPGA Routing problem.
Abstract: Certain sciences such as physics, chemistry or biology,
have a strong computational aspect and use computing infrastructures
to advance their scientific goals. Often, high performance and/or high
throughput computing infrastructures such as clusters and computational
Grids are applied to satisfy computational needs. In addition,
these sciences are sometimes characterised by scientific collaborations
requiring resource sharing which is typically provided by Grid
approaches. In this article, I discuss Grid computing approaches in
High Energy Physics as well as in bioinformatics and highlight some
of my experience in both scientific domains.
Abstract: We explore entanglement in composite quantum systems
and how its peculiar properties are exploited in quantum
information and communication protocols by means of Diagrams
of States, a novel method to graphically represent and analyze how
quantum information is elaborated during computations performed
by quantum circuits.
We present quantum diagrams of states for Bell states generation,
measurements and projections, for dense coding and quantum teleportation,
for probabilistic quantum machines designed to perform
approximate quantum cloning and universal NOT and, finally, for
quantum privacy amplification based on entanglement purification.
Diagrams of states prove to be a useful approach to analyze quantum
computations, by offering an intuitive graphic representation of the
processing of quantum information. They also help in conceiving
novel quantum computations, from describing the desired information
processing to deriving the final implementation by quantum gate
arrays.
Abstract: Devices in a pervasive computing system (PCS) are characterized by their context-awareness. It permits them to provide proactively adapted services to the user and applications. To do so, context must be well understood and modeled in an appropriate form which enhance its sharing between devices and provide a high level of abstraction. The most interesting methods for modeling context are those based on ontology however the majority of the proposed methods fail in proposing a generic ontology for context which limit their usability and keep them specific to a particular domain. The adaptation task must be done automatically and without an explicit intervention of the user. Devices of a PCS must acquire some intelligence which permits them to sense the current context and trigger the appropriate service or provide a service in a better suitable form. In this paper we will propose a generic service ontology for context modeling and a context-aware service adaptation based on a service oriented definition of context.
Abstract: The robustness of color-based signatures in the presence of a selection of representative distortions is investigated. Considered are five signatures that have been developed and evaluated within a new modular framework. Two signatures presented in this work are directly derived from histograms gathered from video frames. The other three signatures are based on temporal information by computing difference histograms between adjacent frames. In order to obtain objective and reproducible results, the evaluations are conducted based on several randomly assembled test sets. These test sets are extracted from a video repository that contains a wide range of broadcast content including documentaries, sports, news, movies, etc. Overall, the experimental results show the adequacy of color-histogram-based signatures for video fingerprinting applications and indicate which type of signature should be preferred in the presence of certain distortions.