Abstract: This article outlines conceptualization and
implementation of an intelligent system capable of extracting
knowledge from databases. Use of hybridized features of both the
Rough and Fuzzy Set theory render the developed system flexibility
in dealing with discreet as well as continuous datasets. A raw data set
provided to the system, is initially transformed in a computer legible
format followed by pruning of the data set. The refined data set is
then processed through various Rough Set operators which enable
discovery of parameter relationships and interdependencies. The
discovered knowledge is automatically transformed into a rule base
expressed in Fuzzy terms. Two exemplary cancer repository datasets
(for Breast and Lung Cancer) have been used to test and implement
the proposed framework.
Abstract: This paper begins with formal defining of human rights and freedoms, and the basic document regarding the said subject is undoubtedly French Declaration of the Rights of Man and of the Citizen from 789. This paper furthermore parses legal sources relevant for the workers' rights in legal system of the Republic of Croatia, international contracts and the Labour Act, which is also a master bill regarding workers' rights The authors are also dealing with issues of Constitutional Court of the Republic of Croatia and its' position in judicial system of the Republic of Croatia, as well as with the specifics of Constitutional Complaint, and the crucial part of the paper is based on the research conducted with an aim to determine implementation of rights and liberties guaranteed by the articles 54. and 55. of the Constitution of the Republic of Croatia by means of Constitutional Complaint.
Abstract: This paper introduces a process for the module level integration of computer based systems. It is based on the Six Sigma Process Improvement Model, where the goal of the process is to improve the overall quality of the system under development. We also present a conceptual framework that shows how this process can be implemented as an integration solution. Finally, we provide a partial implementation of key components in the conceptual framework.
Abstract: Concerning the inpatient care the present situation is
characterized by intense charges of medical technology into the
clinical daily routine and an ever stronger integration of special
techniques into the clinical workflow. Medical technology is by now
an integral part of health care according to consisting general
accepted standards. Purchase and operation thereby represent an
important economic position and both are subject of everyday
optimisation attempts. For this purpose by now exists a huge number
of tools which conduce more likely to a complexness of the problem
by a comprehensive implementation. In this paper the advantages of
an integrative information-workflow on the life-cycle-management in
the region of medical technology are shown.
Abstract: This paper describes simple implementation of
homotopy (also called continuation) algorithm for determining the proper resistance of the resistor to dissipate energy at a specified rate of an electric circuit. Homotopy algorithm can be considered as a developing of the classical methods in numerical computing such as Newton-Raphson and fixed
point methods. In homoptopy methods, an embedding
parameter is used to control the convergence. The method purposed in this work utilizes a special homotopy called Newton homotopy. Numerical example solved in MATLAB is given to show the effectiveness of the purposed method
Abstract: Web usage mining has become a popular research
area, as a huge amount of data is available online. These data can be
used for several purposes, such as web personalization, web structure
enhancement, web navigation prediction etc. However, the raw log
files are not directly usable; they have to be preprocessed in order to
transform them into a suitable format for different data mining tasks.
One of the key issues in the preprocessing phase is to identify web
users. Identifying users based on web log files is not a
straightforward problem, thus various methods have been developed.
There are several difficulties that have to be overcome, such as client
side caching, changing and shared IP addresses and so on. This paper
presents three different methods for identifying web users. Two of
them are the most commonly used methods in web log mining
systems, whereas the third on is our novel approach that uses a
complex cookie-based method to identify web users. Furthermore we
also take steps towards identifying the individuals behind the
impersonal web users. To demonstrate the efficiency of the new
method we developed an implementation called Web Activity
Tracking (WAT) system that aims at a more precise distinction of
web users based on log data. We present some statistical analysis
created by the WAT on real data about the behavior of the Hungarian
web users and a comprehensive analysis and comparison of the three
methods
Abstract: Parsing is important in Linguistics and Natural
Language Processing to understand the syntax and semantics of a
natural language grammar. Parsing natural language text is
challenging because of the problems like ambiguity and inefficiency.
Also the interpretation of natural language text depends on context
based techniques. A probabilistic component is essential to resolve
ambiguity in both syntax and semantics thereby increasing accuracy
and efficiency of the parser. Tamil language has some inherent
features which are more challenging. In order to obtain the solutions,
lexicalized and statistical approach is to be applied in the parsing
with the aid of a language model. Statistical models mainly focus on
semantics of the language which are suitable for large vocabulary
tasks where as structural methods focus on syntax which models
small vocabulary tasks. A statistical language model based on Trigram
for Tamil language with medium vocabulary of 5000 words has
been built. Though statistical parsing gives better performance
through tri-gram probabilities and large vocabulary size, it has some
disadvantages like focus on semantics rather than syntax, lack of
support in free ordering of words and long term relationship. To
overcome the disadvantages a structural component is to be
incorporated in statistical language models which leads to the
implementation of hybrid language models. This paper has attempted
to build phrase structured hybrid language model which resolves
above mentioned disadvantages. In the development of hybrid
language model, new part of speech tag set for Tamil language has
been developed with more than 500 tags which have the wider
coverage. A phrase structured Treebank has been developed with 326
Tamil sentences which covers more than 5000 words. A hybrid
language model has been trained with the phrase structured Treebank
using immediate head parsing technique. Lexicalized and statistical
parser which employs this hybrid language model and immediate
head parsing technique gives better results than pure grammar and
trigram based model.
Abstract: The study explored the question of who am I? As a (re)construction of cultural identity by delving into globalization, communication, and social change in Malta during a historical moment when Malta became a European Union Member State. Three objectives guided this qualitative study. Firstly the study reviewed European Union (EU) policies that regulate broadcasting and their implementation in Member States, whilst meeting the challenges of globalization and new media technology. Secondly the research investigated the changes of the media landscape via organizational structures, programs and television (TV) content. Finally the study explored the impact of these transformations taking place in the way Maltese live as they (re)construct their cultural identity. Despite the choices available to the Maltese audience, old local traditions and new foreign customs coexist as informants continue to (re)construct their cultural identity and define who they are.
Abstract: One of the most important aspects expected from ERP systems is to integrate various operations existing in administrative, financial, commercial, human resources, and production departments of the consumer organization. Also, it is often needed to integrate the new ERP system with the organization legacy systems when implementing the ERP package in the organization. Without relying on an appropriate software architecture to realize the required integration, ERP implementation processes become error prone and time consuming; in some cases, the ERP implementation may even encounters serious risks. In this paper, we propose a new architecture that is based on the agent oriented vision and supplies the integration expected from ERP systems using several independent but cooperator agents. Besides integration which is the main issue of this paper, the presented architecture will address some aspects of intelligence and learning capabilities existing in ERP systems
Abstract: ISO 9000 is the most popular and widely adopted meta-standard for quality and operational improvements. However, only limited empirical research has been conducted to examine the impact of ISO 9000 on operational performance based on objective and longitudinal data. To reveal any causal relationship between the adoption of ISO 9000 and operational performance, we examined the timing and magnitude of change in time-based performance as a result of ISO 9000 adoption. We analyzed the changes in operating cycle, inventory days, and account receivable days prior and after the implementation of ISO 9000 in 695 publicly listed manufacturing firms. We found that ISO 9000 certified firms shortened their operating cycle time by 5.28 days one year after the implementation of ISO 9000. In the long-run (3 years after certification), certified firms showed continuous improvement in time-based efficiency, and experienced a shorter operating cycle time of 11 days than that of non-certified firms. There was an average of 6.5% improvement in operating cycle time for ISO 9000 certified firms. Both inventory days and account receivable days showed similar significant improvements after the implementation of ISO 9000, too.
Abstract: Grid environments include aggregation of
geographical distributed resources. Grid is put forward in three types
of computational, data and storage. This paper presents a research on
data grid. Data grid is used for covering and securing accessibility to
data from among many heterogeneous sources. Users are not worry
on the place where data is located in it, provided that, they should get
access to the data. Metadata is used for getting access to data in data
grid. Presently, application metadata catalogue and SRB middle-ware
package are used in data grids for management of metadata. At this
paper, possibility of updating, streamlining and searching is provided
simultaneously and rapidly through classified table of preserving
metadata and conversion of each table to numerous tables.
Meanwhile, with regard to the specific application, the most
appropriate and best division is set and determined. Concurrency of
implementation of some of requests and execution of pipeline is
adaptability as a result of this technique.
Abstract: In this paper is study the possibility of successfully
implementing of hollow roller concept in order to minimize inertial
mass of the large bearings, with major results in diminution of the
material consumption, increasing of power efficiency (in wind power
station area), increasing of the durability and life duration of the large
bearings systems, noise reduction in working, resistance to
vibrations, an important diminution of losses by abrasion and
reduction of the working temperature. In this purpose was developed
an original solution through which are reduced mass, inertial forces
and moments of large bearings by using of hollow rollers. The
research was made by using the method of finite element analysis
applied on software type Solidworks - Nastran. Also, is study the
possibility of rapidly changing the manufacturing system of solid and
hollow cylindrical rollers.
Abstract: We report on the development of a model to
understand why the range of experience with respect to HIV
infection is so diverse, especially with respect to the latency period.
To investigate this, an agent-based approach is used to extract highlevel
behaviour which cannot be described analytically from the set
of interaction rules at the cellular level. A network of independent
matrices mimics the chain of lymph nodes. Dealing with massively
multi-agent systems requires major computational effort. However,
parallelisation methods are a natural consequence and advantage of
the multi-agent approach and, using the MPI library, are here
implemented, tested and optimized. Our current focus is on the
various implementations of the data transfer across the network.
Three communications strategies are proposed and tested, showing
that the most efficient approach is communication based on the
natural lymph-network connectivity.
Abstract: Nature conducts its action in a very private manner. To
reveal these actions classical science has done a great effort. But
classical science can experiment only with the things that can be seen
with eyes. Beyond the scope of classical science quantum science
works very well. It is based on some postulates like qubit,
superposition of two states, entanglement, measurement and
evolution of states that are briefly described in the present paper.
One of the applications of quantum computing i.e.
implementation of a novel quantum evolutionary algorithm(QEA) to
automate the time tabling problem of Dayalbagh Educational Institute
(Deemed University) is also presented in this paper. Making a good
timetable is a scheduling problem. It is NP-hard, multi-constrained,
complex and a combinatorial optimization problem. The solution of
this problem cannot be obtained in polynomial time. The QEA uses
genetic operators on the Q-bit as well as updating operator of
quantum gate which is introduced as a variation operator to converge
toward better solutions.
Abstract: This paper proposes a low-cost reconfigurable
architecture for AES algorithm. The proposed architecture separates
SubBytes and MixColumns into two parallel data path, and supports
different bit-width operation for this two data path. As a result, different number of S-box can be supported in this architecture. The
throughput and power consumption can be adjusted by changing the
number of S-box running in this design. Using the TSMC 0.18μm CMOS standard cell library, a very low-cost implementation of 7K
Gates is obtained under 182MHz frequency. The maximum throughput is 360Mbps while using 4 S-Box simultaneously, and the
minimum throughput is 114Mbps while only using 1 S-Box
Abstract: System-level design based on high-level abstractions
is becoming increasingly important in hardware and embedded
system design. This paper analyzes meta-design techniques oriented
at developing meta-programs and meta-models for well-understood
domains. Meta-design techniques include meta-programming and
meta-modeling. At the programming level of design process, metadesign
means developing generic components that are usable in a
wider context of application than original domain components. At the
modeling level, meta-design means developing design patterns that
describe general solutions to the common recurring design problems,
and meta-models that describe the relationship between different
types of design models and abstractions. The paper describes and
evaluates the implementation of meta-design in hardware design
domain using object-oriented and meta-programming techniques.
The presented ideas are illustrated with a case study.
Abstract: This paper presents a new approach for busbar protection with stable operation of current transformer during saturation, using fuzzy neuro and symmetrical components theory. This technique uses symmetrical components of current signals to learn the hidden relationship existing in the input patterns. Simulation studies are preformed and the influence of changing system parameters such as inception fault and source impedance is studied. Details of the design procedure and the results of performance studies with the proposed relay are given in the paper. An analysis of the performance of the proposed technique during ct saturation conditions is presented. The performance of the technique was investigated for a variety of operating conditions and for several busbar configurations. Data generated by EMTDC simulations of model power systems were used in the investigations. The results indicate that the proposed technique is stable during ct saturation conditions.
Abstract: This paper proposed a novel model for short term load
forecast (STLF) in the electricity market. The prior electricity
demand data are treated as time series. The model is composed of
several neural networks whose data are processed using a wavelet
technique. The model is created in the form of a simulation program
written with MATLAB. The load data are treated as time series data.
They are decomposed into several wavelet coefficient series using
the wavelet transform technique known as Non-decimated Wavelet
Transform (NWT). The reason for using this technique is the belief
in the possibility of extracting hidden patterns from the time series
data. The wavelet coefficient series are used to train the neural
networks (NNs) and used as the inputs to the NNs for electricity load
prediction. The Scale Conjugate Gradient (SCG) algorithm is used as
the learning algorithm for the NNs. To get the final forecast data, the
outputs from the NNs are recombined using the same wavelet
technique. The model was evaluated with the electricity load data of
Electronic Engineering Department in Mandalay Technological
University in Myanmar. The simulation results showed that the
model was capable of producing a reasonable forecasting accuracy in
STLF.
Abstract: Some meta-schedulers query the information system of individual supercomputers in order to submit jobs to the least busy supercomputer on a computational Grid. However, this information can become outdated by the time a job starts due to changes in scheduling priorities. The MSR scheme is based on Multiple Simultaneous Requests and can take advantage of opportunities resulting from these priorities changes. This paper presents the SWARM meta-scheduler, which can speed up the execution of large sets of tasks by minimizing the job queuing time through the submission of multiple requests. Performance tests have shown that this new meta-scheduler is faster than an implementation of the MSR scheme and the gLite meta-scheduler. SWARM has been used through the GridQTL project beta-testing portal during the past year. Statistics are provided for this usage and demonstrate its capacity to achieve reliably a substantial reduction of the execution time in production conditions.
Abstract: Groups where the discrete logarithm problem (DLP) is believed to be intractable have proved to be inestimable building blocks for cryptographic applications. They are at the heart of numerous protocols such as key agreements, public-key cryptosystems, digital signatures, identification schemes, publicly verifiable secret sharings, hash functions and bit commitments. The search for new groups with intractable DLP is therefore of great importance.The goal of this article is to study elliptic curves over the ring Fq[], with Fq a finite field of order q and with the relation n = 0, n ≥ 3. The motivation for this work came from the observation that several practical discrete logarithm-based cryptosystems, such as ElGamal, the Elliptic Curve Cryptosystems . In a first time, we describe these curves defined over a ring. Then, we study the algorithmic properties by proposing effective implementations for representing the elements and the group law. In anther article we study their cryptographic properties, an attack of the elliptic discrete logarithm problem, a new cryptosystem over these curves.