Abstract: The problem addressed herein is the efficient management of the Grid/Cluster intense computation involved, when the preconditioned Bi-CGSTAB Krylov method is employed for the iterative solution of the large and sparse linear system arising from the discretization of the Modified Helmholtz-Dirichlet problem by the Hermite Collocation method. Taking advantage of the Collocation ma-trix's red-black ordered structure we organize efficiently the whole computation and map it on a pipeline architecture with master-slave communication. Implementation, through MPI programming tools, is realized on a SUN V240 cluster, inter-connected through a 100Mbps and 1Gbps ethernet network,and its performance is presented by speedup measurements included.
Abstract: Today’s technology is heavily dependent on web applications. Web applications are being accepted by users at a very rapid pace. These have made our work efficient. These include webmail, online retail sale, online gaming, wikis, departure and arrival of trains and flights and list is very long. These are developed in different languages like PHP, Python, C#, ASP.NET and many more by using scripts such as HTML and JavaScript. Attackers develop tools and techniques to exploit web applications and legitimate websites. This has led to rise of web application security; which can be broadly classified into Declarative Security and Program Security. The most common attacks on the applications are by SQL Injection and XSS which give access to unauthorized users who totally damage or destroy the system. This paper presents a detailed literature description and analysis on Web Application Security, examples of attacks and steps to mitigate the vulnerabilities.
Abstract: Computers are increasingly being used as educational
tools in elementary/primary schools worldwide. A specific
application of such computer use, is that of multimedia games, where
the aim is to combine pedagogy and entertainment. This study
reports on a case-study whereby an educational multimedia game has
been developed for use by elementary school children. The stages of
the application-s design, implementation and evaluation are
presented. Strengths of the game are identified and discussed, and its
weaknesses are identified, allowing for suggestions for future redesigns.
The results show that the use of games can engage children
in the learning process for longer periods of time with the added
benefit of the entertainment factor.
Abstract: Assessment of IEP (Individual Education Plan) is an
important stage in the area of special education. This paper deals
with this problem by introducing computer software which process
the data gathered from application of IEP. The software is intended
to be used by special education institution in Turkey and allows
assessment of school and family trainings. The software has a user
friendly interface and its design includes graphical developer tools.
Abstract: CIM is the standard formalism for modeling management
information developed by the Distributed Management Task
Force (DMTF) in the context of its WBEM proposal, designed to
provide a conceptual view of the managed environment. In this
paper, we propose the inclusion of formal knowledge representation
techniques, based on Description Logics (DLs) and the Web Ontology
Language (OWL), in CIM-based conceptual modeling, and then we
examine the benefits of such a decision. The proposal is specified
as a CIM metamodel level mapping to a highly expressive subset
of DLs capable of capturing all the semantics of the models. The
paper shows how the proposed mapping provides CIM diagrams with
precise semantics and can be used for automatic reasoning about the
management information models, as a design aid, by means of newgeneration
CASE tools, thanks to the use of state-of-the-art automatic
reasoning systems that support the proposed logic and use algorithms
that are sound and complete with respect to the semantics. Such a
CASE tool framework has been developed by the authors and its
architecture is also introduced. The proposed formalization is not
only useful at design time, but also at run time through the use of
rational autonomous agents, in response to a need recently recognized
by the DMTF.
Abstract: The analysis of electromagnetic environment using
deterministic mathematical models is characterized by the
impossibility of analyzing a large number of interacting network
stations with a priori unknown parameters, and this is characteristic,
for example, of mobile wireless communication networks. One of the
tasks of the tools used in designing, planning and optimization of
mobile wireless network is to carry out simulation of electromagnetic
environment based on mathematical modelling methods, including
computer experiment, and to estimate its effect on radio
communication devices. This paper proposes the development of a
statistical model of electromagnetic environment of a mobile
wireless communication network by describing the parameters and
factors affecting it including the propagation channel and their
statistical models.
Abstract: This paper discusses the use of explorative data
mining tools that allow the educator to explore new relationships
between reported learning experiences and actual activities,
even if there are multiple dimensions with a large number
of measured items. The underlying technology is based on
the so-called Compendium Platform for Reproducible Computing
(http://www.freestatistics.org) which was built on top the computational
R Framework (http://www.wessa.net).
Abstract: Latvia is the fourth in the world by means of broadband internet speed. The total number of internet users in Latvia exceeds 70% of its population. The number of active mailboxes of the local internet e-mail service Inbox.lv accounts for 68% of the population and 97.6% of the total number of internet users. The Latvian portal Draugiem.lv is a phenomenon of social media, because 58.4 % of the population and 83.5% of internet users use it. A majority of Latvian company profiles are available on social networks, the most popular being Twitter.com. These and other parameters prove the fact consumers and companies are actively using the Internet.
However, after the authors in a number of studies analyzed how enterprises are employing the e-environment, namely, e-environment tools, they arrived to the conclusions that are not as flattering as the aforementioned statistics. There is an obvious contradiction between the statistical data and the actual studies. As a result, the authors have posed a question: Why are entrepreneurs resistant to e-tools? In order to answer this question, the authors have addressed the Technology Acceptance Model (TAM). The authors analyzed each phase and determined several factors affecting the use of e-environment, reaching the main conclusion that entrepreneurs do not have a sufficient level of e-literacy (digital literacy).
The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistic method, factor analysis in SPSS 20 environment etc.
The theoretical and methodological background of the research is formed by, scientific researches and publications, that from the mass media and professional literature, statistical information from legal institutions as well as information collected by the author during the survey.
Abstract: Today's business environment requires that companies have access to highly relevant information in a matter of seconds.
Modern Business Intelligence tools rely on data structured mostly in traditional dimensional database schemas, typically represented by
star schemas. Dimensional modeling is already recognized as a
leading industry standard in the field of data warehousing although
several drawbacks and pitfalls were reported. This paper focuses on
the analysis of another data warehouse modeling technique - the
anchor modeling, and its characteristics in context with the standardized dimensional modeling technique from a query performance perspective. The results of the analysis show
information about performance of queries executed on database
schemas structured according to principles of each database modeling
technique.
Abstract: The need for multilingual communication in Japan has
increased due to an increase in the number of foreigners in the
country. When people communicate in their nonnative language,
the differences in language prevent mutual understanding among
the communicating individuals. In the medical field, communication
between the hospital staff and patients is a serious problem. Currently,
medical translators accompany patients to medical care facilities, and
the demand for medical translators is increasing. However, medical
translators cannot necessarily provide support, especially in cases in
which round-the-clock support is required or in case of emergencies.
The medical field has high expectations from information technology.
Hence, a system that supports accurate multilingual communication is
required. Despite recent advances in machine translation technology,
it is very difficult to obtain highly accurate translations. We have
developed a support system called M3 for multilingual medical
reception. M3 provides support functions that aid foreign patients in
the following respects: conversation, questionnaires, reception procedures,
and hospital navigation; it also has a Q&A function. Users
can operate M3 using a touch screen and receive text-based support.
In addition, M3 uses accurate translation tools called parallel texts
to facilitate reliable communication through conversations between
the hospital staff and the patients. However, if there is no parallel
text that expresses what users want to communicate, the users cannot
communicate. In this study, we have developed a circulating support
environment for multilingual medical communication using parallel
texts. The proposed environment can circulate necessary parallel texts
through the following procedure: (1) a user provides feedback about
the necessary parallel texts, following which (2) these parallel texts
are created and evaluated.
Abstract: The paper is dealing by testing of ceramic cutting
tools with an interrupted machining. Tests will be provided on fixture
– interrupted cut simulator. This simulator has 4 mouldings on
circumference and cutting edge is put a shocks during 1 revolution.
Criteria of tool wear are destruction of cutting tool or 6000 shocks.
Like testing cutting tool material will be products of Sandvik
Coromant 6190, 620, 650 and 670. Machined materials was be steels
15 128 (13MoCrV6). Cutting speed (408 m.min-1 and 580 m.min-1)
and cutting feed (0,15 mm; 0,2 mm; 0,25 mm and 0,3 mm) were
variable parameters and cutting depth was constant parameter.
Abstract: The network traffic data provided for the design of
intrusion detection always are large with ineffective information and
enclose limited and ambiguous information about users- activities.
We study the problems and propose a two phases approach in our
intrusion detection design. In the first phase, we develop a
correlation-based feature selection algorithm to remove the worthless
information from the original high dimensional database. Next, we
design an intrusion detection method to solve the problems of
uncertainty caused by limited and ambiguous information. In the
experiments, we choose six UCI databases and DARPA KDD99
intrusion detection data set as our evaluation tools. Empirical studies
indicate that our feature selection algorithm is capable of reducing the
size of data set. Our intrusion detection method achieves a better
performance than those of participating intrusion detectors.
Abstract: To fight against the economic crisis, French
Government, like many others in Europe, has decided to give a boost
to high-speed line projects. This paper explores the implementation
and decision-making process in TGV projects, their evolutions,
especially since the Mediterranean TGV-line. This project was
probably the most controversial, but paradoxically represents today a
huge success for all the actors involved.
What kind of lessons we can learn from this experience? How to
evaluate the impact of this project on TGV-line planning? How can
we characterize this implementation and decision-making process
regards to the sustainability challenges?
The construction of Mediterranean TGV-line was the occasion to
make several innovations: to introduce more dialog into the decisionmaking
process, to take into account the environment, to introduce a
new project management and technological innovations. That-s why
this project appears today as an example in terms of integration of
sustainable development.
In this paper we examine the different kinds of innovations
developed in this project, by using concepts from sociology of
innovation to understand how these solutions emerged in a
controversial situation. Then we analyze the lessons which were
drawn from this decision-making process (in the immediacy and a
posteriori) and the way in which procedures evolved: creation of new
tools and devices (public consultation, project management...).
Finally we try to highlight the impact of this evolution on TGV
projects governance. In particular, new methods of implementation
and financing involve a reconfiguration of the system of actors. The
aim of this paper is to define the impact of this reconfiguration on
negotiations between stakeholders.
Abstract: This paper presents the development of analysis tools
for Home Agriculture project. The tools are required for monitoring
the condition of greenhouse which involves two components:
measurement hardware and data analysis engine. Measurement
hardware is functioned to measure environment parameters such as
temperature, humidity, air quality, dust and etc while analysis tool is
used to analyse and interpret the integrated data against the condition
of weather, quality of health, irradiance, quality of soil and etc. The
current development of the tools is completed for off-line data
recorded technique. The data is saved in MMC and transferred via
ZigBee to Environment Data Manager (EDM) for data analysis.
EDM converts the raw data and plot three combination graphs. It has
been applied in monitoring three months data measurement for
irradiance, temperature and humidity of the greenhouse..
Abstract: The objective of this research is to study plant layout
of iron manufacturing based on the systematic layout planning
pattern theory (SLP) for increased productivity. In this case study,
amount of equipments and tools in iron production are studied. The
detailed study of the plant layout such as operation process chart,
flow of material and activity relationship chart has been investigated.
The new plant layout has been designed and compared with the
present plant layout. The SLP method showed that new plant layout
significantly decrease the distance of material flow from billet
cutting process until keeping in ware house.
Abstract: Timing driven physical design, synthesis, and
optimization tools need efficient closed-form delay models for
estimating the delay associated with each net in an integrated circuit
(IC) design. The total number of nets in a modern IC design has
increased dramatically and exceeded millions. Therefore efficient
modeling of interconnection is needed for high speed IC-s. This
paper presents closed–form expressions for RC and RLC
interconnection trees in current mode signaling, which can be
implemented in VLSI design tool. These analytical model
expressions can be used for accurate calculation of delay after the
design clock tree has been laid out and the design is fully routed.
Evaluation of these analytical models is several orders of magnitude
faster than simulation using SPICE.
Abstract: SQL injection on web applications is a very popular
kind of attack. There are mechanisms such as intrusion detection
systems in order to detect this attack. These strategies often rely on
techniques implemented at high layers of the application but do not
consider the low level of system calls. The problem of only
considering the high level perspective is that an attacker can
circumvent the detection tools using certain techniques such as URL
encoding. One technique currently used for detecting low-level
attacks on privileged processes is the tracing of system calls. System
calls act as a single gate to the Operating System (OS) kernel; they
allow catching the critical data at an appropriate level of detail. Our
basic assumption is that any type of application, be it a system
service, utility program or Web application, “speaks” the language of
system calls when having a conversation with the OS kernel. At this
level we can see the actual attack while it is happening. We conduct
an experiment in order to demonstrate the suitability of system call
analysis for detecting SQL injection. We are able to detect the attack.
Therefore we conclude that system calls are not only powerful in
detecting low-level attacks but that they also enable us to detect highlevel
attacks such as SQL injection.
Abstract: Data mining (DM) is the process of finding and extracting frequent patterns that can describe the data, or predict unknown or future values. These goals are achieved by using various learning algorithms. Each algorithm may produce a mining result completely different from the others. Some algorithms may find millions of patterns. It is thus the difficult job for data analysts to select appropriate models and interpret the discovered knowledge. In this paper, we describe a framework of an intelligent and complete data mining system called SUT-Miner. Our system is comprised of a full complement of major DM algorithms, pre-DM and post-DM functionalities. It is the post-DM packages that ease the DM deployment for business intelligence applications.
Abstract: This paper presents methodologies for developing an
intelligent CAD system assisting in analysis and design of
reconfigurable special machines. It describes a procedure for
determining feasibility of utilizing these machines for a given part
and presents a model for developing an intelligent CAD system. The
system analyzes geometrical and topological information of the given
part to determine possibility of the part being produced by
reconfigurable special machines from a technical point of view. Also
feasibility of the process from a economical point of view is
analyzed. Then the system determines proper positioning of the part
considering details of machining features and operations needed.
This involves determination of operation types, cutting tools and the
number of working stations needed. Upon completion of this stage
the overall layout of the machine and machining equipment required
are determined.
Abstract: As current business environment is demanding a
constant adaptation of companies, the planning and strategic
management should be an ongoing and natural process in all kind of
organizations. The use of management and monitoring strategic
performance tools such as the Balanced Scorecard (BSC) have been
popular; even to Small and Medium-sized Enterprises. This paper
aims to investigate whether the BSC is being used in monitoring the
performance of small businesses, particularly in small fuel retailers
companies, which are competing in co-branding; and if not, it aims to
identify its strategic orientation in order to recommend a possible
strategy map for those managers that are willing to adopt this model
as an alternative to traditional ones for organizational performance
evaluation, which often focus only on evaluation of the
organizational financial performance.