Abstract: This paper is to investigate the impplementation of security
mechanism in object oriented database system. Formal methods
plays an essential role in computer security due to its powerful expressiveness
and concise syntax and semantics. In this paper, both issues
of specification and implementation in database security environment
will be considered; and the database security is achieved through
the development of an efficient implementation of the specification
without compromising its originality and expressiveness.
Abstract: The aim of this study was to compare the solubility of selected volatile organic compounds in water and silicon oil using the simple static headspace method. The experimental design allowed equilibrium achievement within 30 – 60 minutes. Infinite dilution activity coefficients and Henry-s law constants for various organics representing esters, ketones, alkanes, aromatics, cycloalkanes and amines were measured at 303K. The measurements were reproducible with a relative standard deviation and coefficient of variation of 1.3x10-3 and 1.3 respectively. The static determined activity coefficients using shaker flasks were reasonably comparable to those obtained using the gas liquid - chromatographic technique and those predicted using the group contribution methods mainly the UNIFAC. Silicon oil chemically known as polydimethysiloxane was found to be better absorbent for VOCs than water which quickly becomes saturated. For example the infinite dilution mole fraction based activity coefficients of hexane is 0.503 and 277 000 in silicon oil and water respectively. Thus silicon oil gives a superior factor of 550 696. Henry-s law constants and activity coefficients at infinite dilution play a significant role in the design of scrubbers for abatement of volatile organic compounds from contaminated air streams. This paper presents the phase equilibrium of volatile organic compounds in very dilute aqueous and polymeric solutions indicating the movement and fate of chemical in air and solvent. The successful comparison of the results obtained here and those obtained using other methods by the same authors and in literature, means that the results obtained here are reliable.
Abstract: To illustrate diversity of methods used to extract relevant (where the concept of relevance can be differently defined for different applications) visual data, the paper discusses three groups of such methods. They have been selected from a range of alternatives to highlight how hardware and software tools can be complementarily used in order to achieve various functionalities in case of different specifications of “relevant data". First, principles of gated imaging are presented (where relevance is determined by the range). The second methodology is intended for intelligent intrusion detection, while the last one is used for content-based image matching and retrieval. All methods have been developed within projects supervised by the author.
Abstract: Text categorization (the assignment of texts in natural language into predefined categories) is an important and extensively studied problem in Machine Learning. Currently, popular techniques developed to deal with this task include many preprocessing and learning algorithms, many of which in turn require tuning nontrivial internal parameters. Although partial studies are available, many authors fail to report values of the parameters they use in their experiments, or reasons why these values were used instead of others. The goal of this work then is to create a more thorough comparison of preprocessing parameters and their mutual influence, and report interesting observations and results.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: Building intelligent traffic guide systems has been an
interesting subject recently. A good system should be able to observe
all important visual information to be able to analyze the context of
the scene. To do so, signs in general, and traffic signs in particular,
are usually taken into account as they contain rich information to
these systems. Therefore, many researchers have put an effort on
sign recognition field. Sign localization or sign detection is the most
important step in the sign recognition process. This step filters out
non informative area in the scene, and locates candidates in later
steps. In this paper, we apply a new approach in detecting sign
locations using a new color invariant model. Experiments are carried
out with different datasets introduced in other works where authors
claimed the difficulty in detecting signs under unfavorable imaging
conditions. Our method is simple, fast and most importantly it gives
a high detection rate in locating signs.
Abstract: Electrical Discharge Machine (EDM) is especially
used for the manufacturing of 3-D complex geometry and hard
material parts that are extremely difficult-to-machine by conventional
machining processes. In this paper authors review the research work
carried out in the development of die-sinking EDM within the past
decades for the improvement of machining characteristics such as
Material Removal Rate, Surface Roughness and Tool Wear Ratio. In
this review various techniques reported by EDM researchers for
improving the machining characteristics have been categorized as
process parameters optimization, multi spark technique, powder
mixed EDM, servo control system and pulse discriminating. At the
end, flexible machine controller is suggested for Die Sinking EDM to
enhance the machining characteristics and to achieve high-level
automation. Thus, die sinking EDM can be integrated with Computer
Integrated Manufacturing environment as a need of agile
manufacturing systems.
Abstract: In this paper, an image adaptive, invisible digital
watermarking algorithm with Orthogonal Polynomials based
Transformation (OPT) is proposed, for copyright protection of digital
images. The proposed algorithm utilizes a visual model to determine
the watermarking strength necessary to invisibly embed the
watermark in the mid frequency AC coefficients of the cover image,
chosen with a secret key. The visual model is designed to generate a
Just Noticeable Distortion mask (JND) by analyzing the low level
image characteristics such as textures, edges and luminance of the
cover image in the orthogonal polynomials based transformation
domain. Since the secret key is required for both embedding and
extraction of watermark, it is not possible for an unauthorized user to
extract the embedded watermark. The proposed scheme is robust to
common image processing distortions like filtering, JPEG
compression and additive noise. Experimental results show that the
quality of OPT domain watermarked images is better than its DCT
counterpart.
Abstract: The focus of this research is in the area of the soviet period and the mission of the Russian Orthodox Church in Kazakhstan in the XIX century. There was close connection of national customs and traditions with religious practices, outlooks and attitudes. In particular, such an approach has alleged estimation by Kazakh historians of the process of Christianization of the local population. Some of them are inclined to consider the small number of Christening Kazakhs as evidence that the Russian Orthodox Church didn’t achieve its mission. The number of historians who think that the church didn’t achieve its mission has thousand over the last centuries, however our calculations of the number of Kazakhs who became Orthodox Christian is much more than other historians think. Such Christians can be divided into 3 groups: Some remained Christian until their deaths, others had two faiths and the third hid their true religions, having returned to their former belief. Therefore, to define the exact amount of Christening Kazakhs represented a challenge. Some data does not create a clear picture of the level of Christianization, constant and accurate was not collected. The data appearing in reports of spiritual attendants and civil authorities is not always authentic. Article purpose is illumination and the analysis missionary activity of Russian Orthodox Church in Kazakhstan.
Abstract: Flow-shop scheduling problem (FSP) deals with the
scheduling of a set of jobs that visit a set of machines in the same
order. The FSP is NP-hard, which means that an efficient algorithm
for solving the problem to optimality is unavailable. To meet the
requirements on time and to minimize the make-span performance of
large permutation flow-shop scheduling problems in which there are
sequence dependent setup times on each machine, this paper
develops one hybrid genetic algorithms (HGA). Proposed HGA
apply a modified approach to generate population of initial
chromosomes and also use an improved heuristic called the iterated
swap procedure to improve initial solutions. Also the author uses
three genetic operators to make good new offspring. The results are
compared to some recently developed heuristics and computational
experimental results show that the proposed HGA performs very
competitively with respect to accuracy and efficiency of solution.
Abstract: The seismic vulnerability of an urban area is of a great
deal for local authorities especially those facing earthquakes. So, it is
important to have an efficient tool to assess the vulnerability of
existing buildings. The use of the VIP (Vulnerability Index Program)
and the GIS (Geographic Information System) let us to identify the
most vulnerable districts of an urban area.
The use of the vulnerability index method lets us to assess the
vulnerability of the center town of Blida (Algeria) which is a
historical town and which has grown enormously during the last
decades. In this method, three levels of vulnerability are defined. The
GIS has been used to build a data base in order to perform different
thematic analyses. These analyses show the seismic vulnerability of
Blida.
Abstract: Dengue fever has become a major concern for health
authorities all over the world particularly in the tropical countries.
These countries, in particular are experiencing the most worrying
outbreak of dengue fever (DF) and dengue haemorrhagic fever
(DHF). The DF and DHF epidemics, thus, have become the main
causes of hospital admissions and deaths in Malaysia. This paper,
therefore, attempts to examine the environmental factors that may
influence the recent dengue outbreak. The aim of this study is twofold,
firstly is to establish a statistical model to describe the
relationship between the number of dengue cases and a range of
explanatory variables and secondly, to identify the lag operator for
explanatory variables which affect the dengue incidence the most.
The explanatory variables involved include the level of cloud cover,
percentage of relative humidity, amount of rainfall, maximum
temperature, minimum temperature and wind speed. The Poisson and
Negative Binomial regression analyses were used in this study. The
results of the analyses on the 915 observations (daily data taken from
July 2006 to Dec 2008), reveal that the climatic factors comprising of
daily temperature and wind speed were found to significantly
influence the incidence of dengue fever after 2 and 3 weeks of their
occurrences. The effect of humidity, on the other hand, appears to be
significant only after 2 weeks.
Abstract: In the current study we present a system that is
capable to deliver proxy based differentiated service. It will help the
carrier service node to sell a prepaid service to clients and limit the
use to a particular mobile device or devices for a certain time. The
system includes software and hardware architecture for a mobile
device with moderate computational power, and a secure protocol for
communication between it and its carrier service node. On the
carrier service node a proxy runs on a centralized server to be
capable of implementing cryptographic algorithms, while the mobile
device contains a simple embedded processor capable of executing
simple algorithms. One prerequisite is needed for the system to run
efficiently that is a presence of Global Trusted Verification Authority
(GTVA) which is equivalent to certifying authority in IP networks.
This system appears to be of great interest for many commercial
transactions, business to business electronic and mobile commerce,
and military applications.
Abstract: Several wireless networks security standards have been proposed and widely implemented in both business and home environments in order to protect the network from unauthorized access. However, the implementation of such standards is usually achieved by network administrators without even knowing the standards- weaknesses and strengths. The intention of this paper is to evaluate and analyze the impact over the network-s security due to the implementation of the wireless networks security standards WEP, WPA and WLAN 802.1X.
Abstract: Wavelet transforms is a very powerful tools for image compression. One of its advantage is the provision of both spatial and frequency localization of image energy. However, wavelet transform coefficients are defined by both a magnitude and sign. While algorithms exist for efficiently coding the magnitude of the transform coefficients, they are not efficient for the coding of their sign. It is generally assumed that there is no compression gain to be obtained from the coding of the sign. Only recently have some authors begun to investigate the sign of wavelet coefficients in image coding. Some authors have assumed that the sign information bit of wavelet coefficients may be encoded with the estimated probability of 0.5; the same assumption concerns the refinement information bit. In this paper, we propose a new method for Separate Sign Coding (SSC) of wavelet image coefficients. The sign and the magnitude of wavelet image coefficients are examined to obtain their online probabilities. We use the scalar quantization in which the information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also examined. We show that the sign information and the refinement information may be encoded by the probability of approximately 0.5 only after about five bit planes. Two maps are separately entropy encoded: the sign map and the magnitude map. The refinement information of the wavelet coefficient to belong to the lower or to the upper sub-interval in the uncertainly interval is also entropy encoded. An algorithm is developed and simulations are performed on three standard images in grey scale: Lena, Barbara and Cameraman. Five scales are performed using the biorthogonal wavelet transform 9/7 filter bank. The obtained results are compared to JPEG2000 standard in terms of peak signal to noise ration (PSNR) for the three images and in terms of subjective quality (visual quality). It is shown that the proposed method outperforms the JPEG2000. The proposed method is also compared to other codec in the literature. It is shown that the proposed method is very successful and shows its performance in term of PSNR.
Abstract: The aim of this study is to show innovative techniques that describe the effectiveness of individuals diagnosed with antisocial personality disorders (ASPD). The author presents information about hate schemas regarding persons with ASPD and their understanding of the role of hate. The data of 60 prisoners with ASPD, 40 prisoners without ASPD, and 60 men without antisocial tendencies, has been analyzed. The participants were asked to describe their hate inspired by a photograph. The narrative discourse was analyzed, the three groups were compared. The results show the differences between the inmates with ASPD, those without ASPD, and the controls. The antisocial individuals describe hate as an ambivalent feeling with low emotional intensity, i.e., actors (in stories) are presented more as positives than as partners. They use different mechanisms to keep them from understanding the meaning of the emotional situation. The schema's characteristics were expressed in narratives attributed to high Psychopathy.
Abstract: The request for a sustainable development challenges
both managers and consumers to rethink habitual practices and
activities. While consumers are challenged to develop sustainable
consumption patterns, companies are asked to establish managerial
systems and structures considering economical, ecological, and social
issues. As this is in particular true for housing associations, this paper
aims first, at providing an understanding of sustainability strategy in
residential trade and industry (RTI) by identifying relevant facets of
this construct and second, at conceptually analyzing the impact of
sustainability strategy in RTI on operational efficiency and
performance of municipal housing companies. The author develops a
model of sustainability strategy in RTI and its effects and further,
sheds light in priorities for future research.
Abstract: In a complex project environment, project teams face
multi-dimensional communication problems that can ultimately lead
to project breakdown. Team Performance varies in Face-to-Face
(FTF) environment versus groups working remotely in a computermediated
communication (CMC) environment. A brief review of the
Input_Process_Output model suggested by James E. Driskell, Paul H.
Radtke and Eduardo Salas in “Virtual Teams: Effects of
Technological Mediation on Team Performance (2003)", has been
done to develop the basis of this research. This model theoretically
analyzes the effects of technological mediation on team processes,
such as, cohesiveness, status and authority relations, counternormative
behavior and communication. An empirical study
described in this paper has been undertaken to test the
“cohesiveness" of diverse project teams in a multi-national
organization. This study uses both quantitative and qualitative
techniques for data gathering and analysis. These techniques include
interviews, questionnaires for data collection and graphical data
representation for analyzing the collected data. Computer-mediated
technology may impact team performance because of difference in
cohesiveness among teams and this difference may be moderated by
factors, such as, the type of communication environment, the type of
task and the temporal context of the team. Based on the reviewed
model, sets of hypotheses are devised and tested. This research,
reports on a study that compared team cohesiveness among virtual
teams using CMC and non-CMC communication mediums. The
findings suggest that CMC can help virtual teams increase team
cohesiveness among their members, making CMC an effective
medium for increasing productivity and team performance.
Abstract: This paper examines the influence of communication
form on employee uncertainty during mergers and acquisitions
(M&As). Specifically, the author uses narrative theory to analyze
how narrative organizational communication affects the three
components of uncertainty – decreased predictive, explanatory, and
descriptive ability. It is hypothesized that employees whose
organizations use narrative M&A communication will have greater
predictive, explanatory, and descriptive abilities than employees of
organizations using non-narrative M&A communication. This paper
contributes to the stream of research examining uncertainty during
mergers and acquisitions and argues that narratives are an effective
means of managing uncertainty in the mergers and acquisitions
context.
Abstract: This paper presents the results of the authors in designing, experimenting, assessing and transferring an innovative approach to energy education in secondary schools, aimed to enhance the quality of learning in terms of didactic curricula and pedagogic methods. The training is online delivered to youngsters via e-Books and portals specially designed for this purpose or by learning by doing via interactive games. An online educational methodology is available teachers.