Abstract: Architecture education was based on apprenticeship
models and its nature has not changed much during long period but
the Source of changes was its evaluation process and system. It is
undeniable that art and architecture education is completely based on
transmitting knowledge from instructor to students. In contrast to
other majors this transmitting is by iteration and practice and studio
masters try to control the design process and improving skills in the
form of supervision and criticizing. Also the evaluation will end by
giving marks to students- achievements. Therefore the importance of
the evaluation and assessment role is obvious and it is not irrelevant
to say that if we want to know about the architecture education
system, we must first study its assessment procedures. The evolution
of these changes in western countries has literate and documented
well. However it seems that this procedure has unregarded in
Malaysia and there is a severe lack of research and documentation in
this area. Malaysia as an under developing and multicultural country
which is involved different races and cultures is a proper origin for
scrutinizing and understanding the evaluation systems and
acceptability amount of current implemented models to keep the
evaluation and assessment procedure abreast with needs of different
generations, cultures and even genders. This paper attempts to
answer the questions of how evaluation and assessments are
performed and how students perceive this evaluation system in the
context Malaysia. The main advantage of this work is that it
contributes in international debate on evaluation model.
Abstract: Management is required to understand all information security risks within an organization, and to make decisions on which information security risks should be treated in what level by allocating how much amount of cost. However, such decision-making is not usually easy, because various measures for risk treatment must be selected with the suitable application levels. In addition, some measures may have objectives conflicting with each other. It also makes the selection difficult. Therefore, this paper provides a model which supports the selection of measures by applying multi-objective analysis to find an optimal solution. Additionally, a list of measures is also provided to make the selection easier and more effective without any leakage of measures.
Abstract: The concept of housing affordability is a contested
issue, but a pressing and widespread problem for many countries.
Simple ratio measures based on housing expenditure and income are
habitually used to defined and assess housing affordability. However,
conceptualising and measuring affordability in this manner focuses
only on financial attributes and fails to deal with wider issues such as
housing quality, location and access to services and facilities.
The research is based on the notion that the housing affordability
problem encompasses more than the financial costs of housing and a
households ability to meet such costs and must address larger issues
such as social and environmental sustainability and the welfare of
households. Therefore, the need arises for a broad and more
encompassing set of attributes by which housing affordability can be
assessed. This paper presents a system of criteria by which the
affordability of different housing locations could be assessed in a
comprehensive and sustainable manner. Moreover, the paper explores
the way in which such criteria could be measured.
Abstract: In this paper we compare the accuracy of data mining
methods to classifying students in order to predicting student-s class
grade. These predictions are more useful for identifying weak
students and assisting management to take remedial measures at early
stages to produce excellent graduate that will graduate at least with
second class upper. Firstly we examine single classifiers accuracy on
our data set and choose the best one and then ensembles it with a
weak classifier to produce simple voting method. We present results
show that combining different classifiers outperformed other single
classifiers for predicting student performance.
Abstract: Landfill gas, particularly methane is one of the
greenhouse gases which contributes to global warming. This paper presents the findings of a study on methane gas production from
simulated landfill reactor under saturated conditions. A reactor was constructed to represent a landfill cell of 2.5 m thickness on sandy
soil. The reactor was 0.2 m in diameter and 4 m in height. One meter of sand and pebble layer was packed at the bottom of the reactor
followed by 2.5 m of solid waste layer and 0.4 m of sand layer as the cover soil. Degradation of waste in the solid waste layer was at
acidification stage as indicated by the leachate quality with COD as
high as 55,511 mg/L and pH as low as 5.1. However, methanogenic
environment was established at the bottom sand layer after one year of operation indicated by pH of 7.2 and methane gas generation.
Leachate degradation took place as the leachate moved through the
sand layer at an infiltration of rate 0.7 cm/day. This resulted in landfill gas production of 77 mL/day/kg containing 55 to 65% methane. The application of sand layer contributed to the gas
production from landfill by an in-situ degradation of leachate in the
sand at the bottom of the landfill.
Abstract: Two-interconnected fluidized bed systems are widely used in various processes such as Fisher-Tropsch, hot gas desulfurization, CO2 capture-regeneration with dry sorbent, chemical-looping combustion, sorption enhanced steam methane reforming, chemical-looping hydrogen generation system, and so on. However, most of two-interconnected fluidized beds systems require riser and/or pneumatic transport line for solid conveying and loopseals or seal-pots for gas sealing, recirculation of solids to the riser, and maintaining of pressure balance. The riser (transport bed) is operated at the high velocity fluidization condition and residence times of gas and solid in the riser are very short. If the reaction rate of catalyst or sorbent is slow, the riser can not ensure sufficient contact time between gas and solid and we have to use two bubbling beds for each reaction to ensure sufficient contact time. In this case, additional riser must be installed for solid circulation. Consequently, conventional two-interconnected fluidized bed systems are very complex, large, and difficult to operate. To solve these problems, a novel two-interconnected fluidized bed system has been developed. This system has two bubbling beds, solid injection nozzles, solid conveying lines, and downcomers. In this study, effects of operating variables on solid circulation rate, gas leakage between two beds have been investigated in a cold mode two-interconnected fluidized bed system. Moreover, long-term operation of continuous solid circulation up to 60 hours has been performed to check feasibility of stable operation.
Abstract: Shadows add great amount of realism to a scene and
many algorithms exists to generate shadows. Recently, Shadow
volumes (SVs) have made great achievements to place a valuable
position in the gaming industries. Looking at this, we concentrate on
simple but valuable initial partial steps for further optimization in SV
generation, i.e.; model simplification and silhouette edge detection
and tracking. Shadow volumes (SVs) usually takes time in generating
boundary silhouettes of the object and if the object is complex then
the generation of edges become much harder and slower in process.
The challenge gets stiffer when real time shadow generation and
rendering is demanded. We investigated a way to use the real time
silhouette edge detection method, which takes the advantage of
spatial and temporal coherence, and exploit the level-of-details
(LOD) technique for reducing silhouette edges of the model to use
the simplified version of the model for shadow generation speeding
up the running time. These steps highly reduce the execution time of
shadow volume generations in real-time and are easily flexible to any
of the recently proposed SV techniques. Our main focus is to exploit
the LOD and silhouette edge detection technique, adopting them to
further enhance the shadow volume generations for real time
rendering.
Abstract: In this article we explore how computer assisted exercises may allow for bridging the traditional gap between theory and practice in professional education. To educate officers able to master the complexity of the battlefield the Norwegian Military Academy needs to develop a learning environment that allows for creating viable connections between the educational environment and the field of practice. In response to this challenge we explore the conditions necessary to make computer assisted training systems (CATS) a useful tool to create structural similarities between an educational context and the field of military practice. Although, CATS may facilitate work procedures close to real life situations, this case do demonstrate how professional competence also must build on viable learning theories and environments. This paper explores the conditions that allow for using simulators to facilitate professional competence from within an educational setting. We develop a generic didactic model that ascribes learning to participation in iterative cycles of action and reflection. The development of this model is motivated by the need to develop an interdisciplinary professional education rooted in the pattern of military practice.
Abstract: The ability of information systems to operate in conjunction with each other encompassing communication protocols, hardware, software, application, and data compatibility layers. There has been considerable work in industry on the development of component interoperability models, such as CORBA, (D)COM and JavaBeans. These models are intended to reduce the complexity of software development and to facilitate reuse of off-the-shelf components. The focus of these models is syntactic interface specification, component packaging, inter-component communications, and bindings to a runtime environment. What these models lack is a consideration of architectural concerns – specifying systems of communicating components, explicitly representing loci of component interaction, and exploiting architectural styles that provide well-understood global design solutions. The development of complex business applications is now focused on an assembly of components available on a local area network or on the net. These components must be localized and identified in terms of available services and communication protocol before any request. The first part of the article introduces the base concepts of components and middleware while the following sections describe the different up-todate models of communication and interaction and the last section shows how different models can communicate among themselves.
Abstract: In recent years image watermarking has become an
important research area in data security, confidentiality and image
integrity. Many watermarking techniques were proposed for medical
images. However, medical images, unlike most of images, require
extreme care when embedding additional data within them because
the additional information must not affect the image quality and
readability. Also the medical records, electronic or not, are linked to
the medical secrecy, for that reason, the records must be confidential.
To fulfill those requirements, this paper presents a lossless
watermarking scheme for DICOM images. The proposed a fragile
scheme combines two reversible techniques based on difference
expansion for patient's data hiding and protecting the region of
interest (ROI) with tamper detection and recovery capability.
Patient's data are embedded into ROI, while recovery data are
embedded into region of non-interest (RONI). The experimental
results show that the original image can be exactly extracted from the
watermarked one in case of no tampering. In case of tampered ROI,
tampered area can be localized and recovered with a high quality
version of the original area.
Abstract: All Text processing systems allow their users to
search a pattern of string from a given text. String matching is
fundamental to database and text processing applications. Every text
editor must contain a mechanism to search the current document for
arbitrary strings. Spelling checkers scan an input text for words in the
dictionary and reject any strings that do not match. We store our
information in data bases so that later on we can retrieve the same
and this retrieval can be done by using various string matching
algorithms. This paper is describing a new string matching algorithm
for various applications. A new algorithm has been designed with the
help of Rabin Karp Matcher, to improve string matching process.
Abstract: Scalability poses a severe threat to the existing
DRAM technology. The capacitors that are used for storing and
sensing charge in DRAM are generally not scaled beyond 42nm.
This is because; the capacitors must be sufficiently large for reliable
sensing and charge storage mechanism. This leaves DRAM memory
scaling in jeopardy, as charge sensing and storage mechanisms
become extremely difficult. In this paper we provide an overview of
the potential and the possibilities of using Phase Change Memory
(PCM) as an alternative for the existing DRAM technology. The
main challenges that we encounter in using PCM are, the limited
endurance, high access latencies, and higher dynamic energy
consumption than that of the conventional DRAM. We then provide
an overview of various methods, which can be employed to
overcome these drawbacks. Hybrid memories involving both PCM
and DRAM can be used, to achieve good tradeoffs in access latency
and storage density. We conclude by presenting, the results of these
methods that makes PCM a potential replacement for the current
DRAM technology.
Abstract: As a tool for human spatial cognition and thinking, the map has been playing an important role. Maps are perhaps as fundamental to society as language and the written word. Economic and social development requires extensive and in-depth understanding of their own living environment, from the scope of the overall global to urban housing. This has brought unprecedented opportunities and challenges for traditional cartography . This paper first proposed the concept of scaleless-map and its basic characteristics, through the analysis of the existing multi-scale representation techniques. Then some strategies are presented for automated mapping compilation. Taking into account the demand of automated map compilation, detailed proposed the software - WJ workstation must have four technical features, which are generalization operators, symbol primitives, dynamically annotation and mapping process template. This paper provides a more systematic new idea and solution to improve the intelligence and automation of the scaleless cartography.
Abstract: In this paper, a study on the modes of collapse of
compress- expand members are presented. Compress- expand member
is a compact, multiple-combined cylinders, to be proposed as energy
absorbers. Previous studies on the compress- expand member have
clarified its energy absorption efficiency, proposed an approximate
equation to describe its deformation characteristics and also
highlighted the improvement that it has brought. However, for the
member to be practical, the actual range of geometrical dimension that
it can maintain its applicability must be investigated. In this study,
using a virtualized materials that comply the bilinear hardening law,
Finite element Method (FEM) analysis on the collapse modes of
compress- expand member have been conducted. Deformation maps
that plotted the member's collapse modes with regards to the member's
geometric and material parameters were then presented in order to
determine the dimensional range of each collapse modes.
Abstract: In this work a dynamic model of a new quadrotor aerial
vehicle that is equipped with a tilt-wing mechanism is presented.
The vehicle has the capabilities of vertical take-off/landing (VTOL)
like a helicopter and flying horizontal like an airplane. Dynamic
model of the vehicle is derived both for vertical and horizontal flight
modes using Newton-Euler formulation. An LQR controller for the
vertical flight mode has also been developed and its performance
has been tested with several simulations.
Abstract: It is not easy to imagine how the existing city can be
converted to the principles of sustainability, however, the need for
innovation, requires a pioneering phase which must address the main
problems of rehabilitation of the operating models of the city. Today,
however, there is a growing awareness that the identification and
implementation of policies and measures to promote the adaptation,
resilience and reversibility of the city, require the contribution of our
discipline. This breakthrough is present in some recent international
experiences of Climate Plans, in which the envisaged measures are
closely interwoven with those of urban planning. These experiences,
provide some answers principle questions, such as: how the strategies
to combat climate can be integrated in the instruments of the local
government; what new and specific analysis must be introduced in
urban planning in order to understand the issues of urban
sustainability, and how the project compares with different spatial
scales.
Abstract: Efforts to secure supervisory control and data acquisition
(SCADA) systems must be supported under the guidance of
sound security policies and mechanisms to enforce them. Critical
elements of the policy must be systematically translated into a format
that can be used by policy enforcement components. Ideally, the
goal is to ensure that the enforced policy is a close reflection of
the specified policy. However, security controls commonly used to
enforce policies in the IT environment were not designed to satisfy
the specific needs of the SCADA environment. This paper presents
a language, based on the well-known XACML framework, for the
expression of authorization policies for SCADA systems.
Abstract: The performance of a type of fuzzy sliding mode control is researched by considering the nonlinear characteristic of a missile-target interception problem to obtain a robust interception process. The variable boundary layer by using fuzzy logic is proposed to reduce the chattering around the switching surface then is applied to the interception model which was derived. The performances of the sliding mode control with constant and fuzzy boundary layer are compared at the end of the study and the results are evaluated.
Abstract: Exploring an autistic child in Elementary school is a
difficult task that must be fully thought out and the teachers should
be aware of the many challenges they face raising their child
especially the behavioral problems of autistic children. Hence there
arises a need for developing Artificial intelligence (AI)
Contemporary Techniques to help diagnosis to discover autistic
people.
In this research, we suggest designing architecture of expert
system that combine Cognitive Maps (CM) with Case Based
Reasoning technique (CBR) in order to reduce time and costs of
traditional diagnosis process for the early detection to discover
autistic children. The teacher is supposed to enter child's information
for analyzing by CM module. Then, the reasoning processor would
translate the output into a case to be solved a current problem by
CBR module. We will implement a prototype for the model as a
proof of concept using java and MYSQL.
This will be provided a new hybrid approach that will achieve new
synergies and improve problem solving capabilities in AI. And we
will predict that will reduce time, costs, the number of human errors
and make expertise available to more people who want who want to
serve autistic children and their families.
Abstract: Human activities are increasingly based on the use of remote resources and services, and on the interaction between
remotely located parties that may know little about each other. Mobile agents must be prepared to execute on different hosts with
various environmental security conditions. The aim of this paper is to
propose a trust based mechanism to improve the security of mobile
agents and allow their execution in various environments. Thus, an
adaptive trust mechanism is proposed. It is based on the dynamic interaction between the agent and the environment. Information
collected during the interaction enables generation of an environment
key. This key informs on the host-s trust degree and permits the mobile agent to adapt its execution. Trust estimation is based on
concrete parameters values. Thus, in case of distrust, the source of problem can be located and a mobile agent appropriate behavior can
be selected.