Abstract: The paper is concerned with developing stochastic delay mechanisms for efficient multicast protocols and for smooth mobile handover processes which are capable of preserving a given Quality of Service (QoS). In both applications the participating entities (receiver nodes or subscribers) sample a stochastic timer and generate load after a random delay. In this way, the load on the networking resources is evenly distributed which helps to maintain QoS communication. The optimal timer distributions have been sought in different p.d.f. families (e.g. exponential, power law and radial basis function) and the optimal parameter have been found in a recursive manner. Detailed simulations have demonstrated the improvement in performance both in the case of multicast and mobile handover applications.
Abstract: The original idea for a feature film may come from a
writer, director or a producer. Director is the person responsible for
the creative aspects, both interpretive and technical, of a motion
picture production in a film. Director may be shot discussing his
project with his or her cowriters, members of production staff, and
producer, and director may be shown selecting locales or
constructing sets. All these activities provide, of course, ways of
externalizing director-s ideas about the film. A director sometimes
pushes both the film image and techniques of narration to new artistic
limits, but main responsibility of director is take the spectator to an
original opinion in his philosophical approach. Director tries to find
an artistic angle in every scene and change screenplay into an
effective story and sets his film on a spiritual and philosophical base.
Abstract: The objective of this paper is to propose an adaptive multi threshold for image segmentation precisely in object detection. Due to the different types of license plates being used, the requirement of an automatic LPR is rather different for each country. The proposed technique is applied on Malaysian LPR application. It is based on Multi Layer Perceptron trained by back propagation. The proposed adaptive threshold is introduced to find the optimum threshold values. The technique relies on the peak value from the graph of the number object versus specific range of threshold values. The proposed approach has improved the overall performance compared to current optimal threshold techniques. Further improvement on this method is in progress to accommodate real time system specification.
Abstract: In this paper, we have combined some spatial derivatives with the optimised time derivative proposed by Tam and Webb in order to approximate the linear advection equation which is given by = 0. Ôêé Ôêé + Ôêé Ôêé x f t u These spatial derivatives are as follows: a standard 7-point 6 th -order central difference scheme (ST7), a standard 9-point 8 th -order central difference scheme (ST9) and optimised schemes designed by Tam and Webb, Lockard et al., Zingg et al., Zhuang and Chen, Bogey and Bailly. Thus, these seven different spatial derivatives have been coupled with the optimised time derivative to obtain seven different finite-difference schemes to approximate the linear advection equation. We have analysed the variation of the modified wavenumber and group velocity, both with respect to the exact wavenumber for each spatial derivative. The problems considered are the 1-D propagation of a Boxcar function, propagation of an initial disturbance consisting of a sine and Gaussian function and the propagation of a Gaussian profile. It is known that the choice of the cfl number affects the quality of results in terms of dissipation and dispersion characteristics. Based on the numerical experiments solved and numerical methods used to approximate the linear advection equation, it is observed in this work, that the quality of results is dependent on the choice of the cfl number, even for optimised numerical methods. The errors from the numerical results have been quantified into dispersion and dissipation using a technique devised by Takacs. Also, the quantity, Exponential Error for Low Dispersion and Low Dissipation, eeldld has been computed from the numerical results. Moreover, based on this work, it has been found that when the quantity, eeldld can be used as a measure of the total error. In particular, the total error is a minimum when the eeldld is a minimum.
Abstract: The present study is concerned with the effect of
exciting boundary layer on cooling process in a gas-turbine blades.
The cooling process is numerically investigated. Observations show
cooling the first row of moving or stable blades leads to increase
their life-time. Results show that minimum temperature in cooling
line with exciting boundary layer is lower than without exciting.
Using block in cooling line of turbines' blade causes flow pattern and
stability in boundary layer changed that causes increase in heat
transfer coefficient. Results show at the location of block,
temperature of turbines' blade is significantly decreased. The k-ε
turbulence model is used.
Abstract: Industrial radiography is a famous technique for the identification and evaluation of discontinuities, or defects, such as cracks, porosity and foreign inclusions found in welded joints. Although this technique has been well developed, improving both the inspection process and operating time, it does suffer from several drawbacks. The poor quality of radiographic images is due to the physical nature of radiography as well as small size of the defects and their poor orientation relatively to the size and thickness of the evaluated parts. Digital image processing techniques allow the interpretation of the image to be automated, avoiding the presence of human operators making the inspection system more reliable, reproducible and faster. This paper describes our attempt to develop and implement digital image processing algorithms for the purpose of automatic defect detection in radiographic images. Because of the complex nature of the considered images, and in order that the detected defect region represents the most accurately possible the real defect, the choice of global and local preprocessing and segmentation methods must be appropriated.
Abstract: Information is power. Geographical information is an
emerging science that is advancing the development of knowledge to
further help in the understanding of the relationship of “place" with
other disciplines such as crime. The researchers used crime data for
the years 2004 to 2007 from the Baguio City Police Office to
determine the incidence and actual locations of crime hotspots.
Combined qualitative and quantitative research methodology was
employed through extensive fieldwork and observation, geographic
visualization with Geographic Information Systems (GIS) and Global
Positioning Systems (GPS), and data mining. The paper discusses
emerging geographic visualization and data mining tools and
methodologies that can be used to generate baseline data for
environmental initiatives such as urban renewal and rejuvenation.
The study was able to demonstrate that crime hotspots can be
computed and were seen to be occurring to some select places in the
Central Business District (CBD) of Baguio City. It was observed that
some characteristics of the hotspot places- physical design and milieu
may play an important role in creating opportunities for crime. A list
of these environmental attributes was generated. This derived
information may be used to guide the design or redesign of the urban
environment of the City to be able to reduce crime and at the same
time improve it physically.
Abstract: As networking has become popular, Web-learning
tends to be a trend while designing a tool. Moreover, five-axis
machining has been widely used in industry recently; however, it has
potential axial table colliding problems. Thus this paper aims at
proposing an efficient web-learning collision detection tool on
five-axis machining. However, collision detection consumes heavy
resource that few devices can support, thus this research uses a
systematic approach based on web knowledge to detect collision. The
methodologies include the kinematics analyses for five-axis motions,
separating axis method for collision detection, and computer
simulation for verification. The machine structure is modeled as STL
format in CAD software. The input to the detection system is the
g-code part program, which describes the tool motions to produce the
part surface. This research produced a simulation program with C
programming language and demonstrated a five-axis machining
example with collision detection on web site. The system simulates the
five-axis CNC motion for tool trajectory and detects for any collisions
according to the input g-codes and also supports high-performance
web service benefiting from C. The result shows that our method
improves 4.5 time of computational efficiency, comparing to the
conventional detection method.
Abstract: In this paper is investigated a possible
optimization of some linear algebra problems which can be
solved by parallel processing using the special arrays called
systolic arrays. In this paper are used some special types of
transformations for the designing of these arrays. We show
the characteristics of these arrays. The main focus is on
discussing the advantages of these arrays in parallel
computation of matrix product, with special approach to the
designing of systolic array for matrix multiplication.
Multiplication of large matrices requires a lot of
computational time and its complexity is O(n3 ). There are
developed many algorithms (both sequential and parallel) with
the purpose of minimizing the time of calculations. Systolic
arrays are good suited for this purpose. In this paper we show
that using an appropriate transformation implicates in finding
more optimal arrays for doing the calculations of this type.
Abstract: In this paper bi-annual time series data on unemployment rates (from the Labour Force Survey) are expanded to quarterly rates and linked to quarterly unemployment rates (from the Quarterly Labour Force Survey). The resultant linked series and the consumer price index (CPI) series are examined using Johansen’s cointegration approach and vector error correction modeling. The study finds that both the series are integrated of order one and are cointegrated. A statistically significant co-integrating relationship is found to exist between the time series of unemployment rates and the CPI. Given this significant relationship, the study models this relationship using Vector Error Correction Models (VECM), one with a restriction on the deterministic term and the other with no restriction.
A formal statistical confirmation of the existence of a unique linear and lagged relationship between inflation and unemployment for the period between September 2000 and June 2011 is presented. For the given period, the CPI was found to be an unbiased predictor of the unemployment rate. This relationship can be explored further for the development of appropriate forecasting models incorporating other study variables.
Abstract: In this study we present our developed formative
assessment tool for students' assignments. The tool enables lecturers
to define assignments for the course and assign each problem in each
assignment a list of criteria and weights by which the students' work
is evaluated. During assessment, the lecturers feed the scores for each
criterion with justifications. When the scores of the current
assignment are completely fed in, the tool automatically generates
reports for both students and lecturers. The students receive a report
by email including detailed description of their assessed work, their
relative score and their progress across the criteria along the course
timeline. This information is presented via charts generated
automatically by the tool based on the scores fed in. The lecturers
receive a report that includes summative (e.g., averages, standard
deviations) and detailed (e.g., histogram) data of the current
assignment. This information enables the lecturers to follow the class
achievements and adjust the learning process accordingly. The tool
was examined on two pilot groups of college students that study a
course in (1) Object-Oriented Programming (2) Plane Geometry.
Results reveal that most of the students were satisfied with the
assessment process and the reports produced by the tool. The
lecturers who used the tool were also satisfied with the reports and
their contribution to the learning process.
Abstract: Phishing, or stealing of sensitive information on the
web, has dealt a major blow to Internet Security in recent times. Most
of the existing anti-phishing solutions fail to handle the fuzziness
involved in phish detection, thus leading to a large number of false
positives. This fuzziness is attributed to the use of highly flexible and
at the same time, highly ambiguous HTML language. We introduce a
new perspective against phishing, that tries to systematically prove,
whether a given page is phished or not, using the corresponding
original page as the basis of the comparison. It analyzes the layout of
the pages under consideration to determine the percentage distortion
between them, indicative of any form of malicious alteration. The
system design represents an intelligent system, employing dynamic
assessment which accurately identifies brand new phishing attacks
and will prove effective in reducing the number of false positives.
This framework could potentially be used as a knowledge base, in
educating the internet users against phishing.
Abstract: Recently there has been a growing interest in the field
of bio-mimetic robots that resemble the behaviors of an insect or an
aquatic animal, among many others. One of various bio-mimetic robot
applications is to explore pipelines, spotting any troubled areas or
malfunctions and reporting its data. Moreover, the robot is able to
prepare for and react to any abnormal routes in the pipeline. Special
types of mobile robots are necessary for the pipeline monitoring tasks.
In order to move effectively along a pipeline, the robot-s movement
will resemble that of insects or crawling animals. When situated in
massive pipelines with complex routes, the robot places fixed sensors
in several important spots in order to complete its monitoring. This
monitoring task is to prevent a major system failure by preemptively
recognizing any minor or partial malfunctions. Areas uncovered by
fixed sensors are usually impossible to provide real-time observation
and examination, and thus are dependent on periodical offline
monitoring. This paper proposes a monitoring system that is able to
monitor the entire area of pipelines–with and without fixed
sensors–by using the bio-mimetic robot.
Abstract: Emergence of smartphones brings to live the concept
of converged devices with the availability of web amenities. Such
trend also challenges the mobile devices manufactures and service
providers in many aspects, such as security on mobile phones,
complex and long time design flow, as well as higher development
cost. Among these aspects, security on mobile phones is getting more
and more attention. Microkernel based virtualization technology will
play a critical role in addressing these challenges and meeting mobile
market needs and preferences, since virtualization provides essential
isolation for security reasons and it allows multiple operating systems
to run on one processor accelerating development and cutting development
cost. However, virtualization benefits do not come for free.
As an additional software layer, it adds some inevitable virtualization
overhead to the system, which may decrease the system performance.
In this paper we evaluate and analyze the virtualization performance
cost of L4 microkernel based virtualization on a competitive mobile
phone by comparing the L4Linux, a para-virtualized Linux on top of
L4 microkernel, with the native Linux performance using lmbench
and a set of typical mobile phone applications.
Abstract: Complex engineering design problems consist of
numerous factors of varying criticalities. Considering fundamental features of design and inferior details alike will result in an extensive
waste of time and effort. Design parameters should be introduced gradually as appropriate based on their significance relevant to the
problem context. This motivates the representation of design parameters at multiple levels of an abstraction hierarchy. However, developing abstraction hierarchies is an area that is not well
understood. Our research proposes a novel hierarchical abstraction methodology to plan effective engineering designs and processes. It
provides a theoretically sound foundation to represent, abstract and stratify engineering design parameters and tasks according to causality and criticality. The methodology creates abstraction
hierarchies in a recursive and bottom-up approach that guarantees no
backtracking across any of the abstraction levels. The methodology consists of three main phases, representation, abstraction, and layering to multiple hierarchical levels. The effectiveness of the
developed methodology is demonstrated by a design problem.
Abstract: Concerns about low levels of children-s physical activity and motor skill development, prompted the Ministry of Education to trial a physical activity pilot project (PAPP) in 16 New Zealand primary schools. The project comprised professional development and training in physical education for lead teachers and introduced four physical activity coordinators to liaise with and increase physical activity opportunities in the pilot schools. A survey of generalist teachers (128 baseline, 155 post-intervention) from these schools looked at timetabled physical activity sessions and issues related to teaching physical education. The authors calculated means and standard deviations of data relating to timetabled PE sessions and used a one-way analysis of variance to determine significant differences. Results indicated time devoted to physical activity related subjects significantly increased over the course of the intervention. Teacher-s reported improved confidence and competence, which resulted in an improvement in quality physical education delivered more often.
Abstract: Non-saturated soils that while saturation greatly
decrease their volume, have sudden settlement due to increasing
humidity, fracture and structural crack are called loess soils. Whereas
importance of civil projects including: dams, canals and
constructions bearing this type of soil and thereof problems, it is
required for carrying out more research and study in relation to loess
soils. This research studies shear strength parameters by using
grading test, Atterberg limit, compression, direct shear and
consolidation and then effect of using cement and lime additives on
stability of loess soils is studied. In related tests, lime and cement are
separately added to mixed ratios under different percentages of soil
and for different times the stabilized samples are processed and effect
of aforesaid additives on shear strength parameters of soil is studied.
Results show that upon passing time the effect of additives and
collapsible potential is greatly decreased and upon increasing
percentage of cement and lime the maximum dry density is
decreased; however, optimum humidity is increased. In addition,
liquid limit and plastic index is decreased; however, plastic index
limit is increased. It is to be noted that results of direct shear test
reveal increasing shear strength of soil due to increasing cohesion
parameter and soil friction angle.
Abstract: Time varying network induced delays in networked
control systems (NCS) are known for degrading control system-s
quality of performance (QoP) and causing stability problems. In
literature, a control method employing modeling of communication
delays as probability distribution, proves to be a better method. This
paper focuses on modeling of network induced delays as probability
distribution.
CAN and MIL-STD-1553B are extensively used to carry periodic
control and monitoring data in networked control systems.
In literature, methods to estimate only the worst-case delays for
these networks are available. In this paper probabilistic network
delay model for CAN and MIL-STD-1553B networks are given.
A systematic method to estimate values to model parameters from
network parameters is given. A method to predict network delay in
next cycle based on the present network delay is presented. Effect of
active network redundancy and redundancy at node level on network
delay and system response-time is also analyzed.
Abstract: A lateral trench-gate power metal-oxide-semiconductor on 4H-SiC is proposed. The device consists of two separate trenches in which two gates are placed on both sides of P-body region resulting two parallel channels. Enhanced current conduction and reduced-surface-field effect in the structure provide substantial improvement in the device performance. Using two dimensional simulations, the performance of proposed device is evaluated and compare of with that of the conventional device for same cell pitch. It is demonstrated that the proposed structure provides two times higher output current, 11% decrease in threshold voltage, 70% improvement in transconductance, 70% reduction in specific ON-resistance, 52% increase in breakdown voltage, and nearly eight time improvement in figure-of-merit over the conventional device.
Abstract: The world-s largest Pre-stressed Concrete Cylinder
Pipe (PCCP) water supply project had a series of pipe failures which
occurred between 1999 and 2001. This has led the Man-Made River
Authority (MMRA), the authority in charge of the implementation
and operation of the project, to setup a rehabilitation plan for the
conveyance system while maintaining the uninterrupted flow of
water to consumers. At the same time, MMRA recognized the need
for a long term management tool that would facilitate repair and
maintenance decisions and enable taking the appropriate preventive
measures through continuous monitoring and estimation of the
remaining life of each pipe. This management tool is known as the
Pipe Risk Management System (PRMS) and now in operation at
MMRA. Both the rehabilitation plan and the PRMS require the
availability of complete and accurate pipe construction and
manufacturing data
This paper describes a systematic approach of data collection,
analysis, evaluation and correction for the construction and
manufacturing data files of phase I pipes which are the platform for
the PRMS database and any other related decision support system.