Abstract: Amazing development of the information technology,
communications and internet expansion as well as the requirements
of the city managers to new ideas to run the city and higher
participation of the citizens encourage us to complete the electronic
city as soon as possible. The foundations of this electronic city are in
information technology. People-s participation in metropolitan
management is a crucial topic. Information technology does not
impede this matter. It can ameliorate populace-s participation and
better interactions between the citizens and the city managers.
Citizens can proffer their ideas, beliefs and votes through digital
mass media based upon the internet and computerization plexuses on
the topical matters to receive appropriate replies and services. They
can participate in urban projects by becoming cognizant of the city
views. The most significant challenges are as follows: information
and communicative management, altering citizens- views, as well as
legal and office documents
Electronic city obstacles have been identified in this research. The
required data were forgathered through questionnaires to identify the
barriers from a statistical community comprising specialists and
practitioners of the ministry of information technology and
communication, the municipality information technology
organization.
The conclusions demonstrate that the prioritized electronic city
application barriers in Iran are as follows:
The support quandaries (non-financial ones), behavioral, cultural
and educational plights, the security, legal and license predicaments,
the hardware, orismological and infrastructural curbs, the software
and fiscal problems.
Abstract: Scouring around a bridge pier is a complex
phenomenon. More laboratory experiments are required to
understand the scour mechanism. This paper focused on time
development of local scour around piers and piles in semi integral
bridges. Laboratory data collected at Hydraulics Laboratory,
University of Malaya was analyzed for this purpose. Tests were
performed with two different uniform sediment sizes and five ranges
of flow velocities. Fine and coarse sediments were tested in the
flume. Results showed that scour depths for both pier and piles
increased with time up to certain levels and after that they became
almost constant. It had been found that scour depths increased when
discharges increased. Coarser sediment also produced lesser scouring
at the piers and combined piles.
Abstract: This paper proposes an innovative methodology for
Acceptance Sampling by Variables, which is a particular category of
Statistical Quality Control dealing with the assurance of products
quality. Our contribution lies in the exploitation of machine learning
techniques to address the complexity and remedy the drawbacks of
existing approaches. More specifically, the proposed methodology
exploits Artificial Neural Networks (ANNs) to aid decision making
about the acceptance or rejection of an inspected sample. For any
type of inspection, ANNs are trained by data from corresponding
tables of a standard-s sampling plan schemes. Once trained, ANNs
can give closed-form solutions for any acceptance quality level and
sample size, thus leading to an automation of the reading of the
sampling plan tables, without any need of compromise with the
values of the specific standard chosen each time. The proposed
methodology provides enough flexibility to quality control engineers
during the inspection of their samples, allowing the consideration of
specific needs, while it also reduces the time and the cost required for
these inspections. Its applicability and advantages are demonstrated
through two numerical examples.
Abstract: As a company competitiveness depends more and more on the relationship with its stakeholders, the topic of companystakeholder fit is becoming increasingly important. This fit affects the extent to which a stakeholder perceives CSR company commitment, values and behaviors and, therefore, stakeholder identification in a company and his/her loyalty to it. Consequently, it is important to measure the alignment or the gap between stakeholder CSR demands, values, preferences and perceptions, and the company CSR disclosed commitment, values and policies. In this paper, in order to assess the company-stakeholder fit about corporate responsibility, an innovative CSR fit positioning matrix is proposed. This matrix is based on the measurement of a company CSR disclosed commitment and stakeholder perceived and required commitment. The matrix is part of a more complex methodology based on Global Reporting Initiative (GRI) indicators, content analysis and stakeholder questionnaires. This methodology provides appropriate indications for helping companies to achieve CSR company-stakeholder fit, by leveraging both CSR commitment and communication. Moreover, it could be used by top management for comparing different companies and stakeholders, and for planning specific CSR strategies, policies and activities.
Abstract: Higher productivity and less cost in the ship
manufacturing process are required to maintain the international
competitiveness of morden manufacturing industries. In shipbuilding,
however, the Engineering To Order (ETO) production method and
production process is very difficult. Thus, designs change frequently.
In accordance with production, planning should be set up according
to scene changes. Therefore, fixed production planning is very
difficult. Thus, a scheduler must first make sketchy plans, then
change the plans based on the work progress and modifications.
Thus, data sharing in a shipbuilding block assembly shop is very
important. In this paper, we proposed to scheduling method
applicable to the shipbuilding industry and decision making support
system through web based visualization system.
Abstract: Recently, as the scale of construction projects has
increases, more ground excavation for foundations is carried out than ever before. Consequently, damage to underground ducts (gas, water/sewage or oil pipelines, communication cables or power cable ducts) or superannuated pipelines frequently cause serious accidents
resulting in damage to life and property. (In Korea, the total length of city water pipelines was approximately 2,000 km as of the end of 2009.) In addition, large amounts of damage caused by fractures, water
and gas leakage caused by superannuation or damage to underground
ducts in construction has been reported. Therefore, a system is required to precisely detect defects and deterioration in underground
pipelines and the locations of such defects, for timely and accurate
maintenance or replacement of the ducts. In this study, a system was
developed which can locate underground structures (gas and water
pipelines, power cable ducts, etc.) in 3D-coordinates and monitor the
degree and position of defects using an Inertial Measurement Unit
(IMU) sensing technique. The system can prevent damage to underground ducts and superannuated pipelines during construction,
and provide reliable data for maintenance. The utility of the IMU sensing technique used in aircraft and ships in civil applications was
verified.
Abstract: Decisions are regularly made during a project or
daily life. Some decisions are critical and have a direct impact on
project or human success. Formal evaluation is thus required,
especially for crucial decisions, to arrive at the optimal solution
among alternatives to address issues. According to microeconomic
theory, all people-s decisions can be modeled as indifference curves.
The proposed approach supports formal analysis and decision by
constructing indifference curve model from the previous experts-
decision criteria. These knowledge embedded in the system can be
reused or help naïve users select alternative solution of the similar
problem. Moreover, the method is flexible to cope with unlimited
number of factors influencing the decision-making. The preliminary
experimental results of the alternative selection are accurately
matched with the expert-s decisions.
Abstract: The sensitivity of orifice plate metering to disturbed
flow (either asymmetric or swirling) is a subject of great concern to
flow meter users and manufacturers. The distortions caused by pipe
fittings and pipe installations upstream of the orifice plate are major
sources of this type of non-standard flows. These distortions can alter
the accuracy of metering to an unacceptable degree. In this work, a
multi-scale object known as metal foam has been used to generate a
predetermined turbulent flow upstream of the orifice plate. The
experimental results showed that the combination of an orifice plate
and metal foam flow conditioner is broadly insensitive to upstream
disturbances. This metal foam demonstrated a good performance in
terms of removing swirl and producing a repeatable flow profile
within a short distance downstream of the device. The results of using
a combination of a metal foam flow conditioner and orifice plate for
non-standard flow conditions including swirling flow and asymmetric
flow show this package can preserve the accuracy of metering up to
the level required in the standards.
Abstract: Testable software has two inherent properties – observability and controllability. Observability facilitates observation of internal behavior of software to required degree of detail. Controllability allows creation of difficult-to-achieve states prior to execution of various tests. In this paper, we describe COTT, a Controllability and Observability Testing Tool, to create testable object-oriented software. COTT provides a framework that helps the user to instrument object-oriented software to build the required controllability and observability. During testing, the tool facilitates creation of difficult-to-achieve states required for testing of difficultto- test conditions and observation of internal details of execution at unit, integration and system levels. The execution observations are logged in a test log file, which are used for post analysis and to generate test coverage reports.
Abstract: Biomimicry has many potential benefits as many
technologies found in nature are superior to their man-made
counterparts. As technological device components approach the micro
and nanoscale, surface properties such as surface adhesion and friction
may need to be taken into account. Lowering surface adhesion by
manipulating chemistry alone might no longer be sufficient for such
components and thus physical manipulation may be required.
Adhesion reduction is only one of the many surface functions
displayed by micro/nano-structured cuticles of insects. Here, we
present a mini review of our understanding of insect cuticle structures
and the relationship between the structure dimensions and the
corresponding functional mechanisms. It may be possible to introduce
additional properties to material surfaces (indeed multi-functional
properties) based on the design of natural surfaces.
Abstract: Cognitive models allow predicting some aspects of utility
and usability of human machine interfaces (HMI), and simulating
the interaction with these interfaces. The action of predicting is based
on a task analysis, which investigates what a user is required to do
in terms of actions and cognitive processes to achieve a task. Task
analysis facilitates the understanding of the system-s functionalities.
Cognitive models are part of the analytical approaches, that do not
associate the users during the development process of the interface.
This article presents a study about the evaluation of a human
machine interaction with a contextual assistant-s interface using ACTR
and GOMS cognitive models. The present work shows how these
techniques may be applied in the evaluation of HMI, design and
research by emphasizing firstly the task analysis and secondly the
time execution of the task. In order to validate and support our
results, an experimental study of user performance is conducted at
the DOMUS laboratory, during the interaction with the contextual
assistant-s interface. The results of our models show that the GOMS
and ACT-R models give good and excellent predictions respectively
of users performance at the task level, as well as the object level.
Therefore, the simulated results are very close to the results obtained
in the experimental study.
Abstract: In today-s new technology era, cluster has become a
necessity for the modern computing and data applications since many
applications take more time (even days or months) for computation.
Although after parallelization, computation speeds up, still time
required for much application can be more. Thus, reliability of the
cluster becomes very important issue and implementation of fault
tolerant mechanism becomes essential. The difficulty in designing a
fault tolerant cluster system increases with the difficulties of various
failures. The most imperative obsession is that the algorithm, which
avoids a simple failure in a system, must tolerate the more severe
failures. In this paper, we implemented the theory of watchdog timer
in a parallel environment, to take care of failures. Implementation of
simple algorithm in our project helps us to take care of different
types of failures; consequently, we found that the reliability of this
cluster improves.
Abstract: Recently electric vehicles are becoming popular as an
alternative of conventional fossil fuel vehicles. Conventional Internal
Combustion Engine (ICE) vehicle uses fossil fuel which contributing
a major part of overall carbon emission in the environment. Carbon
and other green house gas emission are responsible for global
warming and resulting climate change. It becomes vital to evaluate
performance of vehicle based on emission. In this paper an effort has
been made to depict the picture of emission caused by vehicle and
scenario of Australia has taken into account. Effort has been made to
compare the fossil based vehicle with electric vehicle in phases. The
study also evaluates advancement in electric vehicle technology,
required infrastructure for sustainability and future scope of
developments. This paper also includes the evaluation of electric
vehicle concept for pollution control and sustainable transport
systems in future. This study can be a benchmark for development of
electric vehicle as low carbon emission alternative for the cities of
tomorrow.
Abstract: While compressing text files is useful, compressing
still image files is almost a necessity. A typical image takes up much
more storage than a typical text message and without compression
images would be extremely clumsy to store and distribute. The
amount of information required to store pictures on modern
computers is quite large in relation to the amount of bandwidth
commonly available to transmit them over the Internet and
applications. Image compression addresses the problem of reducing
the amount of data required to represent a digital image. Performance
of any image compression method can be evaluated by measuring the
root-mean-square-error & peak signal to noise ratio. The method of
image compression that will be analyzed in this paper is based on the
lossy JPEG image compression technique, the most popular
compression technique for color images. JPEG compression is able to
greatly reduce file size with minimal image degradation by throwing
away the least “important" information. In JPEG, both color
components are downsampled simultaneously, but in this paper we
will compare the results when the compression is done by
downsampling the single chroma part. In this paper we will
demonstrate more compression ratio is achieved when the
chrominance blue is downsampled as compared to downsampling the
chrominance red in JPEG compression. But the peak signal to noise
ratio is more when the chrominance red is downsampled as compared
to downsampling the chrominance blue in JPEG compression. In
particular we will use the hats.jpg as a demonstration of JPEG
compression using low pass filter and demonstrate that the image is
compressed with barely any visual differences with both methods.
Abstract: The utilization of cheese whey as a fermentation
substrate to produce bio-ethanol is an effort to supply bio-ethanol
demand as a renewable energy. Like other process systems, modeling
is also required for fermentation process design, optimization and
plant operation. This research aims to study the fermentation process
of cheese whey by applying mathematics and fundamental concept in
chemical engineering, and to investigate the characteristic of the
cheese whey fermentation process. Steady state simulation results for
inlet substrate concentration of 50, 100 and 150 g/l, and various
values of hydraulic retention time, showed that the ethanol
productivity maximum values were 0.1091, 0.3163 and 0.5639 g/l.h
respectively. Those values were achieved at hydraulic retention time
of 20 hours, which was the minimum value used in this modeling.
This showed that operating reactor at low hydraulic retention time
was favorable. Model of bio-ethanol production from cheese whey
will enhance the understanding of what really happen in the
fermentation process.
Abstract: In comparison to the original SVM, which involves a
quadratic programming task; LS–SVM simplifies the required
computation, but unfortunately the sparseness of standard SVM is
lost. Another problem is that LS-SVM is only optimal if the training
samples are corrupted by Gaussian noise. In Least Squares SVM
(LS–SVM), the nonlinear solution is obtained, by first mapping the
input vector to a high dimensional kernel space in a nonlinear
fashion, where the solution is calculated from a linear equation set. In
this paper a geometric view of the kernel space is introduced, which
enables us to develop a new formulation to achieve a sparse and
robust estimate.
Abstract: Software effort estimation is the process of predicting
the most realistic use of effort required to develop or maintain
software based on incomplete, uncertain and/or noisy input. Effort
estimates may be used as input to project plans, iteration plans,
budgets. There are various models like Halstead, Walston-Felix,
Bailey-Basili, Doty and GA Based models which have already used
to estimate the software effort for projects. In this study Statistical
Models, Fuzzy-GA and Neuro-Fuzzy (NF) Inference Systems are
experimented to estimate the software effort for projects. The
performances of the developed models were tested on NASA
software project datasets and results are compared with the Halstead,
Walston-Felix, Bailey-Basili, Doty and Genetic Algorithm Based
models mentioned in the literature. The result shows that the NF
Model has the lowest MMRE and RMSE values. The NF Model
shows the best results as compared with the Fuzzy-GA based hybrid
Inference System and other existing Models that are being used for
the Effort Prediction with lowest MMRE and RMSE values.
Abstract: In this work, ionic liquids (ILs) for CO2 capturing in typical absorption/stripper process are considered. The use of ionic liquids is considered to be cost-effective because it requires less energy for solvent recovery compared to other conventional processes. A mathematical model is developed for the process based on Peng-Robinson (PR) equation of state (EoS) which is validated with experimental data for various solutions involving CO2. The model is utilized to study the sorbent and energy demand for three types of ILs at specific CO2 capturing rates. The energy demand is manifested by the vapor-liquid equilibrium temperature necessary to remove the captured CO2 from the used solvent in the regeneration step. It is found that higher recovery temperature is required for solvents with higher solubility coefficient. For all ILs, the temperature requirement is less than that required by the typical monoethanolamine (MEA) solvent. The effect of the CO2 loading in the sorbent stream on the process performance is also examined.
Abstract: Principal Component Analysis (PCA) has many
different important applications especially in pattern detection
such as face detection / recognition. Therefore, for real time
applications, the response time is required to be as small as
possible. In this paper, new implementation of PCA for fast
face detection is presented. Such new implementation is
designed based on cross correlation in the frequency domain
between the input image and eigenvectors (weights).
Simulation results show that the proposed implementation of
PCA is faster than conventional one.
Abstract: purpose of this study was to investigate the current status of support services for students with special education needs (SEN) at colleges and universities in Taiwan. Seventy-two college and universities received a questionnaire on its resource room operation process and four resource room staffs each from different areas were interviewed through semi- structured interview forms. The main findings were (1) most colleges and universities did offer sufficient administrative resources; (2) more efforts on preventions for SEN students and establishment of disability awareness should be made for all campus faculties ; (3) more comprehensive services were required to help students to have better transition into post-school life; (4) most schools provided basic administrative resource requirements but qualities of the resource room programs needed to be enhanced; and (5) most resource room staffs lacked of professional knowledge in counseling the SEN students which needed to be strengthened in the future.