Abstract: Intellectual capital is one of the most valuable and
important parts of the intangible assets of enterprises especially in
knowledge-based enterprises. With respect to increasing gap between
the market value and the book value of the companies, intellectual
capital is one of the components that can be placed in this gap. This
paper uses the value added efficiency of the three components,
capital employed, human capital and structural capital, to measure the
intellectual capital efficiency of Iranian industries groups, listed in
the Tehran Stock Exchange (TSE), using a 8 years period data set
from 2005 to 2012. In order to analyze the effect of intellectual
capital on the market-to-book value ratio of the companies, the data
set was divided into 10 industries, Banking, Pharmaceutical, Metals
& Mineral Nonmetallic, Food, Computer, Building, Investments,
Chemical, Cement and Automotive, and the panel data method was
applied to estimating pooled OLS. The results exhibited that value
added of capital employed has a positive significant relation with
increasing market value in the industries, Banking, Metals & Mineral
Nonmetallic, Food, Computer, Chemical and Cement, and also,
showed that value added efficiency of structural capital has a positive
significant relation with increasing market value in the Banking,
Pharmaceutical and Computer industries groups. The results of the
value added showed a negative relation with the Banking and
Pharmaceutical industries groups and a positive relation with
computer and Automotive industries groups. Among the studied
industries, computer industry has placed the widest gap between the
market value and book value in its intellectual capital.
Abstract: With 40% of total world energy consumption,
building systems are developing into technically complex large
energy consumers suitable for application of sophisticated power
management approaches to largely increase the energy efficiency
and even make them active energy market participants. Centralized
control system of building heating and cooling managed by
economically-optimal model predictive control shows promising
results with estimated 30% of energy efficiency increase. The research
is focused on implementation of such a method on a case study
performed on two floors of our faculty building with corresponding
sensors wireless data acquisition, remote heating/cooling units and
central climate controller. Building walls are mathematically modeled
with corresponding material types, surface shapes and sizes. Models
are then exploited to predict thermal characteristics and changes in
different building zones. Exterior influences such as environmental
conditions and weather forecast, people behavior and comfort
demands are all taken into account for deriving price-optimal climate
control. Finally, a DC microgrid with photovoltaics, wind turbine,
supercapacitor, batteries and fuel cell stacks is added to make the
building a unit capable of active participation in a price-varying
energy market. Computational burden of applying model predictive
control on such a complex system is relaxed through a hierarchical
decomposition of the microgrid and climate control, where the
former is designed as higher hierarchical level with pre-calculated
price-optimal power flows control, and latter is designed as lower
level control responsible to ensure thermal comfort and exploit
the optimal supply conditions enabled by microgrid energy flows
management. Such an approach is expected to enable the inclusion
of more complex building subsystems into consideration in order to
further increase the energy efficiency.
Abstract: Recently, traffic monitoring has attracted the attention
of computer vision researchers. Many algorithms have been
developed to detect and track moving vehicles. In fact, vehicle
tracking in daytime and in nighttime cannot be approached with the
same techniques, due to the extreme different illumination conditions.
Consequently, traffic-monitoring systems are in need of having a
component to differentiate between daytime and nighttime scenes. In
this paper, a HSV-based day/night detector is proposed for traffic
monitoring scenes. The detector employs the hue-histogram and the
value-histogram on the top half of the image frame. Experimental
results show that the extraction of the brightness features along with
the color features within the top region of the image is effective for
classifying traffic scenes. In addition, the detector achieves high
precision and recall rates along with it is feasible for real time
applications.
Abstract: In this research, students’ scientific attitude, computer anxiety, educational use of the Internet, academic achievement, and problematic use of the Internet are analyzed based on different variables (gender, parents’ educational level and daily access to the Internet). The research group involves 361 students from two middle schools which are located in the center of Konya. The “general survey method” is adopted in the research. In accordance with the purpose of the study, percentage, mean, standard deviation, independent samples t--‐test, ANOVA (variance) are employed in the study. A total of four scales are implemented. These four scales include a total of 13 sub-dimensions. The scores from these scales and their subscales are studied in terms of various variables. In the research, students’ scientific attitude, computer anxiety, educational use of the Internet, the problematic Internet use and academic achievement (gender, parent educational level, and daily access to the Internet) are investigated based on various variables and some significant relations are found.
Abstract: Motion response of floating structures is of great
concern in marine engineering. Nonlinearity is an inherent property
of any floating bodies subjected to irregular waves. These floating
structures are continuously subjected to environmental loadings from
wave, current, wind etc. This can result in undesirable motions of the
vessel which may challenge the operability. For a floating body to
remain in its position, it should be able to induce a restoring force
when displaced. Mooring is provided to enable this restoring force.
This paper discusses the hydrodynamic performance and motion
characteristics of an 8 point spread mooring system applied to a pipe
laying barge operating in the West African sea. The modelling of the
barge is done using a computer aided-design (CAD) software
RHINOCEROS. Irregular waves are generated using a suitable wave
spectrum. Both frequency domain and time domain analysis is done.
Numerical simulations based on potential theory are carried out to
find the responses and hydrodynamic performance of the barge in
both free floating as well as moored conditions. Initially, potential
flow frequency domain analysis is done to obtain the Response
Amplitude Operator (RAO) which gives an idea about the structural
motion in free floating state. RAOs for different wave headings are
analyzed. In the following step, a time domain analysis is carried out
to obtain the responses of the structure in the moored condition. In
this study, wave induced motions are only taken into consideration.
Wind and current loads are ruled out and shall be included in further
studies. For the current study, 2000 seconds simulation is taken. The
results represent wave induced motion responses, mooring line
tensions and identify critical mooring lines.
Abstract: The present study aims to explore the effect of
computerization on marketing performance in Snowa Company. In
other words, this study intends to respond to this question that
whether or not, is there any relationship between utilization of
computerization in marketing activities and marketing performance?
The statistical population included 60 marketing managers of Snowa
Company. In order to test the research hypotheses, Pearson
correlation coefficient was employed. The reliability was equal to
96.8%. In this study, computerization was the independent variable
and marketing performance was the dependent variable with
characteristics of market share, improving the competitive position,
and sales volume. The results of testing the hypotheses revealed that
there is a significant relationship between utilization of
computerization and market share, sales volume and improving the
competitive position.
Abstract: Ambient Computing or Ambient Intelligence (AmI) is
emerging area in computer science aiming to create intelligently
connected environments and Internet of Things. In this paper, we
propose communication middleware architecture for AmI. This
middleware architecture addresses problems of communication,
networking, and abstraction of applications, although there are other
aspects (e.g. HCI and Security) within general AmI framework.
Within this middleware architecture, any application developer might
address HCI and Security issues with extensibility features of this
platform.
Abstract: Digital images are widely used in computer
applications. To store or transmit the uncompressed images
requires considerable storage capacity and transmission bandwidth.
Image compression is a means to perform transmission or storage of
visual data in the most economical way. This paper explains about
how images can be encoded to be transmitted in a multiplexing
time-frequency domain channel. Multiplexing involves packing
signals together whose representations are compact in the working
domain. In order to optimize transmission resources each 4 × 4
pixel block of the image is transformed by a suitable polynomial
approximation, into a minimal number of coefficients. Less than
4 × 4 coefficients in one block spares a significant amount of
transmitted information, but some information is lost. Different
approximations for image transformation have been evaluated as
polynomial representation (Vandermonde matrix), least squares +
gradient descent, 1-D Chebyshev polynomials, 2-D Chebyshev
polynomials or singular value decomposition (SVD). Results have
been compared in terms of nominal compression rate (NCR),
compression ratio (CR) and peak signal-to-noise ratio (PSNR)
in order to minimize the error function defined as the difference
between the original pixel gray levels and the approximated
polynomial output. Polynomial coefficients have been later encoded
and handled for generating chirps in a target rate of about two
chirps per 4 × 4 pixel block and then submitted to a transmission
multiplexing operation in the time-frequency domain.
Abstract: Computer education is referred to as the knowledge
and ability to use computers and related technology efficiently, with a
range of skills covering levels from basic use to advance. Computer
continues to make an ever-increasing impact on all aspect of human
endeavours such as education. With numerous benefits of computer
education, what are the insights of students on computer education?
This study investigated the perception of senior secondary school
students on computer education in Federal Capital Territory (FCT),
Abuja, Nigeria. A sample of 7500 senior secondary schools students
was involved in the study, one hundred (100) private and fifty (50)
public schools within FCT. They were selected by using simple
random sampling technique. A questionnaire [PSSSCEQ] was
developed and validated through expert judgement and reliability coefficient
of 0.84 was obtained. It was used to gather relevant data on
computer education. Findings confirmed that the students in the FCT
had positive perception on computer education. Some factors were
identified that affect students’ perception on computer education. The
null hypotheses were tested using t-test and ANOVA statistical
analyses at 0.05 level of significance. Based on these findings, some
recommendations were made which include competent teachers
should be employed into all secondary schools. This will help
students to acquire relevant knowledge in computer education,
technological supports should be provided to all secondary schools;
this will help the users (students) to solve specific problems in
computer education and financial supports should be provided to
procure computer facilities that will enhance the teaching and the
learning of computer education.
Abstract: Background modeling and subtraction in video
analysis has been widely used as an effective method for moving
objects detection in many computer vision applications. Recently, a
large number of approaches have been developed to tackle different
types of challenges in this field. However, the dynamic background
and illumination variations are the most frequently occurred problems
in the practical situation. This paper presents a favorable two-layer
model based on codebook algorithm incorporated with local binary
pattern (LBP) texture measure, targeted for handling dynamic
background and illumination variation problems. More specifically,
the first layer is designed by block-based codebook combining with
LBP histogram and mean value of each RGB color channel. Because
of the invariance of the LBP features with respect to monotonic
gray-scale changes, this layer can produce block wise detection results
with considerable tolerance of illumination variations. The pixel-based
codebook is employed to reinforce the precision from the output of the
first layer which is to eliminate false positives further. As a result, the
proposed approach can greatly promote the accuracy under the
circumstances of dynamic background and illumination changes.
Experimental results on several popular background subtraction
datasets demonstrate very competitive performance compared to
previous models.
Abstract: This study investigates how the site specific traffic
data differs from the Mechanistic Empirical Pavement Design
Software default values. Two Weigh-in-Motion (WIM) stations were
installed in Interstate-40 (I-40) and Interstate-25 (I-25) to developed
site specific data. A computer program named WIM Data Analysis
Software (WIMDAS) was developed using Microsoft C-Sharp (.Net)
for quality checking and processing of raw WIM data. A complete
year data from November 2013 to October 2014 was analyzed using
the developed WIM Data Analysis Program. After that, the vehicle
class distribution, directional distribution, lane distribution, monthly
adjustment factor, hourly distribution, axle load spectra, average
number of axle per vehicle, axle spacing, lateral wander distribution,
and wheelbase distribution were calculated. Then a comparative
study was done between measured data and AASHTOWare default
values. It was found that the measured general traffic inputs for I-40
and I-25 significantly differ from the default values.
Abstract: The use of Computer Aided Design (CAD)
technologies has become pervasive in the Architecture, Engineering
and Construction (AEC) industry. This has led to its inclusion as an
important part of the training module in the curriculum for
Architecture Schools in Nigeria. This paper examines the ethical
questions that arise in the implementation of Computer Aided Design
(CAD) Content of the curriculum for Architectural education. Using
existing literature, it begins this scrutiny from the propriety of
inclusion of CAD into the education of the architect and the
obligations of the different stakeholders in the implementation
process. It also examines the questions raised by the negative use of
computing technologies as well as perceived negative influence of
the use of CAD on design creativity. Survey methodology was
employed to gather data from the Department of Architecture,
Chukwuemeka Odumegwu Ojukwu University Uli, which has been
used as a case study on how the issues raised are being addressed.
The paper draws conclusions on what will make for successful ethical
implementation.
Abstract: Password authentication is one of the widely used
methods to achieve authentication for legal users of computers and
defense against attackers. There are many different ways to
authenticate users of a system and there are many password cracking
methods also developed. This paper proposes how best password
cracking can be performed on a CPU-GPGPU based system. The
main objective of this work is to project how quickly a password can
be cracked with some knowledge about the computer security and
password cracking if sufficient security is not incorporated to the
system.
Abstract: Education and practical training crisis management
members are a topical issue nowadays. The paper deals with the
perspectives and possibilities of "smart solutions" to education for
crisis management staff. Currently, there is a large number of
simulation tools, which notes that they are suitable for practical
training of crisis management staff. The first part of the paper is focused on the introduction of the
technology simulation tools. The simulators aim is to create a
realistic environment for the practical training of extending units of
crisis staff. The second part of the paper concerns the possibilities of using the
simulation technology to the education process. The aim of this
section is to introduce the practical capabilities and potential of the
simulation programs for practical training of crisis management staff.
Abstract: Liver segmentation from medical images poses more
challenges than analogous segmentations of other organs. This
contribution introduces a liver segmentation method from a series of
computer tomography images. Overall, we present a novel method for
segmenting liver by coupling density matching with shape priors.
Density matching signifies a tracking method which operates via
maximizing the Bhattacharyya similarity measure between the
photometric distribution from an estimated image region and a model
photometric distribution. Density matching controls the direction of
the evolution process and slows down the evolving contour in regions
with weak edges. The shape prior improves the robustness of density
matching and discourages the evolving contour from exceeding liver’s
boundaries at regions with weak boundaries. The model is
implemented using a modified distance regularized level set (DRLS)
model. The experimental results show that the method achieves a
satisfactory result. By comparing with the original DRLS model, it is
evident that the proposed model herein is more effective in addressing
the over segmentation problem. Finally, we gauge our performance of
our model against matrices comprising of accuracy, sensitivity, and
specificity.
Abstract: The aim of the study is to describe and analyze design
of mobile teaching for students collaborative learning in distance
higher education with a focus on mobile technologies as online
webinars (web-based seminars or conferencing) by using laptops,
smart phones, or tablets. These multimedia tools can provide face-toface
interactions, recorded flipped classroom videos and parallel chat
communications. The data collection consists of interviews with 22
students and observations of online face-to-face webinars, as well
two surveys. Theoretically, the study joins the research tradition of
Computer Supported Collaborative learning, CSCL, as well as
Computer Self-Efficacy, CSE concerned with individuals’ media and
information literacy. Important conclusions from the study
demonstrated mobile interactions increased student centered
learning. As the students were appreciating the working methods,
they became more engaged and motivated. The mobile technology
using among student also contributes to increased flexibility between
space and place, as well as media and information literacy.
Abstract: Experience is what makes a man perfect. Though we
tend to learn many a different things in life through practice still we
need to go an extra mile to gain experience which would be profitable
only when it is integrated with regular practice. A clear phenomenal
idea is that every teacher is a learner. The centralized idea of this paper would focus on the integrated
practices carried out among the students of Jizan University which
enhances learning through experiences. Integrated practices like
student-directed activities, balanced curriculum, phonological based
activities and use of consistent language would enlarge the vision and
mission of students to earn experience through learning. Students
who receive explicit instruction and guidance could practice the skills
and strategies through student-directed activities such as peer tutoring
and cooperative learning. The second effective practice is to use
consistent language. Consistent language provides students a model
for talking about the new concepts which also enables them to
communicate without hindrances. Phonological awareness is an
important early reading skill for all students. Students generally have
phonemic awareness in their home language can often transfer that
knowledge to a second language. And also a balanced curriculum
requires instruction in all the elements of reading. Reading is the
most effective skill when both basic and higher-order skills are
included on a daily basis. Computer based reading and listening skills
will empower students to understand language in a better way.
English language learners can benefit from sound reading instruction
even before they are fully proficient in English as long as the
instruction is comprehensible. Thus, if students have to be well
equipped in learning they should foreground themselves in various
integrated practices through multifarious experience for which
teachers are moderators and trainers. This type of learning prepares
the students for a constantly changing society which helps them to
meet the competitive world around them for better employability
fulfilling the vision and mission of the institution.
Abstract: Non-linear dynamic time history analysis is
considered as the most advanced and comprehensive analytical
method for evaluating the seismic response and performance of
multi-degree-of-freedom building structures under the influence of
earthquake ground motions. However, effective and accurate
application of the method requires the implementation of advanced
hysteretic constitutive models of the various structural components
including masonry infill panels. Sophisticated computational research
tools that incorporate realistic hysteresis models for non-linear
dynamic time-history analysis are not popular among the professional
engineers as they are not only difficult to access but also complex and
time-consuming to use. In addition, commercial computer programs
for structural analysis and design that are acceptable to practicing
engineers do not generally integrate advanced hysteretic models
which can accurately simulate the hysteresis behavior of structural
elements with a realistic representation of strength degradation,
stiffness deterioration, energy dissipation and ‘pinching’ under cyclic
load reversals in the inelastic range of behavior. In this scenario,
push-over or non-linear static analysis methods have gained
significant popularity, as they can be employed to assess the seismic
performance of building structures while avoiding the complexities
and difficulties associated with non-linear dynamic time-history
analysis. “Push-over” or non-linear static analysis offers a practical
and efficient alternative to non-linear dynamic time-history analysis
for rationally evaluating the seismic demands. The present paper is
based on the analytical investigation of the effect of distribution of
masonry infill panels over the elevation of planar masonry infilled
reinforced concrete [R/C] frames on the seismic demands using the
capacity spectrum procedures implementing nonlinear static analysis
[pushover analysis] in conjunction with the response spectrum
concept. An important objective of the present study is to numerically
evaluate the adequacy of the capacity spectrum method using
pushover analysis for performance based design of masonry infilled
R/C frames for near-field earthquake ground motions.
Abstract: Brain-Computer Interfaces (BCIs) measure brain
signals activity, intentionally and unintentionally induced by users,
and provides a communication channel without depending on the
brain’s normal peripheral nerves and muscles output pathway.
Feature Selection (FS) is a global optimization machine learning
problem that reduces features, removes irrelevant and noisy data
resulting in acceptable recognition accuracy. It is a vital step
affecting pattern recognition system performance. This study presents
a new Binary Particle Swarm Optimization (BPSO) based feature
selection algorithm. Multi-layer Perceptron Neural Network
(MLPNN) classifier with backpropagation training algorithm and
Levenberg-Marquardt training algorithm classify selected features.
Abstract: In this study, a general approach to the reliability
based limit analysis of laterally loaded piles is presented. In
engineering practice the uncertainties play a very important role. The
aim of this study is to evaluate the lateral load capacity of free-head
and fixed-head long pile when plastic limit analysis is considered. In
addition to the plastic limit analysis to control the plastic behaviour
of the structure, uncertain bound on the complementary strain energy
of the residual forces is also applied. This bound has significant effect
for the load parameter. The solution to reliability-based problems is
obtained by a computer program which is governed by the reliability
index calculation.