Abstract: The number of intrusions and attacks against critical
infrastructures and other information networks is increasing rapidly.
While there is no identified evidence that terrorist organizations are
currently planning a coordinated attack against the vulnerabilities of
computer systems and network connected to critical infrastructure,
and origins of the indiscriminate cyber attacks that infect computers
on network remain largely unknown. The growing trend toward the
use of more automated and menacing attack tools has also
overwhelmed some of the current methodologies used for tracking
cyber attacks. There is an ample possibility that this kind of cyber
attacks can be transform to cyberterrorism caused by illegal purposes.
Cyberterrorism is a matter of vital importance to national welfare.
Therefore, each countries and organizations have to take a proper
measure to meet the situation and consider effective legislation about
cyberterrorism.
Abstract: To achieve accurate and precise results of finite
element analysis (FEA) of bones, it is important to represent the
load/boundary conditions as identical as possible to the human body
such as the bone properties, the type and force of the muscles, the
contact force of the joints, and the location of the muscle attachment.
In this study, the difference in the Von-Mises stress and the total
deformation was compared by classifying them into Case 1, which
shows the actual anatomical form of the muscle attached to the femur
when the same muscle force was applied, and Case 2, which gives a
simplified representation of the attached location. An inverse
dynamical musculoskeletal model was simulated using data from an
actual walking experiment to complement the accuracy of the
muscular force, the input value of FEA. The FEA method using the
results of the muscular force that were calculated through the
simulation showed that the maximum Von-Mises stress and the
maximum total deformation in Case 2 were underestimated by 8.42%
and 6.29%, respectively, compared to Case 1. The torsion energy and
bending moment at each location of the femur occurred via the stress
ingredient. Due to the geometrical/morphological feature of the femur
of having a long bone shape when the stress distribution is wide, as
shown in Case 1, a greater Von-Mises stress and total deformation are
expected from the sum of the stress ingredients. More accurate results
can be achieved only when the muscular strength and the attachment
location in the FEA of the bones and the attachment form are the same
as those in the actual anatomical condition under the various moving
conditions of the human body.
Abstract: The rapid development of manufacturing and information systems has caused significant changes in manufacturing environments in recent decades. Mass production has given way to flexible manufacturing systems, in which an important characteristic is customized or "on demand" production. In this scenario, the seamless and without gaps information flow becomes a key factor for success of enterprises. In this paper we present a framework to support the mapping of features into machining workingsteps compliant with the ISO 14649 standard (known as STEP-NC). The system determines how the features can be made with the available manufacturing resources. Examples of the mapping method are presented for features such as a pocket with a general surface.
Abstract: In this era of competitiveness, there is a growing need for supply chains also to become competitive enough to handle pressures like varying customer’s expectations, low cost high quality products to be delivered at the minimum time and the most important is throat cutting competition at world wide scale. In the recent years, supply chain competitiveness has been, therefore, accepted as one of the most important philosophies in the supply chain literature. Various researchers and practitioners have tried to identify and implement strategies in supply chains which can bring competitiveness in the supply chains i.e. supply chain competitiveness. The purpose of this paper is to suggest select strategies for supply chain competitiveness in the Indian manufacturing sector using an integrated approach of literature review and exploratory interviews with eminent professionals from the supply chain area in various industries, academia and research. The aim of the paper is to highlight the important area of competitiveness in the supply chain and to suggest recommendations to the industry and managers of manufacturing sector.
Abstract: Most of the losses in a power system relate to
the distribution sector which always has been considered.
From the important factors which contribute to increase losses
in the distribution system is the existence of radioactive flows.
The most common way to compensate the radioactive power
in the system is the power to use parallel capacitors. In
addition to reducing the losses, the advantages of capacitor
placement are the reduction of the losses in the release peak of
network capacity and improving the voltage profile. The point
which should be considered in capacitor placement is the
optimal placement and specification of the amount of the
capacitor in order to maximize the advantages of capacitor
placement.
In this paper, a new technique has been offered for the
placement and the specification of the amount of the constant
capacitors in the radius distribution network on the basis of
Genetic Algorithm (GA). The existing optimal methods for
capacitor placement are mostly including those which reduce
the losses and voltage profile simultaneously. But the
retaliation cost and load changes have not been considered as
influential UN the target function .In this article, a holistic
approach has been considered for the optimal response to this
problem which includes all the parameters in the distribution
network: The price of the phase voltage and load changes. So,
a vast inquiry is required for all the possible responses. So, in
this article, we use Genetic Algorithm (GA) as the most
powerful method for optimal inquiry.
Abstract: The ability to recognize humans and their activities by computer vision is a very important task, with many potential application. Study of human motion analysis is related to several research areas of computer vision such as the motion capture, detection, tracking and segmentation of people. In this paper, we describe a segmentation method for extracting human body contour in modified HLS color space. To estimate a background, the modified HLS color space is proposed, and the background features are estimated by using the HLS color components. Here, the large amount of human dataset, which was collected from DV cameras, is pre-processed. The human body and its contour is successfully extracted from the image sequences.
Abstract: I/O workload is a critical and important factor to
analyze I/O pattern and to maximize file system performance.
However to measure I/O workload on running distributed parallel file
system is non-trivial due to collection overhead and large volume of
data. In this paper, we measured and analyzed file system activities on
two large-scale cluster systems which had TFlops level high
performance computation resources. By comparing file system
activities of 2009 with those of 2006, we analyzed the change of I/O
workloads by the development of system performance and high-speed
network technology.
Abstract: The plastic forming process of sheet plate takes an
important place in forming metals. The traditional techniques of tool
design for sheet forming operations used in industry are experimental
and expensive methods. Prediction of the forming results,
determination of the punching force, blank holder forces and the
thickness distribution of the sheet metal will decrease the production
cost and time of the material to be formed. In this paper, multi-stage
deep drawing simulation of an Industrial Part has been presented
with finite element method. The entire production steps with
additional operations such as intermediate annealing and springback
has been simulated by ABAQUS software under axisymmetric
conditions. The simulation results such as sheet thickness
distribution, Punch force and residual stresses have been extracted in
any stages and sheet thickness distribution was compared with
experimental results. It was found through comparison of results, the
FE model have proven to be in close agreement with those of
experiment.
Abstract: Business scenario is an important technique that may be used at various stages of the enterprise architecture to derive its characteristics based on the high-level requirements of the business. In terms of wireless deployments, they are used to help identify and understand business needs involving wireless services, and thereby to derive the business requirements that the architecture development has to address by taking into account of various wireless challenges. This study assesses the deployment of Wireless Local Area Network (WLAN) and Broadband Wireless Access (BWA) solutions for several business scenarios in Asia Pacific region. This paper focuses on the overview of the business and technology environments, whereby examples of existing (or suggested) wireless solutions (to be) adopted in Asia Pacific region will be discussed. Interactions of several players, enabling technologies, and key processes in the wireless environments are studied. The analysis and discussions associated to this study are divided into two divisions: healthcare and education, where the merits of wireless solutions in improving living quality are highlighted.
Abstract: The problem of frequent pattern discovery is defined
as the process of searching for patterns such as sets of features or items that appear in data frequently. Finding such frequent patterns
has become an important data mining task because it reveals associations, correlations, and many other interesting relationships
hidden in a database. Most of the proposed frequent pattern mining
algorithms have been implemented with imperative programming
languages. Such paradigm is inefficient when set of patterns is large
and the frequent pattern is long. We suggest a high-level declarative
style of programming apply to the problem of frequent pattern
discovery. We consider two languages: Haskell and Prolog. Our
intuitive idea is that the problem of finding frequent patterns should
be efficiently and concisely implemented via a declarative paradigm
since pattern matching is a fundamental feature supported by most
functional languages and Prolog. Our frequent pattern mining
implementation using the Haskell and Prolog languages confirms our
hypothesis about conciseness of the program. The comparative
performance studies on line-of-code, speed and memory usage of
declarative versus imperative programming have been reported in the
paper.
Abstract: Nowadays, computer worms, viruses and Trojan horse
become popular, and they are collectively called malware. Those
malware just spoiled computers by deleting or rewriting important
files a decade ago. However, recent malware seems to be born to earn
money. Some of malware work for collecting personal information so
that malicious people can find secret information such as password for
online banking, evidence for a scandal or contact address which relates
with the target. Moreover, relation between money and malware
becomes more complex. Many kinds of malware bear bots to get
springboards. Meanwhile, for ordinary internet users,
countermeasures against malware come up against a blank wall.
Pattern matching becomes too much waste of computer resources,
since matching tools have to deal with a lot of patterns derived from
subspecies. Virus making tools can automatically bear subspecies of
malware. Moreover, metamorphic and polymorphic malware are no
longer special. Recently there appears malware checking sites that
check contents in place of users' PC. However, there appears a new
type of malicious sites that avoids check by malware checking sites. In
this paper, existing protocols and methods related with the web are
reconsidered in terms of protection from current attacks, and new
protocol and method are indicated for the purpose of security of the
web.
Abstract: The mathematical framework for studying of a fuzzy approximate reasoning is presented in this paper. Two important defuzzification methods (Area defuzzification and Height defuzzification) besides the center of gravity method which is the best well known defuzzification method are described. The continuity of the defuzzification methods and its application to a fuzzy feedback control are discussed.
Abstract: In this paper we describe the design and implementation of a parallel algorithm for data assimilation with ensemble Kalman filter (EnKF) for oil reservoir history matching problem. The use of large number of observations from time-lapse seismic leads to a large turnaround time for the analysis step, in addition to the time consuming simulations of the realizations. For efficient parallelization it is important to consider parallel computation at the analysis step. Our experiments show that parallelization of the analysis step in addition to the forecast step has good scalability, exploiting the same set of resources with some additional efforts.
Abstract: Quantitative Investigation of impact of the factors' contribution towards measuring the reusability of software components could be helpful in evaluating the quality of developed or developing reusable software components and in identification of reusable component from existing legacy systems; that can save cost of developing the software from scratch. But the issue of the relative significance of contributing factors has remained relatively unexplored. In this paper, we have use the Taguchi's approach in analyzing the significance of different structural attributes or factors in deciding the reusability level of a particular component. The results obtained shows that the complexity is the most important factor in deciding the better Reusability of a function oriented Software. In case of Object Oriented Software, Coupling and Complexity collectively play significant role in high reusability.
Abstract: Practicum placements are an critical factor for student teachers on Education Programs. How can student teachers become professionals? This study was to investigate problems, weakness and obstacles of practicum placements and develop guidelines for partnership in the practicum placements. In response to this issue, a partnership concept was implemented for developing student teachers into professionals. Data were collected through questionnaires on attitude toward problems, weaknesses, and obstacles of practicum placements of student teachers in Rajabhat universities and included focus group interviews. The research revealed that learning management, classroom management, curriculum, assessment and evaluation, classroom action research, and teacher demeanor are the important factors affecting the professional development of Education Program student teachers. Learning management plan and classroom management concerning instructional design, teaching technique, instructional media, and student behavior management are another important aspects influencing the professional development for student teachers.
Abstract: Managers as the key employees have a very important
role in maintaining the workforce performance which is critical to the
construction companies- success in the future. If motivated
employees start with motivated managers probably it would seem
plausible if the de-motivated ones start with de-motivated managers.
This study aims to analyze the importance of motivated managers to
their successes and construction companies- successes. In this study,
a quantitative method was used and the study area was in Medan,
North Sumatera. Questionnaire survey was distributed directly to
construction companies in Medan which are listed in the
Construction Services Development Board. A total of 60 managers
responded and the completed questionnaires were analyzed using the
descriptive analysis. The results indicated that the respondents
acknowledge the importance of motivation among themselves to the
projects and construction companies- success, implying that it is vital
to maintain the motivation and good performance of the workforce.
Abstract: Data mining has been integrated into application systems to enhance the quality of the decision-making process. This study aims to focus on the integration of data mining technology and Knowledge Management System (KMS), due to the ability of data mining technology to create useful knowledge from large volumes of data. Meanwhile, KMS vitally support the creation and use of knowledge. The integration of data mining technology and KMS are popularly used in business for enhancing and sustaining organizational performance. However, there is a lack of studies that applied data mining technology and KMS in the education sector; particularly students- academic performance since this could reflect the IHL performance. Realizing its importance, this study seeks to integrate data mining technology and KMS to promote an effective management of knowledge within IHLs. Several concepts from literature are adapted, for proposing the new integrative data mining technology and KMS framework to an IHL.
Abstract: Simulation is a very powerful method used for highperformance
and high-quality design in distributed system, and now
maybe the only one, considering the heterogeneity, complexity and
cost of distributed systems. In Grid environments, foe example, it is
hard and even impossible to perform scheduler performance
evaluation in a repeatable and controllable manner as resources and
users are distributed across multiple organizations with their own
policies. In addition, Grid test-beds are limited and creating an
adequately-sized test-bed is expensive and time consuming.
Scalability, reliability and fault-tolerance become important
requirements for distributed systems in order to support distributed
computation. A distributed system with such characteristics is called
dependable. Large environments, like Cloud, offer unique
advantages, such as low cost, dependability and satisfy QoS for all
users. Resource management in large environments address
performant scheduling algorithm guided by QoS constrains. This
paper presents the performance evaluation of scheduling heuristics
guided by different optimization criteria. The algorithms for
distributed scheduling are analyzed in order to satisfy users
constrains considering in the same time independent capabilities of
resources. This analysis acts like a profiling step for algorithm
calibration. The performance evaluation is based on simulation. The
simulator is MONARC, a powerful tool for large scale distributed
systems simulation. The novelty of this paper consists in synthetic
analysis results that offer guidelines for scheduler service
configuration and sustain the empirical-based decision. The results
could be used in decisions regarding optimizations to existing Grid
DAG Scheduling and for selecting the proper algorithm for DAG
scheduling in various actual situations.
Abstract: Consumer behaviour analysis represents an important
field of study in marketing. Particularly strategy development for
marketing and communications will be more focused and effective
when marketers have an understanding of the motivations, behaviour
and psychology of consumers. While materialism has been found to
be one of the important elements in consumer behaviour, compulsive
consumption represents another aspect that has recently attracted
more attention. This is because of the growing prevalence of
dysfunctional buying that has raised concern in consumer societies.
Present studies and analyses on origins and motivations of
compulsive buying have mainly focused on either individual factors
or groups of related factors and hence a need for a holistic view
exists. This paper provides a comprehensive perspective on
compulsive consumption and establishes relevant propositions
keeping the family life cycle stages as a reference for the incidence of
chronic consumer states and their influence on compulsive
consumption.
Abstract: DG application has received increasing attention during
recent years. The impact of DG on various aspects of distribution system
operation, such as reliability and energy loss, depend highly on DG
location in distribution feeder. Optimal DG placement is an important
subject which has not been fully discussed yet.
This paper presents an optimization method to determine optimal DG
placement, based on a cost/worth analysis approach. This method
considers technical and economical factors such as energy loss, load point
reliability indices and DG costs, and particularly, portability of DG. The
proposed method is applied to a test system and the impacts of different
parameters such as load growth rate and load forecast uncertainty (LFU)
on optimum DG location are studied.