Abstract: The objective of this paper is to study the electrical
resistivity complexity between field and laboratory measurement, in
order to improve the effectiveness of data interpretation for
geophysical ground resistivity survey. The geological outcrop in
Penang, Malaysia with an obvious layering contact was chosen as the
study site. Two dimensional geoelectrical resistivity imaging were
used in this study to maps the resistivity distribution of subsurface,
whereas few subsurface sample were obtained for laboratory
advance. In this study, resistivity of samples in original conditions is
measured in laboratory by using time domain low-voltage technique,
particularly for granite core sample and soil resistivity measuring set
for soil sample. The experimentation results from both schemes are
studied, analyzed, calibrated and verified, including basis and
correlation, degree of tolerance and characteristics of substance.
Consequently, the significant different between both schemes is
explained comprehensively within this paper.
Abstract: This paper presents a simplified version of Data Envelopment Analysis (DEA) - a conventional approach to evaluating the performance and ranking of competitive objects characterized by two groups of factors acting in opposite directions: inputs and outputs. DEA with a Perfect Object (DEA PO) augments the group of actual objects with a virtual Perfect Object - the one having greatest outputs and smallest inputs. It allows for obtaining an explicit analytical solution and making a step to an absolute efficiency. This paper develops this approach further and introduces a DEA model with Partially Perfect Objects. DEA PPO consecutively eliminates the smallest relative inputs or greatest relative outputs, and applies DEA PO to the reduced collections of indicators. The partial efficiency scores are combined to get the weighted efficiency score. The computational scheme remains simple, like that of DEA PO, but the advantage of the DEA PPO is taking into account all of the inputs and outputs for each actual object. Firm evaluation is considered as an example.
Abstract: While OCD is one of the most commonly occurring
psychiatric conditions experienced by older adults, there is a paucity
of research conducted into the treatment of older adults with OCD.
This case study represents the first published investigation of a
cognitive treatment for geriatric OCD. It describes the successful
treatment of an 86-year old man with a 63-year history of OCD using
Danger Ideation Reduction Therapy (DIRT). The client received 14
individual, 50-minute treatment sessions of DIRT over 13 weeks.
Clinician-based Y-BOCS scores reduced 84% from 25 (severe) at
pre-treatment, to 4 (subclinical) at 6-month post-treatment follow-up
interview, demonstrating the efficacy of DIRT for this client. DIRT
may have particular advantages over ERP and pharmacological
approaches, however further research is required in older adults with
OCD.
Abstract: Multi-dimensional principal component analysis
(PCA) is the extension of the PCA, which is used widely as the
dimensionality reduction technique in multivariate data analysis, to
handle multi-dimensional data. To calculate the PCA the singular
value decomposition (SVD) is commonly employed by the reason of
its numerical stability. The multi-dimensional PCA can be calculated
by using the higher-order SVD (HOSVD), which is proposed by
Lathauwer et al., similarly with the case of ordinary PCA. In this
paper, we apply the multi-dimensional PCA to the multi-dimensional
medical data including the functional independence measure (FIM)
score, and describe the results of experimental analysis.
Abstract: Coronary artery bypass grafts (CABG) are widely
studied with respect to hemodynamic conditions which play
important role in presence of a restenosis. However, papers which
concern with constitutive modeling of CABG are lacking in the
literature. The purpose of this study is to find a constitutive model for
CABG tissue. A sample of the CABG obtained within an autopsy
underwent an inflation–extension test. Displacements were
recoredered by CCD cameras and subsequently evaluated by digital
image correlation. Pressure – radius and axial force – elongation
data were used to fit material model. The tissue was modeled as onelayered
composite reinforced by two families of helical fibers. The
material is assumed to be locally orthotropic, nonlinear,
incompressible and hyperelastic. Material parameters are estimated
for two strain energy functions (SEF). The first is classical
exponential. The second SEF is logarithmic which allows
interpretation by means of limiting (finite) strain extensibility.
Presented material parameters are estimated by optimization based
on radial and axial equilibrium equation in a thick-walled tube. Both
material models fit experimental data successfully. The exponential
model fits significantly better relationship between axial force and
axial strain than logarithmic one.
Abstract: The more recent satellite projects/programs makes
extensive usage of real – time embedded systems. 16 bit processors
which meet the Mil-Std-1750 standard architecture have been used in
on-board systems. Most of the Space Applications have been written
in ADA. From a futuristic point of view, 32 bit/ 64 bit processors are
needed in the area of spacecraft computing and therefore an effort is
desirable in the study and survey of 64 bit architectures for space
applications. This will also result in significant technology
development in terms of VLSI and software tools for ADA (as the
legacy code is in ADA).
There are several basic requirements for a special processor for
this purpose. They include Radiation Hardened (RadHard) devices,
very low power dissipation, compatibility with existing operational
systems, scalable architectures for higher computational needs,
reliability, higher memory and I/O bandwidth, predictability, realtime
operating system and manufacturability of such processors.
Further on, these may include selection of FPGA devices, selection
of EDA tool chains, design flow, partitioning of the design, pin
count, performance evaluation, timing analysis etc.
This project deals with a brief study of 32 and 64 bit processors
readily available in the market and designing/ fabricating a 64 bit
RISC processor named RISC MicroProcessor with added
functionalities of an extended double precision floating point unit
and a 32 bit signal processing unit acting as co-processors. In this
paper, we emphasize the ease and importance of using Open Core
(OpenSparc T1 Verilog RTL) and Open “Source" EDA tools such as
Icarus to develop FPGA based prototypes quickly. Commercial tools
such as Xilinx ISE for Synthesis are also used when appropriate.
Abstract: In this paper, multilayered coreless printed circuit
board (PCB) step-down power transformers for DC-DC converter
applications have been designed, manufactured and evaluated. A set
of two different circular spiral step-down transformers were
fabricated in the four layered PCB. These transformers have been
modelled with the assistance of high frequency equivalent circuit and
characterized with both sinusoidal and square wave excitation. This
paper provides the comparative results of these two different
transformers in terms of their resistances, self, leakage, mutual
inductances, coupling coefficient and also their energy efficiencies.
The operating regions for optimal performance of these transformers
for power transfer applications are determined. These transformers
were tested for the output power levels of about 30 Watts within the
input voltage range of 12-50 Vrms. The energy efficiency for these
step down transformers is observed to be in the range of 90%-97% in
MHz frequency region.
Abstract: Employee-s task performance has been recognized as a
core contributor to overall organizational effectiveness. Hence,
verifying the determinants of task performance is one of the most
important research issues. This study tests the influence of perceived
organizational support, abusive supervision, and exchange ideology
on employee-s task performance. We examined our hypotheses by
collecting self-reported data from 413 Korean employees in different
organizations. Our all hypotheses gained support from the results.
Implications for research and directions for future research are
discussed.
Abstract: Osteoarthritis (OA) is the most prevalent and far common debilitating form of arthritis which can be defined as a degenerative condition affecting synovial joint. Patients suffering from osteoarthritis often complain of dull ache pain on movement.
Physical agents can fight the painful process when correctly indicated and used such as heat or cold therapy Aim. This study was carried out to: Compare the effect of cold, warm and contrast therapy on controlling knee osteoarthritis associated problems. Setting: The study was carried out in orthopedic outpatient clinics of Menoufia University and teaching Hospitals, Egypt. Sample: A convenient sample of 60 adult patients with unilateral knee osteoarthritis. Tools: three tools were utilized to collect the data. Tool I : An interviewing questionnaire. It comprised of three parts covering sociodemographic data, medical data and adverse effects of the treatment protocol. Tool II : Knee Injury and Osteoarthritis Outcome Score (KOOS) It consists of five main parts. Tool II1 : 0-10 Numeric pain rating scale. Results: reveled that the total knee symptoms score was decreased from moderate symptoms pre intervention to mild symptoms after warm and contrast method of therapy, but the contrast therapy had significant effect in reducing the knee symptoms and pain than the other symptoms. Conclusions: all of the three
methods of therapy resulted in improvement in all knee symptoms and pain but the most appropriate protocol of treatment to relive symptoms and pain was contrast therapy.
Abstract: Optimization plays an important role in most real
world applications that support decision makers to take the right
decision regarding the strategic directions and operations of the
system they manage. Solutions for traffic management and traffic
congestion problems are considered major problems that most
decision making authorities for cities around the world are looking
for. This review paper gives a full description of the traffic problem
as part of the transportation planning process and present a view as a
framework of urban transportation system analysis where the core of
the system is a transportation network equilibrium model that is
based on optimization techniques and that can also be used for
evaluating an alternative solution or a combination of alternative
solutions for the traffic congestion. Different transportation network
equilibrium models are reviewed from the sequential approach to the
multiclass combining trip generation, trip distribution, modal split,
trip assignment and departure time model. A GIS-Based intelligent
decision support system framework for urban transportation system
analysis is suggested for implementation where the selection of
optimized alternative solutions, single or packages, will be based on
an intelligent agent rather than human being which would lead to
reduction in time, cost and the elimination of the difficulty, by
human being, for finding the best solution to the traffic congestion
problem.
Abstract: Students in high education are presented with new terms and concepts in nearly every lecture they attend. Many of them prefer Web-based self-tests for evaluation of their concepts understanding since they can use those tests independently of tutors- working hours and thus avoid the necessity of being in a particular place at a particular time. There is a large number of multiple-choice tests in almost every subject designed to contribute to higher level learning or discover misconceptions. Every single test provides immediate feedback to a student about the outcome of that test. In some cases a supporting system displays an overall score in case a test is taken several times by a student. What we still find missing is how to secure delivering of personalized feedback to a user while taking into consideration the user-s progress. The present work is motivated to throw some light on that question.
Abstract: Automatic keyphrase extraction is useful in efficiently
locating specific documents in online databases. While several
techniques have been introduced over the years, improvement on
accuracy rate is minimal. This research examines attribute scores for
author-supplied keyphrases to better understand how the scores affect
the accuracy rate of automatic keyphrase extraction. Five attributes
are chosen for examination: Term Frequency, First Occurrence, Last
Occurrence, Phrase Position in Sentences, and Term Cohesion
Degree. The results show that First Occurrence is the most reliable
attribute. Term Frequency, Last Occurrence and Term Cohesion
Degree display a wide range of variation but are still usable with
suggested tweaks. Only Phrase Position in Sentences shows a totally
unpredictable pattern. The results imply that the commonly used
ranking approach which directly extracts top ranked potential phrases
from candidate keyphrase list as the keyphrases may not be reliable.
Abstract: In this study it is aimed to determine the level of preservice teachers- computer phobia. Whether or not computer phobia meaningfully varies statistically according to gender and computer experience has been tested in the study. The study was performed on 430 pre-service teachers at the Education Faculty in Rize/Turkey. Data in the study were collected through the Computer Phobia Scale consisting of the “Personal Knowledge Questionnaire", “Computer Anxiety Rating Scale", and “Computer Thought Survey". In this study, data were analyzed with statistical processes such as t test, and correlation analysis. According to results of statistical analyses, computer phobia of male pre-service teachers does not statistically vary depending on their gender. Although male preservice teachers have higher computer anxiety scores, they have lower computer thought scores. It was also observed that there is a negative and intensive relation between computer experience and computer anxiety. Meanwhile it was found out that pre-service teachers using computer regularly indicated lower computer anxiety. Obtained results were tried to be discussed in terms of the number of computer classes in the Education Faculty curriculum, hours of computer class and the computer availability of student teachers.
Abstract: A lot of computer-based methods have been developed
to assess the evacuation capability (EC) of high-rise buildings.
Because softwares are time-consuming and not proper for on scene
applications, we adopted two methods, fuzzy analytic hierarchy
process (FAHP) and technique for order preference by similarity to an
ideal solution (TOPSIS), for EC assessment of a high-rise building in
Jinan. The EC scores obtained with the two methods and the
evacuation time acquired with Pathfinder 2009 for floors 47-60 of the
building were compared with each other. The results show that FAHP
performs better than TOPSIS for EC assessment of high-rise buildings,
especially in the aspect of dealing with the effect of occupant type and
distance to exit on EC, tackling complex problem with multi-level
structure of criteria, and requiring less amount of computation.
However, both FAHP and TOPSIS failed to appropriately handle the
situation where the exit width changes while occupants are few.
Abstract: Control of complex systems is one of important files in complex systems, that not only relies on the essence of complex systems which is denoted by the core concept – emergence, but also embodies the elementary concept in control theory. Aiming at giving a clear and self-contained description of emergence, the paper introduces a formal way to completely describe the formation and dynamics of emergence in complex systems. Consequently, this paper indicates the Emergence-Oriented Control methodology that contains three kinds of basic control schemes: the direct control, the system re-structuring and the system calibration. As a universal ontology, the Emergence-Oriented Control provides a powerful tool for identifying and resolving control problems in specific systems.
Abstract: In this paper, a new approach for quality assessment
tasks in lossy compressed digital video is proposed. The research
activity is based on the visual fixation data recorded by an eye
tracker. The method involved both a new paradigm for subjective
quality evaluation and the subsequent statistical analysis to match
subjective scores provided by the observer to the data obtained from
the eye tracker experiments. The study brings improvements to the
state of the art, as it solves some problems highlighted in literature.
The experiments prove that data obtained from an eye tracker can be
used to classify videos according to the level of impairment due to
compression. The paper presents the methodology, the experimental
results and their interpretation. Conclusions suggest that the eye
tracker can be useful in quality assessment, if data are collected and
analyzed in a proper way.
Abstract: Attachment theory focuses on the bond that develops between child and caretaker and the consequences that this bond has on the childs future relationships. Adolescents attempt to define their identity by experiencing various risky behaviors. The first aim of the study was whether risk taking behavior differs according to attachment styles. The second was to examine risk taking behavior differences according to gender. The third aim of this study was to examine attachment X gender interaction effect for risk taking behavior. And final was to investigate attachment styles differences according to gender. Data were collected from 218 participants (114 female and 104 male) who are university students. The results of this study showed that attachment styles differentiated by risk taking behavior and males had higher risk taking score than females. It was also found out that there was significant attachment X gender interaction effect for risk taking behavior. And finally, the results showed that attachment styles differentiated according to gender.KeywordsAttachment style, risk taking
Abstract: We have developed an energy based approach for identifying the binding sites and important residues for binding in protein-protein complexes. We found that the residues and residuepairs with charged and aromatic side chains are important for binding. These residues influence to form cation-¤Ç, electrostatic and aromatic interactions. Our observation has been verified with the experimental binding specificity of protein-protein complexes and found good agreement with experiments. The analysis on surrounding hydrophobicity reveals that the binding residues are less hydrophobic than non-binding sites, which suggests that the hydrophobic core are important for folding and stability whereas the surface seeking residues play a critical role in binding. Further, the propensity of residues in the binding sites of receptors and ligands, number of medium and long-range contacts, and influence of neighboring residues will be discussed.
Abstract: These days we face with so many advertisements in
magazines, those mentioned coaching is pragmatic specialties which
help people make change in their lives. Up to know Specialty coaches
are not necessarily therapists, consultants or psychologist, thus they
may not know psychological theories. The International Coach
Federation identifies "facilitating learning and results" as one of its
four core coach competencies, without understanding learning
theories coaching practice hangs in theoretical abyss. Thus the aim of
this article is investigating learning theories within coaching process.
Therefore, I reviewed some cognitive and behavioral learning
theories and analyzed their contribution with coaching process which
has been introduced in mentor coaches and ICF certified coaches'
papers and books. The result demonstrated that coaching profession
is strongly grounded in learning theories, and it will be strengthened
by the validation of theories and evidence-based research as we move
forward. Thus, it needs more research in order to applying effective
theoretical frameworks.
Abstract: Aspect Oriented Programming promises many
advantages at programming level by incorporating the cross cutting
concerns into separate units, called aspects. Join Points are
distinguishing features of Aspect Oriented Programming as they
define the points where core requirements and crosscutting concerns
are (inter)connected. Currently, there is a problem of multiple
aspects- composition at the same join point, which introduces the
issues like ordering and controlling of these superimposed aspects.
Dynamic strategies are required to handle these issues as early as
possible. State chart is an effective modeling tool to capture dynamic
behavior at high level design. This paper provides methodology to
formulate the strategies for multiple aspect composition at high level,
which helps to better implement these strategies at coding level. It
also highlights the need of designing shared join point at high level,
by providing the solutions of these issues using state chart diagrams
in UML 2.0. High level design representation of shared join points
also helps to implement the designed strategy in systematic way.