Abstract: Although students’ interest level in pursuing Computer Science and related degrees are lower than previous decade, fundamentals of computers, specifically introductory level programming courses are either listed as core or elective courses for a number of non-computer science majors. Universities accommodate these non-computer science majored students either via creating separate sections of a class for them or simply offering mixed-body classroom solutions, in which both computer science and non-computer science students take the courses together. In this work, we demonstrated how we handle introductory level programming course at Sam Houston State University and also provide facts about our observations on students’ success during the coursework. Moreover, we provide suggestions and methodologies that are based on students’ major and skills to overcome the deficiencies of mix-body type of classes.
Abstract: This paper presents a computationally efficient method
for the modeling of robot manipulators with flexible links and
joints. This approach combines the Discrete Time Transfer Matrix
Method with the Finite Segment Method, in which the flexible
links are discretized by a number of rigid segments connected by
torsion springs; and the flexibility of joints are modeled by torsion
springs. The proposed method avoids the global dynamics and has the
advantage of modeling non-uniform manipulators. Experiments and
simulations of a single-link flexible manipulator are conducted for
verifying the proposed methodologies. The simulations of a three-link
robot arm with links and joints flexibility are also performed.
Abstract: The objective of this study is to analyze the evolution of some social and economic indicators of Mercosur´s economies from 1980 to 2012, based on the statistics of the Latin American Integration Association (LAIA). The objective is to observe if after the accession of these economies to Mercosur (the first accessions occurred in 1994) these indicators showed better performance, in order to demonstrate if economic integration contributed to improved trade, macroeconomic performance, and level of social and economic development of member countries. To this end, the methodologies used will be a literature review and descriptive statistics. The theoretical framework that guides the work are the theories of Integration: Classical Liberal, Marxist and structural-proactive. The results reveal that most social and economic indicators showed better performance in those economies that joined Mercosur after 1994. This work is the result of an investigation already completed.
Abstract: The design process in architecture education is based upon the Learning-by-Doing method, which leads students to understand how to design by practicing rather than studying. First-year design studios, as starting educational stage, provide integrated knowledge and skills of design for newly jointed architecture students. Within the basic design studio environment, students are guided to transfer their abstract thoughts into visual concrete decisions under the supervision of design educators for the first time. Therefore, introductory design studios have predominant impacts on students’ operational thinking and designing. Architectural design thinking is quite different from students’ educational backgrounds and learning habits. This educational challenge at basic design studios creates a severe need to study the reality of design education at foundation year and define appropriate educational methods with convenient project types with the intention of enhancing architecture education quality. Material for this study has been gathered through long-term direct observation at a first year second semester design studio at the faculty of architecture at EMU (known as FARC 102), fall and spring academic semester 2014-15. Distribution of a questionnaire among case study students and interviews with third and fourth design studio students who passed through the same methods of education in the past 2 years and conducting interviews with instructors are other methodologies used in this research. The results of this study reveal a risk of a mismatch between the implemented teaching method, project type and scale in this particular level and students’ learning styles. Although the existence of such risk due to varieties in students’ profiles could be expected to some extent, recommendations can support educators to reach maximum compatibility.
Abstract: Per capita energy usage in any country is exponentially increasing with their development. As a result, the country’s dependence on the fossil fuels for energy generation is also increasing tremendously creating economic and environmental concerns. Tropical countries receive considerable amount of solar radiation throughout the year, use of solar energy with different energy storage and conversion methodologies is a viable solution to minimize the ever increasing demand for the depleting fossil fuels. Salinity gradient solar pond is one such solar energy application. This paper reports the characteristics and performance of a thermally insulated, experimental salinity-gradient solar pond, built at the premises of the University of Kelaniya, Sri Lanka. Particular stress is given to the behavior of the evolution of the three layer structure exist at the stable state of a salinity gradient solar pond over a long period of time, under different environmental conditions. The operational procedures required to maintain the long term thermal stability are also reported in this article.
Abstract: Global Software Development (GSD) is becoming a common norm in software industry, despite of the fact that global distribution of the teams presents special issues for effective communication and coordination of the teams. Now trends are changing and project management for distributed teams is no longer in a limbo. GSD can be effectively established using agile and project managers can use different agile techniques/tools for solving the problems associated with distributed teams. Agile methodologies like scrum and XP have been successfully used with distributed teams. We have employed exploratory research method to analyze different recent studies related to challenges of GSD and their proposed solutions. In our study, we had deep insight in six commonly faced challenges: communication and coordination, temporal differences, cultural differences, knowledge sharing/group awareness, speed and communication tools. We have established that each of these challenges cannot be neglected for distributed teams of any kind. They are interlinked and as an aggregated whole can cause the failure of projects. In this paper we have focused on creating a scalable framework for detecting and overcoming these commonly faced challenges. In the proposed solution, our objective is to suggest agile techniques/tools relevant to a particular problem faced by the organizations related to the management of distributed teams. We focused mainly on scrum and XP techniques/tools because they are widely accepted and used in the industry. Our solution identifies the problem and suggests an appropriate technique/tool to help solve the problem based on globally shared knowledgebase. We can establish a cause and effect relationship using a fishbone diagram based on the inputs provided for issues commonly faced by organizations. Based on the identified cause, suitable tool is suggested, our framework suggests a suitable tool. Hence, a scalable, extensible, self-learning, intelligent framework proposed will help implement and assess GSD to achieve maximum out of it. Globally shared knowledgebase will help new organizations to easily adapt best practices set forth by the practicing organizations.
Abstract: Policy makers are increasingly looking to make evidence-based decisions. Evidence-based decisions have historically used rigorous methodologies of empirical studies by research institutes, as well as less reliable immediate survey/polls often with limited sample sizes. As we move into the era of Big Data analytics, policy makers are looking to different methodologies to deliver reliable empirics in real-time. The question is not why did these people do this for the last 10 years, but why are these people doing this now, and if the this is undesirable, and how can we have an impact to promote change immediately. Big data analytics rely heavily on government data that has been released in to the public domain. The open data movement promises greater productivity and more efficient delivery of services; however, Australian government agencies remain reluctant to release their data to the general public. This paper considers the barriers to releasing government data as open data, and how these barriers might be overcome.
Abstract: The aim of this research paper is to conceptualize, discuss, analyze and propose alternate design methodologies for futuristic Black Box for flight safety. The proposal also includes global networking concepts for real time surveillance and monitoring of flight performance parameters including GPS parameters. It is expected that this proposal will serve as a failsafe real time diagnostic tool for accident investigation and location of debris in real time. In this paper, an attempt is made to improve the existing methods of flight data recording techniques and improve upon design considerations for futuristic FDR to overcome the trauma of not able to locate the block box. Since modern day communications and information technologies with large bandwidth are available coupled with faster computer processing techniques, the attempt made in this paper to develop a failsafe recording technique is feasible. Further data fusion/data warehousing technologies are available for exploitation.
Abstract: Information security plays a major role in uplifting the standard of secured communications via global media. In this paper, we have suggested a technique of encryption followed by insertion before transmission. Here, we have implemented two different concepts to carry out the above-specified tasks. We have used a two-point crossover technique of the genetic algorithm to facilitate the encryption process. For each of the uniquely identified rows of pixels, different mathematical methodologies are applied for several conditions checking, in order to figure out all the parent pixels on which we perform the crossover operation. This is done by selecting two crossover points within the pixels thereby producing the newly encrypted child pixels, and hence the encrypted cover image. In the next lap, the first and second order derivative operators are evaluated to increase the security and robustness. The last lap further ensures reapplication of the crossover procedure to form the final stego-image. The complexity of this system as a whole is huge, thereby dissuading the third party interferences. Also, the embedding capacity is very high. Therefore, a larger amount of secret image information can be hidden. The imperceptible vision of the obtained stego-image clearly proves the proficiency of this approach.
Abstract: In this paper, we present a pedestrian detection descriptor called Fused Structure and Texture (FST) features based on the combination of the local phase information with the texture features. Since the phase of the signal conveys more structural information than the magnitude, the phase congruency concept is used to capture the structural features. On the other hand, the Center-Symmetric Local Binary Pattern (CSLBP) approach is used to capture the texture information of the image. The dimension less quantity of the phase congruency and the robustness of the CSLBP operator on the flat images, as well as the blur and illumination changes, lead the proposed descriptor to be more robust and less sensitive to the light variations. The proposed descriptor can be formed by extracting the phase congruency and the CSLBP values of each pixel of the image with respect to its neighborhood. The histogram of the oriented phase and the histogram of the CSLBP values for the local regions in the image are computed and concatenated to construct the FST descriptor. Several experiments were conducted on INRIA and the low resolution DaimlerChrysler datasets to evaluate the detection performance of the pedestrian detection system that is based on the FST descriptor. A linear Support Vector Machine (SVM) is used to train the pedestrian classifier. These experiments showed that the proposed FST descriptor has better detection performance over a set of state of the art feature extraction methodologies.
Abstract: This paper compares the findings of two studies conducted to determine the effectiveness of simulation-based, hands-on and feedback mechanism on students learning by answering the following questions: 1). Does the use of simulation improve students’ learning outcomes? 2). How do students perceive the instructional design features embedded in the simulation program such as exploration and scaffolding support in learning new concepts? 3.) What is the effect of feedback mechanisms on students’ learning in the use of simulation-based labs? The paper also discusses the other aspects of findings which reveal that simulation by itself is not very effective in promoting student learning. Simulation becomes effective when it is followed by hands-on activity and feedback mechanisms. Furthermore, the paper presents recommendations for improving student learning through the use of simulation-based, hands-on, and feedback-based teaching methodologies.
Abstract: Feedback is a vital element for improving student
learning in a simulation-based training as it guides and refines
learning through scaffolding. A number of studies in literature have
shown that students’ learning is enhanced when feedback is provided
with personalized tutoring that offers specific guidance and adapts
feedback to the learner in a one-to-one environment. Thus, emulating
these adaptive aspects of human tutoring in simulation provides an
effective methodology to train individuals. This paper presents the results of a study that investigated the
effectiveness of automating different types of feedback techniques
such as Knowledge-of-Correct-Response (KCR) and Answer-Until-
Correct (AUC) in software simulation for learning basic information
technology concepts. For the purpose of comparison, techniques like
simulation with zero or no-feedback (NFB) and traditional hands-on
(HON) learning environments are also examined. The paper presents the summary of findings based on quantitative
analyses which reveal that the simulation based instructional
strategies are at least as effective as hands-on teaching methodologies
for the purpose of learning of IT concepts. The paper also compares
the results of the study with the earlier studies and recommends
strategies for using feedback mechanism to improve students’
learning in designing and simulation-based IT training.
Abstract: The increase in electric power demand in face of
environmental issues has intensified the participation of renewable
energy sources such as photovoltaics, in the energy matrix of various
countries. Due to their operational characteristics, they can generate
time-varying harmonic and inter-harmonic distortions. For this
reason, the application of methods of measurement based on
traditional Fourier analysis, as proposed by IEC 61000-4-7, can
provide inaccurate results. Considering the aspects mentioned herein,
came the idea of the development of this work which aims to present
the results of a comparative evaluation between a methodology
arising from the combination of the Prony method with the Kalman
filter and another method based on the IEC 61000-4-30 and IEC
61000-4-7 standards. Employed in this study were synthetic signals
and data acquired through measurements in a 50kWp photovoltaic
installation.
Abstract: The main purpose of this study is to assess the
sediment quality and potential ecological risk in marine sediments in
Gymea Bay located in south Sydney, Australia. A total of 32 surface
sediment samples were collected from the bay. Current track
trajectories and velocities have also been measured in the bay. The
resultant trace elements were compared with the adverse biological
effect values Effect Range Low (ERL) and Effect Range Median
(ERM) classifications. The results indicate that the average values of
chromium, arsenic, copper, zinc, and lead in surface sediments all
reveal low pollution levels and are below ERL and ERM values. The
highest concentrations of trace elements were found close to
discharge points and in the inner bay, and were linked with high
percentages of clay minerals, pyrite and organic matter, which can
play a significant role in trapping and accumulating these elements.
The lowest concentrations of trace elements were found to be on the
shoreline of the bay, which contained high percentages of sand
fractions. It is postulated that the fine particles and trace elements are
disturbed by currents and tides, then transported and deposited in
deeper areas. The current track velocities recorded in Gymea Bay had
the capability to transport fine particles and trace element pollution
within the bay. As a result, hydrodynamic measurements were able to
provide useful information and to help explain the distribution of
sedimentary particles and geochemical properties. This may lead to
knowledge transfer to other bay systems, including those in remote
areas. These activities can be conducted at a low cost, and are
therefore also transferrable to developing countries. The advent of
portable instruments to measure trace elements in the field has also
contributed to the development of these lower cost and easily applied
methodologies available for use in remote locations and low-cost
economies.
Abstract: As computing technology advances, smartphone
applications can assist student learning in a pervasive way. For
example, the idea of using mobile apps for the PA Common Trees,
Pests, Pathogens, in the field as a reference tool allows middle school
students to learn about trees and associated pests/pathogens without
bringing a textbook. While working on the development of three heterogeneous mobile
apps, we ran into numerous challenges. Both the traditional waterfall
model and the more modern agile methodologies failed in practice.
The waterfall model emphasizes the planning of the duration for each
phase. When the duration of each phase is not consistent with the
availability of developers, the waterfall model cannot be employed.
When applying Agile Methodologies, we cannot maintain the high
frequency of the iterative development review process, known as
‘sprints’. In this paper, we discuss the challenges and solutions. We
propose a hybrid model known as the Relay Race Methodology to
reflect the concept of racing and relaying during the process of
software development in practice. Based on the development project,
we observe that the modeling of the relay race transition between any
two phases is manifested naturally. Thus, we claim that the RRM
model can provide a de fecto rather than a de jure basis for the core
concept in the software development model. In this paper, the background of the project is introduced first.
Then, the challenges are pointed out followed by our solutions.
Finally, the experiences learned and the future works are presented.
Abstract: The critical concern of satellite operations is to ensure
the health and safety of satellites. The worst case in this perspective
is probably the loss of a mission, but the more common interruption
of satellite functionality can result in compromised mission
objectives. All the data acquiring from the spacecraft are known as
Telemetry (TM), which contains the wealth information related to the
health of all its subsystems. Each single item of information is
contained in a telemetry parameter, which represents a time-variant
property (i.e. a status or a measurement) to be checked. As a
consequence, there is a continuous improvement of TM monitoring
systems to reduce the time required to respond to changes in a
satellite's state of health. A fast conception of the current state of the
satellite is thus very important to respond to occurring failures.
Statistical multivariate latent techniques are one of the vital learning
tools that are used to tackle the problem above coherently.
Information extraction from such rich data sources using advanced
statistical methodologies is a challenging task due to the massive
volume of data. To solve this problem, in this paper, we present a
proposed unsupervised learning algorithm based on Principle
Component Analysis (PCA) technique. The algorithm is particularly
applied on an actual remote sensing spacecraft. Data from the
Attitude Determination and Control System (ADCS) was acquired
under two operation conditions: normal and faulty states. The models
were built and tested under these conditions, and the results show that
the algorithm could successfully differentiate between these
operations conditions. Furthermore, the algorithm provides
competent information in prediction as well as adding more insight
and physical interpretation to the ADCS operation.
Abstract: This paper illustrates the background of various
concepts, approaches, terminologies used to describe the basic
framework of an Islamic Hotel Room design. This paper reviews the
theoretical views in establishing a suitable and optimum environment
for Muslim as well as non-Muslim guests in hotel rooms while
according to shariah. It involves a few research methodologies that
requires the researcher to study on a few characteristics needed to
create more efficient rooms in terms of social interaction, economic
growth and other tolerable elements. This paper intends on revealing
the elements that are vital and may contribute for hotels in achieving
a more conclusive research on space planning for hotel rooms
focusing on the shariah and Muslim guests. Malaysia is an Islamic
country and has billion of tourists coming over for business and
recreational purposes. Therefore, having a righteous environment that
best suit this target user is important in terms of generating the
economy as well as providing a better understanding to the
community on the benefits of applying these qualities in a
conventional resort design.
Abstract: The teaching of computer programming for beginners
has been generally considered as a difficult and challenging task.
Several methodologies and research tools have been developed,
however, the difficulty of teaching still remains. Our work integrates
the state of the art in teaching programming with game software and
further provides metrics for the evaluation of student performance in
a collaborative activity of playing games. This paper aims to present a
multi-agent system architecture to be incorporated to the educational
collaborative game software for teaching programming that monitors,
evaluates and encourages collaboration by the participants. A
literature review has been made on the concepts of Collaborative
Learning, Multi-agents systems, collaborative games and techniques
to teach programming using these concepts simultaneously.
Abstract: The study is in application and analysis of two tourism
management tools that can contribute to making public managers
decision: the Barometer of Tourism Sustainability (BTS) and the
Ecological Footprint (EF). The results have shown that BTS allows
you to have an integrated view of the tourism system, awakening to
the need for planning of appropriate actions so that it can achieve the
positive scale proposed (potentially sustainable). Already the
methodology of ecological tourism footprint is an important tool to
measure potential impacts generated by tourism to tourist reality.
Abstract: Today, there is a large number of political transcripts
available on the Web to be mined and used for statistical analysis,
and product recommendations. As the online political resources are
used for various purposes, automatically determining the political
orientation on these transcripts becomes crucial. The methodologies
used by machine learning algorithms to do an automatic classification
are based on different features that are classified under categories
such as Linguistic, Personality etc. Considering the ideological
differences between Liberals and Conservatives, in this paper, the
effect of Personality traits on political orientation classification is
studied. The experiments in this study were based on the correlation
between LIWC features and the BIG Five Personality traits. Several
experiments were conducted using Convote U.S. Congressional-
Speech dataset with seven benchmark classification algorithms. The
different methodologies were applied on several LIWC feature sets
that constituted by 8 to 64 varying number of features that are
correlated to five personality traits. As results of experiments,
Neuroticism trait was obtained to be the most differentiating
personality trait for classification of political orientation. At the same
time, it was observed that the personality trait based classification
methodology gives better and comparable results with the related
work.