Abstract: Through the exploration of the lived experiences, beliefs and values of instructional leaders, teachers and students in Finland, Germany and Canada, we investigated the factors which contribute to developmentally responsive, intellectually engaging middle-level learning environments for early adolescents. Student-centred leadership dimensions, effective instructional practices and student agency were examined through the lens of current policy and research on middle-level learning environments emerging from the Canadian province of Manitoba. Consideration of these three research perspectives in the context of early adolescent learning, placed against an international backdrop, provided a previously undocumented perspective on leading, teaching and learning in the middle years. Aligning with a social constructivist, qualitative research paradigm, the study incorporated collective case study methodology, along with constructivist grounded theory methods of data analysis. Data were collected through semi-structured individual and focus group interviews and document review, as well as direct and participant observation. Three case study narratives were developed to share the rich stories of study participants, who had been selected using maximum variation and intensity sampling techniques. Interview transcript data were coded using processes from constructivist grounded theory. A cross-case analysis yielded a conceptual framework highlighting key factors that were found to be significant in the establishment of developmentally responsive, intellectually engaging middle-level learning environments. Seven core categories emerged from the cross-case analysis as common to all three countries. Within the visual conceptual framework (which depicts the interconnected nature of leading, teaching and learning in middle-level learning environments), these seven core categories were grouped into Essential Factors (student agency, voice and choice), Contextual Factors (instructional practices; school culture; engaging families and the community), Synergistic Factors (instructional leadership) and Cornerstone Factors (education as a fundamental cultural value; preservice, in-service and ongoing teacher development). In addition, sub-factors emerged from recurring codes in the data and identified specific characteristics and actions found in developmentally responsive, intellectually engaging middle-level learning environments. Although this study focused on 12 schools in Finland, Germany and Canada, it informs the practice of educators working with early adolescent learners in middle-level learning environments internationally. The authentic voices of early adolescent learners are the most important resource educators have to gauge if they are creating effective learning environments for their students. Ongoing professional dialogue and learning is essential to ensure teachers are supported in their work and develop the pedagogical practices needed to meet the needs of early adolescent learners. It is critical to balance consistency, coherence and dependability in the school environment with the necessary flexibility in order to support the unique learning needs of early adolescents. Educators must intentionally create a school culture that unites teachers, students and their families in support of a common purpose, as well as nurture positive relationships between the school and its community. A large, urban school district in Canada has implemented a school cohort-based model to begin to bring developmentally responsive, intellectually engaging middle-level learning environments to scale.
Abstract: In this paper, we present a binary cat swarm
optimization for solving the Set covering problem. The set covering
problem is a well-known NP-hard problem with many practical
applications, including those involving scheduling, production
planning and location problems. Binary cat swarm optimization
is a recent swarm metaheuristic technique based on the behavior
of discrete cats. Domestic cats show the ability to hunt and are
curious about moving objects. The cats have two modes of behavior:
seeking mode and tracing mode. We illustrate this approach with
65 instances of the problem from the OR-Library. Moreover, we
solve this problem with 40 new binarization techniques and we select
the technical with the best results obtained. Finally, we make a
comparison between results obtained in previous studies and the new
binarization technique, that is, with roulette wheel as transfer function
and V3 as discretization technique.
Abstract: In this era of online communication, which transacts data in 0s and 1s, confidentiality is a priced commodity. Ensuring safe transmission of encrypted data and their uncorrupted recovery is a matter of prime concern. Among the several techniques for secure sharing of images, this paper proposes a k out of n region incrementing image sharing scheme for color images. The highlight of this scheme is the use of simple Boolean and arithmetic operations for generating shares and the Lagrange interpolation polynomial for authenticating shares. Additionally, this scheme addresses problems faced by existing algorithms such as color reversal and pixel expansion. This paper regenerates the original secret image whereas the existing systems regenerates only the half toned secret image.
Abstract: The rapid generation of high volume and a broad variety of data from the application of new technologies pose challenges for the generation of business-intelligence. Most organizations and business owners need to extract data from multiple sources and apply analytical methods for the purposes of developing their business. Therefore, the recently decentralized data management environment is relying on a distributed computing paradigm. While data are stored in highly distributed systems, the implementation of distributed data-mining techniques is a challenge. The aim of this technique is to gather knowledge from every domain and all the datasets stemming from distributed resources. As agent technologies offer significant contributions for managing the complexity of distributed systems, we consider this for next-generation data-mining processes. To demonstrate agent-based business intelligence operations, we use agent-oriented modeling techniques to develop a new artifact for mining massive datasets.
Abstract: The laws of Newtonian mechanics allow ab-initio
molecular dynamics to model and simulate particle trajectories in
material science by defining a differentiable potential function. This
paper discusses some considerations for the coding of ab-initio
programs for simulation on a standalone computer and illustrates
the approach by C language codes in the context of embedded
metallic atoms in the face-centred cubic structure. The algorithms use
velocity-time integration to determine particle parameter evolution
for up to several thousands of particles in a thermodynamical
ensemble. Such functions are reusable and can be placed in a
redistributable header library file. While there are both commercial
and free packages available, their heuristic nature prevents dissection.
In addition, developing own codes has the obvious advantage of
teaching techniques applicable to new problems.
Abstract: The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.
Abstract: In this paper, a degradation of the photopolymeric material (PhPM), used as printing plate in the flexography reproduction technique, caused by accelerated aging has been observed. Since the basis process for production of printing plates from the PhPM is a radical cross-linking process caused by exposing to UV wavelengths, the assumption was that improper storage or irregular handling of the PhPM plate can change the surface and structure characteristics of the plates. Results have shown that the aging process causes degradation in the structure and changes in the surface of the PhPM printing plate.
Abstract: Large scale computing infrastructures have been widely
developed with the core objective of providing a suitable platform
for high-performance and high-throughput computing. These systems
are designed to support resource-intensive and complex applications,
which can be found in many scientific and industrial areas. Currently,
large scale data-intensive applications are hindered by the high
latencies that result from the access to vastly distributed data.
Recent works have suggested that improving data locality is key to
move towards exascale infrastructures efficiently, as solutions to this
problem aim to reduce the bandwidth consumed in data transfers, and
the overheads that arise from them. There are several techniques that
attempt to move computations closer to the data. In this survey we
analyse the different mechanisms that have been proposed to provide
data locality for large scale high-performance and high-throughput
systems. This survey intends to assist scientific computing community
in understanding the various technical aspects and strategies that
have been reported in recent literature regarding data locality. As a
result, we present an overview of locality-oriented techniques, which
are grouped in four main categories: application development, task
scheduling, in-memory computing and storage platforms. Finally, the
authors include a discussion on future research lines and synergies
among the former techniques.
Abstract: In this paper, we present the results of a study of TiN thin films which are deposited by a Physical Vapour Deposition (PVD) and Ion Beam Assisted Deposition (IBAD). In the present investigation the subsequent ion implantation was provided with N5+ ions. The ion implantation was applied to enhance the mechanical properties of surface. The thin film deposition process exerts a number of effects such as crystallographic orientation, morphology, topography, densification of the films. A variety of analytic techniques were used for characterization, such as scratch test, calo test, Scanning electron microscopy (SEM), Atomic Force Microscope (AFM), X-ray diffraction (XRD) and Energy Dispersive X-ray analysis (EDAX).
Abstract: This paper discusses the importance of having a good initial characterization of soil samples when thermal desorption has to be applied to polluted soils for the removal of contaminants. Particular attention has to be devoted on the desorption kinetics of the samples to identify the gases evolved during the heating, and contaminant degradation pathways. In this study, two samples coming from different points of the same contaminated site were considered. The samples are much different from each other. Moreover, the presence of high initial quantity of heavy hydrocarbons strongly affected the performance of thermal desorption, resulting in formation of dangerous intermediates. Analytical techniques such TGA (Thermogravimetric Analysis), DSC (Differential Scanning Calorimetry) and GC-MS (Gas Chromatography-Mass) provided a good support to give correct indication for field application.
Abstract: In this paper, a design of H.263 based wireless video
transceiver is presented for wireless camera system. It uses standard
WIFI transceiver and the covering area is up to 100m. Furthermore the
standard H.263 video encoding technique is used for video
compression since wireless video transmitter is unable to transmit high
capacity raw data in real time and the implemented system is capable
of streaming at speed of less than 1Mbps using NTSC 720x480 video.
Abstract: Recently, Automatic Speech Recognition (ASR) systems were used to assist children in language acquisition as it has the ability to detect human speech signal. Despite the benefits offered by the ASR system, there is a lack of ASR systems for Malay-speaking children. One of the contributing factors for this is the lack of continuous speech database for the target users. Though cross-lingual adaptation is a common solution for developing ASR systems for under-resourced language, it is not viable for children as there are very limited speech databases as a source model. In this research, we propose a two-stage adaptation for the development of ASR system for Malay-speaking children using a very limited database. The two stage adaptation comprises the cross-lingual adaptation (first stage) and cross-age adaptation. For the first stage, a well-known speech database that is phonetically rich and balanced, is adapted to the medium-sized Malay adults using supervised MLLR. The second stage adaptation uses the speech acoustic model generated from the first adaptation, and the target database is a small-sized database of the target users. We have measured the performance of the proposed technique using word error rate, and then compare them with the conventional benchmark adaptation. The two stage adaptation proposed in this research has better recognition accuracy as compared to the benchmark adaptation in recognizing children’s speech.
Abstract: The material behavior of graphene, a single layer of
carbon lattice, is extremely sensitive to its dielectric environment. We
demonstrate improvement in electronic performance of graphene
nanowire interconnects with full encapsulation by lattice-matching,
chemically inert, 2D layered insulator hexagonal boron nitride (h-
BN). A novel layer-based transfer technique is developed to construct
the h-BN/MLG/h-BN heterostructures. The encapsulated graphene
wires are characterized and compared with that on SiO2 or h-BN
substrate without passivating h-BN layer. Significant improvements
in maximum current-carrying density, breakdown threshold, and
power density in encapsulated graphene wires are observed. These
critical improvements are achieved without compromising the carrier
transport characteristics in graphene. Furthermore, graphene wires
exhibit electrical behavior less insensitive to ambient conditions, as
compared with the non-passivated ones. Overall, h-BN/graphene/h-
BN heterostructure presents a robust material platform towards the
implementation of high-speed carbon-based interconnects.
Abstract: This paper presents a SAC-OCDMA code with zero cross correlation property to minimize the Multiple Access Interface (MAI) as New Zero Cross Correlation code (NZCC), which is found to be more scalable compared to the other existing SAC-OCDMA codes. This NZCC code is constructed using address segment and data segment. In this work, the proposed NZCC code is implemented in an optical system using the Opti-System software for the spectral amplitude coded optical code-division multiple-access (SAC-OCDMA) scheme. The main contribution of the proposed NZCC code is the zero cross correlation, which reduces both the MAI and PIIN noises. The proposed NZCC code reveals properties of minimum cross-correlation, flexibility in selecting the code parameters and supports a large number of users, combined with high data rate and longer fiber length. Simulation results reveal that the optical code division multiple access system based on the proposed NZCC code accommodates maximum number of simultaneous users with higher data rate transmission, lower Bit Error Rates (BER) and longer travelling distance without any signal quality degradation, as compared to the former existing SAC-OCDMA codes.
Abstract: Parallel hybrid storage systems consist of a hierarchy of different storage devices that vary in terms of data reading speed performance. As we ascend in the hierarchy, data reading speed becomes faster. Thus, migrating the application’ important data that will be accessed in the near future to the uppermost level will reduce the application I/O waiting time; hence, reducing its execution elapsed time. In this research, we implement trace-driven two-levels parallel hybrid storage system prototype that consists of HDDs and SSDs. The prototype uses data mining techniques to classify application’ data in order to determine its near future data accesses in parallel with the its on-demand request. The important data (i.e. the data that the application will access in the near future) are continuously migrated to the uppermost level of the hierarchy. Our simulation results show that our data migration approach integrated with data mining techniques reduces the application execution elapsed time when using variety of traces in at least to 22%.
Abstract: The study of Sanitary landfill in Bang Nok-khwaek
municipality consists of two procedures. First, to survey and create
the spatial database by using physical factor, environmental factor,
economical factor and social factor to follow the method of
Geographic information system: GIS, second, to analyze the proper
spatial for allocating the sanitary landfill in Bang Nok-khwaek
municipality by using Overlay techniques to calculate the weighting
linear total in Arc GIS program.
The study found that there are 2.49 sq.km. proper spatial for the
sanitary landfill in Bang Nok-khwaek municipals city which is
66.76% of the whole area. The highest proper spatial is 0.02 sq.km.
which is 0.54%, The high proper spatial is 0.3 sq.km. which is
8.04%, the moderate spatial is 1.62 sq.km. which is 43.43% and the
low proper spatial is 0.55 sq.km. which is 14.75%. These results will
be used as the guideline to select the sanitary landfill area in
accordance with sanitation standard for Subdistrict Administrative
Organization and Subbdistrict Municipality in Samut Songkhram
provice.
Abstract: Biological processes based on oxidation of sulfur
compounds by chemolithotrophic microorganisms are emerging as an
efficient and eco-friendly technique for removal of sulfur from the
coal. In the present article, study was carried out to investigate the
potential of biodesulfurization process in removing the sulfur from
lignite coal sample collected from a Mongolian coal mine. The batch
biodesulfurization experiments were conducted in 2.5 L borosilicate
baffle type reactors at 35 ºC using Acidithiobacillus ferrooxidans.
The effect of pulp density on efficiency of biodesulfurization was
investigated at different solids concentration (1-10%) of coal. The
results of the present study suggested that the rate of desulfurization
was retarded at higher coal pulp density. The optimum pulp density
found 5% at which about 48% of the total sulfur was removed from
the coal.
Abstract: Topology optimization technique utilizes constant
element densities as design parameters. Finally, optimal distribution
contours of the material densities between voids (0) and solids (1) in
design domain represent the determination of topology. It means that
regions with element density values become occupied by solids in
design domain, while there are only void phases in regions where no
density values exist. Therefore the void regions of topology
optimization results provide design information to decide appropriate
depositions of web-opening in structure. Contrary to the basic
objective of the topology optimization technique which is to obtain
optimal topology of structures, this present study proposes a new idea
that topology optimization results can be also utilized for decision of
proper web-opening’s position. Numerical examples of linear
elastostatic structures demonstrate efficiency of methodological
design processes using topology optimization in order to determinate
the proper deposition of web-openings.
Abstract: Object manipulation techniques in robotics can be
categorized in two major groups including manipulation with grasp
and manipulation without grasp. The original aim of this paper is to
develop an object manipulation method where in addition to being
grasp-less, the manipulation task is done in a passive approach. In
this method, linear and angular positions of the object are changed
and its manipulation path is controlled. The manipulation path is a
helix track with constant radius and incline. The method presented in
this paper proposes a system which has not the actuator and the active
controller. So this system requires a passive mechanical intelligence
to convey the object from the status of the source along the specified
path to the goal state. This intelligent is created based on utilizing the
geometry of the system components. A general set up for the
components of the system is considered to satisfy the required
conditions. Then after kinematical analysis, detailed dimensions and
geometry of the mechanism is obtained. The kinematical results are
verified by simulation in ADAMS.
Abstract: The present work is devoted to thermographic studies of curved composite panels (unidirectional GFRP) with subsurface defects. Various artificial defects, created by inserting PTFE stripe between individual layers of a laminate during manufacturing stage are studied. The analysis is conducted both with the use finite element method and experiments. To simulate transient heat transfer in 3D model with embedded various defect sizes, the ANSYS package is used. Pulsed Thermography combined with optical excitation source provides good results for flat surfaces. Composite structures are mostly used in complex components, e.g., pipes, corners and stiffeners. Local decrease of mechanical properties in these regions can have significant influence on strength decrease of the entire structure. Application of active procedures of thermography to defect detection and evaluation in this type of elements seems to be more appropriate that other NDT techniques. Nevertheless, there are various uncertainties connected with correct interpretation of acquired data. In this paper, important factors concerning Infrared Thermography measurements of curved surfaces in the form of cylindrical panels are considered. In addition, temperature effects on the surface resulting from complex geometry and embedded and real defect are also presented.