Abstract: This research paper presents guiding on how to design
social media into higher education courses. The research
methodology used a survey approach. The research instrument was a
questionnaire about guiding on how to design social media into
higher education courses. Thirty-one lecturers completed the
questionnaire. The data were scored by frequency and percentage.
The research results were the lecturers’ opinions concerning the
designing social media into higher education courses as follows: 1)
Lecturers deem that the most suitable learning theory is Collaborative
Learning. 2) Lecturers consider that the most important learning and
innovation Skill in the 21st century is communication and
collaboration skills. 3) Lecturers think that the most suitable
evaluation technique is authentic assessment. 4) Lecturers consider
that the most appropriate portion used as blended learning should be
70% in the classroom setting and 30% online.
Abstract: Standard Gibbs energy of formation ΔGfor(298.15) of
lanthanide-iron double oxides of garnet-type crystal structure
R3Fe5O12 - RIG (R – are rare earth ions) from initial oxides are
evaluated. The calculation is based on the data of standard entropies
S298.15 and standard enthalpies ΔH298.15 of formation of compounds
which are involved in the process of garnets synthesis. Gibbs energy
of formation is presented as temperature function ΔGfor(T) for the
range 300-1600K. The necessary starting thermodynamic data were
obtained from calorimetric study of heat capacity – temperature
functions and by using the semi-empirical method for calculation of
ΔH298.15 of formation. Thermodynamic functions for standard
temperature – enthalpy, entropy and Gibbs energy - are
recommended as reference data for technological evaluations.
Through the structural series of rare earth-iron garnets the correlation
between thermodynamic properties and characteristics of lanthanide
ions are elucidated.
Abstract: There is currently a gap in the technology covering the
rapid establishment of control after a reconfiguration in a
Reconfigurable Manufacturing System. This gap involves the
detection of the factory floor state and the communication link
between the factory floor and the high-level software. In this paper, a
thin, hardware-supported Middleware Management System (MMS) is
proposed and its design and implementation are discussed. The
research found that a cost-effective localization technique can be
combined with intelligent software to speed up the ramp-up of a
reconfigured system. The MMS makes the process more intelligent,
more efficient and less time-consuming, thus supporting the
industrial implementation of the RMS paradigm.
Abstract: Discursive practices enacted by educators in
kindergarten create a blueprint for how the educational trajectories of
students with disabilities are constructed. This two-year ethnographic
case study critically examines educators’ relationships with students
considered to present challenging behaviors in one kindergarten
classroom located in a predominantly White middle class school
district in the Northeast of the United States. Focusing on the
language and practices used by one special education teacher and
three teaching assistants, this paper analyzes how teacher responses
to students’ behaviors constructs and positions students over one year
of kindergarten education. Using a critical discourse analysis it shows
that educators understand students’ behaviors as deficit and needing
consequences. This study highlights how educators’ responses reflect
students' individual characteristics including family background,
socioeconomics and ability status. This paper offers in depth analysis
of two students’ stories, which evidenced that the language used by
educators amplifies the social positioning of students within the
classroom and creates a foundation for who they are constructed to
be. Through exploring routine language and practices, this paper
demonstrates that educators outlined a blueprint of kindergartners,
which positioned students as learners in ways that became the ground
for either a limited or a promising educational pathway for them.
Abstract: In this article, we deal with a variant of the classical
course timetabling problem that has a practical application in many
areas of education. In particular, in this paper we are interested in
high schools remedial courses. The purpose of such courses is to
provide under-prepared students with the skills necessary to succeed
in their studies. In particular, a student might be under prepared in
an entire course, or only in a part of it. The limited availability
of funds, as well as the limited amount of time and teachers at
disposal, often requires schools to choose which courses and/or which
teaching units to activate. Thus, schools need to model the training
offer and the related timetabling, with the goal of ensuring the
highest possible teaching quality, by meeting the above-mentioned
financial, time and resources constraints. Moreover, there are some
prerequisites between the teaching units that must be satisfied. We
first present a Mixed-Integer Programming (MIP) model to solve
this problem to optimality. However, the presence of many peculiar
constraints contributes inevitably in increasing the complexity of
the mathematical model. Thus, solving it through a general-purpose
solver may be performed for small instances only, while solving
real-life-sized instances of such model requires specific techniques
or heuristic approaches. For this purpose, we also propose a heuristic
approach, in which we make use of a fast constructive procedure
to obtain a feasible solution. To assess our exact and heuristic
approaches we perform extensive computational results on both
real-life instances (obtained from a high school in Lecce, Italy) and
randomly generated instances. Our tests show that the MIP model is
never solved to optimality, with an average optimality gap of 57%.
On the other hand, the heuristic algorithm is much faster (in about the
50% of the considered instances it converges in approximately half of
the time limit) and in many cases allows achieving an improvement
on the objective function value obtained by the MIP model. Such an
improvement ranges between 18% and 66%.
Abstract: New environmental regulations and the increasing
market preference for companies that respect the ecosystem had
encouraged the industry to look after new treatments for its effluents.
The sugar industry, one of the largest emitter of environmental
pollutants, follows this tendency. Membrane technology is
convenient for separation of suspended solids, colloids and high
molecular weight materials that are present in a wastewater from
sugar industry. The idea is to microfilter the wastewater, where the
permeate passes through the membrane and becomes available for
recycle and re-use in the sugar manufacturing process. For
microfiltration of this effluent a tubular ceramic membrane was used
with a pore size of 200 nm at transmembrane pressure in range of 1–3
bars and in range of flow rate of 50–150 l/h. Kenics static mixer was
used for permeate flux enhancement. Turbidity and suspended solids
were removed and the permeate flux was continuously monitored
during the microfiltration process. The flux achieved after 90 minutes
of microfiltration was in a range of 50–70 l/m2h. The obtained
turbidity decrease was in the range of 50-99 % and total amount of
suspended solids was removed.
Abstract: Indonesia has experienced annual forest fires that have
rapidly destroyed and degraded its forests. Fires in the peat swamp
forests of Riau Province, have set the stage for problems to worsen,
this being the ecosystem most prone to fires (which are also the most
difficult, to extinguish). Despite various efforts to curb deforestation,
and forest degradation processes, severe forest fires are still
occurring. To find an effective solution, the basic causes of the
problems must be identified. It is therefore critical to have an indepth
understanding of the underlying causal factors that have
contributed to deforestation and forest degradation as a whole, in
order to attain reductions in their rates. An assessment of the drivers of deforestation and forest
degradation was carried out, in order to design and implement
measures that could slow these destructive processes. Research was
conducted in Giam Siak Kecil–Bukit Batu Biosphere Reserve
(GSKBB BR), in the Riau Province of Sumatera, Indonesia. A
biosphere reserve was selected as the study site because such reserves
aim to reconcile conservation with sustainable development. A
biosphere reserve should promote a range of local human activities,
together with development values that are in line spatially and
economically with the area conservation values, through use of a
zoning system. Moreover, GSKBB BR is an area with vast peatlands,
and is experiencing forest fires annually. Various factors were
analysed to assess the drivers of deforestation and forest degradation
in GSKBB BR; data were collected from focus group discussions
with stakeholders, key informant interviews with key stakeholders,
field observation and a literature review. Landsat satellite imagery was used to map forest-cover changes
for various periods. Analysis of landsat images, taken during the
period 2010-2014, revealed that within the non-protected area of core
zone, there was a trend towards decreasing peat swamp forest areas,
increasing land clearance, and increasing areas of community oilpalm
and rubber plantations. Fire was used for land clearing and most
of the forest fires occurred in the most populous area (the transition
area). The study found a relationship between the deforested/
degraded areas, and certain distance variables, i.e. distance from
roads, villages and the borders between the core area and the buffer
zone. The further the distance from the core area of the reserve, the
higher was the degree of deforestation and forest degradation. Research findings suggested that agricultural expansion may be
the direct cause of deforestation and forest degradation in the reserve,
whereas socio-economic factors were the underlying driver of forest
cover changes; such factors consisting of a combination of sociocultural,
infrastructural, technological, institutional (policy and governance), demographic (population pressure) and economic
(market demand) considerations. These findings indicated that local
factors/problems were the critical causes of deforestation and
degradation in GSKBB BR. This research therefore concluded that
reductions in deforestation and forest degradation in GSKBB BR
could be achieved through ‘local actor’-tailored approaches such as
community empowerment.
Abstract: This report presents an alternative technique of
application of contrast agent in vivo, i.e. before sampling. By this
new method the electron micrograph of tissue sections have an
acceptable contrast compared to other methods and present no artifact
of precipitation on sections. Another advantage is that a small amount
of contrast is needed to get a good result given that most of them are
expensive and extremely toxic.
Abstract: Localization of nodes is one of the key issues of
Wireless Sensor Network (WSN) that gained a wide attention in
recent years. The existing localization techniques can be generally
categorized into two types: range-based and range-free. Compared
with rang-based schemes, the range-free schemes are more costeffective,
because no additional ranging devices are needed. As a
result, we focus our research on the range-free schemes. In this paper
we study three types of range-free location algorithms to compare the
localization error and energy consumption of each one. Centroid
algorithm requires a normal node has at least three neighbor anchors,
while DV-hop algorithm doesn’t have this requirement. The third
studied algorithm is the amorphous algorithm similar to DV-Hop
algorithm, and the idea is to calculate the hop distance between two
nodes instead of the linear distance between them. The simulation
results show that the localization accuracy of the amorphous
algorithm is higher than that of other algorithms and the energy
consumption does not increase too much.
Abstract: Efficient use of energy, the increase in demand of
energy and also with the reduction of natural energy sources, has
improved its importance in recent years. Most of the losses in the
system from electricity produced until the point of consumption is
mostly composed by the energy distribution system. In this study,
analysis of the resulting loss in power distribution transformer and
distribution power cable is realized which are most of the losses in
the distribution system. Transformer losses in the real distribution
system are analyzed by CYME Power Engineering Software
program. These losses are disclosed for different voltage levels and
different loading conditions.
Abstract: The development of transport systems has negative
impacts on the environment although it has beneficial effects on
society. The car policy caused many problems such as: - the
spectacular growth of fuel consumption hence the very vast increase
in urban pollution, traffic congestion in certain places and at certain
times, the increase in the number of accidents. The exhaust emissions
from cars and weather conditions are the main factors that determine
the level of pollution in urban atmosphere. These conditions lead to
the phenomenon of heat transfer and radiation occurring between the
air and the soil surface of any town. These exchanges give rise, in
urban areas, to the effects of heat islands that correspond to the
appearance of excess air temperature between the city and its
surrounding space. In this object, we perform a numerical simulation
of the plume generated by the cars exhaust gases and show that these
gases form a screening effect above the urban city which cause the
heat island in the presence of wind flow. This study allows us: 1. To
understand the different mechanisms of interactions between these
phenomena.2. To consider appropriate technical solutions to mitigate
the effects of the heat island.
Abstract: In this study, a multi objective optimization for end
milling of Al 6061 alloy has been presented to provide better
surface quality and higher Material Removal Rate (MRR). The input
parameters considered for the analysis are spindle speed, depth of cut
and feed. The experiments were planned as per Taguchis design of
experiment, with L27 orthogonal array. The Grey Relational Analysis
(GRA) has been used for transforming multiple quality responses
into a single response and the weights of the each performance
characteristics are determined by employing the Principal Component
Analysis (PCA), so that their relative importance can be properly and
objectively described. The results reveal that Taguchi based G-PCA
can effectively acquire the optimal combination of cutting parameters.
Abstract: As the Silicon oxide scaled down in MOSFET
technology to few nanometers, gate Direct Tunneling (DT) in
Floating gate (FGMOSFET) devices has become a major concern for
analog designers. FGMOSFET has been used in many low-voltage
and low-power applications, however, there is no accurate model that
account for DT gate leakage in nano-scale. This paper studied and
analyzed different simulation models for FGMOSFET using TSMC
90-nm technology. The simulation results for FGMOSFET cascade
current mirror shows the impact of DT on circuit performance in
terms of current and voltage without the need for fabrication. This
works shows the significance of using an accurate model for
FGMOSFET in nan-scale technologies.
Abstract: File sharing in networks is generally achieved using
Peer-to-Peer (P2P) applications. Structured P2P approaches are
widely used in adhoc networks due to its distributed and scalability
features. Efficient mechanisms are required to handle the huge
amount of data distributed to all peers. The intrinsic characteristics of
P2P system makes for easier content distribution when compared to
client-server architecture. All the nodes in a P2P network act as both
client and server, thus, distributing data takes lesser time when
compared to the client-server method. CHORD protocol is a resource
routing based where nodes and data items are structured into a 1-
dimensional ring. The structured lookup algorithm of Chord is
advantageous for distributed P2P networking applications. However,
structured approach improves lookup performance in a high
bandwidth wired network it could contribute to unnecessary overhead
in overlay networks leading to degradation of network performance.
In this paper, the performance of existing CHORD protocol on
Wireless Mesh Network (WMN) when nodes are static and dynamic
is investigated.
Abstract: Life cycle assessment is a technique to assess the
environmental aspects and potential impacts associated with a
product, process, or service, by compiling an inventory of relevant
energy and material inputs and environmental releases; evaluating the
potential environmental impacts associated with identified inputs and
releases; and interpreting the results to help you make a more
informed decision. In this paper, the life cycle assessment of
aluminum and beech wood as two commonly used materials in Egypt
for window frames are heading, highlighting their benefits and
weaknesses. Window frames of the two materials have been assessed
on the basis of their production, energy consumption and
environmental impacts. It has been found that the climate change of
the windows made of aluminum and beech wood window, for a
reference window (1.2m×1.2m), are 81.7 mPt and -52.5 mPt impacts
respectively. Among the most important results are: fossil fuel
consumption, potential contributions to the green building effect and
quantities of solid waste tend to be minor for wood products
compared to aluminum products; incineration of wood products can
cause higher impacts of acidification and eutrophication than
aluminum, whereas thermal energy can be recovered.
Abstract: The purpose of this study is the discrimination of 28
postmenopausal with osteoporotic femoral fractures from an agematched
control group of 28 women using texture analysis based on
fractals. Two pre-processing approaches are applied on radiographic
images; these techniques are compared to highlight the choice of the
pre-processing method. Furthermore, the values of the fractal
dimension are compared to those of the fractal signature in terms of
the classification of the two populations. In a second analysis, the
BMD measure at proximal femur was compared to the fractal
analysis, the latter, which is a non-invasive technique, allowed a
better discrimination; the results confirm that the fractal analysis of
texture on calcaneus radiographs is able to discriminate osteoporotic
patients with femoral fracture from controls. This discrimination was
efficient compared to that obtained by BMD alone. It was also
present in comparing subgroups with overlapping values of BMD.
Abstract: This paper applied factor conditions from Porter’s
Diamond Model (1990) to understand the various challenges facing
the AMISA. Factor conditions highlighted in Porter’s model are
grouped into two groups namely, basic and advance factors. Two
AMISA associations representing over 10 000 employees were
interviewed. The largest Clothing, Textiles and Leather (CTL)
apparel retail group was also interviewed with a government
department implementing the industrialization policy were
interviewed. The paper points out that AMISA have basic factor conditions
necessary for competitive advantage in the apparel industries.
However advance factor creation has proven to be a challenge for
AMISA, Higher Education Institutions (HEIs) and government. Poor
infrastructural maintenance has contributed to high manufacturing
costs and poor quick response technologies. The use of Porter’s
Factor Conditions as a tool to analyze the sector’s competitive
advantage challenges and opportunities has increased knowledge
regarding factors that limit the AMISA’s competitiveness. It is
therefore argued that other studies on Porter’s Diamond model
factors like Demand conditions, Firm strategy, structure and rivalry
and Related and supporting industries can be used to analyze the
situation of the AMISA for the purposes of improving competitive
advantage.
Abstract: The use of titanium fluoride and iron fluoride
(TiF3/FeF3) catalysts in combination with polutetrafluoroethylene
(PTFE) in plain zinc- dialkyldithiophosphate (ZDDP) oil is important
for the study of engine tribocomponents and is increasingly a strategy
to improve the formation of tribofilm and provide low friction and
excellent wear protection in reduced phosphorus plain ZDDP oil. The
influence of surface roughness and the concentration of
TiF3/FeF3/PTFE were investigated using bearing steel samples
dipped in lubricant solution at 100°C for two different heating time
durations. This paper addresses the effects of water drop contact
angle using different surface; finishes after treating them with
different lubricant combination. The calculated water drop contact
angles were analyzed using Design of Experiment software (DOE)
and it was determined that a 0.05 μm Ra surface roughness would
provide an excellent TiF3/FeF3/PTFE coating for antiwear resistance
as reflected in the Scanning electron microscopy (SEM) images and
the tribological testing under extreme pressure conditions. Both
friction and wear performance depend greatly on the PTFE/and
catalysts in plain ZDDP oil with 0.05 % phosphorous and on the
surface finish of bearing steel. The friction and wear reducing effects,
which was observed in the tribological tests, indicated a better micro
lubrication effect of the 0.05 μm Ra surface roughness treated at
100°C for 24 hours when compared to the 0.1 μm Ra surface
roughness with the same treatment.
Abstract: Background: Muscle Energy Techniques (MET) have
been widely used by manual therapists over the past years, but still
limited research validated its use and there was limited evidence to
substantiate the theories used to explain its effects. Objective: To
investigate the effect of Muscle Energy Technique (MET) on anterior
pelvic tilt in patients with lumbar spondylosis. Design: Randomized
controlled trial. Subjects: Thirty patients with anterior pelvic tilt from
both sexes were involved, aged between 35 to 50 years old and they
were divided into MET and control groups with 15 patients in each.
Methods: All patients received 3sessions/week for 4 weeks where the
study group received MET, Ultrasound and Infrared, and the control
group received U.S and I.R only. Pelvic angle was measured by
palpation meter, pain severity by the visual analogue scale and
functional disabilities by the Oswestry disability index. Results: Both
groups showed significant improvement in all measured variables.
The MET group was significantly better than the control group in
pelvic angle, pain severity, and functional disability as p-value were
(0.001, 0.0001, 0.0001) respectively. Conclusion and implication: the
study group fulfilled greater improvement in all measured variables
than the control group which implies that application of MET in
combination with U.S and I.R were more effective in improving
pelvic tilting angle, pain severity and functional disabilities than
using electrotherapy only.
Abstract: Particle size distribution, the most important
characteristics of aerosols, is obtained through electrical
characterization techniques. The dynamics of charged nanoparticles
under the influence of electric field in Electrical Mobility
Spectrometer (EMS) reveals the size distribution of these particles.
The accuracy of this measurement is influenced by flow conditions,
geometry, electric field and particle charging process, therefore by
the transfer function (transfer matrix) of the instrument. In this work,
a wire-cylinder corona charger was designed and the combined fielddiffusion
charging process of injected poly-disperse aerosol particles
was numerically simulated as a prerequisite for the study of a
multichannel EMS. The result, a cloud of particles with no uniform
charge distribution, was introduced to the EMS. The flow pattern and
electric field in the EMS were simulated using Computational Fluid
Dynamics (CFD) to obtain particle trajectories in the device and
therefore to calculate the reported signal by each electrometer.
According to the output signals (resulted from bombardment of
particles and transferring their charges as currents), we proposed a
modification to the size of detecting rings (which are connected to
electrometers) in order to evaluate particle size distributions more
accurately. Based on the capability of the system to transfer
information contents about size distribution of the injected particles,
we proposed a benchmark for the assessment of optimality of the
design. This method applies the concept of Von Neumann entropy
and borrows the definition of entropy from information theory
(Shannon entropy) to measure optimality. Entropy, according to the
Shannon entropy, is the ''average amount of information contained in
an event, sample or character extracted from a data stream''.
Evaluating the responses (signals) which were obtained via various
configurations of detecting rings, the best configuration which gave
the best predictions about the size distributions of injected particles,
was the modified configuration. It was also the one that had the
maximum amount of entropy. A reasonable consistency was also
observed between the accuracy of the predictions and the entropy
content of each configuration. In this method, entropy is extracted
from the transfer matrix of the instrument for each configuration.
Ultimately, various clouds of particles were introduced to the
simulations and predicted size distributions were compared to the
exact size distributions.