Abstract: The availability to deploy mobile applications for
health care is increasing daily thru different mobile app stores. But
within these capabilities the number of hacking attacks has also
increased, in particular into medical mobile applications. The security
vulnerabilities in medical mobile apps can be triggered by errors in
code, incorrect logic, poor design, among other parameters. This is
usually used by malicious attackers to steal or modify the users’
information. The aim of this research is to analyze the vulnerabilities
detected in mobile medical apps according to risk factor standards
defined by OWASP in 2014.
Abstract: Carefully scheduling the operations of pumps can be
resulted to significant energy savings. Schedules can be defined
either implicit, in terms of other elements of the network such as tank
levels, or explicit by specifying the time during which each pump is
on/off. In this study, two new explicit representations based on timecontrolled
triggers were analyzed, where the maximum number of
pump switches was established beforehand, and the schedule may
contain fewer switches than the maximum. The optimal operation of
pumping stations was determined using a Jumping Particle Swarm
Optimization (JPSO) algorithm to achieve the minimum energy cost.
The model integrates JPSO optimizer and EPANET hydraulic
network solver. The optimal pump operation schedule of VanZyl
water distribution system was determined using the proposed model
and compared with those from Genetic and Ant Colony algorithms.
The results indicate that the proposed model utilizing the JPSO
algorithm is a versatile management model for the operation of realworld
water distribution system.
Abstract: The system of ordinary nonlinear differential
equations describing sliding velocity during impact with friction for a
three-dimensional rigid-multibody system is developed. No analytical
solutions have been obtained before for this highly nonlinear system.
Hence, a power series solution is proposed. Since the validity of this
solution is limited to its convergence zone, a suitable time step is
chosen and at the end of it a new series solution is constructed. For a
case study, the trajectory of the sliding velocity using the proposed
method is built using 6 time steps, which coincides with a Runge-
Kutta solution using 38 time steps.
Abstract: One of the most important challenging factors in
medical images is nominated as noise. Image denoising refers to the
improvement of a digital medical image that has been infected by
Additive White Gaussian Noise (AWGN). The digital medical image
or video can be affected by different types of noises. They are
impulse noise, Poisson noise and AWGN. Computed tomography
(CT) images are subjects to low quality due to the noise. Quality of
CT images is dependent on absorbed dose to patients directly in such
a way that increase in absorbed radiation, consequently absorbed
dose to patients (ADP), enhances the CT images quality. In this
manner, noise reduction techniques on purpose of images quality
enhancement exposing no excess radiation to patients is one the
challenging problems for CT images processing. In this work, noise
reduction in CT images was performed using two different
directional 2 dimensional (2D) transformations; i.e., Curvelet and
Contourlet and Discrete Wavelet Transform (DWT) thresholding
methods of BayesShrink and AdaptShrink, compared to each other
and we proposed a new threshold in wavelet domain for not only
noise reduction but also edge retaining, consequently the proposed
method retains the modified coefficients significantly that result good
visual quality. Data evaluations were accomplished by using two
criterions; namely, peak signal to noise ratio (PSNR) and Structure
similarity (Ssim).
Abstract: One of the fundamental characteristics of Information
and Communication Technology (ICT) has been the ever-changing
nature of continuous release and models of ICTs with its impact on
the academic, social, and psychological benefits of its introduction in
schools. However, there seems to be a growing concern about its
negative impact on students when introduced early in schools for
teaching and learning. This study aims to design a model of child
development factors affecting the early introduction of ICTs in
schools in an attempt to improve the understanding of child
development and introduction of ICTs in schools. The proposed
model is based on a sound theoretical framework. It was designed
following a literature review of child development theories and child
development factors. The child development theoretical framework
that fitted to the best of all child development factors was then chosen
as the basis for the proposed model. This study hence found that the
Jean Piaget cognitive developmental theory is the most adequate
theoretical frameworks for modeling child development factors for
ICT introduction in schools.
Abstract: This research will give the introductory ideas for
cultural adaption of B2C E-Service design in Germany. By the
intense competition of E-Service development, many companies have
realized the importance of understanding the emotional and cultural
characteristics of their customers. Ignoring customers’ needs and
requirements throughout the E-Service design can lead to faults,
mistakes, and gaps. The term of E-Service usability now is changed
not only to develop high quality E-Services, but also to be extended
to include customer satisfaction and provide for them to feel local.
Abstract: There are a number of Distributed Generations (DGs)
installed in microgrid, which may have diverse path and direction of
power flow or fault current. The overcurrent protection scheme for the
traditional radial type distribution system will no longer meet the
needs of microgrid protection. Integrating the Intelligent Electronic
Device (IED) and a Supervisory Control and Data Acquisition
(SCADA) with IEC 61850 communication protocol, the paper
proposes a Microgrid Protection Management System (MPMS) to
protect power system from the fault. In the proposed method, the
MPMS performs logic programming of each IED to coordinate their
tripping sequence. The GOOSE message defined in IEC 61850 is used
as the transmission information medium among IEDs. Moreover, to
cope with the difference in fault current of microgrid between
grid-connected mode and islanded mode, the proposed MPMS applies
the group setting feature of IED to protect system and robust
adaptability. Once the microgrid topology varies, the MPMS will
recalculate the fault current and update the group setting of IED.
Provided there is a fault, IEDs will isolate the fault at once. Finally, the
Matlab/Simulink and Elipse Power Studio software are used to
simulate and demonstrate the feasibility of the proposed method.
Abstract: This paper addresses minimizing the makespan of the
distributed permutation flow shop scheduling problem. In this
problem, there are several parallel identical factories or flowshops
each with series of similar machines. Each job should be allocated to
one of the factories and all of the operations of the jobs should be
performed in the allocated factory. This problem has recently gained
attention and due to NP-Hard nature of the problem, metaheuristic
algorithms have been proposed to tackle it. Majority of the proposed
algorithms require large computational time which is the main
drawback. In this study, a general variable neighborhood search
algorithm (GVNS) is proposed where several time-saving schemes
have been incorporated into it. Also, the GVNS uses the sophisticated
method to change the shaking procedure or perturbation depending
on the progress of the incumbent solution to prevent stagnation of the
search. The performance of the proposed algorithm is compared to
the state-of-the-art algorithms based on standard benchmark
instances.
Abstract: This study aims to increase understanding of the
transition of business models in servitization. The significance of
service in all business has increased dramatically during the past
decades. Service-dominant logic (SDL) describes this change in the
economy and questions the goods-dominant logic on which business
has primarily been based in the past. A business model canvas is one
of the most cited and used tools in defining end developing business
models. The starting point of this paper lies in the notion that the
traditional business model canvas is inherently goods-oriented and
best suits for product-based business. However, the basic differences
between goods and services necessitate changes in business model
representations when proceeding in servitization. Therefore, new
knowledge is needed on how the conception of business model and
the business model canvas as its representation should be altered in
servitized firms in order to better serve business developers and interfirm
co-creation. That is to say, compared to products, services are
intangible and they are co-produced between the supplier and the
customer. Value is always co-created in interaction between a
supplier and a customer, and customer experience primarily depends
on how well the interaction succeeds between the actors. The role of
service experience is even stronger in service business compared to
product business, as services are co-produced with the customer. This paper provides business model developers with a service
business model canvas, which takes into account the intangible,
interactive, and relational nature of service. The study employs a
design science approach that contributes to theory development via
design artifacts. This study utilizes qualitative data gathered in
workshops with ten companies from various industries. In particular,
key differences between Goods-dominant logic (GDL) and SDLbased
business models are identified when an industrial firm
proceeds in servitization. As the result of the study, an updated version of the business
model canvas is provided based on service-dominant logic. The
service business model canvas ensures a stronger customer focus and
includes aspects salient for services, such as interaction between
companies, service co-production, and customer experience. It can be
used for the analysis and development of a current service business
model of a company or for designing a new business model. It
facilitates customer-focused new service design and service
development. It aids in the identification of development needs, and
facilitates the creation of a common view of the business model.
Therefore, the service business model canvas can be regarded as a
boundary object, which facilitates the creation of a common
understanding of the business model between several actors involved.
The study contributes to the business model and service business
development disciplines by providing a managerial tool for
practitioners in service development. It also provides research insight
into how servitization challenges companies’ business models.
Abstract: This paper will discuss how we optimize our physical
verification flow in our IC Design Department having various rule
decks from multiple foundries. Our ultimate goal is to achieve faster
time to tape-out and avoid schedule delay. Currently the physical
verification runtimes and memory usage have drastically increased
with the increasing number of design rules, design complexity, and
the size of the chips to be verified. To manage design violations, we
use a number of solutions to reduce the amount of violations needed
to be checked by physical verification engineers. The most important
functions in physical verifications are DRC (design rule check), LVS
(layout vs. schematic), and XRC (extraction). Since we have a
multiple number of foundries for our design tape-outs, we need a
flow that improve the overall turnaround time and ease of use of the
physical verification process. The demand for fast turnaround time is
even more critical since the physical design is the last stage before
sending the layout to the foundries.
Abstract: The current tools for real time management of sewer
systems are based on two software tools: the software of weather
forecast and the software of hydraulic simulation. The use of the first
ones is an important cause of imprecision and uncertainty, the use of
the second requires temporal important steps of decision because of
their need in times of calculation. This way of proceeding fact that
the obtained results are generally different from those waited. The major idea of this project is to change the basic paradigm by
approaching the problem by the "automatic" face rather than by that
"hydrology". The objective is to make possible the realization of a
large number of simulations at very short times (a few seconds)
allowing to take place weather forecasts by using directly the real
time meditative pluviometric data. The aim is to reach a system
where the decision-making is realized from reliable data and where
the correction of the error is permanent. A first model of control laws was realized and tested with different
return-period rainfalls. The gains obtained in rejecting volume vary
from 19 to 100 %. The development of a new algorithm was then
used to optimize calculation time and thus to overcome the
subsequent combinatorial problem in our first approach. Finally, this
new algorithm was tested with 16- year-rainfall series. The obtained
gains are 40 % of total volume rejected to the natural environment
and of 65 % in the number of discharges.
Abstract: Present study was aimed to develop a discharge
measuring device for irrigation and laboratory channels. Experiments
were conducted on sharp edged constricted flow meters having four
types of width constrictions namely 2:1, 1.5:1, 1:1 and 90o in the
direction of flow. These devices were made of MS sheets and
installed separately in a rectangular flume. All these four devices
were tested under free and submerged flow conditions. Eight
different discharges varying from 2 lit/sec to 30 lit/sec were passed
through each device. In total around 500 observations of upstream
and downstream depths were taken in the present work. For each
discharge, free submerged and critical submergence under different
flow conditions were noted and plotted. Once the upstream and
downstream depths of flow over any of the device are known, the
discharge can be easily calculated with the help of the curves
developed for free and submerged flow conditions. The device
having contraction 2:1 is the most efficient one as it allows maximum
critical submergence.
Abstract: Although there has been a growing interest in the
hybrid free-space optical link and radio frequency FSO/RF
communication system, the current literature is limited to results
obtained in moderate or cold environment. In this paper, using a soft
switching approach, we investigate the effect of weather
inhomogeneities on the strength of turbulence hence the channel
refractive index under Qatar harsh environment and their influence
on the hybrid FSO/RF availability. In this approach, either FSO/RF
or simultaneous or none of them can be active. Based on soft
switching approach and a finite state Markov Chain (FSMC) process,
we model the channel fading for the two links and derive a
mathematical expression for the outage probability of the hybrid
system. Then, we evaluate the behavior of the hybrid FSO/RF under
hazy and harsh weather. Results show that the FSO/RF soft switching
renders the system outage probability less than that of each link
individually. A soft switching algorithm is being implemented on
FPGAs using Raptor code interfaced to the two terminals of a
1Gbps/100 Mbps FSO/RF hybrid system, the first being implemented
in the region. Experimental results are compared to the above
simulation results.
Abstract: In this paper, we study the optical nonlinearities of
Silver sulfide (Ag2S) nanostructures dispersed in the Dimethyl
sulfoxide (DMSO) under exposure to 532 nm, 15 nanosecond (ns)
pulsed laser irradiation. Ultraviolet–visible absorption spectrometry
(UV-Vis), X-ray diffraction (XRD), and transmission electron
microscopy (TEM) are used to characterize the obtained nanocrystal
samples. The band gap energy of colloid is determined by analyzing
the UV–Vis absorption spectra of the Ag2S NPs using the band
theory of semiconductors. Z-scan technique is used to characterize
the optical nonlinear properties of the Ag2S nanoparticles (NPs).
Large enhancement of two photon absorption effect is observed with
increase in concentration of the Ag2S nanoparticles using open Zscan
measurements in the ns laser regime. The values of the nonlinear
absorption coefficients are determined based on the local nonlinear
responses including two photon absorption. The observed aperture
dependence of the Ag2S NP limiting performance indicates that the
nonlinear scattering plays an important role in the limiting action of
the sample. The concentration dependence of the optical liming is
also investigated. Our results demonstrate that the optical limiting
threshold decreases with increasing the silver sulfide NPs in DMSO.
Abstract: In many countries, governments have been promoting the involvement of private sector entities to enter into long-term agreements for the development and delivery of large infrastructure projects, with a focus on overcoming the limitations upon public fund of the traditional approach. The involvement of private sector through public private partnerships (PPP) brings in new capital investments, value for money and additional risks to handle. Worldwide research studies have shown that an objective, systematic, reliable and useroriented risk assessment process and an optimal allocation mechanism among different stakeholders is crucial to the successful completion. In this framework, this paper, which is the first stage of a research study, aims to identify the main risks for the delivery of PPP projects. A review of cross-countries research projects and case studies was performed to map the key risks affecting PPP infrastructure delivery. The matrix of mapping offers a summary of the frequency of factors, clustered in eleven categories: construction, design, economic, legal, market, natural, operation, political, project finance, project selection and relationship. Results will highlight the most critical risk factors, and will hopefully assist the project managers in directing the managerial attention in the further stages of risk allocation.
Abstract: This study investigated the impact of inflectional and derivational morphemic analysis awareness on ESL secondary school students’ vocabulary learning strategy. The quasi-experimental study was conducted with 106 low proficiency secondary school students in two experimental groups (inflectional and derivational) and one control group. The students’ vocabulary acquisition was assessed through two measures: Morphemic Analysis Test and Vocabulary- Morphemic Test in the pretest and posttest before and after an intervention programme. Results of ANCOVA revealed that both the experimental groups achieved a significant score in Morphemic Analysis Test and Vocabulary-Morphemic Test. However, the inflectional group obtained a fairly higher score than the derivational group. Thus, the results indicated that ESL low proficiency secondary school students performed better on inflectional morphemic awareness as compared to derivatives. The results also showed that the awareness of inflectional morphology contributed more on the vocabulary acquisition. Importantly, learning inflectional morphology can help ESL low proficiency secondary school students to develop both morphemic awareness and vocabulary gain. Theoretically, these findings show that not all morphemes are equally useful to students for their language development. Practically, these findings indicate that morphological instruction should at least be included in remediation and instructional efforts with struggling learners across all grade levels, allowing them to focus on meaning within the word before they attempt the text in large for better comprehension. Also, by methodologically, by conducting individualized intervention and assessment this study provided fresh empirical evidence to support the existing literature on morphemic analysis awareness and vocabulary learning strategy. Thus, a major pedagogical implication of the study is that morphemic analysis awareness strategy is a definite boon for ESL secondary school students in learning English vocabulary.
Abstract: The present study aimed to determine the
effectiveness of Metaphor therapy on depression among female
students. The sample included 60 female students with depression
symptoms selected by simple sampling and randomly divided into
two equal groups (experimental and control groups). Beck
Depression Inventory was used to measure the variables. This was an
experimental study with a pre-test/post-test design with control
group. Eight metaphor therapy sessions were held for the
experimental group. A post-test was administered to both groups.
Data were analyzed using multivariate analysis of covariance
(MANCOVA). Results showed that the Metaphor therapy decreased
depression in the experimental group compared to the control group.
Abstract: A multilayer passive shield composed of low-activity
lead (Pb), copper (Cu), tin (Sn) and iron (Fe) was designed and
manufactured for a coaxial HPGe detector placed at a surface
laboratory for reducing background radiation and radiation dose to
the personnel. The performance of the shield was evaluated and
efficiency curves of the detector were plotted by using of various
standard sources in different distances. Monte Carlo simulations and
a set of TLD chips were used for dose estimation in two distances of
20 and 40 cm. The results show that the shield reduced background
spectrum and the personnel dose more than 95%.
Abstract: This study analyzed the effect of area variables and
economic variables on the length of each period of the project in order
to analyze the effect of agreement rate on project implementation in
housing renewal projects. In conclusion, as can be seen from these
results, a low agreement rate may not translate into project promotion,
and a higher agreement rate may not translate into project delay. The
expectation of the policy is that the lower the agreement rate, the more
projects would be promoted, but that is not the actual effect. From a
policy consistency viewpoint, changing the agreement rate frequently,
depending on the decision of the public, is not reasonable. The policy
of using agreement rate as a necessary condition for project
implementation should be reconsidered.
Abstract: Anammox is a novel and promising technology that has changed the traditional concept of biological nitrogen removal. The process facilitates direct oxidation of ammonical nitrogen under anaerobic conditions with nitrite as an electron acceptor without addition of external carbon sources. The present study investigated the feasibility of Anammox Hybrid Reactor (AHR) combining the dual advantages of suspended and attached growth media for biodegradation of ammonical nitrogen in wastewater. Experimental unit consisted of 4 nos. of 5L capacity AHR inoculated with mixed seed culture containing anoxic and activated sludge (1:1). The process was established by feeding the reactors with synthetic wastewater containing NH4-H and NO2-N in the ratio 1:1 at HRT (hydraulic retention time) of 1 day. The reactors were gradually acclimated to higher ammonium concentration till it attained pseudo steady state removal at a total nitrogen concentration of 1200 mg/l. During this period, the performance of the AHR was monitored at twelve different HRTs varying from 0.25-3.0 d with increasing NLR from 0.4 to 4.8 kg N/m3d. AHR demonstrated significantly higher nitrogen removal (95.1%) at optimal HRT of 1 day. Filter media in AHR contributed an additional 27.2% ammonium removal in addition to 72% reduction in the sludge washout rate. This may be attributed to the functional mechanism of filter media which acts as a mechanical sieve and reduces the sludge washout rate many folds. This enhances the biomass retention capacity of the reactor by 25%, which is the key parameter for successful operation of high rate bioreactors. The effluent nitrate concentration, which is one of the bottlenecks of anammox process was also minimised significantly (42.3-52.3 mg/L). Process kinetics was evaluated using first order and Grau-second order models. The first-order substrate removal rate constant was found as 13.0 d-1. Model validation revealed that Grau second order model was more precise and predicted effluent nitrogen concentration with least error (1.84±10%). A new mathematical model based on mass balance was developed to predict N2 gas in AHR. The mass balance model derived from total nitrogen dictated significantly higher correlation (R2=0.986) and predicted N2 gas with least error of precision (0.12±8.49%). SEM study of biomass indicated the presence of heterogeneous population of cocci and rod shaped bacteria of average diameter varying from 1.2-1.5 mm. Owing to enhanced NRE coupled with meagre production of effluent nitrate and its ability to retain high biomass, AHR proved to be the most competitive reactor configuration for dealing with nitrogen laden wastewater.