Abstract: Particle size distribution, the most important
characteristics of aerosols, is obtained through electrical
characterization techniques. The dynamics of charged nanoparticles
under the influence of electric field in Electrical Mobility
Spectrometer (EMS) reveals the size distribution of these particles.
The accuracy of this measurement is influenced by flow conditions,
geometry, electric field and particle charging process, therefore by
the transfer function (transfer matrix) of the instrument. In this work,
a wire-cylinder corona charger was designed and the combined fielddiffusion
charging process of injected poly-disperse aerosol particles
was numerically simulated as a prerequisite for the study of a
multichannel EMS. The result, a cloud of particles with no uniform
charge distribution, was introduced to the EMS. The flow pattern and
electric field in the EMS were simulated using Computational Fluid
Dynamics (CFD) to obtain particle trajectories in the device and
therefore to calculate the reported signal by each electrometer.
According to the output signals (resulted from bombardment of
particles and transferring their charges as currents), we proposed a
modification to the size of detecting rings (which are connected to
electrometers) in order to evaluate particle size distributions more
accurately. Based on the capability of the system to transfer
information contents about size distribution of the injected particles,
we proposed a benchmark for the assessment of optimality of the
design. This method applies the concept of Von Neumann entropy
and borrows the definition of entropy from information theory
(Shannon entropy) to measure optimality. Entropy, according to the
Shannon entropy, is the ''average amount of information contained in
an event, sample or character extracted from a data stream''.
Evaluating the responses (signals) which were obtained via various
configurations of detecting rings, the best configuration which gave
the best predictions about the size distributions of injected particles,
was the modified configuration. It was also the one that had the
maximum amount of entropy. A reasonable consistency was also
observed between the accuracy of the predictions and the entropy
content of each configuration. In this method, entropy is extracted
from the transfer matrix of the instrument for each configuration.
Ultimately, various clouds of particles were introduced to the
simulations and predicted size distributions were compared to the
exact size distributions.
Abstract: This paper presents thermal annealing de-wetting
technique for the preparation of porous metal membrane for Thin
Film Encapsulation (TFE) application. Thermal annealing de-wetting
experimental results reveal that pore size formation in porous metal
membrane depend upon i.e. 1. The substrate at which metal is
deposited, 2. Melting point of metal used for porous metal cap layer
membrane formation, 3. Thickness of metal used for cap layer, 4.
Temperature used for formation of porous metal membrane. In order
to demonstrate this technique, Silver (Ag) was used as a metal for
preparation of porous metal membrane on amorphous silicon (a-Si)
and silicon oxide. The annealing of the silver thin film of various
thicknesses was performed at different temperature. Pores in porous
silver film were analyzed using Scanning Electron Microscope
(SEM). In order to check the usefulness of porous metal film for TFE
application, the porous silver film prepared on amorphous silicon (a-
Si) and silicon oxide was released using XeF2 and VHF, respectively.
Finally, guide line and structures are suggested to use this porous
membrane for robust TFE application.
Abstract: The purpose of this project is to propose a quick and
environmentally friendly alternative to measure the quality of oils
used in food industry. There is evidence that repeated and
indiscriminate use of oils in food processing cause physicochemical
changes with formation of potentially toxic compounds that can
affect the health of consumers and cause organoleptic changes. In
order to assess the quality of oils, non-destructive optical techniques
such as Interferometry offer a rapid alternative to the use of reagents,
using only the interaction of light on the oil. Through this project, we
used interferograms of samples of oil placed under different heating
conditions to establish the changes in their quality. These
interferograms were obtained by means of a Mach-Zehnder
Interferometer using a beam of light from a HeNe laser of 10mW at
632.8nm. Each interferogram was captured, analyzed and measured
full width at half-maximum (FWHM) using the software from
Amcap and ImageJ. The total of FWHMs was organized in three
groups. It was observed that the average obtained from each of the
FWHMs of group A shows a behavior that is almost linear, therefore
it is probable that the exposure time is not relevant when the oil is
kept under constant temperature. Group B exhibits a slight
exponential model when temperature raises between 373 K and 393
K. Results of the t-Student show a probability of 95% (0.05) of the
existence of variation in the molecular composition of both samples.
Furthermore, we found a correlation between the Iodine Indexes
(Physicochemical Analysis) and the Interferograms (Optical
Analysis) of group C. Based on these results, this project highlights
the importance of the quality of the oils used in food industry and
shows how Interferometry can be a useful tool for this purpose.
Abstract: In this paper, we provided a literature survey on the
artificial stock problem (ASM). The paper began by exploring the
complexity of the stock market and the needs for ASM. ASM
aims to investigate the link between individual behaviors (micro
level) and financial market dynamics (macro level). The variety of
patterns at the macro level is a function of the AFM complexity. The
financial market system is a complex system where the relationship
between the micro and macro level cannot be captured analytically.
Computational approaches, such as simulation, are expected to
comprehend this connection. Agent-based simulation is a simulation
technique commonly used to build AFMs. The paper proceeds by
discussing the components of the ASM. We consider the roles
of behavioral finance (BF) alongside the traditionally risk-averse
assumption in the construction of agent’s attributes. Also, the
influence of social networks in the developing of agents interactions is
addressed. Network topologies such as a small world, distance-based,
and scale-free networks may be utilized to outline economic
collaborations. In addition, the primary methods for developing
agents learning and adaptive abilities have been summarized.
These incorporated approach such as Genetic Algorithm, Genetic
Programming, Artificial neural network and Reinforcement Learning.
In addition, the most common statistical properties (the stylized facts)
of stock that are used for calibration and validation of ASM are
discussed. Besides, we have reviewed the major related previous
studies and categorize the utilized approaches as a part of these
studies. Finally, research directions and potential research questions
are argued. The research directions of ASM may focus on the macro
level by analyzing the market dynamic or on the micro level by
investigating the wealth distributions of the agents.
Abstract: Voting algorithms are extensively used to make
decisions in fault tolerant systems where each redundant module
gives inconsistent outputs. Popular voting algorithms include
majority voting, weighted voting, and inexact majority voters. Each
of these techniques suffers from scenarios where agreements do not
exist for the given voter inputs. This has been successfully overcome
in literature using fuzzy theory. Our previous work concentrated on a
neuro-fuzzy algorithm where training using the neuro system
substantially improved the prediction result of the voting system.
Weight training of Neural Network is sub-optimal. This study
proposes to optimize the weights of the Neural Network using
Artificial Bee Colony algorithm. Experimental results show the
proposed system improves the decision making of the voting
algorithms.
Abstract: The quantitative study of cell mechanics is of
paramount interest, since it regulates the behaviour of the living cells
in response to the myriad of extracellular and intracellular
mechanical stimuli. The novel experimental techniques together with
robust computational approaches have given rise to new theories and
models, which describe cell mechanics as combination of
biomechanical and biochemical processes. This review paper
encapsulates the existing continuum-based computational approaches
that have been developed for interpreting the mechanical responses of
living cells under different loading and boundary conditions. The
salient features and drawbacks of each model are discussed from both
structural and biological points of view. This discussion can
contribute to the development of even more precise and realistic
computational models of cell mechanics based on continuum
approaches or on their combination with microstructural approaches,
which in turn may provide a better understanding of
mechanotransduction in living cells.
Abstract: The purpose of the paper is to address the strategic
risk issues surrounding Hindi film distribution in Mumbai for a film
distributor, who acts as an entrepreneur when launching a product
(movie) in the market (film territory).The paper undertakes a
fundamental review of films and risk in the Hindi film industry and
applies Grounded Theory technique to understand the complex
phenomena of risk taking behavior of the film distributors (both
independent and studios) in Mumbai. Rich in-depth interviews with
distributors are coded to develop core categories through constant
comparison leading to conceptualization of the phenomena of
interest. This paper is a first-of-its-kind-attempt to understand risk
behavior of a distributor, which is akin to entrepreneurial risk
behavior under conditions of uncertainty.
Abstract: Superabsorbent polymers received much attention and
are used in many fields because of their superior characters to
traditional absorbents, e.g., sponge and cotton. So, it is very
important but challenging to prepare highly and fast-swelling
superabsorbents. A reliable, efficient and low-cost technique for
removing heavy metal ions from wastewater is the adsorption using
bio-adsorbents obtained from biological materials, such as
polysaccharides-based hydrogels superabsorbents. In this study, novel multi-functional superabsorbent composites
type semi-interpenetrating polymer networks (Semi-IPNs) were
prepared via graft polymerization of acrylamide onto chitosan
backbone in presence of gelatin, CTS-g-PAAm/Ge, using potassium
persulfate and N,N’-methylene bisacrylamide as initiator and
crosslinker, respectively. These hydrogels were also partially
hydrolyzed to achieve superabsorbents with ampholytic properties
and uppermost swelling capacity. The formation of the grafted
network was evidenced by Fourier Transform Infrared Spectroscopy
(ATR-FTIR) and Thermogravimetric Analysis (TGA). The porous
structures were observed by Scanning Electron Microscope (SEM).
From TGA analysis, it was concluded that the incorporation of the Ge
in the CTS-g-PAAm network has marginally affected its thermal
stability. The effect of gelatin content on the swelling capacities of
these superabsorbent composites was examined in various media
(distilled water, saline and pH-solutions). The water absorbency was
enhanced by adding Ge in the network, where the optimum value was
reached at 2 wt. % of Ge. Their hydrolysis has not only greatly
optimized their absorption capacity but also improved the swelling
kinetic.These materials have also showed reswelling ability. We
believe that these super-absorbing materials would be very effective
for the adsorption of harmful metal ions from wastewater.
Abstract: Lead time is a critical measure of a supply chain's
performance. It impacts both the customer satisfactions as well as the
total cost of inventory. This paper presents the result of a study on the
analysis of the customer order lead-time for a multinational company.
In the study, the lead time was divided into three stages respectively:
order entry, order fulfillment, and order delivery. A sample of size 2,425 order lines was extracted from the
company's records to use for this study. The sample data entails
information regarding customer orders from the time of order entry
until order delivery. Data regarding the lead time of each stage for
different orders were also provided. Summary statistics on lead time
data reveals that about 30% of the orders were delivered later than the
scheduled due date. The result of the multiple linear regression
analysis technique revealed that component type, logistics parameter,
order size and the customer type have significant impacts on lead
time. Data analysis on the stages of lead time indicates that stage 2
consumed over 50% of the lead time. Pareto analysis was made to
study the reasons for the customer order delay in each stage.
Recommendation was given to resolve the problem.
Abstract: Accurate forecasting of fresh produce demand is one
the challenges faced by Small Medium Enterprise (SME)
wholesalers. This paper is an attempt to understand the cause for the
high level of variability such as weather, holidays etc., in demand of
SME wholesalers. Therefore, understanding the significance of
unidentified factors may improve the forecasting accuracy. This
paper presents the current literature on the factors used to predict
demand and the existing forecasting techniques of short shelf life
products. It then investigates a variety of internal and external
possible factors, some of which is not used by other researchers in the
demand prediction process. The results presented in this paper are
further analysed using a number of techniques to minimize noise in
the data. For the analysis past sales data (January 2009 to May 2014)
from a UK based SME wholesaler is used and the results presented
are limited to product ‘Milk’ focused on café’s in derby. The
correlation analysis is done to check the dependencies of variability
factor on the actual demand. Further PCA analysis is done to
understand the significance of factors identified using correlation.
The PCA results suggest that the cloud cover, weather summary and
temperature are the most significant factors that can be used in
forecasting the demand. The correlation of the above three factors
increased relative to monthly and becomes more stable compared to
the weekly and daily demand.
Abstract: High Voltage Direct Current (HVDC) power
transmission is employed to move large amounts of electric power.
There are several possibilities to enhance the transient stability in a
power system. One adequate option is by using the high
controllability of the HVDC if HVDC is available in the system. This
paper presents a control technique for HVDC to enhance the transient
stability. The strategy controls the power through the HVDC to help
make the system more transient stable during disturbances. Loss of
synchronism is prevented by quickly producing sufficient
decelerating energy to counteract accelerating energy gained during.
In this study, the power flow in the HVDC link is modulated with the
addition of an auxiliary signal to the current reference of the rectifier
firing angle controller. This modulation control signal is derived from
speed deviation signal of the generator utilizing a PD controller; the
utilization of a PD controller is suitable because it has the property of
fast response. The effectiveness of the proposed controller is
demonstrated with a SMIB test system.
Abstract: Optic disk segmentation plays a key role in the mass
screening of individuals with diabetic retinopathy and glaucoma
ailments. An efficient hardware-based algorithm for optic disk
localization and segmentation would aid for developing an automated
retinal image analysis system for real time applications. Herein,
TMS320C6416DSK DSP board pixel intensity based fractal analysis
algorithm for an automatic localization and segmentation of the optic
disk is reported. The experiment has been performed on color and
fluorescent angiography retinal fundus images. Initially, the images
were pre-processed to reduce the noise and enhance the quality. The
retinal vascular tree of the image was then extracted using canny
edge detection technique. Finally, a pixel intensity based fractal
analysis is performed to segment the optic disk by tracing the origin
of the vascular tree. The proposed method is examined on three
publicly available data sets of the retinal image and also with the data
set obtained from an eye clinic. The average accuracy achieved is
96.2%. To the best of the knowledge, this is the first work reporting
the use of TMS320C6416DSK DSP board and pixel intensity based
fractal analysis algorithm for an automatic localization and
segmentation of the optic disk. This will pave the way for developing
devices for detection of retinal diseases in the future.
Abstract: Despite the advances made in various new
technologies, application of these technologies for agriculture still
remains a formidable task, as it involves integration of diverse
domains for monitoring the different process involved in agricultural
management. Advances in ambient intelligence technology represents
one of the most powerful technology for increasing the yield of
agricultural crops and to mitigate the impact of water scarcity,
climatic change and methods for managing pests, weeds and diseases.
This paper proposes a GPS-assisted, machine to machine solutions
that combine information collected by multiple sensors for the
automated management of paddy crops. To maintain the economic
viability of paddy cultivation, the various techniques used in
agriculture are discussed and a novel system which uses ambient
intelligence technique is proposed in this paper. The ambient
intelligence based agricultural system gives a great scope.
Abstract: Background subtraction and temporal difference are
often used for moving object detection in video. Both approaches are
computationally simple and easy to be deployed in real-time image
processing. However, while the background subtraction is highly
sensitive to dynamic background and illumination changes, the
temporal difference approach is poor at extracting relevant pixels of
the moving object and at detecting the stopped or slowly moving
objects in the scene. In this paper, we propose a simple moving object
detection scheme based on adaptive background subtraction and
temporal difference exploiting dynamic background updates. The
proposed technique consists of histogram equalization, a linear
combination of background and temporal difference, followed by the
novel frame-based and pixel-based background updating techniques.
Finally, morphological operations are applied to the output images.
Experimental results show that the proposed algorithm can solve the
drawbacks of both background subtraction and temporal difference
methods and can provide better performance than that of each method.
Abstract: In this paper a novel color image compression
technique for efficient storage and delivery of data is proposed. The
proposed compression technique started by RGB to YCbCr color
transformation process. Secondly, the canny edge detection method is
used to classify the blocks into the edge and non-edge blocks. Each
color component Y, Cb, and Cr compressed by discrete cosine
transform (DCT) process, quantizing and coding step by step using
adaptive arithmetic coding. Our technique is concerned with the
compression ratio, bits per pixel and peak signal to noise ratio, and
produce better results than JPEG and more recent published schemes
(like CBDCT-CABS and MHC). The provided experimental results
illustrate the proposed technique that is efficient and feasible in terms
of compression ratio, bits per pixel and peak signal to noise ratio.
Abstract: The rapid growth of multimedia technology demands
the secure and efficient access to information. This fast growing lose
the confidence of unauthorized duplication. Henceforth the protection
of multimedia content is becoming more important. Watermarking
solves the issue of unlawful copy of advanced data. In this paper,
blind video watermarking technique has been proposed. A luminance
layer of selected frames is interlaced into two even and odd rows of
an image, further it is deinterlaced and equalizes the coefficients of
the two shares. Color watermark is split into different blocks, and the
pieces of block are concealed in one of the share under the wavelet
transform. Stack the two images into a single image by introducing
interlaced even and odd rows in the two shares. Finally, chrominance
bands are concatenated with the watermarked luminance band. The
safeguard level of the secret information is high, and it is
undetectable. Results show that the quality of the video is not
changed also yields the better PSNR values.
Abstract: Neural activity in the human brain starts from the
early stages of prenatal development. This activity or signals
generated by the brain are electrical in nature and represent not only
the brain function but also the status of the whole body. At the
present moment, three methods can record functional and
physiological changes within the brain with high temporal resolution
of neuronal interactions at the network level: the
electroencephalogram (EEG), the magnet oencephalogram (MEG),
and functional magnetic resonance imaging (fMRI); each of these has
advantages and shortcomings. EEG recording with a large number of
electrodes is now feasible in clinical practice. Multichannel EEG
recorded from the scalp surface provides very valuable but indirect
information about the source distribution. However, deep electrode
measurements yield more reliable information about the source
locations intracranial recordings and scalp EEG are used with the
source imaging techniques to determine the locations and strengths of
the epileptic activity. As a source localization method, Low
Resolution Electro-Magnetic Tomography (LORETA) is solved for
the realistic geometry based on both forward methods, the Boundary
Element Method (BEM) and the Finite Difference Method (FDM). In
this paper, we review the findings EEG- LORETA about epilepsy.
Abstract: Since 1920, the industry has almost completely
changed the rivets production techniques for the manufacture of
permanent welding join production of structures and manufacture of
other products. The welding arc is the process more widely used in
industries. This is accomplished by the heat of an electric arc which
melts the base metal while the molten metal droplets are transferred
through the arc to the welding pool, protected from the atmosphere
by a gas curtain. The GMAW (Gas metal arc welding) process is
influenced by variables such as: current, polarity, welding speed,
electrode: extension, position, moving direction; type of joint,
welder's ability, among others. It is remarkable that the knowledge
and control of these variables are essential for obtaining satisfactory
quality welds, knowing that are interconnected so that changes in one
of them requiring changes in one or more of the other to produce the
desired results. The optimum values are affected by the type of base
metal, the electrode composition, the welding position and the quality
requirements. Thus, this paper proposes a new methodology, adding
the variable vibration through a mechanism developed for GMAW
welding, in order to improve the mechanical and metallurgical
properties which does not affect the ability of the welder and enables
repeatability of the welds made. For confirmation metallographic
analysis and mechanical tests were made.
Abstract: In this paper, improved performance scheme for
joint transmission (JT) is proposed in downlink (DL) coordinated
multi-point (CoMP) in case of the constraint transmission power.
This scheme is that a serving transmission point (TP) requests the
JT to an inter-TP and it selects a precoding technique according
to the channel state information (CSI) from user equipment (UE).
The simulation results show that the bit error rate (BER) and the
throughput performances of the proposed scheme provide the high
spectral efficiency and the reliable data at the cell edge.
Abstract: Cemented carbides, owing to their excellent
mechanical properties, have been of immense interest in the field of
hard materials for the past few decades. A number of processing
techniques have been developed to obtain high quality carbide tools,
with a wide range of grain size depending on the application and
requirements. Microwave sintering is one of the heating processes,
which has been used to prepare a wide range of materials including
ceramics. A deep understanding of microwave sintering and its
contribution towards control of grain growth and on deformation of
the resulting carbide materials requires further studies and attention.
In addition, the effect of binder materials and their behavior during
microwave sintering is another area that requires clear understanding.
This review aims to focus on microwave sintering, providing
information of how the process works and what type of materials it is
best suited for. In addition, a closer look at some microwave sintered
Tungsten Carbide-Cobalt samples will be taken and discussed,
highlighting some of the key issues and challenges faced in this
research area.