Abstract: The purpose of this study was to evaluate the efficacy of a low-cost filter regarding per- and polyfluoroalkyl substances (PFAS). PFAS is a commonly used man-made chemical that can be found in a variety of household and industrial products with deleterious effects on humans. The filter consists of a combination of low-cost materials which could be locally procured. Water testing results for 4 different PFAS contaminants indicated that for Perfluorooctane sulfonic acid (PFOS), the Agency for Toxic Substances and Disease Registry (ATSDR) regulation is 7 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. For Perfluorononanoic acid (PFNA), the ATSDR regulation is 10.5 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. For Perfluorooctanoic acid (PFOA), the ATSDR regulation is 11 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. For Perfluorohexane sulfonic acid (PFHxS), the ATSDR regulation is 70 ppt, the initial concentration was 15 ppt, and the final concentration was 3.9 ppt. The results indicated a 74% reduction in PFAS concentration in filtered samples. Statistical data through regression analysis showed 0.9 validity of the sample data. Initial tests show the efficiency of the proposed filter described could be far greater if tested at a greater scale. It is highly recommended further testing to be conducted to validate the data for an innovative solution to a ubiquitous problem.
Abstract: Both daily and long-term management of a heavy-duty vehicles and construction machinery fleet is an extremely complicated and hard to solve issue. This is mainly due to the diversity of the fleet vehicles – machinery, which concerns not only the vehicle types, but also their age/efficiency, as well as the fleet volume, which is often of the order of hundreds or even thousands of vehicles/machineries. In the present paper we present “InteligentLogger”, a holistic heavy-duty fleet management system covering a wide range of diverse fleet vehicles. This is based on specifically designed hardware and software for the automated vehicle health status and operational cost monitoring, for smart maintenance. InteligentLogger is characterized by high adaptability that permits to be tailored to practically any heavy-duty vehicle/machinery (of different technologies -modern or legacy- and of dissimilar uses). Contrary to conventional logistic systems, which are characterized by raised operational costs and often errors, InteligentLogger provides a cost-effective and reliable integrated solution for the e-management and e-maintenance of the fleet members. The InteligentLogger system offers the following unique features that guarantee successful heavy-duty vehicles/machineries fleet management: (a) Recording and storage of operating data of motorized construction machinery, in a reliable way and in real time, using specifically designed Internet of Things (IoT) sensor nodes that communicate through the available network infrastructures, e.g., 3G/LTE; (b) Use on any machine, regardless of its age, in a universal way; (c) Flexibility and complete customization both in terms of data collection, integration with 3rd party systems, as well as in terms of processing and drawing conclusions; (d) Validation, error reporting & correction, as well as update of the system’s database; (e) Artificial intelligence (AI) software, for processing information in real time, identifying out-of-normal behavior and generating alerts; (f) A MicroStrategy based enterprise BI, for modeling information and producing reports, dashboards, and alerts focusing on vehicles– machinery optimal usage, as well as maintenance and scraping policies; (g) Modular structure that allows low implementation costs in the basic fully functional version, but offers scalability without requiring a complete system upgrade.
Abstract: In this paper, a method for reconfiguring bandwidth in a circular dual-mode resonator is presented. The method concerns the optimized geometry of a structure that may be used to host the tuning elements, which are typically RF (Radio Frequency) switches. The tuning elements themselves, and their performance during tuning, are not the focus of this paper. The designed resonator is able to reconfigure its fractional bandwidth by adjusting the inter-coupling level between the degenerate modes, while at the same time improving its response by adjusting the external-coupling level and keeping the center frequency fixed. The inter-coupling level has been adjusted by changing the dimensions of the perturbation element, while the external-coupling level has been adjusted by changing one of the feeder dimensions. The design was arrived at via optimization. Agreeing simulation and measurement results of the designed and implemented filters showed good improvements in return loss values and the stability of the center frequency.
Abstract: The quantum communication technology is an evolving
design which connects multiple quantum enabled devices to internet
for secret communication or sensitive information exchange. In
future, the number of these compact quantum enabled devices
will increase immensely making them an integral part of present
communication systems. Therefore, safety and security of such
devices is also a major concern for us. To ensure the customer
sensitive information will not be eavesdropped or deciphered, we
need a strong authentications and encryption mechanism. In this
paper, we propose a mutual authentication scheme between these
smart quantum devices and server based on the secure exchange of
information through quantum channel which gives better solutions
for symmetric key exchange issues. An important part of this
work is to propose a secure mutual authentication protocol over
the quantum channel. We show that our approach offers robust
authentication protocol and further our solution is lightweight,
scalable, cost-effective with optimized computational processing
overheads.
Abstract: Spectrum underutilization has made cognitive
radio a promising technology both for current and future
telecommunications. This is due to the ability to exploit the unused
spectrum in the bands dedicated to other wireless communication
systems, and thus, increase their occupancy. The essential function,
which allows the cognitive radio device to perceive the occupancy
of the spectrum, is spectrum sensing. In this paper, the performance
of modern adaptations of the four most widely used spectrum
sensing techniques namely, energy detection (ED), cyclostationary
feature detection (CSFD), matched filter (MF) and eigenvalues-based
detection (EBD) is compared. The implementation has been
accomplished through the PlutoSDR hardware platform and the
GNU Radio software package in very low Signal-to-Noise Ratio
(SNR) conditions. The optimal detection performance of the
examined methods in a realistic implementation-oriented model is
found for the common relevant parameters (number of observed
samples, sensing time and required probability of false alarm).
Abstract: The cardiopulmonary signal monitoring, without the
usage of contact electrodes or any type of in-body sensors, has
several applications such as sleeping monitoring and continuous
monitoring of vital signals in bedridden patients. This system has
also applications in the vehicular environment to monitor the driver,
in order to avoid any possible accident in case of cardiac failure.
Thus, the bio-radar system proposed in this paper, can measure vital
signals accurately by using the Doppler effect principle that relates
the received signal properties with the distance change between the
radar antennas and the person’s chest-wall. Once the bio-radar aim
is to monitor subjects in real-time and during long periods of time,
it is impossible to guarantee the patient immobilization, hence their
random motion will interfere in the acquired signals. In this paper,
a mathematical model of the bio-radar is presented, as well as its
simulation in MATLAB. The used algorithm for breath rate extraction
is explained and a method for DC offsets removal based in a motion
detection system is proposed. Furthermore, experimental tests were
conducted with a view to prove that the unavoidable random motion
can be used to estimate the DC offsets accurately and thus remove
them successfully.
Abstract: The analysis of geographic inequality heavily relies on the use of location-enabled statistical data and quantitative measures to present the spatial patterns of the selected phenomena and analyze their differences. To protect the privacy of individual instance and link to administrative units, point-based datasets are spatially aggregated to area-based statistical datasets, where only the overall status for the selected levels of spatial units is used for decision making. The partition of the spatial units thus has dominant influence on the outcomes of the analyzed results, well known as the Modifiable Areal Unit Problem (MAUP). A new spatial reference framework, the Taiwan Geographical Statistical Classification (TGSC), was recently introduced in Taiwan based on the spatial partition principles of homogeneous consideration of the number of population and households. Comparing to the outcomes of the traditional township units, TGSC provides additional levels of spatial units with finer granularity for presenting spatial phenomena and enables domain experts to select appropriate dissemination level for publishing statistical data. This paper compares the results of respectively using TGSC and township unit on the mortality data and examines the spatial characteristics of their outcomes. For the mortality data between the period of January 1st, 2008 and December 31st, 2010 of the Taitung County, the all-cause age-standardized death rate (ASDR) ranges from 571 to 1757 per 100,000 persons, whereas the 2nd dissemination area (TGSC) shows greater variation, ranged from 0 to 2222 per 100,000. The finer granularity of spatial units of TGSC clearly provides better outcomes for identifying and evaluating the geographic inequality and can be further analyzed with the statistical measures from other perspectives (e.g., population, area, environment.). The management and analysis of the statistical data referring to the TGSC in this research is strongly supported by the use of Geographic Information System (GIS) technology. An integrated workflow that consists of the tasks of the processing of death certificates, the geocoding of street address, the quality assurance of geocoded results, the automatic calculation of statistic measures, the standardized encoding of measures and the geo-visualization of statistical outcomes is developed. This paper also introduces a set of auxiliary measures from a geographic distribution perspective to further examine the hidden spatial characteristics of mortality data and justify the analyzed results. With the common statistical area framework like TGSC, the preliminary results demonstrate promising potential for developing a web-based statistical service that can effectively access domain statistical data and present the analyzed outcomes in meaningful ways to avoid wrong decision making.
Abstract: In any communication, security is the most important issue in today’s world. In this paper, steganography is the process of hiding the important data into other data, such as text, audio, video, and image. The interest in this topic is to provide availability, confidentiality, integrity, and authenticity of data. The steganographic technique that embeds hides content with unremarkable cover media so as not to provoke eavesdropper’s suspicion or third party and hackers. In which many applications of compression, encryption, decryption, and embedding methods are used for digital image steganography. Due to compression, the nose produces in the image. To sustain noise in the image, the LSB insertion technique is used. The performance of the proposed embedding system with respect to providing security to secret message and robustness is discussed. We also demonstrate the maximum steganography capacity and visual distortion.
Abstract: In this paper, we consider a non-identically independently distributed (non-i.i.d.) Hoyt fading single-input multiple-out put (SIMO) channel, where the transmitter sends some confidential information to the legitimate receiver in presence of an eavesdropper. We formulated the probability of non-zero secrecy mutual information; secure outage probability and average secrecy mutual information (SMI) for the SIMO wireless communication system. The calculation has been carried out using small limit argument approximation (SLAA) on zeroth-order modified Bessel function of first kind. In our proposed model, an eavesdropper observes transmissions of information through another Hoyt fading channel. First, we derived the analytical expression for non-zero secrecy mutual information. Then, we find the secure outage probability to investigate the outage behavior of the proposed model. Finally, we find the average secrecy mutual information. We consider that the channel state information (CSI) is known to legitimate receiver.
Abstract: Digital reference service is when a traditional library
reference service is provided electronically. In most cases users do
not get full satisfaction from using digital reference service due to
variety of reasons. This paper discusses the formal specification of
web services applications for digital reference services (WSDRS).
WSDRS is an informal model that claims to reduce the problems of
digital reference services in libraries. It uses web services technology
to provide efficient digital way of satisfying users’ need in the
reference section of libraries. Informal model is in natural language
which is inconsistent and ambiguous that may cause difficulties to the
developers of the system. In order to solve this problem we decided
to convert the informal specifications into formal specifications. This
is supposed to reduce the overall development time and cost. We use
Z language to develop the formal model and verify it with Z/EVES
theorem prover tool.
Abstract: The web services applications for digital reference
service (WSDRS) of LIS model is an informal model that claims to
reduce the problems of digital reference services in libraries. It uses
web services technology to provide efficient way of satisfying users’
needs in the reference section of libraries. The formal WSDRS model
consists of the Z specifications of all the informal specifications of
the model. This paper discusses the formal validation of the Z
specifications of WSDRS model. The authors formally verify and
thus validate the properties of the model using Z/EVES theorem
prover.
Abstract: This paper presents a novel algorithm for secure,
reliable and flexible transmission of big data in two hop wireless
networks using cooperative jamming scheme. Two hop wireless
networks consist of source, relay and destination nodes. Big data has
to transmit from source to relay and from relay to destination by
deploying security in physical layer. Cooperative jamming scheme
determines transmission of big data in more secure manner by
protecting it from eavesdroppers and malicious nodes of unknown
location. The novel algorithm that ensures secure and energy balance
transmission of big data, includes selection of data transmitting
region, segmenting the selected region, determining probability ratio
for each node (capture node, non-capture and eavesdropper node) in
every segment, evaluating the probability using binary based
evaluation. If it is secure transmission resume with the two- hop
transmission of big data, otherwise prevent the attackers by
cooperative jamming scheme and transmit the data in two-hop
transmission.
Abstract: Purpose: The study aimed to assess the depressant or
antidepressant effects of several Nonsteroidal Anti-Inflammatory
Drugs (NSAIDs) in mice: the selective cyclooxygenase-2 (COX-2)
inhibitor meloxicam, and the non-selective COX-1 and COX-2
inhibitors lornoxicam, sodium metamizole, and ketorolac. The
current literature data regarding such effects of these agents are
scarce.
Materials and methods: The study was carried out on NMRI mice
weighing 20-35 g, kept in a standard laboratory environment. The
study was approved by the Ethics Committee of the University of
Medicine and Pharmacy „Carol Davila”, Bucharest. The study agents
were injected intraperitoneally, 10 mL/kg body weight (bw) 1 hour
before the assessment of the locomotor activity by cage testing (n=10
mice/ group) and 2 hours before the forced swimming tests (n=15).
The study agents were dissolved in normal saline (meloxicam,
sodium metamizole), ethanol 11.8% v/v in normal saline (ketorolac),
or water (lornoxicam), respectively. Negative and positive control
agents were also given (amitryptilline in the forced swimming test).
The cage floor used in the locomotor activity assessment was divided
into 20 equal 10 cm squares. The forced swimming test involved
partial immersion of the mice in cylinders (15/9cm height/diameter)
filled with water (10 cm depth at 28C), where they were left for 6
minutes. The cage endpoint used in the locomotor activity assessment
was the number of treaded squares. Four endpoints were used in the
forced swimming test (immobility latency for the entire 6 minutes,
and immobility, swimming, and climbing scores for the final 4
minutes of the swimming session), recorded by an observer that was
„blinded” to the experimental design. The statistical analysis used the
Levene test for variance homogeneity, ANOVA and post-hoc
analysis as appropriate, Tukey or Tamhane tests.
Results: No statistically significant increase or decrease in the
number of treaded squares was seen in the locomotor activity
assessment of any mice group. In the forced swimming test,
amitryptilline showed an antidepressant effect in each experiment, at
the 10 mg/kg bw dosage. Sodium metamizole was depressant at 100
mg/kg bw (increased the immobility score, p=0.049, Tamhane test),
but not in lower dosages as well (25 and 50 mg/kg bw). Ketorolac
showed an antidepressant effect at the intermediate dosage of 5
mg/kg bw, but not so in the dosages of 2.5 and 10 mg/kg bw,
respectively (increased the swimming score, p=0.012, Tamhane test).
Meloxicam and lornoxicam did not alter the forced swimming
endpoints at any dosage level.
Discussion: 1) Certain NSAIDs caused changes in the forced
swimming patterns without interfering with locomotion. 2) Sodium
metamizole showed a depressant effect, whereas ketorolac proved
antidepressant. Conclusion: NSAID-induced mood changes are not
class effects of these agents and apparently are independent of the
type of inhibited cyclooxygenase (COX-1 or COX-2).
Disclosure: This paper was co-financed from the European Social
Fund, through the Sectorial Operational Programme Human Resources Development 2007-2013, project number POSDRU /159
/1.5 /S /138907 "Excellence in scientific interdisciplinary research,
doctoral and postdoctoral, in the economic, social and medical fields
-EXCELIS", coordinator The Bucharest University of Economic
Studies.
Abstract: The emerging Cognitive Radio is combo of both the
technologies i.e. Radio dynamics and software technology. It involve
wireless system with efficient coding, designing, and making them
artificial intelligent to take the decision according to the surrounding
environment and adopt themselves accordingly, so as to deliver the
best QoS. This is the breakthrough from fixed hardware and fixed
utilization of the spectrum. This software-defined approach of
research is centralized at user-definition and application driven
model, various software method are used for the optimization of the
wireless communication. This paper focused on the Spectrum
allocation technique using genetic algorithm GA to evolve radio,
represented by chromosomes. The chromosomes gene represents the
adjustable parameters in given radio and by using GA, evolving over
the generations, the optimized set of parameters are evolved, as per
the requirement of user and availability of the spectrum, in our
prototype the gene consist of 6 different parameters, and the best set
of parameters are evolved according to the application need and
availability of the spectrum holes and thus maintaining best QoS for
user, simultaneously maintaining licensed user rights. The analyzing
tool Matlab is used for the performance of the prototype.
Abstract: Cognitive Radio is a turning out technology that
empowers viable usage of the spectrum. Energy Detector-based
Sensing is the most broadly utilized spectrum sensing strategy.
Besides, it's a lot of generic as receivers doesn't would like any
information on the primary user's signals, channel data, of even the
sort of modulation. This paper puts forth the execution of energy
detection sensing for AM (Amplitude Modulated) signal at 710 KHz,
FM (Frequency Modulated) signal at 103.45 MHz (local station
frequency), Wi-Fi signal at 2.4 GHz and WiMAX signals at 6 GHz.
The OFDM/OFDMA based WiMAX physical layer with
convolutional channel coding is actualized utilizing USRP N210
(Universal Software Radio Peripheral) and GNU Radio based
Software Defined Radio (SDR). Test outcomes demonstrated the
BER (Bit Error Rate) augmentation with channel noise and BER
execution is dissected for different Eb/N0 (the energy per bit to noise
power spectral density ratio) values.
Abstract: This paper addresses the reduction of peak to average
power ratio (PAPR) for the OFDM in Mobile-WiMAX physical layer
(PHY) standard. In the process, the best achievable PAPR of 0 dB is
found for the OFDM spectrum using phase modulation technique
which avoids the nonlinear distortion. The performance of the
WiMAX PHY standard is handled by the software defined radio
(SDR) prototype in which GNU Radio and USRP N210 employed as
software and hardware platforms respectively. It is also found that
BER performance is shown for different coding and different
modulation schemes. To empathize wireless propagation in specific
environments, a sliding correlator wireless channel sounding system
is designed by using SDR testbed.
Abstract: Protein kinases participate in a myriad of cellular
processes of major biomedical interest. The in vivo substrate
specificity of these enzymes is a process determined by several
factors, and despite several years of research on the topic, is still
far from being totally understood. In the present work, we have
quantified the contributions to the kinase substrate specificity of
i) the phosphorylation sites and their surrounding residues in the
sequence and of ii) the association of kinases to adaptor or scaffold
proteins. We have used position-specific scoring matrices (PSSMs),
to represent the stretches of sequences phosphorylated by 93 families
of kinases. We have found negative correlations between the number
of sequences from which a PSSM is generated and the statistical
significance and the performance of that PSSM. Using a subset
of 22 statistically significant PSSMs, we have identified specificity
determinant residues (SDRs) for 86% of the corresponding kinase
families. Our results suggest that different SDRs can function as
positive or negative elements of substrate recognition by the different
families of kinases. Additionally, we have found that human proteins
with known function as adaptors or scaffolds (kAS) tend to interact
with a significantly large fraction of the substrates of the kinases to
which they associate. Based on this characteristic we have identified
a set of 279 potential adaptors/scaffolds (pAS) for human kinases,
which is enriched in Pfam domains and functional terms tightly
related to the proposed function. Moreover, our results show that
for 74.6% of the kinase–pAS association found, the pAS colocalize
with the substrates of the kinases they are associated to. Finally, we
have found evidence suggesting that the association of kinases to
adaptors and scaffolds, may contribute significantly to diminish the
in vivo substrate crossed-specificity of protein kinases. In general, our
results indicate the relevance of several SDRs for both the positive
and negative selection of phosphorylation sites by kinase families and
also suggest that the association of kinases to pAS proteins may be
an important factor for the localization of the enzymes with their set
of substrates.
Abstract: HF Communication system is one of the attractive fields among many researchers since it can be reached long-distance areas with low-cost. This long-distance communication can be achieved by exploiting the ionosphere as a transmission medium for the HF radio wave. However, due to the dynamic nature of ionosphere, the channel characteristic of HF communication has to be investigated in order to gives better performances. Many techniques to characterize HF channel are available in the literature. However, none of those techniques describe the HF channel characteristic in low-latitude regions, especially equatorial areas. Since the ionosphere around equatorial region has an ESF phenomenon, it becomes an important investigation to characterize the wideband HF Channel in low-latitude region. On the other sides, the appearance of software-defined radio attracts the interest of many researchers. Accordingly, in this paper a SDR-based channel measurement system is proposed to be used for characterizing the HF channel in low-latitude region.
Abstract: Breast region segmentation is an essential prerequisite in computerized analysis of mammograms. It aims at separating the breast tissue from the background of the mammogram and it includes two independent segmentations. The first segments the background region which usually contains annotations, labels and frames from the whole breast region, while the second removes the pectoral muscle portion (present in Medio Lateral Oblique (MLO) views) from the rest of the breast tissue. In this paper we propose hybridization of Connected Component Labeling (CCL), Fuzzy, and Straight line methods. Our proposed methods worked good for separating pectoral region. After removal pectoral muscle from the mammogram, further processing is confined to the breast region alone. To demonstrate the validity of our segmentation algorithm, it is extensively tested using over 322 mammographic images from the Mammographic Image Analysis Society (MIAS) database. The segmentation results were evaluated using a Mean Absolute Error (MAE), Hausdroff Distance (HD), Probabilistic Rand Index (PRI), Local Consistency Error (LCE) and Tanimoto Coefficient (TC). The hybridization of fuzzy with straight line method is given more than 96% of the curve segmentations to be adequate or better. In addition a comparison with similar approaches from the state of the art has been given, obtaining slightly improved results. Experimental results demonstrate the effectiveness of the proposed approach.
Abstract: Greenhouse is a building, which provides controlled climate conditions to the plants to keep them from external hard conditions. Greenhouse technology gives freedom to the farmer to select any crop type in any time during year. The quality and productivity of plants inside greenhouse is highly dependent on the management quality and a good management scheme is defined by the quality of the information collected from the greenhouse environment. Therefore, Continuous monitoring of environmental variables such as temperature, humidity, and soil moisture gives information to the grower to better understand, how each factor affects growth and how to manage maximal crop productiveness. In this piper, we designed and implemented climate monitoring with irrigation control system based on Wireless Sensor Network (WSN) technology. The designed system is characterized with friendly to use, easy to install by any greenhouse user, multi-sensing nodes, multi-PAN ID, low cast, water irrigation control and low operation complexity. The system consists of two node types (sensing and control) with star topology on one PAN ID. Moreover, greenhouse manager can modifying system parameters such as (sensing node addresses, irrigation upper and lower control limits) by updating corresponding data in SDRAM memory. In addition, the designed system uses 2*16 characters. LCD to display the micro climate parameters values of each plants row inside the greenhouse.