Abstract: Traditionally, the dimensioning of storage tanks is conducted with a deterministic approach based on partial coefficients of safety. These coefficients are applied to take into account the uncertainties related to hazards on properties of materials used and applied loads. However, the use of these safety factors in the design process does not assure an optimal and reliable solution and can sometimes lead to a lack of robustness of the structure. The reliability theory based on a probabilistic formulation of constructions safety can respond in an adapted manner. It allows constructing a modelling in which uncertain data are represented by random variables, and therefore allows a better appreciation of safety margins with confidence indicators. The work presented in this paper consists of a mecano-reliability analysis of a concrete storage tank placed on ground. The classical method of Monte Carlo simulation is used to evaluate the failure probability of concrete tank by considering the seismic acceleration as random variable.
Abstract: Nowadays, robust and secure watermarking algorithm and its optimization have been need of the hour. A watermarking algorithm is presented to achieve the copy right protection of the owner based on visual cryptography, histogram shape property and entropy. In this, both host image and watermark are preprocessed. Host image is preprocessed by using Butterworth filter, and watermark is with visual cryptography. Applying visual cryptography on water mark generates two shares. One share is used for embedding the watermark, and the other one is used for solving any dispute with the aid of trusted authority. Usage of histogram shape makes the process more robust against geometric and signal processing attacks. The combination of visual cryptography, Butterworth filter, histogram, and entropy can make the algorithm more robust, imperceptible, and copy right protection of the owner.
Abstract: The purpose of this article is to find a method
of comparing designs for ordinal regression models using
quantile dispersion graphs in the presence of linear predictor
misspecification. The true relationship between response variable
and the corresponding control variables are usually unknown.
Experimenter assumes certain form of the linear predictor of the
ordinal regression models. The assumed form of the linear predictor
may not be correct always. Thus, the maximum likelihood estimates
(MLE) of the unknown parameters of the model may be biased due to
misspecification of the linear predictor. In this article, the uncertainty
in the linear predictor is represented by an unknown function. An
algorithm is provided to estimate the unknown function at the
design points where observations are available. The unknown function
is estimated at all points in the design region using multivariate
parametric kriging. The comparison of the designs are based on
a scalar valued function of the mean squared error of prediction
(MSEP) matrix, which incorporates both variance and bias of the
prediction caused by the misspecification in the linear predictor. The
designs are compared using quantile dispersion graphs approach.
The graphs also visually depict the robustness of the designs on the
changes in the parameter values. Numerical examples are presented
to illustrate the proposed methodology.
Abstract: Connected vehicles are equipped with wireless sensors
that aid in Vehicle to Vehicle (V2V) and Vehicle to Infrastructure
(V2I) communication. These vehicles will in the near future
provide road safety, improve transport efficiency, and reduce traffic
congestion. One of the challenges for connected vehicles is how
to ensure that information sent across the network is secure. If
security of the network is not guaranteed, several attacks can occur,
thereby compromising the robustness, reliability, and efficiency of
the network. This paper discusses existing security mechanisms and
unique properties of connected vehicles. The methodology employed
in this work is exploratory. The paper reviews existing security
solutions for connected vehicles. More concretely, it discusses
various cryptographic mechanisms available, and suggests areas
of improvement. The study proposes a combination of symmetric
key encryption and public key cryptography to improve security.
The study further proposes message aggregation as a technique to
overcome message redundancy. This paper offers a comprehensive
overview of connected vehicles technology, its applications, its
security mechanisms, open challenges, and potential areas of future
research.
Abstract: 3D model-based vehicle matching provides a new way
for vehicle recognition, localization and tracking. Its key is to
construct an evaluation function, also called fitness function, to
measure the degree of vehicle matching. The existing fitness functions
often poorly perform when the clutter and occlusion exist in traffic
scenarios. In this paper, we present a practical and efficient fitness
function. Unlike the existing evaluation functions, the proposed
fitness function is to study the vehicle matching problem from
both local and global perspectives, which exploits the pixel gradient
information as well as the silhouette information. In view of the
discrepancy between 3D vehicle model and real vehicle, a weighting
strategy is introduced to differently treat the fitting of the model’s
wireframes. Additionally, a normalization operation for the model’s
projection is performed to improve the accuracy of the matching.
Experimental results on real traffic videos reveal that the proposed
fitness function is efficient and robust to the cluttered background
and partial occlusion.
Abstract: Enterprise growth is generally considered as a key driver of competitiveness, employment, economic development and social inclusion. As such, it is perceived to be a highly desirable outcome of entrepreneurship for scholars and decision makers. The huge academic debate resulted in the multitude of theoretical frameworks focused on explaining growth stages, determinants and future prospects. It has been widely accepted that enterprise growth is most likely nonlinear, temporal and related to the variety of factors which reflect the individual, firm, organizational, industry or environmental determinants of growth. However, factors that affect growth are not easily captured, instruments to measure those factors are often arbitrary, causality between variables and growth is elusive, indicating that growth is not easily modeled. Furthermore, in line with heterogeneous nature of the growth phenomenon, there is a vast number of measurement constructs assessing growth which are used interchangeably. Differences among various growth measures, at conceptual as well as at operationalization level, can hinder theory development which emphasizes the need for more empirically robust studies. In line with these highlights, the main purpose of this paper is twofold. Firstly, to compare structure and performance of three growth prediction models based on the main growth measures: Revenues, employment and assets growth. Secondly, to explore the prospects of financial indicators, set as exact, visible, standardized and accessible variables, to serve as determinants of enterprise growth. Finally, to contribute to the understanding of the implications on research results and recommendations for growth caused by different growth measures. The models include a range of financial indicators as lag determinants of the enterprises’ performances during the 2008-2013, extracted from the national register of the financial statements of SMEs in Croatia. The design and testing stage of the modeling used the logistic regression procedures. Findings confirm that growth prediction models based on different measures of growth have different set of predictors. Moreover, the relationship between particular predictors and growth measure is inconsistent, namely the same predictor positively related to one growth measure may exert negative effect on a different growth measure. Overall, financial indicators alone can serve as good proxy of growth and yield adequate predictive power of the models. The paper sheds light on both methodology and conceptual framework of enterprise growth by using a range of variables which serve as a proxy for the multitude of internal and external determinants, but are unlike them, accessible, available, exact and free of perceptual nuances in building up the model. Selection of the growth measure seems to have significant impact on the implications and recommendations related to growth. Furthermore, the paper points out to potential pitfalls of measuring and predicting growth. Overall, the results and the implications of the study are relevant for advancing academic debates on growth-related methodology, and can contribute to evidence-based decisions of policy makers.
Abstract: The next generation mobile communication systems i.e. fourth generation (4G) was developed to accommodate the quality of service and required data rate. This project focuses on multiple access technique proposed in 4G communication systems. It is attempted to demonstrate the IDMA (Interleave Division Multiple Access) technology. The basic principle of IDMA is that interleaver is different for each user whereas CDMA employs different signatures. IDMA inherits many advantages of CDMA such as robust against fading, easy cell planning; dynamic channel sharing and IDMA increase the spectral efficiency and reduce the receiver complexity. In this, performance of IDMA is analyzed using QC-LDPC coding scheme further it is compared with LDPC coding and at last BER is calculated and plotted in MATLAB.
Abstract: Blood cell analysis plays a significant role in the diagnosis of human health. As an alternative to the traditional technique conducted by laboratory technicians, this paper presents an automatic white blood cell (leukocyte) detection system using Image Stitching and Color Overlapping Windows. The advantage of this method is to present a detection technique of white blood cells that are robust to imperfect shapes of blood cells with various image qualities. The input for this application is images from a microscope-slide translation video. The preprocessing stage is performed by stitching the input images. First, the overlapping parts of the images are determined, then stitching and blending processes of two input images are performed. Next, the Color Overlapping Windows is performed for white blood cell detection which consists of color filtering, window candidate checking, window marking, finds window overlaps, and window cropping processes. Experimental results show that this method could achieve an average of 82.12% detection accuracy of the leukocyte images.
Abstract: Fiber-Wireless (FiWi) networks are a promising candidate for future broadband access networks. These networks combine the optical network as the back end where different passive optical network (PON) technologies are realized and the wireless network as the front end where different wireless technologies are adopted, e.g. LTE, WiMAX, Wi-Fi, and Wireless Mesh Networks (WMNs). The convergence of both optical and wireless technologies requires designing architectures with robust efficient and effective bandwidth allocation schemes. Different bandwidth allocation algorithms have been proposed in FiWi networks aiming to enhance the different segments of FiWi networks including wireless and optical subnetworks. In this survey, we focus on the differentiating between the different bandwidth allocation algorithms according to their enhancement segment of FiWi networks. We classify these techniques into wireless, optical and Hybrid bandwidth allocation techniques.
Abstract: In this paper numerous robust fitting procedures are considered in estimating spatial variograms. In spatial statistics, the conventional variogram fitting procedure (non-linear weighted least squares) suffers from the same outlier problem that has plagued this method from its inception. Even a 3-parameter model, like the variogram, can be adversely affected by a single outlier. This paper uses the Hogg-Type adaptive procedures to select an optimal score function for a rank-based estimator for these non-linear models. Numeric examples and simulation studies will demonstrate the robustness, utility, efficiency, and validity of these estimates.
Abstract: The development of change prediction models can help the software practitioners in planning testing and inspection resources at early phases of software development. However, a major challenge faced during the training process of any classification model is the imbalanced nature of the software quality data. A data with very few minority outcome categories leads to inefficient learning process and a classification model developed from the imbalanced data generally does not predict these minority categories correctly. Thus, for a given dataset, a minority of classes may be change prone whereas a majority of classes may be non-change prone. This study explores various alternatives for adeptly handling the imbalanced software quality data using different sampling methods and effective MetaCost learners. The study also analyzes and justifies the use of different performance metrics while dealing with the imbalanced data. In order to empirically validate different alternatives, the study uses change data from three application packages of open-source Android data set and evaluates the performance of six different machine learning techniques. The results of the study indicate extensive improvement in the performance of the classification models when using resampling method and robust performance measures.
Abstract: A simple and robust approach for developing secure software. A Four Phase methodology consists in developing the non-secure software in phase one, and for the next three phases, one phase for each of the secure developing types (i.e. self-protected software, secure code transformation, and the secure shield). Our methodology requires first the determination and understanding of the type of security level needed for the software. The methodology proposes the use of several teams to accomplish this task. One Software Engineering Developing Team, a Compiler Team, a Specification and Requirements Testing Team, and for each of the secure software developing types: three teams of Secure Software Developing, three teams of Code Breakers, and three teams of Intrusion Analysis. These teams will interact among each other and make decisions to provide a secure software code protected against a required level of intruder.
Abstract: This paper discusses a corner detection algorithm
for camera calibration. Calibration is a necessary step in many
computer vision and image processing applications. Robust
corner detection for an image of a checkerboard is required
to determine intrinsic and extrinsic parameters. In this paper,
an algorithm for fully automatic and robust X-corner detection
is presented. Checkerboard corner points are automatically
found in each image without user interaction or any prior
information regarding the number of rows or columns. The
approach represents each X-corner with a quadratic fitting
function. Using the fact that the X-corners are saddle points,
the coefficients in the fitting function are used to identify each
corner location. The automation of this process greatly simplifies
calibration. Our method is robust against noise and different
camera orientations. Experimental analysis shows the accuracy
of our method using actual images acquired at different camera
locations and orientations.
Abstract: In this paper, propose method that can user’s position
that based on database is built from single camera. Previous
positioning calculate distance by arrival-time of signal like GPS
(Global Positioning System), RF(Radio Frequency). However, these
previous method have weakness because these have large error range
according to signal interference. Method for solution estimate position
by camera sensor. But, signal camera is difficult to obtain relative
position data and stereo camera is difficult to provide real-time
position data because of a lot of image data, too. First of all, in this
research we build image database at space that able to provide
positioning service with single camera. Next, we judge similarity
through image matching of database image and transmission image
from user. Finally, we decide position of user through position of most
similar database image. For verification of propose method, we
experiment at real-environment like indoor and outdoor. Propose
method is wide positioning range and this method can verify not only
position of user but also direction.
Abstract: The material behavior of graphene, a single layer of
carbon lattice, is extremely sensitive to its dielectric environment. We
demonstrate improvement in electronic performance of graphene
nanowire interconnects with full encapsulation by lattice-matching,
chemically inert, 2D layered insulator hexagonal boron nitride (h-
BN). A novel layer-based transfer technique is developed to construct
the h-BN/MLG/h-BN heterostructures. The encapsulated graphene
wires are characterized and compared with that on SiO2 or h-BN
substrate without passivating h-BN layer. Significant improvements
in maximum current-carrying density, breakdown threshold, and
power density in encapsulated graphene wires are observed. These
critical improvements are achieved without compromising the carrier
transport characteristics in graphene. Furthermore, graphene wires
exhibit electrical behavior less insensitive to ambient conditions, as
compared with the non-passivated ones. Overall, h-BN/graphene/h-
BN heterostructure presents a robust material platform towards the
implementation of high-speed carbon-based interconnects.
Abstract: As greenhouse effect has been recognized as serious environmental problem of the world, interests in carbon dioxide (CO2) emission which comprises major part of greenhouse gas (GHG) emissions have been increased recently. Since construction industry takes a relatively large portion of total CO2 emissions of the world, extensive studies about reducing CO2 emissions in construction and operation of building have been carried out after the 2000s. Also, performance based design (PBD) methodology based on nonlinear analysis has been robustly developed after Northridge Earthquake in 1994 to assure and assess seismic performance of building more exactly because structural engineers recognized that prescriptive code based design approach cannot address inelastic earthquake responses directly and assure performance of building exactly. Although CO2 emissions and PBD approach are recent rising issues on construction industry and structural engineering, there were few or no researches considering these two issues simultaneously. Thus, the objective of this study is to minimize the CO2 emissions and cost of building designed by PBD approach in structural design stage considering structural materials. 4 story and 4 span reinforced concrete building optimally designed to minimize CO2 emissions and cost of building and to satisfy specific seismic performance (collapse prevention in maximum considered earthquake) of building satisfying prescriptive code regulations using non-dominated sorting genetic algorithm-II (NSGA-II). Optimized design result showed that minimized CO2 emissions and cost of building were acquired satisfying specific seismic performance. Therefore, the methodology proposed in this paper can be used to reduce both CO2 emissions and cost of building designed by PBD approach.
Abstract: Coffee is a widely consumed beverage with many components such as caffeine, flavonoids, phenolic compounds, and minerals. Coffee consumption continues to increase due to its physiological effects, its pleasant taste, and aroma. Robusta and Arabica are two basic types of coffee beans. The coffee bean used for Turkish coffee is Arabica. There are many elements in the structure of coffee and have various effect on human health such as Sodium (Na), Boron (B), Magnesium (Mg) and Iron (Fe). In this study, the amounts of Mg, Na, Fe, and B contents in Turkish coffee are determined and effect of sugar addition is investigated for conscious consumption. The analysis of the contents of coffees was determined by using inductively coupled plasma optical emission spectrometry (ICP-OES). From the results of the experiments the Mg, Na, Fe and B contents of Turkish coffee after sugar addition were found as 19.83, 1.04, 0.02, 0.21 ppm, while without using sugar these concentrations were found 21.46, 0.81, 0.008 and 0.16 ppm. In addition, element contents were calculated for 1, 3 and 5 cups of coffee in order to investigate the health effects.
Abstract: This paper presents a subband adaptive filter (SAF)
for a system identification where an impulse response is sparse
and disturbed with an impulsive noise. Benefiting from the uses
of l1-norm optimization and l0-norm penalty of the weight vector
in the cost function, the proposed l0-norm sign SAF (l0-SSAF)
achieves both robustness against impulsive noise and much improved
convergence behavior than the classical adaptive filters. Simulation
results in the system identification scenario confirm that the proposed
l0-norm SSAF is not only more robust but also faster and more
accurate than its counterparts in the sparse system identification in
the presence of impulsive noise.
Abstract: We propose two affine projection algorithms (APA)
with variable regularization parameter. The proposed algorithms
dynamically update the regularization parameter that is fixed in the
conventional regularized APA (R-APA) using a gradient descent
based approach. By introducing the normalized gradient, the proposed
algorithms give birth to an efficient and a robust update scheme for
the regularization parameter. Through experiments we demonstrate
that the proposed algorithms outperform conventional R-APA in
terms of the convergence rate and the misadjustment error.
Abstract: We present a normalized LMS (NLMS) algorithm
with robust regularization. Unlike conventional NLMS with the
fixed regularization parameter, the proposed approach dynamically
updates the regularization parameter. By exploiting a gradient
descent direction, we derive a computationally efficient and robust
update scheme for the regularization parameter. In simulation, we
demonstrate the proposed algorithm outperforms conventional NLMS
algorithms in terms of convergence rate and misadjustment error.