Seismic Performance Assessment of Pre-70 RC Frame Buildings with FEMA P-58

Past earthquakes have shown that seismic events may incur large economic losses in buildings. FEMA P-58 provides engineers a practical tool for the performance seismic assessment of buildings. In this study, FEMA P-58 is applied to two typical Italian pre-1970 reinforced concrete frame buildings, characterized by plain rebars as steel reinforcement and masonry infills and partitions. Given that suitable tools for these buildings are missing in FEMA P- 58, specific fragility curves and loss functions are first developed. Next, building performance is evaluated following a time-based assessment approach. Finally, expected annual losses for the selected buildings are derived and compared with past applications to old RC frame buildings representative of the US building stock. 

3D Mesh Coarsening via Uniform Clustering

In this paper, we present a fast and efficient mesh coarsening algorithm for 3D triangular meshes. Theis approach can be applied to very complex 3D meshes of arbitrary topology and with millions of vertices. The algorithm is based on the clustering of the input mesh elements, which divides the faces of an input mesh into a given number of clusters for clustering purpose by approximating the Centroidal Voronoi Tessellation of the input mesh. Once a clustering is achieved, it provides us an efficient way to construct uniform tessellations, and therefore leads to good coarsening of polygonal meshes. With proliferation of 3D scanners, this coarsening algorithm is particularly useful for reverse engineering applications of 3D models, which in many cases are dense, non-uniform, irregular and arbitrary topology. Examples demonstrating effectiveness of the new algorithm are also included in the paper.

Selection of Designs in Ordinal Regression Models under Linear Predictor Misspecification

The purpose of this article is to find a method of comparing designs for ordinal regression models using quantile dispersion graphs in the presence of linear predictor misspecification. The true relationship between response variable and the corresponding control variables are usually unknown. Experimenter assumes certain form of the linear predictor of the ordinal regression models. The assumed form of the linear predictor may not be correct always. Thus, the maximum likelihood estimates (MLE) of the unknown parameters of the model may be biased due to misspecification of the linear predictor. In this article, the uncertainty in the linear predictor is represented by an unknown function. An algorithm is provided to estimate the unknown function at the design points where observations are available. The unknown function is estimated at all points in the design region using multivariate parametric kriging. The comparison of the designs are based on a scalar valued function of the mean squared error of prediction (MSEP) matrix, which incorporates both variance and bias of the prediction caused by the misspecification in the linear predictor. The designs are compared using quantile dispersion graphs approach. The graphs also visually depict the robustness of the designs on the changes in the parameter values. Numerical examples are presented to illustrate the proposed methodology.

Disaggregating and Forecasting the Total Energy Consumption of a Building: A Case Study of a High Cooling Demand Facility

Energy disaggregation has been focused by many energy companies since energy efficiency can be achieved when the breakdown of energy consumption is known. Companies have been investing in technologies to come up with software and/or hardware solutions that can provide this type of information to the consumer. On the other hand, not all people can afford to have these technologies. Therefore, in this paper, we present a methodology for breaking down the aggregate consumption and identifying the highdemanding end-uses profiles. These energy profiles will be used to build the forecast model for optimal control purpose. A facility with high cooling load is used as an illustrative case study to demonstrate the results of proposed methodology. We apply a high level energy disaggregation through a pattern recognition approach in order to extract the consumption profile of its rooftop packaged units (RTUs) and present a forecast model for the energy consumption.  

Evaluation of Ensemble Classifiers for Intrusion Detection

One of the major developments in machine learning in the past decade is the ensemble method, which finds highly accurate classifier by combining many moderately accurate component classifiers. In this research work, new ensemble classification methods are proposed with homogeneous ensemble classifier using bagging and heterogeneous ensemble classifier using arcing and their performances are analyzed in terms of accuracy. A Classifier ensemble is designed using Radial Basis Function (RBF) and Support Vector Machine (SVM) as base classifiers. The feasibility and the benefits of the proposed approaches are demonstrated by the means of standard datasets of intrusion detection. The main originality of the proposed approach is based on three main parts: preprocessing phase, classification phase, and combining phase. A wide range of comparative experiments is conducted for standard datasets of intrusion detection. The performance of the proposed homogeneous and heterogeneous ensemble classifiers are compared to the performance of other standard homogeneous and heterogeneous ensemble methods. The standard homogeneous ensemble methods include Error correcting output codes, Dagging and heterogeneous ensemble methods include majority voting, stacking. The proposed ensemble methods provide significant improvement of accuracy compared to individual classifiers and the proposed bagged RBF and SVM performs significantly better than ECOC and Dagging and the proposed hybrid RBF-SVM performs significantly better than voting and stacking. Also heterogeneous models exhibit better results than homogeneous models for standard datasets of intrusion detection. 

Response of a Bridge Crane during an Earthquake

During an earthquake, a bridge crane may be subjected to multiple impacts between crane wheels and rail. In order to model such phenomena, a time-history dynamic analysis with a multi-scale approach is performed. The high frequency aspect of the impacts between wheels and rails is taken into account by a Lagrange explicit event-capturing algorithm based on a velocity-impulse formulation to resolve contacts and impacts. An implicit temporal scheme is used for the rest of the structure. The numerical coupling between the implicit and the explicit schemes is achieved with a heterogeneous asynchronous time-integrator.

Developing New Media Credibility Scale: A Multidimensional Perspective

The main purposes of this study are to develop a scale that reflects emerging theoretical understandings of new media credibility, based on the evolution of credibility studies in western researches, identification of the determinants of credibility in the media and its components by comparing traditional and new media credibility scales and building accumulative scale to test new media credibility. This approach was built on western researches using conceptualizations of media credibility, which focuses on four principal components: Source (journalist), message (article), medium (newspaper, radio, TV, web, etc.), and organization (owner of the medium), and adding user and cultural context as key components to assess new media credibility in particular. This study’s value lies in its contribution to the conceptualization and development of new media credibility through the creation of a theoretical measurement tool. Future studies should explore this scale to test new media credibility, which represents a promising new approach in the efforts to define and measure credibility of all media types.

The Effect of Corporate Social Responsibility in the National Commercial Bank in Saudi Arabia

The aim of the paper is to investigate the effect of corporate social responsibility (CSR) CSR on the National Commercial Bank (NCB) in Saudi Arabia. In order to achieve this, a case study was made of the CSR activities of this bank from the perspective of its branch managers. The NCB was chosen as it was one of the first Saudi banks to engage in CSR and currently has a wide range of CSR initiatives. A qualitative research method was used. Open-ended questionnaires were administered to eighty branch managers of the NCB, with fifty-five usable questionnaires returned and twenty managers were interviewed as part of the primary research. Data from both questionnaires and interviews were analysed using qualitative content analysis. Six themes emerged from the questionnaire findings were used to develop the interview questions. These themes are the following: Awareness of employees about CSR in the NCB; CSR activities as a type of investment; Government and media support; Increased employee loyalty in the NCB; Prestige and profit to the NCB; and View of CSR in Islam. This paper makes a theoretical contribution in that it investigates and increases understanding of the effect of CSR on the NCB in Saudi Arabia. In addition, it makes a practical contribution by making recommendations which can support the development of CSR in the NCB. A limitation of the paper is that it is a case study of only one bank. It is therefore recommended that future research could be conducted with other banks in Saudi Arabia, or indeed, with a range of other types of firm within the financial services area in Saudi Arabia. In this way, the same issues could be explored but with a greater potential generalisability of findings of CSR within the Saudi Arabian financial services industry. In addition, this paper takes a qualitative approach and it is suggested that future research be carried out using mixed methods, which could provide a greater depth of analysis.

Medical Image Edge Detection Based on Neuro-Fuzzy Approach

Edge detection is one of the most important tasks in image processing. Medical image edge detection plays an important role in segmentation and object recognition of the human organs. It refers to the process of identifying and locating sharp discontinuities in medical images. In this paper, a neuro-fuzzy based approach is introduced to detect the edges for noisy medical images. This approach uses desired number of neuro-fuzzy subdetectors with a postprocessor for detecting the edges of medical images. The internal parameters of the approach are optimized by training pattern using artificial images. The performance of the approach is evaluated on different medical images and compared with popular edge detection algorithm. From the experimental results, it is clear that this approach has better performance than those of other competing edge detection algorithms for noisy medical images.

Expected Present Value of Losses in the Computation of Optimum Seismic Design Parameters

An approach to compute optimum seismic design parameters is presented. It is based on the optimization of the expected present value of the total cost, which includes the initial cost of structures as well as the cost due to earthquakes. Different types of seismicity models are considered, including one for characteristic earthquakes. Uncertainties are included in some variables to observe the influence on optimum values. Optimum seismic design coefficients are computed for three different structural types representing high, medium and low rise buildings, located near and far from the seismic sources. Ordinary and important structures are considered in the analysis. The results of optimum values show an important influence of seismicity models as well as of uncertainties on the variables.

Large-Scale Production of High-Performance Fiber-Metal-Laminates by Prepreg-Press-Technology

Lightweight construction became more and more important over the last decades in several applications, e.g. in the automotive or aircraft sector. This is the result of economic and ecological constraints on the one hand and increasing safety and comfort requirements on the other hand. In the field of lightweight design, different approaches are used due to specific requirements towards the technical systems. The use of endless carbon fiber reinforced plastics (CFRP) offers the largest weight saving potential of sometimes more than 50% compared to conventional metal-constructions. However, there are very limited industrial applications because of the cost-intensive manufacturing of the fibers and production technologies. Other disadvantages of pure CFRP-structures affect the quality control or the damage resistance. One approach to meet these challenges is hybrid materials. This means CFRP and sheet metal are combined on a material level. Therefore, new opportunities for innovative process routes are realizable. Hybrid lightweight design results in lower costs due to an optimized material utilization and the possibility to integrate the structures in already existing production processes of automobile manufacturers. In recent and current research, the advantages of two-layered hybrid materials have been pointed out, i.e. the possibility to realize structures with tailored mechanical properties or to divide the curing cycle of the epoxy resin into two steps. Current research work at the Chair for Automotive Lightweight Design (LiA) at the Paderborn University focusses on production processes for fiber-metal-laminates. The aim of this work is the development and qualification of a large-scale production process for high-performance fiber-metal-laminates (FML) for industrial applications in the automotive or aircraft sector. Therefore, the prepreg-press-technology is used, in which pre-impregnated carbon fibers and sheet metals are formed and cured in a closed, heated mold. The investigations focus e.g. on the realization of short process chains and cycle times, on the reduction of time-consuming manual process steps, and the reduction of material costs. This paper gives an overview over the considerable steps of the production process in the beginning. Afterwards experimental results are discussed. This part concentrates on the influence of different process parameters on the mechanical properties, the laminate quality and the identification of process limits. Concluding the advantages of this technology compared to conventional FML-production-processes and other lightweight design approaches are carried out.

Machine Learning Approach for Identifying Dementia from MRI Images

This research paper presents a framework for classifying Magnetic Resonance Imaging (MRI) images for Dementia. Dementia, an age-related cognitive decline is indicated by degeneration of cortical and sub-cortical structures. Characterizing morphological changes helps understand disease development and contributes to early prediction and prevention of the disease. Modelling, that captures the brain’s structural variability and which is valid in disease classification and interpretation is very challenging. Features are extracted using Gabor filter with 0, 30, 60, 90 orientations and Gray Level Co-occurrence Matrix (GLCM). It is proposed to normalize and fuse the features. Independent Component Analysis (ICA) selects features. Support Vector Machine (SVM) classifier with different kernels is evaluated, for efficiency to classify dementia. This study evaluates the presented framework using MRI images from OASIS dataset for identifying dementia. Results showed that the proposed feature fusion classifier achieves higher classification accuracy.

Towards Developing a Self-Explanatory Scheduling System Based on a Hybrid Approach

In the study, we present a conceptual framework for developing a scheduling system that can generate self-explanatory and easy-understanding schedules. To this end, a user interface is conceived to help planners record factors that are considered crucial in scheduling, as well as internal and external sources relating to such factors. A hybrid approach combining machine learning and constraint programming is developed to generate schedules and the corresponding factors, and accordingly display them on the user interface. Effects of the proposed system on scheduling are discussed, and it is expected that scheduling efficiency and system understandability will be improved, compared with previous scheduling systems.

Discrete Element Modeling on Bearing Capacity Problems

In this paper, the classical bearing capacity problem is re-considered from discrete element analysis. In the discrete element approach, the bearing capacity problem is considered from the elastic stage to plastic stage to rupture stage (large displacement). The bearing capacity failure mechanism of a strip footing on soil is investigated, and the influence of micro-parameters on the bearing capacity of soil is also observed. It is found that the distinct element method (DEM) gives very good visualized results, and basically coincides well with that derived by the classical methods.

An Energy Aware Data Aggregation in Wireless Sensor Network Using Connected Dominant Set

Wireless Sensor Networks (WSNs) have many advantages. Their deployment is easier and faster than wired sensor networks or other wireless networks, as they do not need fixed infrastructure. Nodes are partitioned into many small groups named clusters to aggregate data through network organization. WSN clustering guarantees performance achievement of sensor nodes. Sensor nodes energy consumption is reduced by eliminating redundant energy use and balancing energy sensor nodes use over a network. The aim of such clustering protocols is to prolong network life. Low Energy Adaptive Clustering Hierarchy (LEACH) is a popular protocol in WSN. LEACH is a clustering protocol in which the random rotations of local cluster heads are utilized in order to distribute energy load among all sensor nodes in the network. This paper proposes Connected Dominant Set (CDS) based cluster formation. CDS aggregates data in a promising approach for reducing routing overhead since messages are transmitted only within virtual backbone by means of CDS and also data aggregating lowers the ratio of responding hosts to the hosts existing in virtual backbones. CDS tries to increase networks lifetime considering such parameters as sensors lifetime, remaining and consumption energies in order to have an almost optimal data aggregation within networks. Experimental results proved CDS outperformed LEACH regarding number of cluster formations, average packet loss rate, average end to end delay, life computation, and remaining energy computation.

The Impact of Regulatory Changes on the Development of Mobile Medical Apps

Mobile applications are being used to perform a wide variety of tasks in day-to-day life, ranging from checking email to controlling your home heating. Application developers have recognized the potential to transform a smart device into a medical device, by using a mobile medical application i.e. a mobile phone or a tablet. When initially conceived these mobile medical applications performed basic functions e.g. BMI calculator, accessing reference material etc.; however, increasing complexity offers clinicians and patients a range of functionality. As this complexity and functionality increases, so too does the potential risk associated with using such an application. Examples include any applications that provide the ability to inflate and deflate blood pressure cuffs, as well as applications that use patient-specific parameters and calculate dosage or create a dosage plan for radiation therapy. If an unapproved mobile medical application is marketed by a medical device organization, then they face significant penalties such as receiving an FDA warning letter to cease the prohibited activity, fines and possibility of facing a criminal conviction. Regulatory bodies have finalized guidance intended for mobile application developers to establish if their applications are subject to regulatory scrutiny. However, regulatory controls appear contradictory with the approaches taken by mobile application developers who generally work with short development cycles and very little documentation and as such, there is the potential to stifle further improvements due to these regulations. The research presented as part of this paper details how by adopting development techniques, such as agile software development, mobile medical application developers can meet regulatory requirements whilst still fostering innovation.

Numerical Applications of Tikhonov Regularization for the Fourier Multiplier Operators

Tikhonov regularization and reproducing kernels are the most popular approaches to solve ill-posed problems in computational mathematics and applications. And the Fourier multiplier operators are an essential tool to extend some known linear transforms in Euclidean Fourier analysis, as: Weierstrass transform, Poisson integral, Hilbert transform, Riesz transforms, Bochner-Riesz mean operators, partial Fourier integral, Riesz potential, Bessel potential, etc. Using the theory of reproducing kernels, we construct a simple and efficient representations for some class of Fourier multiplier operators Tm on the Paley-Wiener space Hh. In addition, we give an error estimate formula for the approximation and obtain some convergence results as the parameters and the independent variables approaches zero. Furthermore, using numerical quadrature integration rules to compute single and multiple integrals, we give numerical examples and we write explicitly the extremal function and the corresponding Fourier multiplier operators.

Linking Business Process Models and System Models Based on Business Process Modelling

Organizations today need to invest in software in order to run their businesses, and to the organizations’ objectives, the software should be in line with the business process. This research presents an approach for linking process models and system models. Particularly, the new approach aims to synthesize sequence diagram based on role activity diagram (RAD) model. The approach includes four steps namely: Create business process model using RAD, identify computerized activities, identify entities in sequence diagram and identify messages in sequence diagram. The new approach has been validated using the process of student registration in University of Petra as a case study. Further research is required to validate the new approach using different domains.

Encouraging the Development of Scientific Literacy in Early Childhood Institutions: Croatian Experience

There is a widespread belief in everyday discourse that science subjects (physics, chemistry and biology) are, along with math, the most difficult school subjects in the education of an individual. This assumption is usually justified by the following facts: low GPA in these subjects, the number of pupils who fail these subjects is high in comparison to other subjects, and the number of pupils interested in continuing their studies in the fields with a focus on science subjects is lower compared to non-science-oriented fields. From that perspective, the project: “Could it be different? How do children explore it?” becomes extremely interesting because it is focused on young children and on the introduction of new methods, with aim of arousing interest in scientific literacy development in 10 kindergartens by applying the methodology of an action research, with an ethnographic approach. We define scientific literacy as a process of encouraging and nurturing the research and explorative spirit in children, as well as their natural potential and abilities that represent an object of scientific research: to learn about exploration by conducting exploration. Upon project completion, an evaluation questionnaire was created for the parents of the children who had participated in the project, as well as for those whose children had not been involved in the project. The purpose of the first questionnaire was to examine the level of satisfaction with the project implementation and its outcomes among those parents whose children had been involved in the project (N=142), while the aim of the second questionnaire was to find out how much the parents of the children not involved (N=154) in this activity were interested in this topic.

Analytical Study of Applying the Account Aggregation Approach in E-Banking Services

The advanced information technology is becoming an important factor in the development of financial services industry, especially the banking industry. It has introduced new ways of delivering banking to the customer, such as Internet Banking. Banks began to look at electronic banking (e-banking) as a means to replace some of their traditional branch functions using the Internet as a new distribution channel. Some consumers have at least more than one account, and across banks, and access these accounts using e-banking services. To look at the current net worth position, customers have to login to each of their accounts and get the details and work on consolidation. This not only takes ample time but it is a repetitive activity at a specified frequency. To address this point, an account aggregation concept is added as a solution. E-banking account aggregation, as one of the e-banking types, appeared to build a stronger relationship with customers. Account Aggregation Service generally refers to a service that allows customers to manage their bank accounts maintained in different institutions through a common Internet banking operating a platform, with a high concern to security and privacy. This paper presents an overview of an e-banking account aggregation approach as a new service in the e-banking field.