Response of the Residential Building Structureon Load Technical Seismicity due to Mining Activities

In the territories where high-intensity earthquakes are frequent is paid attention to the solving of the seismic problems. In the paper are described two computational model variants based on finite element method of the construction with different subsoil simulation (rigid or elastic subsoil) is used. For simulation and calculations program system based on method final elements ANSYS was used. Seismic responses calculations of residential building structure were effected on loading characterized by accelerogram for comparing with the responses spectra method.

A Hybrid Particle Swarm Optimization Solution to Ramping Rate Constrained Dynamic Economic Dispatch

This paper presents the application of an enhanced Particle Swarm Optimization (EPSO) combined with Gaussian Mutation (GM) for solving the Dynamic Economic Dispatch (DED) problem considering the operating constraints of generators. The EPSO consists of the standard PSO and a modified heuristic search approaches. Namely, the ability of the traditional PSO is enhanced by applying the modified heuristic search approach to prevent the solutions from violating the constraints. In addition, Gaussian Mutation is aimed at increasing the diversity of global search, whilst it also prevents being trapped in suboptimal points during search. To illustrate its efficiency and effectiveness, the developed EPSO-GM approach is tested on the 3-unit and 10-unit 24-hour systems considering valve-point effect. From the experimental results, it can be concluded that the proposed EPSO-GM provides, the accurate solution, the efficiency, and the feature of robust computation compared with other algorithms under consideration.

Extrapolation of Clinical Data from an Oral Glucose Tolerance Test Using a Support Vector Machine

To extract the important physiological factors related to diabetes from an oral glucose tolerance test (OGTT) by mathematical modeling, highly informative but convenient protocols are required. Current models require a large number of samples and extended period of testing, which is not practical for daily use. The purpose of this study is to make model assessments possible even from a reduced number of samples taken over a relatively short period. For this purpose, test values were extrapolated using a support vector machine. A good correlation was found between reference and extrapolated values in evaluated 741 OGTTs. This result indicates that a reduction in the number of clinical test is possible through a computational approach.

Array Signal Processing: DOA Estimation for Missing Sensors

Array signal processing involves signal enumeration and source localization. Array signal processing is centered on the ability to fuse temporal and spatial information captured via sampling signals emitted from a number of sources at the sensors of an array in order to carry out a specific estimation task: source characteristics (mainly localization of the sources) and/or array characteristics (mainly array geometry) estimation. Array signal processing is a part of signal processing that uses sensors organized in patterns or arrays, to detect signals and to determine information about them. Beamforming is a general signal processing technique used to control the directionality of the reception or transmission of a signal. Using Beamforming we can direct the majority of signal energy we receive from a group of array. Multiple signal classification (MUSIC) is a highly popular eigenstructure-based estimation method of direction of arrival (DOA) with high resolution. This Paper enumerates the effect of missing sensors in DOA estimation. The accuracy of the MUSIC-based DOA estimation is degraded significantly both by the effects of the missing sensors among the receiving array elements and the unequal channel gain and phase errors of the receiver.

Deep iCrawl: An Intelligent Vision-Based Deep Web Crawler

The explosive growth of World Wide Web has posed a challenging problem in extracting relevant data. Traditional web crawlers focus only on the surface web while the deep web keeps expanding behind the scene. Deep web pages are created dynamically as a result of queries posed to specific web databases. The structure of the deep web pages makes it impossible for traditional web crawlers to access deep web contents. This paper, Deep iCrawl, gives a novel and vision-based approach for extracting data from the deep web. Deep iCrawl splits the process into two phases. The first phase includes Query analysis and Query translation and the second covers vision-based extraction of data from the dynamically created deep web pages. There are several established approaches for the extraction of deep web pages but the proposed method aims at overcoming the inherent limitations of the former. This paper also aims at comparing the data items and presenting them in the required order.

Fatigue Failure of Structural Steel – Analysis Using Fracture Mechanics

Fatigue is the major threat in service of steel structure subjected to fluctuating loads. With the additional effect of corrosion and presence of weld joints the fatigue failure may become more critical in structural steel. One of the apt examples of such structural is the sailing ship. This is experiencing a constant stress due to floating and a pulsating bending load due to the waves. This paper describes an attempt to verify theory of fatigue in fracture mechanics approach with experimentation to determine the constants of crack growth curve. For this, specimen is prepared from the ship building steel and it is subjected to a pulsating bending load with a known defect. Fatigue crack and its nature is observed in this experiment. Application of fracture mechanics approach in fatigue with a simple practical experiment is conducted and constants of crack growth equation are investigated.

Electronic Transactions: Jurisdictional Issues in the European Union

One of the main consequences of the ubiquitous usage of Internet as a means to conduct business has been the progressive internationalization of contracts created to support such transactions. As electronic commerce becomes International commerce, the reality is that commercial disputes will occur creating such questions as: "In which country do I bring proceedings?" and "Which law is to be applied to solve disputes?" The decentralized and global structure of the Internet and its decentralized operation have given e-commerce a transnational element that affects two questions essential to any transaction: applicable law and jurisdiction in the event of dispute. The sharing of applicable law and jurisdiction among States in respect of international transactions traditionally has been based on the use of contact factors generally of a territorial nature (the place where real estate is located, customary residence, principal establishment, place of shipping goods). The characteristics of the Internet as a new space sometimes make it difficult to apply these rules, and may make them inoperative or lead to results that are surprising or totally foreign to the contracting parties and other elements and circumstances of the case.

Flood Hazard Mapping in Dikrong Basin of Arunachal Pradesh (India)

Flood zoning studies have become more efficient in recent years because of the availability of advanced computational facilities and use of Geographic Information Systems (GIS). In the present study, flood inundated areas were mapped using GIS for the Dikrong river basin of Arunachal Pradesh, India, corresponding to different return periods (2, 5, 25, 50, and 100 years). Further, the developed inundation maps corresponding to 25, 50, and 100 year return period floods were compared to corresponding maps developed by conventional methods as reported in the Brahmaputra Board Master Plan for Dikrong basin. It was found that, the average deviation of modelled flood inundation areas from reported map inundation areas is below 5% (4.52%). Therefore, it can be said that the modelled flood inundation areas matched satisfactorily with reported map inundation areas. Hence, GIS techniques were proved to be successful in extracting the flood inundation extent in a time and cost effective manner for the remotely located hilly basin of Dikrong, where conducting conventional surveys is very difficult.

A New Method for Detection of Artificial Objects and Materials from Long Distance Environmental Images

The article presents a new method for detection of artificial objects and materials from images of the environmental (non-urban) terrain. Our approach uses the hue and saturation (or Cb and Cr) components of the image as the input to the segmentation module that uses the mean shift method. The clusters obtained as the output of this stage have been processed by the decision-making module in order to find the regions of the image with the significant possibility of representing human. Although this method will detect various non-natural objects, it is primarily intended and optimized for detection of humans; i.e. for search and rescue purposes in non-urban terrain where, in normal circumstances, non-natural objects shouldn-t be present. Real world images are used for the evaluation of the method.

Software Technology Behind Computer Accounting

The main problems of data centric and open source project are large number of developers and changes of core framework. Model-View-Control (MVC) design pattern significantly improved the development and adjustments of complex projects. Entity framework as a Model layer in MVC architecture has simplified communication with the database. How often are the new technologies used and whether they have potentials for designing more efficient Enterprise Resource Planning (ERP) system that will be more suited to accountants?

Effect of the Machine Frame Structures on the Frequency Responses of Spindle Tool

Chatter vibration has been a troublesome problem for a machine tool toward the high precision and high speed machining. Essentially, the machining performance is determined by the dynamic characteristics of the machine tool structure and dynamics of cutting process. Therefore the dynamic vibration behavior of spindle tool system greatly determines the performance of machine tool. The purpose of this study is to investigate the influences of the machine frame structure on the dynamic frequency of spindle tool unit through finite element modeling approach. To this end, a realistic finite element model of the vertical milling system was created by incorporated the spindle-bearing model into the spindle head stock of the machine frame. Using this model, the dynamic characteristics of the milling machines with different structural designs of spindle head stock and identical spindle tool unit were demonstrated. The results of the finite element modeling reveal that the spindle tool unit behaves more compliant when the excited frequency approaches the natural mode of the spindle tool; while the spindle tool show a higher dynamic stiffness at lower frequency that may be initiated by the structural mode of milling head. Under this condition, it is concluded that the structural configuration of spindle head stock associated with the vertical column of milling machine plays an important role in determining the machining dynamics of the spindle unit.

Robust Probabilistic Online Change Detection Algorithm Based On the Continuous Wavelet Transform

In this article we present a change point detection algorithm based on the continuous wavelet transform. At the beginning of the article we describe a necessary transformation of a signal which has to be made for the purpose of change detection. Then case study related to iron ore sinter production which can be solved using our proposed technique is discussed. After that we describe a probabilistic algorithm which can be used to find changes using our transformed signal. It is shown that our algorithm works well with the presence of some noise and abnormal random bursts.

Cloud Computing Databases: Latest Trends and Architectural Concepts

The Economic factors are leading to the rise of infrastructures provides software and computing facilities as a service, known as cloud services or cloud computing. Cloud services can provide efficiencies for application providers, both by limiting up-front capital expenses, and by reducing the cost of ownership over time. Such services are made available in a data center, using shared commodity hardware for computation and storage. There is a varied set of cloud services available today, including application services (salesforce.com), storage services (Amazon S3), compute services (Google App Engine, Amazon EC2) and data services (Amazon SimpleDB, Microsoft SQL Server Data Services, Google-s Data store). These services represent a variety of reformations of data management architectures, and more are on the horizon.

Association of Selected Biochemical Markers and Body Mass Index in Women with Endocrine Disorders

Obesity is frequent attendant phenomenon of patients with endocrinological disease. Between BMI and endocrinological diseases is close correlation. In thesis we focused on the allocation of hormone concentration – PTH and TSH, CHOL a mineral element Ca in a blood serum. The examined group was formed by 100 respondents (women) aged 36 – 83 years, who were divided into two groups – control group (CG), group with diagnosed endocrine disease (DED). The concentration of PTH and TSH, Ca and CHOL was measured through the medium of analyzers Cobas e411 (Japan); Cobas Integra 400 (Switzerland). At individuals was measured body weight as well as stature and thereupon from those data we enumerated BMI. On the basis of Student T-test in biochemical parameter of PTH and Ca we found out significantly meaningful difference (p

Parametric Optimization of Hospital Design

Present paper presents a parametric performancebased design model for optimizing hospital design. The design model operates with geometric input parameters defining the functional requirements of the hospital and input parameters in terms of performance objectives defining the design requirements and preferences of the hospital with respect to performances. The design model takes point of departure in the hospital functionalities as a set of defined parameters and rules describing the design requirements and preferences.

Applications of Artificial Neural Network to Building Statistical Models for Qualifying and Indexing Radiation Treatment Plans

The main goal in this paper is to quantify the quality of different techniques for radiation treatment plans, a back-propagation artificial neural network (ANN) combined with biomedicine theory was used to model thirteen dosimetric parameters and to calculate two dosimetric indices. The correlations between dosimetric indices and quality of life were extracted as the features and used in the ANN model to make decisions in the clinic. The simulation results show that a trained multilayer back-propagation neural network model can help a doctor accept or reject a plan efficiently. In addition, the models are flexible and whenever a new treatment technique enters the market, the feature variables simply need to be imported and the model re-trained for it to be ready for use.

Wind Farm Modeling for Steady State and Dynamic Analysis

This paper focuses on PSS/E modeling of wind farms of Doubly-fed Induction Generator (DFIG) type and their impact on issues of power system operation. Since Wind Turbine Generators (WTG) don-t have the same characteristics as synchronous generators, the appropriate modeling of wind farms is essential for transmission system operators to analyze the best options of transmission grid reinforcements as well as to evaluate the wind power impact on reliability and security of supply. With the high excepted penetration of wind power into the power system a simultaneous loss of Wind Farm generation will put at risk power system security and reliability. Therefore, the main wind grid code requirements concern the fault ride through capability and frequency operation range of wind turbines. In case of grid faults wind turbines have to supply a definite reactive power depending on the instantaneous voltage and to return quickly to normal operation.

CFD Simulation of Solid-Liquid Stirred Tank with Rushton Turbine and Propeller Impeller

Stirred tanks have applications in many chemical processes where mixing is important for the overall performance of the system. In present work 5%v of the tank is filled by solid particles with diameter of 700 m that Rushton Turbine and Propeller impeller is used for stirring. An Eulerian-Eulerian Multi Fluid Model coupled and for modeling rotating of impeller, moving reference frame (MRF) technique was used and standard-k- model was selected for turbulency. Flow field, radial velocity and axial distribution of solid for both of impellers was investigation and comparison. Comparisons of simulation results between Rushton Turbine and propeller impeller shows that final quality of solid-liquid slurry in different rotating speed for propeller impeller is better than the Rushton Turbine.

Synthesis of Copper Sulfide Nanoparticles by Pulsed Plasma in Liquid Method

Copper sulfide nanoparticles (CuS) were successfully synthesized by the pulsed plasma in liquid method, using two copper rod electrodes submerged in molten sulfur. Low electrical energy and no high temperature were applied for synthesis. Obtained CuS nanoparticles were then analyzed by means of X-ray diffraction, Low and High Resolution Transmission Electron Microscopy, Electron Diffraction, X-ray Photoelectron, Raman Spectroscopies and Field Emission Scanning Electron Microscopy. XRD analysis revealed peaks for CuS with hexagonal phase composition. TEM and HRTEM studies showed that sizes of CuS nanoparticles ranged between 10-60 nm, with the average size of about 20 nm. Copper sulfide nanoparticles have short nanorod-like structure. Raman spectroscopy found peak for CuS at 474.2cm-1of Raman region.

Graphical Programming of Programmable Logic Controllers -Case Study for a Punching Machine-

The Programmable Logic Controller (PLC) plays a vital role in automation and process control. Grafcet is used for representing the control logic, and traditional programming languages are used for describing the pure algorithms. Grafcet is used for dividing the process to be automated in elementary sequences that can be easily implemented. Each sequence represent a step that has associated actions programmed using textual or graphical languages after case. The programming task is simplified by using a set of subroutines that are used in several steps. The paper presents an example of implementation for a punching machine for sheets and plates. The use the graphical languages the programming of a complex sequential process is a necessary solution. The state of Grafcet can be used for debugging and malfunction determination. The use of the method combined with a set of knowledge acquisition for process application reduces the downtime of the machine and improve the productivity.