Laplace Transformation on Ordered Linear Space of Generalized Functions

Aim. We have introduced the notion of order to multinormed spaces and countable union spaces and their duals. The topology of bounded convergence is assigned to the dual spaces. The aim of this paper is to develop the theory of ordered topological linear spaces La,b, L(w, z), the dual spaces of ordered multinormed spaces La,b, ordered countable union spaces L(w, z), with the topology of bounded convergence assigned to the dual spaces. We apply Laplace transformation to the ordered linear space of Laplace transformable generalized functions. We ultimately aim at finding solutions to nonhomogeneous nth order linear differential equations with constant coefficients in terms of generalized functions and comparing different solutions evolved out of different initial conditions. Method. The above aim is achieved by • Defining the spaces La,b, L(w, z). • Assigning an order relation on these spaces by identifying a positive cone on them and studying the properties of the cone. • Defining an order relation on the dual spaces La,b, L(w, z) of La,b, L(w, z) and assigning a topology to these dual spaces which makes the order dual and the topological dual the same. • Defining the adjoint of a continuous map on these spaces and studying its behaviour when the topology of bounded convergence is assigned to the dual spaces. • Applying the two-sided Laplace Transformation on the ordered linear space of generalized functions W and studying some properties of the transformation which are used in solving differential equations. Result. The above techniques are applied to solve non-homogeneous n-th order linear differential equations with constant coefficients in terms of generalized functions and to compare different solutions of the differential equation.

Signed Approach for Mining Web Content Outliers

The emergence of the Internet has brewed the revolution of information storage and retrieval. As most of the data in the web is unstructured, and contains a mix of text, video, audio etc, there is a need to mine information to cater to the specific needs of the users without loss of important hidden information. Thus developing user friendly and automated tools for providing relevant information quickly becomes a major challenge in web mining research. Most of the existing web mining algorithms have concentrated on finding frequent patterns while neglecting the less frequent ones that are likely to contain outlying data such as noise, irrelevant and redundant data. This paper mainly focuses on Signed approach and full word matching on the organized domain dictionary for mining web content outliers. This Signed approach gives the relevant web documents as well as outlying web documents. As the dictionary is organized based on the number of characters in a word, searching and retrieval of documents takes less time and less space.

Thermomechanical Studies in Glass/Epoxy Composite Specimen during Tensile Loading

This paper presents the results of thermo-mechanical characterization of Glass/Epoxy composite specimens using Infrared Thermography technique. The specimens used for the study were fabricated in-house with three different lay-up sequences and tested on a servo hydraulic machine under uni-axial loading. Infrared Camera was used for on-line monitoring surface temperature changes of composite specimens during tensile deformation. Experimental results showed that thermomechanical characteristics of each type of specimens were distinct. Temperature was found to be decreasing linearly with increasing tensile stress in the elastic region due to thermo-elastic effect. Yield point could be observed by monitoring the change in temperature profile during tensile testing and this value could be correlated with the results obtained from stress-strain response. The extent of prior plastic deformation in the post-yield region influenced the slopes of temperature response during tensile loading. Partial unloading and reloading of specimens post-yield results in change in slope in elastic and plastic regions of composite specimens.

A Design Framework for Event Recommendation in Novice Low-Literacy Communities

The proliferation of user-generated content (UGC) results in huge opportunities to explore event patterns. However, existing event recommendation systems primarily focus on advanced information technology users. Little work has been done to address novice and low-literacy users. The next billion users providing and consuming UGC are likely to include communities from developing countries who are ready to use affordable technologies for subsistence goals. Therefore, we propose a design framework for providing event recommendations to address the needs of such users. Grounded in information integration theory (IIT), our framework advocates that effective event recommendation is supported by systems capable of (1) reliable information gathering through structured user input, (2) accurate sense making through spatial-temporal analytics, and (3) intuitive information dissemination through interactive visualization techniques. A mobile pest management application is developed as an instantiation of the design framework. Our preliminary study suggests a set of design principles for novice and low-literacy users.

Comparison of Full Graph Methods of Switched Circuits Solution

As there are also graph methods of circuit analysis in addition to algebraic methods, it is, in theory, clearly possible to carry out an analysis of a whole switched circuit in two-phase switching exclusively by the graph method as well. This article deals with two methods of full-graph solving of switched circuits: by transformation graphs and by two-graphs. It deals with the circuit switched capacitors and the switched current, too. All methods are presented in an equally detailed steps to be able to compare.

Advanced Travel Information System in Heterogeneous Networks

In order to achieve better road utilization and traffic efficiency, there is an urgent need for a travel information delivery mechanism to assist the drivers in making better decisions in the emerging intelligent transportation system applications. In this paper, we propose a relayed multicast scheme under heterogeneous networks for this purpose. In the proposed system, travel information consisting of summarized traffic conditions, important events, real-time traffic videos, and local information service contents is formed into layers and multicasted through an integration of WiMAX infrastructure and Vehicular Ad hoc Networks (VANET). By the support of adaptive modulation and coding in WiMAX, the radio resources can be optimally allocated when performing multicast so as to dynamically adjust the number of data layers received by the users. In addition to multicast supported by WiMAX, a knowledge propagation and information relay scheme by VANET is designed. The experimental results validate the feasibility and effectiveness of the proposed scheme.

SimplexIS: Evaluating the Impact of e-Gov Simplification Measures in the Information System Architecure

Nowadays increasingly the population makes use of Information Technology (IT). As such, in recent year the Portuguese government increased its focus on using the IT for improving people-s life and began to develop a set of measures to enable the modernization of the Public Administration, and so reducing the gap between Public Administration and citizens.Thus the Portuguese Government launched the Simplex Program. However these SIMPLEX eGov measures, which have been implemented over the years, present a serious challenge: how to forecast its impact on existing Information Systems Architecture (ISA). Thus, this research is focus in addressing the problem of automating the evaluation of the actual impact of implementation an eGovSimplification and Modernization measures in the Information Systems Architecture. To realize the evaluation we proposes a Framework, which is supported by some key concepts as: Quality Factors, ISA modeling, Multicriteria Approach, Polarity Profile and Quality Metrics

Automatic 2D/2D Registration using Multiresolution Pyramid based Mutual Information in Image Guided Radiation Therapy

Medical image registration is the key technology in image guided radiation therapy (IGRT) systems. On the basis of the previous work on our IGRT prototype with a biorthogonal x-ray imaging system, we described a method focused on the 2D/2D rigid-body registration using multiresolution pyramid based mutual information in this paper. Three key steps were involved in the method : firstly, four 2D images were obtained including two x-ray projection images and two digital reconstructed radiographies(DRRs ) as the input for the registration ; Secondly, each pair of the corresponding x-ray image and DRR image were matched using multiresolution pyramid based mutual information under the ITK registration framework ; Thirdly, we got the final couch offset through a coordinate transformation by calculating the translations acquired from the two pairs of the images. A simulation example of a parotid gland tumor case and a clinical example of an anthropomorphic head phantom were employed in the verification tests. In addition, the influence of different CT slice thickness were tested. The simulation results showed that the positioning errors were 0.068±0.070, 0.072±0.098, 0.154±0.176mm along three axes which were lateral, longitudinal and vertical. The clinical test indicated that the positioning errors of the planned isocenter were 0.066, 0.07, 2.06mm on average with a CT slice thickness of 2.5mm. It can be concluded that our method with its verified accuracy and robustness can be effectively used in IGRT systems for patient setup.

Addressing Data Security in the Cloud

The development of information and communication technology, the increased use of the internet, as well as the effects of the recession within the last years, have lead to the increased use of cloud computing based solutions, also called on-demand solutions. These solutions offer a large number of benefits to organizations as well as challenges and risks, mainly determined by data visualization in different geographic locations on the internet. As far as the specific risks of cloud environment are concerned, data security is still considered a peak barrier in adopting cloud computing. The present study offers an approach upon ensuring the security of cloud data, oriented towards the whole data life cycle. The final part of the study focuses on the assessment of data security in the cloud, this representing the bases in determining the potential losses and the premise for subsequent improvements and continuous learning.

Effect of Addition the Dune Sand Powder on Development of Compressive Strength and Hydration of Cement Pastes

In this paper, the effect of addition the dune sand powder (DSP) on development of compressive strength and hydration of cement pastes was investigated as a function of water/binder ratio, was varied, on the one hand, the percentage of DSP and on the other, the fineness of DSP. In order to understand better the pozzolanic effect of dune sand powder in cement pastes, we followed the mixtures hydration (50% Pure Lime + 50% DSP) by X-ray diffraction. These mixtures the pastes present a hydraulic setting which is due to the formation of a C-S-H phase (calcium silicate hydrate). The latter is semi-crystallized. This study is a simplified approach to that of the mixtures (80% ordinary Portland cement + 20% DSP), in which the main reaction is the fixing of the lime coming from the cement hydration in the presence of DSP, to form calcium silicate hydrate semi-crystallized of second generation. The results proved that up to (20% DSP) as Portland cement replacement could be used with a fineness of 4000 cm²/g without affecting adversely the compressive strength. After 28 days, the compressive strength at 5, 10 and 15% DSP is superior to Portland cement, with an optimum effect for a percentage of the order of 5% to 10% irrespective of the w/b ratio and fineness of DSP.

An Investigation on the Effect of Various Noises on Human Sensibility by using EEG Signal

Noise causes significant sensibility changes on a human. This study investigated the effect of five different noises on electroencephalogram (EEG) and subjective evaluation. Six human subjects were exposed to classic piano, ocean wave, alarm in army, ambulance, mosquito noise and EEG data were collected during the experimental session. Alpha band activity in the mosquito noise was smaller than that in the classic piano. Alpha band activity decreased 43.4 ± 8.2 % in the mosquito noise. On the other hand, Beta band activity in the mosquito noise was greater than that in the classic piano. Beta band activity increased 60.1 ± 10.7 % in the mosquito noise. The advances from this study may aid the product design process with human sensibility engineering. This result may provide useful information in designing a human-oriented product to avoid the stress.

Variational Iteration Method for the Solution of Boundary Value Problems

In this work, we present a reliable framework to solve boundary value problems with particular significance in solid mechanics. These problems are used as mathematical models in deformation of beams. The algorithm rests mainly on a relatively new technique, the Variational Iteration Method. Some examples are given to confirm the efficiency and the accuracy of the method.

Modeling Peer-to-Peer Networks with Interest-Based Clusters

In the world of Peer-to-Peer (P2P) networking different protocols have been developed to make the resource sharing or information retrieval more efficient. The SemPeer protocol is a new layer on Gnutella that transforms the connections of the nodes based on semantic information to make information retrieval more efficient. However, this transformation causes high clustering in the network that decreases the number of nodes reached, therefore the probability of finding a document is also decreased. In this paper we describe a mathematical model for the Gnutella and SemPeer protocols that captures clustering-related issues, followed by a proposition to modify the SemPeer protocol to achieve moderate clustering. This modification is a sort of link management for the individual nodes that allows the SemPeer protocol to be more efficient, because the probability of a successful query in the P2P network is reasonably increased. For the validation of the models, we evaluated a series of simulations that supported our results.

Simulating the Dynamics of Distribution of Hazardous Substances Emitted by Motor Engines in a Residential Quarter

This article is dedicated to development of mathematical models for determining the dynamics of concentration of hazardous substances in urban turbulent atmosphere. Development of the mathematical models implied taking into account the time-space variability of the fields of meteorological items and such turbulent atmosphere data as vortex nature, nonlinear nature, dissipativity and diffusivity. Knowing the turbulent airflow velocity is not assumed when developing the model. However, a simplified model implies that the turbulent and molecular diffusion ratio is a piecewise constant function that changes depending on vertical distance from the earth surface. Thereby an important assumption of vertical stratification of urban air due to atmospheric accumulation of hazardous substances emitted by motor vehicles is introduced into the mathematical model. The suggested simplified non-linear mathematical model of determining the sought exhaust concentration at a priori unknown turbulent flow velocity through non-degenerate transformation is reduced to the model which is subsequently solved analytically.

Laser Excited Nuclear γ -Source of High Spectral Brightness

This paper considers various channels of gammaquantum generation via an ultra-short high-power laser pulse interaction with different targets.We analyse the possibilities to create a pulsed gamma-radiation source using laser triggering of some nuclear reactions and isomer targets. It is shown that sub-MeV monochromatic short pulse of gamma-radiation can be obtained with pulse energy of sub-mJ level from isomer target irradiated by intense laser pulse. For nuclear reaction channel in light- atom materials, it is shown that sub-PW laser pulse gives rise to formation about million gamma-photons of multi-MeV energy.

Specification of a Model of Honeypot Attack Based On Raised Data

The security of their network remains the priorities of almost all companies. Existing security systems have shown their limit; thus a new type of security systems was born: honeypots. Honeypots are defined as programs or intended servers which have to attract pirates to study theirs behaviours. It is in this context that the leurre.com project of gathering about twenty platforms was born. This article aims to specify a model of honeypots attack. Our model describes, on a given platform, the evolution of attacks according to theirs hours. Afterward, we show the most attacked services by the studies of attacks on the various ports. It is advisable to note that this article was elaborated within the framework of the research projects on honeyspots within the LABTIC (Laboratory of Information Technologies and Communication).

Face Image Coding Using Face Prototyping

In this paper we present a novel approach for face image coding. The proposed method makes a use of the features of video encoders like motion prediction. At first encoder selects appropriate prototype from the database and warps it according to features of encoding face. Warped prototype is placed as first I frame. Encoding face is placed as second frame as P frame type. Information about features positions, color change, selected prototype and data flow of P frame will be sent to decoder. The condition is both encoder and decoder own the same database of prototypes. We have run experiment with H.264 video encoder and obtained results were compared to results achieved by JPEG and JPEG2000. Obtained results show that our approach is able to achieve 3 times lower bitrate and two times higher PSNR in comparison with JPEG. According to comparison with JPEG2000 the bitrate was very similar, but subjective quality achieved by proposed method is better.

Automated Stereophotogrammetry Data Cleansing

The stereophotogrammetry modality is gaining more widespread use in the clinical setting. Registration and visualization of this data, in conjunction with conventional 3D volumetric image modalities, provides virtual human data with textured soft tissue and internal anatomical and structural information. In this investigation computed tomography (CT) and stereophotogrammetry data is acquired from 4 anatomical phantoms and registered using the trimmed iterative closest point (TrICP) algorithm. This paper fully addresses the issue of imaging artifacts around the stereophotogrammetry surface edge using the registered CT data as a reference. Several iterative algorithms are implemented to automatically identify and remove stereophotogrammetry surface edge outliers, improving the overall visualization of the combined stereophotogrammetry and CT data. This paper shows that outliers at the surface edge of stereophotogrammetry data can be successfully removed automatically.

Effects of the Wavy Surface on Free Convection-Radiation along an Inclined Plate

A numerical analysis used to simulate the effects of wavy surfaces and thermal radiation on natural convection heat transfer boundary layer flow over an inclined wavy plate has been investigated. A simple coordinate transformation is employed to transform the complex wavy surface into a flat plate. The boundary layer equations and the boundary conditions are discretized by the finite difference scheme and solved numerically using the Gauss-Seidel algorithm with relaxation coefficient. Effects of the wavy geometry, the inclination angle of the wavy plate and the thermal radiation on the velocity profiles, temperature profiles and the local Nusselt number are presented and discussed in detail.

The Content Based Objective Metrics for Video Quality Evaluation

In this paper we proposed comparison of four content based objective metrics with results of subjective tests from 80 video sequences. We also include two objective metrics VQM and SSIM to our comparison to serve as “reference” objective metrics because their pros and cons have already been published. Each of the video sequence was preprocessed by the region recognition algorithm and then the particular objective video quality metric were calculated i.e. mutual information, angular distance, moment of angle and normalized cross-correlation measure. The Pearson coefficient was calculated to express metrics relationship to accuracy of the model and the Spearman rank order correlation coefficient to represent the metrics relationship to monotonicity. The results show that model with the mutual information as objective metric provides best result and it is suitable for evaluating quality of video sequences.