Abstract: This paper presented a MATLAB-based system named Smart Access Network Testing, Analyzing and Database (SANTAD), purposely for in-service transmission surveillance and self restoration against fiber fault in fiber-to-the-home (FTTH) access network. The developed program will be installed with optical line terminal (OLT) at central office (CO) to monitor the status and detect any fiber fault that occurs in FTTH downwardly from CO towards residential customer locations. SANTAD is interfaced with optical time domain reflectometer (OTDR) to accumulate every network testing result to be displayed on a single computer screen for further analysis. This program will identify and present the parameters of each optical fiber line such as the line's status either in working or nonworking condition, magnitude of decreasing at each point, failure location, and other details as shown in the OTDR's screen. The failure status will be delivered to field engineers for promptly actions, meanwhile the failure line will be diverted to protection line to ensure the traffic flow continuously. This approach has a bright prospect to improve the survivability and reliability as well as increase the efficiency and monitoring capabilities in FTTH.
Abstract: Currently WWW is the first solution for scholars in
finding information. But, analyzing and interpreting this volume of
information will lead to researchers overload in pursuing their
research.
Trend detection in scientific publication retrieval systems helps
scholars to find relevant, new and popular special areas by
visualizing the trend of input topic.
However, there are few researches on trend detection in scientific
corpora while their proposed models do not appear to be suitable.
Previous works lack of an appropriate representation scheme for
research topics.
This paper describes a method that combines Semantic Web and
ontology to support advance search functions such as trend detection
in the context of scholarly Semantic Web system (SSWeb).
Abstract: In this paper, we propose to study the synthesis of the
vertical dipole antenna over imperfect ground. The synthesis
implementation-s method for this type of antenna permits to
approach the appropriated radiance-s diagram. The used approach is
based on neural network. Our main contribution in this paper is the
extension of a synthesis model of this vertical dipole antenna over
imperfect ground.
Abstract: This paper focuses on creating a component model of information system under uncertainty. The paper identifies problem in current approach of component modeling and proposes fuzzy tool, which will work with vague customer requirements and propose components of the resulting component model. The proposed tool is verified on specific information system and results are shown in paper. After finding suitable sub-components of the resulting component model, the component model is visualised by tool.
Abstract: Basel III (or the Third Basel Accord) is a global
regulatory standard on bank capital adequacy, stress testing and
market liquidity risk agreed upon by the members of the Basel
Committee on Banking Supervision in 2010-2011, and scheduled to
be introduced from 2013 until 2018. Basel III is a comprehensive set
of reform measures. These measures aim to; (1) improve the banking
sector-s ability to absorb shocks arising from financial and economic
stress, whatever the source, (2) improve risk management and
governance, (3) strengthen banks- transparency and disclosures.
Similarly the reform target; (1) bank level or micro-prudential,
regulation, which will help raise the resilience of individual banking
institutions to periods of stress. (2) Macro-prudential regulations,
system wide risk that can build up across the banking sector as well
as the pro-cyclical implication of these risks over time. These two
approaches to supervision are complementary as greater resilience at
the individual bank level reduces the risk system wide shocks.
Macroeconomic impact of Basel III; OECD estimates that the
medium-term impact of Basel III implementation on GDP growth is
in the range -0,05 percent to -0,15 percent per year. On the other hand
economic output is mainly affected by an increase in bank lending
spreads as banks pass a rise in banking funding costs, due to higher
capital requirements, to their customers. Consequently the estimated
effects on GDP growth assume no active response from monetary
policy. Basel III impact on economic output could be offset by a
reduction (or delayed increase) in monetary policy rates by about 30
to 80 basis points. The aim of this paper is to create a framework
based on the recent regulations in order to prevent financial crises.
Thus the need to overcome the global financial crisis will contribute
to financial crises that may occur in the future periods. In the first
part of the paper, the effects of the global crisis on the banking
system examine the concept of financial regulations. In the second
part; especially in the financial regulations and Basel III are analyzed.
The last section in this paper explored the possible consequences of
the macroeconomic impacts of Basel III.
Abstract: Tacit knowledge has been one of the most discussed
and contradictory concepts in the field of knowledge management
since the mid 1990s. The concept is used relatively vaguely to refer
to any type of information that is difficult to articulate, which has led
to discussions about the original meaning of the concept (adopted
from Polanyi-s philosophy) and the nature of tacit knowing. It is
proposed that the subject should be approached from the perspective
of cognitive science in order to connect tacit knowledge to
empirically studied cognitive phenomena. Some of the most
important examples of tacit knowing presented by Polanyi are
analyzed in order to trace the cognitive mechanisms of tacit knowing
and to promote better understanding of the nature of tacit knowledge.
The cognitive approach to Polanyi-s theory reveals that the
tacit/explicit typology of knowledge often presented in the
knowledge management literature is not only artificial but totally
opposite approach compared to Polanyi-s thinking.
Abstract: Comparison of two approaches for the simulation of
the dynamic behaviour of a permanent magnet linear actuator is
presented. These are full coupled model, where the electromagnetic
field, electric circuit and mechanical motion problems are solved
simultaneously, and decoupled model, where first a set of static
magnetic filed analysis is carried out and then the electric circuit and
mechanical motion equations are solved employing bi-cubic spline
approximations of the field analysis results. The results show that the
proposed decoupled model is of satisfactory accuracy and gives more
flexibility when the actuator response is required to be estimated for
different external conditions, e.g. external circuit parameters or
mechanical loads.
Abstract: Multi-energy systems will enhance the system
reliability and power quality. This paper presents an integrated
approach for the design and operation of distributed energy resources
(DER) systems, based on energy hub modeling. A multi-objective
optimization model is developed by considering an integrated view of
electricity and natural gas network to analyze the optimal design and
operating condition of DER systems, by considering two conflicting
objectives, namely, minimization of total cost and the minimization
of environmental impact which is assessed in terms of CO2
emissions. The mathematical model considers energy demands of the
site, local climate data, and utility tariff structure, as well as technical
and financial characteristics of the candidate DER technologies. To
provide energy demands, energy systems including photovoltaic, and
co-generation systems, boiler, central power grid are considered. As
an illustrative example, a hotel in Iran demonstrates potential
applications of the proposed method. The results prove that
increasing the satisfaction degree of environmental objective leads to
increased total cost.
Abstract: A series of microarray experiments produces observations
of differential expression for thousands of genes across multiple
conditions.
Principal component analysis(PCA) has been widely used in
multivariate data analysis to reduce the dimensionality of the data in
order to simplify subsequent analysis and allow for summarization of
the data in a parsimonious manner. PCA, which can be implemented
via a singular value decomposition(SVD), is useful for analysis of
microarray data.
For application of PCA using SVD we use the DNA microarray
data for the small round blue cell tumors(SRBCT) of childhood
by Khan et al.(2001). To decide the number of components which
account for sufficient amount of information we draw scree plot.
Biplot, a graphic display associated with PCA, reveals important
features that exhibit relationship between variables and also the
relationship of variables with observations.
Abstract: We present a novel scheme to evaluate sinusoidal functions with low complexity and high precision using cubic spline interpolation. To this end, two different approaches are proposed to find the interpolating polynomial of sin(x) within the range [- π , π]. The first one deals with only a single data point while the other with two to keep the realization cost as low as possible. An approximation error optimization technique for cubic spline interpolation is introduced next and is shown to increase the interpolator accuracy without increasing complexity of the associated hardware. The architectures for the proposed approaches are also developed, which exhibit flexibility of implementation with low power requirement.
Abstract: Embedded systems need to respect stringent real
time constraints. Various hardware components included in such
systems such as cache memories exhibit variability and therefore
affect execution time. Indeed, a cache memory access from an
embedded microprocessor might result in a cache hit where the
data is available or a cache miss and the data need to be fetched
with an additional delay from an external memory. It is therefore
highly desirable to predict future memory accesses during
execution in order to appropriately prefetch data without incurring
delays. In this paper, we evaluate the potential of several artificial
neural networks for the prediction of instruction memory
addresses. Neural network have the potential to tackle the nonlinear
behavior observed in memory accesses during program
execution and their demonstrated numerous hardware
implementation emphasize this choice over traditional forecasting
techniques for their inclusion in embedded systems. However,
embedded applications execute millions of instructions and
therefore millions of addresses to be predicted. This very
challenging problem of neural network based prediction of large
time series is approached in this paper by evaluating various neural
network architectures based on the recurrent neural network
paradigm with pre-processing based on the Self Organizing Map
(SOM) classification technique.
Abstract: Flexible Job Shop Problem (FJSP) is an extension of
classical Job Shop Problem (JSP). The FJSP extends the routing
flexibility of the JSP, i.e assigning machine to an operation. Thus it
makes it more difficult than the JSP. In this study, Cooperative Coevolutionary
Genetic Algorithm (CCGA) is presented to solve the
FJSP. Makespan (time needed to complete all jobs) is used as the
performance evaluation for CCGA. In order to test performance and
efficiency of our CCGA the benchmark problems are solved.
Computational result shows that the proposed CCGA is comparable
with other approaches.
Abstract: This paper presents an application of level sets for the segmentation of abdominal and thoracic aortic aneurysms in CTA
datasets. An important challenge in reliably detecting aortic is the
need to overcome problems associated with intensity
inhomogeneities. Level sets are part of an important class of methods
that utilize partial differential equations (PDEs) and have been extensively applied in image segmentation. A kernel function in the
level set formulation aids the suppression of noise in the extracted
regions of interest and then guides the motion of the evolving contour
for the detection of weak boundaries. The speed of curve evolution
has been significantly improved with a resulting decrease in segmentation time compared with previous implementations of level
sets, and are shown to be more effective than other approaches in
coping with intensity inhomogeneities. We have applied the Courant
Friedrichs Levy (CFL) condition as stability criterion for our algorithm.
Abstract: Content-based music retrieval generally involves analyzing, searching and retrieving music based on low or high level features of a song which normally used to represent artists, songs or music genre. Identifying them would normally involve feature extraction and classification tasks. Theoretically the greater features analyzed, the better the classification accuracy can be achieved but with longer execution time. Technique to select significant features is important as it will reduce dimensions of feature used in classification and contributes to the accuracy. Artificial Immune System (AIS) approach will be investigated and applied in the classification task. Bio-inspired audio content-based retrieval framework (B-ACRF) is proposed at the end of this paper where it embraces issues that need further consideration in music retrieval performances.
Abstract: Latvia is the fourth in the world by means of broadband internet speed. The total number of internet users in Latvia exceeds 70% of its population. The number of active mailboxes of the local internet e-mail service Inbox.lv accounts for 68% of the population and 97.6% of the total number of internet users. The Latvian portal Draugiem.lv is a phenomenon of social media, because 58.4 % of the population and 83.5% of internet users use it. A majority of Latvian company profiles are available on social networks, the most popular being Twitter.com. These and other parameters prove the fact consumers and companies are actively using the Internet.
However, after the authors in a number of studies analyzed how enterprises are employing the e-environment, namely, e-environment tools, they arrived to the conclusions that are not as flattering as the aforementioned statistics. There is an obvious contradiction between the statistical data and the actual studies. As a result, the authors have posed a question: Why are entrepreneurs resistant to e-tools? In order to answer this question, the authors have addressed the Technology Acceptance Model (TAM). The authors analyzed each phase and determined several factors affecting the use of e-environment, reaching the main conclusion that entrepreneurs do not have a sufficient level of e-literacy (digital literacy).
The authors employ well-established quantitative and qualitative methods of research: grouping, analysis, statistic method, factor analysis in SPSS 20 environment etc.
The theoretical and methodological background of the research is formed by, scientific researches and publications, that from the mass media and professional literature, statistical information from legal institutions as well as information collected by the author during the survey.
Abstract: Partitioning is a critical area of VLSI CAD. In order to build complex digital logic circuits its often essential to sub-divide multi -million transistor design into manageable Pieces. This paper looks at the various partitioning techniques aspects of VLSI CAD, targeted at various applications. We proposed an evolutionary time-series model and a statistical glitch prediction system using a neural network with selection of global feature by making use of clustering method model, for partitioning a circuit. For evolutionary time-series model, we made use of genetic, memetic & neuro-memetic techniques. Our work focused in use of clustering methods - K-means & EM methodology. A comparative study is provided for all techniques to solve the problem of circuit partitioning pertaining to VLSI design. The performance of all approaches is compared using benchmark data provided by MCNC standard cell placement benchmark net lists. Analysis of the investigational results proved that the Neuro-memetic model achieves greater performance then other model in recognizing sub-circuits with minimum amount of interconnections between them.
Abstract: One of the difficulties of the vibration-based damage identification methods is the nonuniqueness of the results of damage identification. The different damage locations and severity may cause the identical response signal, which is even more severe for detection of the multiple damage. This paper proposes a new strategy for damage detection to avoid this nonuniqueness. This strategy firstly determines the approximates damage area based on the statistical pattern recognition method using the dynamic strain signal measured by the distributed fiber Bragg grating, and then accurately evaluates the damage information based on the Bayesian model updating method using the experimental modal data. The stochastic simulation method is then used to compute the high-dimensional integral in the Bayesian problem. Finally, an experiment of the plate structure, simulating one part of mechanical structure, is used to verify the effectiveness of this approach.
Abstract: The paper deals with the analysis of triggering
conditions and evolution processes of piping phenomena, in relation
to both mechanical and hydraulic aspects. In particular, the aim of
the study is to predict slope instabilities triggered by piping,
analysing the conditions necessary for a flow failure to occur. Really,
the mechanical effect involved in the loads redistribution around the
pipe is coupled to the drainage process arising from higher
permeability of the pipe. If after the pipe formation, the drainage
goes prevented for pipe clogging, the porewater pressure increase can
lead to the failure or even the liquefaction, with a subsequent flow
slide. To simulate the piping evolution and to verify relevant stability
conditions, a iterative coupled modelling approach has been pointed
out. As example, the proposed tool has been applied to the Stava
Valley disaster (July, 1985), demonstrating that piping might be one
of triggering phenomena of the tailings dams collapse.
Abstract: Although Face detection is not a recent activity in the
field of image processing, it is still an open area for research. The
greatest step in this field is the work reported by Viola and its recent
analogous is Huang et al. Both of them use similar features and also
similar training process. The former is just for detecting upright
faces, but the latter can detect multi-view faces in still grayscale
images using new features called 'sparse feature'. Finding these
features is very time consuming and inefficient by proposed methods.
Here, we propose a new approach for finding sparse features using a
genetic algorithm system. This method requires less computational
cost and gets more effective features in learning process for face
detection that causes more accuracy.
Abstract: Visualizing sound and noise often help us to determine
an appropriate control over the source localization. Near-field acoustic
holography (NAH) is a powerful tool for the ill-posed problem.
However, in practice, due to the small finite aperture size, the discrete
Fourier transform, FFT based NAH couldn-t predict the activeregion-
of-interest (AROI) over the edges of the plane. Theoretically
few approaches were proposed for solving finite aperture problem.
However most of these methods are not quite compatible for the
practical implementation, especially near the edge of the source. In
this paper, a zip-stuffing extrapolation approach has suggested with
2D Kaiser window. It is operated on wavenumber complex space
to localize the predicted sources. We numerically form a practice
environment with touch impact databases to test the localization of
sound source. It is observed that zip-stuffing aperture extrapolation
and 2D window with evanescent components provide more accuracy
especially in the small aperture and its derivatives.