Abstract: With the technology evolving every day and with the increase in global competition, industries are always under the pressure to be the best. They need to provide good quality products at competitive prices, when and how the customer wants them. In order to achieve this level of service, products and their respective supply chain processes need to be flexible and evolvable; otherwise changes will be extremely expensive, slow and with many combinatorial effects. Those combinatorial effects impact the whole organizational structure, from a management, financial, documentation, logistics and specially the information system Enterprise Requirement Planning (ERP) perspective. By applying the normalized system concept/theory to segments of the supply chain, we believe minimal effects, especially at the time of launching an organization global software project. The purpose of this paper is to point out that if an organization wants to develop a software from scratch or implement an existing ERP software for their business needs and if their business processes are normalized and modular then most probably this will yield to a normalized and modular software system that can be easily modified when the business evolves. Another important goal of this paper is to increase the awareness regarding the design of the business processes in a software implementation project. If the blueprints created are normalized then the software developers and configurators will use those modular blueprints to map them into modular software. This paper only prepares the ground for further studies; the above concept will be supported by going through the steps of developing, configuring and/or implementing a software system for an organization by using two methods: The Software Development Lifecycle method (SDLC) and the Accelerated SAP implementation method (ASAP). Both methods start with the customer requirements, then blue printing of its business processes and finally mapping those processes into a software system. Since those requirements and processes are the starting point of the implementation process, then normalizing those processes will end up in a normalizing software.
Abstract: Organizations are living in a very competitive and dynamic environment which is constantly changing. In order to achieve a high level of service, the products and processes of these organizations need to be flexible and evolvable. If the supply chains are not modular and well designed, changes can bring combinatorial effects to most areas of a company from its management, financial, documentation, logistics and its information structure. Applying the normalized system’s concept to segments of the supply chain may help in reducing those ripple effects, but it may also increase lead times. Lead times are important and can become a decisive element in gaining customers. Industries are always under the pressure in providing good quality products, at competitive prices, when and how the customer wants them. Most of the time, the customers want their orders now, if not yesterday. The above concept will be proven by examining lead times in a manufacturing example before and after applying normalized systems concept to that segment of the chain. We will then show that although we can minimize the combinatorial effects when changes occur, the lead times will be increased.
Abstract: One of the major shortcomings of widely used
scientometric indicators is that different disciplines cannot be
compared with each other. The issue of cross-disciplinary
normalization has been long discussed, but even the classification
of publications into scientific domains poses problems. Structural
properties of citation networks offer new possibilities, however, the
large size and constant growth of these networks asks for precaution.
Here we present a new tool that in order to perform cross-field
normalization of scientometric indicators of individual publications
relays on the structural properties of citation networks. Due to the
large size of the networks, a systematic procedure for identifying
scientific domains based on a local community detection algorithm
is proposed. The algorithm is tested with different benchmark
and real-world networks. Then, by the use of this algorithm, the
mechanism of the scientometric indicator normalization process is
shown for a few indicators like the citation number, P-index and
a local version of the PageRank indicator. The fat-tail trend of the
article indicator distribution enables us to successfully perform the
indicator normalization process.
Abstract: Modelling realized volatility with high-frequency returns is popular as it is an unbiased and efficient estimator of return volatility. A computationally simple model is fitting the logarithms of the realized volatilities with a fractionally integrated long-memory Gaussian process. The Gaussianity assumption simplifies the parameter estimation using the Whittle approximation. Nonetheless, this assumption may not be met in the finite samples and there may be a need to normalize the financial series. Based on the empirical indices S&P500 and DAX, this paper examines the performance of the linear volatility model pre-treated with normalization compared to its existing counterpart. The empirical results show that by including normalization as a pre-treatment procedure, the forecast performance outperforms the existing model in terms of statistical and economic evaluations.
Abstract: The paper describes the availability analysis of milling system of a rice milling plant using probabilistic approach. The subsystems under study are special purpose machines. The availability analysis of the system is carried out to determine the effect of failure and repair rates of each subsystem on overall performance (i.e. steady state availability) of system concerned. Further, on the basis of effect of repair rates on the system availability, maintenance repair priorities have been suggested. The problem is formulated using Markov Birth-Death process taking exponential distribution for probable failures and repair rates. The first order differential equations associated with transition diagram are developed by using mnemonic rule. These equations are solved using normalizing conditions and recursive method to drive out the steady state availability expression of the system. The findings of the paper are presented and discussed with the plant personnel to adopt a suitable maintenance policy to increase the productivity of the rice milling plant.
Abstract: Building code-related literature provides
recommendations on normalizing approaches to the calculation of
the dynamic properties of structures. Most building codes make a
distinction among types of structural systems, construction material,
and configuration through a numerical coefficient in the
expression for the fundamental period. The period is then used in
normalized response spectra to compute base shear. The typical
parameter used in simplified code formulas for the fundamental
period is overall building height raised to a power determined from
analytical and experimental results. However, reinforced concrete
buildings which constitute the majority of built space in less
developed countries pose additional challenges to the ones built with
homogeneous material such as steel, or with concrete under stricter
quality control. In the present paper, the particularities of reinforced
concrete buildings are explored and related to current methods of
equivalent static analysis. A comparative study is presented between
the Uniform Building Code, commonly used for buildings within
and outside the USA, and data from the Middle East used to model
151 reinforced concrete buildings of varying number of bays, number
of floors, overall building height, and individual story height. The
fundamental period was calculated using eigenvalue matrix
computation. The results were also used in a separate regression
analysis where the computed period serves as dependent variable,
while five building properties serve as independent variables. The
statistical analysis shed light on important parameters that simplified
code formulas need to account for including individual story height,
overall building height, floor plan, number of bays, and concrete
properties. Such inclusions are important for reinforced concrete
buildings of special conditions due to the level of concrete damage,
aging, or materials quality control during construction.
Overall results of the present analysis show that simplified code
formulas for fundamental period and base shear may be applied but
they require revisions to account for multiple parameters. The
conclusion above is confirmed by the analytical model where
fundamental periods were computed using numerical techniques and
eigenvalue solutions. This recommendation is particularly relevant
to code upgrades in less developed countries where it is customary to
adopt, and mildly adapt international codes.
We also note the necessity of further research using empirical data
from buildings in Lebanon that were subjected to severe damage due
to impulse loading or accelerated aging. However, we excluded this
study from the present paper and left it for future research as it has its
own peculiarities and requires a different type of analysis.
Abstract: Multimodal biometric systems integrate the data presented by multiple biometric sources, hence offering a better performance than the systems based on a single biometric modality. Although the coupling of biometric systems can be done at different levels, the fusion at the scores level is the most common since it has been proven effective than the rest of the fusion levels. However, the scores from different modalities are generally heterogeneous. A step of normalizing the scores is needed to transform these scores into a common domain before combining them. In this paper, we study the performance of several normalization techniques with various fusion methods in a context relating to the merger of three unimodal systems based on the face, the palmprint and the fingerprint. We also propose a new adaptive normalization method that takes into account the distribution of client scores and impostor scores. Experiments conducted on a database of 100 people show that the performances of a multimodal system depend on the choice of the normalization method and the fusion technique. The proposed normalization method has given the best results.
Abstract: This paper describes an automated implementable
system for impulsive signals detection and recognition. The system
uses a Digital Signal Processing device for the detection and
identification process. Here the system analyses the signals in real
time in order to produce a particular response if needed. The system
analyses the signals in real time in order to produce a specific output
if needed. Detection is achieved through normalizing the inputs and
comparing the read signals to a dynamic threshold and thus avoiding
detections linked to loud or fluctuating environing noise.
Identification is done through neuronal network algorithms. As a
setup our system can receive signals to “learn” certain patterns.
Through “learning” the system can recognize signals faster, inducing
flexibility to new patterns similar to those known. Sound is captured
through a simple jack input, and could be changed for an enhanced
recording surface such as a wide-area recorder. Furthermore a
communication module can be added to the apparatus to send alerts
to another interface if needed.
Abstract: The main emphasis of metallurgists has been to process the materials to obtain the balanced mechanical properties for the given application. One of the processing routes to alter the properties is heat treatment. Nearly 90% of the structural applications are related to the medium carbon an alloyed steels and hence are regarded as structural steels. The major requirement in the conventional steel is to improve workability, toughness, hardness and grain refinement. In this view, it is proposed to study the mechanical and tribological properties of unalloyed structural (AISI 1140) steel with different thermal (heat) treatments like annealing, normalizing, tempering and hardening and compared with as brought (cold worked) specimen. All heat treatments are carried out in atmospheric condition. Hardening treatment improves hardness of the material, a marginal decrease in hardness value with improved ductility is observed in tempering. Annealing and normalizing improve ductility of the specimen. Normalized specimen shows ultimate ductility. Hardened specimen shows highest wear resistance in the initial period of slide wear where as above 25KM of sliding distance, as brought steel dominates the hardened specimen. Both mild and severe wear regions are observed. Microstructural analysis shows the existence of pearlitic structure in normalized specimen, lath martensitic structure in hardened, pearlitic, ferritic structure in annealed specimen.
Abstract: In the literature of information theory, there is
necessity for comparing the different measures of fuzzy entropy and
this consequently, gives rise to the need for normalizing measures of
fuzzy entropy. In this paper, we have discussed this need and hence
developed some normalized measures of fuzzy entropy. It is also
desirable to maximize entropy and to minimize directed divergence
or distance. Keeping in mind this idea, we have explained the method
of optimizing different measures of fuzzy entropy.
Abstract: This paper discusses the performance modeling and availability analysis of Yarn Dyeing System of a Textile Industry. The Textile Industry is a complex and repairable engineering system. Yarn Dyeing System of Textile Industry consists of five subsystems arranged in series configuration. For performance modeling and analysis of availability, a performance evaluating model has been developed with the help of mathematical formulation based on Markov-Birth-Death Process. The differential equations have been developed on the basis of Probabilistic Approach using a Transition Diagram. These equations have further been solved using normalizing condition in order to develop the steady state availability, a performance measure of the system concerned. The system performance has been further analyzed with the help of decision matrices. These matrices provide various availability levels for different combinations of failure and repair rates for various subsystems. The findings of this paper are therefore, considered to be useful for the analysis of availability and determination of the best possible maintenance strategies which can be implemented in future to enhance the system performance.
Abstract: This paper applies fuzzy clustering algorithm in classifying real estate companies in China according to some general financial indexes, such as income per share, share accumulation fund, net profit margins, weighted net assets yield and shareholders' equity. By constructing and normalizing initial partition matrix, getting fuzzy similar matrix with Minkowski metric and gaining the transitive closure, the dynamic fuzzy clustering analysis for real estate companies is shown clearly that different clustered result change gradually with the threshold reducing, and then, it-s shown there is the similar relationship with the prices of those companies in stock market. In this way, it-s great valuable in contrasting the real estate companies- financial condition in order to grasp some good chances of investment, and so on.
Abstract: In this paper, a novel contrast enhancement technique
for contrast enhancement of a low-contrast satellite image has been
proposed based on the singular value decomposition (SVD) and
discrete cosine transform (DCT). The singular value matrix
represents the intensity information of the given image and any
change on the singular values change the intensity of the input image.
The proposed technique converts the image into the SVD-DCT
domain and after normalizing the singular value matrix; the enhanced
image is reconstructed by using inverse DCT. The visual and
quantitative results suggest that the proposed SVD-DCT method
clearly shows the increased efficiency and flexibility of the proposed
method over the exiting methods such as Linear Contrast Stretching
technique, GHE technique, DWT-SVD technique, DWT technique,
Decorrelation Stretching technique, Gamma Correction method
based techniques.
Abstract: A feed-forward, back-propagation Artificial Neural
Network (ANN) model has been used to forecast the occurrences of
wastewater overflows in a combined sewerage reticulation system.
This approach was tested to evaluate its applicability as a method
alternative to the common practice of developing a complete
conceptual, mathematical hydrological-hydraulic model for the
sewerage system to enable such forecasts. The ANN approach
obviates the need for a-priori understanding and representation of the
underlying hydrological hydraulic phenomena in mathematical terms
but enables learning the characteristics of a sewer overflow from the
historical data.
The performance of the standard feed-forward, back-propagation
of error algorithm was enhanced by a modified data normalizing
technique that enabled the ANN model to extrapolate into the
territory that was unseen by the training data. The algorithm and the
data normalizing method are presented along with the ANN model
output results that indicate a good accuracy in the forecasted sewer
overflow rates. However, it was revealed that the accurate
forecasting of the overflow rates are heavily dependent on the
availability of a real-time flow monitoring at the overflow structure
to provide antecedent flow rate data. The ability of the ANN to
forecast the overflow rates without the antecedent flow rates (as is
the case with traditional conceptual reticulation models) was found to
be quite poor.
Abstract: This research proposes a methodology for patent-citation-based technology input-output analysis by applying the patent information to input-output analysis developed for the dependencies among different industries. For this analysis, a technology relationship matrix and its components, as well as input and technology inducement coefficients, are constructed using patent information. Then, a technology inducement coefficient is calculated by normalizing the degree of citation from certain IPCs to the different IPCs (International patent classification) or to the same IPCs. Finally, we construct a Dependency Structure Matrix (DSM) based on the technology inducement coefficient to suggest a useful application for this methodology.
Abstract: A number of mass spectrometry applications are already available as web-based and windows-based systems to calculate isotope pattern and to display the mass spectrum based on the specific molecular formula besides providing necessary information. These applications were evaluated and compared with our new alternative application called Theoretical Isotope Generator (TIG) in terms of its functionality and features provided to prove this new application is working better and performing well. TIG provides extra features than others, complete with several functionality such as drawing, normalizing and zooming the generated graph that convey with the molecular information in a number of formats by providing the details of the calculation and molecules. Thus, any chemist, students, lecturers and researchers from anywhere could use TIG to gain related information on molecules and their relative intensity.