Abstract: The halophilic proteinase showed a maximal activity
at 50°C and pH 9~10, in 20% NaCl and was highly stabilized by
NaCl. It was able to hydrolyse natural actomyosin (NAM), collagen
and anchovy protein. For NAM hydrolysis, the myosin heavy chain
was completely digested by halophilic proteinase as evidenced by the
lowest band intensity remaining, but partially hydrolysed actin. The
SR5-3 proteinase was also capable hydrolyzing two major
components of collagen, β- and α-compounds, effectively. The
degree of hydrolysis (DH) of the halophilic proteinase and
commercial proteinases (Novozyme, Neutrase, chymotrypsin and
Flavourzyme) on the anchovy protein, were compared, and it was
found that the proteinase showed a greater degree of hydrolysis
towards anchovy protein than that from commercial proteinases. DH
of halophilic proteinase was sharply enhanced according to the
increase in the concentration of enzyme from 0.035 U to 0.105 U.
The results warranting that the acceleration of the production of fish
sauce with higher quality, may be achieved by adding of the
halophilic proteinase from this bacterium.
Abstract: As seen in literature, about 70% of the improvement initiatives fail, and a significant number do not even get started. This paper analyses the problem of failing initiatives on Software Process Improvement (SPI), and proposes good practices supported by motivational tools that can help minimizing failures. It elaborates on the hypothesis that human factors are poorly addressed by deployers, especially because implementation guides usually emphasize only technical factors. This research was conducted with SPI deployers and analyses 32 SPI initiatives. The results indicate that although human factors are not commonly highlighted in guidelines, the successful initiatives usually address human factors implicitly. This research shows that practices based on human factors indeed perform a crucial role on successful implantations of SPI, proposes change management as a theoretical framework to introduce those practices in the SPI context and suggests some motivational tools based on SPI deployers experience to support it.
Abstract: This paper presents the applicability of artificial
neural networks for 24 hour ahead solar power generation forecasting
of a 20 kW photovoltaic system, the developed forecasting is suitable
for a reliable Microgrid energy management. In total four neural
networks were proposed, namely: multi-layred perceptron, radial
basis function, recurrent and a neural network ensemble consisting in
ensemble of bagged networks. Forecasting reliability of the proposed
neural networks was carried out in terms forecasting error
performance basing on statistical and graphical methods. The
experimental results showed that all the proposed networks achieved
an acceptable forecasting accuracy. In term of comparison the neural
network ensemble gives the highest precision forecasting comparing
to the conventional networks. In fact, each network of the ensemble
over-fits to some extent and leads to a diversity which enhances the
noise tolerance and the forecasting generalization performance
comparing to the conventional networks.
Abstract: In over deployed sensor networks, one approach
to Conserve energy is to keep only a small subset of sensors
active at Any instant. For the coverage problems, the monitoring
area in a set of points that require sensing, called demand points, and
consider that the node coverage area is a circle of range R, where R
is the sensing range, If the Distance between a demand point and
a sensor node is less than R, the node is able to cover this point. We
consider a wireless sensor network consisting of a set of sensors
deployed randomly. A point in the monitored area is covered if it is
within the sensing range of a sensor. In some applications, when the
network is sufficiently dense, area coverage can be approximated by
guaranteeing point coverage. In this case, all the points of wireless
devices could be used to represent the whole area, and the working
sensors are supposed to cover all the sensors. We also introduce
Hybrid Algorithm and challenges related to coverage in sensor
networks.
Abstract: The building sector is the largest energy consumer and
CO2 emitter in the European Union (EU) and therefore the active
reduction of energy consumption and elimination of energy wastage
are among the main goals in it. Healthy housing and energy
efficiency are affected by many factors which set challenges to
monitoring, control and research of indoor air quality (IAQ) and
energy consumption, especially in old buildings. These challenges
include measurement and equipment costs, for example.
Additionally, the measurement results are difficult to interpret and
their usage in the ventilation control is also limited when taking into
account the energy efficiency of housing at the same time. The main
goal of this study is to develop a cost-effective building monitoring
and control system especially for old buildings. The starting point or
keyword of the development process is a wireless system; otherwise
the installation costs become too high. As the main result, this paper
describes an idea of a wireless building monitoring and control
system. The first prototype of the system has been installed in 10
residential buildings and in 10 school buildings located in the City of
Kuopio, Finland.
Abstract: This work relates the development of an optical fiber
(OF) sensor for the detection and quantification of single walled
carbon nanotubes in aqueous solutions. The developed OF displays a
compact design, it requires less expensive materials and equipment
as well as low volume of sample (0.2 mL). This methodology was
also validated by the comparison of its analytical performance with
that of a standard methodology based on ultraviolet-visible
spectroscopy. The developed OF sensor follows the general SDS
calibration proposed for OF sensors as a more suitable calibration
fitting compared with classical calibrations.
Abstract: Dorsal hand vein pattern is an emerging biometric which is attracting the attention of researchers, of late. Research is being carried out on existing techniques in the hope of improving them or finding more efficient ones. In this work, Principle Component Analysis (PCA) , which is a successful method, originally applied on face biometric is being modified using Cholesky decomposition and Lanczos algorithm to extract the dorsal hand vein features. This modified technique decreases the number of computation and hence decreases the processing time. The eigenveins were successfully computed and projected onto the vein space. The system was tested on a database of 200 images and using a threshold value of 0.9 to obtain the False Acceptance Rate (FAR) and False Rejection Rate (FRR). This modified algorithm is desirable when developing biometric security system since it significantly decreases the matching time.
Abstract: There is growing interest in biodiesel (fatty acid
methyl ester or FAME) because of the similarity in its properties
when compared to those of diesel fuels. Diesel engines operated on
biodiesel have lower emissions of carbon monoxide, unburned
hydrocarbons, particulate matter, and air toxics than when operated
on petroleum-based diesel fuel. Production of fatty acid methyl ester
(FAME) from rapeseed (nonedible oil) fatty acid distillate having
high free fatty acids (FFA) was investigated in this work. Conditions
for esterification process of rapeseed oil were 1.8 % H2SO4 as
catalyst, MeOH/oil of molar ratio 2 : 0.1 and reaction temperature
65 °C, for a period of 3h. The yield of methyl ester was > 90 % in 1
h.
The amount of FFA was reduced from 93 wt % to less than 2 wt %
at the end of the esterification process. The FAME was pureed by
neutralization with 1 M sodium hydroxide in water solution at a
reaction temperature of 62 °C. The final FAME product met with the
biodiesel quality standard, and ASTM D 6751.
Abstract: We study the typical domain size and configuration
character of a randomly perturbed system exhibiting continuous
symmetry breaking. As a model system we use rod-like objects
within a cubic lattice interacting via a Lebwohl–Lasher-type
interaction. We describe their local direction with a headless unit
director field. An example of such systems represents nematic LC or
nanotubes. We further introduce impurities of concentration p, which
impose the random anisotropy field-type disorder to directors. We
study the domain-type pattern of molecules as a function of p,
anchoring strength w between a neighboring director and impurity,
temperature, history of samples. In simulations we quenched the
directors either from the random or homogeneous initial
configuration. Our results show that a history of system strongly
influences: i) the average domain coherence length; and ii) the range
of ordering in the system. In the random case the obtained order is
always short ranged (SR). On the contrary, in the homogeneous case,
SR is obtained only for strong enough anchoring and large enough
concentration p. In other cases, the ordering is either of quasi long
range (QLR) or of long range (LR). We further studied memory
effects for the random initial configuration. With increasing external
ordering field B either QLR or LR is realized.
Abstract: The policies governing the business of any
organization are well reflected in her business rules. The business
rules are implemented by data validation techniques, coded during
the software development process. Any change in business
policies results in change in the code written for data validation
used to enforce the business policies. Implementing the change in
business rules without changing the code is the objective of this
paper. The proposed approach enables users to create rule sets at
run time once the software has been developed. The newly defined
rule sets by end users are associated with the data variables for
which the validation is required. The proposed approach facilitates
the users to define business rules using all the comparison
operators and Boolean operators. Multithreading is used to
validate the data entered by end user against the business rules
applied. The evaluation of the data is performed by a newly
created thread using an enhanced form of the RPN (Reverse Polish
Notation) algorithm.
Abstract: Modern times call organizations to have an active role
in the social arena, through Corporate Social Responsibility (CSR).
The objective of this research was to test the hypothesis that there is a
positive relation between social performance and economic
performance, and if there is a positive correlation between social
performance and financial-economic performance. To test these
theories a measure of social performance, based on the Green Book
of Commission of the European Community, was used in a group of
nineteen Portuguese top companies, listed on the PSI 20 index,
through a period of five years, since 2005 to 2009. A clusters
analysis was applied to group companies by their social performance
and to compare and correlate their economic performance. Results
indicate that companies that had a better social performance are not
the ones who had a better economic performance, and suggest that
the middle path might provide a good relation CSR-Economic
performance, as a basis to a sustainable development.
Abstract: We have developed a computer program consisting of
6 subtests assessing the children hand dexterity applicable in the
rehabilitation medicine. We have carried out a normative study on a
representative sample of 285 children aged from 7 to 15 (mean age
11.3) and we have proposed clinical standards for three age groups
(7-9, 9-11, 12-15 years). We have shown statistical significance of
differences among the corresponding mean values of the task time
completion. We have also found a strong correlation between the task
time completion and the age of the subjects, as well as we have
performed the test-retest reliability checks in the sample of 84
children, giving the high values of the Pearson coefficients for the
dominant and non-dominant hand in the range 0.740.97 and
0.620.93, respectively.
A new MATLAB-based programming tool aiming at analysis of
cardiologic RR intervals and blood pressure descriptors, is worked
out, too. For each set of data, ten different parameters are extracted: 2
in time domain, 4 in frequency domain and 4 in Poincaré plot
analysis. In addition twelve different parameters of baroreflex
sensitivity are calculated. All these data sets can be visualized in time
domain together with their power spectra and Poincaré plots. If
available, the respiratory oscillation curves can be also plotted for
comparison. Another application processes biological data obtained
from BLAST analysis.
Abstract: The customer satisfaction for textile sector carries
great importance like the customer satisfaction for other sectors
carry. Especially, if it is considered that gaining new customers
create four times more costs than protecting existing customers from
leaving, it can be seen that the customer satisfaction plays a great
role for the firms. In this study the affecting independent variables of
customer satisfaction are chosen as brand image, perceived service
quality and perceived product quality. By these independent
variables, it is investigated that if any differences exist in perception
of customer satisfaction according to the Turkish textile consumers in
the view of gender. In data analysis of this research the SPSS
program is used.
Abstract: In this study, some physical and mechanical properties
of jujube fruits, were measured and compared at constant moisture
content of 15.5% w.b. The results showed that the mean length, width
and thickness of jujube fruits were 18.88, 16.79 and 15.9 mm,
respectively. The mean projected areas of jujube perpendicular to
length, width, and thickness were 147.01, 224.08 and 274.60 mm2,
respectively. The mean mass and volume were 1.51 g and 2672.80
mm3, respectively. The arithmetic mean diameter, geometric mean
diameter and equivalent diameter varied from 14.53 to 20 mm, 14.5
to 19.94 mm, and 14.52 to 19.97 mm, respectively. The sphericity,
aspect ratio and surface area of jujube fruits were 0.91, 0.89 and
926.28 mm2, respectively. Whole fruit density, bulk density and
porosity of jujube fruits were measured and found to be 1.52 g/cm3,
0.3 g/cm3 and 79.3%, respectively. The angle of repose of jujube fruit
was 14.66° (±0.58°). The static coefficient of friction on galvanized
iron steel was higher than that on plywood and lower than that on
glass surface. The values of rupture force, deformation, hardness and
energy absorbed were found to be between 11.13-19.91N, 2.53-
4.82mm, 3.06-5.81N mm and 20.13-39.08 N/mm, respectively.
Abstract: Since the 1990s the American furniture industry faces a transition period. Manufacturers, one of its most important actors made its entrance into the retail industry. This shift has had deep consequences not only for the American furniture industry as a whole, but also for other international furniture industries, especially the Chinese. The present work aims to analyze this actor based on the distinction provided by the Global Commodity Chain Theory. It stresses its characteristics, structure, operational way and importance for both the U.S. and the Chinese furniture industries.
Abstract: Risk management is an essential fraction of project management, which plays a significant role in project success. Many failures associated with Web projects are the consequences of poor awareness of the risks involved and lack of process models that can serve as a guideline for the development of Web based applications. To circumvent this problem, contemporary process models have been devised for the development of conventional software. This paper introduces the WPRiMA (Web Project Risk Management Assessment) as the tool, which is used to implement RIAP, the risk identification architecture pattern model, which focuses upon the data from the proprietor-s and vendor-s perspectives. The paper also illustrates how WPRiMA tool works and how it can be used to calculate the risk level for a given Web project, to generate recommendations in order to facilitate risk avoidance in a project, and to improve the prospects of early risk management.
Abstract: Network warfare is an emerging concept that focuses on the network and computer based forms through which information is attacked and defended. Various computer and network security concepts thus play a role in network warfare. Due the intricacy of the various interacting components, a model to better understand the complexity in a network warfare environment would be beneficial. Non-quantitative modeling is a useful method to better characterize the field due to the rich ideas that can be generated based on the use of secular associations, chronological origins, linked concepts, categorizations and context specifications. This paper proposes the use of non-quantitative methods through a morphological analysis to better explore and define the influential conditions in a network warfare environment.
Abstract: This study describes a capillary-based device
integrated with the heating and cooling modules for polymerase chain
reaction (PCR). The device consists of the reaction
polytetrafluoroethylene (PTFE) capillary, the aluminum blocks, and is
equipped with two cartridge heaters, a thermoelectric (TE) cooler, a
fan, and some thermocouples for temperature control. The cartridge
heaters are placed into the heating blocks and maintained at two
different temperatures to achieve the denaturation and the extension
step. Some thermocouples inserted into the capillary are used to obtain
the transient temperature profiles of the reaction sample during
thermal cycles. A 483-bp DNA template is amplified successfully in
the designed system and the traditional thermal cycler. This work
should be interesting to persons involved in the high-temperature
based reactions and genomics or cell analysis.
Abstract: This paper analyses the unsteady, two-dimensional
stagnation point flow of an incompressible viscous fluid over a flat
sheet when the flow is started impulsively from rest and at the same
time, the sheet is suddenly stretched in its own plane with a velocity
proportional to the distance from the stagnation point. The partial
differential equations governing the laminar boundary layer forced
convection flow are non-dimensionalised using semi-similar
transformations and then solved numerically using an implicit finitedifference
scheme known as the Keller-box method. Results
pertaining to the flow and heat transfer characteristics are computed
for all dimensionless time, uniformly valid in the whole spatial region
without any numerical difficulties. Analytical solutions are also
obtained for both small and large times, respectively representing the
initial unsteady and final steady state flow and heat transfer.
Numerical results indicate that the velocity ratio parameter is found
to have a significant effect on skin friction and heat transfer rate at
the surface. Furthermore, it is exposed that there is a smooth
transition from the initial unsteady state flow (small time solution) to
the final steady state (large time solution).
Abstract: Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.