Quality Evaluation of Ready to Eat Potatoes’ Produce in Flexible Packaging

Experiments have been carried out at the Latvia University of Agriculture Department of Food Technology. The aim of this work was to assess the effect of thermal treatment in flexible retort pouch packaging on the quality of potatoes’ produce during the storage time. Samples were evaluated immediately after retort thermal treatment; and following 1; 2; 3 and 4 storage months at the ambient temperature of +18±2ºC in vacuum packaging from polyamide/polyethylene (PA/PE) and aluminum/polyethylene (Al/PE) film pouches with barrier properties. Experimentally the quality of the potatoes’ produce in dry butter and mushroom dressings was characterized by measuring pH, hardness, color, microbiological properties and sensory evaluation. The sterilization was effective in protecting the produce from physical, chemical, and microbial quality degradation. According to the study of obtained data, it can be argued that the selected product processing technology and packaging materials could be applied to provide the safety and security during four-month storage period.

Toward Community-Based Personal Cloud Computing

This paper proposes a new of cloud computing for individual computer users to share applications in distributed communities, called community-based personal cloud computing (CPCC). The paper also presents a prototype design and implementation of CPCC. The users of CPCC are able to share their computing applications with other users of the community. Any member of the community is able to execute remote applications shared by other members. The remote applications behave in the same way as their local counterparts, allowing the user to enter input, receive output as well as providing the access to the local data of the user. CPCC provides a peer-to-peer (P2P) environment where each peer provides applications which can be used by the other peers that are connected CPCC.

Systems with Queueing and their Simulation

In the queueing theory, it is assumed that customer arrivals correspond to a Poisson process and service time has the exponential distribution. Using these assumptions, the behaviour of the queueing system can be described by means of Markov chains and it is possible to derive the characteristics of the system. In the paper, these theoretical approaches are presented on several types of systems and it is also shown how to compute the characteristics in a situation when these assumptions are not satisfied

A Novel Method for the Characterization of Synchronization and Coupling in Multichannel EEG and ECoG

In this paper we introduce a novel method for the characterization of synchronziation and coupling effects in multivariate time series that can be used for the analysis of EEG or ECoG signals recorded during epileptic seizures. The method allows to visualize the spatio-temporal evolution of synchronization and coupling effects that are characteristic for epileptic seizures. Similar to other methods proposed for this purpose our method is based on a regression analysis. However, a more general definition of the regression together with an effective channel selection procedure allows to use the method even for time series that are highly correlated, which is commonly the case in EEG/ECoG recordings with large numbers of electrodes. The method was experimentally tested on ECoG recordings of epileptic seizures from patients with temporal lobe epilepsies. A comparision with the results from a independent visual inspection by clinical experts showed an excellent agreement with the patterns obtained with the proposed method.

Project Management Maturity Models and Organizational Project Management Maturity Model (OPM3®): A Critical Morphological Evaluation

There exists a strong correlation between efficient project management and competitive advantage for organizations. Therefore, organizations are striving to standardize and assess the rigor of their project management processes and capabilities i.e. project management maturity. Researchers and standardization organizations have developed several project management maturity models (PMMMs) to assess project management maturity of the organizations. This study presents a critical evaluation of some of the leading PMMMs against OPM3® in a multitude of ways to look at which PMMM is the most comprehensive model - which could assess most aspects of organizations and also help the organizations in gaining competitive advantage over competitors. After a detailed morphological analysis of the models, it is concluded that OPM3® is the most promising maturity model that can really provide a competitive advantage to the organizations due to its unique approach of assessment and improvement strategies.

Evaluation of Algorithms for Sequential Decision in Biosonar Target Classification

A sequential decision problem, based on the task ofidentifying the species of trees given acoustic echo data collectedfrom them, is considered with well-known stochastic classifiers,including single and mixture Gaussian models. Echoes are processedwith a preprocessing stage based on a model of mammalian cochlearfiltering, using a new discrete low-pass filter characteristic. Stoppingtime performance of the sequential decision process is evaluated andcompared. It is observed that the new low pass filter processingresults in faster sequential decisions.

PTFE Capillary-Based DNA Amplification within an Oscillatory Thermal Cycling Device

This study describes a capillary-based device integrated with the heating and cooling modules for polymerase chain reaction (PCR). The device consists of the reaction polytetrafluoroethylene (PTFE) capillary, the aluminum blocks, and is equipped with two cartridge heaters, a thermoelectric (TE) cooler, a fan, and some thermocouples for temperature control. The cartridge heaters are placed into the heating blocks and maintained at two different temperatures to achieve the denaturation and the extension step. Some thermocouples inserted into the capillary are used to obtain the transient temperature profiles of the reaction sample during thermal cycles. A 483-bp DNA template is amplified successfully in the designed system and the traditional thermal cycler. This work should be interesting to persons involved in the high-temperature based reactions and genomics or cell analysis.

Screening and Evaluation of in vivo and in vitro Generated Insulin Plant (Vernonia divergens) for Antimicrobial and Anticancer Activities

Vernonia divergens Benth., commonly known as “Insulin Plant” (Fam: Asteraceae) is a potent sugar killer. Locally the leaves of the plant, boiled in water are successfully administered to a large number of diabetic patients. The present study evaluates the putative anti-diabetic ingredients, isolated from the in vivo and in vitro grown plantlets of V. divergens for their antimicrobial and anticancer activities. Sterilized explants of nodal segments were cultured on MS (Musashige and Skoog, 1962) medium in presence of different combinations of hormones. Multiple shoots along with bunch of roots were regenerated at 1mg l-1 BAP and 0.5 mg l-1 NAA. Micro-plantlets were separated and sub-cultured on the double strength (2X) of the above combination of hormones leading to increased length of roots and shoots. These plantlets were successfully transferred to soil and survived well in nature. The ethanol extract of plantlets from both in vivo & in vitro sources were prepared in soxhlet extractor and then concentrated to dryness under reduced pressure in rotary evaporator. Thus obtainedconcentrated extracts showed significant inhibitory activity against gram negative bacteria like Escherichia coli and Pseudomonas aeruginosa but no inhibition was found against gram positive bacteria. Further, these ethanol extracts were screened for in vitro percentage cytotoxicity at different time periods (24 h, 48 h and 72 h) of different dilutions. The in vivo plant extract inhibited the growth of EAC mouse cell lines in the range of 65, 66, 78, and 88% at 100, 50, 25 & 12.5μg mL-1 but at 72 h of treatment. In case of the extract of in vitro origin, the inhibition was found against EAC cell lines even at 48h. During spectrophotometric scanning, the extracts exhibited different maxima (ʎ) - four peaks in in vitro extracts as against single in in vivo preparation suggesting the possible change in the nature of ingredients during micropropagation through tissue culture techniques.

A Parallel Algorithm for 2-D Cylindrical Geometry Transport Equation with Interface Corrections

In order to make conventional implicit algorithm to be applicable in large scale parallel computers , an interface prediction and correction of discontinuous finite element method is presented to solve time-dependent neutron transport equations under 2-D cylindrical geometry. Domain decomposition is adopted in the computational domain.The numerical experiments show that our parallel algorithm with explicit prediction and implicit correction has good precision, parallelism and simplicity. Especially, it can reach perfect speedup even on hundreds of processors for large-scale problems.

Development of a Fiber based Interferometric Sensor for Non-contact Displacement Measurement

In this paper, a fiber based Fabry-Perot interferometer is proposed and demonstrated for a non-contact displacement measurement. A piece of micro-prism which attached to the mechanical vibrator is served as the target reflector. Interference signal is generated from the superposition between the sensing beam and the reference beam within the sensing arm of the fiber sensor. This signal is then converted to the displacement value by using a developed program written in visual Cµ programming with a resolution of λ/8. A classical function generator is operated for controlling the vibrator. By fixing an excitation frequency of 100 Hz and varying the excitation amplitude range of 0.1 – 3 Volts, the output displacements measured by the fiber sensor are obtained from 1.55 μm to 30.225 μm. A reference displacement sensor with a sensitivity of ~0.4 μm is also employed for comparing the displacement errors between both sensors. We found that over the entire displacement range, a maximum and average measurement error are obtained of 0.977% and 0.44% respectively.

Cooperative Movements in Malaysia: The Issue of Governance

Cooperative organizations in Malaysia are going through a phase of rapid growth. They are seen by the government as another crucial vehicle to drive and boost up the country-s economical development and growth. Hence, the issue of cooperative governance is of great importance. Unlike literatures on corporate governance for public listed companies-, literatures on governance for social enterprises, in particular the cooperative organizations are still at the early stage in Malaysia and very scant in number. This paper will look into current practices as well as issues and challenges related to cooperative governance. The need for a better solution towards forming best practices of cooperative governance framework appears imperative in deterring cases of mismanagement and fraud.

Energy Loss at Drops using Neuro Solutions

Energy dissipation in drops has been investigated by physical models. After determination of effective parameters on the phenomenon, three drops with different heights have been constructed from Plexiglas. They have been installed in two existing flumes in the hydraulic laboratory. Several runs of physical models have been undertaken to measured required parameters for determination of the energy dissipation. Results showed that the energy dissipation in drops depend on the drop height and discharge. Predicted relative energy dissipations varied from 10.0% to 94.3%. This work has also indicated that the energy loss at drop is mainly due to the mixing of the jet with the pool behind the jet that causes air bubble entrainment in the flow. Statistical model has been developed to predict the energy dissipation in vertical drops denotes nonlinear correlation between effective parameters. Further an artificial neural networks (ANNs) approach was used in this paper to develop an explicit procedure for calculating energy loss at drops using NeuroSolutions. Trained network was able to predict the response with R2 and RMSE 0.977 and 0.0085 respectively. The performance of ANN was found effective when compared to regression equations in predicting the energy loss.

Hybrid Color-Texture Space for Image Classification

This work presents an approach for the construction of a hybrid color-texture space by using mutual information. Feature extraction is done by the Laws filter with SVM (Support Vectors Machine) as a classifier. The classification is applied on the VisTex database and a SPOT HRV (XS) image representing two forest areas in the region of Rabat in Morocco. The result of classification obtained in the hybrid space is compared with the one obtained in the RGB color space.

Unsteady Laminar Boundary Layer Forced Flow in the Region of the Stagnation Point on a Stretching Flat Sheet

This paper analyses the unsteady, two-dimensional stagnation point flow of an incompressible viscous fluid over a flat sheet when the flow is started impulsively from rest and at the same time, the sheet is suddenly stretched in its own plane with a velocity proportional to the distance from the stagnation point. The partial differential equations governing the laminar boundary layer forced convection flow are non-dimensionalised using semi-similar transformations and then solved numerically using an implicit finitedifference scheme known as the Keller-box method. Results pertaining to the flow and heat transfer characteristics are computed for all dimensionless time, uniformly valid in the whole spatial region without any numerical difficulties. Analytical solutions are also obtained for both small and large times, respectively representing the initial unsteady and final steady state flow and heat transfer. Numerical results indicate that the velocity ratio parameter is found to have a significant effect on skin friction and heat transfer rate at the surface. Furthermore, it is exposed that there is a smooth transition from the initial unsteady state flow (small time solution) to the final steady state (large time solution).

Modeling Spatial Distributions of Point and Nonpoint Source Pollution Loadings in the Great Lakes Watersheds

A physically based, spatially-distributed water quality model is being developed to simulate spatial and temporal distributions of material transport in the Great Lakes Watersheds of the U.S. Multiple databases of meteorology, land use, topography, hydrography, soils, agricultural statistics, and water quality were used to estimate nonpoint source loading potential in the study watersheds. Animal manure production was computed from tabulations of animals by zip code area for the census years of 1987, 1992, 1997, and 2002. Relative chemical loadings for agricultural land use were calculated from fertilizer and pesticide estimates by crop for the same periods. Comparison of these estimates to the monitored total phosphorous load indicates that both point and nonpoint sources are major contributors to the total nutrient loads in the study watersheds, with nonpoint sources being the largest contributor, particularly in the rural watersheds. These estimates are used as the input to the distributed water quality model for simulating pollutant transport through surface and subsurface processes to Great Lakes waters. Visualization and GIS interfaces are developed to visualize the spatial and temporal distribution of the pollutant transport in support of water management programs.

Enhanced-Delivery Overlay Multicasting Scheme by Optimizing Bandwidth and Latency Discrepancy Ratios

With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.

Software Development Processes Maturity versus Software Processes and Products Measurement

Unsatisfactory effectiveness of software systems development and enhancement projects is one of the main reasons why in software engineering there are attempts being made to use experiences coming from other engineering disciplines. In spite of specificity of software product and process a belief had come out that the execution of software could be more effective if these objects were subject to measurement – as it is true in other engineering disciplines for which measurement is an immanent feature. Thus objective and reliable approaches to the measurement of software processes and products have been sought in software engineering for several dozens of years already. This may be proved, among others, by the current version of CMMI for Development model. This paper is aimed at analyzing the approach to the software processes and products measurement proposed in the latest version of this very model, indicating growing acceptance for this issue in software engineering.

Dempster-Shafer Evidence Theory for Image Segmentation: Application in Cells Images

In this paper we propose a new knowledge model using the Dempster-Shafer-s evidence theory for image segmentation and fusion. The proposed method is composed essentially of two steps. First, mass distributions in Dempster-Shafer theory are obtained from the membership degrees of each pixel covering the three image components (R, G and B). Each membership-s degree is determined by applying Fuzzy C-Means (FCM) clustering to the gray levels of the three images. Second, the fusion process consists in defining three discernment frames which are associated with the three images to be fused, and then combining them to form a new frame of discernment. The strategy used to define mass distributions in the combined framework is discussed in detail. The proposed fusion method is illustrated in the context of image segmentation. Experimental investigations and comparative studies with the other previous methods are carried out showing thus the robustness and superiority of the proposed method in terms of image segmentation.

Review and Experiments on SDMSCue

In this work, I present a review on Sparse Distributed Memory for Small Cues (SDMSCue), a variant of Sparse Distributed Memory (SDM) that is capable of handling small cues. I then conduct and show some cognitive experiments on SDMSCue to test its cognitive soundness compared to SDM. Small cues refer to input cues that are presented to memory for reading associations; but have many missing parts or fields from them. The original SDM failed to handle such a problem. SDMSCue handles and overcomes this pitfall. The main idea in SDMSCue; is the repeated projection of the semantic space on smaller subspaces; that are selected based on the input cue length and pattern. This process allows for Read/Write operations using an input cue that is missing a large portion. SDMSCue is augmented with the use of genetic algorithms for memory allocation and initialization. I claim that SDM functionality is a subset of SDMSCue functionality.

Using the Monte Carlo Simulation to Predict the Assembly Yield

Electronics Products that achieve high levels of integrated communications, computing and entertainment, multimedia features in small, stylish and robust new form factors are winning in the market place. Due to the high costs that an industry may undergo and how a high yield is directly proportional to high profits, IC (Integrated Circuit) manufacturers struggle to maximize yield, but today-s customers demand miniaturization, low costs, high performance and excellent reliability making the yield maximization a never ending research of an enhanced assembly process. With factors such as minimum tolerances, tighter parameter variations a systematic approach is needed in order to predict the assembly process. In order to evaluate the quality of upcoming circuits, yield models are used which not only predict manufacturing costs but also provide vital information in order to ease the process of correction when the yields fall below expectations. For an IC manufacturer to obtain higher assembly yields all factors such as boards, placement, components, the material from which the components are made of and processes must be taken into consideration. Effective placement yield depends heavily on machine accuracy and the vision of the system which needs the ability to recognize the features on the board and component to place the device accurately on the pads and bumps of the PCB. There are currently two methods for accurate positioning, using the edge of the package and using solder ball locations also called footprints. The only assumption that a yield model makes is that all boards and devices are completely functional. This paper will focus on the Monte Carlo method which consists in a class of computational algorithms (information processed algorithms) which depends on repeated random samplings in order to compute the results. This method utilized in order to recreate the simulation of placement and assembly processes within a production line.