Abstract: Rainfall records of rainfall station including the
rainfall potential per hour and rainfall mass of five heavy storms are
explored, respectively from 2001 to 2010. The rationalization formula
is to investigate the capability of flood peak duration of flood
detention pond in different rainfall conditions. The stable flood
detention model is also proposed by using system dynamic control
theory to get the message of flood detention pond in this research.
When rainfall frequency of one hour rainfall duration is more than
100-year frequency which exceeds the flood detention standard of
20-year frequency for the flood detention pond, the flood peak
duration of flood detention pond is 1.7 hours at most even though the
flood detention pond with maximum drainage potential about 15.0
m3/s of pumping system is constructed. If the rainfall peak current is
more than maximum drainage potential, the flood peak duration of
flood detention pond is about 1.9 hours at most. The flood detention
pond is the key factor of stable drainage control and flood prevention.
The critical factors of flood disaster is not only rainfall mass, but also
rainfall frequency of heavy storm in different rainfall duration and
flood detention frequency of flood detention system.
Abstract: A property-s selling price is described as the result of
sequential bargaining between a buyer and a seller in an environment
of asymmetric information. Hedonic housing prices are estimated
based upon 17,333 records of New Zealand residential properties
sold during the years 2006 and 2007.
Abstract: Transferring patient information between medical care
sites is necessary to deliver better patient care and to reduce medical
cost. So developing of electronic medical records is an important trend
for the world.The Continuity of Care Document (CCD) is product of
collaboration between CDA and CCR standards. In this study, we will
develop a system to generate medical records with entry level based on
CCD template module.
Abstract: This paper presents a procedure for estimating VAR
using Sequential Discounting VAR (SDVAR) algorithm for online
model learning to detect fraudulent acts using the telecommunications
call detailed records (CDR). The volatility of the VAR is observed
allowing for non-linearity, outliers and change points based on the
works of [1]. This paper extends their procedure from univariate
to multivariate time series. A simulation and a case study for
detecting telecommunications fraud using CDR illustrate the use of
the algorithm in the bivariate setting.
Abstract: With optimized bandwidth and latency discrepancy ratios, Node Gain Scores (NGSs) are determined and used as a basis for shaping the max-heap overlay. The NGSs - determined as the respective bandwidth-latency-products - govern the construction of max-heap-form overlays. Each NGS is earned as a synergy of discrepancy ratio of the bandwidth requested with respect to the estimated available bandwidth, and latency discrepancy ratio between the nodes and the source node. The tree leads to enhanceddelivery overlay multicasting – increasing packet delivery which could, otherwise, be hindered by induced packet loss occurring in other schemes not considering the synergy of these parameters on placing the nodes on the overlays. The NGS is a function of four main parameters – estimated available bandwidth, Ba; individual node's requested bandwidth, Br; proposed node latency to its prospective parent (Lp); and suggested best latency as advised by source node (Lb). Bandwidth discrepancy ratio (BDR) and latency discrepancy ratio (LDR) carry weights of α and (1,000 - α ) , respectively, with arbitrary chosen α ranging between 0 and 1,000 to ensure that the NGS values, used as node IDs, maintain a good possibility of uniqueness and balance between the most critical factor between the BDR and the LDR. A max-heap-form tree is constructed with assumption that all nodes possess NGS less than the source node. To maintain a sense of load balance, children of each level's siblings are evenly distributed such that a node can not accept a second child, and so on, until all its siblings able to do so, have already acquired the same number of children. That is so logically done from left to right in a conceptual overlay tree. The records of the pair-wise approximate available bandwidths as measured by a pathChirp scheme at individual nodes are maintained. Evaluation measures as compared to other schemes – Bandwidth Aware multicaSt architecturE (BASE), Tree Building Control Protocol (TBCP), and Host Multicast Tree Protocol (HMTP) - have been conducted. This new scheme generally performs better in terms of trade-off between packet delivery ratio; link stress; control overhead; and end-to-end delays.
Abstract: Frequent machine breakdowns, low plant availability and increased overtime are a great threat to a manufacturing plant as they increase operating costs of an industry. The main aim of this study was to improve Overall Equipment Effectiveness (OEE) at a manufacturing company through the implementation of innovative maintenance strategies. A case study approach was used. The paper focuses on improving the maintenance in a manufacturing set up using an innovative maintenance regime mix to improve overall equipment effectiveness. Interviews, reviewing documentation and historical records, direct and participatory observation were used as data collection methods during the research. Usually production is based on the total kilowatt of motors produced per day. The target kilowatt at 91% availability is 75 Kilowatts a day. Reduced demand and lack of raw materials particularly imported items are adversely affecting the manufacturing operations. The company had to reset its targets from the usual figure of 250 Kilowatt per day to mere 75 per day due to lower availability of machines as result of breakdowns as well as lack of raw materials. The price reductions and uncertainties as well as general machine breakdowns further lowered production. Some recommendations were given. For instance, employee empowerment in the company will enhance responsibility and authority to improve and totally eliminate the six big losses. If the maintenance department is to realise its proper function in a progressive, innovative industrial society, then its personnel must be continuously trained to meet current needs as well as future requirements. To make the maintenance planning system effective, it is essential to keep track of all the corrective maintenance jobs and preventive maintenance inspections. For large processing plants these cannot be handled manually. It was therefore recommended that the company implement (Computerised Maintenance Management System) CMMS.
Abstract: As it is known, buoyancy and drag forces rule bubble's rise velocity in a liquid column. These forces are strongly dependent on fluid properties, gravity as well as equivalent's diameter. This study reports a set of bubble rising velocity experiments in a liquid column using water or glycerol. Several records of terminal velocity were obtained. The results show that bubble's rise terminal velocity is strongly dependent on dynamic viscosity effect. The data set allowed to have some terminal velocities data interval of 8.0 ? 32.9 cm/s with Reynolds number interval 1.3 -7490. The bubble's movement was recorded with a video camera. The main goal is to present an original set data and results that will be discussed based on two-phase flow's theory. It will also discussed, the prediction of terminal velocity of a single bubble in liquid, as well as the range of its applicability. In conclusion, this study presents general expressions for the determination of the terminal velocity of isolated gas bubbles of a Reynolds number range, when the fluid proprieties are known.
Abstract: Despite many success stories of manufacturing safety, many organizations are still reluctant, perceiving it as cost increasing and time consuming. The clear contributor may be due to the use of lagging indicators rather than leading indicator measures. The study therefore proposes a combinatorial model for determining the best safety strategy. A combination theory and cost benefit analysis was employed to develop a monetary saving / loss function in terms value of preventions and cost of prevention strategy. Documentations, interviews and structured questionnaire were employed to collect information on Before-And-After safety programme records from a Tobacco company between periods of 1993-2001(for pre-safety) and 2002-2008 (safety period) for the model application. Three combinatorial alternatives A, B, C were obtained resulting into 4, 6 and 4 strategies respectively with PPE and Training being predominant. A total of 728 accidents were recorded for a 9 year period of pre-safety programme and 163 accidents were recorded for 7 years period of safety programme. Six preventions activities (alternative B) yielded the best results. However, all the years of operation experienced except year 2004. The study provides a leading resources for planning successful safety programme
Abstract: The aim of this study was to remove the two principal
noises which disturb the surface electromyography signal
(Diaphragm). These signals are the electrocardiogram ECG artefact
and the power line interference artefact. The algorithm proposed
focuses on a new Lean Mean Square (LMS) Widrow adaptive
structure. These structures require a reference signal that is correlated
with the noise contaminating the signal. The noise references are
then extracted : first with a noise reference mathematically
constructed using two different cosine functions; 50Hz (the
fundamental) function and 150Hz (the first harmonic) function for
the power line interference and second with a matching pursuit
technique combined to an LMS structure for the ECG artefact
estimation. The two removal procedures are attained without the use
of supplementary electrodes. These techniques of filtering are
validated on real records of surface diaphragm electromyography
signal. The performance of the proposed methods was compared with
already conducted research results.
Abstract: This paper proposes the use of Bayesian belief
networks (BBN) as a higher level of health risk assessment for a
dumping site of lead battery smelter factory. On the basis of the
epidemiological studies, the actual hospital attendance records and
expert experiences, the BBN is capable of capturing the probabilistic
relationships between the hazardous substances and their adverse
health effects, and accordingly inferring the morbidity of the adverse
health effects. The provision of the morbidity rates of the related
diseases is more informative and can alleviate the drawbacks of
conventional methods.
Abstract: We board the problem of creating a seismic alert
system, based upon artificial neural networks, trained by using the
well-known back-propagation and genetic algorithms, in order to emit
the alarm for the population located into a specific city, about an
eminent earthquake greater than 4.5 Richter degrees, and avoiding
disasters and human loses. In lieu of using the propagation wave, we
employed the magnitude of the earthquake, to establish a correlation
between the recorded magnitudes from a controlled area and the city,
where we want to emit the alarm. To measure the accuracy of the
posed method, we use a database provided by CIRES, which contains
the records of 2500 quakes incoming from the State of Guerrero
and Mexico City. Particularly, we performed the proposed method to
generate an issue warning in Mexico City, employing the magnitudes
recorded in the State of Guerrero.
Abstract: The purpose of this study was to study postpartum breastfeeding mothers to determine the impact their psychosocial and spiritual dimensions play in promoting full-term (6 month duration) breastfeeding of their infants. Purposive and snowball sampling methods were used to identify and recruit the study's participants. A total of 23 postpartum mothers, who were breastfeeding within 6 weeks after giving birth, participated in this study. In-depth interviews combined with observations, participant focus groups, and ethnographic records were used for data collection. The Data were then analyzed using content analysis and typology. The results of this study illustrated that postpartum mothers experienced fear and worry that they would lack support from their spouse, family and peers, and that their infant would not get enough milk It was found that the main barrier mothers faced in breastfeeding to full-term was the difficulty of continuing to breastfeed when returning to work. 81.82% of the primiparous mothers and 91.67% of the non-primiparous mothers were able to breastfeed for the desired full-term of 6 months. Factors found to be related to breastfeeding for six months included 1) belief and faith in breastfeeding, 2) support from spouse and family members, 3) counseling from public health nurses and friends. The sample also provided evidence that religious principles such as tolerance, effort, love, and compassion to their infant, and positive thinking, were used in solving their physical, mental and spiritual problems.
Abstract: The purpose of this article applies the monthly final
energy yield and failure data of 202 PV systems installed in Taiwan to
analyze the PV operational performance and system availability. This
data is collected by Industrial Technology Research Institute through
manual records. Bad data detection and failure data estimation
approaches are proposed to guarantee the quality of the received
information. The performance ratio value and system availability are
then calculated and compared with those of other countries. It is
indicated that the average performance ratio of Taiwan-s PV systems
is 0.74 and the availability is 95.7%. These results are similar with
those of Germany, Switzerland, Italy and Japan.
Abstract: Primary and secondary data from the Bauchi abattoir were utilized to determine the relative contributions of different livestock species to meat supply in Bauchi Metropolis. Daily livestock slaughter figures for five months (June – October 2011) indicated that more goats (64.0) were slaughtered than either sheep (47.3) or cattle (41.30) each day (P
Abstract: This paper introduces a tool that is being developed for the expression of information security policy controls that govern electronic healthcare records. By reference to published findings, the paper introduces the theory behind the use of knowledge management for automatic and consistent security policy assertion using the formalism called the Secutype; the development of the tool and functionality is discussed; some examples of Secutypes generated by the tool are provided; proposed integration with existing medical record systems is described. The paper is concluded with a section on further work and critique of the work achieved to date.
Abstract: The world economic crises and budget constraints
have caused authorities, especially those in developing countries, to
rationalize water quality monitoring activities. Rationalization
consists of reducing the number of monitoring sites, the number of
samples, and/or the number of water quality variables measured. The
reduction in water quality variables is usually based on correlation. If
two variables exhibit high correlation, it is an indication that some of
the information produced may be redundant. Consequently, one
variable can be discontinued, and the other continues to be measured.
Later, the ordinary least squares (OLS) regression technique is
employed to reconstitute information about discontinued variable by
using the continuously measured one as an explanatory variable. In
this paper, two record extension techniques are employed to
reconstitute information about discontinued water quality variables,
the OLS and the Line of Organic Correlation (LOC). An empirical
experiment is conducted using water quality records from the Nile
Delta water quality monitoring network in Egypt. The record
extension techniques are compared for their ability to predict
different statistical parameters of the discontinued variables. Results
show that the OLS is better at estimating individual water quality
records. However, results indicate an underestimation of the variance
in the extended records. The LOC technique is superior in preserving
characteristics of the entire distribution and avoids underestimation
of the variance. It is concluded from this study that the OLS can be
used for the substitution of missing values, while LOC is preferable
for inferring statements about the probability distribution.
Abstract: In this paper, a novel multi join algorithm to join
multiple relations will be introduced. The novel algorithm is based
on a hashed-based join algorithm of two relations to produce a double index. This is done by scanning the two relations once. But
instead of moving the records into buckets, a double index will be built. This will eliminate the collision that can happen from a complete hash algorithm. The double index will be divided into join
buckets of similar categories from the two relations. The algorithm then joins buckets with similar keys to produce joined buckets. This
will lead at the end to a complete join index of the two relations. without actually joining the actual relations. The time complexity
required to build the join index of two categories is Om log m where m is the size of each category. Totaling time complexity to O n log m
for all buckets. The join index will be used to materialize the joined relation if required. Otherwise, it will be used along with other join
indices of other relations to build a lattice to be used in multi-join operations with minimal I/O requirements. The lattice of the join indices can be fitted into the main memory to reduce time complexity of the multi join algorithm.
Abstract: With the explosive growth of information sources available on the World Wide Web, it has become increasingly difficult to identify the relevant pieces of information, since web pages are often cluttered with irrelevant content like advertisements, navigation-panels, copyright notices etc., surrounding the main content of the web page. Hence, tools for the mining of data regions, data records and data items need to be developed in order to provide value-added services. Currently available automatic techniques to mine data regions from web pages are still unsatisfactory because of their poor performance and tag-dependence. In this paper a novel method to extract data items from the web pages automatically is proposed. It comprises of two steps: (1) Identification and Extraction of the data regions based on visual clues information. (2) Identification of data records and extraction of data items from a data region. For step1, a novel and more effective method is proposed based on visual clues, which finds the data regions formed by all types of tags using visual clues. For step2 a more effective method namely, Extraction of Data Items from web Pages (EDIP), is adopted to mine data items. The EDIP technique is a list-based approach in which the list is a linear data structure. The proposed technique is able to mine the non-contiguous data records and can correctly identify data regions, irrespective of the type of tag in which it is bound. Our experimental results show that the proposed technique performs better than the existing techniques.
Abstract: The sanitary sewerage connection rate becomes an
important indicator of advanced cities. Following the construction of
sanitary sewerages, the maintenance and management systems are
required for keeping pipelines and facilities functioning well. These
maintenance tasks often require sewer workers to enter the manholes
and the pipelines, which are confined spaces short of natural
ventilation and full of hazardous substances. Working in sewers could
be easily exposed to a risk of adverse health effects. This paper
proposes the use of Bayesian belief networks (BBN) as a higher level
of noncarcinogenic health risk assessment of sewer workers. On the
basis of the epidemiological studies, the actual hospital attendance
records and expert experiences, the BBN is capable of capturing the
probabilistic relationships between the hazardous substances in sewers
and their adverse health effects, and accordingly inferring the
morbidity and mortality of the adverse health effects. The provision of
the morbidity and mortality rates of the related diseases is more
informative and can alleviate the drawbacks of conventional methods.
Abstract: Oxygen and carbon isotopes records of multi-species planktonic, benthic foraminifera and bulk carbonate sample from Central Java Indonesia demonstrate that warm sea surface temperature occurred during the Miocene. Planktonic δ18O values from this study consistently lighter (-4 to -3 ‰PDB) than previous studies that indicate sea surface temperature during Miocene in this area was warm than tropical/equatorial localities. A surprising decrease of oxygen isotopic composition was recorded at ±14 Ma where the maximum of δ18O values is -4.87 ‰PDB for Orbulina universa, -5.02 ‰PDB for Globigerinoides sacculifer and -4.30 ‰PDB for Globoquadrina dehiscens, this event we predict as Middle Miocene Optimum. Warming of sea surface temperature we interpret as related to the development of Western Pacific Warm Pool where warm water from Pacific Ocean through the Indonesian seaway appears to remain during Miocene. Our result also show increasing suddenly of oxygen isotope values of planktic, benthic and bulk carbonate sample from ± 12 Ma, the increasing cooled surface water relatively high degree with Late Miocene global cooling climate or we predict that due to closing of Indonesian Gateway.