Abstract: The paper explores the development of an optimization of method and apparatus for retrieving extended high dynamic range from digital negative image. Architectural photo imaging can benefit from high dynamic range imaging (HDRI) technique for preserving and presenting sufficient luminance in the shadow and highlight clipping image areas. The HDRI technique that requires multiple exposure images as the source of HDRI rendering may not be effective in terms of time efficiency during the acquisition process and post-processing stage, considering it has numerous potential imaging variables and technical limitations during the multiple exposure process. This paper explores an experimental method and apparatus that aims to expand the dynamic range from digital negative image in HDRI environment. The method and apparatus explored is based on a single source of RAW image acquisition for the use of HDRI post-processing. It will cater the optimization in order to avoid and minimize the conventional HDRI photographic errors caused by different physical conditions during the photographing process and the misalignment of multiple exposed image sequences. The study observes the characteristics and capabilities of RAW image format as digital negative used for the retrieval of extended high dynamic range process in HDRI environment.
Abstract: Fine alignment of main ship power plants mechanisms
and shaft lines provides long-term and failure-free performance of
propulsion system while fast and high-quality installation of
mechanisms and shaft lines decreases common labor intensity. For
checking shaft line allowed stress and setting its alignment it is
required to perform calculations considering various stages of life
cycle. In 2012 JSC SSTC developed special software complex
“Shaftline” for calculation of alignment of having its own I/O
interface and display of shaft line 3D model. Alignment of shaft line
as per bearing loads is rather labor-intensive procedure. In order to
decrease its duration, JSC SSTC developed automated alignment
system from ship power plants mechanisms. System operation
principle is based on automatic simulation of design load on bearings.
Initial data for shaft line alignment can be exported to automated
alignment system from PC “Shaft line”.
Abstract: This paper addresses the fundamental requirements for
starting an online business. It covers the process of ideation,
conceptualization, formulation, and implementation of new venture
ideas on the Web. Using Facebook as an illustrative example, we learn
how to turn an idea into a successful electronic business and to execute
a business plan with IT skills, management expertise, a good
entrepreneurial attitude, and an understanding of Internet culture. The
personality traits and characteristics of a successful e-commerce
entrepreneur are discussed with reference to Facebook-s founder,
Mark Zuckerberg. Facebook is a social and e-commerce success. It
provides a trusted environment of which participants can conduct
business with social experience. People are able to discuss products
before, during the after the sale within the Facebook environment. The
paper also highlights the challenges and opportunities for e-commerce
entrepreneurial startups to go public and of entering the China market.
Abstract: Program slicing is the task of finding all statements in
a program that directly or indirectly influence the value of a variable
occurrence. The set of statements that can affect the value of a
variable at some point in a program is called a program backward
slice. In several software engineering applications, such as program
debugging and measuring program cohesion and parallelism, several
slices are computed at different program points. The existing
algorithms for computing program slices are introduced to compute a
slice at a program point. In these algorithms, the program, or the
model that represents the program, is traversed completely or
partially once. To compute more than one slice, the same algorithm
is applied for every point of interest in the program. Thus, the same
program, or program representation, is traversed several times.
In this paper, an algorithm is introduced to compute all forward
static slices of a computer program by traversing the program
representation graph once. Therefore, the introduced algorithm is
useful for software engineering applications that require computing
program slices at different points of a program. The program
representation graph used in this paper is called Program Dependence
Graph (PDG).
Abstract: In this treatise we will study the capability of static
compensator for reactive power to stabilize sheen voltage with motor
loading on power networks system. We also explain the structure and main function of STATCOM and the method to control it using STATCOM transformer current to simultaneously predict after
telling about the necessity of FACTS tools to compensate in power networks. Then we study topology and controlling system to stabilize
voltage during start of inductive motor. The outcome of stimulat by MATLAB software supports presented controlling idea and
system in the treatise.
Abstract: one of the significant factors for improving the
accuracy of Land Surface Temperature (LST) retrieval is the correct
understanding of the directional anisotropy for thermal radiance. In
this paper, the multiple scattering effect between heterogeneous
non-isothermal surfaces is described rigorously according to the
concept of configuration factor, based on which a directional thermal
radiance model is built, and the directional radiant character for urban
canopy is analyzed. The model is applied to a simple urban canopy
with row structure to simulate the change of Directional Brightness
Temperature (DBT). The results show that the DBT is aggrandized
because of the multiple scattering effects, whereas the change range of
DBT is smoothed. The temperature difference, spatial distribution,
emissivity of the components can all lead to the change of DBT. The
“hot spot" phenomenon occurs when the proportion of high
temperature component in the vision field came to a head. On the other
hand, the “cool spot" phenomena occur when low temperature
proportion came to the head. The “spot" effect disappears only when
the proportion of every component keeps invariability. The model
built in this paper can be used for the study of directional effect on
emissivity, the LST retrieval over urban areas and the adjacency effect
of thermal remote sensing pixels.
Abstract: In this study, an inland metropolitan area, Gwangju, in Korea was selected to assess the amplification potential of earthquake motion and provide the information for regional seismic countermeasure. A geographic information system-based expert system was implemented for reliably predicting the spatial geotechnical layers in the entire region of interesting by building a geo-knowledge database. Particularly, the database consists of the existing boring data gathered from the prior geotechnical projects and the surface geo-knowledge data acquired from the site visit. For practical application of the geo-knowledge database to estimate the earthquake hazard potential related to site amplification effects at the study area, seismic zoning maps on geotechnical parameters, such as the bedrock depth and the site period, were created within GIS framework. In addition, seismic zonation of site classification was also performed to determine the site amplification coefficients for seismic design at any site in the study area. KeywordsEarthquake hazard, geo-knowledge, geographic information system, seismic zonation, site period.
Abstract: Problem-based learning (PBL) is one of the student
centered approaches and has been considered by a number of higher
educational institutions in many parts of the world as a method of
delivery. This paper presents a creative thinking approach for
implementing Problem-based Learning in Mechanics of Structure
within a Malaysian Polytechnics environment. In the learning
process, students learn how to analyze the problem given among the
students and sharing classroom knowledge into practice. Further,
through this course-s emphasis on problem-based learning, students
acquire creative thinking skills and professional skills as they tackle
complex, interdisciplinary and real-situation problems. Once the
creative ideas are generated, there are useful additional techniques
for tender ideas that will grow into a productive concept or solution.
The combination of creative skills and technical abilities will enable
the students to be ready to “hit-the-ground-running" and produce in
industry when they graduate.
Abstract: Three-dimensional geometric models have been used
to present architectural and engineering works, showing their final
configuration. When the clarification of a detail or the constitution of
a construction step in needed, these models are not appropriate. They
do not allow the observation of the construction progress of a
building. Models that could present dynamically changes of the
building geometry are a good support to the elaboration of projects.
Techniques of geometric modeling and virtual reality were used to
obtain models that could visually simulate the construction activity.
The applications explain the construction work of a cavity wall and a
bridge. These models allow the visualization of the physical
progression of the work following a planned construction sequence,
the observation of details of the form of every component of the
works and support the study of the type and method of operation of
the equipment applied in the construction. These models presented
distinct advantage as educational aids in first-degree courses in Civil
Engineering. The use of Virtual Reality techniques in the
development of educational applications brings new perspectives to
the teaching of subjects related to the field of civil construction.
Abstract: In any trust model, the two information sources that a peer relies on to predict trustworthiness of another peer are direct experience as well as reputation. These two vital components evolve over time. Trust evolution is an important issue, where the objective is to observe a sequence of past values of a trust parameter and determine the future estimates. Unfortunately, trust evolution algorithms received little attention and the proposed algorithms in the literature do not comply with the conditions and the nature of trust. This paper contributes to this important problem in the following ways: (a) presents an algorithm that manages and models trust evolution in a P2P environment, (b) devises new mechanisms for effectively maintaining trust values based on the conditions that influence trust evolution , and (c) introduces a new methodology for incorporating trust-nurture incentives into the trust evolution algorithm. Simulation experiments are carried out to evaluate our trust evolution algorithm.
Abstract: An overview of the important aspects of managing
and controlling industrial effluent discharges to public sewers namely
sampling, characterization, quantification and legislative controls has
been presented. The findings have been validated by means of a case
study covering three industrial sectors namely, tanning, textile
finishing and food processing industries. Industrial effluents
discharges were found to be best monitored by systematic and
automatic sampling and quantified using water meter readings
corrected for evaporative and consumptive losses. Based on the
treatment processes employed in the public owned treatment works
and the chemical oxygen demand and biochemical oxygen demand
levels obtained, the effluent from all the three industrial sectors
studied were found to lie in the toxic zone. Thus, physico-chemical
treatment of these effluents is required to bring them into the
biodegradable zone. KL values (quoted to base e) were greater than
0.50 day-1 compared to 0.39 day-1 for typical municipality
wastewater.
Abstract: In the highly competitive and rapidly changing global
marketplace, independent organizations and enterprises often come
together and form a temporary alignment of virtual enterprise in a
supply chain to better provide products or service. As firms adopt the
systems approach implicit in supply chain management, they must
manage the quality from both internal process control and external
control of supplier quality and customer requirements. How to
incorporate quality management of upstream and downstream supply
chain partners into their own quality management system has recently
received a great deal of attention from both academic and practice.
This paper investigate the collaborative feature and the entities-
relationship in a supply chain, and presents an ontology of
collaborative supply chain from an approach of aligning
service-oriented framework with service-dominant logic. This
perspective facilitates the segregation of material flow management
from manufacturing capability management, which provides a
foundation for the coordination and integration of the business process
to measure, analyze, and continually improve the quality of products,
services, and process. Further, this approach characterizes the different
interests of supply chain partners, providing an innovative approach to
analyze the collaborative features of supply chain. Furthermore, this
ontology is the foundation to develop quality management system
which internalizes the quality management in upstream and
downstream supply chain partners and manages the quality in supply
chain systematically.
Abstract: In this paper we will develop a sequential life test approach applied to a modified low alloy-high strength steel part used in highway overpasses in Brazil.We will consider two possible underlying sampling distributions: the Normal and theInverse Weibull models. The minimum life will be considered equal to zero. We will use the two underlying models to analyze a fatigue life test situation, comparing the results obtained from both.Since a major chemical component of this low alloy-high strength steel part has been changed, there is little information available about the possible values that the parameters of the corresponding Normal and Inverse Weibull underlying sampling distributions could have. To estimate the shape and the scale parameters of these two sampling models we will use a maximum likelihood approach for censored failure data. We will also develop a truncation mechanism for the Inverse Weibull and Normal models. We will provide rules to truncate a sequential life testing situation making one of the two possible decisions at the moment of truncation; that is, accept or reject the null hypothesis H0. An example will develop the proposed truncated sequential life testing approach for the Inverse Weibull and Normal models.
Abstract: In this paper, we propose use of convolutional codes
for file dispersal. The proposed method is comparable in complexity
to the information Dispersal Algorithm proposed by M.Rabin and for
particular choices of (non-binary) convolutional codes, is almost as
efficient as that algorithm in terms of controlling expansion in the
total storage. Further, our proposed dispersal method allows string
search.
Abstract: This paper aims to present a framework for the
organizational knowledge management, which seeks to deploy a
standardized structure for the integrated management of knowledge is
a common language based on domains, processes and global
indicators inspired by the COBIT framework 5 (ISACA, 2012),
which supports the integration of three technologies, enterprise
information architecture (EIA), the business process modeling (BPM)
and service-oriented architecture (SOA). The Gomak Framework is a
management platform that seeks to integrate the information
technology infrastructure, the structure of applications, information
infrastructure, and business logic and business model to support a
sound strategy of organizational knowledge management, low
process-based approach and concurrent engineering. Concurrent
engineering (CE) is a systematic approach to integrated product
development that respond to customer expectations, involving all
perspectives in parallel, from the beginning of the product life cycle.
(European Space Agency, 2000).
Abstract: Automated production lines with so called 'hard structures' are widely used in manufacturing. Designers segmented these lines into sections by placing a buffer between the series of machine tools to increase productivity. In real production condition the capacity of a buffer system is limited and real production line can compensate only some part of the productivity losses of an automated line. The productivity of such production lines cannot be readily determined. This paper presents mathematical approach to solving the structure of section-based automated production lines by criterion of maximum productivity.
Abstract: The drastic increase in the usage of SMS technology
has led service providers to seek for a solution that enable users of
mobile devices to access services through SMSs. This has resulted in
the proposal of solutions towards SMS-based service invocation in
service oriented environments. However, the dynamic nature of
service-oriented environments coupled with sudden load peaks
generated by service request, poses performance challenges to
infrastructures for supporting SMS-based service invocation. To
address this problem we adopt load balancing techniques. A load
balancing model with adaptive load balancing and load monitoring
mechanisms as its key constructs is proposed. The load balancing
model then led to realization of Least Loaded Load Balancing
Framework (LLLBF). Evaluation of LLLBF benchmarked with round
robin (RR) scheme on the queuing approach showed LLLBF
outperformed RR in terms of response time and throughput.
However, LLLBF achieved better result in the cost of high
processing power.
Abstract: In this work we present a solution for DAGC (Digital
Automatic Gain Control) in WLAN receivers compatible to IEEE 802.11a/g standard. Those standards define communication in 5/2.4
GHz band using Orthogonal Frequency Division Multiplexing OFDM modulation scheme. WLAN Transceiver that we have used
enables gain control over Low Noise Amplifier (LNA) and a
Variable Gain Amplifier (VGA). The control over those signals is
performed in our digital baseband processor using dedicated hardware block DAGC. DAGC in this process is used to automatically control the VGA and LNA in order to achieve better
signal-to-noise ratio, decrease FER (Frame Error Rate) and hold the
average power of the baseband signal close to the desired set point.
DAGC function in baseband processor is done in few steps: measuring power levels of baseband samples of an RF signal,accumulating the differences between the measured power level and
actual gain setting, adjusting a gain factor of the accumulation, and
applying the adjusted gain factor the baseband values. Based on the measurement results of RSSI signal dependence to input power we have concluded that this digital AGC can be implemented applying
the simple linearization of the RSSI. This solution is very simple but also effective and reduces complexity and power consumption of the
DAGC. This DAGC is implemented and tested both in FPGA and in ASIC as a part of our WLAN baseband processor. Finally, we have integrated this circuit in a compact WLAN PCMCIA board based on MAC and baseband ASIC chips designed from us.
Abstract: This study aims to investigate empirically the valuerelevance
of accounting information to domestic investors in Tehran
stock exchange from 1999 to 2006. During the present research
impacts of two factors, including positive vs. negative earnings and
the firm size are considered as well. The authors used earnings per
share and annual change of earnings per share as the income
statement indices, and book value of equity per share as the balance
sheet index. Return and Price models through regression analysis are
deployed in order to test the research hypothesis. Results depicted
that accounting information is value-relevance to domestic investors
in Tehran Stock Exchange according to both studied models.
However, income statement information has more value-relevance
than the balance sheet information. Furthermore, positive vs. negative
earnings and firm size seems to have significant impact on valuerelevance
of accounting information.
Abstract: In recent years, it has been proposed security
architecture for sensor network.[2][4]. One of these, TinySec by Chris
Kalof, Naveen Sastry, David Wagner had proposed Link layer security
architecture, considering some problems of sensor network. (i.e :
energy, bandwidth, computation capability,etc). The TinySec employs
CBC_mode of encryption and CBC-MAC for authentication based on
SkipJack Block Cipher. Currently, This TinySec is incorporated in the
TinyOS for sensor network security.
This paper introduces TinyHash based on general hash algorithm.
TinyHash is the module in order to replace parts of authentication and
integrity in the TinySec. it implies that apply hash algorithm on
TinySec architecture. For compatibility about TinySec, Components
in TinyHash is constructed as similar structure of TinySec. And
TinyHash implements the HMAC component for authentication and
the Digest component for integrity of messages. Additionally, we
define the some interfaces for service associated with hash algorithm.