Abstract: Collaborative research has become more prevalent and important across disciplines because it stimulates innovation and interaction between scholars. Seeing as existing studies relatively disregarded the institutional conditions triggering collaborative research, this work aims to analyze the changing trend in collaborative work patterns among Korean social scientists. The focus of this research is the performance of social scientists who received research grants through the government’s Social Science Korea (SSK) program. Using quantitative statistical methods, collaborative research patterns in a total of 2,354 papers published under the umbrella of the SSK program in peer-reviewed scholarly journals from 2013 to 2016 were examined to identify changing trends and triggering factors in collaborative research. A notable finding is that the share of collaborative research is overwhelmingly higher than that of individual research. In particular, levels of collaborative research surpassed 70%, increasing much quicker compared to other research done in the social sciences. Additionally, the most common composition of collaborative research was for two or three researchers to conduct joint research as coauthors, and this proportion has also increased steadily. Finally, a strong association between international journals and co-authorship patterns was found for the papers published by SSK program researchers from 2013 to 2016. The SSK program can be seen as the driving force behind collaboration between social scientists. Its emphasis on competition through a merit-based financial support system along with a rigorous evaluation process seems to have influenced researchers to cooperate with those who have similar research interests.
Abstract: This paper integrates Octagon and Square Search
pattern (OCTSS) motion estimation algorithm into H.264/AVC
(Advanced Video Coding) video codec in Adaptive Group of Pictures
(AGOP) mode. AGOP structure is computed based on scene change
in the video sequence. Octagon and square search pattern block-based
motion estimation method is implemented in inter-prediction process
of H.264/AVC. Both these methods reduce bit rate and computational
complexity while maintaining the quality of the video sequence
respectively. Experiments are conducted for different types of video
sequence. The results substantially proved that the bit rate,
computation time and PSNR gain achieved by the proposed method
is better than the existing H.264/AVC with fixed GOP and AGOP.
With a marginal gain in quality of 0.28dB and average gain in bitrate
of 132.87kbps, the proposed method reduces the average computation
time by 27.31 minutes when compared to the existing state-of-art
H.264/AVC video codec.
Abstract: In this paper a fast motion estimation method for
H.264/AVC named Triplet Search Motion Estimation (TS-ME) is
proposed. Similar to some of the traditional fast motion estimation
methods and their improved proposals which restrict the search points
only to some selected candidates to decrease the computation
complexity, proposed algorithm separate the motion search process to
several steps but with some new features. First, proposed algorithm try
to search the real motion area using proposed triplet patterns instead of
some selected search points to avoid dropping into the local minimum.
Then, in the localized motion area a novel 3-step motion search
algorithm is performed. Proposed search patterns are categorized into
three rings on the basis of the distance from the search center. These
three rings are adaptively selected by referencing the surrounding
motion vectors to early terminate the motion search process. On the
other hand, computation reduction for sub pixel motion search is also
discussed considering the appearance probability of the sub pixel
motion vector. From the simulation results, motion estimation speed
improved by a factor of up to 38 when using proposed algorithm than
that of the reference software of H.264/AVC with ignorable picture
quality loss.