Please wait a minute...

当期目录

2010年, 第5期 刊出日期:2010-05-01 上一期    下一期
Scalable high performance de-duplication backup via hash join
Tian-ming Yang, Dan Feng, Zhong-ying Niu, Ya-ping Wan
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 315-327.   https://doi.org/10.1631/jzus.C0910445
摘要( 2444 )     PDF(0KB)( 1156 )
Apart from high space efficiency, other demanding requirements for enterprise de-duplication backup are high performance, high scalability, and availability for large-scale distributed environments. The main challenge is reducing the significant disk input/output (I/O) overhead as a result of constantly accessing the disk to identify duplicate chunks. Existing inline de-duplication approaches mainly rely on duplicate locality to avoid disk bottleneck, thus suffering from degradation under poor duplicate locality workload. This paper presents Chunkfarm, a post-processing de-duplication backup system designed to improve capacity, throughput, and scalability for de-duplication. Chunkfarm performs de-duplication backup using the hash join algorithm, which turns the notoriously random and small disk I/Os of fingerprint lookups and updates into large sequential disk I/Os, hence achieving high write throughput not influenced by workload locality. More importantly, by decentralizing fingerprint lookup and update, Chunkfarm supports a cluster of servers to perform de-duplication backup in parallel; it hence is conducive to distributed implementation and thus applicable to large-scale and distributed storage systems.
Minimal role mining method for Web service composition
Chao Huang, Jian-ling Sun, Xin-yu Wang, Yuan-jie Si
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 328-339.   https://doi.org/10.1631/jzus.C0910186
摘要( 2473 )     PDF(0KB)( 1224 )
Web service composition is a low cost and efficient way to leverage the existing resource and implementation. In current Web service composition implementations, the issue of how to define the role for a new composite Web service has been little addressed. Adjusting the access control policy for a new composite Web service always causes substantial administration overhead from the security administrator. Furthermore, the distributed nature of Web service based applications makes traditional role mining methods obsolete. In this paper, we analyze the minimal role mining problem for Web service composition, and prove that this problem is NP-complete. We propose a sub-optimal greedy algorithm based on the analysis of necessary role mapping for interoperation across multiple domains. Simulation shows the effectiveness of our algorithm, and compared to the existing methods, our algorithm has significant performance advantages. We also demonstrate the practical application of our method in a real agent based Web service system. The results show that our method could find the minimal role mapping efficiently.
Online detection of bursty events and their evolution in news streams
Wei Chen, Chun Chen, Li-jun Zhang, Can Wang, Jia-jun Bu
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 340-355.   https://doi.org/10.1631/jzus.C0910245
摘要( 2339 )     PDF(0KB)( 1319 )
Online monitoring of temporally-sequenced news streams for interesting patterns and trends has gained popularity in the last decade. In this paper, we study a particular news stream monitoring task: timely detection of bursty events which have happened recently and discovery of their evolutionary patterns along the timeline. Here, a news stream is represented as feature streams of tens of thousands of features (i.e., keyword. Each news story consists of a set of keywords.). A bursty event therefore is composed of a group of bursty features, which show bursty rises in frequency as the related event emerges. In this paper, we give a formal definition to the above problem and present a solution with the following steps: (1) applying an online multi-resolution burst detection method to identify bursty features with different bursty durations within a recent time period; (2) clustering bursty features to form bursty events and associating each event with a power value which reflects its bursty level; (3) applying an information retrieval method based on cosine similarity to discover the event’s evolution (i.e., highly related bursty events in history) along the timeline. We extensively evaluate the proposed methods on the Reuters Corpus Volume 1. Experimental results show that our methods can detect bursty events in a timely way and effectively discover their evolution. The power values used in our model not only measure event’s bursty level or relative importance well at a certain time point but also show relative strengths of events along the same evolution.
Triangular domain extension of linear Bernstein-like trigonometric polynomial basis
Wan-qiang Shen, Guo-zhao Wang
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 356-364.   https://doi.org/10.1631/jzus.C0910347
摘要( 2572 )     PDF(0KB)( 1116 )
In computer aided geometric design (CAGD), the Bernstein-Bézier system for polynomial space including the triangular domain is an important tool for modeling free form shapes. The Bernstein-like bases for other spaces (trigonometric polynomial, hyperbolic polynomial, or blended space) has also been studied. However, none of them was extended to the triangular domain. In this paper, we extend the linear trigonometric polynomial basis to the triangular domain and obtain a new Bernstein-like basis, which is linearly independent and satisfies positivity, partition of unity, symmetry, and boundary representation. We prove some properties of the corresponding surfaces, including differentiation, subdivision, convex hull, and so forth. Some applications are shown.
High quality multi-focus polychromatic composite image fusion algorithm based on filtering in frequency domain and synthesis in space domain
Lei Zhang, Peng Liu, Yu-ling Liu, Fei-hong Yu
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 365-374.   https://doi.org/10.1631/jzus.C0910344
摘要( 2567 )     PDF(0KB)( 1069 )
A novel multi-focus polychromatic image fusion algorithm based on filtering in the frequency domain using fast Fourier transform (FFT) and synthesis in the space domain (FFDSSD) is presented in this paper. First, the original multi-focus images are transformed into their frequency data by FFT for easy and accurate clarity determination. Then a Gaussian low-pass filter is used to filter the high frequency information corresponding to the image saliencies. After an inverse FFT, the filtered images are obtained. The deviation between the filtered images and the original ones, representing the clarity of the image, is used to select the pixels from the multi-focus images to reconstruct a completely focused image. These operations in space domain preserve the original information as much as possible and are relatively insensitive to misregistration scenarios with respect to transform domain methods. The polychromatic noise is well considered and successfully avoided while the information in different chromatic channels is preserved. A natural, nice-looking fused microscopic image for human visual evaluations is obtained in a dedicated experiment. The experimental results indicate that the proposed algorithm has a good performance in objective quality metrics and runtime efficiency.
Real-time motion deblurring algorithm with robust noise suppression
Hua-jun Feng, Yong-pan Wang, Zhi-hai Xu, Qi Li, Hua Lei, Ju-feng Zhao
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 375-380.   https://doi.org/10.1631/jzus.C0910201
摘要( 2465 )     PDF(0KB)( 1421 )
In an image restoration process, to obtain good results is challenging because of the unavoidable existence of noise even if the blurring information is already known. To suppress the deterioration caused by noise during the image deblurring process, we propose a new deblurring method with a known kernel. First, the noise in the measurement process is assumed to meet the Gaussian distribution to fit the natural noise distribution. Second, the first and second orders of derivatives are supposed to satisfy the independent Gaussian distribution to control the non-uniform noise. Experimental results show that our method is obviously superior to the Wiener filter, regularized filter, and Richardson-Lucy (RL) algorithm. Moreover, owing to processing in the frequency domain, it runs faster than the other algorithms, in particular about six times faster than the RL algorithm.
New loop pairing criterion based on interaction and integrity considerations
Ling-jian Ye, Zhi-huan Song
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 381-393.   https://doi.org/10.1631/jzus.C0910217
摘要( 2504 )     PDF(0KB)( 1004 )
Loop pairing is one of the major concerns when designing decentralized control systems for multivariable processes. Most existing pairing tools, such as the relative gain array (RGA) method, have shortcomings both in measuring interaction and in integrity issues. To evaluate the overall interaction among loops, we propose a statistics-based criterion via enumerating all possible combinations of loop statuses. Furthermore, we quantify the traditional concept of integrity to represent the extent of integrity of a decentralized control system. Thus, we propose that a pairing decision should be made by taking both factors into consideration. Two examples are provided to illustrate the effectiveness of the proposed criterion.
Model predictive control with an on-line identification model of a supply chain unit
Jian Niu, Zu-hua Xu, Jun Zhao, Zhi-jiang Shao, Ji-xin Qian
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 394-400.   https://doi.org/10.1631/jzus.C0910270
摘要( 2792 )     PDF(0KB)( 934 )
A model predictive controller was designed in this study for a single supply chain unit. A demand model was described using an autoregressive integrated moving average (ARIMA) model, one that is identified on-line to forecast the future demand. Feedback was used to modify the demand prediction, and profit was chosen as the control objective. To imitate reality, the purchase price was assumed to be a piecewise linear form, whereby the control objective became a nonlinear problem. In addition, a genetic algorithm was introduced to solve the problem. Constraints were put on the predictive inventory to control the inventory fluctuation, that is, the bullwhip effect was controllable. The model predictive control (MPC) method was compared with the order-up-to-level (OUL) method in simulations. The results revealed that using the MPC method can result in more profit and make the bullwhip effect controllable.
Robust time reversal processing for active detection of a small bottom target in a shallow water waveguide
Xiang Pan, Jian-long Li, Wen Xu, Xian-yi Gong
Front. Inform. Technol. Electron. Eng., 2010, 11(5): 401-406.   https://doi.org/10.1631/jzus.C0910212
摘要( 2606 )     PDF(0KB)( 1176 )
With the spatial-temporal focusing of acoustic energy, time reversal processing (TRP) shows the potential application for active target detection in shallow water. To turn the ‘potential’ into a reality, the TRP based on a model source (MS) instead of a physical probe source (PS) is investigated. For uncertain ocean environments, the robustness of TRP is discussed for the narrowband and broadband signal respectively. The channel transfer function matrix is first constructed in the acoustic perturbation space. Then a steering vector for time reversal transmission is obtained by singular value decomposition (SVD) of the matrix. For verification of the robust TRP, the tank experiments of time reversal transmission focusing and its application for active target detection are undertaken. The experimental results have shown that the robust TRP can effectively detect and locate a small bottom target.
9 articles

编辑部公告More

友情链接