نتایج جستجو برای: vq scintigraphy
تعداد نتایج: 10797 فیلتر نتایج به سال:
1047-3203/$ see front matter 2010 Elsevier Inc. A doi:10.1016/j.jvcir.2010.02.008 * Corresponding author. Fax: +886 8 7215034. E-mail addresses: [email protected], b [email protected] (C.-H. Yang). The VQ reversible data embedding technology allows an original VQ coding to be completely restored after the extraction of embedded data. In this paper, we propose a new reversible sche...
Vector quantization (VQ) using exhaustive nearest neighbor (NN) search is the speed bottleneck in classic bag of visual words (BOV) models. Approximate NN (ANN) search methods still cost great time in VQ, since they check multiple regions in the search space to reduce VQ errors. In this paper, we propose ExVQ, an exclusive NN search method to speed up BOV models. Given a visual descriptor, a po...
This paper examines two vector quantization algorithms which can combine the tasks of compression and classiication: Bayes risk weighted vector quantization (BRVQ) proposed by Oehler et al., and Optimized Learning Vector Quantization 1 (OLVQ1) proposed by Kohonen et al. BRVQ uses a parameter to control the tradeoo between compression and clas-siication. BRVQ performance is studied for a range o...
Vector Quantization (VQ) has its origins in signal processing where it is used for compact, accurate representation of input signals. However, since VQ induces a partitioning of the input space, it can also be used for statistical pattern recognition. In this paper we present a novel gradient descent VQ classiication algorithm (GVQ) which minimizes the Bayes Risk, and compare its performance to...
WRKY transcription factors are encoded by a large gene superfamily with a broad range of roles in plants. Recently, several groups have reported that proteins containing a short VQ (FxxxVQxLTG) motif interact with WRKY proteins. We have recently discovered that two VQ proteins from Arabidopsis (Arabidopsis thaliana), SIGMA FACTOR-INTERACTING PROTEIN1 and SIGMA FACTOR-INTERACTING PROTEIN2, act a...
Since 2000, many digital watermarking schemes for vector quantization (VQ)-compressed images have been proposed. Their main idea is to carry watermark information by VQ codeword indices. The advantage of this kind of watermarking schemes is its robustness to VQ compression with the same codebook. This Letter presents a more effective image watermarking method based on classified VQ. First, the ...
Vector-Quantization (VQ) is a widely implemented method for low-bit-rate signal coding. A common assumption in the design of VQ systems is that the digital information is transmitted through a perfect channel. Under this assumption, the assignment of channel symbols to the VQ Reconstruction Vectors (RV) is of no importance. However, under physical channels, the effect of channel errors on the V...
In this paper, two bfferent schemes for predictive vector quantization (VQ) of subband decomposed images are investigated. The aim is to reduce the quantization error by incorporating memory into the VQ scheme. The first scheme is a form of finite-state VQ FSVQ) which we will call subband FSVQ (SB-FSVQ) and the second is a form of predictive VQ (PVQ) applied to image subbands. We will refer to ...
Vector Quantization (VQ) has been explored in the past as a means of reducing likelihood computation in speech recognizers which use hidden Markov models (HMMs) containing Gaussian output densities. Although this approach has proved successful, there is an extent beyond which further reduction in likelihood computation substantially degrades recognition accuracy. Since the components of the VQ ...
Vector quantization (VQ) has received a great attention in the field of multimedia data compression since last few decades because it has simple decoding structure and can provide high compression ratio. In general, algorithms of VQ codebook generation focus on solving two kinds of problem: (i) to determine the quantization regions and the code words that minimize the distortion error. (ii) to ...
نمودار تعداد نتایج جستجو در هر سال
با کلیک روی نمودار نتایج را به سال انتشار فیلتر کنید