You are on page 1of 4

Universal Data Compression (049043) – Critical Summary

Select a paper for a critical summary from the following list and let me know of your choice as soon
as possible (no paper can be chosen by more than one student).
Submit your critical summary by the last day of the semester (July 2, 2013) and then set with
me an appointment for the oral discussion on the paper.

General requirements:

The critical summary should be no more than five pages long (including the bibliography; font size
≥ 11pt, margins: ≥ 1cm). First, it should include a concise description of the main results in the
paper along with explanations. One is not expected to copy proofs or analytical developments,
but is expected to explain the essence of the techniques used therein. Secondly, and this is the
critical part, the summary should include, as much as possible, personal observations of the stu-
dent: intuitive insights, relationships (if exist) with the material of the lectures, comments on the
degree of innovation with respect to previous work, techincal rigor and correctness, suggestions for
improvement, simplification, or generalization of the analysis, and so on.
For the oral discussion, the student should be knowledgeable not only with regard to the paper
itself, and its detailed technical aspects, but also on closely related work like the references cited in
the paper. Thus, short papers should not necessary be considered more attractive than long ones.
Examples of good critical summaries (although in another course – “Coded Communication”) can
be found in the moodle site.

List of Papers

(1) P. Flajolet and W. Szpankowski, “Analytic variations on redundancy rates of renewal pro-
cesses,” IEEE Trans. on Inform. Theory, Vol. 48, no. 11, pp. 2911–2921, November 2002.

(2) H. Prodinger and W. Szpankowski, “Optimal versus randomized search of fixed length binary
words,” IEEE Trans. on Inform. Theory, Vol. 48, no. 9, pp. 2614–2621, September 2002.

(3) B. Ya. Ryabko, “Twice-universal coding,” Problems of Information Transmission, pp. 173-
177, July-Sept., 1984.

(4) B. S. Clarke and A. R. Barron, “Jeffreys’ prior is asymptotically least favorable under entropy
risk,” Journal of Statistical Planning and Inference, Vol. 41, pp. 37-60, 1994.

(5) P. C. Shields, “Universal almost sure data compression using Markov types,” preprint (can
be copied from me).

(6) L. Finesso, C.-C. Liu, and P. Narayan, “The optimal error exponent for Markov order esti-
mation,” IEEE Trans. on Inform. Theory, Vol. 42, No. 5, pp. 1488-1497, September 1996.

(7) M. J. Weinberger and G. Seroussi, “Sequential prediction and ranking in universal context
modeling and data compression,” IEEE Trans. Inform. Theory, vol. 43, no. 5, pp. 1697-1706,
September 1997.
(8) J. Muramatsu, “On the performance of recency–rank and block–sorting universal lossless
data compression algorithms,” IEEE Trans. Inform. Theory, vol. 48, no. 9, pp. 2621-2625,
September 2002.

(9) T. Weissman, E. Ordentlich, G. Seroussi, S. Verdú, and M. J. Weinberger, “Universal discrete


denoising: known channel,” IEEE Trans. on Inform. Theory, Vol. 51, no. 1, pp. 5–28, January
2005.

(10) F. M. J. Willems, Y. M. Shtarkov, and T. J. Tjalkens, “The context-tree weighting method:


basic properties,” IEEE Trans. on Inform. Theory, Vol. 41, No. 3, pp. 653-664, May 1995.

(11) M. J. Weinberger and M. Feder, “Predictive stochastic complexity and model estimation for
finite-state processes,” Journal of Statistical Planning and Inference, Vol. 39, pp. 353-372,
1994.

(12) W. Yang and W. Liu, “The asymptotic equipartition property for M th–order nonhomo-
geneous Markov information sources,” IEEE Trans. Inform. Theory, Vol. 50, no. 12, pp.
3326–3330, December 2004.

(13) I. Tabus and J. Rissanen, “Asymptotics of greedy algorithms for variable–to–fixed length
coding of Markov sources,” IEEE Trans. Inform. Theory, vol. 48, no. 7, pp. 2022–2035, July
2002.

(14) M. Drmota and W. Szpankowski, “Precise minimax redundancy and regret,” IEEE Trans.
Inform. Theory, vol. 50, no. 11, pp. 2686–2707, November 2004.

(15) M. Effros, K. Visweswariah, S. R. Kulkarni, and S. Verdú, “Universal lossless source coding
with the Burrows Wheeler transform,” IEEE Trans. Inform. Theory, vol. 48, no. 5, pp. 1061–
1081, May 2002.

(16) E.-h. Yang, A. Kaltchenko, and J. C. Kieffer, “Universal lossless data compression with side
information by using a conditional MPM grammar transform,” IEEE Trans. Inform. Theory,
vol. 47, no. 6, pp. 2130–2150, September 2001.

(17) J. A. Storer and T. G. Szymanski, “Data compression via textual substitution,” Jour. ACM
29(1982) 928-951.

(18) W. J. Teahan and J. G. Cleary, “The entropy of English using PPM-based models,” IEEE
Data Compression Conference ’95, pp. 53-62, Snowbird, Utah, March 1996.

(19) J. G. Cleary and W. J. Teahan, “Unbounded lengfh contexts for PPM,” The Computer
Journal 40 (1997) 67-75.

(20) M. J. Weinberger, G. Seroussi, and G. Sapiro, “LOCO-I: A low complexity, context-based,


lossless image compression algorithm,” Proc. Data Compression Conf. (1996) 140-149.

(21) F. Liang and A. Barron, “Exact minimax strategies for predictive density estimation, data
compression, and model selection,” IEEE Trans. Inform. Theory, vol. 50, no. 11, pp. 2708–
2726, November 2004.

2
(22) G. I. Shamir and D. J. Costello, Jr., “Universal lossless coding for sources with repeating
statistics,” IEEE Trans. Inform. Theory, vol. 50, no. 8, pp. 1620–1635, August 2004.

(23) R. Sundarsen, “Guessing under source uncertainty,” IEEE Trans. Inform. Theory, vol. 53,
no. 1, pp. 269–287, January 2007.

(24) L. A. Lastras–Montaño, “On certain pathwise properties of the sliding–window Lempel–Ziv


algorithm,” IEEE Trans. Inform. Theory, vol. 52, no. 12, pp. 5267–5283, December 2006.

(25) G. I. Shamir, “Universal lossless compression with unknown alphabets – the average case,”
IEEE Trans. Inform. Theory, vol. 52, no. 11, pp. 4915–4944, November 2006.

(26) G. M. Gemelos and T. Weissman, “On the entropy rate of pattern processes,” IEEE Trans.
Inform. Theory, vol. 52, no. 9, pp. 3994–4007, September 2006.

(27) H. Cai, S. R. Kulkarni, and S. Verdú, “An algorithm for universal lossless compression with
side information,” IEEE Trans. Inform. Theory, vol. 52, no. 9, pp. 4008–4016, September
2006.

(28) E.-h. Yang and D.-k. He, “Universal data compression with side information at the decoder
by using traditional universal lossless compression algorithms,” Proc. ISIT 2007, pp. 431–435,
Nice, France, June 2007.

(29) H. S. Cronie and S. B. Korada, “Lossless source coding with polar codes,” Proc. ISIT 2010,
pp. 904–908, Austin, TX, U.S.A., June 2010.

(30) C. Chang and A. Sahai, “Upper bound on error exponents with delay for lossless source coding
with side–information,” Proc. ISIT 2006, pp. 326–330, Seattle, Washington, July 2006.

(31) S. C. Draper, “Universal incremental Slepian–Wolf coding,” Proc. Annual Allerton Conference
on Communication, Control, and Computing, Monticello, IL, October 2004.

(32) D. Baron, M. A. Khojastepour, and R. G. Baraniuk, “Redundancy rates of Slepian–Wolf


Coding,” Proc. Annual Allerton Conference on Communication, Control, and Computing,
Monticello, IL, October 2004.

(33) D. Baron and A. C. Singer, “On the cost of worst case coding length constraints IEEE Trans.
Inform. Theory, vol. 47, no. 7, pp. 3088–3090, November 2001.

(34) W. Szpankowski, “Asymptotic average redundancy of Huffman (and other) block codes,”
IEEE Trans. Inform. Theory, vol. 46, no. 7, pp. 2434–2443, November 2000.

(35) D. Sheinwald, A. Lempel, and J. Ziv, “Two–dimensional encoding by finite–state encoders,”


IEEE Trans. on Communications, vol. 38, no. 3, pp. 341–347, March 1990.

(36) L. Györfi, I. Páli, and E. van der Meulen, “There is no universal source code for an infinite
alphabet,” IEEE Trans. Inform. Theory, vol. IT–40, no. 1, pp. 267–271, January 1994.

(37) P. C. Shields, “Universal redundancy rates do not exist,” IEEE Trans. Inform. Theory,
vol. IT–39, no. 2, pp. 520–524, March 1993.

3
(38) Y. M. Shtarkov, T. J. Tjalkens, and F. M. J. Willems, “Multialphabet weighting universal
coding of context tree sources,” Problems of Information Transmission (IPPI), vol. 33, no. 1,
pp. 17–28, 1997.

(39)-(46) Papers by the following authors, published in the July 2004 issue of the IEEE Trans. Inform.
Theory (special issue on problems on sequences): Jacquet and Szpankowski; Kieffer and Yang;
Savari; Martı́n, Seroussi and Weinberger; Orlitsky, Santhanam, and Zhang; Nobel; Meron and
Feder; Cai, Kulkarni and Verdú.

You might also like