TY - BOOK AU - MacKay, David J. C. TI - Information Theory, Inference and Learning Algorithms SN - 9780521642989 U1 - 003.54 PY - 2003/// CY - UK PB - Cambridge University Press KW - Computer Engineering N1 - Introduction to information theory Probability, entropy and inference More about inference Part - 1: Data Compression: The source coding theorem Symbol codes Stream codes Codes for integers Part - 2: Noisy-Channel Coding: Dependent random variables Communication over a noisy channel The noisy-channel coding theorem Error-correcting codes and real channels Part - 3: Further Topics in Information Theory: Hash codes Binary codes Very good linear codes exist Further exercises on information theory Message passing Constrained noiseless channels Crosswords and codebreaking Why have sex? Information acquisition and evolution Part - 4: Probabilities and Inference: An example inference task: clustering Exact inference by complete enumeration Maximum likelihood and clustering Useful probability distributions Exact marginalization Exact marginalization in trellises Exact marginalization in graphs Laplace's method Model comparison and Occam's razor Monte Carlo methods Efficient Monte Carlo methods Ising models Exact Monte Carlo sampling Variational methods Independent component analysis Random inference topics Decision theory Bayesian inference and sampling theory Part - 5: Neural Networks: Introduction to neural networks The single neuron as a classifier Capacity of a single neuron Learning as inference Hopfield networks Boltzmann machines Supervised learning in multilayer networks Gaussian processes Deconvolution Part - 6: Sparse Graph Codes Low-density parity-check codes Convolutional codes and turbo codes Repeat-accumulate codes Digital fountain codes Part - 7: Appendices: A. Some physics Some mathematics ER -