EE 565 Information Theory and Its Application to (Big) Data Sciences
Units: 4 Terms Offered: FaSpEntropy and mutual information. Variable and fixed-length, lossless and lossy compression. Universal compression. Text and multimedia compression. Channel capacity. Error-correcting codes. Erasure and Gaussian channels. Prerequisite:EE 503 Duplicates Credit in former EE 565a Instruction Mode: Lecture, Discussion Grading Option: Letter
You must be logged in to post a comment.