Information Amount & Entropy

Laboratory work: Analysis of source information characteristics.

📚 Formulas & Definitions

Information amount is a posterior characteristic, while entropy is an a priori measure of uncertainty.

  • Self-information: I(Xi) = -log2 P(Xi).
  • Unconditional Entropy: H(X) = -Σ p(Xi) log p(Xi).
  • Conditional Entropy: H(X/Y) = Σ p(yj) H(X/yj).

🔢 Information in Ensemble

// Waiting for input...

📊 Unconditional & Max Entropy

// Waiting for input...

🧩 Conditional Source Entropy

// Waiting for input...