channel-capacity
npx machina-cli add skill parcadei/Continuous-Claude-v3/channel-capacity --openclawChannel Capacity
When to Use
Use this skill when working on channel-capacity problems in information theory.
Decision Tree
-
Mutual Information
- I(X;Y) = H(X) + H(Y) - H(X,Y)
- I(X;Y) = H(X) - H(X|Y) = H(Y) - H(Y|X)
- Symmetric: I(X;Y) = I(Y;X)
scipy.stats.entropy(p) + scipy.stats.entropy(q) - joint_entropy
-
Channel Model
- Input X, output Y, channel P(Y|X)
- Channel matrix: rows = inputs, columns = outputs
- Element (i,j) = P(Y=j | X=i)
-
Channel Capacity
- C = max_{p(x)} I(X;Y)
- Maximize over input distribution
- Achieved by capacity-achieving distribution
-
Common Channels
Channel Capacity Binary Symmetric (BSC) 1 - H(p) where p = crossover prob Binary Erasure (BEC) 1 - epsilon where epsilon = erasure prob AWGN 0.5 * log2(1 + SNR) -
Blahut-Arimoto Algorithm
- Iterative algorithm to compute capacity
- Alternates between optimizing p(x) and p(y|x)
- Converges to capacity
z3_solve.py prove "capacity_upper_bound"
Tool Commands
Scipy_Mutual_Info
uv run python -c "from scipy.stats import entropy; p = [0.5, 0.5]; q = [0.6, 0.4]; H_X = entropy(p, base=2); H_Y = entropy(q, base=2); print('H(X)=', H_X, 'H(Y)=', H_Y)"
Sympy_Bsc_Capacity
uv run python -m runtime.harness scripts/sympy_compute.py simplify "1 + p*log(p, 2) + (1-p)*log(1-p, 2)"
Z3_Capacity_Bound
uv run python -m runtime.harness scripts/z3_solve.py prove "I(X;Y) <= H(X)"
Key Techniques
From indexed textbooks:
- [Elements of Information Theory] Elements of Information Theory -- Thomas M_ Cover & Joy A_ Thomas -- 2_, Auflage, New York, NY, 2012 -- Wiley-Interscience -- 9780470303153 -- 2fcfe3e8a16b3aeefeaf9429fcf9a513 -- Anna’s Archive. Using a randomly generated code, Shannon showed that one can send information at any rate below the capacity C of the channel with an arbitrarily low probability of error. The idea of a randomly generated code is very unusual.
Cognitive Tools Reference
See .claude/skills/math-mode/SKILL.md for full tool documentation.
Source
git clone https://github.com/parcadei/Continuous-Claude-v3/blob/main/.claude/skills/math/information-theory/channel-capacity/SKILL.mdView on GitHub Overview
This skill helps you solve channel capacity problems in information theory by modeling the channel, computing mutual information, and maximizing it. It covers the core concepts of I(X;Y), the channel model P(Y|X), and practical methods like Blahut-Arimoto, with reference to common channels such as BSC, BEC, and AWGN.
How This Skill Works
Model X and Y with P(Y|X) and form the channel matrix (rows inputs, columns outputs). Compute I(X;Y) using entropy relations such as I(X;Y) = H(X) + H(Y) - H(X,Y) or I(X;Y) = H(X) - H(X|Y). Maximize I(X;Y) over p(x) to obtain the capacity C. Use iterative methods like the Blahut-Arimoto algorithm to converge to the capacity and the capacity-achieving input distribution.
When to Use It
- When deriving the capacity of a given channel model P(Y|X).
- When you need to compute mutual information I(X;Y) from distributions.
- When working with common channels like BSC, BEC, or AWGN and applying their capacity formulas.
- When you want to verify capacity with the Blahut-Arimoto algorithm.
- When you need to bound capacity using upper bounds or solver-assisted verification.
Quick Start
- Step 1: Define the channel model P(Y|X) and the input alphabet X.
- Step 2: Compute I(X;Y) using entropy relations or empirical distributions.
- Step 3: Run Blahut-Arimoto to maximize I(X;Y) and obtain the capacity and the optimal p(x).
Best Practices
- Write out the channel model P(Y|X) and assemble a channel matrix with inputs as rows and outputs as columns.
- Compute I(X;Y) using I(X;Y) = H(X) + H(Y) - H(X,Y) or I(X;Y) = H(X) - H(X|Y).
- Maximize I(X;Y) over the input distribution p(x) to obtain the channel capacity.
- Apply Blahut-Arimoto to iteratively optimize p(x) and p(y|x) until convergence.
- Cross-check results with known formulas for BSC, BEC, and AWGN and use bounds as sanity checks.
Example Use Cases
- BSC capacity: C = 1 - H(p) where p is the crossover probability.
- BEC capacity: C = 1 - epsilon where epsilon is the erasure probability.
- AWGN capacity approximation: C ≈ 0.5 * log2(1 + SNR).
- Compute H(X) and H(Y) using entropy functions to verify I(X;Y).
- Use a solver script to bound capacity, e.g., prove capacity_upper_bound with a tool.