Summary
Two minor issues found during translation review of the zh-cn repo (via Copilot code review on QuantEcon/lecture-python-programming.zh-cn#19):
1. PRNG key reuse in lectures/autodiff.md
In the gradient descent / simulated data section, the same key is used for both jax.random.uniform and jax.random.split:
x = jax.random.uniform(key, (n,))
α, β, σ = 0.5, 1.0, 0.1
key, subkey = jax.random.split(key)
ϵ = jax.random.normal(subkey, (n,))
This violates JAX's guidance that keys should be treated as single-use. key is consumed by uniform and then reused as input to split, which can lead to correlated draws.
Suggested fix:
key, x_key, eps_key = jax.random.split(key, 3)
x = jax.random.uniform(x_key, (n,))
α, β, σ = 0.5, 1.0, 0.1
ϵ = jax.random.normal(eps_key, (n,))
2. Unused random import in lectures/numpy_vs_numba_vs_jax.md
random is imported but never used in the lecture. Can be safely removed.
Found by GitHub Copilot code review during automated translation sync.
Summary
Two minor issues found during translation review of the zh-cn repo (via Copilot code review on QuantEcon/lecture-python-programming.zh-cn#19):
1. PRNG key reuse in
lectures/autodiff.mdIn the gradient descent / simulated data section, the same
keyis used for bothjax.random.uniformandjax.random.split:This violates JAX's guidance that keys should be treated as single-use.
keyis consumed byuniformand then reused as input tosplit, which can lead to correlated draws.Suggested fix:
2. Unused
randomimport inlectures/numpy_vs_numba_vs_jax.mdrandomis imported but never used in the lecture. Can be safely removed.Found by GitHub Copilot code review during automated translation sync.