Implement toNAF(n) that converts an integer n (allow negative n) to its Non-Adjacent Form using digits from {-1, 0, 1} with no two adjacent nonzero digits, returning digits least-significant-first. Implement fromNAF(digits) to convert a NAF sequence back to the integer. Prove correctness, show why NAF minimizes the number of nonzero digits among signed-binary representations, analyze time and space complexity, and handle edge cases (n = 0, very large magnitude). As an extension, show how to compute n * k efficiently using the NAF of k to reduce additions.