This question evaluates understanding of binary representations and signed-digit encodings (Non-Adjacent Form), algorithmic reasoning about correctness and optimality, and the ability to analyze time/space complexity and edge-case behavior within the Coding & Algorithms domain, combining conceptual understanding with practical implementation.
Implement toNAF(n) that converts an integer n (allow negative n) to its Non-Adjacent Form using digits from {-1, 0, 1} with no two adjacent nonzero digits, returning digits least-significant-first. Implement fromNAF(digits) to convert a NAF sequence back to the integer. Prove correctness, show why NAF minimizes the number of nonzero digits among signed-binary representations, analyze time and space complexity, and handle edge cases (n = 0, very large magnitude). As an extension, show how to compute n * k efficiently using the NAF of k to reduce additions.