In this sense, Turing completeness is a *really bad* property of a programming language because it's such a gigantic over-approximation of the thing you actually want, which is the total computable functions. In this sense the thing that languages like Agda and Idris do is much better: "by default" (ie. using the obvious type signature Nat -> Nat) supply as big a possible under-approximation of total computable functions that contains everything you need it to contain except for 1 in a million cases, and for the remaining 1 in a million cases let you get back to the standard over-approximation by partial computable functions by using a different type signature (like Nat -> Delay Nat or whatever)