Ok to have stack depth linearly proportional to some input size?
When programming in Java (or any other procedural language for that matter), I often choose between solving something recursively vs solving it iteratively. The recursive option is often more elegant than an iterative solution so I usually go for the recursive solution. With one exception:
Worrying about stack overflows I tend to avoid recursive solutions if the maximum stack depth is linearly proportional to the size of the input (or worse). I realize however that in many other languages (even ones targeting the JVM such as Scala and Clojure) many algorithms, such as basic list algorithms for instance, are often expressed recursively where the maximum stack depths is proportional to the length of the list.(1) So, are my worries about stack overflows in linear-stack-depth-algorithms justified?
TL;DR: What "stack depth complexity" is considered reasonable? Logarithmic complexity, recursive binary search for instance, O(log N) is surely ok, but how about O(N), O(N log N), O(N2)? Where would you typically draw the line?(2)
(1) I realize that such languages sometimes supports things like @tailrec, but this question concerns Java, C# etc.(2) Note that I'm not concerned about CPU overhead etc. Just the stack depth.
You can't answer this without asking yourself what size input you want to be able to support.
Linear space complexity on the stack is absolutely fine if you're processing a deck of playing cards, and probably completely hopeless if you're taking in huge text files. You need to work out what the maximum input size will be for your application, or rather, the largest input for which you don't mind it failing.
We do this all the time. For instance, every time you declare an array in Java, you know that the array can't have more than 231 elements in it. Does that matter? Does it mean the program is broken? Probably not; but sometimes it does, if huge input is something you want to be able to deal with.
So I don't think you can be too prescriptive about this. What would you say if someone asked (in quite general terms) what time complexity was OK? You might say some general principles: usually linear is OK, usually exponential is bad... but the real answer is, it depends what you're doing.
Stack depth complexity is one thing to consider, but another thing is the input size itself.
If the problem excludes large input of its domain (for extended period time in the development process), then we can go for recursive solution (of course, the recursive solution must not exceed the capacity of the stack with the largest possible input).
If the input size is not defined, O(1) stack depth or O(log N) stack depth is acceptable, IMHO. It may be possible that O(log N) may not be acceptable if the practical upper limit on the input size is astronomically large that it exceeds the stack capacity. However, I think such case would be quite rare.