Theoretically yes but practically no. Theoretically Big O notation is defined as a limit that goes to infinity but in practice there's no such thing as infinity and saying that an algorithm's runtime is in O(n) implies certain things about its behaviour for finite n, that's kind of the point. And it's perfectly possible for an algorithm to be defined piecewise on the input size and so saying it has runtime O(n) for small n is a statement that has practical significance.