When trying to answer this question, it's easy to start talking about programming languages and computer science, but ultimately I found answers involving those concepts unsatisfactory.
If we think about what it's needed to become a good programmer, to start learning programming, computer science and knowledge of a programming language aren't good answers at all.
In most beginner programming classes, those two topic aren't explicitly touched until few lectures in, and even then, you can easily notice that the real struggle for a beginner is not knowing how to write a for-loop or how to declare a variable, it's instead how to code.
Code doesn't merely mean "writing computer programs", instead it hints at a deeper understanding of programming: when we code we are really translating a problem into something capable of solving that problem. We are coding that problem with a programming language.
That is the actual thing I do everytime I code. I take a problem, analyze it, decompose it in smaller and smaller subproblems until I get to something a programming language can directly handle (i.e. a conditional statement, an algebraic expression, an assignment).
If a beginner is provided with a description of a program (like a flow chart or a pseudocode snippet), she can easily translate it to "real" code. However provide just the statement of the program ("sort a vector", "find the GCD of two numbers") and the task becomes a lot harder.
I remember struggling at writing very simple programs when I learned to code, and I see people learning from scratch going through the same hurdles.
Coding is by no means an easy task, it's one which requires years to be mastered to an high enough level, and even then, you cannot realistically expect to become able to code every problem. Simple problems will become increasingly easier, although even experienced programmers can struggle with problems whose coded form is not trivial at all (i.e. object recognition, non-linear optimization, ecc.).
With time, our decomposing skills gets better and better, as well as our knowledge of how to do it (algorithms, theoretical computer science), different programming languages, and so on.
Then we start to bump into new issues: to find a program that just solves the problem it's not enough anymore, we need it to be future-proof. We start dealing with stuff like technical debt, maintainability, testing and so on. I would argue this is a definite line of demarcation between two different ways of being a programmer, a meaningful step of your growth as a programmer.
If we embark on big projects (or simply long ones), this is going to happen very soon, simply because we are human, and humans complicate everything: humans make mistakes, thus the need of check the correcteness of our program (TDD, linting, ecc.), humans change idea (versioning, maintainability, and a awful lot of more things), humans don't understand each other as well as computers do (code formatting rules, programming idioms, documentation) and other sneaky issues that you need to deal with.
Unless you are immediately start writing enterprise-level code, it's likely your going to deal with them in chunks, and you'll slowly assimilate them in your coding style and especially in your problem translation process.
Coding changes meaning, and becomes a stronger concept: you are not just coding your problem to a program which can solve it, you are doing it in a good way, that is, one that satisfies one or more of the principles listed above.
You are not writing code that "just works" anymore.