When I studied computer science, the emphasis was on practical problem solving and learning 'to code'. I remember brief mentions of Turing Machines and Von Neuman architecture before we got on with the real stuff. A few years later, I think it would have been a Software Engineering course. At my first job in Cambridge I met Cambridge Computer Lab graduate who had followed a much more theoretical path and self-taught hackers. They were often the better coders because they'd learned out of love.
I now find myself interested in the difference between calculation and computation and I don't "have the moves". Computability appears to have been defined in terms of the Turing Machine, which was defined in order to explain what computing was. That feels like wanting to define agricultural machinery, so specifying a theoretical lawn mower then going on to define machinabilty in terms of the mower.
I've found some very unsatisfactory explanations of the difference: calculation is simple and arithmetic, computation has conditionals (is Turing Complete.) In computer science, can any of you computing people tell me what computing is please? @auxbuss posted a link about computing causing state change but I pointed out that pure, immutable functional code doesn't. I immediately realised that it does, outside the functional system, on the 'platform' it is running on. (That's very like how life reverses entropy within the living system at the cost of greater entropy outside the system. Perhaps functional systems encapsulate the maths/magic. Clojure passes all the filth out to Java.)