Calculation or Computation?

(Andy Wootton) #1

When I studied computer science, the emphasis was on practical problem solving and learning ‘to code’. I remember brief mentions of Turing Machines and Von Neuman architecture before we got on with the real stuff. A few years later, I think it would have been a Software Engineering course. At my first job in Cambridge I met Cambridge Computer Lab graduate who had followed a much more theoretical path and self-taught hackers. They were often the better coders because they’d learned out of love.

I now find myself interested in the difference between calculation and computation and I don’t “have the moves”. Computability appears to have been defined in terms of the Turing Machine, which was defined in order to explain what computing was. That feels like wanting to define agricultural machinery, so specifying a theoretical lawn mower then going on to define machinabilty in terms of the mower.

I’ve found some very unsatisfactory explanations of the difference: calculation is simple and arithmetic, computation has conditionals (is Turing Complete.) In computer science, can any of you computing people tell me what computing is please? @auxbuss posted a link about computing causing state change but I pointed out that pure, immutable functional code doesn’t. I immediately realised that it does, outside the functional system, on the ‘platform’ it is running on. (That’s very like how life reverses entropy within the living system at the cost of greater entropy outside the system. Perhaps functional systems encapsulate the maths/magic. Clojure passes all the filth out to Java.)

(Andy Wootton) #2

Via @auxbuss in Coursera Machine learning course starts Monday, 17 Oct ‘Everything is computation’

and me ‘What is computation?’ and ‘Your brain is not a computer’ (I think it is.)

I think calculation is a subset of computation but the boundary may use fuzzy logic e.g. Is knitting a jumper computation? What if it has your initials on the front?

(Marc Cooper) #3

I started a response, then remembered that there’s a long treatise on this: Roger Penrose, The Emperor’s New Mind. It goes way beyond this topic, of course.

I read it when it came out and recognise it as formative in retrospect. I’d get far more out of it now and should probably reread it. So, I just ordered it :slight_smile:

(Andy Wootton) #4

I’ve gone way off the real subject of my question with the references. I’m interested in data processing not consciousness. No-one seems able to define what that means, even if they’re convinced it exists.

I was just trying to decide what makes a ‘machine’ change from being a ‘calculator’ to a ‘computer’. I currently think it’s about changing the function by input data and creating data as output, otherwise a Jacquard loom is a computer and I don’t think Babbage would like that. I think the Difference Engine was probably a calculator (to produce tables for ‘computers’ to use in their work) but the Analytical Engine design was for a general purpose computer. I couldn’t see how to explain the difference.

(Marc Cooper) #5

That’s one of the things that makes you interesting. Please don’t change :slight_smile:

Are you asserting that consciousness is not data processing? The book remains relevant.

(Andy Wootton) #6

Because of the way I think, I’ve never been convinced of the idea that consciousness resides in one place or that a brain and a computer need to work the same way. We know the mind isn’t entirely electrical.

The way I spray ideas around on-line is an experiment in not doing one thing at a time and ‘crossing the streams’. I think I could stop if I wanted to but I think networks of people bouncing off one another can invent things.

(Andy Wootton) #7

I’m back on this question.


“The Analytical Engine marks the transition from mechanised arithmetic to fully-fledged general purpose computation”

Others replace “mechanised arithmetic” with “mechanical calculator”, for the Difference Engine.

“General Purpose Computer” points to which gives:
“The processing element carries out arithmetic and logical operations, and a sequencing and control unit can change the order of operations in response to stored information” so I think the fact that input is a set of symbols rather than just numbers is important.

Definitions seems very circular. Turing clearly knew what thought computing was because he appears to have invented the theoretical Turing Machine to explain, then defined computability in terms of the machine. Others did similar things which were shown to be equivalent.–Turing_thesis gives a hypothesis about the nature of computable functions.
“It states that a function on the natural numbers is computable by a human being following an algorithm, ignoring resource limitations, if and only if it is computable by a Turing machine.”

Digital physics suggests that the universe is a computer. I don’t buy that the universe can be replaced by a Turing machine - maybe a quantum computer? I’ve strayed out of my region of clue now.

(Steve Jalim) #8

My general Wikipedia-veracity caveat: the crowd is good, but the crowd is not infallible.

(Andy Wootton) #9

I’ve gone back to Turing’s original paper. He says a computer is anything that can calculate the set of computable numbers :slight_smile:
Then there’s a load of maths. There’s a book that explains this paper. I don’t think I’ll read it.

(Andy Wootton) #10

“computable numbers are the real numbers that can be computed to within any desired precision by a finite, terminating algorithm.”

for an imperative language to be Turing-complete, it needs:

A form of conditional repetition or conditional jump
A way to move to, read and write some form of storage

Truly amazing fact: Babbage’s Analytical Engine, designed 1837 would have been the first Turing Complete machine (with the infinite storage requirement relaxed) but the UK government wouldn’t fund it because he’d gone off-plan from the Difference Engine they ordered in 1923. Would that be Prince 1?