Coursera Machine learning course starts Monday, 17 Oct


(Marc Cooper) #47

Leaving this here mainly for @stevejalim, since it relates strongly to this topic: Everything Is Computation

What really hit home for me is in the penultimate paragraph:

Computation changes our idea of knowledge: instead of treating it as justified true belief, knowledge describes a local minimum in capturing regularities between observables.


(Andy Wootton) #48

There seems to be a problem earlier in the article: if state-change is a precursor to computation, how do pure functions with immutable state work? Are they pure maths or just magic? This is what happens when MIT moves from Lisp to Python :slight_smile: A very thought provoking piece (I know I’m Wrong but the thought process is a fun trip.)

I already had the difference between calculation and computing on my to-do list. You’ve either just made it easier or harder. I need to go and let the cat out.

I’ve had this ‘on file’ for a while and it’s old http://www.cs.northwestern.edu/~ian/What%20is%20computation.pdf

I find it interesting that Babbage & Ada seem to have seen the difference between the computational Analytical and the calculational Difference Engines more clearly than we do, that Turing defined mathematically what computability is, but that we still seem unclear about the definition of computation.


(Andy Wootton) #49

I’ve mentioned you in dispatches @auxbuss (as a trouble-maker)


(Andy Wootton) #50

This just came up on a friend’s feed


(Marc Cooper) #51

That’s a good paper. It cops out a bit by answering “What is computation?” by ultimately saying “It’s tricky to define”, but it covers some good ground that gives the reader a lot of info to think about the question.

Also state-change doesn’t necessarily equate to change of (a variable’s) state. A system is almost always changing state.


(Marc Cooper) #52

I’m internet famous at last! :dancers: :blush:

My view is that time is an illusion. An emergent phenomenon. I base this on absolutely nothing more than what I know of what we humans know right now. Which is a tiny fraction of what a CERN physicist knows. Space, as we know it, is probably the same.

The idea of quantum time has been around a while. The limit is Planck Time, I believe.

So, your closing idea washes over me since there are no dimensions and there’s certainly no now. I suspect these are simply emergent properties of fields. I also suspect there are a few more fields than we already know about. I hope so.

A collection of fields jiggling about with their fluctuations and vacuum energy interacting with each other under an immutable set of rules. We call it The Universe.


Calculation or Computation?
(Andy Wootton) #53

There was a scientist on Radio 4 talking about a new theory that ‘gravity might be wrong’ so there’s no need for Dark Matter to exist to make the equations work :slight_smile:

He said something like, “Physics will carry on, whether we understand it or not.” I feel the same way about Truth. It exists, even if we don’t know what it is, or if it is unknowable. Beliefs, testable hypotheses, theories, ‘Laws’ and scientific facts are us trying to edge closer, mostly in the dark, occasionally falling in bunkers.


(Steve Jalim) #54

I’m very very nearly done with the course and will write up my views on it when I am.

However, in the meantime this is awesome http://kevinhughes.ca/blog/tensor-kart - and even more so given that everything significant and techy here makes sense and is familiar, thanks to the course


(Andy Wootton) #55

Is this related to the “discrete tensor networks” I mentioned in the blog post, above? Thanks for the link. It looks relevant to several of my interests. I was hoping to invent a multi-dimensional network thing once I was better at Clojure.


(Steve Jalim) #56

and


(Andy Wootton) #57

I don’t actually understand the question I’m asking yet. I deliberately didn’t find out until I’d written the blog entry. A quick search gave me “neural tensor networks” and weird quantum entanglement shizzle which I can imagine being a multi-dimensional network. When I failed to learn to Lisp in 1981, it was for a project to add fuzzy logic into a semantic network. They’ve taken their time, haven’t they? :smiley: I’m going to blame the PC revolution, as usual, for plunging us back into the dark ages.


(Marc Cooper) #58

TensorFlow looks awesome. If it had some modern language bindings, it’s be great :smile:


(Marc Cooper) #59

MOND (or Modified Newtonian Dynamics) has been around for a long time. The problem is making the equations work. Newtonian physics has proved to be extremely accurate for every space mission we’ve deployed to date. So, it certainly scales to Solar System level.

I suspect the reason this is being discussed is because of the recent death of Vera Rubin; a personal heroine of mine. Her book, Bright Galaxies, Dark Matters, is wonderful for anyone remotely interested in dark matter. It’s one of the very few paper books I retain. I just love owning it and reading bits of it from time to time.

Vera’s hope was that it is Newtonian physics that is wrong, and that dark matter, as a sub-nuclear particle, is not the cause of the galactic gravitational anomaly. I’m with Vera on this.

It’s disgraceful that she wasn’t awarded a Nobel for her work, and I’m not the only one to think so: http://www.nytimes.com/2017/01/04/opinion/why-vera-rubin-deserved-a-nobel.html

Edit: So the non-relativistic component of MOND bothered me. I found this: https://arxiv.org/abs/1112.3960


(Marc Cooper) #60

Moving away slightly from ML, but same field. Loved this from 33rd Chaos Communication Congress (33C3) in December. Paging @stevejalim & @Woo. Q&A at ~45 mins:

https://www.youtube.com/watch?v=u7aB2khRKWY

Joscha Bach is @plinz on the twitters.


(Andy Wootton) #61

This is great. I’m only half way through. It mentions the state-vector of the universe. JSD talks about them for a single process and I think it contains a ‘text pointer’. The universe must have a big book, full of them.

I went to a talk by Jackson on how to use JSD with OO. He seems to have had the same ideas.


(Andy Wootton) #62

deep learning and neutral nets, from LinkedIn http://www.andreykurenkov.com/writing/a-brief-history-of-neural-nets-and-deep-learning/


(Andy Wootton) #63

Pete Ashton (fellow at BOMlabs) was asking on Twitter if anyone is interested in machine learning in art.


(Marc Cooper) #64

This is an excellent intro. It uses a bit of ML lingo (i.e. everyday words that have meaning in ML), but I don’t think detracts. It’s only 20 mins. Well worth a watch.

As I watched, I pondered whether the technique described could be applied to automatically handle functional tests, particularly regression testing. Interesting times.


(Marc Cooper) #65

Now starting on Neural Networks for Machine Learning.


(Steve Jalim) #66

At ~2h/week, that’s looking a lot easier to fit into things!