A discussion where I’m in over my head.
It started with this article in Wired Magazine. Where the author describes what a world of computative thinking looks like if there is data approaching infinity…
Does the axiom that correlation does not equal causation still hold?
Then my friend (who will not be cited here since he would not want me to post this anyway) said this about that
That is interesting. The best case of something like this I’ve heard of is in Cystic Fibrosis research where they just try every gene sequence to see which ones help. Massive numbers of trial and error. No theory, no idea which will help. Just keep trying — one of these has to fit…
So if we allow anyone to write quizzes for students and watch the results across a large user group to determine which questions do the best job of discerning A-students from B-students and B from C, etc. then who needs instructional design? Maybe we can ask undergrads to make flash reviews of key topics, then track exposure to those reviews and test results to find the best reviews. Of course, that seems to put an awful lot on the articulation of the question. Who’s test? I can know which question is most often answered correct by A students but incorrect by B students for one particular class, but the concept of “A student” just isn’t reliable enough to generalize, is it?
If we have no model but have all of something (such as all the text on the web) do we have anything useful? Life size maps and time machines that go forward at regular speed and what not. I mean, I already have a tool that allows me to access the entire web in my browser. If we have a fact such as “the word ‘handbasket’ almost always comes after the phrase ‘to hell in a'” do we know anything? Can we infer that handbaskets are consistently defined as a means of conveyance to hell? Maybe this is really about finding statistical correlations on massive scales. So instead of guessing that handbaskets go with hell, we can actually know that they do. Which is useful, but descriptive. It’s like not having to guess at how far away the moon is but instead being able to measure it within a fraction of an inch. That’s far more precise and very helpful in planning trips there, but in the end it’s description of what is and not comprehension, knowing or meaning.
At this point I feel thoroughly unmoored, as if I’ve missed the point and once again bounced off of a quantitative claim further into my qualitative bubble, so I’ll stop.
But thanks for passing it along.
then I said this, which I wrote but don’t follow too well …
yeah, I track that parsing “what happens” is significantly different from “what it means.” The audio clip about education’s end by Anthony Kronman, which is not exactly stellar to my mind, touches on this difference in how we acquire and gauge things by domain. and this petabyte take on discovering patterns in a random but knowable universe, strikes me as inverse to a more stochastic (fabulous word I just learned the other day) way of looking at things, which is to say we take the patterns that we recognize (as a Douglas Aadams exercise in silliness) and look for the unlikely, or improbable, as in all the air in the room collects I one corner: while not impossible it is highly improbable, right? It is running so many hypothetical tests and pairings that at some point you will get monkeys typing Hamlet.
so if that were not adequately unclear, it seems that it is one thing to take the gwazillion random tests of, say any pairing and combination of genes within some whopping DNA strand and with the use of some tricky algorithm place every piece of a puzzle in every possible combination and at the end of the day (or nanosecond) you get the exact sensible combination. Confusion into a spiffy Jigsaw of Whistler’s Mother. This is taking the arbitrary and guessing at rules until you get lucky on a 10 to the 25th scale. Rather than thinking through which piece of which puzzle matches Mrs. Whistler’s left eye inside, puzzle piece inny with Mrs. Whistler’s left eye outside, puzzle piece outy you just move all the pieces in a very speedy fashion into all possible positions and get order from disorder. Second law of thermo dynamics be darned.
But in the petabyte article I gather he is referencing all data (and that’s a lot of jigsaw pieces) being juxtaposed in all ways and all the combination of all the things (or numbers), and this will supplant the scientific model of observing, hypothesis stating, and theorizing, and instead it will give one a numeric concept of what is. This is the opposite of stochastic right. it is saying that the monkeys finished Shakespeare because they technically could.
All that to say that while the correlation of handbaskets to damnation can be counted, it cannot really be understood, unless we kid ourselves with the idea of understanding and it really is quantitative and there you go. The counting is all that understanding is and everything else is just the Humanities.
So I hope that this response was as cryptic as yours. I do like the words.