Tracy asks:
Stephen Downes, I asked you before in a comment on one of Ken's posts if there is any evidence that could convince you that content knowledge is necessary for thinking. I suppose I was too late and you were no longer reading the thread, I'll try again here, running the risk of course that you might no longer be following the thread.
Is there any evidence that could convince you that content knowledge might be necessary for reading and critical thinking?
I pulled Tracy's comment up front because I'd also like to hear a straight answer from Stephen for this question and the best way to attract an answer from him is to write a blog post which he'll see in his feed reader. Typically, he'll be the first to comment.
Of course, Tracy is trying to thread the needle by expecting a concise, clear, and (fully) supported answer from Stephen that answers the precise question asked. Tracy expects this type of answer because that's how she would answer (I know this based on the numerous comments Tracy has left on many blogs). Though there is a slight chance will provide such an answer (just to prove me wrong), it is far more likely that he will respond in one of two ways.
He will either provide a concise answer based on at least one re-defined term of the question. In this case he'll re-define the term "content knowledge" to mean "only facts taught by rote." And, he'll provide an answer along the lines of "Facts are almost never necessary for critical reading and thinking skills." Technically, his answer is accurate based on his re-definition. The problem is that he often doesn't tell you the redefinition, leaving it implicit. It took me two years to determine that he was operating under this particular re-definition of "content knowledge."
Or, he'll provide the classic shaggy dog answer. I often get to the end of one of Stephen's lengthy blog posts only to find the conclusion doesn't necessarily follow from what's preceded it. this doesn't mean that he's necessarily wrong; all it means is that I cannot follow his argument as written. That's a failure of advocacy.(My day job entails the ability to quickly understand and analyze arguments based on technical descriptions written by experts for other experts in domains in which I am not necessarily an expert, so if anyone should be make sense of Stephen's arguments it's me.)
We actually don't need to reinvent the wheel on this issue. Stephen has already provided partial answers.
Let's start with this one.
[Y]es, you need domain knowledge to answer requests for facts within that domain.
That answer came in response to four specific questions whose answers depended on domain knowledge.
But notice the qualification: "requests for facts within that domain." It's superfluous. The questions did request a factual answer because that's the easiest and most common way of demonstrating one's knowledge. That knowledge could have been demonstrated without the use of facts. This is readily apparent in the last two questions.
1. Jones sacrificed and knocked in a run. Where are Jones and the runner?
2. When John walked out onto the street, he nictitated rapidly. Where might John have just come from?
Now imagine that the reader has been presented with four pictures or diagrams for each question. One of the pictures/diagrams is a depiction of a correct answer. The other three do not depict the correct answer. The reader is instructed to select the correct picture/diagram.
The reader can now demonstrate his knowledge without knowing any facts. However, in both cases (the fact-based answer and the picture/diagram-based answer) the reader must still be able to successfully discriminate between an example of the higher order concept "sacrificed and knocked in a run" and "nictitate" and non-examples of those domain-specific concepts. The reader doesn't even have to be able to articulate an accurate definition of those domain-specific concepts to perform a successful discrimination. (And, to make sure we're clear, I'm using the term concept to mean something that has a defining feature.)
There was no need for Stephen's qualification. The form of the answer is irrelevant to the need for the reader to possess domain-specific content knowledge when the reader is requested to discriminate between examples and non-examples (or to identify an example) of a particular piece of domain specific-knowledge. That is the underlying salient point to the argument I presented to Stephen.
I made this same point (though less elegantly) in my response to Stephen.
That these answers can be provided in the form of declarative knowledge does not detract from the premise that the [reader] must understand the underlying knowledge to activate the declarative knowledge needed to answer the question.
Stephen responded:
But what happens is not that the person had the fact, but rather, that the person created the fact out of the complex mass of previous observations and experiences, and presented it to you.
I don't disagree, although it is possible that the answerer also knew the definition or fact associated with the concept needed to perform the discrimination/identification which underlies the answer. And as far as that "complex mass of previous observations and experiences" goes, that's domain or content knowledge. The reader, to use Stephen's words, must be able to use that "complex mass of previous observations and experiences" to perform the underlying discrimination/identification needed to answer the questions presented. Of course, you might not have caught that since it was buried in a comment trying to refute my argument. Compounding the confusion is that these accurate statements are tangled up with inaccurate ones, like this one:
I've said on numerous oc[c]asions, simply responding to a demand for some fact does not constitute reasoning. These questions may resemble high school tests - but they do not resemble inference or reasoning.
As I explained above, responding to a request for a fact (such as in a high school test) often does require reasoning. Specifically, it requires simple deductive reasoning, something all preschoolers possess. For example, in the reader's "complex mass of previous observations and experiences" he knows a general idea (concept, rule/proposition, routine), he uses the general idea to examine an unknown potential example using the information contained within the general idea, “decides” whether the new example fits within the boundaries defined by the "complex mass of previous observations and experiences," and “treats” the example accordingly by indicating an answer.
That seems to resemble reasoning to me even when the end-product of the process results in a bubbled in (c). But, apparently, not to Stephen, so I provide a clearer example that required both content knowledge and reasoning. Stephen responded by chiding me for repeating myself and then gets all cryptic on me.
When you represent 'knowledge' as 'answering questions' you get a trivial, stunted version of knowledge.
He leaves this unexplained. It might be profound. Or simplistic. But, certainly it demands an explanation.
So, like Tracy, I'm interested in getting a straight answer to Tracy's question, or better yet, a more precise question:
When is domain-specific content knowledge (i.e., "complex mass of previous observations and experiences") not needed for evaluating a proposition made in that domain once the proposition has been filtered for trustworthiness (i.e., derived from evidence not opinion; author/speaker is qualified, trustworthy, and not biased bereft of logical fallacies and faulty arguments)?
Then perhaps we can get on to the more interesting questions related to learning and teaching.
(Commenters feel free to further clarify my question for Stephen.)
32 comments:
Allow me to provide one answer to my own question: when you argue by analogy.
Let's say I know very little about dinosaurs and am asked to sort twenty pictures of dinosaurs into groups based on the dinosaur's diet. I do know quite a bit about diet relating to other animals. I can use that knowledge to perform the sort. The sort will likely not be as accurate and there is a good argument that I have used domain knowledge of "diet" which is present in both the domains of "dinosaurs" and "all other animals" but there we go.
I just noticed that Stephen said something similar in one of his comments I linked to:
And - what's more - the person will (as Caulfield suggested, appropriately) to use whatever tool they can in order to create this fact. This includes knowledge inside and outside the domain, and especially, critical reasoning, language learning, analogies form other domains, related knowledge from other domains, unrelated stuff from a movie he saw yesterday, and the rest.
That's good. But, of course, it immediately suggests the question "how good is the tool that is created?" That question depends on expertise which moves us incrementally to the real issue which I will reveal shortly.
I've only skimmed DeRosa's lengthy diatribe against me, but I would like to focus on the question:
"Is there any evidence that could convince you that content knowledge might be necessary for reading and critical thinking?
Part of the problem here is, this question is a bit too value. I would like to see it clarified on two points:
- first, do we mean some content knowledge in general, or do we mean some specific content knowledge that would constitute a 'core'? If the former, then more directly to the next question. If the latter, then what I would require minimally would be (a) a statement of what that knowledge is, and (b) a description of how exactly reading and critical thinking fails without it.
- second, what precisely do you mean by 'content knowledge'? I ask this because there are different ways of characterizing 'content knowledge' and I'm less unhappy with some than with others. Specifically, is content knowledge:
(a) propositional in form - and specifically, expressible as a sentence or set of sentences? (In which case, I would need, as an answer to the first question, the specific sentences that a person must 'know', and not just a hand-wave toward them)
(b) experiental, or the result of some experience, or some neural state, or some set of appropriate dispositions to respond, or skill or habit of mind, etc?
DeRosa gets very angry when I refer to his equating of knowledge with 'facts', and then turns around and treats all knowledge as facts, while in the same breath saying it's something else. So, even if you don't require some _specific_ content knowledge, I want to know, what, to you, counts as knowledhge - and is any of that _necessary_ (as in, we can't do without it) to the learning of language or critical thinking?
Over to you.
Correction: a bit too _vague_, not a bit 'too value' (that's a weird typo...)
It's not a diatribe nor am I angry. And, quite frankly, I would prefer it not be about you, but no one else with views similar to yours are stepping up to bat with any arguments, let alone good ones, so I have to rely on you as a spokeman. Anyway, thanks for obliging. Now I can read your comment.
I agree that the question is vague, that's why I tried to clarify it at the end of the post.
Yes, let's stick to content knowledge in general. (We don't need to reach the common core issue.)
Let's start with your "definition" of content knowledge - a "complex mass of previous observations and experiences" relevant to a particular domain. Certainly, the knowledge includes experiential knowledge. And, no the knowledge does not have to be propositional; however, the learner should be able to communicate possession of the knowledge in whatever form you think is appropriate and conveys that the learner is capable of discriminating examples of the knowledge from non-examples.
The performance of the learner should be better than a novice with the level of expertise commensurate with the amount of time the learner has devoted to learning that content knowledge.
Let's start there and see where it leads us.
Even a sentence as simple as "The cat sat in his lap and purred" makes no sense if you have no idea what a cat is. It's not domain specific; it's just that you need to know something about the topic to know what is going on. If for whatever reason you don't know what a cat is, this sentence may as well be about tiny space aliens!
In response to Stephen's questions:
"- first, do we mean some content knowledge in general, or do we mean some specific content knowledge that would constitute a 'core'? "
I mean content knowledge in general.
- second, what precisely do you mean by 'content knowledge'?
By "content knowledge" I mean (a) and part of your (b). The part of your (b) I want is the one you labelled "experiental, or the result of some experience, or some neural state"
By content knowledge I don't mean "or some set of appropriate dispositions to respond, or skill or habit of mind". Of course content knowledge can include knowledge about your own, or someone else's, or some animal's, dispositions to respond, skills or habits of mind. But it is conceivable that we can have dispositions to respond, or skills, or habits of mind, without also knowing that such dispositions or skills or habits exist. (Is a snake self-aware? A mouse? Wasn't Freud implying a lot about us humans' unconscious minds?)
and is any of that _necessary_ (as in, we can't do without it) to the learning of language...
How could we learn language without content knowledge? For example, how could a small child ask their mother for a biscuit without some awareness of what a biscuit is? This is a serious question and I would like to hear your answer.
and is any of that _necessary_ (as in, we can't do without it)... or critical thinking?
I'm not going to try to make an argument on this until I have an idea of what sort of evidence you might find convincing. So, to repeat my question:
"Is there any evidence that could convince you that content knowledge might be necessary for reading and critical thinking?"
And thanks KenDeRosa, for the compliment, but I don't think anyone has ever asked me what sort of evidence might convince me, so I don't really know how I would answer.
And based on the number of times I have asked an equivalent question of various people about "what evidence would convince you" , relative to the number of times I have gotten a straight answer, I must distinguish between the meanings of the words "expect" and "hope". I hope to get a straight answer, but I don't expect to get one.
I'm a bit unclear about what Downes' means by "experiential" knowledge that isn't expressible through propositions.
Does he mean things like knowing how to drive or knowing how to draw? Is he saying that knowing how to reason is like these things?
It's a fool's errand trying to define Downes' terms for him, so take a look at the Wikipedia entry for a good description.
The more I think about this, the harder it is to get inside the "no content knowledge needed" argument. Every good question I can think of needs some domain specific content knowledge.
I'm a mathematician, and I would say that math problems are somewhat less in need of lots of detailed content knowledge--but it only works in theory. You can, if you know the definitions (content knowledge, but basic) work your way to all sorts of things without a lot of background ... of course, that method works best if you happen to be Gauss or Newton. And, of course, you need to specify unlimited time to work on the problem. If it takes my students who don't know their multiplication facts 15 minutes to solve the same problem their classmates with better content knowledge can solve in 3 minutes, does that count as success? If it takes me 2 years of working and developing ideas to solve something you can solve in 30 minutes, does it qualify as a success?
I suppose analyzing literature, if you can understand the text, appears to not need content knowledge, though if you compare the analysis of someone with content knowledge to the analysis of someone without it, it'll be pretty obvious who is who.
The other thing I do is math education, and if you attack a problem like: write a division of fractions word problem, or , what are the best sorts of word problems to ask of kindergarteners, it's a rare thing for people to come up with very good answers without studying the subject. Certainly Li Ping Ma makes a good argument that teachers who have not been taught how to approach fraction division fail miserably at the task of writing a word problem, even though they certainly know what fractions are and know what division is. Related background knowledge and experience just aren't enough for most people. My experience with college students, and, indeed, myself leads me to believe that almost all adults do poorly with the Kindergarten teaching question too (they assume that the things that make a problem easy or hard for someone using adult strategies to solve a word problem will be the same things that make a problem easy or hard for someone using kindergarten strategies).
So, and maybe I just need to go back and read previous posts. Are there examples of worthwhile problems that someone would want to solve that don't require content knowledge? (Ideally ones that it has been verified that someone without content knowledge can answer them, not just that it looks like anyone could answer them).
OK, so we've dispensed with the idea that there is some particular 'core' set of knowledge which is required in order to learning reading, critical thinking, etc., which was the bulk of my argument.
I trust you'll take this point over to the "common core" people.
The question of what counts as content knowledge is trickier. Tracy says it is, "(a) and part of your (b)," which is to say, propositional in form and (in part) experiential.
Helpfully, she clarifies, "I don't mean 'or some set of appropriate dispositions to respond, or skill or habit of mind'."
And we get, again usefully, a concrete example: "How could we learn language without content knowledge? For example, how could a small child ask their mother for a biscuit without some awareness of what a biscuit is? This is a serious question and I would like to hear your answer."
Excellent response.
The requirement of "some awareness" of what a biscuit is, if it is 'content knowledge', is propositional and at least partially experiential in nature. It is, in other words, knowledge that can be expressed in a sentence, where the content of that sentence is based in part on experience (presumably, with biscuits).
I say, "can be expressed in a sentence." A strong version of this sort of knowledge is that it actually *is* a sentence in the brain. A weaker version allows that it might be stored in some alternative fashion, but that this fashion includes a one-to-one correspondence with the sentence. So (for example) it make be made up of constituent meanings of words, grammatical principles, etc., that are assembled in order to form the sentence.
I don't think there's anything to object to in this characterization, but I'm laying it out clearly so you can see what it is I believe I'm responding to.
Now, the question becomes, "could a small child ask their mother for a biscuit" without this sort of knowledge?
The quick answer is, of course she could. Should not ask _in language_ without knowing language -- in other words, trivially, a knowledge of language requires a knowledge of language -- but she could make her desire known without language, and therefore, her knowledge of what a biscuit is, whatever that may happen to be, is not propositional in form.
I know that it really sounds like I'm splitting hairs, but I'm not. We have two very clear alternatives here:
1. knowledge that is propositional (ie', "(a) and partly (b)", vs
2. knowledge that is "some set of appropriate dispositions to respond, or skill or habit of mind" - which, if I cash this out, ends up being non-propositional neural structures that are blended in with each other
When you - and Ken, I assume - say that 'content knowledge' is required in order to read or think critically, you mean, "the person most previously possess at least one instance of 1 in order to read or think critically."
(part 2)
Now - for me this seems a bit tricky, because it means you're saying, "the child must previously possess at least one instance of language in order to learn language."
But of course, it isn't that tricky - you just have them memorize it, by rote, so that they have the necessary building blocks (alternative, you could say, following Chomsky and Fodor and the like, that these building blocks are innate, built in).
I argue, though, that these building blocks are in fact "some set of appropriate dispositions to respond, or skill or habit of mind" - that they perform the equivalent function of these memorized facts, but are not in fact memorized facts, in that they are not propositional in form, and are, in fact, complex neural dispositions to respond. Habits of mind. Webs of connections between neurons that do _not_ have a one-to-one correspondence with propositional expressions.
So I say: a child can ask for a muffin without having a single fact (of type 1) about muffins. And the evidence I offer is:
a. this could be a habitual behaviour, something copied from adult behaviour
b. 'reaching out', which is _interpreted_ as 'asking' could be a habitual or even innate behaviour
c. The child can express a desire for a muffin long before acquiring any linguistic knowledge whatsoever
d. animals, such as dogs, which _cannot_ have propositional knowledge of type 1. can nonetheless ask for a muffin
In other words, what I am saying, is that all the rest of this comes first, and propositional knowledge - knowledge of form 1 - only comes *after* skills like language and critical thinking are obtained.
They learn, in other words, the mechanics of of how to come up with and express facts, and only then do they learn the facts themselves.
So, now, we turn to the question, "is there is any evidence that could convince you that content knowledge is necessary for thinking."
So what would be required?
I would need to be shown that a person must know some proposition (of form 1), but not some particular proposition (of form 1), prior to learning language.
Let me express this formally.
For some language L, which contains constructions (sentences, functions, theorems, whatever) f, which in turn contain variables x,y,z and may be instantiated with constants a,b,c, where knowledge L requires knowledge of f(x), and where knowledge of f(x) requires knowledge of some instantiation f(a).
In other words, I need to know, that there is some constant instantiation in a language f(a) that must be known _prior_ to knowing f(x) and, by extension, L.
So what evidence would I need?
1. an example of f(a)
2. proof that f(x) cannot be understood without prior understanding of f(a)
3. proof that L cannot be understood without prior understanding of f(x)
And - this is important - 'proof' is more than just the assertion that (say) f(x) cannot be understood without prior understanding of f(a).
This is what DeRosa does all the time. He just says it, but he never proves. We're supposed to take it as self-evident. But it is not self-evident.
What would constitute a proof?
1. A set of statements, f(b) ... f(c) that can be supported through direct empirical evidence - we can see that they are true, they are experimentally shown to be true (where 'experimentally' means normal rigourous experiments, not some prof studying a dozen grad students).
2. An inference rule g (eg., a deductive inference, an industive inferemce, an argument to the best explanation (aka abduction), a mathematical inference, or some other form of inference), such that
3. g(f(b)...f(c)) entails (L -> f(a)) (in other words, 1 and 2 entail some fact f(a) such that some language L cannot be learned without that fact f(a).
Or, to express the requirement as a whole less formally:
evidence which would prove that there is some fact without which a knowledge of some language is impossible.
I think that's clear enough.
Looking at DeRosa's original post in more detail, we that once again he uses the 'question-answering' behaviour in order to prove his point.
"The reader can now demonstrate his knowledge without knowing any facts. However, in both cases (the fact-based answer and the picture/diagram-based answer) the reader must still be able to successfully discriminate between an example of the higher order concept "sacrificed and knocked in a run" and "nictitate" and non-examples of those domain-specific concepts."
Really?
Against this, I bring forward John Searle's well-known 'Chinese room argument', which shows generally that evidence of the question-answering form does not distingush between knowledge and non-comprehending mechanical response. http://en.wikipedia.org/wiki/Chinese_room
To make a long story short, I can learn to select an appropriate answer from the shape of the word 'nictate' or 'third base' or whatever, without having any domain knowledge at all. Question-answering does not demonstrate domain knowledge ever, at all, and this is well known by people who are actually professionals in the field.
Perhaps DeRosa can come up with an answer to Searle's Chinese Room example - he should probably review the many hundreds of attempts first. His response, interestingly, if successful, would be of the form g(f(b)...f(c)) entails (L -> f(a)) -- because that's what it would take.
This is why I get frustrated with DeRosa. He presents examples of type P as conclusive evidence for his position, with no apparent knowledge that answers of type P have been well-discussed in the field, and are not conclusive evidence at all.
Gawd. Where is Occam's razor, when it's really needed? Bring back the "Why Do I Bother" post, pleeeeze, Ken.
If anyone thinks that understanding communication conveyed in either spoken or written medium does not require background knowledge, let them operate at their own peril. They'll be mis-instructing kids, but what else is new?
The merits and liabilities of a "common core" of background knowledge might be worth batting around. But the participants here have put that off limits.
Stephen has answered Tracy's question with a very long-winded noooooooooooo... There isn't anything left to say, but that doesn't stop the flow of words.
Let me quickly respond to the Chinese room analogy.
It's not applicable for at leat two important reasons.
The "computer" in the Chinese Room problem is receive symbols as inputs. A symbol is not knolwedge (nor is it a fact) because it contains no connection. "Canada" is a symbol. "Canada is a country" is both a fact and knowledge because it connects "Canada" to "country." So, in the Chinese Room, the computer did not learn any content knowledge. It has merely succeeded in emulating a human based on the programming by other humans.
The other reason it is inapplicable is because we are assuming that a human brain and mind is receiving content knowledge, doing whatever it is that a human brain and mind do with that knowledge to effect learning and to make meaning, and then responding appropriately. The computer in the Chinese room doesn't understand the symbols, our human does because it has processed the connection. That's the critical difference.
Then Stephen gives us a poor example:
I can learn to select an appropriate answer from the shape of the word 'nictitate' or 'third base' or whatever, without having any domain knowledge at all.
1. Stephen is a human using his brain/mind to do the thinking. The human mind works on both syntax and semantics. The Chinese Room computer only acts upon syntax by definition.
2. Stephen is discriminiating the shape of word which implies pattern recognition which implies a connect which implies domain knowledge (semantics and meaning). Otherwise, a selection couldn't have been made by a human using only his brain/mind.
3. If instead Stephen were in fact a sophisticated computer doing pattern recognition to perform discriminations then we don't fall within the Chinese Room boundaries because that computer can only rely upon syntax.
Back to you Stephen.
Dick, you need the patience of Job to tolerate Downes.
We are progressing down a Downesian tangent, but I have faith we'll return to a more productive place.
Question-answering does not demonstrate domain knowledge ever, at all, and this is well known by people who are actually professionals in the field.
This does not follow from the Chinese Room problem when the answerer is a human who has processed the question using his brain/mind. If you are elying upon something else for this proposition you need to cite and explain it.
(just correcting a typo in the previous removed comment)
> This does not follow from the Chinese Room problem
Actually, it does. You have to read more than just the Wikipedia entry, you have to read the article and the discussion around it.
But I've gotten used to 'does-does not' style of argumentation here.
Or responses like Dick's, which are the equivalent of "His answer is too long and complicated, it must be wrong."
There's no read desire here to actually understand any of this stuff. Just to advocate some point of view without any real understanding of it.
If your question read "may not" I might agree, but it doesn't.
Oh... there's a longer response. Goodie.
> A symbol is not knowledge (nor is it a fact) because it contains no connection....
What you mean by 'connection' is derived from, who in turn is describing Mill's methods. As the Wikipedia article shows clearly enough (though you can read it in A System of Logic, as I did) these 'connections' are in fact linguistic constructions. Hence, the Chinese room input consists of both symboks and connections, as you've defined them.
> The computer in the Chinese room doesn't understand the symbols, our human does because it has processed the connection. That's the critical difference.
No, this is a mis-statement of the Chinese room example.
The point is that the people inside the room are shown symbols in a language they don't speak or understand (that's why it's the Chinese room example, and not just the 'room example'). So the people in the room cannot employ any domain knowledge, or any knowledge at all over and above the shape of the symbols, in order to produce their output.
these 'connections' are in fact linguistic constructions.
No, not necessarily. For example, for many basic sensory concepts and procedures the connection need not be linguistic.
You are confusing "connections" with how those connections might be learned for certain forms of knowledge.
So, as I've defined it, providing a symbol (such as a pictograph)isn't knowledge. It becomes knowledge when I also provide a photograph of what that pictograph represents. For example, I provide a pictograph of the word "red" and also a picture of the color red. That's knowledge and taht's exactly what wasn't provided in the CHinese Room experiment. Only the symbol (the pictograph) was provided to the computer or human.
So the people in the room cannot employ any domain knowledge, or any knowledge at all over and above the shape of the symbols,
Exactly. They were provided with no knowledge.
From the article:
(A1) "Programs are formal (syntactic)."
A program uses syntax to manipulate symbols and pays no attention the semantics of the symbols. It knows where to put the symbols and how to move them around, but it doesn't know what they stand for or what they mean. For the program, the symbols are just physical objects like any others.
(A2) "Minds have mental contents (semantics)."
Unlike the symbols used by a program, our thoughts have meaning: they represent things and we know what it is they represent.
I found better explanations of the Chinese Room problem.
The problem assumes that the person in the Chinese room only attends to the syntax of the symbolic input (even if the input contains semantic information). In fact, the person has no way to access the semantics of the input since it is in a language the person does not know.
In contrast, the example I gave assumes that the person has attended to the meaning of the input -- because that's what humans do automatically. See the Stroop Effect.
And, just like that we are outside the assumptions of the Chinese Room problem.
Then Stephen states: Question-answering does not demonstrate domain knowledge ever, at all.
In the sense that the answerer may have guessed his way to a correct answer and/or has a defective brain which is incapable of processing semantic input, Stephen has a point -- a very minor point. In the real world, however, we are willing to live with such minor uncertainty. That's why we use a confidence level of 95%, not 100% for experimental results.
Was this really a tangent we needed to explore?
The requirement of "some awareness" of what a biscuit is, if it is 'content knowledge', is propositional and at least partially experiential in nature.
But it isn't. Why should content knowledge of what a biscuit be propositional? We often use words we can't define precisely. For example, when does a biscuit become a cracker? What's love? At what temperature does a room stop being cold and start being lukewarm?
A weaker version allows that it might be stored in some alternative fashion, but that this fashion includes a one-to-one correspondence with the sentence.
Why should we believe that anything is stored in the brain in a one-to-one correspondence with the sentence? Why can't things be stored in the brain in other ways? You provide no evidence for this assertion.
I don't think there's anything to object to in this characterization
Well there's at least two things to object in it. You are assuming faacts not in evidence. You are assuming that content knowledge must be propositional, and you are assuming that things must be stored in the brain in a certain way.
Now, the question becomes, "could a small child ask their mother for a biscuit" without this sort of knowledge?
Nope. The question is "could a small child ask their mother for a biscuit without any awareness of biscuits". Not merely not any propositional knowledge, but any content knowledge of biscuits at all.
The rest of your answer is just irrelevant as it misses the point of my question. Please feel free to rephrase my question as to how a child could ask their mother for a biscuit without knowing what a biscuit is, using "knowing" in a sense broader than that of propositional knowledge.
So, now, we turn to the question, "is there is any evidence that could convince you that content knowledge is necessary for thinking."
...
I would need to be shown that a person must know some proposition (of form 1), but not some particular proposition (of form 1), prior to learning language.
Hold on Stephen. Previously you asked me to clarify my question, and gave me two options of knowledge:
My response was, to quote myself:
'By "content knowledge" I mean (a) and part of your (b). The part of your (b) I want is the one you labelled "experiental, or the result of some experience, or some neural state"'
My question was not confined to propositonal knowledge, but included "experiental, or the result of some experience, or some neural state".
Your answer here ignores that clarification that you yourself asked for. This clarification which you asked for is easily available higher up in the comment block.
I will reask my question again:
Is there is any evidence that could convince you that content knowledge (not simply propositional knowledge, but also experiental, or the result of some experience, or some neural state, is necessary for thinking?
The probl;em with arguing with you is that you apparemntly do not understand when you express blatant contradictions.
I'm serious. Look at this:
I write, "these 'connections' are in fact linguistic constructions." And you respond, "No, not necessarily."
Then just a few lines later you write, "So, as I've defined it, providing a symbol (such as a pictograph)isn't knowledge. It becomes knowledge when I also provide a photograph of what that pictograph represents."
First of all, I think it's very nice that you've replicated Wittgenstein's 'Picture Theory' as described in the Tractatus. For what's wrong with that theory I'll refer you to Wittgenstein's later writings.
But more significantly, not 10 lines after saying that these connections are not linguistic constructions, you present a story where they consist of a matching of pictures with (GASP!) linguistic constructions.
I think, Ken, that you genuinely do not understand your own theory. Probably because it isn't your theory, it's just something you copied from somewhere.
> The problem assumes that the person in the Chinese room only attends to the syntax of the symbolic input (even if the input contains semantic information). In fact, the person has no way to access the semantics of the input since it is in a language the person does not know.
I love the way you feel you've solved a major problem in philosophy with a one-paragraph example.
The problem with the Chinese Room example isn't with the input. The problem is with the output.
Specifically, the problem is that there is no difference in the output of a person who has symbolic input only, and who has symblic inpout and what you call (in this post, at least; you have half a dozen names for it) " the semantics of the input".
Syntax only -> output P
Syntax + semantics -> output P
When you say "The problem is they don't have the semantics in the input" you assume that, consequently, the output will be different. But the problem is, it isn't. You get the same output in either case.
Ken, I don't have the time or the inclination to provide you with a complete epistemology course here. And I will say it's tiresome to be told "you're wrong" over and over by someone who does not understand the subject matter.
You should rethink your attitude regarding this material. I'm glad you're interested in it. You should take the time to learn it.
Stephen, that "story" was not an example of a non-linguistic connection. It involved a linguistic connection because it involves language as one node.
When I write that connections are not necessarily a linguistic connection, I mean in some instances there is a linguistic connection and inother cases there is a non-linguistic connection.
I'm beginning to think I am really arguing with a computer which is slowly failing its Turing test.
Well, Tracy, this is an interesting tactic - I quote your remarks to you, and then you respond with objections to them.
Like this:
I wrote, quoting you, "The requirement of "some awareness" of what a biscuit is, if it is 'content knowledge', is propositional and at least partially experiential in nature." This is exactly what you said when you said "a and paertly b."
You respond, "But it isn't. Why should content knowledge of what a biscuit be propositional? We often use words we can't define precisely. For example, when does a biscuit become a cracker? What's love? At what temperature does a room stop being cold and start being lukewarm?"
Well, that's right. That's exactly my own objection to the propositional view. The problem with theories that require that knowledge be like a 'language in the brain' is that languages presuppose much sharper definitions than we actually have.
There is a whole ramnge of theories of communication, for example, taht suggest that we "agree" on the meanings of terms, and that this is what makes learning possible. But agreement requires some precision of definition, and this is precisely what we do not have.
Again, you not continue the sort of line of reasoning I would offer:
"Why should we believe that anything is stored in the brain in a one-to-one correspondence with the sentence? Why can't things be stored in the brain in other ways? You provide no evidence for this assertion."
Quite right. There is no good evidenced for this assertion. However, if knowledge (or any part of knowledge, Ken) reduces to propositions (as it does with the picture theory) then a one-to-one correspondence is entailed. Since we reject one-to-one correspondence, then we must conclude that knowledge does not reduce to propositions.
Knowledge - no part of it - is linguistic in nature.
I don't think there's anything to object to in this characterization
"Well there's at least two things to object in it. You are assuming faacts not in evidence. You are assuming that content knowledge must be propositional, and you are assuming that things must be stored in the brain in a certain way."
No. YOU were. Go back and read your response to my question. I asked very carefully and clearly whether you thought knowledge was propositional. You said it was, and I quote, "By "content knowledge" I mean (a) and part of your (b). The part of your (b) I want is the one you labelled "experiental," where (a) was, and again I quote, "(a) propositional in form - and specifically, expressible as a sentence or set of sentences?"
Now if you're happy not to have knowledge in propositional form, that's great.
But you know and I know that when people (including Ken) say "people need knowledge to learn language or critical thinking" they mean something like 'facts, expressed in language'. They don't mean the fuzzy non-propositional experiences and habits that we are talking about here - because there experiences and habits of mind just are the basic non-contentful elements of critical thinking and language.
Just a final note.
You have both answered me by offering contradictions.
I offered detailed responses to your questions in good faith and you respond with waffling, prevarications and contradictions.
I have no wish to continue a dialogue under such conditions. You appear to believe 'proving me wrong' to be more important than clear reasoning.
This isn't some game of "advocacy", as you seem to believe. It doesn't matter whether you agree with me. I don't care, and the world doesn't care.
You asked a question, I answered it, you don't like my annswer and lie and contradict yourselves in an effort to show I was wrong, well, fine.
Enjoy yourselves. I'm done with this thread.
With regard to the Chinese Problem, I have not solved it, I have no desire to. I have merely sidestepped it by constraining the discussion to humans who have attended to the semantics of the input.
Also, I understand, that the output of the computer might be indistinguishable from that of a human using my behaviorist test. My test also doesn't guard against the testee guessing the right answer (however improbable that may be). But, in the real world, we know the test subject will be a human that has attended to the semantics of the input and have ways to account for guessing.
It's analogous to continuing to use newtonian physics to deal with the great bulk of physics issues and only switching to Einsteinian physics when the conditions warrant.
I wouldn't give my test to determine if a computer is sentient.
Post a Comment