"Disputes about words belong rather to grammarians than philosophers, but philosophers ought not to escape censure when they corrupt a language, by using words in a way which the purity of language will not admit" (Thomas Reid, EIP, I, 11).
(I hear Eliot in this: "to purify the language of the tribe.")
"The hatred that men bear to privilege increases in proportion as privileges become fewer and less considerable, so that democratic passions would seem to burn most fiercely just when they have least fuel. I have already given the reason for this phenomenon. When all conditions are unequal, no inequality is so great as to offend the eye, whereas the slightest dissimilarity is odious in the midst of general uniformity; the more complete this uniformity is, the more insupportable the sight of such a difference becomes. Hence it is natural that the love of equality should constantly increase together with equality itself, and that it should grow by what it feeds on." (https://en.wikipedia.org/wiki/Tocquevill
Is the rising up of those who feel alienated by the social justice movement also fueled by hatred of privilege? Yes, I think it is. The Tocqueville effect is playing out in two different groups of people at the same time. The question is: do we have any power to stop either of them? Or will they cause society to ultimately fall apart?
"To expect sense from two mentalities of such extreme viewpoints would not be logical" --Spock, Let That Be Your Last Battlefield.
"This thing you call language: you depend on it for so very much, but is anyone ever really its master?" --Spock as Kollos the Medusan
has very little to do with the style of writing itself, or of the 'goodness' or clarity of the thinking that underlies it. It has to do with the audience sharing the author's presuppositions.
When a conclusion is correct, there is always more than one valid argument that can be made for it.
Or maybe it's just me....
For a long time I've had the feeling that there's a bias in contemporary philosophy against arguing for 'big' positions--the kinds of positions those who hold (or discount) hold (or discount) in a fundamental way. The thought is: if you give an argument for one of these positions, the only people you convince will be those already convinced.
What remains to philosophy is--must be--to only solve problems within these fundamental framework views. But that makes it difficult to ever interact with those who reject them, UNLESS your M.O. when interacting with them is to always accept others' presuppositions and to try to work things out within their views.
That's all well and good, I suppose. But it seems to be ignoring a herd of elephants in the room. (That, and I'm absolutely terrible at accepting presuppositions I object to.)
1. The Identity of Indiscernibles: If two things have all their properties in common, they are identical.
2. The Indiscernibility of Identicals: If two things are identical, they have all their properties in common.
THESIS: there is no application of these principles that is not implicitly epistemic, and self-refuting.
I used to think that both of these principles were acceptable. Or that perhaps (1) was acceptable, but not (2). But upon further thought, both are problematic if "have all their properties in common" is read as "are indiscernible." Having all properties in common with something is not equivalent to being indiscernible from it. Indiscernibility is an epistemic notion. As an epistemic notion, it is incomplete until the perspective of the discerner has been specified. (It makes a difference whether we're talking about things being indiscernible to God, or to me; and it might make a difference what else I know about the entity in question.) Thus Descartes's use of (1), which turns not on shared properties but on indiscernibility from his perspective, is not legitimate.
For uses that do not on indiscernibility from a perspective, (1) seems fine. But it also seems like a principle that will never be used, as no two things have all their properties in common--unless what we really mean to say is that what seem, from some perspective, to be two things, have all their properties in common. But if that's the case, they only seem to have some properties in common: there are sufficient differences for them to seem to be two things in the first place. There won't be any such "two things" ("two things" should be put in scarequotes).
Non-epistemically motivated uses of (2) are equally impossible to apply. We're never going to encounter TWO THINGS that have all their properties in common. We might encounter what seem to be two things but are actually one, which have all their properties in common except seeming to be two things. This is an epistemic interpretation of (2), and it refutes itself. In seeming to be two things, the "two things" do not have all their properties in common. They have sufficient difference to seem to be two.
The purely metaphysical version of (2) might also be false in quantum physics: if, in quantum physics, the same thing can be in different places at once, (2) is false for whatever sort of thing has that ability.
To do the work philosophers want to put them to, both (1) and (2) need to include epistemic notions. But they can't. The purely metaphysical versions are not much use.
Despite the apparent democracy of its classes, St. John's is a strikingly non-transparent place. The curriculum is handed down from on high. The educational objectives are completely undetermined. No explanations of why things are important, or what you should be getting from them, are ever given. This sounds great, in theory. But in my case it led to paranoia: am I finding what I should be finding in these things? And to obsessive attempts at pattern-recognition, at discerning what the guiding intentions were behind the arrangement of the Program.
This contributes all the more to students' paranoia about their own performance. The grading system is--because unspoken--also vague and various. It's harder to know when you're doing well when you have no idea what that would mean.
The College thus takes on characteristics of a religion. It inspires simultaneously doubt about one's own worthiness and about whether what the College requires of one to be worthy--whatever it is--is right. Perhaps this is part of what has made me obsessed with trust issues for much of my adult life. This is also why the College reminds me of The Village from The Prisoner, despite being intended to be the paragon and training ground of democracy.
In past years, I've moved on from the things in my life I believed in that required me to trust them in ways I found difficult. I'm much happier, and don't miss them. Perhaps it's a worthwhile experience to have, but on the whole, if something requires trust to the degree that constant self-doubt is inspired, I would avoid it.
David Kaplan uses the following example in support of the idea that indexicals and demonstratives (including some uses of pronouns) refer directly, i.e. unmediated by Fregean senses:
Suppose you have a friend, Paul, who lives in Princeton. You're at a party, and Charles has shown up disguised as Paul. You say
(1) "He [Delta] now lives in Princeton."
Kaplan says, "I assume that in the possible circumstances described earlier, Paul and Charles having disguised themselves as each other, Delta would have demonstrated Charles. Therefore, under the Fregean theory, the proposition I just expressed, Pat [he named the proposition 'Pat'] would have been false under the circumstances of the switch" (Demonstratives, 516). He construes Fregeanism for some reason as tying the actual object presented, rather than the mode in which it is presented, with the proposition expressed. (Sure, senses are modes of presentation of referents, but senses compose thoughts, so I'm a bit confused.)
He claims that Direct Reference gets the right result because demonstratives are rigid designators; they designate the same thing in all possible worlds (keeping certain things constant), rather than varying with, say, which object is present. (I can't claim to understand this. If you hold the object constant, the person you refer to in all possible worlds would still be Charles, would it not?)
But this seems like the right result for the wrong reason. I agree that in the proposition above, the person you refer to is Paul rather than Charles. But why? Because the information you're drawing on in making the judgment is derived from Paul, and not Charles. It's irrelevant whether Paul or Charles is the person sitting next to you: you're saying something about Paul based on information drawn previously from Paul.
Let's tweak the case slightly. Suppose, at the same party, Charles-in-a-Paul-suit starts dancing. You say,
(2) "I never thought I'd see him dance!"
Which person are you referring to?
This seems harder, to me, because the information you're drawing on concerns the actions of the person in front of you. Are these Charles's actions, or Paul's? I don't pretend this is obvious, but I think it's much more tempting in this case to say you're referring to the person who's actually there (Charles), rather than the person he's disguised as. Then again, it might not be surprising if Charles is someone who dances often, and Paul never does. So perhaps here you're referring to Paul also, even if Charles is doing the dancing.
What if Charles disguised himself as Paul to carry out a murder. You catch him in the act, and call the police. When they arrive, you say,
(3) What he did was horrible!
Does 'he' refer to Charles, or Paul? Charles. Why? Charles did the thing, even if you thought he was someone else. These are not easy cases.
A very Evansian way to treat these cases would be the following: you are referring to the person from whom the bulk of the information you draw on in your thought/statement/judgment derives. You fail to meet your target in making the judgment insofar as there is a mismatch between the person from whom the bulk of your information derives and the person who's actually there. So in (1) you have tried and failed to refer to Paul by demonstrating Charles, but you have still said something true about Paul. In (2) you're drawing on information about the frequency of Paul's dancing, and so still trying to refer to Paul, albeit also failing. In (3) you have said something true about Charles without knowing it. You have referred to Charles, but as it were by luck.
That is still not to explain why a more sophisticated Fregeanism might be useful in such cases. But Charles-under-a-Paul-mode of presentation could certainly go a long way in making sense of (1) and (2).
I am a lady who is big into technical stuff. Philosophy of language, taking electronics apart and putting them back together; you want to talk about how to solve a problem? I'm all over it.
I am also surrounded by dudes all the time.
So why in the name of the seven mad gods who rule the world isn't most of my life like Kaylee at the ball on Persephone (Firefly)?
Oh, because the latter is a Joss Whedon fantasy. Carry on.
Sometimes I think much of what we value about people and places and living situations boils down to action potentials--if not in the strict sense that the term is used in neuropsychology, but in the sense of what is easy to do in different places, with different people, etc, and what is most enjoyable that way.
My apartment without a roommate in it has all the same things I enjoy using and playing with. But only without one can I really enjoy them, or use them at all. When one is on one's own, action potentials in a lot of directions are practically infinite.
Quite a few are also curtailed, which could only blossom in someone else's presence: shared experiences of all kinds.
On the whole, I value shared experiences more. But I don't feel whole unless I have that total freedom frequently. It's the space for self-determination.
As I get older, increasingly I discover people my age or younger doing things that seem as though I could've done them. The "I could've ___ that" phenomenon. It's strange for someone who has been culturally isolated for most of her life to find people who speak for or to her at all. It's unnerving. It makes me feel average. A product of my times, even though I'm not; a product of other times, maybe.
Perhaps it's just that increasingly possibilities are becoming realized, so some of those I might've been able to realize are also being realized by other people. And as I continue on the path I've found, I realize possibilities there. Yet culturally, it feels as though I ought to be realizing all those I'm capable of, putting the unique combination of influences that I embody to every use possible. That other people will end up doing it for me is at once comforting and saddening.
I was inspired to revisit this now, 13 years later, because what it says is still true, and still a problem for me, even outside of the cultish environment of the college in question.
Note: The following is an opinion-piece, written for the St. John’s student newspaper, on the place of “quiet students” in the discussion-based atmosphere of the school.
Mr. MacLean has raised some issues that have preoccupied me, on and off, for the past three years. Although I cannot present anything other than a personal case, perhaps it will offer some psychological insight.
Frankly and at the outset: I am a student with a notorious history of pained quietness. I can say only that other quiet students do not personally offend me, or seem to threaten the life of any of my classes. What is of concern, I believe, is not any threat to the community that might come directly from quiet students, but whether students like myself are missing--and are perhaps incapable of attaining--something crucial with regard to their own educations. This is the sense I have from my intense awareness of how my classes function and of how I am, in respect to their functioning that way, deficient. This is not a defense I feel the need to give because I believe myself objectionable; I am not objectionable, but neither am I wholly in accord with the operational standard. I give my case history below.
Granted that St. John’s belongs to students who think out loud, I think, mainly, on paper. The difficulty lies, specifically, in the moment of intuition (intuition in the Cartesian sense, Rules for Direction of Mind), which unfortunately for me does not retain its connection to the logical circumstances that produced it. As soon as I “see” something (a math proof, for example), I lose accounts. A train of words does run through my head, but is generally detached from the real issue at hand either for my mind or the discussion. Bringing the words and the images together is a constant undertaking. Perhaps others have this problem, too, and are simply stronger intellectually and so overcome it more quickly. It is true that, with some subjects and situations, I experience this less. But, as it is, a link is missing that enables me to say what I “see” and “see” what others say with the rapidity called for by most occasions.
There are other problems (emotional), which stem mostly from fear of being exposed in the above handicap: of speaking and producing not the moments of intuition but the words running through my mind, and saying nonsense.
It is true that conversations involve common ground between participants. I think I have grown capable of finding this along with my classes, but often don’t stay there myself. I simply can’t work out any ideas of my own at all, and put words to them, in time with the conversation. I am too busy picturing the situation someone else presents to respond immediately. In this way, I often fall back upon my own trains of thought--which frequently enough are just outside the bounds of the discussion. What interests me—what I would say—cannot be said in the context of the conversation. What I do say is what I believe to be lacking in the presentation on the table. Perhaps this reaction is grounded in a need to feel some integrity within myself in the midst of a disorienting situation. This is a shortcoming on my part. In assuming my personal ideas to be of more importance to me than the exercise of conversation, I commit what may be a fundamental sin against the standpoint of the College.
I’m not sure I foresaw this effect: I had hoped, as a prospective, that I would undergo a miraculous transformation (there is something rewarding about holding forth in a group of people that I had, in fact, always dreamed of) from the environment alone. Perhaps my soul was not prepared. Perhaps it never will be. The fact is, others can and do spend less time thinking about the readings but naturally and, it seems, effortlessly send out the words they do. Something prevents their rising in me (though I could write them with a little more time). I see others participating in this way--pretty much the way I do in orals, paper conferences, and one-on-one situations in general--and cannot help feeling I am missing something, although I am fairly reconciled that I must hold myself to a different standard. Whether or not such a different standard exists coherently in the eyes of the College has caused me some uneasiness.
My tutors have accepted me, hitherto, because I am very alive intellectually and certainly benefit from classes, both in the development of my thought, and in my relationship to conversation. (The second type of benefit I noticed almost immediately: conversation with friends and family at home became, by Christmas freshman year, much easier. I could deal with strange ideas and viewpoints more rapidly.) I don’t need to mention, for all will know it in their own cases, how I have grown intellectually here. My own education, then, is going just fine--unless, of course, what matters to an individual student’s education here lies not in what he learns, what new ideas he takes away, but somehow lies in the act of conversation itself. On the one scale, students of my order are not even ideologically reprehensible; but on the other scale, there are heights of development we (assuming there are others) are incapable of reaching that nevertheless are central to any student’s task at the College.
Although, as stated, I don’t believe we are in danger as a community from quietness, the above question--whether the purpose of education lies in learning things and being involved with the ideas of the program and one’s classmates, or instead lies somehow in the intellectual exercise of conversation—is a serious one. To this higher standard we may, I fear, be accountable: it consists, I imagine, in a still higher readiness to accommodate strange thoughts, place them, and work on them rationally; a more rapid and trained “seeing” of what one hears. These are virtues for me because they are my deficiencies. Whether these virtues are the primary purpose of our work here determines how far students such as myself should be accepted. Any quieter means of reaching for such virtues, should such means exist, would be the criteria of our acceptance.
I have always hated discussion classes. At St. John's, I'd spend the day preparing for a seminar, not get to speak in it, and go home feeling miserable knowing that however much I loved the material and however much work I did, my grades would be terrible. Often I'd cry after classes for which I'd put in a lot of preparation. That didn't change in graduate school. Classes in which I cared about the material and spent the day preparing were especially bad. (It's easier to not care what you say or how the class goes when you don't care about the material.) And especially when the professor yelled. More weeping, followed by a migraine.
Discussion-based learning is only useful to an individual student if she gets to guide the discussion; and the difference between the places I wanted to go and the places the rest of the group did was always huge, so I reasoned that the majority ought to get their preference. (Perhaps I'm an innate utilitarian.) I still think like this. But this isn't to say discussion isn't valuable and something I need. Discussion with a single person at a time is.
Group discussion contexts often stifle my inner dialogue. Being surrounded by peers is not a good situation for opening up one's mind. There's too much going on to follow a line of thought. Thinking is something one does in private, in a pleasant place, when distractions are tuned out and one is relaxed. Feeling safe is of the utmost importance to this. Otherwise all kinds of defenses take over and make thinking difficult. Part of the problem, then, is that group environments are threatening (intrinsically, not just sometimes), and part of the problem is that they're too busy.
When I think about classes I've enjoyed being in, there are only a few. If groups, they were tiny (<5) and did not exhibit significant power structures; we were engaged in a task together, and it was pretty clear what to DO in the actual class. That is, one could prepare things one would like to talk about and actually talk about them. (That's happened twice the entire time I've been in school. And when I think about it now, the preparation I did for them was much like preparing to lead a discussion section these days, even though I wasn't the leader. Knowing how to prepare is important.) Another type was a lecture with a lot of individual feedback: the professor often asked students to write responses to questions and problems. These weren't graded; I don't know whether he ever looked at them. But I loved those classes so much I went to them when I wasn't even enrolled.
In the future, then, as a leader of classes, I might want:
1. to have students write their own responses to questions more often
2. to make it so that students know how to prepare for discussion classes
3. break large groups into smaller groups, and give people roles within the groups, so they're not distracted by floundering in social chaos.
2. is hard. In my first seminar at St. John's one of the tutors asked me, in particular, to come up with possible seminar questions before class. He never asked my questions, and I never knew if I was supposed to ask them (I assumed I was just being evaluated based on them). It helped neither my performance nor my grade, and I felt awful for being singled out as in need of remedial something or other. So, whatever I do--NOT that.
Many professors do things like that, for instance having students post something either to them or to an online discussion board before class. But I've always hated the publicity of this and found it stifling. (When posted to a professor, they'd often tell me how much my questions stunk.) So perhaps just requiring them to write something and not evaluating it or showing it to others would be better. Another possibility is to have a discussion board but have all contributions be anonymous. If that could be done, it might have interesting results.
I dreamed last night that I was giving them (being a big nerd), so here goes:
Pace Wittgenstein, we can give a definition of what it is to be a game. To be a game, a set of actions must contain routine or ritual components which have an indeterminate outcome that depends either on chance, on skill, or both.
There is no need to specify the number of players, the fact that some games are played with balls, others with words, etc.
There may be concepts that can't be pinned down through definitions, but this isn't one of them. (It's certainly true that we don't LEARN most concepts in this way, but that doesn't mean definitions can be given afterwards. Harder concepts to do this for are basic sensory concepts, like 'red,' and perhaps qualitative concepts.)