Weiss' Bescheid (trite and true)
Opinions based on limited information
Posted Jan 07, 2013
April 2, 2013
March 16, 2013
I remember when you would make 5 copies of a manuscript and send them to a journal editor. Some weeks (or months) later, you would get a thick envelope with the reviews and decision. There was simplicity in that. Invitations to review other authors' manuscripts also came in the mail. The editor sent that to you because he or she knew you personally or by reputation. Those were the good old days. Now we have submission portals, which shift your relationship -- be it as author or reviewer -- to the publishing house (Elsevier, Sage, Wiley, they are all the same) and away from the editor. Publishing houses are money making bureaucracies and hence they are greedy for data. Any data, it seems. Within the last two days I reviewed 3 manuscripts and submitted 1 of my own. It was a reminder how bad things have become. To submit my manuscript, I had to traverse metaphorical hell. I understand that I need to upload a file with the manuscript and a file with a cover letter, but to enter 3 passages of text in response to questions about issues that are addressed in any half-decent cover letter? Come on! And then there is the entry of lots of personal and professional information and multiple declarations of agreement with this and consent to that, ad nauseam.
Presumably, all this effort is in the interest of efficiency. The effort is real and the time spent is measurable, but where, oh where, is the evidence for greater efficiency? And I don't mean to argue that online portals per se are less efficient than paper mail. I believe that they, in principle, are more efficient. But there is a trap. Now that data collection is virtually effort- and costless for the publishing houses, there is no incentive whatever to exercise restraint. More data is always good, right? How can it hurt? They won't feel the hurt because it is borne by someone else.
Two hours later, Jungian synchronicity strikes: The journal to which I submitted the paper using their Dantean portal of hell sent me an email with a survey attached. The survey asks various questions regarding how authors manage their final report. Apparently, investigators have persuaded the journal to collect data on their behalf from submitting (submissive?) authors. I wonder if the results of this effort will be reported in this journal. Will the authors enjoy going through the portal? It's an unpleasant reality folding back onto itself.
Today, a student asked me if I thought that we were all within a simulation (as in god's thoughts). I asked him if he was referring to something like The Matrix) but he had not seen the movie (too young). Anyway, I said no, but that if we were, we wouldn't know the difference. This forced me to realize that my own negative belief (that we are not part of a simulation) has neither positive nor negative evidence associated with it. That being so, must we retreat to a who cares attitude? We could take the view of the positivists and declare that the simulation hypothesis has no meaning because it is not testable. I am not satisfied with this strategy. So let me try this: I believe we are not part of a simulation because if we were, we would have to postulate a higher-order reality in which that simulation is being run. So either way, we must claim the existence of some reality to avoid infinite regress (it's simulations all the way down). Since there is no evidence that we are in a simulation, we might as well -- with Occam -- believe that we are part of a reality, which, though it may be simulated, is itself not a simulation.
At Logan Airport (Boston), your safety is their "first priority." Prior means first, no?
March 12, 2013
Gmail tells you who of your recent correspondents is online by putting a little green circle next to their names. I don't know what algorithm they're running, because people pop up in that window that I have not written to or received mail from in a very long time indeed (in like forever, as google engineers of generation z might say). Today I crossed a Rubicon of the information age. Sitting in class and through a student's presentation (which was excellent by the way so I could divert a grain of attention), I noticed a little green circle next to the name of a student who was enrolled but not in class. So I opened a chat box and hesitated for a moment on the bank of the Rubicon. Then I crossed. "Leslie" (not her real name), I wrote, "you are not in class . . ." The response came saying something about a job interview being had. I wished luck and received thanks for it.
I am a foe of the surveillance state in all its forms (cameras, chips, fingerprints web dredging, what have you), and here I intruded into a student's privacy. I suspect that she was a bit taken aback by a note from the classroom in real time. Will this become normal professorial behavior? I have seen the future and I don't like it.
Some philosophers and other psychologists worry about the problem of other minds. How do we know that other people have minds if all we ever see is their behavior? From these behaviors we infer mental states, but inferences can never rise to the level of direct experience. In contrast, our own mental states are not only experienced, they are an experience. I submit that to worry about "the problem of other minds" is to put the cart before the horse. The first problem is the problem of own mind. How do we know that the experience of a mental state proves the existence of that mental state -- besides appearing not to need any proof? What convinces us that our mental experience is not the subroutine of a AI program or a god's dream (or nightmare). To my mind (hähä), science fiction writers deal with this question more creatively than philosophers and other psychologists. For other distillations, see here.
February 23, 2013
February 19, 2013
But promises are made and sometimes kept, as Hume must have known. How could it be? Hume “venture[s] to conclude, that promises are human conventions, founded on the necessities and interests of society” (p. 333).
Asserting that “Men being naturally selfish,” Hume needs to find room for trust and reciprocity. He thinks that ‘tis necessary, that one party be contented to remain in uncertainty, and depend on the gratitude of the other for a return of kindness.” This must be hard for “selfishness is the true mother of ingratitude” (p. 333), and “we cannot depend upon their gratitude” (p. 334). Hume considers the situation of two farmers whose corn ripens at different times. One could assist the other, hoping that he will return the favor when the time comes so that both might have a good harvest. But once the first farmer has been helped, he has no incentive to labor on behalf of the other. Both lose their “harvests for want of mutual confidence and security” (p. 334). Hume, in other words sound like an early prophet of game theory, arguing by backward induction that no one will trust when anticipating that the other party will have no incentive to return a favor.
Hume’s solution – if one can call it that – is to maintain the doctrine of selfishness by arguing that people “give a new direction to those natural passions, and teach us that we can better satisfy our appetites in an oblique and artificial manner, than by
their headlong and impetuous motion. Hence I learn to do a service to another, without bearing him any real kindness; because I foresee, that he will return my service, in expectation of another of the same kind” (p. 334).
People end up making promises because they do not wish to subject themselves “to the penalty of never being trusted again in case of failure” (p. 335).
February 15, 2013
I've been looking for an antonym of forecast. I like aftcast, but have not been able to convert anyone to that view. Pastcast does not do it for me because it is the antonym of futurecast (a term used by a Providence TV station). Postcast is no could either because it raises the question of what a precast might be. At any rate, if a forecast is a prediction based on prospective research, a postdiction based on retrospective research would have to be an aftcast after all.
February 8, 2013
The word of the day is urolagnia (also known as urophilia). One person pees on another and the latter (perhaps the former too) experiences pleasure. What's wrong with apple juice?
February 4, 2013
Until recently science was the one forum which functioned like an anarchist's dream. Each man capable of doing research had more or less the same opportunity of access to its tools and to a hearing by the community of peers. Now bureaucratization and organization have placed much of science beyond public reach. Indeed, what used to be an international network of scientific information has been splintered into an arena of competing teams.
~ Ivan Illich, The Deschooling of Society (1971).
What was a historical note at the time, was also a prophesy. Science is an industry, a bureaucracy, and a secular religion.
February 2, 2013
They say that Nietzsche was a misogynist. I found this quote “Ah, women. They make the highs higher and the lows more frequent.” here. Now, by itself, this quote is no evidence for misogyny. To be judged so, one would need to show that both of Nietzsche's claims are individually false and that he misjudges women's effect on men's happiness overall. As it stands, his two-pronged claim suggests that Nietzsche had insight into range-frequency theory 100 years before Parducci (1965) formalized it. Not bad, eh? It seems that social relations in general have the potential to heighten positive peak experiences. Children, for example, also have the capacity to do that. Negative experiences can also be accentuated, however [not mentioned by Nietzsche]. A betrayal by a trusted person or the death of a loved one are most painful.
Parducci, A. (1965). Category judgment: A range-frequency model. Psychological Review, 72, 407-418.
The word of the day is consuetudinal, which refers to a manual of a group's customs (think faculty handbook). And then there's heresiarch, referring to the leader of a heretical sect. Sometimes, a heresiarch writes a new consuetudinal.
January 30, 2013
January 26, 2013
A teacher of mine at the University of Bielefeld once defined psychology as "the science of human experience and behavior." Experience and behavior. Think about. Experience first. Besides behavior, experience is what we want to understand, explain, and predict. But that is a hard thing to do. So hard indeed that perennially psychologists flee from even trying. The single-minded focus on behavior was the program of the eponymous movement of behaviorism. The movement did not fail exactly, but it ran its course. Then came neuroscience with the bone-headed idea that only if you see the brain at work do you really understand what's going on. Neuroscience owes a huge debt to psychology. How, for example, would you know that the firing of the amygdala has anything to do with fear if you didn't have other, non-neuro, indicators thereof (including, most prominently, the subjects' reports of their experience)? Then comes anything with the term "computational" in it. Computational thisandthat is an exercise in applied mathematics. It models an aspect of reality numerically. Again, the project cannot stand on its own. It needs referents in the world of experience. Otherwise, what are you modelling.
At the end of the line, there is the phenomenon of consciousness, the master problem of psychology and its derivative sciences. Consciousness will resist being fully modeled, comprehended, explained, and packaged. This reason is that consciousness is not a thing or phenomenon like others, but it is the medium through which we understand and explain other phenomena. Consciousness cannot turn on itself (not even with the help of reductionist methods like experimentation or mathematical modeling) and gain a full understanding without running into paradoxa of self-reference. But that's ok by me because it underlines the necessity of consciousness. Meanwhile we continue to dance about the bush and learn a bit more here and there, and what's wrong with that?
January 24, 2013
January 18, 2013
Lance Armstrong won 7 TdFs thanks to certain "performance-enhancing drugs." Where are the drugs that get us into Science or Nature? We still need to cheat the old-fashioned way: make crap up.
Before panicking, consider how most psychologists seek to contribute to the literature (and make a name for themselves). Few cheat, as Mr. Stapel did (thank Buddha), but most daven to the deity of NHST (Null Hypothesis Significance Testing). Klaus Fiedler, Florian Kutzner & I (Perspectives on Psychological Science, November, 2012) suggested that tightening statistical (lowering p) and other publication criteria is not an ideal response to the replicability crisis. The price paid is the choking off of subtle, innovative, or creative work. We concluded that the main reason for the inflation of significant yet trite or unreplicable finding is a lack of theoretical depth.
I am aware of the pressures and temptations, and I have observed the following many times: Novice (and some seasoned) psychological scientists ask how they can demonstrate something interesting, by which they mean 'significant.' If you have p < .05, you have something, the reasoning goes, and surely it can be interpreted by a smart person. Every p < .05 is a score and the sum of the scores brings happiness, reputation, and promotion. Understandable as it is, it gets science backwards. Science ought to begin with a problem, a puzzle, a perplexity. Theories can be drawn in or invented to propose answers. These theories should yield hypotheses that can be tested, ideally competitively, and significance testing can play a modest role to help.
Contemporary social psychology does not appear to work this way. We see triumphant papers describing how holding a hot cup of coffee is associated with interpersonal warmth (Williams & Bargh, 2008) or how the smell of fish reduces trust. What are the problems that are being addressed here?
Lee, S., & Schwarz, N. (2012). Bidirectionality, mediation, and moderation of metaphorical effects: The embodiment of social suspicion and fishy smells. Journal of Personality and Social Psychology, 103, 737-749.
Williams, L. E., & Bargh, J. A. (2008). Experiencing physical warmth promotes interpersonal warmth. Science, 322, 606-607.
Oscar (The Office) argues that Hilary Swank is merely attractive rather than 'hot.' Pointing to psychological research, Oscar notes that Hilary's facial features are symmetrical; she has the kind of face one would get from averaging many faces. Oscar correctly uses the term koinophilia, which refers to organisms' dislike of mutants. A person whose face is different from the averaged face is presumably genetically different. Avoiding mutants and seeking the average is reasonable when distinctions between good (greater fitness) and bad (less fitness) are difficult.
January 14, 2013
The awful German language. Americans regard the German language as impossible to pronounce and as grammatically too complex. Arguably, these views can be held. Few German words have crept into the American vernacular. Kindergarten, Gesundheit, Sauerkraut, and Bratwurst ("brats" in the Midwest) come to mind. Some are more arcane, like Schadenfreude or Zeitgeist. Then, there's a small set of German words you only find in scientific or scholarly contexts, like Gestalt or Lumpenproletariat. My favorite, though, is Festschrift (sometimes rendered as 'festshrift'). This is an edited volume of essays written to honor a senior (perhaps emeritus) professor. The authors are her former graduate students and colleagues. They have only good things to say. Festschrifts are a labor of love. They do not sell.
January 12, 2013
The face of this post is legen -- dary German cartoon character "Werner." He contributed many a memorable phrase to the vernacular, many rendered in semi-dialectical Plattdeutsch (Low German). He was, however, conversant in Standard German as well. He is credited with the sage remark "Ich muss mich erstmal von meinen Simulationen erholen" (I first need to rest from my simulations). I dedicate the discovery of this quote to my Matlab-savvy friends.
The word of the day is humblebrag. The Urban Dictionary defines it as "subtly letting others now about how fantastic your life is while undercutting it with a bit of self-effacing humor or "woe is me" gloss." The Dalai Lama asserts that he is "just a simple monk." Hashtag humblebrag.
January 9, 2013
January 7, 2013
An essay on the school shooting in Newtown, CT (Sandy Hook), can be found here. I tried to take the attacker's perspective and asked what it means for our understanding of morality.
The word of the day is pescatarian. Being allergic ot all seafood and shellfish, I could not distinguish myself from a vegetarian.
January 6, 2013
The word of the day is periphrase, which kind of sort of means circumlocution, use of more words than necessary. I wish I could think of a more periphrastic way of saying this. A pleonasm (safe haven, climate change, hound dog, Department of Cognitive, Linguistic & Psychological Sciences) is type of periphrase.
January 4, 2013
What's life without the ability to make stupid choices? -- She needs her free will.
~ Gregory House, M.D., on a patient.
In shallow thoughts, I introduced Austrian sociologist Ludwig Gumplowicz as a naturalist, praising him for squarely confronting the myth of free will. Today, I follow up with two sections from Outlines of sociology (1899).
This understanding and misinterpretation is most intriguing. How did it come about? Why should man create an image of god as a separate entity outside of nature, an agent that created nature and is thus itself supernatural?
Although Gumplowicz goes on to say that “as man is himself subject to nature, is constrained by her demands, must satisfy his natural needs, lives according to the measure of the strength and capacity she has given him, and following her commands must close it, so his mind is deeply impressed by her omnipotence and the resulting course of events” (p. 176), he also observes that man struggles against nature, and it is in this struggles that he fancies his will to be free.
In his subsequent chapter on individual efforts and social necessities, Gumplowicz suggests that man perceives natural laws to operate on a grand scale, while independent action is possible on the human scale. Hence, “the beautiful illusion that the individual acts ‘freely’” (p. 190) takes hold. Man perceives his own actions to be free inasmuch as they contravene that which nature otherwise dictates. One of nature’s most basic laws is that all that grows must also decline. “In the realm of nature, all is perishable. Man would preserve everything. This fundamental antithesis lies like a curse on all of man’s ‘free acts,’ which are condemned to be exhausted in fruitless struggle against nature’s necessities” (p. 190).
When man’s will appears to prevail over nature’s will, the triumph is an illusion. “It is false to believe that man could ever at any point be victorious. What is fulfilled is always and exclusively natural necessity, never man’s ‘free will’” (p. 192). As a “trivial illustration,” Gumplowicz offers the example of a man who tries wine stoppers of different sizes until he eventually finds one that fits the bottle. “Eventually one will fit, the one of proper size, and when we find it we cork the bottle with satisfaction, proud of our ‘free action’” (p. 192).
Having convinced himself that he can act freely, man is but a small step away from believing that a being like himself, only larger, can dominate nature on the grand scale. The doctrine of free will and theism thus seem to constitute each other.
Gumplowicz, L. (1899). The outlines of sociology. Philadelphia: American Academy of Political and Social Science.
Did I have the guts to eat the Pfälzer Saumagen (Palatinatian Sow's Stomach)? I did. Just for the pun of it.