Thursday, December 25, 2014

Why You Think the Internet Stinks Now

I have been active on the Internet since 1990. In those days, we had Usenet at the university and moderated private bulletin board systems (FidoNet) for amateur fun. This makes me four Internet generations ago, if we assume that an Internet generation is about five years, rather than twenty-five.

Indulge with me in nostalgia. Before the FCC decided to sell bandwidth to the .com top level domain, the Internet was a place of low or no graphics, where users were almost exclusively .edu accounts (with a few .sys, .mil, and .gov), and most accounts were a first initial and last name (mine being misspelled by the IT people at my U, so I had anonymity even then). HTML sites were usually navigated by Lynx or one of the other text-only Unix based clients. Since the entire endeavor was text on slow dial up connections or fast connections on computers with monochrome displays, it was an empire of words subordinated to either a group's activity (e.g. rec.bicycle) or a group's academic research. University and government workers created massive amounts of free information, and the early HTML systems were ways to link information infinitely. People spoke of beginning with an interesting note in the newspaper, clicking on an odd term, following to a strange fact, seeing a strange name, and spending hours “surfing” from site to site, but all of this was via reading.

When America Online and CompuServe opened their closed systems to the Internet, everything changed. There were floods of new users who reflected average America. While many snobs thought that “killed” the Internet, it was, instead, the underlying decision to commercialize the Internet, combined with hypertext, that turned the Internet into today's creature of clicks and “eyeballs.

By 1996, and certainly by 2001, most “Internet” users only knew the world wide web, not Usenet, not discussion. The world wide web itself “was” a series of commercially profitable sites, with the last of the .edu created purchased (e.g., created by Columbia University library to make all of its Columbia UP reference works available for free, was sold, then resold) or forgotten. University server sites providing information became gradually invisible to average users because they went unadvertised, and “the web” was no longer a place where persons “surfed” on an adventure of information. Certainly by the time the "tech bubble" burst, each website invested in keeping visitors on sites, in sites, and preventing outside linkages. Websites became more pictographic, with an increase in sensationalism, as the same pressures that turned superabundant newspapers in 1900 into the Yellow Press created increasingly narrow and vertical forms of discourse (“vertical” refers to information that is recursive and closed, in this case). What had been peer groups in conversation became interest groups engaged in a tailored retail experience. 

“Friendster” and “MySpace” were non-profit simulacra of the older Internet. They succeeded, to the degree they did, by offering like-minded cultures and subcultures. (There have been other simulations since, including Reddit.) However, when their host/software companies offered stock, they were purchased by media corporations that, as early as 1998, had been imagining an Internet/cable vehicle whereby visitors would be captive, ordering television shows, movies, books, radio, and the rest on an a la carte basis by the Internet. '

Neither the websites nor the delivery technologies were in place for such visions to be realized. (This vision, and its failure, was paradoxically critical to the collapse of Enron. The Amazon Kindle/Fire is getting very close to its realization today, according to critics. If they can "own the pipeline" and the store and the production, then the consumer choice is finally completely eliminated -- or "business uncertainty is minimized," if you prefer.)

 I'm no Electronic Frontier Foundation warrior or GNU freak. I haven't the money to be the first or the skills to be the second. I did join Wikipedia in 2003, though. My frustrations with it were that it was not dedicated to a common project of construction as much as it was a "community." My criticisms of capital in technology are not propelled by idealism or ideology. They are directed solely at an analysis of the deterioration of academic freedom and investigation because of a capitalized web.

One thing I've noticed is that a genuine analysis of technology cannot be found under the title of "technology" writing. For about ten years, I noticed that the writing about computer technology, in particular, fell into one of two camps. Either a new device or program was the Swiss Army Knife of Heaven -- able to turn every student into a buddha and genius -- or the next device or program will free the Fenris wolf and extinguish the sun for once and all. "Tech" writers are either reviewers or advocates. Sometimes, even worse, they're salesmen.

Look, capitalized websites serve capital. I get it. That's just and right. But non-capitalized websites are now all but invisible, especially since Facebook learned from MyFace's failure and AOL's persistence and began to fold in a whole universe of outside websites into its "you're still on Facebook" experience. Regular people are beginning, just beginning, to realize one of the more cynical web "memes": If you're not paying for the product, then you are the product.

All of the commercial Internet is riding on public infrastructure. Ask AT&T how it feels about cable companies getting access to "their" infrastructure. Well, how should we feel, then, about commercialized Internet services working against the public's interests or the nation's constitution? EU investigators found out that a person who creates a Facebook account and immediately deletes it generations twenty thousand pages of data that Facebook does not delete. That, after all, is their data.

Windows 8.1 is roping all its users into Xbox Live accounts and beaming geo-location to Microsoft. Apple does that with iTunes. This is without even talking about a smart phone. Any person who owns such a thing is foolish, in my opinion (as a phone, it's a phone, but as a computer, is it equal to a laptop? as an mp3 player is it equal to even a Walkman? doesn't it offer imitations of a dozen functions but at inferior performance, and all with the solitary advantage of fitting in a pocket?). I'm sure you have already read that the new iPhone made news for not including a backdoor into user encrypted data for NSA and FBI. 

Going onto Google is a losing proposition. It geolocates the browser. After a few searches, Google begins to tailor results to "customize the web experience." It predicts the sorts of results this user wants. It discounts, for example, "old" web pages -- so if you're looking for a news story from 2004, you won't find it, because Google simply doesn't want that to show up, because "normal people" don't search for old information. A few more searches, and the results are "customized more." 

Google's search is used by Bing, by the way. Facebook will tailor search results and "help" the user extensively as well. 

Doesn't that scare you? Don't you see why that's the end of the world?

The limitations are being made, in the case of Amazon, Facebook, and others, on the basis of likely purchases, not what one wishes. In the case of Google, it's made on the basis of what you have been interested in before. In other words, these merchants are making decisions about the sorts of questions you can ask, and answers you can get, on the basis of likely sales, likely happiness, not answers, not knowledge, not growth.

As an academic, I have to have open searches, because the Internet has consumed the library. Once that library has then become a public library with card catalogs assembled by advertisers, whole floors of the stacks go missing. The only answer to it is yet more capital outlays in the form of JSTOR and EBSCO subscription services. 

Meanwhile, the Facebooking of conversation has pushed conversation into interest groups, where like meet like, the agreed hear from the confirmed. That is guaranteed to be sterile or frenetic.

Monday, December 22, 2014

Let's change the rules

My only real rule, and it's about me and for me, so I don't have to justify it before any tribunal, is that I won't talk about myself on this blog. I'm fully aware that there is no other subject, if we trace the matter analytically through hoop after hoop, but you can take your hoop and put it in your nose. The simple fact is that I don't like me. I don't find me interesting. More, I loathe "bloggers" who write about the miracle of their daily lives. While these verminous scourges are less common these days, when I began here, they had not yet found their Facebook selfy Reddit Instagram Pintrest nirvana.

I recognize that I have an audience of perhaps two, thinking optimistically. I have made this so.

I want to change the rules a little. I want to talk about the vitreous that blocks me, even if it isn't universal. For example, anyone who reads all of this blog gets the impression that it is written by someone with clinical depression. That makes it nothing special. My last (look down) commented on the fact that this is just how things are, and talking about it might be profoundly useless. However, today I decided to make a play list for my funeral. I didn't do that because I'm planning on hastening the pale visitor's conquest, but because I've gotten a couple of cancers. That's no big deal, except that I'm paid so little that some months mean hunger (really), and our insurance just pushed the deductible from $3,000 to $5,000. I just, in essence, got a $5,000 pay cut.

The IRS has chased me around and taken this month's paycheck. The reason is that my mother borrowed against a life insurance policy of mine some fourteen years ago. The insurance company made sure that the premiums would never go to repay the loan, and therefore the policy would die. When it did, the company reported that I had gotten $5,000 in "income." I, of course, had gotten not a cent. My mother had. She's dead. I don't blame the taxmen. I blame the insurance industry that wants old life insurance policies to die, because they're too cheap. Nevertheless, I owed $1,000 on the "profit" I had made when the policy died. I barely get $1,000 a month, so I wasn't exactly able to pay them from my excess funds.

There's good news, though. If I owed $10,000 or more, there is a "Fresh Start Initiative" to take care of me to renegotiate! But owing $1000? Well, hell, boy, everyone can afford to pay that!

I've been working at a job for 10 years where I am making $5,000 less per year than the starting pay for my position.

Still, money is not something I think about until I have to. I hate money. I hate the people who allow money to carry value. It is, after all, the most abstracted and irrational unit in the world. It is unconnected to morality in the extreme. It is unconnected to work nearly as far. The construction worker works much more than the stock trader, but the stock trader makes an obscenity of ejaculating money, while the construction worker makes a wage and destroys his body while listening to Rush Limbaugh and Neil Boors.

My college president thought it would be a good idea if all of us provided our Facebook profiles to him for a new intranet (that would be linked to the school's Facebook page. . . I'll let you figure that one out). We were also supposed to explain our conversion experiences. When did we realize that Jesus was Lord? What book or preacher was most important to us?

"Consider the lilies of the field. They spin not, nor do they weave...."

I do not do Facebook. I never will. I have Reasons. I put "A Poor Man's House" by Patty Griffin on my funeral play list. My first song is "On the Nickel," because it says, "If you chew tobacco, and never comb your hair," which sets out two of my conditionals.

I took a long time, but I finally came up with a response to the president's request. He wanted to know, so I decided I should tell him. I wrote two documents. I'll share them here, I guess. Maybe I won't. One is the actual story of how I lost my faith thanks to the evangelical movement and its emphasis on altar calls. I came back to it later and discovered a quiet, certain, faith. I never, in the document, point out the problems with the theology and polity that this college president and his new trustees embrace. I only tell the truth. Then I turned in a second document and explained why Facebook and all of the commercialized web is the destruction of academic inquiry.

This second one is boring as a sand pudding. I will post it here.

The point is that we once had a volunteer Internet, where discussion was organized by joint projects or activities. We exchanged that for interest groups and consumer subsets under Facebook or one of the others -- all designed as being conversation under the power of sale.

I will offer one of my blogs of Ideas next.

I remember the t-shirt from 1991: "The Internet's Full: Go Away!" That was a response to the "AOLamers." I thought the snobs were wrong then, and I was right, but I thought the Internet had been destroyed then, and I was right about that, too. In 1993-4, I thought the invasion of graphics into websites was the problem. I was feeling a symptom. The truth is that the decision to sell bandwidth to .com meant that, at best, the "real" Internet of Usenet days would exist only in an underpopulated, unadvertised, esoteric bubble, but "the web" would become a midway of freakshows, where the marks are the exhibits.

Friday, October 10, 2014

Therapeutics and Cures

I am myself indifferent honest, but yet I could accuse me of such things that it were better my mother had not borne me. Iam very proud, revengeful, ambitious, with more offences at my beck than I have thoughts to put them in, imagination to give them shape, or time to act them in. What should such fellows as I do crawling between heaven and earth? We are arrant knaves all; believe none of us. Go thy ways to a nunnery. -- Hamlet III i
Hamlet is a great play for grumbling. It has the best condemnations of life one could hope to find. Hamlet, the character, is a peevish young man, but Shakespeare was a middle aged man who had license to write a peevish young man. It's the perfect combination.

"You know, I just can't stand myself/ And it takes a whole lot of medicine/ For me to pretend that I'm somebody else," Randy Newman wrote, in "Guilty," and the song (on "Good Old Boys") hit a common vein for muddled up men. From guys in foam trucker hats to bicultural intellectuals with delivery marijuana in Manhattan to stifled men hurling insults and encomiums at ESPN, the sense of self-loathing may be the most common sense, the only commonsense, in the American man.

I suppose there are men who like themselves. I think I've met some who, standing in a flood light and a room of mirrors, puff their chests out and say, "That's what I'm talking about!" However, I'm not talking about the water bugs who can live on a soap bubble. If those men don't make up for their lack of self-loathing in outright hatred from everyone they know, then I, at least, will hold them odious.

We begin in fantasy. We start as princes to be, quarterbacks and inventors of indispensable goods that will benefit humanity. We dally in plans that give us joy because of their fundamental justice: every plan is an affirmation of our potential, our uniqueness, our power. We even put plans into action and make achievements. However, we are not the jet pilots, the commandos, the secret agents, the wizards and rock stars we knew we could be, if reality honored -- if reality only allowed us to enact our plans.

Sartre said that Hell is other people, and the adolescent's fantasies fail to take into account other people, except as objects. Young men's plans fail to take into account unfairness as a founding principle of society. Each individual relationship is as fair as the two persons make it, and there is goodness beyond description to be found, but lurking behind the immediate, always present beyond the personal, is a force like entropy -- a force of profit, of grasping, of protecting power and subjugating the masses, and this force asserts itself like a flood against the leaky boat of the personal and the social.

Thoughtful people talk about The Combine or The System or late capitalism (although we're really in a post-capitalist state, to my way of thinking, as capital has divorced itself entirely from commodity and produces rents without reference to any commodity exchange). The less thoughtful people talk about The Government or immigrants or nepotism or loss of traditional values. All of the young men who had their plans and put their plans into action and found themselves anxious at the end of every month, or deciding which bill to be late on, know that something isn't right. All of them know that they have failed, and they know, accurately I'd say, that they never really had a chance.

Anger turned inward is depression. Yeah, well, depression is also realism.

Some of the most capable, beautiful people I have known have been crippled by depression. It never mattered what they could do. It only mattered how far they failed themselves, and they had failed themselves pretty deeply.

It occurs to me at this date, far too late a date, that we are a strange, crazed people. We are trying to "treat" depression. We are not trying to treat the fact that inflation is occurring (all food products are shrinking and staying the same price -- as if there were only a few manufacturers and they colluded to raise prices by shrinking portions. . . but such a thing could never really happen, could it?) but not showing up in an inflation rate, that surplus labor has meant increasing profits and no increase in employment, that tax rates for the top go down, while the taxes on the bottom go up. . . but I'm only speaking of money, because money is bothering me, personally. We have no frontiers, no new societies for humans to forge their identities anew, so our old accumulations of cultural power have begun to rot and invite violence. We have turned our nation into the value system of the MBA, and the MBA's value system is anti-humanistic and anti-human (as well as irrational).

There is no cure for depression. There is no point in asking for this or that thing, this or that process of chemistry, to intervene. Depression is not, after all, abnormal. It is legitimate, and it comes from never becoming persons. It is despair, in the Kierkegaardian sense, but it carries with it its own ever-shifting demands. Unlike Kierkegaard's notion of despair, where one must live in the eternal present and engage the self in full awareness of the religious obligation, this is a compass with fixed legs: as the self gets more engaged, its expectation of what it requires to be fully alive moves farther along, and the gap is an acute sadness.

That, I suspect, is a prospect of living.

Sunday, August 17, 2014

Strippers, Cops, and a War with Drugs

I have known more prostitutes than police, more police than strippers. I have had conversations with strippers when they were off the clock, but I'm not an expert on what they do or why they do it. By the way, the number for the first two professions I mentioned is two and one. The difference is that I knew the detective was a detective the entire time I was an acquaintance of his, but I only knew one of the prostitutes had been a prostitute. The other was working, and I didn't know about it.

There are many, many documentaries by and about strippers. The most interesting facet of their profession is the labor exploitation. The psychological exploitation isn't something the women seem to cite very often, but the labor conditions would have Samuel Gompers calling out his guys. Watch any documentary on the job, and you'll come to the conclusion that the women are stolen from, subjected to poor conditions, and encouraged to spend all of the money they make on dangerous surgeries.

As for prostitutes, the subject is endlessly complicated from a labor point of view. Even the issue of trafficking turns out to be complex, as the numbers of women enacting a "Lilya 4-Ever" scenario are probably low. (Even one is too high a number.) However, again, the women face economic exploitation that leaves them constantly having to work more to obtain the big payday they can see just over the horizon.

For a while, was a cutting edge left wing political site. Then it became a click-bait site, where every article was written by Tracy Clark-Flory. "Ten Things about 'Divergent' that Related to My Lesbianism" and "Sex Toys and Me" and "Gender Equality and My Girlfriend" and on and on and on -- every piece was, "And how this totes relates to me! I'm amazingly hip! I like sex. With girls!" Now, Salon is trying to be a left wing political site with writers again, although designed as if every person on the Internet were either blind or color blind and viewing things on a 3" screen. (Don't get me wrong, "Miracle Pasta Recipe as a Hipster Lesbian" articles still run every day.)

All along, Salon has had a "sex positive" feminism. (That means, for people who like to reduce long arguments to clickable bylines, that it is a feminism that embraces sexual pleasure as a right and believes that it is good to discuss desire and set out sexual norms that avoid shame.) They ran, back in 2003 or so, a series from a Washington call girl. They also ran a story from a woman who was a prostitute in Cuba for a while. Both women attested to the same thing: the body takes over, and there is a bit of pleasure that's simply due to wiring, and they lost intimacy. "Intimacy," according to biochemists, is oxytocin. According to everyone else, it is a bond and closeness that follows a sexual encounter that is often the best part. Sex workers stop having that, and they report difficulties in achieving it with their mates, if they are married.

Sex work is not supposed to leave women an emotional wreck. I beg to disagree, but my sample size was small, and I do not wish it larger. It is enough, as far as I am concerned, to think that, at best, the women can count on "not awful" and then pay the price of a divorce from emotional connections.

There is a movie available on Netflix that I cannot hiss enough. It's called "Whore's Glory" in English, and it's only accidentally good. The German film maker gets credited for sympathy for the women, but I saw none. I saw, instead, a voyeuristic impulse that controlled the film so that the narrative simply had to lead to some on-campus intercourse. While the documentarian was chasing down the most degraded red light districts in the world, he let the prostitutes themselves talk about whatever they talked about, and the extremely young girls in India made a case against their dehumanization that is utterly shattering. For the most part, though, the women talk about how little money they're making, gossip about each other, and talk about Johns. Their attitudes toward men is as commercial, affectionless, and dry as it could possibly be.

A stripper in a documentary talked about how she came to see her breasts as an ATM. She would shake them around, and pull money out. Every man she saw on the street, she said, she thought of in terms of whether he would tip well or not. She had stopped dating, because she was convinced that every man she met went to strip clubs and was as bad as the customers who gave her tips, and whom she despised, every night. The prostitutes in "Whore's Glory" spoke of men either in idealistic terms -- the man who would be different, who would protect her, who would give her money -- or in terms of a cully.

Strippers can get tax deductions for their breast augmentation surgery. Silicone is the real drug of choice for the industry. Sex workers, despite what apologists say, have a correlative link with narcotic use.

To sex workers, people look like Johns or unicorns. To strippers, men look like suckers to be played. To cops. . . .

The NYCPD detective I knew was a great guy. However, he told me himself that he had had to learn to look at the world a new way. The world inside the force looked like a war against scumbags -- that's you and me -- and victims -- also you and me -- and people trying to keep them from doing their job -- also you and me. When you see bad people and hear lies all day, you expect every stranger to be a liar. When you live and die by the idea that, like the military, you're not "fighting for" an abstraction, but for the guy next to you, the loyalty you build means that of course the witness is lying about the other cop doing something bad.

I find it rather easy to believe that a policeman shot an unarmed Black man to death in the middle of Ferguson, MO. I would believe he did it for the young man failing to obey. I know that the grounds for shooting for police have shifted since 2006, that police can now shoot if they believe they are in danger. No longer do they actually have to have their lives actually in danger; they only need to think so. Since Ferguson PD beat another Black man and then charged him for destruction of police property for bleeding on their stuff, I absolutely believe that these cops -- whose county superiors later arrested an alderman for "failing to obey" -- would kill because they weren't being obeyed.

The police have a natural psychological bias. See crummy people all day, and you'll start thinking all people are crummy. See violent people all day, and you'll assume everyone's violent.

The problem is that the police only get to enforce laws, and they have to tolerate annoying citizens. If they can't do their jobs without releasing information, without being protested, without freedom of assembly, then they can't do their jobs at all. Police who "must" get MRAPP's and Strykers and automatic weapons and LDAP's aren't police: they're paramilitaries.

The recruiting for police makes it seem like a W A R on crime. The SWAT gear and military surplus allows all of the police to go out with JSOC styled garb and point rifles at empty-handed protestors. It allows the police to say that the crowd are "f*cking animals." It seals the assumption (that the public is criminal -- an entire prison population waiting for booking) with the rituals of conquest (not occupation).

The old "war on drugs" gave us the legal abuses that glaze these affronts. The "no-knock warrant," which is now served by SWAT teams, comes from the drug war. The roving wire tap comes from the drug war. The invention of SWAT itself comes from the drug war. However, all the heavy weaponry in local cops' hands comes from the 9/11 freakout. Someone thought that it was a great idea to put military junk in Wayback, Arkansas so that it could deter the Islamic invaders. Marry the "no knock" warrant and the SWAT with that stuff, and you've got Ferguson, almost.

If strippers need breast augmentations ad infinitum and sex workers look for central nervous system depressants, then what of the Valiant Watchmen on the Wall? It's possible to see the behaviors that we've seen across America in the last five years without a widespread drug problem among the police, but, as long as we're making SNAP recipients pee in a cup, making school teachers pee in a cup, making parolees pee in a cup, why not ask the local police to be screened for anabolic steroids and testosterone supplements? It's only fair. We do, after all, want to arrest any law breakers.

Friday, August 08, 2014

Miss Elainey

To clarify what is below, the point is really simple. I refer to McLuhan a lot, but that's because he asks us to ignore the novelty of a piece of technology and to focus instead on what it does. What it does, he says, is inevitably a replacement for something already being done, an extension of one of the human senses or capabilities, the recall of a forgotten technology, and the reversal of its initial extension and replacement. I grok that two of these are hard to buy.

Just focus on the first thing: every piece of technology replaces something already underway. Humans come to a piece of technology with the same brain they've always had. What's more, technologies create their own social norms. Remember CB radio, good buddy? Hashtag memory. The individual technology creates a fetish in both the "neutral" anthropological sense and the more potent Marxist sense. McLuhan's The Mechanical Bride talks about how a new technology presents anesthesia ("New!" "Labor saving miracle" "Lose Weight without Trying!") and seduction by borrowing from art. He had in mind the yearly parade of automobiles and dishwashers. Imagine if he had seen the explosion of articles praising the "revolutionary design" and "aesthetic" of the Apple iFad.

If we can ignore the borrowed clothes of technology and suspend the anesthetic claims, we can ask, "What does this do by other means?" In other words, if you want design, go to 90th and Park Ave. in New York. The labor saving claim is true, but it's a compensation for changing the way we do things, the way we organize labor, and the expectations we have. Allowing it to be more than that is to be seduced.

From Shorpy

With me so far?

Ok, the fetish of a piece of technology grows more and more essential as that technology is social. Therefore, a single user piece of technology such as the hoe will hardly have any fetish to it. There won't be "a right way to hoe," and there most especially won't be a "right way to get ready to hoe." On the other hand, driving a buggy or a car has an enormous fetish: adjust the music, fasten the seat belt, adjust the seat, set rules for "calling shotgun," and then going out to engage in heavily codified driving behavior.

A great deal of money and advertising effort has gone into making YouTube ubiquitous. Many dollar bills are betting that the future consists of every citizen of the planet watching and computing on a telephone. The new Windows 8.1 looks like a Kindergarten cut-out board book with its big icons (perfect for a phone screen) and reduction of text to a series of enigmatic gestures of hostility (e.g. "The Store").

"Video," as understood by persons born after 1995, is a free audio/visual experience found "on the web." It is anti-artistic, in that it is a product intended for consumption and repetition, but not consideration. Video requires response -- an up or down thumb or a forwarding to a friend -- but, if you understand the dichotomy of pornography and art, it is on the pornographic side. (Briefly: pornography is taken, devoured, and used up by its viewer, and it is good to the degree that it is useful in producing an effect. The pornographic is consumed in the viewing and therefore cannot teach lessons or provoke thought, because those are inimical to the pure sensationalism of pornography. The artistic refuses to be understood. It cannot be contained by the viewer and elicits mood rather than provokes it.) Video is flash paper.

If a teacher uses a video presentation or a video presence in a remote class, then the medium's fetish works against the purpose of the class. The medium (video) eliminates a set of uses and imposes a set of interpretive mandates.

All online teaching runs head-on into the fetish and unintended reiterations of the technologies we use. What I call "the big text box" runs into a problem, too. Again, forget the claims of saving labor for the time being and bracket any questions of "ease of use" or "design," because those are all claims astride or beside the critical questions of, "What is this, outside of the classroom" and "What is the fetish already in place?"

The Professor gets a big window. In the left pane is a list of the names with online/off status indicators, and below that is a dialog box for the students to "speak." The professor's right pane splits into one box for slides and another for web pages or documents brought up on the fly. The professor then types:
The Licensing Act of 1736 indirectly led to the success
of the English novel and the creation of Shakespeare as
"the greatest playwright in English." After Walpole's Commons
passed the Licensing Act, London audiences distrusted
any plays that did get to the stage, because such plays
felt like propaganda. Furthermore, playwrights couldn't
get plays passed by the censors. However, they could make
some money by publishing their play ideas as novels.That's
just what Henry Fielding did.
There is a slide up there saying, "*John Gay's  Polly *Henry Fielding **Haymarket Theatre **Pasquin *Repertoire theaters with Shakespeare *Puppets!" However, to the shock of the true believer in online classes, student Chad interrupts with "When was Shakespeare born?" Addison takes advantage of a pause while the professor waits for students to catch up to type, "The syllabus didn't say that the first test was going to be part of our final grade. I don't think it's far."

What's happening is that the fetish of the Big Text Box is the online forum or the web comment thread. It's different from "watching a video," but it has a primitive social structure that repeats itself with depressing regularity. The rules lawyer, the "but you haven't done your job because you haven't convinced me that Jane Austen wasn't a lesbian" writer, the "you have to be nice to me; it's in the rules" special sunbeam, and, of course, the troll (the individual who goes to a place he (or she, I suppose) most hates to try to 'tell them off' or just make 'them' unhappy) -- each is standard issue in comments threads. 

In an online class, students have every reason to avoid "web comment" behavior, but they have every reason to avoid classroom disruption in in-person classes, too. For students feeling frustrated or afraid, or for students who are just plain unhappy, "exposing this BS for what it is" seems worthwhile. When an online class uses the BTB, students know the personae they must adopt.

An Hoff Othat

Illinois Republican Bobby Schilling was formerly in the House of Representatives, and he wants his old job back. For one thing, he needs the money. He's only making $100,000.00 a year, and he made $174,000.00 while in Congress. He can't manage on his current salary
". . .the folks that are living paycheck-to-paycheck, which is most Americans, including myself, is that, you know, this [an imaginary tax to fund the ACA] is not something that you want to be putting out when you've got a kid that wants to play sports or you want to take a trip for vacation. Instead, you've got to funnel your money over to Obamacare, which is something you might never have to use."
Let us bask in the glow of the 5 watt light bulb glowing before his lenses
So, health insurance is terrible, because it might mean not taking a trip for vacation -- which is a decision we paycheck-to-paycheck people often grapple with -- just where we want to go for our vacations, whether we should fly or drive, and whether we should try the Virgin Islands this year or stick to Martha's Vineyard. Why, health insurance could even cost as much as. . . as a kid playing soccer. Well! In that case, the choice is easy: little Maradona needs spikes. (It's possible that Bob there could be thinking of a daughter and an actually expensive sport, like gymnastics or tennis, but we've got to remember that he's living paycheck to paycheck, so he has to be thinking of an inexpensive sport.)

I want to point out something in Bob's favor, here. I believe him when he says he's broke. This is because of my lesser known law ("Geogre's Law" is on the Internets, but I've got more than one of 'em): Debt rises to income. Also, all people live on $18,000 a year.

Bob has no money. Bob makes a lot of money. Bob probably has nice stuff, including a nice car payment and a nice mortgage payment to make. Given his party affiliation, he probably has a tuition payment or two to make as well. He no doubt has dues and greens fees that he has to pay. He spends more per mile with his vehicle than I do, for example, because he would have a "nice" car, which means a heavy car, which means fewer miles to gallon. If he makes more money, he will likely get a private plane. No matter what, until he runs out of desires, the debts will chase his income, leaving him with a set amount with which to buy food, drinks, golfing magazines, pay-per-view sports, and Toblerones in hotels. That figure used to be $15,000, but I'm sure that it is now at least $18,000.

There! Two posts in one. Some pretty pictures, though.

Friday, August 01, 2014

The Valley of the Uncan

This pun is available for adaptation, by the way ("The ICANNy Valley," e.g.).

The time is drawing near when I will be asked to consider technology and instruction again. I have never stopped thinking about it, of course, but once upon a time I was payed to be aware of the issues, even if no one actually wanted to hear what I had to say. (I blame myself. I probably didn't say it very clearly.) Now, though, I'm going to be caught between the cruising speed icebergs of capital and capitalized tech purchases and will need to explain why education, instruction, and the latest purchase aren't always aligned.

I got a new laptop, and for once, I'm current in software. I'm indistinguishable from consumer class, and it only took thirty years to get there. This means that there is a video camera built into the lid of the laptop, and both the NSA and Microsoft can turn it on at any time without my knowledge. Fortunately, there is a high tech defense against this. For a fee, I can relate the specs to you on the wireless network enabled BLACKTAAPE (TM) (Pat pending). By getting the properly designed, wholly chemical free and Y2K compliant BLACKTAAPE, you can place this device in front of the lens of the camera and be certain that no one is seeing anything you don't want to show.

Anyway, even if you don't keep up with college education, you know that the trend is for online classes. This trend is driven by consumer impulses and a flood of G.I. benefits, an exponential growth of for-profit colleges, and, most of all, colleges and universities seeking ways of gaining revenues by reducing labor costs and facilities expenditures. Sometimes, schools try to own the online classes teachers create, claiming that the professors surrendered their intellectual property and copyrights by using college/university computers. Even when that's not the case, the colleges frequently pay less for online classes and demand more.

In general, they are exceptionally unpopular among faculty. (Yes, commenter: you love them. That's great. I don't hate them. I'm talking about the general feeling.) Faculty generally figure out that exploitation is in the offing.

I want to talk about one small subsection of the phenomenon, though, and that's the technology of the online classroom.

Assume a class size of 15. Assume the class will involve laying out background information and process instruction. In fact, let's go ahead and stipulate a mid-point class, like "Survey of British Literature 2: 1750 - 1945." To teach a class like that, there will be
  • Historical background, genre background, biographical background, thematic background for major authors; construction of either a thematic narrative or an historical narrative to unite the material selected into an arc that will allow the students to frame the things they read.
  • Discussion with students of individual works to encourage close reading and strengthen student close reading skills; class investigations of longer works so that the students get to pioneer the exploration and discover when a reading has and lacks support.
  • Explanation of "how to write on literature." A survey class is structured as a first step in a major, so one needs to teach students how to write about literature with an awareness of critical perspectives; students need to know how to read criticism without being adversarial or slavish.
Now, ignore for a moment what technology you must use. What technology would you choose to use for these tasks -- provided that "in seat class time" is not allowed?

No very good teacher is ever one-way about teaching. Giving background is close to one-way information flow, as the information in the background goes in one direction. However, the delivery, in person, is two-way. In a physical classroom, lecturers watch students, listen to whispers and groans, make personal asides to punctuate the depth of the information, slow down when students get behind, etc. However, this, and only this, can be replaced with a set of web pages or a video. Students can "watch a You Tube" of the professor and get an 80% experience, perhaps.

The third thing -- "how to write the paper" and "how to take the test" -- seems as if it is just as susceptible to one-way, static replication, but it is not. Even in highly selective schools, where students have relatively uniform backgrounds, a class of fifteen students will have ten different misapprehensions about how to approach writing about literature. This is inevitable, because the task is at the heart of the college major. In other words, students will be uniformly heterogeneous because they're not college majors. A single talk or web page on "how to write a literature paper" that addresses the misapprehensions of students will either be a work of inexplicable genius or unreproducible luck.

It's the second bullet point that's the hell.

Should I be on video, with fifteen small thumbnails of the students, to "meet" with my class? Is that better than a large chat window?

There is an irony here, because video contact is worse at reproducing the classroom than a flat text window. It is worse for my students, and much worse for me, to have video and video to recreate the in-person classroom than to have no pictorial representation at all in favor of text windows.

Think about what happens when you speak to a conference room. Think about what the people around the table are doing. The non-verbal communication is much greater than the verbal communication, and people will inaudibly negotiate a mood and behavior. This is why one class can be "mean" and another "sweet" -- with the same material presented by the same teacher, the students themselves will negotiate a mood among themselves without even knowing it, and this collective voice will hold until disturbed. This social harmonizing prevents the most egregious behaviors. (It also intimidates some students and prevents their asking for help.)

When fifteen students are fifteen picture-in-pictures, they are fifteen individuals -- fifteen television sets. They don't negotiate with one another, and each is engaged in the social behavior of "watching video."

You may think "video connection is allowing me to connect to my students," but each student has a history of viewing "video" on a laptop or desktop computer. There are conventions for YouTube and the others that overwrites the actual use made of the video link.  "Watching video" is a fundamentally solitary behavior that is subject to the egoism of consumerism. "Watching video" comes with a "like" or "dislike" button, has a comments field, and invites "snark" or forwarding to Facebook. These conventions are everywhere except the online class, so the students, at best, experience contradictory signals from the media. More likely, each of the fifteen students conceives of herself as a solo entity and the professor as disembodied, if not a commodity, and the commodity experience (i.e. monetized routine found in advertising and placed in journalism) of "watching video" acts as interference against the perception of the professor as a teacher.

The instructor for his or her part, will see fifteen separate, distracting behaviors across the screen. There will not be the corporate behavior one gets in person.

On the other hand, if students engage a large text box, the experience pre-dating the activity is "writing a text" or "reading." This is an individual experience as well, but it is what McLuhan called a "cool medium." The cool medium of reading/writing allows or forces analytical thinking. While the video presentation should replicate "conversation," the technology by which it is arriving has already etched out a set of expectation that instead dominate the intended effect.

Thus, it seems, just as it becomes more possible to have a video link with a class, it is less and less useful -- more and more counterproductive, in fact -- to do so, because the ubiquity of YouTube, Vine, Vimeo, and the rest automatically carry methods of interpretation in the very act of appearing by video.

Monday, July 21, 2014

More simple stuff

It appears now that raising the minimum wage increases employment. This is directly opposite of the "small bakery" micro-analogy that conservatives appeal to and everyday voters imagine.

I offered students "Raising the minimum wage will/will not raise prices or lower employment." I warned my students in advance that, no matter what they thought, no matter what "common sense" told them, if they did what I required them to do and accessed scholarly reviews, they would find virtually no support for the "will" position. I warned them that academic economists saw the issue as complex, with various effects.

Of about 22 papers that chose the topic, two chose "will not." The rest chose "will," and they either used no scholarly evidence (a consensus) -- opting instead for Hurtage Foundation and Kato Foundation papers that wouldn't fool anyone and newspapers -- or used websites. All of the "will" people, though -- every last one of them -- informed me of the Iron Clad Rule of Capitalism.

If you have someone make you pay your employees more all of a sudden, then you have to make that money up. You either have to increase the prices or fire an employee. Suppose you ran a bakery...

That little canard was inescapable. I read it and read it and read it.
The problem is that this mythical bakery didn't exist in Adam Smith's day, and it sure doesn't exist in 2014. The bakery they imagine is a zero sum, and that's typical of a childhood imagination. Actual bakeries, and we'll drop the bakery as soon as possible, pay for their materials, their labor, their advertising, their insurance, their licensure, their utilities, and then charge extra. The cost of the cupcake is not zero sum. The bakery is costs + profit = price.

Profits, in most small businesses, grow and get shared with the employees, or else they go toward growth. The owner gets richer, sure, but the small business doesn't keep very many workers at minimum wage. As it grows, it rewards its employees.

Who pays minimum wage? 1. Restaurants, 2. "big box" retail. There was a study of a local environment that showed inflation when there was a minimum wage increase, but it was of Chicago, when it mandated waiters and waitresses getting minimum wage. The restaurant business hadn't be set up to handle that, and so it did have a bulge in prices. Otherwise, true minimum wage (non-tipped) tends to show up in fast food.

Mike's Manufacturing doesn't keep a large part of his staff at minimum wage. McDonald's does. Papa John's demands it.

So, aside from some businesses with high turn over and manual labor, who holds the bag? Mal-Wart and McWendy King. The problem with worrying over their labor costs is that the true nature of capitalism is that non-publicly traded companies have the duty, as Henry Ford said, "To make the highest quality possible for the lowest price possible while paying the highest wages possible." Once a corporation is publicly traded, though, its board's duty is to gain profit at all costs, and reducing labor is the easiest way to achieve an illusion of profit. No board can voluntarily increase labor costs (pay) without facing an investor law suit or loss of stock value.

$ = Labor + Materials + Rent on capital + Profit

In low unemployment, employers are forced to increase wages. In a recession or depression, employers have no market force compelling an increase in wage. Furthermore, publicly traded corporations are compelled to increase profits. Since the Great Recession began, corporations, most especially including those that page minimum wage, have increased their profits. McDonald's has turned a huge profit, and Wal-Mart continues to be the most affluent non-oil or drug business one can imagine, but neither may pay employees more without being sued. Neither is compelled to pay more. The balance of the integers in the formula of capitalist price are out of whack. P > L, and it keeps going up.

The companies can't break the cycle.

There is a need for government (the external regulatory authority) to act as a guarantor of the continuing function of capitalism. I.e. it has to negate the impulse of boards of directors in order to make some of the vastly increased profit go to labor.

"If someone says you have to pay your people more, where is that going to come from," students would ask. It would come from PROFITS, of course. If you are one of the main industries paying minimum wage as a large component of your labor, you have been growing in profit and passing none of it down.

Monday, July 14, 2014

Advice for the Love Shorn

No. . . I spelled it right.

I have some experience with being the one left behind at the end of a love affair. The love lorn are, you know. . . lorn. "Lorn" is from Middle English, "leosan" (i.e. "lose"). It's the past participle, and you know it most often with its prepositional friend "for," as in "forlorn." Since the love lorn are those who are without love, that's everybody. The love shorn, though, are the ones who were not expecting anything, even though they should have been.

If you are a normal, functioning human being, then when your boy/girl friend (or wife/husband) says, "It's over" -- with or without the "I'm seeing someone else" (I rather think the people who don't announce the breakup until they're in the arms of someone else are cowards; they need that other person to be a shield or an insult that will make the breakup "stick," or they aren't going to break up until they know they can "do better" (even though, of course, they never will do better than YOU)), then you won't take it lying down. No: you'll pace back and forth, throw yourself at the wall and door, and then demand that she or he tell you WHY it's ending.

You poor sap.

Think about what you're asking for. You're asking for a set of reasons for an emotional state. Second, you're not going to listen to anything the other person says, because your question is corrupt. You don't want to know why he is dumping you. You don't want to know why she feels that things are working out. Even if your former lover were possessed of Orphic clarity, even if the beloved could say why love has gone, it wouldn't mean anything to you.

What you really want is a list of reasons why you would break up with you, why you would end the relationship. You're asking to be convinced to not love the other person in the same way that she doesn't love you.
She says you're taking her for granted. You say that you're not. She says that you show no affection, and you say that real love means not having to do that.

Parse this, or its inverse (you want all her time), and what you see is that she's saying, "I feel," and you're saying, "I think" or "That feeling isn't justified." The argument only confirms the premise (the real premise: "This relationship is over") because it's taking place. What's worse is that no one can beat another person into loving.

This isn't a very deep insight I'm offering, is it? It's obvious that no one can argue another person, much less threaten another person, into liking him or her. Nor is it possible to accept the kind of damage being dumped brings without protest. All in all, it doesn't help to know the fruitlessness of arguing. What the love shorn needs and wants is to be validated, to be worthy or worthier.

This really basic dynamic works on humans from their early to their late years. In fact, I know of an institution that, having been rejected, has resorted to argument. It's hard to sue the country club into making you a member; if you win, you won't want to go.
Þæs ofereode,

þisses swa mæg.
With respect, of course.

Friday, June 27, 2014

What May Not Be Said

I am living in a world more full of words that cannot be said than words that may.

I avoided Facebook, and I'm still not there. In fact, I won't go on any comment system, whether it's Google or Facebook or anything else, that points at, much less lists, my legal name. This is for multiple reasons:
1. I am boring as a subject.
2. I am boring as a suspect.
3. I do not believe academic freedom exists, because professors, instructors, and teachers are just corporate employees in an MBA's conception of capital now, and research only exists when it produces capital in a way susceptible to corporate monopoly.
4. I have seen what the Internet offers in the form of "fans" and "intense" personalities, and that should be enough to earn Mark Zuckerberg a circle 9A in Hell.

I can't even obliquely refer to what has happened in my life.

"Suile, and mare thanne we cunnen saein, we tholeden xix wintre for ure sinnes." -- Peterborough Chronicle, Second Continuation
"... there are so many fools placed in heights of which they are unworthy, that he who cannot restrain his contempt or indignation at the sight will be too often quarrelling with the disposal of things to relish that share which is allotted to himself." -- -- Henry Mackenzie, The Man of Feeling

". . . quotations start to rise Like rehearsed alibis." -- Seamus Heaney, "Away From It All"

I can't tell you what I'm referring to, but it is a feast of malignant intemperance.

Sunday, June 15, 2014

Probably Hyperbole

I was at Mal-Wart on a Sunday morning, after church, and the lines were long. For some reason, we dumb customers simply REFUSE to go to the teller-less checkouts. Long lines'll teach us to fight Mal-Wart's obvious wisdom in going to a labor-free retail environment. (It's true: the management is decreasing cashiers and making cashiers stand by the "check yourself out" lines to "guide" people. Unfortunately, people would prefer to wait than check themselves out.)

Ned Ludd was right, by the way.

Ludd's loss isn't why I'm writing. Cosmopolitan is why I'm writing. Its cover this month is:
If I were good, I wouldn't hotlink this, but I don't think my traffic will inconvenience anyone.
Cosmopolitan is supposed to make people who view its cover think about sex. In this case, I will admit that I thought about nudity, simply because the dress-thing on Katie Perry was so offensive to the eye that I could only think about how much I'd prefer it if she took it off. I think a Burqa would be preferable. The copy on the cover explains that "Katie Perry is on fire," and this may be true, but not when she was photographed. When she was photographed, she appeared to be decomposing, as the gangrenous hair dye and the dress with cut-outs looking like a bug's eyes reminded me more of the grave than flames. (A thingamabob that's shorts, but with a long, exposed zipper, and long sleeves? Is there any element of the garment that works with any other?)

No, what made me pause is the magazine's offer to provide "20 OMFG Moves" and "Epic Summer Sex." I suspect the magazine's copy editor was drunk.

Many moves will result in a partner making the sound, "Omfg!" I believe an unexpected elbow to the solar plexus or the chin is quite effective. A sudden belly flop of one partner onto the other can routinely elicit that noise from both participants and "turn up the heat."

It's the "epic summer sex" that had me scratching my head. My fifth edition of the Holman Handbook of Literature tells me that the epic is,
1. Marked with elevated diction,
2. Invokes the gods and involves supernatural aid,
3. Deals with matters of national foundations,
4. Covers a large scope of action.

I appreciate the writers at Cosmopolitan Magazine making a contribution to the American epic. After all, the English epic has proven elusive enough. Oh, sure, everyone says that Beowulf is the English epic -- says so! -- but it's about the founding of a nation called the Geats. . . in Europe. John Milton was gonna write an English epic, but he decided that writing an epic-epic -- the story of Man -- was better, so he wrote Paradise Lost/Paradise Regained. Everyone knew that King Arthur was the potential epic subject, and William D'Avenant's Gondibert had tried an epic in the a,b,c,b ballad rhyme in the 17th century. Finally, Alfred, Lord Tennyson did Idyls of the King and ended anyone trying to write an epic in English anymore, because it frankly kind of stank. American efforts have been even worse.

"Hark! we hear of hookups past in Forum and fanzines,
How Fifty Shades of Grey taught his lady much to endure,
She crouching and swooning and swatted and pierced to ecstasy,
That was good erotica. Then came she, he, and all
To America, the gods to bless, greedy for good sex, alluring. . . ."

I can't do any more, I'm afraid, because I didn't buy the issue. I am, however, looking forward to the summer sex that founds new nations and spans vast territories.

Monday, May 26, 2014

The Dumbest Argument of All Time

Wayne LaPierre, executive Vice President of the National Rifle Association, announced his current argument about gun ownership on the day after the Newtown, CT mass shootings. On December 12, 2012, LaPierre announced that what caused shootings by nice young people was mental illness and video games. The answer, he declared, was easy and obvious: "Only a good guy with a gun can stop a bad guy with a gun."

The Republican Party, and much of the Democratic Party, has faithfully echoed LaPierre's remarks. The GOP has been relentless in saying that video games and R-rated movies are to blame. (Today is the day after a shooting in Santa Barbara, and something called The Daily Republican has an article pointing out that the shooter's father is a movie director and therefore tied to the "culture" of violence. I won't link to it.) Paul Ryan and a few others have even talked about the culture of "urban youth." Poor people outside of "urban" areas are deserving poor, but the people who are "urban" have a bad "culture."

LaPierre has been doing that "kids today and their scary movies and video joystick games" junk for a long time. In fact, it's such a stale act that, when he responded to Newtown in 2012, he did so by blaming "Natural Born Killers" and the video game "Mortal Kombat." A movie from 1994 and a game from 1992 were to blame, he claimed, for a severely mentally ill child killing his mother and then children at a primary school. It was such a lazy and slapdash evasion, blaming "media culture," that it didn't catch on. The people who have followed LaPierre have typically done so by just using "culture" as the grand conflating variable -- the monkey wrench with which they plan to deny any causality to a correlation between guns and crime (e.g. "Sure, when children have access to guns they're more likely to have gun violence, but how do you blame the gun instead of the insanely violent culture?").

What amazes me, though, is the staggering number of people -- mostly men -- who seem competent enough to hold down jobs who will say that "only a good guy with a gun can stop a bad guy with a gun." That is, quite literally, the dumbest argument of all time.

First, if you have been near a shooting, you know, as I do, that you didn't know it when it was happening. You are trying to sleep, and the noise you hear outside will be anything -- your brain will do its best to ignore it, as you're trying to sleep. If you can't ignore it, you'll imagine that it's a trash can lid falling, a piece of metal falling from a rooftop, anything. If you are going down a street, and someone is shooting ahead of you, you will be thinking, "Where do I need to go next? She's pretty. Do I have money to go there? Maybe I'll walk this way," and you simply won't process the sounds as gunshots. Even if you know what gunshots sound like, you won't think "gunfire" until your eyes tell you something is wrong or your ears give you other signals -- quiet, screams, sirens.

Second, if you process quickly, you have to be extremely accurate -- as accurate as you are fast, at least -- to know where the gunshots are coming from, take cover, get your own gun, remove the safety, chamber a round, and then shoot as few times as possible. Every time you miss, your bullet will continue travelling until it is stopped by something solid, like an innocent person's body. If you were in Santa Barbara and strapped, there is no way you would say, "Pistol shots," get your gun out, and aim carefully before the shooter had moved on.

Here, though, is why it's the stupidest argument in history: I'm writing this on Memorial Day.

During war, do all of the good guys come home uninjured and all of the bad guys die?

In war, we have good guys who are well trained, expecting to have to fire, armed with the best weapons, and facing off against "bad guys with guns," and yet -- amazingly -- it appears that bad guys with guns actually win some of the time and good guys with guns kill bystanders sometimes and good guys with guns are killed sometimes. In other words, in the best possible case of a "good guy with a gun" -- an armed soldier anticipating battle -- we do not get greater safety, just greater casualties and greater death.

That is what Wayne LaPierre is selling.

Thursday, May 08, 2014

"9/11 Changed Everything," indeed.

To begin to write for the third time on this subject, I would need to go to the ends of the subject. As with each previous time, the real occasion for writing is obscure, as it should be, if there is going to be any use to it. Way back in 2001 and 2002, the fools who had to have something to say as they closed their thirty minute tours of news would quote governmental press releases and talking points fallen from Dick Cheney's desk and say, "It's no exaggeration to say that 9/11 has changed everything." I didn't believe them. I vaguely remember a floating head on a screen waxing poetic and summoning all the atmospheric Murrow he could (but only managing Brokaw) and saying that we would be telling our children about the event that defined their world. It was Pearl Harbor. . . or not quite that, but it was like it.

The news readers going basso profundis was inconceivable comedy to me. Even looking back, the most I can come up with by way of how I felt is a paraphrase of Lincoln, who said that, if the United States were ever to die, it would be by suicide, not conquest. 

Then again Lincoln embarrassingly underestimated the thanatopsic urge. So did I.

Oh, historians of the time in question have said that Cofer Black and the Spook Patrol began giving W. Bush daily briefings with unfiltered intelligence -- that is, with no analysts involved, no hierarchies of threats, no assessments of feasibility or reliability, just every single hateful and violent threat made by the whole world -- and this every morning before the Ovaltine. It scared Bush senseless, and he said "Yes" to every request. This is how the USA PATRIOT ACT went from controversial to routine -- even boring. The same scary people with scary reports kept working on their haunted house routine, because Obama began saying his job was, "Keeping Americans safe" in 2009. The man who campaigned on civil liberties began devouring them at a pace that his predecessor would not have imagined.

Assassinations? Secret detentions? Suspension of habeas corpus? Phone taps on a national scale (i.e. a small nation of people, not every person in this nation)? "Why object, if you have nothing to hide" uttered as a legitimation, when it had last appeared in satire or dystopian cliche? All of this has come to be a portion of the invisible architecture of American reality -- a base condition for living for most citizens. W. H. Auden's "The Unknown Citizen" is unteachable now, both because its commentary has been bunted in front of home plate and, more, because its world of a state with an indexed population of statistics is now so far beneath the N.S.A.'s capabilities and the ostensible goals of "total informational awareness" as to seem boring.

September the eleventh thrust a cup of hemlock into national hands. Our leaders presented as a chalice, and all drank. 

I do not mean, though, that losing "the American way" or civil liberties was the death of America. Instead, I mean that, frighteningly, the vapid people were right: we changed everything on the day, soul first.

In 2011, the little people who live inside of the television asked, "What have we learned from 9/11?" We learned nothing, of course. We learned nothing because there was nothing to learn from a billionaire attacking centers of international capitalism. We learned nothing because 9/11 handed its victims suffering rather than pain, and suffering neither teaches nor presents norms. Most of all, though, we learned nothing because to learn we must first recognize our world and ourselves.

September the eleventh was suffering for those of us in New York City. I feel confident that it was for those in the affected wing of the Pentagon and for the families of the lost in Pennsylvania as well. No one can learn from suffering, only  from pain, but suffering is less focused. Rarely is suffering for a particular reason, or even for a passion. Even more rarely is suffering for a passion or reason that carries with it a moral value or heuristic principle. Instead, people may grow while suffering, but they grow by having empathy or understanding (of humanity, of self, of family) increase. 

Most of us would rather attend a seminar.

Among our problems with 9/11 is just simple recognition. "We" don't do much introspection of the moral sort. For a nation of self-help patrons and "Face the Press in Review This Week" organizers, we love to look at the individual self and the weekly events, but not much from a decade or a community.

 Why there is a new government,
and why the parties are disenfranchised
The Republican Party has variations on "self" in its definitions. It's the party of "self-reliance," of "individual liberty," of "personal responsibility," and it performs a fan dance with libertarianism. (Well, it used to. Since 2012, it has been more of a peep show, where the primaries are all access libertarian and the general elections put on a flag G-string.) The Democratic Party is the party of good governance, of community building, of ensuring welfare and commonweal. Most of its definitions feature "community" and "common" in them.

September the eleventh made both parties aliens to the United States. Unlike the United Kingdom and numerous European nations, the U.S. has no "government" distinct from its politicians. We do have a civil service, but it is weak and without an identity of its own. It is certainly not operating contrary to the political system. There could be no American "Yes, Minister" the way there was an American "House of Cards," because our civil servants have less and less job security and are beholden to political appointees who face the spoils system. Thus, most talk of "the government" Americans do has been faulty from nearly the day of Andrew Jackson. It has certainly been faulty since the 1980's. However, since 2001, we have begun to grow a government, complete with self-protection and ideology separate from the political EVEN AS the traditional civil service has been put under more political control than ever before.

This government of 9/11 begins and ends with a fiat: It is the job of government to keep the citizens safe from "evildoers"/ "those who wish us harm" (the difference is one of dialect, not language). You heard Bush say that his job was keeping Americans safe countless times. You may have even heard Cheney and others lie and claim that Bush was a good president because there were no terrorist attacks on Americans during his presidency. If you have been listening, you have also heard President Obama define his job this way, and probably as often as his predecessor. 
The problem is that, well, keeping citizens safe from harm is simply not one of the duties of the presidency. The president of the United States is the head of the executive branch. He is the chief cop and the chief enforcer of laws. In war, she is the coordinator of the armed forces ("commander in chief"). There is no warrior king, priest king ("decider"), or even "CEO president" function to the job. Armed forces keep us safe from foreign powers, and police keep us safe from those on our soil if they have broken a criminal law. Secret services stop agents of enemy and foreign powers. Quick: what part of the government is the Secret Service a portion of?

We all want to be safe, of course. However, we also all want to be asked how we are kept safe, if we are a democracy. A critical difference between democracy and fascism is that we do not believe that a Great Man (or woman) might, with will or strength, achieve what the people, with consent, do every day. A critical difference between democracy and the Soviet is that we do not believe that the Party or state leadership can, with critical efficiencies or expert policy, achieve what we do in our stumbling consensual manner.

Never mind my idealism, though. The parties are aliens to the government of the United States because this government dedicated to keeping Americans safe has a new question to ask. It is no longer concerned with the individual's happiness or the group's welfare, as both are irrelevant. Instead, it asks, over and over again, "Who are you? What is the identity of the citizen?"

Define for yourself the goal of N.S.A. and other agencies dedicated to defeating foreign agents in an era when "agent" no longer means what it once did. Whereas once an agent was a person not only acting in the interests of a foreign entity but acting at the behest of that foreign entity, an agent now does not need the alien entity's knowledge, much less involvement. This is because the agent is no longer of a foreign power, but a foreign ideology. Furthermore, that ideology is not named. It isn't "the Communist Party": it's "terror" or "wishing us harm."

Even though we in the United States do not have an official religion or official ideology, we have a shifting net of enemy ideologies that are largely identified solely by the willingness of anyone propounding them to, coincidentally or consequentially, advocate violence against the U.S. military, U.S. citizens, U.S. territory, or U.S. assets. Think back to 1978 for a moment and remember the anti-nuclear protests held throughout Europe. It was a weapon that made plain the fact that some of Europe would be a battlefield in a coming war between the U.S. and U.S.S.R., and the people living on that battlefield were less than pleased. Some of them were infiltrated by Soviet agitators. Some of them were violent. Most of them were neither. Were that today, would N.S.A. label "Belgian" or "Social Democrat" as enemy ideologies? Under the philosophy that demands that all believers in a religion or religious sect are "enemy" today, it might. This is a consequence of 9/11 and defining the goal of government as "protecting Americans." Once that becomes the goal, violence is the only qualification for enemy status.
Imagine that you are floating in the ocean. Now, so long as you float, you will be rescued. However, a line is tied to you and attached to everyone you know. For reasons unclear to you, some of these people cannot swim and have weights attached to them, while others are just struggling swimmers like yourself. Even the people you are tied to who are swimming are themselves tied to all the people they know, and some of them are sinking. It is fairly likely, depending upon how many people you know, that you will be dragged down.

As far as the security government is concerned, a person is not a person. A person is a set of associations -- a deferred identity calculated by its connective power. Each association is either dangerous or not. If an association is not dangerous, it carries no weight. If it is dangerous, it weights the person. Furthermore, the attachment's weight is determined by its own attachments. Are you an enemy of America? Well, a friend of yours who has a friend in the Peace Corps who made friends with a group of people in Yemen has sunk you. You do not know this, of course, because you do not know your friend's friends. Your friend, in fact, does not know his friends are today called "terrorist" by someone. They, indeed, do not know that they are "terrorists," necessarily. Even if they have shouted, "Death to America," they could have repented of the view. It would not matter. It does not matter because the government's role is to "keep Americans safe," not to produce an accurate risk assessment.
Once we take the one, small step, from "provide for the common welfare" to "keep Americans safe," safety trumps all political activities and all operations of the state. The citizens cease to have civic value and transform into menace or neutrality, and only menace or neutral. The two political parties, therefore, become entirely beside the point. Individual liberty or community building are meaningless questions to a government dedicated to detecting threats and sifting its own population into only two piles.

I could offer up homespun analogies on the philosophy of safety. I could ask you to compare a nation to a household and to think of the effects of parents who seek to keep their children safe at all costs with not a thought to the children's happiness, prosperity, or education. However, those analogies foster reductive thinking, because nations are not families, or businesses, or enough like anything except themselves to be profitably compared except to each other. In fact, nations are capable of a phenomenon that is almost without parallel in any other organism: they can grow alienated from themselves. Nations can, under the worst possible circumstances, begin to operate one way while believing another; they can begin to concentrate power in one spot while announcing it in another. The most famous, and therefore guarded against, condition of national alienation is the phenomenon of bureaucracy. When the civilization is not rule by the demos (democracy) or representative (res publica/republic) or divine person, or by select family, or by the wealthy, but, instead, by bureaus, then an unthinking, vegetable mind governs indifferently to all concerns and makes all political exchanges inefficient.

We are not in a bureaucracy. We have something else. Dana Priest's report on “Secret America,” where she began to see just how large the expenditure and secrecy is in Classified work, certainly testifies to a potential bureaucracy of safety, but, forgetting the inefficiencies of duplication and lack of oversight, we have an alienation where no one votes for safety as a national priority. Neither chamber of Congress, no election, and no presidential order reorganizes society onto safety first. All the same, it is there, and it determines the activities of all other facets of the nation.
One way that we can tell that our nation is alienated is the staggering bathos of the safety measures. When, two weeks after September eleventh, military with sub-machine guns were stationed in Penn Station in New York City, it did not make travelers feel safer. Coming in from Madison Avenue and having one's eye first fall on a soldier with a slung machine gun did not set a commuter's mind at ease. The harlequin pantomime that has replaced airline boarding – shoes off, hands up, standing in a booth – does not give safety, either. Crucially, both “left” and “right” react the same way to these measures. The left rejects the loss of civil liberties, and the right fears “the government” and calls for a right to have personal firearms to protect itself. These measures launched in the name of security represent no one's political idea, no one's civil goal, and deliver no one's social good, but there they are. The most conservative president in history and the current president alike have presided over measures seemingly no one has endorsed.

What began as an urge to reverse every expansion of civil liberties of the 1970's with the USA PATRIOT ACT turned into something else. It has turned into something with the power to generate itself, something that moves by a vegetative mind, with a motive (to make America safe) that is paramilitary and unconstitutional. The government that is arising now, what people call “the security state” (a misnomer, as this is not the state; it is beneath the state and beside the state), sets out an end goal that cannot be achieved without the elimination of free will.