Monday, October 23, 2017

The Hunt for Genius Part 3: Cultural Factors

Here's another reposting from my series on the MacArthur Fellowship Program. This time it's a short one in which I take a quick look at cultural factors. We live in a world that's rapidly changing, but the Big Mac Lottery Board pretty much assumes the future's going to be like the past. I've collected these posts into a working paper, The Genius Chronicles: Going Boldly Where None Have Gone Before?, which you may download at this link: 
https://www.academia.edu/7974651/The_Genius_Chronicles_Going_Boldly_Where_None_Have_Gone_Before

* * * * *

It does look like this topic has taken over my brain, at least a part of it, for a while. Things are condensing and compacting. Processing goes on.

My main point remains: The MacArthur Fellows Program is strongly biased by the social network in which it is situated. The simplest thing it can do to blunt that bias and thereby name more interesting and challenging classes of fellows is to stop giving awards to people on staff at elite institutions. Those institutions are too deeply rooted in the past to serve the future well.

Our World is Culturally Complex and Changing

First, while the search for genius, as conducted by the MacArthur Fellows Program for example, is conceptualized as a search for attributes of individuals, even an essence, that’s not how it works out in fact and in our world. If our world were culturally uniform and static, then, yes, the search for genius might properly be conceptualized as a search for essence. But that’s only because every individual would be surrounded by the same culture and thus the possibilities for a ‘fit’ between individual capabilities and accomplishments would be the same for all.

But that’s not the world we live in. In the first place, it is not culturally uniform. Culture varies by class, geographical region, and by ethnicity and religion. In such a world a search for genius will almost inevitably be biased by the interests and understanding for those conducting the search. In such a world the ONLY way to escape bias is to recognize the problem and take explicit steps to correct for it.

Just what those steps would be, I don’t know. Two obvious suggestions: 1) Make sure the selection process includes people from all sectors of society. 2) Conceive each class of awardees as a sampling of the cultural space.

And in the second place, our world is not culturally static. It’s changing rapidly for various reasons. Globalization is driving workforce changes that affect educational requirements and skillset distribution and needs. Computer technology has changed the media world and is itself making many jobs obsolete while creating new classes of jobs. Ideas in every sphere are changing and new expressive forms are emerging in the arts. And globalization is moving large numbers of people from one country to another.

In that kind of situation, which is the situation we face, it is all but impossible to have an unbiased genius hunt. I argued in my first post, on the MacArthur Fellows Program, that their search is in fact very biased and that that bias is evident in the number of fellows who are on staff at elite educational and research institutions. While those institutions are pumping out some of the ideas and technology that’s driving change, they are themselves conservative. How could these conservative institutions thus be a force for change?

Long Black Veil

20171022-P1130559



Sunday, October 22, 2017

Let's Have an Adventure! – King Soloman's Mines

Bumping this to the top of the queue because, Why Not? 
* * * * * 
Another reprint from The Valve, this one from May of 2009. I've taken the liberty of appending a comment by Adam Roberts, and my reply to that comment.
Taking a bet from his brother that he couldn’t write a book as good as Treasure Island, H. Rider Haggard wrote and published King Solomon’s Mines in 1885. Such a strange book it was that publisher after publisher rejected it. It became a smash hit. 

As for that strangeness, King Solomon's Mines is of a genre that is strange no more, an adventure into a Lost World

The story follows three Brits as they trek deep into East African in search of the legendary diamond mines of King Solomon. The journey is initiated by Sir Henry Curtis, who is accompanied by Capt. John Good, recently retired from the Royal Navy. Sir Henry is looking for his younger brother, George, who had disappeared while searching for the mines. While on a steamer from Cape Town to Durban Curtis and Good meet Allan Quartermain, a well-known big-game hunter. Upon learning of their mission, Quartermain reveals that he has an old hand-drawn map to the mines that he’d acquired years ago from a Portuguese “at a place called Sitanda’s Kraal, and a miserable place it was, for one could get nothing to eat there, and there was but little game about.” This man, José Silvestre, told a story about an ancestor’s struggle and misfortune in a search for Solomon’s fabulous wealth. That ancestor made the map that Silvestre gave to Quartermain just before he expired. 

That is the map, or rather a copy of it, that Quartermain used to guide Curtis and Good deep into the African interior. But not before a bargain had been struck. Curtis agreed 1) to an up-front fee of £500, 2)to pay all expedition expenses and 3) that “any ivory or other valuables” they got would be split between Good and Quartermain. He further agreed (4) to provide for Quartermain’s son, Henry, should Quartermain die or be disabled. Having struck the bargain, Quartermain arranged an expedition, and off they went adventuring deep into the heart of the African continent.

The details of those adventures – an elephant hunt, desert treks, mountain climbing, a big battle, underground scrambling – are remarkable enough. But that’s not what I’m writing about. For one thing adventures lose all their fun and vigor in summary. More particularly, the adventures are a setting for something else.

What is that something else? Social relations. I want to focus on the relations among the central trio of characters and between those characters and the Kukuana, a fictional people who inhabit the region where Solomon’s mines are located.

Grandparents

20171014-_IGP0825

Saturday, October 21, 2017

Acting white in the Trump-era

White people acting white have embraced the ethic of the white underclass, which is distinct from the white working class, which has the distinguishing feature of regular gainful employment. The manners of the white underclass are Trump’s — vulgar, aggressive, boastful, selfish, promiscuous, consumerist. The white working class has a very different ethic. Its members are, in the main, churchgoing, financially prudent, and married, and their manners are formal to the point of icy politeness. You’ll recognize the style if you’ve ever been around it: It’s “Yes, sir” and “No, ma’am,” but it is the formality of soldiers and police officers — correct and polite, but not in the least bit deferential. It is a formality adopted not to acknowledge the superiority of social betters but to assert the equality of the speaker — equal to any person or situation, perfectly republican manners. It is the general social respect rooted in genuine self-respect.  
Its opposite is the sneering, leveling, drag-’em-all-down-into-the-mud anti-“elitism” of contemporary right-wing populism. Self-respect says: “I’m an American citizen, and I can walk into any room, talk to any president, prince, or potentate, because I can rise to any occasion.” Populist anti-elitism says the opposite: “I can be rude enough and denigrating enough to drag anybody down to my level.” Trump’s rhetoric — ridiculous and demeaning schoolyard nicknames, boasting about money, etc. — has always been about reducing. Trump doesn’t have the intellectual capacity to duke it out with even the modest wits at the New York Times, hence it’s “the failing New York Times.” Never mind that the New York Times isn’t actually failing and that any number of Trump-related businesses have failed so thoroughly that they’ve gone into bankruptcy; the truth doesn’t matter to the argument any more than it matters whether the fifth-grade bully actually has an actionable claim on some poor kid’s lunch money. It would never even occur to the low-minded to identify with anybody other than the bully. That’s what all that ridiculous stuff about “winning” was all about in the campaign. It is might-makes-right, i.e., the politics of chimpanzee troupes, prison yards, kindergartens, and other primitive environments. That is where the underclass ethic thrives — and how “smart people” came to be a term of abuse.

Halloween kitty

20171021-P1130528

75% Americans are afraid of government corruption

For the third year in a row, corruption of government officials has topped the list—only this year it jumped 13 percentage points, from 60.6 percent of Americans identifying themselves as afraid of government corruption in 2016, to a whopping 74.5 percent being afraid of the same in 2017.

“Our previous lists had more to do with disasters and crime, and that naturally lent itself to the type of messaging [about crime] we’re doing,” Bader says. “The list this year is fundamentally different in the sense that it’s showing a great fear of some of the things happening in this presidency.”

Fear of North Korea using weapons came in at number nine on the list, with 44.9 percent marking themselves as being afraid. The survey has been asking about nuclear attacks since it first started; this is the first year North Korea was listed specifically. “It’s very difficult to curb people’s fears about North Korea when frankly, North Korea and how it’s being addressed is very scary,” says Bader.

Another first this year was environmental concerns appearing in the top ten list of fears, of which there were four: pollution of oceans rivers and lakes; pollution of drinking water; global warming/climate change; and air pollution. And the survey was conducted before Hurricanes Harvey and Maria and the ongoing California wildfire crisis, with questions sent out from June 28 to July 7. The researchers ascribe the increased environmental fears to media coverage of President Trump’s decision to withdraw from the Paris Climate Agreement and cut funding to the Environmental Protection Agency, as well as coverage of the lead in the tap water in Flint, Michigan.
H/t 3QD.

Ebert Defends Literature on the uncharted seas

Bumping this seven year old post to the top of the queue. The issues are still unresolved. Regert Ebert died on April 4, 2013.
* * * * * 

I’ve recently become interested in Roger Ebert. As I’ve indicated earlier, he’s long served me as a reference critic, someone I’d consult on movies that interest me. My current interest extends beyond that.

The nature of my current interest is not entirely clear to me. Oh, sure, Ebert is one of the most prominent intellectuals in America these days, and is readily available on the web. As is Stanley Fish. That Stanley Fish is an intellectual is obvious on the face of it. But Roger Ebert, he’s a film critic, no? Yes, and we don’t normally think of film critics as intellectuals. But there are film critics and there are film critics.

And Roger Ebert is more than a film critic. Perhaps he’s always been more than a film critic. But it’s his writing in his blog that interests me, and that’s what’s prompted me to think of him as an intellectual. Yes, I find it just a bit strange. But I’m going with it. He’s not the type of intellectual Stanley Fish is, but an intellectual he is. And, for what it’s worth, he’s more widely known.

And that’s worth something. Just what, I don’t know. But something, and that something is part of my attraction.



You may have heard that Ebert’s been kicking up a fuss about video games. He doesn’t think that they can ever be art. This little tempest in a teapot led him to Tweet and then blog a simple question: “Which of these would you value more? A great video game. Huckleberry Finn, by Mark Twain.” The answer came back 13,823 to 8,088 in favor of video games.

And so Ebert posted that result to his blog, while also admitting that there was nothing remotely scientific about his procedure. It’s just an informal question, with an answer that didn’t please him. And he launches into a defense and justification of literature without, however, saying anything more against video games. For the moment, that’s done and gone.

Ebert tells us that he first read Huckleberry Finn when he was seven – I believe I was a bit older than that when my father read it to me. He quotes Hemingway’s line about all modern American literature descending from Huck Finn. [He illustrates his post with scans from a Huck Finn comic.] And he quotes his favorite passage from the book: “Read it over a couple of times and then read it aloud to someone you like. It's music. Can you imagine a more evocative description of a thunderstorm?”

Here’s the nub of his concern:
I believe reading good books is the best way we can civilize ourselves even in the absence of all other opportunities. If a child can read, has access to books and the freedom to read them, that child need not be "disadvantaged" for long. What concerns me is that reading competence and experience has been falling steadily in America. Most of the adults I meet are not very "well read." My parents were.
And then:
Beyond a certain point, we take our education into our own hands. We discover what excites us intellectually, and seek it out. The world of books allows us to walk in the shoes of people who lived in other times and other places, who belonged to other races and religions. It allows us to become more humane and open-minded. In exposing us to prose of the highest level, it encourages us to think in a way that isn't merely "better" but is more fanciful, creative, poetic and expressive. It makes us less boring, and less bore-able.
This is all quite traditional. It could have been written fifty or sixty years ago. No doubt it was, by other intellectuals, in other words.

It’s as though the intellectual ferment that ripped through English departments in the 70s and 80s, ferment in which Stanley Fish was a major rabble rouser, it’s as though that had passed Ebert by. Ebert is writing as a traditional humanist in a world where the academic stewards of humanism have all but abandoned the tradition. The promising new ideas of the 70s and 80s have, indeed, changed the field of discourse. But the way forward is no longer apparent. We’re in a swamp, we have no map, nor compass, and so we don’t know where we’re going.

Yet Ebert defends reading in traditional terms as though Fish’s swamp didn’t exist. What’s particularly interesting is that it’s literature that Ebert is defending, not film. Literature is very important to him, but it’s film that he’s put at the center of his intellectual life. I don’t know what terms he’d use to defend film, though one could certainly say that it allows us to walk in the shoes of people who lived in other times and other places, who belonged to other races and religions. It allows us to become more humane and open-minded. In exposing us to prose of the highest level, it encourages us to think in a way that isn't merely "better" but is more fanciful, creative, poetic and expressive. It makes us less boring, and less bore-able.

Such words, once again, ignore Fish’s swamp. But they’re apt. Moreover it seems to me that we’re living in a world where film has the kind of importance that novels had in the 19th century. In fact, film may be less important now than it was 30, 50, 80 years ago. And video games, I’m told the video game market is bigger than the film market. Do video games allow us to walk in the shoes of people who lived in other times . . . . ? I don’t know, I don’t play them.

So, in consequence of an argument that video games aren’t and will never be art, Roger Ebert, a film critic, mounts a traditional defense of literature. That’s were we are today. That’s our uncharted sea.

Residential Ornaments

20170924-_IGP0416

Friday, October 20, 2017

Borges redux: Computing Babel – Is that what’s going on with these abstract spaces of high dimensionality? [#DH]

If the eye were not sun-like, the sun’s light it would not see. – Johann Wolfgang von Goethe

Thinking about corpus linguistics, machine learning, neural nets, deep learning and such. One of the thoughts that keeps flitting through my mind goes something like this:
How come, for example, we can create this computer system that crunches through a two HUGE parallel piles of texts in two languages and produce a system than can then make passable translations from one of those languages to the other WITHOUT, however, UNDERSTANDING any language whatsoever? Surely the fact that THAT – and similar things – is possible tells us something about something, but what?
As far as I know these systems have arisen through trying things out and seeing what works. The theoretical basis for them is thin. Oh, the math may be robust, but that’s not quite the same.

Understanding involves the relationship between text and world. That’s what we were trying to do back in the 1970s and into the 1980s, create systems that understood natural language texts. We created systems that had an internal logic relating concepts to one another so one could make inferences, and even construct new assertions. That effort collapsed, and it collapsed because THE WORLD. Yes, combinatorial explosion. Brittleness, that’s a bit closer. Lack of common sense knowledge, still closer, that’s knowledge of the world, lots of it, much of it trivial, but necessary. But these symbolic systems were also merely symbolic, they weren’t coupled to sensory and motor systems – and through them to the world itself.

And now we have these systems that utterly lack an internal logic relating concepts one to the other, and yet they succeed, after a fashion, where we failed (back in the day). How is it that crunching over HUGE PILES of texts is a workable proxy for understanding the world? THAT’s the question. Surely there’s some kind of theorem here.

The thing about each of the texts in those huge piles is that they were created by a mind engaged with the world. That is, each text reflects the interaction of a mind with the world. What the machine seems to be doing through crunching over all these texts is it recovers a simulacrum of the mind’s contribution to those texts and that’s sufficient to get something useful done. Or, is it a simulacrum of the world’s contribution to those texts? Does it matter? Can we tell?

THAT’s what I’m wondering about.

I think.

Think of the world as Borges’s fabled library of Babel. Most of the texts – and they are just texts, strings of graphic symbols – in that world are gibberish. Imagine, however, that we have combed through this library and have managed to collect a large pile of meaningful texts. Only an infinitesimal set of texts is meaningful, and we’ve managed to find millions of them. So, we crunch through this pile and, voilà! we can now generate more texts, all of which are almost as intelligible and coherent as the originals, the true texts. And yet our machines don’t understand a thing. They just crunch the texts, dumber than those monkeys seeking Shakespeare with their random typing.

THAT, I think, is what’s going on in deep learning and so forth.

If so, doesn’t that tell us something about the world? Something about the world that makes it intelligible? For not all possible worlds are intelligible.

The world Borges imagines in that story, “The Library of Babel”, is not an intelligible world. Why not? 

Remember, we’re using this story as a metaphor, in this case, we’re using it to think about corpus linguistics, machine learning, and the rest. In this usage each volume in the library represents an encounter between someone’s mind and the world. Most such encounters are ephemeral and forgotten. Only some of them yield intelligible texts. Those are the one’s that interest us.


The problem with the library as Borges describes it is there’s no way of finding the ‘useful’ or ‘interesting’ books in it. They all look alike. That world is, for all practical purposes, unintelligible. You’ve got to read each one of them all the way through in order determine whether or not it contains anything sensible.

Imagine, however, that each stack had a marking on it indicating whether or not there was a useful book somewhere in the stack. (Of course, someone, some agency, would have to do the marking. That would be part of the revised story.) If the stack had a red dot three centimeters in diameter on its upper right corner, that means the stack contains a useful book.

Few of the stacks, of course, would contain such a mark. You’d have to wander far and wide before you find one. But that’s surely better than having to examine each book, page by page, on each shelf in each stack. Now you only have to examine each book in the marked stack. But that’s an improvement, no? NOW the world becomes intelligible. One can live in it.


* * * * *

As for those humanists who worry about some conflict between “close reading” and “distant reading”, get over it. Neither is a kind of reading, as the term is ordinarily understood. Both usages are doing undisclosed mythological/ideological work. Drop the nonsense and try to think about what’s really going on.

It’s hard, I know. But at this point we really have no choice. We’ve extracted all we can from those myths of reading. Now they’re just returning garbage.

Time to come up out of the cave.

Friday Fotos: The Hudson River at Rhinecliff, NY [#AutumnExpress]

20171014-_IGP0774

20171014-_IGP0778

20171014-_IGP0780

20171014-_IGP0777

20171014-_IGP0814

It's a small world (network) after all, and it arises through adaptive rewiring

Nicholas Jarman, Erik Steur, Chris Trengove, Ivan Y. Tyukin & Cees van Leeuwen, Self-organisation of small-world networks by adaptive rewiring in response to graph diffusion, Scientific Reports 7, Article number: 13158 (2017) doi:10.1038/s41598-017-12589-9
Abstract
Complex networks emerging in natural and human-made systems tend to assume small-world structure. Is there a common mechanism underlying their self-organisation? Our computational simulations show that network diffusion (traffic flow or information transfer) steers network evolution towards emergence of complex network structures. The emergence is effectuated through adaptive rewiring: progressive adaptation of structure to use, creating short-cuts where network diffusion is intensive while annihilating underused connections. With adaptive rewiring as the engine of universal small-worldness, overall diffusion rate tunes the systems’ adaptation, biasing local or global connectivity patterns. Whereas the former leads to modularity, the latter provides a preferential attachment regime. As the latter sets in, the resulting small-world structures undergo a critical shift from modular (decentralised) to centralised ones. At the transition point, network structure is hierarchical, balancing modularity and centrality - a characteristic feature found in, for instance, the human brain.

Introduction
Complex network structures emerge in protein and ecological networks, social networks, the mammalian brain, and the World Wide Web. All these self-organising systems tend to assume small–world network (SWN) structure. SWNs may represent an optimum in that they uniquely combine the advantageous properties of clustering and connectedness that characterise, respectively, regular and random networks. Optimality would explain the ubiquity of SWN structure; it does not inform us, however, whether the processes leading to it have anything in common. Here we will consider whether a single mechanism exists that has SWN structure as a universal outcome of self-organisation.

In the classic Watts and Strogatz algorithm, a SWN is obtained by randomly rewiring a certain proportion of edges of an initially regular network. Thereby the network largely maintains the regular clustering, while the rewiring creates shortcuts that enhance the networks connectedness. As it shows how these properties are reconciled in a very basic manner, the Watts-Strogatz rewiring algorithm has a justifiable claim to universality. However, the rewiring compromises existing order than to rather develop over time and maintain an adaptive process. Therefore the algorithm is not easily fitted to self-organising systems.

In self-organising systems, we propose, network structure adapts to use - the way pedestrians define walkways in parks. Accordingly, we consider the effect of adaptive rewiring: creating shortcuts where network diffusion (traffic flow or information transfer) is intensive while annihilating underused connections. This study generalises previous work on adaptive rewiring. While these studies have shown that SWN robustly emerge through rewiring according to the ongoing dynamics on the network, the claim to universality has been frustrated by need to explicitly specify the dynamics. Here we take a more general approach and replace explicit dynamics with an abstract representation of network diffusion. Heat kernels capture network-specific interaction between vertices and as such they are, for the purpose of this article, a generic model of network diffusion.

We study how initially random networks evolve into complex structures in response to adaptive rewiring. Rewiring is performed in adaptation to network diffusion, as represented by the heat kernel. We systematically consider different proportions of adaptive and random rewirings. In contrast with the random rewirings in the Watts-Strogatz algorithm, here, they have the function of perturbing possible equilibrium network states, akin to the Boltzmann machine. In this sense, the perturbed system can be regarded as an open system according to the criteria of thermodynamics.

In adaptive networks, changes to the structure generally occur at a slower rate than the network dynamics. Here, the proportion of these two rates is expressed by what we call the diffusion rate (the elapsed forward time in the network diffusion process before changes in the network structure). Low diffusion rates bias adaptive rewiring to local connectivity structures; high diffusion rates to global structures. In the latter case adaptive rewiring approaches a process of preferential attachment.

We will show that with progressive adaptive rewiring, SWNs always emerge from initially random networks for all nonzero diffusion rates and for almost any proportion of adaptive rewirings. Depending on diffusion rate, modular or centralised SWN structures emerge. Moreover, at the critical point of phase transition, there exists a network structure in which the two opposing properties of modularity and centrality are balanced. This characteristic is observed, for instance, in the human brain We call such a structure hierarchical. In sum, adaptation to network diffusion represents a universal mechanism for the self–organisation of a family of SWNs, including modular, centralised, and hierarchical ones.

Thursday, October 19, 2017

The elusive face of time

20171014-_IGP0857

20171014-_IGP0855

20171014-_IGP0858

The Hunt for Genius, Part 2: Crackpots, athletes, 4 kinds of judgment, training, and Cultural Context

I continue reposting my series on the MacArthur Fellowship Program. This time I take up the problem of identifying "genius"-class creativity by running through a variety of examples and ending on a brief discussion of the importance of cultural context. How do we bias our selection process toward the future, not the past? I've collected these posts into a working paper, The Genius Chronicles: Going Boldly Where None Have Gone Before?, which you may download at this link: 
https://www.academia.edu/7974651/The_Genius_Chronicles_Going_Boldly_Where_None_Have_Gone_Before
* * * * *

Part 1 is my post on the misguided MacArthur Fellows Program. And I thought that would be the end of it. I was wrong.

Now that I’ve gotten my brain revved up thinking about “genius”, whatever that is, I’ve got to think a bit more. The foundation is making judgments about people, judgments about the originality of their work, their ability to cross traditional disciplinary and institutional boundaries, their need for support, and their potential for future contributions of an extraordinary kind. The program has been consistently criticized for picking too many fellows who don’t meet those criteria.

In that post I argued there is in fact a simple way to improve those judgments relative to those criteria: don’t give fellowships to people with stable jobs at elite institutions. The purpose of this post is to clarify my reasoning on that point.

I begin by pointing out that it’s possible for one person to be both a genius and a crackpot. Then I have a brief note on the Nobel Prize, where the point is that even giving awards for accomplishment is difficult. In the following two sections I step through athletic and musical performance as a way of outlining different kinds of judgments, which I’ve called objective, complex, incommensurable, and predictive. I return to the MacArthur Fellowship Program in the final section where I once again talk about the importance of cultural context.

Two for One: Genius and Crackpot in a Single Package

First, let’s think about, say, Isaac Newton, a prototypical scientific genius. We remember him for his work in physics (optics, mechanics, and gravity) and mathematics. No one cares about his work in theology and alchemy except historians, yet it meant a great deal to Newton himself. In the last century Albert Einstein was quickly recognized as a genius, mostly for his work on relativity and photons. He spent the last part of his career looking for a unified field theory. For a long time that work was considered to be a waste of time. Now that unified theory has made a comeback in physics I don’t know whether that work has been re-evaluated or not.

Were these guys working on half a brain when they did that misbegotten work? Were they drunk? I mean, what happened to the supernal abilities that allowed them to make profound and permanent contributions to science?

Nothing happened to those abilities. There’s no reason to think that they weren’t firing on all cylinders when they did that work. The work just doesn’t fit very well with other knowledge of the world. Think of ideas as keys. What do we use keys for? To unlock doors. Some of the keys these geniuses crafted unlocked real doors. Other keys don’t unlock real doors. Whether or not a key unlocks a door is not a matter of how well the key is crafted. The most exquisitely crafted square peg is not going to fit into a round hole.

Well, it turns out that some of the locks these guys had in mind when crafting keys weren’t real. They were figments of their imagination. Just because the lock was imagined by a genius doesn’t mean it is real.

And so forth.

The point is that ability is not enough. That ability has to be fitted to context.

The search for genius, however, is always conceptualized as a search for ability. This is most obviously the case when genius is defined in terms of a score on some standardized test, an IQ test. If you score high enough on the test you’re a genius – as defined by the test. Otherwise, no.

Now, I rather doubt that anyone involved in the MacArthur Fellows program cares about scores on IQ tests. Whatever it is they’re looking for, it can’t be identified by an IQ test. If it could, then running the Fellows Program would be trivially easy. If tests did the trick there’d be no need for the program. The geniuses would be identified by the standard testing programs undertaken in schools. They aren’t.

So, how do you find a genius?

Nobel Prizes: Even Post Facto Judgments are Difficult

What about Nobel Prizes? They, of course, are awarded for accomplishment, not for promise. And so the prize is not conceived of as one given for ability, though we all assume that Nobel Laureates must have extraordinary ability in order to do whatever it is that got them the award.

And yet the fact that these prizes are awarded for accomplishments visible to all doesn’t insulate them from criticism. I’m sure if I were to did around in what’s written about Nobels I’d find lists of people who got them, but shouldn’t (e.g. Obama or Kissinger for the Peace Prize) and other lists of people who should have gotten them but didn’t. Judging the value of accomplishments such as these is not easy.

So, let’s start by thinking about some kind of ability where the tests are straightforward.

Wednesday, October 18, 2017

What WAS I thinking when I snapped this photo? Not the Beatles and not Abbey Road

20171014-P1130443-2

If you are of a certain age and a certain inclination you can’t help by think of the album cover for Abbey Road, released by the Beatles in 1969. That’s certainly what I thought when I looked at the photo on my computer.

But that’s not what I was thinking when I took the photo. I had come out of New York City’s Penn Station at a bit after 5PM on Saturday, October 14, 2017. I was taking photos to document the day – a train ride my sister had gotten for us in celebration of my upcoming major birthday – and snapping shots in front of the station. Traffic was busy, making photography a bit tricky, everything moving, shots materializing and disappearing just as quickly.

It was shoot or die. I saw those people walking across street and my mind flashed there’s a photo there, but not in so many words. It was just a realization that I had to point and shoot NOW or lose it. So I took the shot.

And moved on, taking other shots.

But I’m sure that intuitive decision had been primed by that album cover I saw so many times over the years, but not, say, in the last five or six, perhaps more, years.

And, you see, when you shoot quickly, sometimes things don’t quite work out. Sometimes that’s OK.

20171014-P1130445

Can you learn anything worthwhile about a text if you treat it, not as a TEXT, but as a string of marks on pages? [#DH]

The Chronicle of Higher Education just published a drive-by take-down of the digital humanities. It was by one Timothy Brennan, who didn’t know what he was talking about, didn’t know that he didn’t known, and more likely than not, didn’t care.
Timothy Brennan, The Digital-Humanities Bust, The Chronicle of Higher Education, October 15, 2017, http://www.chronicle.com/article/The-Digital-Humanities-Bust/241424
Subsequently there was a relatively brief tweet storm in the DH twittersphere in which one Michael Gavin observed that Brennan seemed genuinely confused:


“Lexical patterns”, what are they? The purpose of this post is to explicate my response to Gavin.

The Text is not the (physical) text

While literary critics sometimes use “the text” to refer to a physical book, or to alphanumeric markings on the pages in such a book, they generally have something vaguer and ore expansive in mind. Here is a passage from a well-known, I won’t say “text”, article by Roland Barthes [1]:
1. The text must not be understood as a computable object. It would be futile to attempt a material separation of works from texts. In particular, we must not permit ourselves to say: the work is classical, the text is avant-garde; there is no question of establishing a trophy in modernity's name and declaring certain literary productions in and out by reason of their chronological situation: there can be “Text” in a very old work, and many products of contemporary literature are not texts at all. The difference is as follows: the work is a fragment of substance, it occupies a portion of the spaces of books (for example, in a library). The Text is a methodological field. The opposition may recall (though not reproduce term for term) a distinction proposed by Lacan: “reality” is shown [se montre], the “real” is proved [se démontre]; in the same way, the work is seen (in bookstores, in card catalogues, on examination syllabuses), the text is demonstrated, is spoken according to certain rules (or against certain rules); the work is held in the hand, the text is held in language: it exists only when caught up in a discourse (or rather it is Text for the very reason that it knows itself to be so); the Text is not the decomposition of the work, it is the work which is the Text's imaginary tail. Or again: the Text is experienced only in an activity, in a production. It follows that the Text cannot stop (for example, at a library shelf); its constitutive moment is traversal (notably, it can traverse the work, several works).  
And that is just the first of seven propositions in that well known text article, which has attained, shall we say, the status of a classic.

I have no intention of offering extended commentary on this passage. I will note, however, that Barthes obviously knows that there’s an important difference between the physical object and what he’s calling the text. Every critic knows that. We are not dumb, but we do have work to do.

Secondly, perhaps the central concept is in that italicized assertion: “the Text is experienced only in an activity, in a production.”

Finally, I note that that first sentence has also been translated as: “The Text must not be thought of as a defined object” [2]. Not being a reader of French, much less a French speaker, I don’t know which translation is truer to the original. It is quite possible that they are equally true and false at the same time. But “computable object” has more resonance in this particular context.

Now, just to flesh things out a bit, let us consider a more recent passage, one that is more didactic. This is from the introduction Rita Copeland and Frances Ferguson prepared for five essays from the 2012 English Institute devoted to the text [3]:
Yet with the conceptual breadth that has come to characterize notions of text and textuality, literary criticism has found itself at a confluence of disciplines, including linguistics, anthropology, history, politics, and law. Thus, for example, notions of cultural text and social text have placed literary study in productive dialogue with fields in the social sciences. Moreover, text has come to stand for different and often contradictory things: linguistic data for philology; the unfolding “real time” of interaction for sociolinguistics; the problems of copy-text and markup in editorial theory; the objectified written work (“verbal icon”) for New Criticism; in some versions of poststructuralism the horizons of language that overcome the closure of the work; in theater studies the other of performance, ambiguously artifact and event. “Text” has been the subject of venerable traditions of scholarship centered on the establishment and critique of scriptural authority as well as the classical heritage. In the modern world it figures anew in the regulation of intellectual property. Has text become, or was it always, an ideal, immaterial object, a conceptual site for the investigation of knowledge, ownership and propriety, or authority? If so, what then is, or ever was, a “material” text? What institutions, linguistic procedures, commentary forms, and interpretive protocols stabilize text as an object of study? [p. 417]
“Linguistic data” and “copy-text”, they sound like the physical text itself, the rest of them, not so much.

If literary critics were to confine themselves to discussing the physical text, what would we say? Those engaged in book studies and editorial projects would have more to say than most, but even they would find such rigor to be intolerably confining. The physical signs on the page, or the vibrations in the air, exist and come alive in a vast a complicated network of ... well, just exactly what? Relationships among people to be sure, but also relationships between sights and sounds and ideas and movements and feelings and a whole bunch of stuff mediated by the nervous systems of all those people interacting with one another.

It’s that vast network of people and neuro-mental stuff that we’re trying to understand when we explicate literary and cultural Texts. As we lack really good accounts of all that stuff, literary critics have felt that we had little choice by to adopt this more capacious conception, albeit at the expense of definition and precision. Anyhow, aren’t the people trying to figure out those systems, aren’t they scientists? And aren’t we, as humanists, skeptical about science?

And then along came the computer.

Tuesday, October 17, 2017

Three from the window of a moving train

20171014-P1130387

20171014-P1130411

20171014-P1130412

Out of the ground with your hands, my summer in coal @3QD

I’ve done a little editing to a recent post and reposted it at 3 Quarks Daily under the title, slightly changed from the original, My summer job working in coal – or, how I learned about class in America: http://www.3quarksdaily.com/3quarksdaily/2017/10/my-summer-job-working-in-coal-or-how-i-learned-about-class-in-america.html

It would be a bit strong to say that coal pervaded my life growing up, but I was aware of it and thought about it, in one way or another, almost, perhaps, likely, daily – steel too. After all, my father was in the business and took frequent trips to visit coal mines and cleaning plants. I remember waiting for him to come home, staying up late a night they day of his return, and getting the little gifts he’d bring me and my sister from whatever exotic place he’d visited. I remember the hard hats he wore when on site.

And I remember talking with him about his work. I remember him telling me about dead plant matter turning into peat, peat into lignite and lignite into coal. Coal was once living matter.

Coal is elemental. It’s a fuel, a dirty fuel. A dirty fuel that gave us the iron and steel industries. Coal fires gave us the Anthropocene.
Ashes to Dust
Life to Coal
Coal to Ashes
Dust to Life

Monday, October 16, 2017

Stairway to Penn Station, NYC

20171014-P1130379

20171014-P1130378

Another (strenuous) take on what went wrong with literary criticism, John Searle and Geoffrey Hartman edition

Yeah, I know. But it’s important to get this right.

Once again I’m going to review that Geoffrey Hartman statement I find so characteristic of the mid-1970s rearward shift in academic literary criticism, the one about ‘rithmatic and distance. But this time I want to put it in the context a discussion of the ontological and epistemological senses of objective and subjective that John Searle makes in The Construction of Social Reality, Penguin Books, 1995.

Searle: Ontology and Epistemology

After some preliminary discussion, some of which I’ve appended to this post, Searle concludes (p. 7):
Here, then, are the bare bones of our ontology: We live in a world made up entirely of physical particles in fields of force. Some of these are organized into systems. Some of these systems are living systems and some of these living systems have evolved consciousness. With consciousness comes intentionality, the capacity of the organism to represent objects and states of affairs in the world to itself. Now the question is, how can we account for the existence of social facts within that ontology?
How indeed.

Searle then observes (pp. 7-8):
Much of our world view depends on our concept of objectivity and the contrast between the objective and the subjective. Famously, the distinction is a matter of degree, but it is less often remarked that both “objective” and “subjective” have several different senses. For our present discussion two senses are crucial, an epistemic sense of the objective-subjective distinction and an ontological sense. Epistemically speaking, “objective” and “subjective “ are primarily predicates of judgments. We often speak of judgments as being “subjective” when we mean that their truth or falsity cannot be settled “objectively,” because the truth or falsity is not a simple matter of fact but depends on certain attitudes, feelings, and points of view of the makers such subjective judgments with objective judgments, such as the judgment “Rembrandt lived in Amsterdam during the year 1632.” For such objective judgments, the facts in the world that make them true or false are independent of anybody’s attitudes or feelings about them. In this epistemic sense we can speak not only of objective judgments but of objective facts. Corresponding to objectively true judgments there are objective facts. It should be obvious from these examples that the contrast between epistemic objectivity and epistemic subjectivity is a matter of degree.

In addition to the epistemic sense of the objective-subjective distinction, there is also a related ontological sense. In the ontological sense, “objective” and “subjective” are predicates of entities and types of entities, and they ascribe modes of existence. In the ontological sense, pains are subjective entities, because their mode of existence depends on being felt by subjects. But mountains, for example, in contrast to pains, are ontologically objective because their mode of existence is independent of any perceiver or any mental state.
Word meanings, in this sense, are ontologically subjective, which I’ve previously argued [1]. And so are the meanings of texts, even texts about objective facts. Hence textual meaning can be subject to endless, and often fruitless, discussion, especially when intersubjective agreement on the meanings of crucial terms is lax.

Continuing directly on from the previous passage, (pp. 8-9):
We can see the distinction between the distinctions clearly if we reflect on the fact that we can make epistemically subjective statements about entities that are ontologically objective, and similarly, we can make epistemically objective statements about entities that are ontologically subjective. For example, the statement “Mt. Everest is more beautiful than Mt. Whitney” is about ontologically objective entities, but makes a subjective judgment about them. On the other hand, the statement “I now have a pain in my lower back” reports an epistemically objective fact in the sense that it is made true by the existence of an actual fact that is not dependent on any stance, attitudes, or opinions of observers. However, the phenomenon itself, the actual pain, has a subjective mode of existence.
I argue, though Searle might disagree, that the meanings of the words in that statement – “I now have a pain in my lower back” – are themselves ontologically subjective, despite the fact that the statement itself, in context, is ABOUT an epistemologically objective fact (where that fact is about something ontologically subjective, a pain).

It’s confusing, I know. Alas, it’s going to get worse.

Sunday, October 15, 2017

Latour on the second science "war"

Q: How do you look back at the “science wars”?
A: Nothing that happened during the ’90s deserves the name “war.” It was a dispute, caused by social scientists studying how science is done and being critical of this process. Our analyses triggered a reaction of people with an idealistic and unsustainable view of science who thought they were under attack. Some of the critique was indeed ridiculous, and I was associated with that postmodern relativist stuff, I was put into that crowd by others. I certainly was not antiscience, although I must admit it felt good to put scientists down a little. There was some juvenile enthusiasm in my style.
We’re in a totally different situation now. We are indeed at war. This war is run by a mix of big corporations and some scientists who deny climate change. They have a strong interest in the issue and a large influence on the population.
Q: How did you get involved in this second science war?
A: It happened in 2009 at a cocktail party. A famous climate scientist came up to me and said: “Can you help us? We are being attacked unfairly.” Claude Allègre, a French scientist and former minister of education, was running a very efficient ideological campaign against climate science.
It symbolized a turnaround. People who had never really understood what we as science studies scholars were doing suddenly realized they needed us. They were not equipped, intellectually, politically, and philosophically, to resist the attack of colleagues accusing them of being nothing more than a lobby. 
Q: How do you explain the rise of antiscientific thinking and “alternative facts”? 
A: To have common facts, you need a common reality. This needs to be instituted in church, classes, decent journalism, peer review. … It is not about posttruth, it is about the fact that large groups of people are living in a different world with different realities, where the climate is not changing.
The second science war has at least freed us of the idea that science and technology can be separated from policy. I have always argued that they can't be. Science has never been immune to political bias. On issues with huge policy implications, you cannot produce unbiased data. That does not mean you cannot produce good science, but scientists should explicitly state their interests, their values, and what sort of proof will make them change their mind. 
Q: How should scientists wage this new war? 
A: We will have to regain some of the authority of science. That is the complete opposite from where we started doing science studies. Now, scientists have to win back respect. But the solution is the same: You need to present science as science in action. I agree that’s risky, because we make the uncertainties and controversies explicit.
H/t 3QD.

Emergency Exit

20171014-P1130384

20171014-P1130405

20171014-P1130437

MacArthur Fellowships: Search for creativity or the same old cronyism?

I've been criticizing the MacArthur Fellowships for five years now. It's about time I reposted the original articles in the series. This is the first one, which I'd originally published on October 9, 2013, under the title, "MacArthur Fellowships: Let the Geniuses Free". This looonng post examines the history of the program, looks at three recipients in the first year (1981) – "Skip" Gates, Robert Penn Warren, and Stephen Wolfram –  considers criticisms of the program, and examines the class of 2013, where 15 of 24 fellows have tenure at elite institutions – hence the suspicion, which I share with others, that the Fellows program is yet another case of elitist cronyism. I conclude with a simple suggestion: Don't give any awards to people with tenure at those schools. I stick by that suggestion. I've collected my observations into a working paper, The Genius Chronicles: Going Boldly Where None Have Gone Before?, which you may download at this link: 
https://www.academia.edu/7974651/The_Genius_Chronicles_Going_Boldly_Where_None_Have_Gone_Before
* * * * *

I’ve been following the MacArthur Fellowship program from the beginning. Like many, I believe it's too conservative in its pick of fellows. I long ago decided that the foundation could improve matters by adopting a simple rule: don’t award fellowships to anyone who has stable employment at an elite institution.

My reasoning was simple: if they’ve got an elite job, they can eat and they can work. Depending on the job, they may not have as much time for creative work as they’d like to have. But they’ve got more time than they’d have if they had to wait tables, do temp word-processing, or teach five adjunct courses a term spread across three different schools. They can function creatively.

That puts them ahead those who are so busy scratching for a living that they cannot function creatively at all.

When I set out to write this post, that’s all I had in mind. I’d reiterate the standard complaint about MacArthur’s programmatic constipation, with appropriate links here and there, and then offer up my one simple suggestion. I figured it for a thousand or maybe fifteen hundred words.

But then things started getting interesting, and more complex. So I’ve had to write a much longer post. I’ve not given up on that simple idea, nor have I augmented it. But I have a richer and more interesting rationale for it. That’s what this post is about.

The Genius Grants

I don’t know when I first heard that the newly formed Catherine D. MacArthur Foundation would “be looking for gifted but impecunious poets, promising young composers, research scientists in midcareer and other ‘exceptionally talented people’”, as The New York Times put it in 1980, but, like many creative people, I thought to myself: At last, a foundation that’s looking for (people like) me. The article went on to say:
Many foundation programs have sought to assist scholars and artists...but most have required that the would-be fellows already have achieved some public recognition. Unlike most others, the new fellowships will permit the recipients to choose entirely new fields of interest, with no requirement that the fellowship lead to the completion of a project, publication, or even a progress report.
Just what I need, thought I to myself, just what I need. It would allow me to blow this pop stand and get some real work done.

As Roderick MacArthur, son of the foundation’s benefactor, John D. MacArthur, would put it in 1981:
“This program,” Mr. MacArthur said, “is probably the best reflection of the rugged individualism exemplified by my father - the risky betting on individual explorers while everybody else is playing it safe on another track.”

“If only a handful produce something of importance - whether it be a work of art or a major breakthrough in the sciences - it will have been worth the risk.”
My name wasn’t on that list or on any subsequent list.

Nor, I tentatively decided in that first year, was the foundation deeply interested in people like me, people whose work did not fit into conventional categories and thus would be ineligible for conventional foundation largesse. Rather, given the foundation’s actual practice, it is clear that the MacArthur Fellows Program has been funding pretty much the same people funded by every other foundation and government agency.

The major distinguishing characteristic of a MacArthur Fellowship is that you don’t have to do anything to justify the funding; nor, for that matter, can you actually apply for support. The support comes to you, unbidden, and once you start cashing the checks, you are under no obligation complete a stated project nor submit any reports. This is a good thing, as Martha Stewart would say, but this goodness is of little comfort to those who don’t get a MacArthur Fellowship.

None of these observations are new. They’ve been made ever since the foundation began awarding the fellowships. The problem with these observations is that, assuming that the foundation really does want to identify and gift those who “boldly go where no man has gone before”; identifying those people is extraordinarily difficult, if not impossible.

My purpose in this post, then, is not to come up with rules and procedures so the MacArthur Foundation can go about that task the right way. I don’t think there is a right way. The task is impossible.

Rather, I want to do two things. First, I argue that the MacArthur Fellows Program functions to provide the foundation world with a cosmetic device whereby it can pat itself vigorously on the back for going boldly where none have gone before while continuing to fund the same suspects. Second, I argue that the best thing the Foundation could do at this point is simply to stop awarding fellowships to people who have secure employment at elite institutions. That’s a simple, but in view of my larger argument, no longer a simple-minded, suggestion.

Saturday, October 14, 2017

Minimalism on the Hudson River, with ducks

IMGP8177rd

IMGP8176rd

20151128-P1110744

This is your brain on stories

Decoding the Neural Representation of Story Meanings across Languages
Morteza Dehghani, Reihane Boghrati, Kingson Man, Joseph Hoover, Sarah Gimbel, Ashish Vaswani, Jason Zevin, Mary Immordino, Andrew Gordon, Antonio Damasio, Jonas Kaplan

PsyArXiv Preprint, doi: 10.17605/OSF.IO/QRPP3

Abstract 

Drawing from a common lexicon of semantic units, humans fashion narratives whose meaning transcends that of their individual utterances. However, while brain regions that represent lower-level semantic units, such as words and sentences, have been identified, questions remain about the neural representation of narrative comprehension, which involves inferring cumulative meaning. To address these questions, we exposed English, Mandarin and Farsi native speakers to native language translations of the same stories during fMRI scanning. Using a new technique in natural language processing, we calculated the distributed representations of these stories (capturing the meaning of the stories in high-dimensional semantic space), and demonstrate that using these representations we can identify the specific story a participant was reading from the neural data. Notably, this was possible even when the distributed representations were calculated using stories in a different language than the participant was reading. Relying on over 44 billion classifications, our results reveal that identification relied on a collection of brain regions most prominently located in the default mode network. These results demonstrate that neuro-semantic encoding of narratives happens at levels higher than individual semantic units and that this encoding is systematic across both individuals and languages.

Friday, October 13, 2017

Friday Fotos: On the beach

IMGP4736

IMGP4780

IMGP4793

20141123-_IGP0301

IMGP7032

Why the computational form of literary texts is mere form in the Kantian sense

Actually, it’s not me that’s taking the look at Kant. To be sure, I read some Kant years ago, and I do mean years, more like decades. But I don’t remember it and I’m pretty sure it wasn’t the Critique of Judgment, which is the text in play here. It was put in play in a recent article:
Robert Lehman, Formalism, Mere Form, and Judgment, New Literary History, Vol. 48, No. 2, Spring 2017, pp. 245-263.
I’m thus in the precarious position of having to rely on Lehman’s presentation of Kant. But then, isn’t that what intellectual life is like, working in the community of scholars, always depending on the kindness of strangers?

What is formalism, what is literature?
Before looking at Kant via Lehman, however, let me give you the rough and ready on two matters: 1) what I mean by formalism, and 2) how I understand what literature is.

My basic approach to the second is a crude “I know it when I see it.” Of course, I’ve learned from others, starting with my parents. They taught me that, for example, Moby Dick is literature, The Voyage of the Beagle is not, and so forth. Now that I think about it, I’m not at all sure that I’ve ever had to determine, for myself, whether or not this text was literature or not. Good vs. mediocre vs. downright bad literature, yes. But literature vs. something else, I don’t think so.

But let’s assume that there may well come a time when I would have to make such a judgment. It might be an easy judgment to make, or it might not. Where the judgment is easy for me, I suspect it will be easy for others. Where it is difficult, there as well. But in that case, we might arrive at different judgments. In consequence we could enter into a discussion about the matter and give our reasons. Perhaps we’d reach agreement, perhaps not. If not, what of it?

Well, one might throw up one’s hands and say, but then, but then, isn’t the distinction between literature and non-literature pointless? No, difficult and fuzzy, yes; pointless, no. There are color patches that are obviously blue and other ones that are obviously green. That doesn’t mean that difficult cases, cases we decide, perhaps, by tossing a coin, force us to abandon any notion that blue and green are different colors. This or that virtuoso theorist may care to gum up the whole works by invoking a difficult case, but so what? That’s posturing, not thinking.

As for formalism, all I mean is that I’m interested in analyzing and describing the formal properties of literary texts. My big beef with existing literary criticism is that, for the most part, that project seems peripheral to the enterprise despite the fact that form is a central concept of the discipline and formalism a well-recognized critical stance, or family of stances. I find it odd that these formalists, for whom the general fact of form is so very important, show so little interest in specific instances that they cannot spend time analyzing texts for their formal features.

Kant on phenomenal vs. mere form

But I think Lehman’s article can help us sort this out.

He opens by observing (p. 245):
... the ascendancy of the old formalisms—of the Yale School (minus the antiformalist Harold Bloom), or of the New Critics (expanded to include René Wellek and Austin Warren), or even of Aristotle—tended to coincide with an increased attention to or anxiety around the question of literature as such, the rise of the new formalism has not.
But these new formalists are no more interested in describing formal features than those old formalists were. What differentiates these new formalists from the old, it seems, is that “the new formalism has done nothing to answer the question: what is literature? As far as I can tell, it has not even tried” (p. 246). OK, I’m with them on that.

Here’s what Lehman thinks formalism is, both old and new, (p. 246):
At its most basic, I mean an approach to art objects—literature, film, painting, and so on—grounded in an attention to these objects’ spatiotemporal qualities, their phenomenal qualities, which might allow for the transmission of a content or a meaning but that are not themselves intrinsically meaningful. As a critical practice, then, formalism would prescribe consideration of meter, line, composition, rhythm, movement, shape: all those characteristics that are supposed to make an art object what it is. Now, I hope that this definition is broad enough to be relatively uncontroversial. I intend it to be prior both to Levinson’s distinction between “activist” and “normative” formalism—that is, between approaches that affirm and approaches that deny the compatibility of formalism and historicism—and to the question of what model of form ought to be adopted—static or dynamic, molar or molecular. And it does not depend on any especially rigid division of form from content, a division that certain varieties of formalism pride themselves on their having moved beyond.
Am I a formalist in THAT sense? Let’s be careful here.

Jerry Seinfeld: It's 98% in how you deliver the joke

Here's Jerry Seinfeld talking with George Stephanopoulos (who, you may recall, had been White House Communications Director under Bill Clinton) about comedy. In his new Netflix special Seinfeld returns to the oldest jokes in his repertoire. Starting at about 2:28 he talks about he had to relearn ALL of the old bits, which he illustrates with one about cotton balls.



Stephanopoulos: It's not just what's on paper?
Seinfeld: No, no. That's...two percent of it. 98 percent is the way you do it.
There's a shot in the special where we see all the yellow pads on which Seinfeld crafted his jokes. We see that shot in this little clip.

Thursday, October 12, 2017

What I got from Blade Runner

reality dispersion vector zed 574.009 epsilon 38

super secret government laboratory in a place that nobody can ever know about

Why the New Intellectuals Don't Cut It

Back in the mid-1990s I met one Cuda Brown (not his real name) online. We hit it off and went on to establish one of the first black zones on the internet. We started out as Meanderings, a newsletter that Cuda had been circulating privately among his friends, but it time that morphed into Gravity, which then died because we didn't have the means to sustain the effort. But it was fun for the year or so that it lasted. We had one of the first discussion forums on the web, something that Cuda coded up in a DB program – Informix? – and we did a collaboration with Vibe Magazine on the trial of O. J. Simpson.

Cuda wrote a number of pieces, including this stunner on his early days as a black nationalist, and I wrote some, including this one from March 1995, which I'm reprinting here on New Savanna. It's a bit old, but made a point or two. Black intellectuals now have a lot more to say about music than they did back then, though not, I'm sure, through any influence by this piece. It's just the logical thing to do. As for the psychology I discuss, still a deafening silence. You might want to look at the list I give, which I took from Robert Boynton's piece in The Atlantic Monthly, and see how they've fared over the last two decades. BTW, you might notice that the domain name for those various pages is "newsavanna". Where do you think I got the name for my blog?

* * * * *
Clickety clack . . . clickety clack
Bring that man's baby back.
Clickety clack . . . clickety clack! . .
I want my spirit back.
CLICKETY CLACK!
Bubble music being seen and heard on Saturday night
Blinding the eyes of ones that's supposed to see!
Bubble music, being played and showed, throughout America.
Clickety clack . . . clickety clack . .
Somebody's mind's got off the goddamn track!
Clickety clack . . . clickety clack . . .
Won't somebody bring the Spirit back?
. . . .
Who will it be? Who will it be?
It certainly won't be someone that says that they're free.
– Rahsaan Roland Kirk
The Atlantic Monthly for March [1995] features an important article by Robert S. Boynton about "The New Intellectuals," by which he means a group of thinkers who are both public intellectuals and black intellectuals. They are public in the sense that they often address themselves to a general educated audience rather than speaking exclusively to an audience of academic specialists. They are black in two senses. In the first place they have enough so-called black blood in their veins that they would be classified as black by census-takers. In the second they are variously concerned with what it means to black and American, or American and black, or, increasingly, just plain American.

Toward the end of the article, Boynton asserts that "If today's black intellectuals have not yet–with the exception of Toni Morrison's extraordinary novels–produced a body of work that will sustain itself through the Darwinian selection process of American culture, there is no reason to believe that they won't. They are relatively young, and a number seem to be just hitting their stride." The purpose of this essay is to suggest that if these 40-something intellectuals (plus or minus a decade) don't soon get some funkadelic glide in their stride, some jivometric pep in their step, there is little chance that they will produce a deep and abiding body of work, though one can always hope that they will produce an intellectual climate in which others may come along and walk where they fear to tread.

As a group, their collective work has two central weaknesses in my view:
  1. However much they may admire black music, they don't make it central to their thought and writing.
  2. Whatever they may know about the psychodynamics of racism, they are unwilling to talk and write about it.
The joint effect of their blindness is that they cannot address themselves to the deepest dynamics of American culture. They weave elegant designs around the edges, but the warp and woof are invisible to them.

Caveat Emptor – “Don't let a fox stand guard over the chickens”

Before I begin, I should make a disclaimer or two. First, the weaknesses I've indicated don't apply uniformly to all in Boynton's anointed group, which is a large and diverse bunch of folks including, in no particular order:
Toni Morrison, Cornel West, Henry Louis Gates, Jr., Orlando Patterson, Shelby Steele, David Levering Lewis, Stanley Crouch, Patricia Williams, William Julius Wilson, bell hooks, Houston Baker, Randall Kennedy, Michael Eric Dyson, Gerald Early, Jerry Watts, Robert Gooding-Williams, Nell Painter, Thomas Sowell, Ellis Cose, Juan Williams, Lani Guinier, Glenn Loury, Michelle Wallace, Manning Marable, Adolph Reed, June Jordan, Walter Williams, and Derrick Bell.
Stanley Crouch and Gerald Early have, for example, written extensively about black music. Houston Baker has written a book about hip hop, though he's more concerned with the lyrics than with rhyme, rhythm and artistic technique. Cornel West has a chapter in Race Matters about sex and race, which is at the heart of racist psychodynamics. For the most part, however, the music is more admired than analyzed and understood and the subject of psychodynamics is left untouched and, therefore, unscathed.

The other qualification is personal and negative. I haven't read all of those folks, so I may well be sticking a foot or two in my mouth. Just so you know, I have read at least something, and generally more, by the following: Toni Morrison, Orlando Patterson, Stanley Crouch, Cornel West, Henry Louis Gates, Jr., bell hooks, Houston Baker, Gerald Early, Thomas Sowell, and Juan Williams. Beyond this, I know something about the work of many of those whom I haven't read. In particular, I know their work doesn't address the issues I've mentioned above.

While critiquing individuals for what they don't do is a doubtful enterprise, and one I will nonetheless undertake, my real criticism is of the group. Boynton has written about and invested hope in them as a group. My criticism is directed at deficiencies in the intellectual program one can expect of this aggregation. To the extent they can control and influence discourse about America, we are in trouble. That trouble is not so deep as that presented by, say, the religious right, but it is a trouble progressive folk would be better off without.

Finally, regardless of what may seem to be a rather nasty critique, I should say that reading these folks has given me much pleasure and more than a little insight. Thus my criticism is in the spirit of the "loyal opposition." They have much to teach us. But, they also have much to learn about themselves and about America. It's about time they cut the cord and get on with it.

Music and The New Intellectuals

Let's begin by looking at just why these folks don't write very much about the music so many of them clearly love and draw on for spiritual strength. The reason is simple. They are intellectuals functioning in a tradition which has been and still is deeply suspicious of music (and any other expressive form, though literature has received partial dispensation since it consists of words artfully arrayed). Hence that tradition doesn't demand that you have any significant understanding of music in order to sport the credentials of an intellectual or that you take such understanding into the public arena. Plato condemned music 2400 years ago and the curse has stuck. The sin of the father has been dogging the sons and daughters ever since.

Thus, if you go into the stacks of any major research library you'll find many more pages about Shakespeare than Beethoven, Balzac than Mozart, Dante than Bach, or Goethe than Brahms. Clearly, music is not held in so high a regard as literature. No doubt that this is in part attributable to the fact that writing about music seems more difficult than writing about literature. To go much beyond impressionist evocation of feelings and styles, you must learn something of music theory so you can discuss technique and structure in musical terms. When well done, such as Charles Rosen's superb The Classical Style, the result is as deep and illuminating as any work of literary analysis. But, on the whole, the intellectual community clearly does not believe the end is worth the trouble of actually learning how to think about music.