“They tell us that on the last day the sea will give up its dead; and I suppose that on the same occasion long strings of extraordinary things will come out of my pockets” (G.K. Chesterton, “What I Found in my Pocket”).
This is what I found in my pockets this week: keys, a pen, a wallet (mine), my cell phone, two bark chips, several wrung out flower petals (of unknown origin), a blue plastic bead, a ladybug hairclip (not mine), a Lego hand, and two broken pieces of white sea glass.
Who thinks about pockets these days? We take these poor, repressed, underpantsed things for granted.
When I stop to think about it, my pockets have always served me well. Without them, I would not be literally going places; without them I would not get home. I am convinced that on innumerable occasions they have saved my brow from the sweat of remembering, just as they have saved my mouth from admitting my poverty. I actually trust in my pockets. (In the case of money, of course, I trust my wallet as well. It is a double-coverage kind of faith.) After all, a pair of pockets is a steadfast second set of hands.
But over the past several years, my pockets have begun to surprise me. In them I have noticed a new dual-importance growing. The first is vocational; the second is technological; both, you might say, are trans-personal. My pockets are connecting me to people.
The first is this: my work has involved my pockets in carrying more than my own personal effects. In teaching preschool- and kindergarten-age children, my pockets have become a veritable treasury of trinkets. Many’s the time I get an importunate hand pressing on my arm, with two round eyes leaning ever closer to me, and a little voice urging, “Hold this for me . . . Hold-this-for-me-please.” I only accept these miniscule burdens if the child doesn’t have any pockets of their own—if they are wearing a dress or sweatpants, or have those shallow, sewn-in flaps that hardly count (so why bother, Gymboree?)—or if the gewgaw in question would be safer with me—especially if it turns out to be Johnny’s mommy’s credit card, or Jane’s dad’s Masonic ring. But most of the time, when I take these tiny things upon myself, it is for the good grown-up reason that they are not appropriate in the activity or at school at all.
But lately—in truth, over the past decade or so—I have also noticed a presence in my pockets even more importunate—and, if it can be believed, even more imperious—than the nimbuses flaring from the children’s treasures. It has been burgeoning to a great concern, beyond an ingrained habit or a necessary evil, to the point of actual bodily care. It is my cell phone.
It rests in my pocket, but it emanates such an aura of utter relevance and necessity that I forget my very personal attachment to my pants. If I were to plunge into a lake, my heart would cry out for my phone, not my clothes or my life. Certainly not my hair. Were I to fall from a bridge, I am certain I would use the duration of the plummet to check if my phone was indeed woefully trapped in my pocket. I cannot completely disavow that the thought wouldn’t cross my mind of flinging it to the safety of dry land.
(It’s not just me, of course. Whenever someone drops their phone the room gasps and flinches. The world watches in silent hopes of survival. If we happen to misplace our device, we sense the phantom phone inside our pocket or purse, despite what our fingers say. Sometimes it seems to me that the seat of my pocket, that place on my leg where a pocket has always resided, has now grown a mass in the shape of a phone. A sensory tumor.)
But it would be false to talk about my cell phone as a presence in my pocket without discussing what it really represents: people. Or, people of a certain sort. Indeed, I act as if I hold many people in my pocket. Some of them I communicate with directly, over literally long distances, as if they were virtually in the room; most of them I merely watch and scrutinize, or read as literal proofs of human errancy. To my mind they are manifestations of our “current situation,” which is always changing and ever increasingly urgent, sometimes doomsdayish. Whenever I have a free moment—even if it’s only an idle minute better spent staring into some suggestive texture on the wall—I pull these people out and try to “keep up.” Most of the time, when I am working or with real enfleshed people, I feel my cell phone’s clutches on me—a kind of vague and unwitting obsession with a thing that is merely sitting inside my pocket, yet a screen that is always promising to show me sign-people and threaded wonders.
* * *
I am glad to spend my days with smaller pockets. I am lucky that my day’s work is early childhood education. I am blessed to be handling trinkets—and to see them properly handled.
Too often I am asking smaller hands to surrender what’s inside these smaller pockets—granted, usually only for a short time while I survey their contents. Most often I am asking these self-same smaller hands to pocket whatever little knickknacks they’ve been holding up to the air like a pearl of great price.
Much too often I view these things as nuisances alone. Much too often I view them as “things.” “Trinkets,” as I have been calling them this whole time. In the world of transitions and projects, of learning—that preparation for global citizenship, which is a life of keeping up with or, if we’re lucky, of outrunning every other first-world country at present and to come—of seizing every advantage, of capitalizing on every opportunity—that deeper-down race of no goal other than “evading the specter entropy”; to a teacher in this twenty-first-century work-a-moment world, pockets are a real problem. That is, they are a problem because they’re real. They hold something actually. The child pinches inside this narrow flap and raises up a bit of matter, and all developing eyes turn to gaze on its presence. Because it has been given human importance—being literally exalted—and therefore the most presence. For anyone with a schedule, these little things become big blocks to evidential productivity—in the teaching world we call it “learning.” To my adult mind, they are as simple and stupidly stubborn as rocks.
Sometimes, they are actual rocks. My five- and six-year-old students are collectors of “crystals”—quartz, if they’re lucky, though they are not discriminating against opacity—and necromancers of clods—“This was once a dragon egg—but it got burned. That’s why it’s falling apart.” And yet, despite this over-specialization, they remain Renaissance-level humanists. They pick up where old Adam left off by being namers of stones. They even give these deposits of earth the gift of their speech—often hushed below whisper, just a notch above breath. They write for their clods the greatest parts ever played in a five-year-old globe. These are the “things” I sometimes find myself contending with: modicums of concreteness endowed with humanity.
And God forbid a child find any fluff. All that tossing, lifting, drifting off joy will swiftly smite my instruction right out of the air.
I once confiscated a scab of wood the size of a dime because “Michael-Angela” was rollicking too loudly during Quiet Time.
Now, I work in a progressive educational environment, meaning lots of things, quite a lot of them commitments/ideals/aspirations that we grown-ups try our best to work toward realizing; and I know that compared to many other schools, even the majority, we give ample time for actualizing those buzz-words “exploration” and “experiential learning.” We are, I think, pretty damn good at being age-appropriate. And we all know by heart the maxim “process over product.” But there is still an invisible adult hand putting pressure on everything we do—we cannot let the moment bleed out into nothingness. We worry about lost time, missed opportunities. Most of all, perhaps, we worry about finding ourselves with little verifiable evidence of “learning,” of having our teacherly hands empty at the end of the day, week, month, year. And in the midst of it all, the child raises up her hand, holding an almost invisible speck: she is doing a new thing—do you not perceive it?
That bit of matter, that morsel of who-knows-what. To a pair of fresh eyes, the object has ostensible weight and shape in the palm—but the moment it turns in the hand it becomes a subject in its own right, and it is as far as the very next second that the fingers could dandle it into resemblance of something alive and new. This is academically a likeness and image of what we grown-ups have termed “novel” or “emergent.” This is actually the immanent extravagance of little humans. No wonder children look: they are watching. No wonder they can’t hear, when they are actually seeing something they wish to see. Our “here” is their “there”; and their “here” is our “nowhere.”
Of course, part of the purpose of education is to show children what they would otherwise not see. A driving, sometimes pressing intention in this enterprise of educating is actually care: without admonition our young people would find themselves in harm’s way. This kind of instruction can be as primal as not touching the fire. For me, this means making sure my students know something as fundamental to their safety as how to walk while holding scissors—they must never, ever spread their “wings” and “fly”—or something as basic to communication as knowing how to write letters—a “K” has only so many legs before it kicks into confusion—or something as essential to community as knowing how to listen when someone is speaking—it’s amazing how much you can hear when your eyes are looking.
But as a twenty-first century global community, we have to teach our young people many other things about being our particular kind of human being. The lessons we have learned as a civilization, our “accumulated knowledge” about the world and ourselves, have progressed (or hastened) far beyond mere literacy—what John Dewey, at the turn of the last century, deemed an “artificial” form of learning. We are, I think, in a really strange position in educational history: in addition to teaching reading that progresses from the efferent (reading for information) to the aesthetic (reading for experience)—forms of reading we have all learned to master by high school or college—we now have to teach our students how to be hyper-efferent—while still learning how to be hyper-efferent ourselves. The information we now find necessary to transmit about our society has become multiplied in amount and, in many instances, abbreviated in form. The habits we require to receive this transmission are increasingly “virtual” ones. Any social interaction of the future will have to deal, at least to some degree, with the fleeting, the flagrant, and the reactionary. The effect of this new means of learning and living are still largely new and untested. But we are, to a certainty, preparing children to be some sort of digital citizens.
Even as an early childhood educator I know this to be inevitable. I remember the first time I saw a two-year-old child manipulate an iPhone: he handled it with such mechanical rapidity, such exacting haste, that his little fingers seemed to be possessed by the device. I sat watching him in a kind of wondering shock. The technology had taught him how to use it so very well, so very quickly; but had he learned yet how to turn a block in to a building? He had learned how to demand with a touch of the flat icons the way to get the stimulation that could seize him, but could he make pieces into a whole? He knew just the right code to receive sight and sound; but could he give to something inert his own movement and language? Bereft of this little rectangle of metal and glass, could he apprehend his own meanings? These are not “little” questions.
Now, the question of appropriateness poses no problem for me. Current research continues to show digital media of any kind to be definitively harmful to early human development. But even as a preschool and kindergarten teacher, I have found my colleagues and myself surprisingly beset by outside, societal pressures of relevance and verifiability—and I believe these pressures to be closely connected to the very technology I find inappropriate for my students.
At least from my experience of witnessing new trends in education, I find that there is a fear of decline and insignificance (practically synonymous terms in the professional discourse) for our national “community” (if I can call it that) that not only stems from but also even resembles the constantly developing technology itself. America and its children must be at the forefront of the technological race—wherever that is, whatever that will look like. (I am often surprised that so few stop to ask just what exactly that edge is supposed to lead to, and whether or not we actually want to be there at all.) In the same way, the classroom must always be updating. We want to ensure that our children are learning whatever will be useful to their future work—or, in our more faint-hearted moments, only that which promises to lead to future work, success, a position of value in the hastening electric haze. Though it is never clearly stated, it is assumed it will have something to do with the “high-priority” left-brained disciplines: I have heard so many parents fret inordinately more over math-consumption than any other aspect of their child’s progress; I remember one parent even asked me whether we “used technology” in our 2/3s class. Most currently, STEM is quickly becoming the educational tetragrammaton.
Worst of all to me is the sometime feeling that, whether we intended it to or not, our teaching (and therefore, to some degree, our students’ learning) has begun to conform to the life of digital literacy: through documentation and photographs, and numerous finished projects, through slideshows and videos and blog entries, we prioritize display of multivalent content over time spent with an individual activity. We urge our young children on—often to the point of inciting frustration or meltdowns—to complete all of their various products before the big event, so that we can show their grown-ups the sprawling web of ways they’ve learned to move on to the next assignment. Now, I admire the richness of the curriculum I have inherited at my school, and I know we spend a concerted amount of time and effort focusing on the kids’ own strengths and interests within our directed parameters, but I still worry that they are learning to move through information without really dwelling in any subject—to navigate without knowing where they are or where they want to be. I am worried they’re learning to search away from their native ability to see.
All this time I have had an inner interlocutor questioning my questions and critiquing my critiques. I know that I have been unfair to my school and my colleagues and the enormous and highly nuanced issues of technology and education by virtue of my brevity. I also know that I am in danger of sounding whimsical or idealistic, as if I were merely romanticizing my students without accounting for all of the practical realities surrounding them. I know that not everything can be imagination and wonder. And yet the fact that whatever a child makes him- or herself believe with his or her hand should be relegated to notions of “imagination” and “wonder”—the presumption that any “useless,” product-less moment could or should be dismissed as romantic—amounts to a huge misunderstanding of childhood, and thus of the people we once were, and thus of the people we still at least latently are. This is not a novel statement in the least. It is one of the major historic claims of progressive education: that play is actually the native work of beginning humans. It was in watching a boy building with blocks that Caroline Pratt, the founder of New York’s historic City and Country School (originally called the Play School in 1913), first had this revelation: “In his play [he] was thinking, learning, setting down his understanding of the way things worked, the relationship of facts to each other . . . educating himself.” Through play the growing human cultivates reality: from the tiniest seed of material springs up new animate life, organic connections, a constantly developing world. If we “grown-ups” actually help to raise the play properly—if we give enough space and draw the right boundaries around it; if we protect it at times from our own tramplings through—we will eventually see the “make-believe” blossom into something that every eye will come to benefit from: a self-fulfilling human being.
This is what I believe, at least—this is my confession: what makes up the major portion of a child’s consciousness I too often reduce to a tiny playhouse corner of my understanding. I can see this in my language: I wrap their rapid self-expansion up with the cozy term “little.” I know that even the textbooks tell me there is nothing little to cognitive development. This novel correspondence with the world is the mind’s own reality-rearing—literally the brain’s constant forming and reshaping in itself of the many likenesses of the world it receives and pursues through the senses. We used to call this “mind” by many other names—Reason, Spirit, Soul—and often felt the need to capitalize it, it was so much like God.
And yet I pocket it away in my mind. I know it is there, and I’m happy to keep it for later—should it someday prove useful.
* * *
Experience and reflection. Being, acting, thinking. Re-thinking, re-acting, and being with the consequences. Existence—which is the shorter way of saying “human existence.” The “self.” We have many words for this “thing” we call living. Currently, we often seem to drop these capacious, necessarily hard to define words for the decisiveness of verbal nouns—identities defined totally by the actions they perform, such as “worker” or “learner.” These terms seem to me to come already clothed for a life full of “work” according dominant notions of what makes a life meaningful, or in some sense valuable. (After all, it is the already determined work—whatever the society and those in positions of power have decided is most crucial and beneficial—that defines just what kind of worker the worker will be. The same goes with learners and the kind of learning they will do.) It is perhaps instructive to note Emerson’s problem with calling a person a “thinker”—even this was not generous enough.
Now, certainly all words have come down to us packed—which is to say all words are conditioned, contingent, to some degree—but if our literature is any indication, we have never been fully satisfied with describing ourselves and just what we can do. “I dwell in Possibility – / A fairer House than Prose – / More numerous of Windows – / Superior – for Doors –”
My worry, when I look at my phone, is that we are finding fewer words to describe ourselves—who we are or who we could be. My worry is that we are becoming more satisfied.
“It is a ridiculous demand which . . . America make[s], that you shall speak so that they can understand you. Neither men nor toad-stools grow so. As if that were important, and there were not enough to understand you without them. As if Nature could support but one order of understandings . . . I fear chiefly lest my expression may not be extra- vagant enough, may not wander far enough beyond the narrow limits of my daily experience.”
We are, in my perception, often rather smug in ourselves. (The word smug, by the way, comes from the Germanic smuk, meaning “to adorn” or “dress” someone or something; it is related, in fact, to the word smock, which one might describe as a wearable human-sized pocket.) We are smug, cynical, self-pocketing. We expect nothing better than the small and we act small-ly on those expectations. We are a society of labelers. We believe implicitly in epithets—our tags for ourselves promise speedy connections. We do not go beyond the confines of our terms, thus proving these terms to be sad equivalents for ourselves. (I fear they are becoming dismally optimal.) Granted, this has always been a struggle and shortcoming of ours, at least since the bullish presence of “the Press” barged into our national life. But nowadays we have our own short-tempered presses in the palms of our hands (needless to say you are reading one now). We are often required to engage in shorter type if we want to get the gist of what is out there. But shorter type means shorter words, shorter sentences, shorter paragraphs—shorter thoughts. And, truth be told, even pages cannot catalogue our thoughtlives; only books, and books altogether—libraries—have come close to being adequate symbols for some factor humanity.
Books and symbols. For both books and symbols are speedy lines toward many possible likenesses. This is what a child learns when they learn their letters. These symbols are those oldest “ductile anchors” between people over space and time. But even they prove tenuously web-like, indeed artificial, when compared to that reality which shapes and handles them, that house of cognitive and actionable potential we take so much for granted that it becomes a “piece of meat,” a mere clump of matter.
Possibility is superior for doors. What do we dwell in? What do I dwell in, when I look at my phone when it’s not even there, when I think of the brittle screened people I’d turn into sand if only I could, and forget the small scree of little human things I’ve collected until the end of the day, when I hopefully pull them out and remember, and think hard to recollect, what that growing person told me to see in the broken sea glass: “You need to touch the light with it . . . Put the sun inside it. See?”
 They’ve also at times thwarted my fingers’ search for meaning, that most primitive purpose and demanded human destiny of finding something graspable. And yet each pocket is only ever a limited abyss: in the deeps the meaning is there, somewhere at bottom, for my pockets shall not hide my keys from me forever.
 Indeed, whenever I happen to see a character in a movie fall or sink or dive into a body of water, my mind immediately races to—the phone. Or rather, it is there, immediately, in my mind. Faster than a fractured bone. And so potent is the realization of harm that my leg promptly gives off a sensation. Not just when someone falls into water, but also whenever any kind of damage occurs to someone’s phone, real or fictional, I suffer cellular sympathy pains. (I myself feel we may have on our hands a new trope in movies—which may be well-documented and thoroughly discoursed-around-with by now—ranging from the pathetic to the tragic to the horrific, in which a character witnesses the gruesome death of their only begotten phone. To me it can be as awful as a murder, far worse than any time Optimus Prime has gotten turned off.)
 Marilynne Robinson, The Givenness of Things, p. 3.
 Much of this, I think is due to the over-scienced nature of contemporary progressive education. This is a topic I would love to learn more about—and am currently too little read in to say much meaningfully—but right now it seems to me that we are so concerned with the material verifiability of our success that we veer toward a kind of pedagogical positivism—that is, we define success as anything materially verifiable. Needless to say, I feel—and I know—there are other ways of verifying; we just have to stop and watch and remember.
 As Henry David Thoreau recognized, the original meaning of this word means to go beyond—extra being the beyond, and vagant being the going. (More on this below.)
 John Dewey, Democracy and Education, pp. 8-9.
 For the foundational study of this spectrum of reading, see Louise Rosenblatt, Literature as Exploration.
 Although the AAP has ever-so-slightly loosened their previous (and total) prohibition of screen-time for children over 18 months, they restrict such entertainment to only short, infrequent “high quality” programming (Sesame Street is one example) and stress the utmost priority of face-to-face interactions: http://www.bbc.com/news/technology-37751433; see also: http://www.pbs.org/newshour/rundown/toddlers-screen-time-linked-slower-speech-development-study-finds/.
 From Caroline Pratt’s biography/history of City and Country, I Learn from Children, p. 29. Not so coincidentally, Pratt is commonly credited with having designed and advanced wooden blocks (called “unit blocks”) to become the standard tool for early childhood education that we all take for granted (for more see Play and Playground Encyclopedia’s entry on Pratt).
 One need only read almost any literature (and certainly any philosophy) from Europe and America in the eighteenth, nineteenth, and early twentieth centuries. For thinkers like Kant, Coleridge, Kierkegaard, Emerson, Thoreau, Melville, Dickinson, Parker, and, in his own way, James, this Self was the central location and source of all meaning-making, and this centrality (or at least independence/individuality) was also axiomatic to the general Western zeitgeist (for better and ill).
 One need only have watched the last two State of the Union Addresses, and have just the roughest familiarity with Common Core—nay, one need only have not lived under a rock to have some sense of the growing importance (really urgency) that mathematics and the sciences have gained with our increasingly digital culture. As far as vocations go, they are the “money-makers”—which is to say the value-makers. As far as positions (places in society) go, they are “in-demand”—which is to say they are valued.
 “The American Scholar,” p. 54 in Essays and Lectures (New York: Library of America, 1983).
 Emily Dickinson, #466 in The Poems of Emily Dickinson: Variorium Edition, ed. R.W. Franklin (Cambridge: The Belknap Press of Harvard University Press, 1998).
 Thoreau, Walden (New Haven: Yale University Press, 2006), p. 352.
 Hence the humanities. For classic examples of this view, see any book—but maybe especially Tolstoy’s War and Peace, which Isaac Babel’s said read like the world had written itself, or John Milton’s defense in Areopagitica of any “good” book as spiritually tantamount to human life.
 Walt Whitman, “A Noiseless Patient Spider.”
 This is what Marilynne Robinson, in Absence of Mind, refers to as the fallacy of “descriptive sufficiency” to a common materialist/positivist view of the human brain (p. xvi). Her chapter “The Strange History of Altruism” (pp. 31-75) discusses some of the problems behind this kind of reductive self-regard.