Quis custodiet ipsos custodes? – Why We Should Care about Government Data Mining

If you are anything like me, you will agree that this summer’s PRISM/NSA scandal has marked a new low in the history of 21st century digital surveillance. But by far the scariest prospect is that the NSA’s attitude towards privacy and data-mining might represent a new normal in how large institutions relate to their customers’ data. True to form, for the last half-year the media has been reinforcing just this narrative with their breathless coverage of how “big data” concepts are revolutionizing the future of both security and corporate administration, offering untold benefits to analysts interested in getting an edge. But even with the copious amounts of digital ink spilled on “big data” and data-mining, very little consideration is given to explaining the technology involved.PRISM

To a certain extent, this is understandable. The NSA scandals along with other corporate data-farming controversies are primarily about the privacy and ownership of social data, not the use of the data once it is collected. When the government (or corporate entity) has already absconded with our information, do we really care what they do with it? Still, absent even a rough overview of the technology, several of the potential dangers with our lax attitudes towards data privacy may not be fully apparent to the average news consumer.

In the broadest sense data mining is simply the use of computer algorithms to adaptively classify a dataset to discover subtle relationships. “Adaptively” is the keyword in the last sentence. Data mining certainly is not the straightforward testing of relationships over a large dataset (that would be standard statistical analysis). For this reason data mining is often categorized as an extension of artificial intelligence or machine learning. Data mining techniques’ advantage over traditional methods is in their ability to perceive relationships that could confound human perception. As humans, we are programmed to look for certain types of patterns in data sets. Humans have certain sensitivities (perceiving faces and voices) and in-sensitivities (very small or large numbers) and when approaching a dataset these biases can blind us to relationships that contain a good deal of predictive power. Unlike standard statistical methods, most data-mining algorithms simply begin with a metric to track and then proceed to use examples (called a training set) to create categories that explain the metrics variance. Since no categories are proposed at the offset, this adaptive technique greatly resembles “learning” and can come up with relationships across thousands of distinct inputs that a human would likely never discover.

Before my readers’ eyes glaze over, perhaps I can offer an example showing the difference between the two approaches. Say we would like to understand what factors influence teacher performance in primary schools. Certainly we can imagine a large dataset comprising the dossiers of teachers along with a metric representing the overall effectiveness of those teachers in educating their students. In a standard statistical approach, one might try to test whether classroom size, income, or location, or even a linear combination of all the variables effect the ultimate performance metric. From these standard methods we would expect an answer that would tell us whether there is a statistically valid relationship in any one of our proposed relationships (e.g. if there is 0.85 correlation between class size and performance at 95% confidence). However, this is approach limited. There might be very subtle relationships lurking in the many thousands variables we have on each teacher. Enter data mining. At the end of running a mining algorithm we might arrive at entirely unexpected rules that would never have been discovered by a pre-hoc query. We might discover, for instance, that teachers below the age of 35 have a improved performance metric but only if machineLearningthey are unmarried and living in an urban area; we could discover that teachers above the age of 40 in suburbs actually improve their performance metric when class size is increased above 20 students. Any number of complicated relationships might be generated based on the very subtle interplay of the thousands of variables describing each teacher. Some one might never expect.

At this point one might ask: So what’s the problem with this technology? Certainly it can only help to understand reality-based facts to make our decisions. If we assume no malice on the part of the one conducting the investigation, there can be nothing to lose from implementing a new approach to data. Well, like most things, the devil is in the details.

In the previous example, the rules I offered were odd but easy to understand. In a more realistic scenario rules could involve the interplay of thousand variables and might not be logically discernible to non-experts. In many, if not most cases, huge multi-variate relationships could be standing in for casual relationships between data not even contained in the original dataset. Of course a domain expert could, with some work, gain an understanding of why these rules hold predictive power but this is expensive and given the nature of current data mining practices, such rigorous methods are almost never pursued. In fact, to some analysts, data mining rules are often seen as a possible replacement for domain knowledge. This can lead to an insane scenario, where data-mining applications actually blind analysts to the casual relationships in play.  When the data becomes large and complex, it may be tempting to let the so-called “learning” of the data-mining algorithm replace expertise. But despite the nomenclature, the algorithm hasn’t really applied any intelligence; and so a modern data-mining system could proceed with it’s power discernment completely disabled.

All of this sounds very esoteric so perhaps I can  return to our original example of tracking social and personal data. The great and abiding problem with allowing large institutions free range to mine these personal data sources is that citizens will have only a vague notion of what casual inferences are being used to track them. In all likelihood the institutions have no clear idea either, and because of this, any number of things might be unwittingly inferred about subjects. With a large datasheet, a machine learning algorithm might discover an exact fingerprint that will statistically identify people as gun-owners, transgender, Muslims, home-schoolers, or any number of things not even tracked in the initial dataset. Of course, these relationships will be obscured to all but the very rigorous analyst, but in a poorly supervised system these fingerprints (standing in for first hand data) will be used to make decisions.

Privacy advocates frequently bring up examples where someone’s sexual and dietary habits might be inferred from access to an amazon wish list or a public Facebook profile. This scenario is often dismissed by cooler-headed individuals as paranoid tripe. Don’t the analysts at Amazon or the NSA have better things to do then sit around collecting personal dossiers citizens? And how could anyone be spiteful enough to snoop into strangers personal lives? The cooler heads are certainly right in assuming a lack spite on the part of data-miners, unfortunately sloth might accomplish what malice does not. Your employer or the government might be flagged that you panopticonare a suspicious or an “interesting” individual by their data-analysis system simply because you have a hobby of collecting antique guns. Of course no one would have ever asked if you were a gun-collector to begin with, but nevertheless they are getting an answer (in so many words) from the algorithm. A data mining algorithm that reads through thousands of different variables across a dataset of millions of individuals will likely have fingerprints (statistical proxies) for any number of personal facts.  As a result government or corporate dossiers could implicitly have any number of pieces of information about an individual..

I suppose at this point one could introduce another larger philosophical objection But really what is the problem? If the relationships are obscured within the system no one individual is the wiser and if statistically some groups are prone to being more violent, better customers of embarrassing products, or simply not well suited for some social environments, shouldn’t we be using this information to improve society? After all no person will likely discover or know these relationships? But the problem is just that. No one does know! The human transaction of investigation, which should contain a level of discretion, has been replaced by a headless automaton. In the past, Americans decided that, at some level, there was a core concept of privacy, some predictive information would not be used to make judgments about an individual regardless of the benefits to society. Data-mining, in effect, could circumvent this concept, using peripheral data to compose an implicit portrait of peoples’ more intimate biographical details.

All of this brings me back to two recent news stories: the revelation of extensive and racially biased “stop and frisk policies” within Bloomberg’s New York city and the increasing prevalence of CCTV surveillance throughout most urban areas in Europe and America. Certainly both surveillance of citizens and the use of racial profiling is troubling itself. However, in both cases, the extent of the evils were mitigated by technological limitations and a firm desire on the part of the public not to transgress common sense privacy norms. CCTV never could be the Orwellian Panopticon of Big Brother for the simple reason that, for each CCTV, there had to be an official at the other end watching (an intractable staffing issue). Moreover, despite having used racially profiling “stop and frisk” policies in the past, New Yorkers are now demonstrating that they find said-policies unethical even if they do help law enforcement reduce crime rates. There are too many people who value privacy above security, and too few watchers willing to test their limits for a nightmare scenario to ever be realized.

However, with the advent of data mining, both the technological barriers and our ability to detect ethical problems in the invasion of our privacy are fundamentally undermined. Whereas previously a city-wide CCTV surveillance system could never be realistically be staffed; a CCTV database paired with a data-mining algorithm trained to the face, action, and motion patterns of criminals could quite easily provide an effective means to detecting potential crimes in progress. Conversely, police officials forbidden from using race  as a means to track suspects may turn to subtler methods of tracking criminal activity through the use of data-mining algorithms trained on everything BUT race and race’s more obvious correlates. What is likely to emerge are large complicated data-fingerprints that proxy for race in everything but name. The New York Mayors Office was able to effectively deny the use of race in its “stop and frisk” policy for years by hiding behind an obvious proxy (saying they were targeting “neighborhoods”). Subsequently, future regimes able to couch their implicit racial strategy in complicated data-sets may be virtually undetectable by the public. Of course those rigorous enough to check (or cynical enough to guess) will know that society is essentially tracking its members by race anyway; and so, at the end of the day society will be using technology to do an end-run-around its ethical principles of privacy and fairness.

The ancient Latin conundrum Quis custodiet ipsos custodes? (who watches the watchers themselves) expressed clearlyWho_Watches_The_Watchmen__by_XxFallenFaerieX3 the problem of limiting the reach of authorities while holding those same authorities accountable. In answer, western common law proposed the dual values of privacy and transparency. More explicitly, that authorities needed to be restrained from using some information (even if readily available), and citizens had the right to know how and where they were being watched. I may sound paranoid, but in a society locked into system of machine surveillance neither the principle of privacy or transparency can be truly realized. Extensive data collection and mining may have untold benefits on policing, marketing, and  general convenience, but without an enormous effort on the part of society to manage that system, citizens will no longer understand the means and mechanisms through which they are constantly watched and judged. The machine will see it, the machine will flag it, the machine will determine the boundaries for judgement and set the conditions for the interaction between the citizen and the authorities. In a society so ruled, “Who Watches the Watchers?” not only ceases to have a meaningful answer, it will cease to be a meaningful question.


Return to the Conor Byrne

Tonight was my first chance to visit Seattle’s old-time community. This offered the opportunity to return Connor2to Conor Byrne, the first place I played oldtime music with a group back in 2007. It’s surprising how little the place has changed. It’s surprising how little the people have changed. It’s a shame that the jam is always on a weeknight and there is no real chance to drink Guinness and stay up late with the regulars. Still, jams rarely need libations and it’s good to know that the institution still exists.

It’s very hard to find things that you can return to six years later and find exactly the way you left them. Things move fast in the modern world, especially for people like myself who move from city to city. It’s rare to return to a place and feel like you are returning home.

Philosophy, Literature, and the Varieties of Atheist Experience

Richard Dawkins provided an interesting quote in last Thursday’s New York Times that made me think of other recent atheist comments on the relationship of literature and the humanities to general science.

“Why is the Nobel Prize in Literature almost always given to a novelist, never a scientist? Why should we prefer our literature to be about things that didn’t happen? Wouldn’t, say, Steven Pinker be a good candidate for the literature prize”

Well, certainly there have been some non-fiction writers that have won the Nobel Prize in Literature (most notably Winston Churchill and Solzhenitsyn). It certainly wouldn’t be out of the question for a science writer to win the award too; however, to this date, the Nobel committee has favored philosophers and historians for the prize. I always assumed this was a conscious decision on the part of the Nobel committee to mark literature and humanities as distinct from the sciences. Dawkins believes that this very distinction is mistaken at its inception.


I think Dawkins highlights a common attitude among the new atheists that the separation of science from the humanities is the main source of sloppy or “magical thinking” in academia. I found echoes of this in Steven Pinker’s recent article in the New Republic:

“And as with politics, the advent of data science applied to books, periodicals, correspondence, and musical scores holds the promise for an expansive new “digital humanities.” The possibilities for theory and discovery are limited only by the imagination and include the origin and spread of ideas, networks of intellectual and artistic influence, the persistence of historical memory, the waxing and waning of themes in literature, and patterns of unofficial censorship and taboo.”

Leaving aside Pinker’s Pollyanna-ish perspective on the effectiveness of data mining on soft-datasets, he paints a picture of a humanities field on the verge of being folded into general science (If one can simply mine a book for meaning and arrive at a deterministic result, what is the point in reading it?). I think perhaps, the most brazen claim yet has come from Sam Harris, who has recently claimed that he has solved man’s 3000-year-old conundrum and developed the perfect set of ethics that can be scientifically verified as sound (though Sam Harris’s scientific ethics bare a strange resemblance to those favored by east-coast liberals living in the early 21st century United States)

Despite having a strong interest in both science and the humanities, I have a profound distaste for the ambition to totally merge the two into a superior composite. Not that I think the application of good scientific technique to the study of humanities isn’t important. However, I find the new Atheists description of what they imagine for a scientific humanities to be fundamentally dishonest. Several key caveats are never mentioned. Attempts to merge the humanities with the sciences are not new, have yielded very poor results in the past, and due to the necessary differences in the standards of evidence between fields, lead naturally to the corruption of one of the parties involved.

But I am hardly the first to mention any of these objections to the Dawkins-Pinker-Harris project. For instance, I think many, not necessarily religious, people will balk when the neo-atheists’ ambitions to unify ethics under science comes to a head. Not to mention the many atheist literature professors who will object to data-mining as a replacement to textual analysis. Depending on how far the current ambition of the new atheists goes, new attempts to merge science with the humanities may in fact mark another point for a major atheist schism. Will a progressive atheist literature-enthusiast interested in social justice feel more in common with Pinker’s scientism than he will with a liberal Episcopalian? It’s hard to say.

I have been interested in the development of atheism ever since reading Christopher Hitchen’s God Is Not Great. As most atheists will tell you, there is no common core of values inside of atheism aside from the non-belief in God. However, I find the neo-atheists insistence that atheism itself is a unifying force (as opposed to religions that divide) stands in contradiction to this first contention that Atheism has no ideological content. If atheism is ideologically and ethically empty, it has absolutely no unifying power and any consilience universally felt among atheists is due to cultural/demographic coincidence.

I think we may be on the verge of seeing, what I like to call, an “Atheist Babylon”; a schism where atheists previously housed under one edifice divide into communities more antagonistic to each other than to ideologically similar believing groups. This idea is too long to develop within this blog post. But I hope to write more about it in the coming days.

How to Absorb the Bible in 5 minutes with Good Data Visualization

People have been lamenting the general religious illiteracy of Americans for some time now. I am sympathetic. It seems common in debates about God for neither the atheist nor Christian to have a good idea of what’s actually in any of the 66 books that make up the standard old and new testaments. For a while I have been on the lookout for a good crash courses in the hope of finding an introduction to the bible for the attention-deficit afflicted members of of my generation. I have had some favorites (the History Channel’s sexed up The Bible mini-series isn’t one of them.).

However, recently I came across a video that might represent the non plus ultra of Biblical summarization. The Bible in 66 Word Clouds uses one of the oldest data-representing schemes to illustrate each of the bible’s 66 books in a single still frame. I never thought word-clouds were good at illustrating datasets; but, oddly enough, each of the 66 frames used in the presentation captures the general mood of the corresponding book. Given that the books of the Bible are each fundamentally about the intersection of relationships and values; looking at the word-clouds one can immediately understand the primary relationship and value being described in the narrative.

I am convinced that a complete bible neophyte could watch this video and develop an above-average comprehension of the bible with little time investment. In my first 5 minutes I came to the following associations just from superficially tracking the large word bubbles and adding common sense.

  • Genesis concerns Abraham and Jacob’s relationship with God. Exodus concerns God and Mosses. Leviticus, Deuteronomy, and Numbers concern how the Lord should receive offerings
  • The Book of Ruth is about the relationship of Ruth and Naomi in Boaz
  • The Books of Samuel concern the reign of David as King of Israel, the next three books concern his descendants
  • The Song of Songs is about Love. The Books of Psalms is about battle. The Book of Proverbs is about wisdom.
  • Mark, Mathew, Luke, and John are about the life of Jesus

Which is more or less the take away I had after reading these books in length. To think, a general understanding of biblical texts might have been easily achieved without the 4-years of Catholic school and the independent study. All that was needed was a really good use of data visualization.

What I learned from The Dominican Sisters of Ann Arbor

In August the Dominican Sisters of Ann Arbor released their first album: Mater Euchariste. Having lived in Ann Arbor for several years, I was happy to see NPR give it a detailed review. Most notable in NPR’s interview was the description of the Sisters’ relationship with music:

The 110 women who live inside the red-brick nunnery in the lush green countryside north of Ann Arbor love to sing. They sing these chants and hymns during morning, noon and evening prayers. And when they’re cooking dinner, they’re liable to break into The Sound of Music or Oklahoma.

Later, one of the Sisters is quoted:

“Honestly, I don’t listen to very much music, and most of us don’t listen to very much music that’s recorded. Almost all of the music that we encounter on a day-to-day basis is music we make ourselves,”

Hearing this quote I couldn’t help but think about other conversations with professional musicians who have admitted very sparse music listening habits. Even among friends who have careers as semi-professional musicians, I find an inverse correlation between participation in music and the consumption of musical recordings. At first I found this very hard to explain. More and more, I feel it embodies something more fundamental about how we relate to our music.

It would be easy to attribute avid musicians’ lack of interest in recordings to the candy-chef effect (where professionals involved in creating something generally lose their taste for it). But anyone who knows a serious musician will have a hard time buying this explanation. The same musician who prefers to spend his two-hour plane ride in silence will, nonetheless, take an entire weekend to run a pro-bono workshop. The same musician who doesn’t own an iPod will, nevertheless, spend hours learning an instrument ancillary to his profession. The ambivalence of musicians to recorded music is better explained by the development of an appreciation for music that can no longer be satisfied by being a passive listener. When a person creates music as part of his daily life, he understands his relationship with music as being that of a creator or critic, and less that of a passive listener. A musician has a relationship with music that is entirely active.

But historically, humanity has always had this active relationship with music. We can see this clearly in extreme examples such as the Medieval cloister where each member’s existence is driven by songs sung in prayer. But even in secular communities right up until modern times, people encountered music only in contexts where listening implied participation. It may have been through prayer, dance, singing, or by actively being a discerning audience (as in opera or symphony), but to encounter music prior to the 20th century meant engaging with that music. The link between listening and participation was cut with the advent of musical recordings, and may have been severed with the advent of the mp3. My relationship with my iPod couldn’t be more passive. Usually, I don’t even choose which song will be played, I just hit shuffle. My sole input in the process of listening to mp3s is deciding whether to keep the current song or switch to the next song randomly selected.


Even the experience of actively buying and owning music is no longer a large part of contemporary music culture. For those old enough to remember, going to a used record store and purchasing a CD provided a great deal of fun. Since CDs were relatively expensive, there was a strange charm in shopping for one, knowing that the decision to purchase a certain band’s album would define one’s very finite music collection. There was romance in the imperfection, knowing that each of the albums in one’s collection was a risk: there were disappointments and diamonds in the rough. But all in all, an album collection was fundamentally personal.

Nostalgia for musical ownership is probably the driving force behind the recent fad of collecting vinyl records. It certainly is for me. Regardless of what any of us Vinyl junkies say about fidelity and the way Vinyls “sound”, I buy Vinyls for two experiences: the process of looking through the racks at records stores and wondering if this record deserves a place in my collection; and that moment when a dropped needle hits the vinyl surface with an amplified *thud* confirming I have made the decision to play that record.

Needless to say, there is no similar moment of decision when playing a song on an iPod. The digital music exists in an ocean of mp3s and is played (in most circumstances) randomly. I purchase mp3s individually and almost immediately when I hear a catchy tune, and I rarely concern myself with the band or genre to which they belong. The cost of any individual mp3 is never enough to force me to limit my purchases to songs from some set of artists or styles. There is no longer a sense that I even have a collection of albums, just a number of tunes that I enjoy or have enjoyed listening to.

But we are already well on the way to an even more radically detached listening culture with the introduction of predictive analytics into music processing and selection. Starting with Pandora and later Spotify, iTunes-Genius, and Songza, the next generation of music distribution systems is transitioning from the digital to the predictive. Now in place of a digital library constructed from individual decisions, systems are being built to feed users a pre-selected playlist constructed with data-mining algorithms and machine learning interfaces. I am an avid Pandora fan myself, and must admit there is nothing more efficient for background music than putting on a radio station pre-tuned to one’s musical tastes. But, the usefulness of these services aside, the new predictive approach to music constitutes the removal of any direct participation by a listener in his listening experience. The user does not choose anything beyond providing a binary input, the rest of the playlist selection is controlled by the algorithm. The user is treated not as a participant but as a dataset.mindmachine

Of course using binary inputs from the user as a data set is only the first step. Soon, I am sure, our phones will have an app able to scan our brains and play a song for us that an algorithm has pre-determined we will want. Perhaps, in the final iteration, each note and beat of a song will be created and remixed by an algorithm to perfectly satisfy our brains before we even know we have the desire for music. But while the creation of machine designed songs pre-fit for our brains may be the apex of consumer satisfaction, the art of music will be completely transformed into a solipsistic loop, having as little to do with actual music as masturbation has to do with sex.

But I don’t think mere consumer satisfaction is what humanity wants from music. We don’t want music to be reduced to a product of consumption or an experience perfectly calibrated to our whims. We want to live music; we want to hear music as something outside of ourselves; we want to interact with music as we do with a friend or struggle with music as we do with a lover. Music is not really music unless it shapes us at the same time that we shape it. When we participate in music by composing or singing or playing or even listening and being challenged by it, we are sharing in music at a much deeper level than consumption; we are allowing the music to become part of the fabric of our lives.

Again I think back to the Dominican Sisters of Mary, living day in and day out with the music that shapes and defines their relationship with God and I am reminded of the name of Israel as “The one who struggles with God”. These women have let music shape them in their daily struggles and it has become for them their conduit to the divine. The Sisters live their music and, therefore, understand the holy reality of music on a truer and deeper level than any of us moderns with our iPhones and Terra-byte mp3 collections.

Certainly, we can’t all live the way the Dominican Sisters do; we lack the time, we lack the devotion, we lack the means. Nonetheless, I think it’s important to understand that our modern relationship with music is an impoverishment. When we cannot live the divine, we participate; when we cannot participate, we try to own it; and when we can no longer even attempt ownership, we mine and consume what remains.Through all of these degradations, however, we are losing sight of what was originally sought. My hope is that, in the middle of this boom of music consumption, we feel called to participate in music as much as we feel called to listen to it. We can see hints in the lives of others more dedicated of what music can really offer a human life; we just need the courage to pursue it for ourselves.

Distributism and the Lost Cause

I consider myself a distributist. This is odd because I do not believe that distributism is an effective economic system. I don’t even consider distributism a source of sound political principles. Distributism is a defunct idea. To be emphatic I might go further and call distributism one of the most naive philosophies ever proposed. Still, I call myself a distributist. Despite its manifest failures, distributism contains a spark of idealism absent in the contemporary world; it is this idealism that deserves to be carried forward and explored in the modern times.


Chesterton’s third way

Inspired by Pope Leo XIII’s encyclical Rerum Novarum (on Capitol and Labor), G.K. Chesterton and Hilaire Belloc formulated distributism as one of the first “third way” Christian alternatives to socialism. Chesterton and Belloc preached a return to the fundamental Christian values in order to move society away from large institutions and back towards the organic family-oriented communities of the medieval past. On the surface idyllic, distributism was in fact a revolutionary perspective that viewed the existence of largeness and efficiency as enemies to the protection of the small and beautiful. Many distributists following Chesterton and Belloc believed that only when the veneration of power and imperialism was destroyed could the poor and humble things in life be valued. Much ink was spilled on the value for the humble and Chesterton (never shying away from grand pronouncements defending the downtrodden) famously finished his distributist magnum opus What’s Wrong with the World by asserting that he would rather disband every bank, parliament, and army within England than admit that the shaving of one poor girl’s head for lice was “necessary”.


The distributist community

But despite the romance of Chesterton’s assertions, the distributist vision was (and still is) economic nonsense. Anyone with an even rudimentary understanding of trade knows the terrible impact economic de-globalization would have on the poor. Anyone with knowledge of manufacturing knows the price de-industrialization would have for ordinary families. Moreover, Chesterton’s notion of an independent and empowered medieval peasantry is one the greatest examples of historical hogwash in the modern era. Truly it was a mercy that Chesterton’s creed was found hard and never tried. As a practical system, distributism deserved to die in a sea of well-earned laughter just as much as communism deserved to die in its sea of well-earned blood.

But I persist. Even if there is no value in implementing distributism, there is value in calling oneself a distributist. Identity should be about the ideals and, as such, one can associate with a political philosophy even as he acknowledges it to be impractical. A person’s core ideological position says something about him. If one calls themselves a libertarian or a socialist they are saying something definite about their values even if that same person acknowledges that pure libertarianism or pure socialism would be a disaster. Values are an end to themselves and therefore a truly honest ways of constructing an identity. It is in this way that calling oneself a distributist is powerful. The name stands for idealism, a futile idealism perhaps, but relevant nonetheless.


Idealism as a guiding light

Chesterton was right in one regard, the world is not suffering from too much idealism but too little. There are too many conversations about practical reality and too few about the values needed to rule over that reality. Within all of the discussion about interventionist wars and international institutions, do people ever think about how the planet should be governed ideally? Within all the talk of funding entitlements, do people discuss how responsibility and ownership should ideally be divided between the individual and state? I find that ideology is almost never discussed in politics. And if there is a dearth of ideological discussion in politics, there is an absolute void of ideological consciousness in technology and culture.  The history of internet privacy and digital copyright abuse is one long catalog of the American people accepting incursions on their property that would have been rejected in seconds if put in stark ideological terms. Like it or not, through a series of pragmatic steps, a core American ideal is now almost removed from our everyday lives.  Admittedly, in realistic terms, privacy and property may be doomed in the digital age; but I would rather hold up the ideal and see it fail than to simply slouch into perdition unconsciously. Even if ideals are hard to realize (and especially if  ideals are impossible to realize ) we still need them if for no other reason than to measure our progress or regression.

Skylight 013

Humanitiy’s greatest calling

Fair enough one might say, But why distributism? Certainly there is some higher idealism than a failed socialist scheme written by uneducated novelists? Why not Marx or Nietzsche? If one is going to chase castles in the sky, why not make a grander effort and reach for the stars and the supermen?  But I disagree. Distributism does embody the grandest desire of humanity because it rightly focuses man towards the small and personal. Distributism knows that humanity’s greatest calling (and failing) is to love thy neighbor. The philosophy has no grand vision but only a dream of family and fellowship. It is a utopia that is almost familiar because it is almost reachable. The vision of distributism is therefore not just a lost cause but the ultimate and most fundamental of humanity’s lost causes. I think this is what Frank Capra might have meant in his famous scene from Mr. Smith goes to Washington: that behind all the impossible ideals that drive us, the most important is to be neighborly.

So I call myself a distributist not only because the ideal it stands for is hopeless, but also because the ideal  is truly ideal. In the dying embers of that true ideal one might see more clearly the evils done in name of practicality. That, if nothing else, is a lost cause worth fighting for.