Leapdragon 2016 - Aron Hsiao Was Here

I am a very competent seer. I am a far less competent doer. I need a mentor. Stat.  §

I am an insights guy. I am ahead of the curve, and always have been. A man before my time. A person whose ideas are on the cutting edge, etc. Some key examples:

  • In the ’80s when no one even knew what the Internet was yet, I started a software company called UNIT to build social networking tools for the coming TCP/IP (i.e. internet) world. I created the ONAS (OS Network Access System) project to bring BBS-style social interaction to desktop TCP/IP computer systems over telnet for, essentially, friending and chit-chat. Of course, WWW didn’t exist yet, but it was clear to me that real-time networked interaction about a million informal little things between average people would be a thing. People didn’t understand what I was trying to do. Why bother? Email and UUCP existed already, as did BBS systems, and only a handful of pocket protector wearing uber-geeks cared. Why would anyone want what I was building, especially when it was for an obscure networking technology that only universities used and that required very expensive computers and networks? My suggestion that eventually everyone would have TCP/IP in their bedroom on a megabit connection? Ha! Dream on, kid. The average person will never find this stuff to be cool. People are embarrassed to admit that they even know how to use a computer. There could not be anything less “general public” in the world. My project was a labor of love; it never generated any money. It was never finished. The code still lives around here somewhere. It’s written in C. Not even ANSI C. This was the ’80s. It’s in K&R C. I aborted at about 80 percent complete (core libraries and platform working, UI mostly done) because it was clear to me that I was spending a lot of time on something that I had no idea how to turn into income and that I struggled even to explain to people; it was also clear that I had no plan for what would happen in that “???” step between “finish writing software platform” and “profit.” More on this general inability later.
  • In 1992, I adopted Linux almost the moment that it was released and began to evangelize and write books, saying that Unix-like operating systems would be the next big thing in computing and that the multiuser-multiprocessing-networked operating system model would be the future of home computing. There were a lot of people that thought this was nuts—what’s the need for any of this in a personal computer? Remember, the PCs of the time had no preemptive multitasking, no security model, no concept of user identities, and no networking. By the end of the ’90s, I had been completely vindicated. By the early ’00s, every computing being shipped had a minimal multiuser, preemptive multitasking, and security model, as well as a complete and robust TCP/IP stack—even in the home.
  • In the late ’90s I ditched film photography altogether, tossed out my negatives, and began to say to people that digital photography would completely displace it, not just as a matter of initially capturing moments, but also as a matter of archiving them, and that the big problem and space of innovation would not be imaging (which everyone presumed would never catch up, but I never had any doubt about) but in fact how to curate, store, database, and transmit or preserve these new archives, especially once everyone was a photographer that could produce thousands of images a month. Nobody took this seriously. At the time, the best cameras were just under one megapixel, they were expensive, and storage requirements were minimal, so the idea that it would be a tough job to store the ten tiny digital photos taken by the ten geeks who actually had access to a CCD imager in an elecronic device, or that there was some worry about society losing access, long-term, to its own visual archive, seemed off the wall. Well, it wasn’t.

© Aron Hsiao / 2001
  • In 2001, this very website provided detailed “wish list” specs for a device that I thought could dominate the world. Not just the tech world. The world world. At a time when people thought that touch computing and portable computing were niche concepts at best, and PDAs and computers were understood as entirely separate markets, I called for a device that was both highly personalized and a full-fledged computer, that fit in the palm of your hand, instant- or always-on, with a high-resolution, full color touch display, a built-in full-bore web browser, no requirement to use a pen or pointing device, no physical keyboard and no buttons except for a power button, complete networking rather than “plug in and sync” data, a serious camera, microphone, speaker, and video processing, a large amount of processing power and memory, flash media storage slot, running full-fledged Linux or Unix under the hood, at about six inches by four inches or smaller and very lightweight, with a full day of battery power and mobile networking for Internet data and VOIP. I even began to imagine that I could hack one together with spare parts and other devices that I knew about—but I lacked the funds to do it. All the pieces, however, existed in one form or another in already shipping devices, albeit in rudimentary form. In short, the imaginary device that I described—which seemed ridiculous at the time in light of prevailing industry pundit analyses—was what we now know as a high-end smartphone. I described a phablet. An iPhone 6 or 7 or 8 Plus. Or a Galaxy Note of whatever generation. Take your pick. And I was right, it did come to dominate the world.
  • In 2004 during my first stint at grad school, I earned a lot of smirks for wanting to marry the analysis of (1) contemporary public policy, (2) religion in the culture wars, and (3) ethnonationalist sensibilities. These were three different things and I was an “unserious” person who hadn’t done my homework for trying to glue them together. I did ultimately write my thesis and earn my masters degree, but people whispered about me and tsk-tsked. Now? Gosh, Donald Trump got elected, the nexus of alt-right, evangelicalism, and traditional subcultures in the U.S. has given us Steve Bannon and Roy Moore and they’ve become hot topics of debate and national deliberation, and by god what the pundits are talking about is how these figures are promoting a new ethnonationalist understanding of politics and society with eschatological overtones. Who saw that coming? Well, I’m sure there were a few others here and there, but also—me. Except I couldn’t make hay with it at the time; instead, people snickered.
  • In 2006 to start my second stint at grad schol, I wanted to study two related points as pressing matters of public policy and social research, namely, (1) social media as a coming force and organizing principle in society, and (2) the fact that all of this social and interaction data—and the society and interactions themselves, as embodied things—would be sequestered away in proprietary databases, marking a dual crisis for public policy and social research: the transition to new forms of society at the same time that they lost all publicness (of process, of data, etc.) under current legal and regulatory regimes. This time, I wasn’t just laughed at (keep in mind, this is before the release of the first iPhone and before Facebook opened up to the general public, rather than being a closed environment for students at participating schools); rather, I was actively hated by many in the faculty. Who knows how I actually got admitted. But in any case, so far as they were concerned, I wasn’t doing sociology or public policy, I was a computing and technology geek trying for some unexplained reason to secret these obscuritarian things into the discussions held by The Serious People about how to run society. Not only that, but the computing and technology stuffs that I was talking about were imaginary, pie-in-the-sky, flash-in-the-pan bullshit “products” (if we could even deign to call them that) that nobody cared about and nobody would use. Certainly not normal, healthy, commonplace members of society. I almost got tossed out of my Ph.D. program multiple times. Names were called. Crusades against me were launched. I was again described as unserious and not a real social researcher, only this time in much less flattering terms. I had to make presentations to representatives of deans and provosts to try to justify my silliness and why I ought to be allowed to stay in. And of course now—in 2017, post Donald Trump and Russian Meddling and lalala—everyone is talking about how social media and technology have transformed the way in which society, governance, public policy, and public deliberation happen, and in the wake of these events, pundits within the academy and outside of it are finally starting to wake up to the idea that we don’t have any idea what happened or have the tools to research, understand, or influence these public processes any longer because all the data is proprietary inside Facebook and its peers.
  • In 2009 I ditched Linux after years of being an evangelist and told everyone that the general-purpose computer operating system was done; Unix had won, but in the process it had also receded into the background. Its “small-u Unix” centrality to smartphones was where things were really going, not the “large-u Unix” in fully articulated computing environments like desktops and LANs. There was no reason to work on “maintaining an OS install” any longer or to worry about desktop applications, integrated computing environments and windowing systems, etc. I switched to Mac OS and using a minimal set of applications and stopped working on any kind of scripting, development, and so on, in anticipation of Mac OS becoming an appendage of iOS and the mobile ecosystem. Nearly ten years later and here we are—Apple is unifying the app ecosystems of iOS and Mac OS, desktops and laptops are dead in the water in their traditional understandings, and the need to own a “computer” has been entirely obviated for most consumers. Once again, people thought I was insane early on. Now my position is the taken-for-granted one.

I’m tooting my own horn here, yes. And there are more examples, but I won’t belabor the point. I see things. I understand things. I am an astute student of society. I make connections and have insights somewhere between 10 and 20 years ahead of the curve. That’s great.

Here’s what’s not great, and where I don’t toot my own horn. I lack any skill or mechanism for turning my insights into success or benefit for myself or for anyone else. If it is laudable that I can understand things in advance and enjoy some accuracy in predicting social trends, it is less laudable that I lack the talents to actually turn these predictions into any sort of prescription or plan for what is to be done, much less execute on such in order to serve society or to serve my family.

I have never managed to:

  • Convince anyone of any of my insights before they became self-evident years later
  • Start a business around any of them
  • Bring a product to market
  • Launch a research or activist project to study them or ameliorate the problems that I forsee

In short, I may be insightful, but I am completely, catastrophically ineffective.

This is not as a matter of not having tried. I started multiple businesses in the ’80s and ’90s. All failed. During my stints in graduate school, I tried to build alliances and launch projects; I approached people and institutions and tried to secure funding and access. I secured nothing. I got the degrees, sure. I didn’t manage to launch the research centers, the collaborations, the working groups.


© Aron Hsiao / 2008

In more modest efforts, during my teaching years, I failed even to successfully pitch classes to ultimately be offered on these topics. My proposals were met with silence or even ridicule; I ended up teaching the 101s and the 201s and the 301s and so on. So I couldn’t even “profess” as such about what I knew.

In fact, my entire professional life—the things that I have managed to accomplish and to be compensated for—has been mundane. I’ve been a small-time role player, a small time adjunct professor, a small-time trade nonfiction writer, and a small-time middle manager on small-time teams.

I went to grad school twice because I naively assumed (an assumption that I gradually weaned myself from by midway through my Ph.D.) that if I had fancy graduate degrees, that would somehow go some way to giving me a platform of authority and access, somehow, to leverage in pursuing and communicating about these things. Of course, that was not the case. That’s just not how it works. The degrees do not automatically open a space for you, of any kind, business, teaching, research, or otherwise.

In short, I often feel as though I understand a great many things, but I simply can not get things done or make things happen in relation to these understandings. So in the meantime, I support myself by being a boilerplate writer of things that other people ask me to write, and a competent computer operator doing computing tasks that other people need done, as I have always done. My vision is better than theirs, but somehow I work for them and often do work that I don’t quite believe in because others, with less vision, are managing to drive revenue sufficient to pay someone (e.g. me) while I have never managed to do this.

Now, rapidly heading into my mid-’40s, I am struggling to figure out what changes I need to make to myself—what I need to learn and develop—in order to:

  • Become an entrepreneur and bring products to market
  • Start businesses to address needs and niches
  • Build coalitions and launch projects
  • Do anything of practical value to leverage the insights that I seem to have

Because it is worthless to be “right.” Worthless to me and worthless to everyone else. Value only exists if I’m able to do something about it. To date, I never have been, and time is running short for me to figure out what, precisely, my failing is.

But there is no history of this kind in my family or in my social circle. We are merely “thinking people.” The processes and methods of “acting” in society are foreign to us—so I do not have a base of experience, mentorship, or know-how to draw on. I have only the vague idea that there are obscure and magical skills and practices that I do not posses that are “active” ones. But what they are, in day-to-day behavioral terms?

This eludes me.

And I cannot seem to find them documented anywhere.

There is a basic set of social (I mean this in the broad sense) skills that is lacking, and I am of the impression that these are not formally taught, but are the products of informal, environmental socialization practices in circles in which they obtain. But to find them? To find such an environment?

I don’t know where to start, nor do I have the sense that I will recognize them or it once spotted; I may well have passed such entirely over multiple times in my life without even realizing it.

This is my current quest: figure out who knows how to effectively act, rather than merely “see” and “say” all the time, then learn from them.

— § —

As an aside, perhaps before year’s end I’ll manage to make a post containing my predictions for the next 10-20 years. I have a few.