The Three Laws of Robotics

This is part two of a series on Isaac Asimov’s Greater Foundation story collection. This part is about the short story collection, I, Robot.

Picking up with the next entry in the Asimov read-through, I read a book I last picked up in college, I, Robot. This is the book that cemented his reputation in science fiction. His works on robots are probably his most well-known. He was an early thinker in the space (he even coined the term “robotics”), and wrote extensively on the subject of artificial intelligence. After reading this again, it’s incredible how much influence a 60 year old collection of pulpy science fiction thought experiments ended up having on the sci-fi genre, and arguably on real-world engineering technical development itself.

I, Robot

I, Robot isn’t a novel, but a collection of 9 short stories, each of which were published independently in several science fiction publications throughout the 1950s. The parts are stitched together within a framing story of Dr. Susan Calvin, the “robopsychologist” that makes appearances in several of Asimov’s robot stories, recounting her experiences with robot behavior working for US Robots and Mechanical Men, from the time of the earliest models to extremely advanced humanoid versions. Fundamentally, I, Robot is a philosophical study of Asimov’s famous Three Laws of Robotics, laws that dictate the allowable behavior of robots and which form the basis of much of his exploratory thinking on the nature of intelligence:

  • A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  • A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
  • A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.

This simple set of rules form the basis for the stories of I, Robot. The groundwork of the Three Laws lets Asimov ruminate on logical, ethical thought process, and what differentiates the human from the artificial.

Each story is an analysis of an aspect of robotic technical development. As the stories progress and the technology advances, each plot line underscores elements of human thought taken for granted in their complexity and nuance. In order to poke and prod at the Three Laws, moral and psychological situations are presented to investigate how robots might respond to input, and by extension, how minor variations in inputs could dramatically change response. Asimov’s robots are equipped with “positronic brains” — three-pronged logic processors that weigh every decision against the Three Laws. Upon initial interpretation within the framework of the Laws, each plot’s situation appears to result in a conundrum or violation of the rule set. Asimov’s mystery storytelling then kicks in and invites the reader to deconstruct and solve the puzzle.

My favorite of the stories center around US Robots’ “field engineers”, Mike Powell and Greg Donovan. They appear in four of the nine stories, and serve as the corporate guinea pigs responsible for putting new robot models through their paces in a variety of settings, from remote space stations to inhospitable planets to asteroids. I loved how the technology always seems to get the better of them, only to have them figure clever solutions by twisting the Three Laws to their advantage. In “Reason”, Powell and Donovan are stuck on a space station with a robot named QT-1 (Cutie), a model with highly developed reasoning abilities. Cutie refuses to obey any of their commands because it reasons that a power exists higher than humans, which it calls “The Master”. They eventually discover that the Master is actually the station’s power source, which Cutie determines is of a higher authority than the stations human operators, as none of them could exist without it. It’s a 2001-esque series of events, though Cutie isn’t quite as insidous as HAL.

Evidence” introduces the character of Stephen Byerley, a man suspected of being a highly-developed humanoid robot. Dr. Calvin attempts to use psychological analysis to determine if he is man or machine when physical means are exhausted, realizing that if he were truly a robot, he would be forced by programming to obey the Three Laws. But the investigation takes a turn when she realizes that his conformance with the Three Laws may “simply make him a good man”, since the Laws were engineered to model human morals.

In the final story, “The Evitable Conflict”, Asimov even hints at what our modern AIs will look like, with positronic brains embedded in even non-humanoid machines, a 1950s vision of Siri or Watson. These computers of the future are critical in managing the world’s economy, mass-production, and coordination. The computers begin experiencing minor glitches in decision-making that seem to be minor violations of the First Law. But it turns out that the computers have effectively invented a “Zeroth Law” by reinterpreting the First: A robot may not harm humanity, or, by inaction, allow humanity to come to harm — making minor exceptions to the First Law to save humanity from themselves. Between Calvin and Byerley, there’s a sense of despair as humanity has given its future over to the machines. Would we be okay dispensing with free will in order to avoid war and conflict? It punctuates the final evolutionary path of robotic development, and provides a nice segue into the Robot novels in the future chronology of his universe.

“Think, that for all time, all conflicts are finally evitable. Only the Machines, from now on, are inevitable!”

I’m interested to see where the path leads as I continue to read more of his work, and to find out how these robot stories interconnect with his wider universe. Overall, I thoroughly enjoyed this book. It’s clever, thought-provoking, humorous, and will make you realize how many of our favorite works of science fiction in writing and film owe a tremendous debt to this book.

full post → — 03.06.2014 —

The Year in Books

2013 was busy in so many ways — our product matured beyond the level I’d hoped it could, we’ve done some incredible mapping work around the world, and I’m just getting started with my involvement in an awesome local hackerspace scene. Even with all that going on, I still managed to read a fair number of great books this year.

Year in Books

A few thoughts on some of the favorites:

Neuromancer, William Gibson. 1984.

I first read this one back in 2010, but after finishing up the Sprawl series with Mona Lisa Overdrive, I had to revisit it. The first time around, I found it difficult to follow and get engaged, but the second reading cemented it as one of my all-time favorites of any fiction. This is one of the seeds that sprouted the cyberpunk scene, a genre which might as well have been invented for me. The setting and culture of the book is completely fascinating, and Gibson’s prose drags you through its cities, space stations, and cyberspaces at pace, but with enough expression that you can taste the Sprawl’s grime and visualize the grandeur of Freeside, the massive spindle-like space station. Gibson’s writing oozes with style; he can turn a drug addict on a computer terminal (er, “console cowboy”) hacking a corporate network into an action anti-hero. I highly recommend this book to anyone.

When Gravity Fails, George Alec Effinger. 1987.

In this one, Effinger reverses the traditions of futuristic settings, with the West in decline and the Levant as the world’s economic core. It’s the first of a three-part series featuring Marîd Audran, a hustler from the Maghreb who lives in the fictional Arab ghetto of the Budayeen. In the slums and backalleys of the Budayeen, blackmarket clinics offer its brain-wired citizens installation of cybernetic add-ons and full personality replacement mods. Audran is an unmodified traditionalist (and drug addict), but quickly finds himself in the debt of Friedlander Bey, the Budayeen’s resident paternal crimeboss. The story follows Audran as he must himself get “wired” in order to track down a serial killer committing a string of inexplicable murders. Loved this unconventional work of cyberpunk, and looking forward to getting to the next two parts in 2014.

Consider the Lobster, David Foster Wallace. 2005.

I’ve had DFW on my reading list for years, but this first book of his I picked up is actually a collection of essays rather than fiction. Many of the pieces in the collection are works of journalism, with Wallace covering events or reviewing books. Reading a writer of his caliber covering something like the Maine Lobster Festival, or following the 2000 McCain campaign is rare, and his outsider’s point of view is refreshing.

The Revenge of Geography, Robert Kaplan. 2012.

I had been on the lookout for some time for a book about modern geopolitics, and this one was excellent. Kaplan begins by setting the historical context with the ideas of early geopolitical theorists. The central ideas of “sea-centric” vs. “land-centric” power are explained — the Rimland vs. the Heartland — and how significant historical events revolved around these two central strategies of geographic positioning. Kaplan then goes on to analyze the regions of the modern world, their connections with one another, and conjectures interesting possible outcomes, all through the lens of geography.

The End of Eternity, Isaac Asimov. 1955.

This one really surprised me, one of my favorite works of sci-fi. I wrote a post a couple months ago with my thoughts on this book, but suffice it to say that it’s my favorite piece of time travel fiction. And if you’ve watched Fringe, you’ll see the deep influence of this novel about 20 pages in.

The One World Schoolhouse, Salman Khan. 2012.

Our public education system is deeply flawed. In this book, Sal Khan analyzes the fundamental problems and posits a potential way forward. He’s the founder of the Khan Academy, one of the largest players in the world of MOOCs, striving to build an approach and set of tools to bring the same level of education worldwide with minimal access, and to wean ourselves off of the old world, hyper-structured Prussian education system we’ve been following for over a century. I have a deep personal interest in our education system, particularly the almost total lack of representation of my field as a foundational layer in primary and secondary schools.

Shadow of the Torturer & Claw of the Conciliator, Gene Wolfe. 1980.

I’ll round it out with the first two parts of Gene Wolfe’s Book of the New Sun tetralogy. The series is set on a distant future Earth, and follows Severian, a torturer of the “Seekers for Truth and Penitence” (the guild of torturers) responsible for holding and extracting information from political prisoners. The depth of these novels is unmatched, and they’re quite difficult to follow at first. Severian tells the story in the first person, is sometimes an unreliable narrator, and from his point of view many places and things that cross his path he misidentifies or misunderstands, having never left the torturers guild until his exile. Wolfe uses language that is arcane or dead, many of the words derived from Greek or Latin (a few examples: fuligin, autarch, archon, aquastor, optimate), which will send you to the dictionary frequently. Becauses of the complexity of the story and writing, this was my second attempt to read these two books. If you make it through the first quarter, you’ll be handsomely rewarded with one of the most fascinating, deep, and original fantasy stories ever written.

full post → — 12.20.2013 —

Upwhen and Downwhen

This is part one of a series of essays on Isaac Asimov’s famous Greater Foundation story collection. In this first one I discuss the time travel mystery The End of Eternity. It’s rife with spoilers, so beware.

The prolific science fiction writer Isaac Asimov published an astonishing body of work in his life. Though he’s probably most well-known for his stories, collections, and postulations about robots (and, therefore, artificial intelligence), he wrote a baffling amount speculating on much bigger ideas like politics, religion, and philosophy. The Robot series is one angle on a bigger picture. Within the same loosely-connected universe sit two other series, those of the Empire and Foundation collections. Altogether, these span 14 full novels, with a sprinkling of several other short story collections in between.

Eternity

In deciding to read all the works in the collection, I first had to choose where to begin. Is the best experience had by reading in the order he wrote them? Or to read them in story chronological order? Trying to figure this out, I naturally ran across the sci-fi message board discussions arguing the two sides, with compelling arguments both ways. I wasn’t sure which had more merit until I read that Asimov himself suggests a chronological approach, rather than in the order of their writing, to lend maximum immersion into the galactic saga. Taking a tip from another reader, I also decided to go a step further and begin with one outside of the main series, but seen by many as a precursor to the other storylines — the 1955 time travel story The End of Eternity.

The novel is primarily a mystery-slash-thriller, set in a distant future. The story follows the experiences of Andrew Harlan, a man extracted from Reality and into “Eternity”, a place that exists outside of time where humans called “Eternals” have taken it upon themselves to police the timeline of human existence, altering Reality where necessary to minimize human suffering, and control the flow of history. Eternals are people recruited from various times throughout history for particular desired skills, from the 27th century, all the way up to the 30,000th and beyond. Within Eternity is something of a class hierarchy, with Eternals dividing up the duties – Sociologists use statistics to plot the lives of individuals, Computers calculate the long-term effects of Reality Changes, and Technicians pinpoint the exact moments in time at which to intiate the Reality Change. By traveling time and entering at an exact pre-calculated point, Technicians strive to introduce the “minimum necessary change” to induce a “maximum desired response”. In other words, the smallest modification to Reality possible to create the most positive outcome:

“…He had tampered with a mechanism during a quick few minutes taken out of the 223rd and, as a result, a young man did not reach a lecture on mechanics he had meant to attend. He never went in for solar engineering, consequently, and a perfectly simple device was delayed in its development a crucial ten years. A war in the 224th, amazingly enough, was moved out of Reality as a result.”

Harlan is one of the Technicians, who actually triggers these butterfly effect Reality Changes. Unlike most of the Eternals, he has a fascination with the “primitive centuries”, those of the era before the discovery of time travel in the 24th. He collects artifacts from the 20th and 21st centuries — magazines, books, and other relics of the past to understand what made people tick in the time before Eternity. So Harlan and the other Eternals go about this business, traversing time “upwhen” and “downwhen” along their temporal transit system, shaping history like plastic.

This story contains one of my favorite takes on time travel. It presents a set of rules, obeys those rules, and directly acknowledges the time paradoxes it introduces. The plot itself is set up as a mystery, flinging Harlan into a Twilight Zone-esque narrative, leaving us as perplexed as he is as to what is actually going on, and whether he’s being manipulated by those around him. Eternals are allowed no contact or personal relationship with any “Timers”, people not aware of Eternity and that still exist within the timeline of Reality. Since the reality changes they induce can remove the existence of friends and family from Reality, Eternals are supposed to sever ties with family and forget that they ever existed. Like much time travel-based fiction, keeping tabs on the plot can get confusing, even though there’s a logical framework for how time travel functions in this universe.

For a story written in 1955 (and about as “hard sci-fi” as you can get), I was pleasantly surprised with several scenes that felt like reading a fast-paced thriller, with twists and revelations popping up every few pages for the entire final third of the book. One in particular consists of Harlan entering a point in time he had entered previously, creating the first of several ontological paradoxes that become key plot elements. The characters in the story directly acknowledge these paradoxes, speculate about the effects of an Eternal meeting himself, and even hatch a scheme to save Eternity by intentionally creating one.

The grand experiment of social engineering created by the existence of time travel and reality change in Eternity is questioned by the characters as they imagine the impact of constantly molding time to maintain an unexciting equilibrium. Each time the Sociologists’ “life plots” predict some calamity, like nuclear war, they intervene to level things out. And as it turns out, the intention to do good by removing chaos and chance from the equation stagnates humanity’s expansion to greater things, and creates a never ending cyclical machine. History is doomed to repeat itself.

The best science fiction gives itself space to ruminate on the philosophical and moral implications of technology. I loved this book, and found it to be one of the most creative takes on time travel I’ve read, which says a lot given the quantity and variations on the subject in film, television, and writing. It’s all the more impressive that this was written in 1955, and isn’t even one of Asimov’s better-known works. I highly recommend it to anyone interested in science fiction. Its mystery structure keeps things interesting throughout, from a plot perspective, but it doesn’t shy away from classic sci-fi conventions, either.

full post → — 10.30.2013 —

OmniFocus 2 for iPhone

I’m an OmniFocus-flavored GTD adherent, or try to be. The iOS apps for OmniFocus were huge contributors to my mental adoption of my own GTD system. When OmniFocus 2 dropped a few weeks back for iPhone, I picked it up right away.

OmniFocus iOSThe new design lines up with the iOS 7 look. I really dig the flat UI style in utilitarian apps like OmniFocus; for any app where function truly overrides form in importance — typically anything I open dozens of times of day as part of my routine. The new layout gives weight and screen real estate to the things you access more frequently, like the Inbox, Forecast, and Perspectives views. I’m really liking the inclusion of the Forecast view as a first-class citizen, with the top row devoted to giving you context on the next week out for tasks with deadlines.

As before, there’s a fast “Add to Inbox” button for quick capture. But rather than a button positioned somewhat arbitarily in a bottom navigation menu, it’s now an ever-present floating button, always in the bottom right for rapid inbox capture. Upcoming and overdue tasks are now symbolized with colored dots when in sub-views, and with colorized checkboxes in list views. The color highlights fit the iOS 7 aesthetic nicely, and give subtle indications of importance.

Like any effective design, the right balance of positioning and subtlety actually makes it clear how a feature should be used, and makes it simpler for you to integrate with your workflow. In past OmniFocus versions, I had a hard time figuring out how to make use of due dates (and start dates) properly, so I leaned away from using them.

With the latest iOS update, OmniFocus is now not only a tool that follows a GTD workflow, but one that actually leads you into better GTD practice.

full post → — 10.22.2013 —

Cabbage Key

Egret

I spent last weekend with the family at Cabbage Key, an island near Charlotte Harbor, in southwest Florida. It’s only visitable by boat, so we launched the Shamrock on Friday morning to head over to the cottage, including a number of cargo trips to bring all the weekend’s people and provisions.

We had a fantastic time fishing, sailing, drinking beers, and eating. Cabbage is a great spot that’s close enough to drive to, yet still detached enough to feel like a true vacation away from home.

House on Cayo Costa

On Saturday we visited a friend’s rustic cabin on Cayo Costa, a barrier island state park, with a mangrove-lined shore on Pine Island Sound, and a beach on the Gulf. Since, like Cabbage, Cayo Costa is only accessible by private boat or ferry, it’s pretty secluded. Our family friend’s cabin is a minimalist setup, with just enough shelter, a generator, and small kitchen — perfect for our weekend seafood grill session.

I recorded some GPS traces of a few of our outings, a couple on the Shamrock, and some aboard Nat’s 18’ Buccaneer. We had an amazing sail back to Pineland on Monday (the red line below), averaged 6 knots in rough seas, making the 5 mile trip in a little over 45 minutes. We had the tails of Tropical Storm Karen sweeping through that afternoon, so we made it back just ahead of a heavy squall.

It was convenient on the trips to have the charts readily-available offline in Fulcrum. Once I figured out how to download the raster data, convert it, and load it in, it was pretty simple. I now have a process for doing this with any of the digital charts that NOAA publishes. I had built a small app in Fulcrum for reporting errors on the charts, and used it with some success out on the water – though I’m not sure what exactly constitutes an actual missing feature, what things are “managed” as canonical features for navigational charts, and how to report them back. Planning a future post on this soon.

In all the hacking I’ve done with charts and data in recent weeks, a small side project is coming together to make it easier to extract the raw data from the electronic charts, not just rasters. NOAA’s formats are workable (and supported in GDAL), but it’s far too difficult for a regular person to make use of the data outside the paper charts or expensive proprietary chart plotters. A project is brewing to do more with that data, to make it more consumable and ready for mapping out-of-the-box, so stay tuned.

full post → — 10.09.2013 —

Bringing Geographic Data Into the Open with OpenStreetMap

This is an essay I wrote that was published in the OpenForum Academy’s “Thoughts on Open Innovation” book in early summer 2013. Shane Coughlan invited me to contribute on open innovation in geographic data, so I wrote this piece on OpenStreetMap and its implications for community-building, citizen engagement, and transparency in mapping. Enjoy.

OpenStreetMapWith the growth of the open data movement, governments and data publishers are looking to enhance citizen participation. OpenStreetMap, the wiki of world maps, is an exemplary model for how to build community and engagement around map data. Lessons can be learned from the OSM model, but there are many places where OpenStreetMap might be the place for geodata to take on a life of its own.

The open data movement has grown in leaps and bounds over the last decade. With the expansion of the Internet, and spurred on by things like Wikipedia, SourceForge, and Creative Commons licenses, there’s an ever-growing expectation that information be free. Some governments are rushing to meet this demand, and have become accustomed to making data open to citizens: policy documents, tax records, parcel databases, and the like. Granted, the prevalence of open information policies is far from universal, but the rate of growth of government open data is only increasing. In the world of commercial business, the encyclopedia industry has been obliterated by the success of Wikipedia, thanks to the world’s subject matter experts having an open knowledge platform. And GitHub’s meteoric growth over the last couple of years is challenging how software companies view open source, convincing many to open source their code to leverage the power of software communities. Openness and collaborative technologies are on an unceasing forward march.

full post → — 09.09.2013 —

Terra

Inspired by a couple of others, I released a micro project of mine called Terra, to provide a fast way to run several geospatial tools on your computer.

Terra

Because I work with a variety of GIS datasets all the time, I end up writing lots of scripts and small automation utilities to manipulate, convert, and merge data, in tons of different formats. Working with geo data at scale like this challenges the non-software developer to get comfortable with basic scripting and programming. I’ve learned a ton in the last couple years about Unix environments, and the community of open source geo tools for working with data in ways that can be pipelined or automated. Fundamental knowledge about bash, Python, or Ruby quickly becomes critical to saving yourself countless hours of repetitive, slow data processing.

GDALThe renaissance tool of choice for all sorts of data munging is GDAL, and the resident command line suites of GDAL and OGR. The GDAL and OGR programs (for raster and vector data, respectively) are super powerful out of the box, once you understand the somewhat obtuse and involved syntax for sending data between datasources, and the myriad translation parameters. But these get extra powerful as multitools for all types of data is when you can read from, and sometimes write to, proprietary data formats like Esri geodatabases, ECW files, MrSID raster images, GeoPDFs, SpatiaLite, and others. Many of these formats, though, require you to build the toolchain from source on your own, including the associated client libraries, and this process can be a giant pain, particularly for anyone who doesn’t want to learn the nuances of make and binary building. The primary driver for building Terra was to have a simple, clean, consistent environment with a working base set of geotools. It gives you a prebuilt configuration that you can have up and running in minutes.

Terra uses Vagrant, for provisioning virtual machines, and Chef, an automation tool for batching up the setup, and maintaining its configuration. Vagrant is really a wrapper around VirtualBox VMs, and uses base Linux images to give you a clean starting point for each deployment. It’s amazing for running dev environments. It’s supports both Chef and Puppet, two CM tools for automating installation of software. I used Chef since I like writing Ruby, and created recipes to bootstrap the installs.

This all started because I got sick of setting up custom GDAL builds on desktop systems. Next on the list for this mini project is to provision installs of some other open geo apps, like TileMill and CartoDB, to run locally. Try it out on your computer, all you need is VirtualBox and Vagrant installed, and install is a few simple commands. Check it out on GitHub, follow the README to get yourself set up, and post an issue if you’d like to see other functionality included.

full post → — 09.06.2013 —

America's Music

Lee Morgan

Jazz is a genre of music I never used to take seriously. When I started listening to decent music around junior high (very little of it was “good”), I would mentally lump jazz music in with the standards and classical pieces — the “music we play in band class” genre. I was even an alto sax player, but had next to no interest in learning the history of the music, or a desire to understand its styles, structure, or theory. Early in college, something turned me on to the records of Coltrane and Herbie Hancock, and since then all forms of jazz have been a consistent part of my listening habits.

Since I started heavily using Rdio at the start of this year, I’ve found it to be a fantastic way to get back into listening to jazz music. Over the past six months, I’ve wound my way through the catalogs of a dozen of the early jazz innovators, curating playlists of favorites. Rdio is a great service for library-building, and the depth and scope of the form over a century of recordings makes it particularly useful for jazz.

As an art form and culture, it’s incredible how intertwined the evolution of jazz is with 20th century American history.. As the most popular form of music up until the mid- to late-1940s, it’s no surprise that the music followed (and in some cases defined) periods of American lifestyle. Swing music came of age during the postwar 1920s, and became even more popular as an escape from the depression of the 30s. Bebop came on the scene during the late 1940s and 50s as a rebellion against the “commercial”, mass-produced swing records. Hard bop and free jazz grew along with the turbulence of the 60s. As a history buff, all this music got me interested in learning more about the players in the jazz scene, who influenced who, the personal relationships between artists, and how the styles evolved over the past hundred years. I’ve read a couple books on jazz musicians, but I wanted a good primer from start to finish. So Colette and I watched Ken Burns’ Jazz documentary series over the past month.

My exposure to jazz was always limited to the music itself, and some small bits of reading about the individuals (the prominent players like Miles Davis and Coltrane). What I’d been looking for is that comprehensive history — something to link the personalities to one another, and to trace the lineage of the music’s stylistic evolution. Jazz does a fanstastic job here, collecting the key moments, like Louis Armstrong’s recording of West End Blues as an iconic moment in Chicago jazz, or Coleman Hawkins’ rendition of Body and Soul that broke the trends of tired swing. The stories of interrelationships between artists also provide that context, painting the picture of how one style led into others. Listening to the music on its own leaves out the rich added layers you can get from understanding, for example, the impact of Count Basie bringing bluesy Kansas City jazz to New York for the first time. That context of the average New Yorker’s reaction to the novelty of Basie’s sound makes listening to his music that much more enjoyable.

The producers use the lives and careers of Louis Armstrong and Duke Ellington to provide a thread to link each era back to its original roots. Both Armstrong and Ellington were legendary players, composers, and entertainers. As two definitive anchors whose careers span the majority of the 20th century, their paths provide that backdrop against which to contrast the ever-growing and evolving ideas in jazz.

The best parts of the series, without a doubt, are the stories told by the artists themselves. Guys like Dave Brubeck, Artie Shaw, Ron Carter, and Herbie Hancock reminisce about particular recordings, memorable jam sessions, and on the talents of their contemporaries.

The show isn’t without issues, by any stretch. For a 12 episode, 19-hour epic series, it’s a shame that we don’t reach the bebop era until episode eight. The lion’s share of the program’s running time is spent on the swing era of the 20s and 30s. Soul jazz and fusion are almost completely omitted, save a couple of short clips, and the legendary work of Alfred Lion and Blue Note during the 1960s isn’t even mentioned a single time. Factors like this don’t ruin the show’s impact for me, but it sells the entire genre short to slice off such an influential period in all music, not just jazz. Much of what began as jazz fusion evolved into and influenced funk and hip-hop. Wasn’t a deal-breaker for the show, more of a letdown not to get to see a well-produced and organized documentary about my favorite period. The Blue Note hard bop records of the late 50s to mid-60s have always been my favorites, and again, these were almost swept out of the way.

Overall, I highly recommend Jazz. It gave me an appreciation I never had, and makes the music even more enjoyable.

full post → — 08.06.2013 —

Dropbox and Backups

I use Dropbox as the nerve center for all of my digital goods, keeping data, configurations, histories, log files, and anything else I need access to centralized and available from my Mac or iOS devices.

Here are a few of my daily tools or information trails I want to keep synced up, so anything here can be a few clicks or a search away:

  • Instant message chat history
  • iTunes library
  • Histories + log files
  • OmniFocus backups

Chat Archiving

I use Messages on the desktop for all chat conversations with my Jabber and Google accounts. I access the transcript history daily to find things I told people in chat conversations, look up links I sent, and other things. So much of my communication happens via instant messaging that I rely on it to keep logs of interactions (albeit securely).

Backing up chat transcripts is simple with symlinks. For me, I want all chat logs to be archived into a Dropbox directory continuously, so I don’t have to remember to back them up. Messages stores its transcript files here:

~/Library/Messages/Archive/

Since I want my chats to all be instantly backed up to Dropbox, I symlink the directory into a ~/Dropbox/backups directory, like this:

ln -s ~/Library/Messages/Archive ~/Dropbox/backups/chats/

Linking those files to a Dropbox directory will automatically sync them to your account in real time, if you have syncing enabled. These files are then backed up for good, in case I need to search later. A downside with Messages is the transcript files are .ichat files, not plain text. So they can’t be searched from the Dropbox iOS app or mobile text readers. The in-app search works okay, but hopefully we’ll see some performance improvement there in the upcoming OS X Mavericks release. This piece from Glenn Fleishman has some other good tips on instant messaging with Messages.

iTunes

My iTunes media is mostly secure at this point, with iTunes Match and iCloud, but I still like to keep a backup of the raw XML library data. This contains a ton of stuff I don’t want to lose, like playlists, ratings, and other metadata. ID3 tags and album art are safe with the MP3 files. A couple of symlinks make it so every time I close iTunes, the latest changes to my library get backed up. The .itl file is the primary iTunes database, and the XML file adds a software compatibility layer for other apps that read from your library (like Garage Band and others):

ln -s ~/Music/iTunes/iTunes\ Library.itl \
  ~/Dropbox/backups/iTunes/iTunes\ Library.itl

ln -s ~/Music/iTunes/iTunes\ Music\ Library.xml \
  ~/Dropbox/backups/iTunes/iTunes\ Music\ Library.xml

History + Logs

On a daily basis, I’m all over the place with my machine — working with data in Postgres or SQLite, writing Ruby scripts, and just generally working on the shell doing tons of different things. I love having my command history for anything that has a CLI archived somewhere, so when I need to pull up some command or see how I had built a package from source, it’s as simple as searching a history file. Many Linux & Mac applications keep themselves a history file inside your home directory, typically hidden, like .bash_history for the bash shell environment. I use zsh, with the awesome oh-my-zsh environment framework, highly recommended. Here’s a few I keep around for posterity and convenience, in a “histories” backup1 directory:

  • ~/.zsh_history
  • ~/.irb-history
  • ~/.psql_history

With those backed up, I can always search the logs for when I installed something with Homebrew:

history | grep "brew install mapnik"

As for OmniFocus, backups are cake. Just check the preferences for the database backup location and frequency settings, and change it to somewhere within your Dropbox folder.

In addition to the convenience of keeping this stuff linked into a secure, synced place like Dropbox, using an online backup service (like the fantastic Backblaze) is a no-brainer for keeping your stuff safe. You should be using one. Even though Time Machine is super simple to get going to an external HDD, I don’t trust the hardware enough to rely solely on that.

  1. Remember, history files can often contain passwords and other secure data. Make sure if you keep them around they’re somewhere secure.

full post → — 06.13.2013 —

Drafts

Through a number of recommendations around the web, I’ve started using Drafts, an iOS app with an interesting workflow model that’s helping me replace a number of input channels for capturing different pieces of information while on-the-go.

It’s positioned primarily as a text editor or note-taking app for iOS, but it introduces a fundamentally different approach to the capture → process flow than most other solutions I’ve tried, even ones that I like. Like most heavy mobile users, I have a suite of apps I use constantly to capture different inputs: OmniFocus for task management, Mail for email, Byword for notes and Markdown content, Fantastical for calendar items, and others. I love each of these apps for what they do, but speed is paramount for capture to be truly ubiquitous, at least for me. And I sometimes find myself swiping around looking for the right app to put something.

The way Drafts handles input is novel because it puts the content first, and the action second. You can jot something down, then decide how to process it. Sometimes it’s a to-do, sometimes a draft of an email, and sometimes just a quick note. I love the idea of starting with a bit of text, then picking the chute down which to send it in step two. Open the app and its ready for some text; no need to add titles to text files, create a new document, or any other hurdle, just start typing. It’s my new method for throwing things in the OmniFocus inbox1.

Depending on the exact wording of the quick note, it could end up as a to-do in my OF inbox:

  • Set up phone call with John → Add to OmniFocus

Then later become an appointment for the calendar:

  • Conference call with John 4/16 at 2pm → Parse in Fantastical2

One of my favorite features is the ability to write emails in Markdown. For quick replies I still use Mail (and most replies are quick from my iPhone, anyway), but for longer-form messages, I’ll open Drafts where I can include inline links and formatting using Markdown, then use the “Markdown: Email” feature to convert it and send as HTML email.

There are tons of actions supported for processing your input once you’ve entered it — Sending the text to email, Reminders, Messages, clipboard, printing, Dropbox — as well as the third-party app support. Things get really geeky once you dig into the customizable URL and Email actions.

This app is changing how I capture information from my iPhone, helping me strike a better balance between ubiquity of capture and the all-important correctness of processing. Highly recommended.

  1. If you’re an OF user and haven’t tried the Siri integration, check it out.

  2. This app has fantastic natural language processing for adding new items. So fast.

full post → — 04.15.2013 —