Matt Quirion

Jan 25

The 2nd Tablet

Have you ever heard of the adoption curve?  I first really became aware of it in a 100-level marketing class in college.  It’s something most everyone inherently knows, but marketing researchers really put a name to it and then started diagramming it. It looks something like this:

The curve represents the typical adoption of a good or service by customers.  The “Innovators” and the “Early Adopters” are the first folks attracted to something new.  You probably know at least a few of these folks.  They’re always up on the latest music, fashion, TV shows, tech, whatever.  These are the ones who pre-ordered the first-generation iPad after telling you it was coming for a year.  They knew Ska was big before Swingers was released. They’d had a cronut before lines even started forming. And sometimes (often) they adopt things that never take off, like Laser Discs or the Apple Newton. Maybe they bought a Segway without owning a city touring service.

Then there are people like me.

I’m almost always part of the “early majority.” And I’m almost always in the very early front of that majority.  I didn’t get the 1st iPad.  I did get the 2nd generation iPad a week after it was released though.  I’m not among the first 1000 users of Twitter, but I was using if way before mainstream media was trying to do stories about how you could explain it to your mom.  I was a “foodie” maybe a year before everyone was a “foodie.”  You get the picture.  I’m rarely at the forefront of anything in pop-culture, but I can usually see it from where I’m standing, waiting to be convinced that the early-adopters aren’t just off on another wild goose-chase in their Segways.

There’s a perceived split between Innovators/Early-Adopters and the Early Majority that many folks call “The Chasm.”  It’s the great distance a product/service/pop-icon must cross over in order to go from a niche player that the few love to the mainstream, money-making success that everyone knows and wants.  I’ve noticed something about myself: If I’m doing it, using it, listening to it, or watching it, whatever “it” is has probably only just crossed that Chasm.  Had I the stomach to play individual stocks, I’d watch myself more closely and play the stocks of the things I just fell in love with. Of course, I’d probably ruin the effect then.

But here’s something I just did that I think might be worth noting, given I so often sit on the side of the Chasm with my feet dangling: I bought a 2nd tablet.  Not as a replacement for the trusty iPad. Not even as an “upgrade.” I got it because I could see more need for another tablet; Nothing fancy - just a tablet.

So I bought a Nexus 7. It’s gotten very good reviews and it’s inexpensive. Yeah, it’s Android rather than iOS, but it’s not full of bloatware like other non-iOS devices.  And besides, I’ve still got my iPad. When that finally dies, then we’ll probably get another iPad.  We’re too invested in the iTunes media empire not to do so.  But right now I don’t need or want another iPad.  I just wanted another tablet. So, Nexus 7 it is.

Given my tendencies to walk with the vanguard of the early-adopters army, I have to at least wonder if that might mean something about the tablet market in general.  My first purchase of a tablet wasn’t a purchase of a tablet. I was buying an iPad.  I practically got goose-bumps as I handed over my debit card at the Apple Store the day I bought it.  I couldn’t wait to see what possibilities lay before me.  I couldn’t wait to show it off to people I knew who still didn’t have one. I had something others might covet: an iPad! 2! It was embarrassingly thrilling.

But last night I just wanted another tablet.

Oct 25

“The internet makes human desires more easily attainable. In other words, it offers convenience. Convenience on the internet is basically achieved by two things: speed, and cognitive ease. If you study what the really big things on the internet are, you realize they are masters at making things fast and not making people think.

Here’s the formula if you want to build a billion-dollar internet company. Take a human desire, preferably one that has been around for a really long time. Identify that desire and use modern technology to take out steps.” — Ev Williams (via Wired.com)

(via courtenaybird)

Oct 10

Twitter, The Mobile Gap, and an External Locus of Control

The weather is getting colder, so my mind seems to be paying more attention to things outside of the outside.  Twitter’s IPO is a big topic lately. Much excitement and much anticipation. And many questions.  One question, in particular, keeps jumping out off the internet while I read: How will Twitter improve their mobile advertising performance?  It’s a question that keeps getting asked by reporters, analysts, and pundits.  And it’s the wrong question.  To address the real problem, everyone should be asking: How will Twitter get the rest of the world wide web to solve their mobile performance problem?

The last thing I bought on the internet, I bought because of Twitter ads.  It was a Lego set.  And it’s awesome.  I saw an advertisement for it in my timeline on my smartphone and before I even tapped the ad, I knew I’d buy it. In fact, I didn’t tap that ad at all.

Instead, I switched over to my todo app, and made a note: “Buy holiday lego set.”  And then later that same evening, I went onto Amazon.com (not Lego.com) and bought the set.  There’s virtually no way Twitter gets any real credit for that sale.  And that’s a shame, because Twitter should get a lot of credit.  From what I’ve seen in my timeline, Twitter, as much as any company on the web save for Amazon, has got me dead-to-rights figured out with their choice of ads in my timeline.  From software development tools and services to dad-tech, to, well, Legos, Twitter puts ads into my timeline that are far more interesting to me than practically anything I’ve ever seen on a Facebook page or even most Google search results.  But I’ve still not bought anything directly through a Twitter ad on my phone.

That’s not Twitter’s fault, really.  Twitter’s various moves over the last few years, from creating rules that would all but eliminate any Twitter clients outside of their control, to creating their “card” technology are all focused on controlling the Twitter timeline ad experience.  The problem for Twitter - and so many of the advertisers and marketers who pay to get on those timelines - is what happens after a user taps a twitter ad.  The problem is here. And here. And here. From poor visual design to poor network optimization choices, the world wide web outside of Twitter is just not built to support Twitter’s mobile ad success. Not yet.  The mobile web - where most of Twitter’s ads go - is just atrocious.  So users like myself choose to avoid it when we can, even if it means waiting until later in the night to do my buying on a laptop. On a completely different site from what was advertised.

Seemingly every month or so we see news from Twitter’s product group showing off new ways to use Twitter’s internal ecosystem to improve advertising performance from within.  That course, however, runs the risk of falling into the trap of trying to become a platform that users will never leave.  That course has already been tried multiple times, from AOL to Facebook, and it’s never really worked.  People are going to leave the platform.  Advertisers and marketers are going to want full control over the experience.  

So instead of continually looking inward to fix their mobile advertising problem, Twitter should look outward and ask if they can help.  One of the few applications that still works fairly well on my now ancient HTC Thunderbolt device is the Twitter app.  In fact, even the Twitter mobile website works pretty well on my phone.  Twitter clearly has technical know-how that they could share with the rest of the world wide web.  If Twitter wants to fix their mobile ad problem, they should get to sharing immediately.

Feb 11

LinkedIn E-Mails Addressed to Top 5% are Directed at the Other 95%

There’s been a ton of conversation in social media circles about LinkedIn’s new marketing campaign.  LinkedIn has been notifying users that they’re among the top 5% (and even 10%, now) of viewed profiles on the social network.  Some mentions of the campaign bemoan it. Other mentions seem to make attempts at a “humble brag” about it. And still others seem to genuinely think it’s pretty cool.  But what I love about the campaign is that these e-mail notifications aren’t aimed at the receivers of the e-mails at all.  These congratulatory messages are targeted at the LinkedIn users who aren’t getting them.

One of LinkedIn’s most valued assets is the data it possesses about each of its registered users.  As of January 2013, LinkedIn has 200 Million registered users.  Most of them have at least indicated their current career situations and probably their work history.  But there’s so much more a registered user can do with their profiles, accounts, and activities. And as LinkedIn constantly likes to imply while it encourages those activities, doing these things could help a user’s career.

So when 2(0) Million LinkedIn users start talking about the congratulatory e-mails they’ve received about the interest in their profiles (which, presumably, has some notion of career benefit attached), it leaves the other 95% who never received such an e-mail to wonder: What can I do to enhance my profile on LinkedIn and help my own career?  After all, even after people realize 5% of 200 million is 10 million, the other 95% must ponder what they’re failing to do to be among that massive pool of “better” profiles.

LinkedIn’s answer: Add more data about yourself and get more active on their social network. (Read: Give us more data.)

It’s a brilliant move.  It’s Judo Marketing.  And yet I can’t help but recall this Onion’s piece on “personal branding” even as I enjoy this gambit.

Aug 11

“This movement has been gaining momentum for more than a decade. Human beings who make investment decisions based on their assessment of the economy and on the prospects for individual companies are retreating. Computers—acting on computer-generated market trend data and even newsfeeds, communicating only with one another—have taken up the slack.” — Raging Bulls: How Wall Street Got Addicted to Light-Speed Trading | Wired Business | Wired.com

(via thisistheverge)

Aug 07

Why Are We Still Consuming News Like It’s 1899? | benhuh!com -

shaneguiter:

The limited amount of space on news homepages and their outmoded method of presentation poses big problems for the distribution of news as well as consumption by the public. Even though it’s been more than 15 years since the Internet became a news destination, journalists and editors are still trapped in the print and TV world of message delivery.

The traditional methods of news-writing, such as the reverse pyramid, the various “editions” of news pose big limitation on how news is reported and consumed. Unfortunately, internet-based changes such as reverse-chronological blogging of news, inability to archive yesterday’s news, poor commenting quality, live-blogging, and others have made news consumption an even more frustrating experience.

Jul 04

When the country elected Barack Obama just four years ago, Twitter was a fledgling startup. During the campaign, Obama overtook Kevin Rose as the most followed person on Twitter, passing him at 56,482 followers.

Five years ago, according to Pew, less than half of Americans used email daily; less than a third used a search engine.

YouTube was founded in 2005 and Facebook in 2004 — and it would be a while after that until they became such integral parts of our day-to-day Internet experience.

Today nearly half of Americans own a smartphone. The iPhone is five years old.

” — Technology - Rebecca J. Rosen - 59% of Young People Say the Internet Is Shaping Who They Are - The Atlantic (via infoneer-pulse)

Sometimes you have to stop and look back to appreciate the rush.

(via emergentfutures)

Jul 01

Why Google Wants You To Look Geeky

Last week during Google’s big I/O event, Sergey Brin, co-founder of Google, “interrupted” the keynote of the event’s first day in order to show off his personal pet project: Google Glass.  Glass is a technology integrated into an eyeglass frame to allow “wearable computing.”  Brin’s means of showing off Glass last week was by way of an impressive, muti-person, multi-modal stunt involving sky-divers, stunt bikers, and people abseiling down the side of a building to show off the sharing capabilities of Google Glass.  The stunt impressed a great deal of folks, and it was followed up by an opportunity for members of the media to actually briefly experience wearing Google Glass for themselves.  Quite a few of those folks are leaving the experience impressed, going so far as to exclaim that they’ve seen the future of computing. And, conveniently for Google, they’re buying the humanistic marketing pitch for Glass; the project moves technology out of the way of communications and experiencing life.  But that’s not really why Google is racing towards a dominant position in wearable technology.  The real reason is that if Google can get “technology out of the way,” then Google can marginalize Apple’s primary competitive advantage.

Last month, Apple unveiled their newest version of the iOS operating system, and much was made of the fact that Apple has dropped Google Maps for their own proprietary mapping technology in partnership with a smaller online map purveyor.  That step, along with the previous introduction of Apple’s “Siri” technology - a voice activated digital assistant that essentially can search for answers to your questions on the iPhone and iPad, are moves by Apple designed to negate Google’s influence over their iOS technology.  And given Apple’s market dominance of mobile computing, those moves also provide them with opportunities for market dominance in the search and mapping service industries. You can imagine how Google must be feeling about being cut out of the dominant mobile computing platform.

Read any single review of any smart-phone or tablet computing device of the last 2 years, and the benchmark against which all other machines are judged is Apple technology - the iPad and the iPhone.  And very rarely do any non-iOS devices make par. Rarer still are the devices that might make a reviewer gush that they’re better than iOS options without any caveats.  And the reason for that is the massive competitive advantage Apple enjoys in the realms of design and human-computer-interaction.  With the slick, tightly controlled iOS environment, and the existence of only 2 form factors for iOS devices, no technology maker can build a device that will be as pain free and enjoyable (not to mention sexy) to use as the iPad or iPhone.  

In the smart-phone and tablet computing world, the winner of the game is the competitor who makes the most beautiful, elegant device.  That’s why Google wants to end the beauty pageant.  Sure, the current iterations of Google Glass are “geeky looking.” No beauty contests are going to be won by them. But for centuries, people have been wearing glasses. And over time Google will be able to minify the technology backing the Glass product until it looks like any other pair of fashionable frames.  Who knows, maybe a pair of Google Glass contact lenses isn’t out of the question.  At any rate, eventually, nobody will notice the glasses.  Which means nobody will be noticing the device.  Which means nobody will care about the form of the device any longer.  All anyone will care about is the service the device provides. Given its history, that’s a game Google’s got to look forward to playing.

Google Glass and its successors wont eliminate tablets and smart phones.  Brin has already conceded that point in his discussions over the project.  But there are still only 24 hours in any day, and any day only involves a finite number of times that a person actually needs or wants to seek information, exposition, or entertainment.  By providing Glass, Google will be offering people a way to get that without having to pick up a tablet or a phone.  From Google’s point of view, if Glass takes off, Apple can hang on to it’s market dominance when people care about how the tech they’re using looks and feels.   

Jun 11

Apple Killed the App Store Star (And the App Era Too)

Right, sorry for the link-baity title.  But the thesis holds true. Long term, anyway.

Today the key note of Apple’s WWDC was presented, and a slew of new Apple products from hardware to software were unveiled.  Despite the fact that the hotly anticipated Apple Television (not to be confused with the Apple TV box unit) never did make an appearance, anyone who followed the presentation or at least read up on the results would find it hard to argue that Apple didn’t come out with guns ablaze.  The pricy new hardware is as beautiful as it is expensive.  The slick new OS Mountain Lion features are as pretty as the retina displays on the new Mac Book Pro. And the new integrated apps in the soon-to-arrive iOS6 look brilliant… and really familiar.  In fact, those new integrated apps are so familiar because they mimic a great deal of functionality of some of the most popular third-party-created apps in the iOS ecosystem.  And by taking the ideas that were germinated and then perfected in those third-party apps and integrating them tightly with iOS 6, Apple killed those apps and the entire ecosystem too.

First, let’s just accept that there is no finish line in this game. There’s no eventual winner. Apple’s the king of the tech and business world in so many ways right now, it’s not worth trying to argue that what’s coming is Apple’s demise. But this matter is bigger than Apple anyway, and Apple will do just fine finding its way in the post-app era. It probably just wont be too eager to see that era arrive.

Today, by eating the best apps cultivated in its own ecosystem, Apple made a few declarations: 1) Even though Apple creates beautiful objects with beautiful interfaces, they’re pretty unclear about what people actually want to do with pretty little things. 2) Apple feels no qualms about snatching success from its ecosystem developers, even when those developers show extreme loyalty to iOS. And 3) Apple sees no reason not to annoint winners in Software as a Service categories, and then integrating them tightly with their own applications as partners, the rest of the ecosystem be damned. iOS is not an open playing field or a level one. It’s just a playing field where the rules are up to the hopefully benevolent dictator, and all the players are at the mercy of that dictator’s market analysis and app store rankings.

For a time now, a common meme among web development professionals is that the web is the most hostile development environment in the history of computing, all because of the various takes on “standards” by the array of browsers on the market, the seemingly endless niche scenarios exposed by the seemingly endless tool choices, and the exposure to the population at large with only rudimentary access controls made available by the web platform itself.  But for all its flaws, the web has never been hostile in the way that iOS is today.  With the declarations Apple just made, the iOS platform (and really any other proprietary OS) is hostile to innovation.  And that hostility will kill the ecosystem.

The iOS ecosystem wont turn into a pumpkin at midnight tonight. It will continue to thrive for some time, and the extrapolation of near-term data of iOS development and usage will make for easy arguments that the app era is only just beginning, but the signs of strain are easy to find.  There’s already a dedicated piece of lingo for the notion of having Apple take your software idea and ship it with their iOS out of the box: “Sherlocked.” It’s been around for a couple of years now.  And today there was a seemingly endless font of the “s-word” springing from the twitter accounts of various well known iOS developers who had, for a time, found a happy little niche market that made for a nice living until Apple came in and ate it.  The cracks are starting to develop. No doubt some of these highly capable iOS developers will just look to create another clever iOS app, but it’s just as possible that folks with such skills might choose to now apply themselves to an ecosystem not wholly owned and dictated by the monster that just ate them.

One admirable quality of many of the best iOS app developers is that they treat their work like a craft. They often seek to find elegant, beautiful, and innovative solutions to common problems.  Many of them are prolific writers on the problems they’re solving and the efforts to do so. And while the monetary payoff is a prime motivation for these people, it’s clear that the creativity and innovation involved in their work is what keeps them going.  That innovation has value to these developers too, and Apple’s been keen to let them go right on innovating until they see fit to take all that innovation and use it for themselves.  And all it cost Apple was 70% of the third-party apps’ purchase prices from the App store.  That’s a heck of a ROI on market research and development.

And when such people, with their clever ideas, come to an intersection with better tools for developing advanced applications on the web, that’s when those cracks in the app ecosystem are going to give way.  Ideas are malleable on the web in a way they can’t be on a proprietary OS. The ever popular “pivot” is something that can be executed at relatively small cost on the web versus the OS.  There’s no need to conform to an albeit beautiful but restrictive set of UI guidelines on the web. Nobody worries that one’s web app ever need meet some form of “approval” except whether or not users find value in it. And on the web, as long as you put in the work, your app will always be discovered by someone.

Oh. Yeah. Discoverablity - the gaping wide hole in the proprietary OS ecosystem’s polished armor.  Funny how everyone waited to hear if Apple was getting into televisions when they haven’t managed to solve their biggest problem yet - the ability for users to easily, intelligently, and at times serendipitously discover great apps on the ecosystem.  Apple has now reached 6 full iterations of their iOS platform and still have nothing better than a few small improvements over the years to its base App store.  And Google Play? By the company that provides the de facto standard in web-based search?  Yeah, forget it.

The gold-star standard for application discoverability was invented years ago by Tim Berners-Lee before apps were even really a consideration.  That standard is the web, and that standard will be the benefactor to generations of web-based applications that learn to harness that power. No proprietary OS ecosystem will ever match it because to do so would be to implement an internal, proprietary version of the web that would be too hard to get third party app creators to agree to implement.  And so discoverability will not only remain a weakness for app platforms, but a major competitive disadvantage.

Proprietary app ecosystems wont fade because developers explicitly leave them. The getting for so many app developers is so good - for now. But the ecosystem’s best developers are being chased away. The web app ecosystem will simply begin to thrive as the tools improve, the skills advance, and the opportunity to be discovered remains open. Eventually developers looking to build something will seek open spaces where their innovations may grow without need to agree to the indentured servitude of the digital age.

Jun 06

Cutting The Cord

With relatively rare exception, I turn on my television and my DirecTV box every night to see “what’s on.” Note that I’m rarely turning it on with “appointment tv” in mind. I’m almost always disappointed. Who could blame me? For about $90 a month, I pay DirecTV for the privilege of taking up about 85% of my DVR space with Sesame Street, Super Why, and Dino Train for the kids (all from PBS). And as an added bonus, the best thing I’ll see on TV that entire week might be something like The Killing - a show that (ask any of its regular viewers) clearly hates its audience.  This sort of behavior is unhealthy, but more importantly to me, it’s bad economics.  Near as I can tell, I’m paying about $90 a month for about 300 channels that almost never interest me, so I’m ending that now.

There’s been a lot of discussion about consumers “cutting the cord” from cable and dish television providers. Yesterday an entire site dedicated to a sort of social-media form of petitioning HBO for a  stand-alone HBOGo cropped up.  And recently, during the All Things Digital 10 conference, quite the kerfuffle was made over various comments by Hollywood super-agent Ari Emanuel that essentially boiled down to “ala carte models wont work” and “people don’t want to pay for anything.”  And along with that, he made a lot of misinformed comments about magical anti-piracy technology that the likes of Google could and should implement if they were really willing to help protect his industry. (Yep, his biggest angle was protectionism.)

I’ve seen a lot of numbers on the subject lately, and the ones I tend to believe more tend to support the argument that the “old models” of Hollywood are dying, but I’ll grant that those holding the opposite opinion have their own data too.  Instead, I’d rather just speak to my own logic for deciding to cut the cord from these old models of televised entertainment.  When it comes to money I spend on entertainment, I’m seeking value.  When I’m connected to the old model via a cable or dish subscription, I’m paying money for entertainment that’s ostensibly available 24/7. If I turn the TV on at any hour, there’s something I’m paying for displaying on the television. Of course, whether or not it’s actually entertaining or informative is an entirely different question - one that’s often answered with a resounding “no.”  So in essence, I’m paying $90 a month to have by-and-large non-entertaining, non-informative noise pushed to me, whether I’m trying to receive it or not.  What’s worse, when I do just give up on trying to find something compelling to watch, but choose to watch “something dumb” on television using the old model, I’m paying for it both with my subscription money and my time. And time’s my most valuable resource in this equation by far. Cable and dish network television make it entirely too easy to waste my resource of time.

So instead, I’ll save that $90 bucks a month, and use services like Netflix to provide childrens’ entertainment when I (God help me) want to have the kids watch tv.  Luckily we’ll still have PBS available as well - Sesame Street is actually better today than it was when I was a kid.  And for my own entertainment, I’ll be using things like Apple TV to occasionally purchase things I really want to watch, like the Game of Thrones series.  At a pure, per-unit rate, such viewing might cost me more, but at least I’ll feel like I’m getting some sort of value for both my money and my time, and I’m not subsidizing things like Operation Repo on truTV.  I don’t even really know what either one of those things is, but I know I don’t want to help pay for it.

So here’s a warning to all the moving picture content companies seeking to protect their old business models.  It comes with no malice or ill intent: I am not the consumer type you want to dismiss. I am not an early adopter. I’ve never pirated a single movie. Throughout my life, from AOL to broadband, from Napster to Spotify, from wine appreciation to craft brews, I’ve proven to be part of the “early majority.” If I’m making the move, it might not yet be too late to change your models, but it will be before too long.

This evening, while scanning Twitter and not watching TV, I saw this tweet re-tweeted many times over…

HBO faces the same issue that studios do with premium VOD. At what point do youset yourself up for the future while you shiv the past?

— Jason Hirschhorn (@JasonHirschhorn) June 6, 2012

It’s a tricky question, but there’s a deceptively simple answer, provided by the music industry: You shiv the past before the future shivs you.