The boardwalk near Okaka Hut at the high point of the Humpridge Track, photo by me
⛴ Ship
I started this year intending to publish an update here at least once per quarter, and managed to get the first two out on that schedule in April and July. But, then I got busy…
New Essay: Busy
During this year I’ve been slowly turning all of the writing I did in 2021 into these “evergreen” essays on RowanSimpson.com. There are now 65 completed essays.
This last one published today was first created as a draft in April, so it’s taken me 8 months to get it published. Did I mention … I’ve been busy!
I’m delighted to have these all in a format that’s more easily shared, although I do worry that y’all will get soon sick of seeing the same links re-posted over and over again each time these questions come up.
You can browse the full list, but I realise that just listing them in alphabetical order isn’t especially useful. That’s a work-on for 2023.
In the meantime and in the spirit of “Top Three” these are the three that have had the most interest in the last 12 months (based on page views):
🚧 Deconstruct
Atari 2600
The dark ages, circa 1985, from memory…
The state of the art, in our house at least, back then was an Atari 2600, a simple game console, with a slot for game cartridges which were sold separately.
We had a few of the classics – Pac Man, Space Invaders, Missile Command.
This was hours of fun for all of us. We just inserted the game cartridge we wanted to play and it magically appeared on the screen. It had a grand total of four switches – on/off, colour/black+white, game select and game reset. In other words we could pretty much turn it off and on, start and stop games and not much else, so there was no real learning curve and it was pretty bullet proof.
However, I naturally started to wonder: how does it all work on the inside? (as I’ve since discovered for myself, little people can be annoyingly curious, and I was no exception). Who made these games we were playing? And, how? I enjoyed using the games we had, although to be honest I never was and still am not much of a gamer, but it felt like it would be more fun to try and make my own.
Around the same time I was given some old BYTE magazines, which were full of articles about “computers” like the ZX Spectrum and Commodore-64. At the back there were pages and pages of goobledegook which were apparently the instructions you could type into these machines to make them do different and interesting things. That all sounded intriguing to me, so I started thinking of all of the things I could build and tried to convince my parents to let me buy one. However, they didn’t see the need for another “game machine”. To be fair, the distinction between a console which I could play games on and a more expensive computer which I could type the code for games into and then play was a bit subtle and I struggled to make the case.
Eventually I saved up enough money to take the decision out of their hands. I purchased a second-hand Commodore-16 off a family friend (he was no doubt upgrading to something even more powerful like a Commodore-128 or maybe even an Amiga?) and started to teach myself BASIC.
It was pretty slow going to begin with. My first project was to try and build a system that would emulate the statistics shown on TV during a one day cricket game, with run rates for each batsman and manhattan graphs and worms etc. It turned out to be far too ambitious. I eventually got it to work for a full 50 overs, but it would always crash at the change of innings. In hindsight I suspect that I may have needed more than 16K of memory to achieve my vision. But either way I never let it defeat me. There was always a new technique to learn (discovering if
statements and while
loops was a revolution!) and I enjoyed the challenge of creating something of my own from scratch.
iPhone & iPad
Fast forward a few years, to circa 2010…
The state of the art, in our house at least, then was an iPhone and iPad - both simple mobile devices, running applications which were sold separately.
We had a few of the classics - Flight Control, Angry Birds, Shazam.
Again, it was hours of fun for all of us. We could just tap on the icon of the application we wanted to use and it magically appears on the screen. It had a grand total of four switches – on/off, volume and mute and a home button model. In other words we could turn them off and on, start and stop applications and not much else, so there was no learning curve and they were pretty bullet proof.
However, I naturally start to wonder: how do they work on the inside? Who makes these applications we were using? And, how? While the applications we downloaded were pretty addictive, it seems like it would be more fun to try and make my own.
I found a few websites with articles about developing applications. They were full of square brackets and semi-colons that you could type into a computer to create an application that you could then transfer to run on your device. That all sounded intriguing to me, so I started thinking of all of the things I could build. In the meantime I’d completed a BSc in Computer Science, so I wasn’t quite so naive. And thankfully this time around I didn’t need to convince anybody other than myself that this was a good idea. 🙂
Eventually I saved up enough time to begin experimenting. I installed XCode and the SDK and started to teach myself Objective-C.
It was pretty slow going to begin with (it was a few years since I had to allocate and deallocate memory, for goodness sake!) My first project was to try and build a power meter reader app which ran on top of the Powershop API. It probably took me about 10x longer to get it working than it should have, while I came up to speed with some of the unique problems of designing and developing for a mobile device and a touch interface. But, either way, I didn’t let it defeat me. There was always a new technique to learn (discovering autorelease
was a revolution, for example!) and I continue to enjoy the challenge of creating something of my own from scratch.
A couple of years after that we had a top-rating app for sale on the App Store!
The Post-PC Era?
At the D8 conference in 2010 Apple CEO Steve Jobs compared a PC to a truck - i.e. a heavy duty vehicle that has its uses but is not the standard transport mode of choice for most people (watch the video).
With the benefit of hindsight he was right. He was, in some ways, highlighting the beginning of the post-PC era.
Of course, it’s not an either/or situation.
Around the same time as the first iPhone was launched I got fed up with providing tech support for my parents and replaced their PC with an iMac (yes, they eventually realised that computers are not only about games but also useful for sharing photos of grandkids!) This was much more of a controlled computer experience than they were used to - maybe a minivan, to extend Jobs’ analogy? They loved it and the things they were able to do with technology blossomed. By 2015, when it was next time to upgrade, they made the jump to iPad. A small car suited them much better than a truck.
The numbers tell the story loud and clear. There are orders of magnitude more demand for cars than for trucks. The iPhone and iPad and other equivalent mobile devices, are so simple to use that nearly everybody in the world now carries a computer around wherever they go.
But trucks haven’t gone away. As long as there are still people like me who want to find out how things work and are tempted to create things of our own we’ll always need some trucks to do the heavy lifting.
Distribution, Distribution, Distribution
Ever since the first App Store was launched in 2008 (it was called the iTunes App Store back then) it has attracted controversy - both for the opaque approval process and for the licence fees and commissions that Apple charge app developers.
As Chris Dixon explained (in a since deleted tweet), perhaps Atari has some responsibility for this:
In video game industry, it is widely believed that Atari died because of explosion of crappy games. Hence platforms have been curated since then.
I don’t know if that is correct. But, either way, it is true that the App Store is tightly controlled by Apple (is “curated” the right word?), and that remains a source of frustration for many developers who are forced to wait for them to approve every application and update and pay handsomely for the privilege.
The App Store is a monarchy.1
I guess nobody likes to think of themselves as a serf or worse, find themselves banished from court for befriending a rival kingdom.
On the other hand, I struggle to get too angry about it. It’s an amazingly popular venue full of people looking for software to install. It does a pretty decent job of separating the wheat from the chaff (which is a bit depressing for those of us whose apps are in the latter category). And it takes care of many aspects of marketing and selling applications that are painful on other platforms.
Chris Dixon again in an article about how the iPhone permanently upended the mobile industry:
The people griping about Apple’s “closed system” are generally people who […] didn’t realize how bad it was before.
If only there was such an accessible and well trafficked distribution channel for web applications. Many of the startups that I’ve worked on over the years would certainly have benefited from an equivalent channel to reach the customers we were targeting with our software and services.
Those who begrudge paying Apple a 30% success fee, probably overlook how much they would spend on sales, marketing, distribution, payment and fulfilment via any alternative channel.
Recently this feels like an open question again, with legal challenges and the emerging threat of regulation. Whatever happens we will always need a method for getting software installed on devices. Perhaps it’s the App Store and equivalents on other proprietary platforms. Perhaps is a decentralised and more open equivalent, without the oversight of a single company? Perhaps it’s just the web?
Who knows?
History Only Ever Repeats
Here is a quote I rediscovered recently, from a 1996 issue of Wired Magazine:
The Web reminds me of the early days of the PC industry. No one really knows anything. There are no experts. All the experts have been wrong. There’s a tremendous open possibility to the whole thing. And it hasn’t been confined, or defined, in too many ways. That’s wonderful. There’s a phrase in buddhism, “beginner’s mind.” It’s wonderful to have a beginner’s mind.
— The Next Insanely Great Thing, Wired 4.02
That’s Steve Jobs again, talking about the coming wave of web applications, as he saw it way back then. I was lucky enough to ride that wave. It’s been a fun ride so far, and has some distance to run yet, I think.
But, half a lifetime later, we can easily tweak that quote to apply to any new technology.
Different hardware will come and go. Different languages for developing software will come and go. Different channels to distribute software will come and go.
Here’s my advice:
Don’t worry about any of them destroying what came before - that’s inevitable in time. Rather than being scared, do your best to understand them and explore them. Try to see the possibilities and keep a beginner’s mind. But balance that with a healthy skepticism - it’s rare that anything is as amazing as it promises to be in the short term. Often the biggest opportunities are revealed when you understand the limits and constraints.
All we need to do is ensure that people generally, and kids especially, remain curious about how things work, and most other things will take care of themselves.
Amen. Or, namaste, if you prefer.
📢 Influence
You might remember a story that was in the media recently about two Kiwis on a trip to some very off-the-beaten-track places who needed rescuing from Iran:
In all of this coverage these two were described as “influencers”.
But, what does that mean?
Describing somebody as an “influencer” is an incomplete description, in my opinion.
We should really explain who they are an influencer for, who specifically they are hired to try and influence, and how they do that.
To pick two simplistic examples: describing somebody instead as "a contractor for a cosmetics company, targeting teenage girls" or “a representative of a beer brand, trying to convince sports fans that elite athletes love to drink as much as them”.
When we put it that way it all sounds like much less glamorous work!
I was pleased to see some follow-up reporting a few days later that dug into the various sponsors who funded the global adventure these two have been on:
Mixed feelings among backers of Kiwi influencers Richwhite and Thackwray
Some of the sponsors might have mixed views of the publicity and the decision-making of the two stars of the show, but surely headline sponsors North Face and Jeep must have been absolutely delighted to see the carefully staged photos predominantly featuring their brands mindlessly published so prominently.
How much do big companies like this normally have to spend to get large photo ads on all media sites? A lot more than they pay influencers to do it for them, that’s for sure! And these stories and related photos would have reached a much larger audience than those paid adverts ever would.
When we were working on Trade Me, and operating with very skinny (read: non existent) marketing budgets, we quickly realised that editorial is the best advertising. But it’s not always given as easily as it was in this case.
One of the difficult question for any startup or growing company is:
“How will you overcome your obscurity?”
While many founders worry about others stealing their ideas in the early stages, realising that most people just don't even know about you at all (at best) or just don't care about your idea is a much bigger concern.2
The way brands, both new and old, now engage folks like these two rich kids, or others who have an audience and who are prepared to promote products and services to those people, is an increasingly important part of marketing and so a potential answer to this question.
One of the things I recommend to founders is to try and spend less time talking to other startup people and more time talking to potential customers. But unless your customers already know about you that’s a slow, expensive and difficult hill to climb (which might explain why so many founders ignore the advice!)
Employing somebody who already has that network is an obvious short cut. But how do you choose? And how do you know it’s working?
At Timely we worked with a number of different people who were already respected in the health and beauty sector and so, we assumed, could reach the audience of people we wanted to talk to about our software. It was a bit of a leap of faith each time. Thankfully it worked out well in nearly every case, but it would be better if there was a way to be more confident in advance.
Prior to the recent FIFA Football World Cup, Kiwi startup Stickybeak published some research into the relative reach of three of the biggest names in the men’s game: Ronaldo, Messi and Mbappe. These players who are also some of the most well known “influencers” in the world. Ronaldo, for example, has the biggest audience of all on Instagram with over 520 million people following his account!
The results have a few good nuggets, for example:
Influencers' power also varies with consumers' age. Lionel Messi may be the favourite of many football purists but it looks like he has less pull with the young who more often follow football for fashion and watch clips not games and so are less impressed with his silky skills than they would be by continuous Instagram content.
If you’re interested, Stickybeak now allows you to quickly and cost effectively repeat that kind of research for your own brands, and so can help you narrow in on the people who will be able to reach the audience you’re interested in targeting and test their effectiveness in advance.
Why not give it a go?
Now, go back to the start of this section with a bit more information that I haven’t given you up until now: I’m an investor in Stickybeak.
Does that change how you read it?
HT Brad Burnham from Union Square Ventures for that perfect description:
Web Services as Government
HT Tim O’Reilly who said it much more succinctly:
“The biggest threat to an author is obscurity, not piracy.”