The unmet need social media addressed was that it made it easy for us to signal virtue, wealth, and, in some cases, perceived superiority… at scale.
There’s a great story about a decision Sony made when they shipped the legendary Walkman.
Against the advice of market research, Sony’s co-founder had asked the engineering team to build a portable music player that would ease the boredom on long flights. The engineers then came back with what could only be termed a product manager’s dream – for nearly the same amount of effort and cost, they were able to add an additional feature – a record button – to this cassette player.
But, to their dismay, Chairman Akio Morita asked them to remove the record button.
By reducing the device to serve a single use case, he eliminated any potential user confusion. In the same way McDonalds removed cutlery from restaurants to make it clear how they wanted customers to eat their burgers, Sony released the Walkman with a lower range of functionality to give them the highest chance to change customer behavior.
And change behavior they did.
(H/T: Alchemy by Rory Sutherland)
John Montagu was a consummate card player who didn’t like meal interruptions while playing his favorite game of cribbage. So, it is said that he asked for veal stuffed between two pieces of bread to make it easy for him to eat while playing.
As John Montagu was also the Earl of Sandwich, others started asking for the “same as Sandwich.” And the rest, as they say, is history.
For when we find ourselves stuck in discussions about building for the “average” user, it is worth reminding ourselves that the sandwich, like many innovations, happened on the edges thanks to a passionate early adopter with a weird request.
(H/T: Alchemy by Rory Sutherland – a fantastic read)
Someone we know in India received an job offer from an acquaintance in the Middle East to work at a Skyspring hotel in New York. They needed to front $600 for the visa and he’d received instructions via an email.
They come from a modest background (his mother is a cook) and she asked my mother to help check if this was a real job – $600 is a lot of money after all. The “pay upfront to get this job” rang all kinds of alarm bells. But, it was hard to dismiss it outright since it came from someone they knew.
And, it didn’t help that Google had a very convincing looking card show up on search.
Of course, it all unravels the moment you spend more than a minute investigating. The hotel has no trace on Tripadvisor or Booking.com and the phone number doesn’t work. The email has a few typos (why do scammers not get that right?), was sketchy on details of the work visa, and it came from a questionable looking “@consultant.com” email address.
All in all, it was more sophisticated than the traditional Nigerian prince scam and it could have fooled someone who wasn’t discerning. I think the Google card was the most convincing piece of the scam and I couldn’t find a way to flag it on my phone (I found it on my laptop and did so).
It did get me thinking about how important it is to design products with scam/bad actor use cases in mind. It isn’t enough to just think of the happy path.
Bloomberg shared the story of the company behind the product that claims to be able to detect shoplifters by monitoring fidgeting, restlessness, and other suspicious body language. The goal is prevention – if the person is approached, the chances are high that the crime never happens.
On the one hand, this is awesome. If we can use technology to stop folks from committing crimes, that is a win.
On the other hand, it does make me wonder where this road will take us.
For example, will the data about the identified shoplifters go to a centralized database? Will that database be shared with other retailers to stop crime together? Will law enforcement make a case that the data should be shared with them? Will we then use the data in the database to move beyond behavioral signals to demographic signals?
It isn’t hard to envision why these steps wouldn’t logically follow the first. What happens to someone who makes a bad decision to steal a loaf of bread because he’s going through a tough time? Given how quickly he will be identified and caught, how hard will it be for him to pick himself back up after he commits that first crime?
Many questions. No simple answers.
NYT reporter Katie Rosman shared a screenshot she found of a teacher who had her students turn up their phone volumes in class and create a collective record of notifications they received.
I wonder what this chart would look like if we, as co-workers and family members, did this exercise at important meetings and family meals.
And, perhaps more importantly, what if we made it a point to do it periodically?
(H/T: Greg’s “Cofounder Weekly” newsletter that brings together a collection of interesting/fun tweets on tech/entrepreneurship)
As I think of technology waves in the post PC computing era, I like asking the question – “is this a fad or a foundational technology?”
Foundational technologies reach mass adoption – i.e. reach every available user of computing – and are eclipsed by the next wave. In many ways, mobile was the breakthrough foundational technology wave. By reaching nearly every adult on the planet, the mobile wave has made personal computing decidedly mainstream. Thanks to the mobile wave, every future foundational wave will have the opportunity to touch every human being on the planet.
Let’s take a moment to reflect on how incredible that is. The world has never been more accessible.
We’ve now moved from the era of mobile to the era of machine learning. Like previous waves, machine learning benefits from the fact that everyone carries a supercomputer in their pocket. And, like previous waves, it will only be a matter of time before machine learning will be ubiquitous.
All this brings us to the more interesting question – what lies ahead? Is it going to be the blockchain? Or, will the next decade be all about wearables or augmented reality?
Except we’re going to skip speculating about the next decade. Instead, we’re going to skip a wave or two and spend time on virtual reality. I am reasonably certain that the next decade isn’t going to be about virtual reality. But, I predict we’ll all spend plenty of time in virtual reality in the 2030s and 2040s.
Gregarious Simulation Systems
As I shared here, I read Ernest Cline’s bestseller “Ready Player One” over the holidays in December and have been mulling the dystopian view of the future portrayed in the book. If you are interested in mulling the future of technology and haven’t read the book yet, I’ll relay what the person who recommended the book to me said – “You can thank me later.” :)
The story revolves around a virtual world called “The Oasis” created by a successful global behemoth “Gregarious Simulation Systems.” A large portion of the population in the story spend their lives on “The Oasis” via their avatars – it even has its own public schooling system.
I had 3 reflections from the book –
1. Virtual reality is going to be the ultimate outlet for our desire for escapism. I’d written a post looking ahead at augmented reality two years ago. In it, I wrote about two potential use cases for AR – “fun + escapism and information + insights.” Of these, I think fun + escapism (i’ll refer to this as “escapism” going forward) is going to be the area virtual reality will come to dominate.
The industry for escapism is already huge. Think about the amount of time and money spent on television, social media, gaming, and casual (or non-essential) retail. If told someone in 1919 that we’d have humans spend 5+ hours everyday in front of a television, they’d have thought we’d be out of our minds. Now, if we found a way to explain the concept of video games, imagine their reaction if we told them we’d also be spending hours watching others play video games (eSports). We love escapism so much that we’re happy to watch others indulge in it.
Our current go-tos for escapism are Facebook and Netflix. But, why would you spend your time in 2D if you had the opportunity to escape in immersive 3D?
Next, while we’re at it, imagine the potential of virtual reality for more productive pursuits. E-Learning and meetings for remote workers could both be made possible with powerful implications.
And, what if we added the ability to travel – through space and time? What if we had the opportunity to spend time looking at the view from Mount Everest’s summit while also visiting medieval Florence?
Virtual reality is coming. And it is going to be big.
2. Corporations will, on average, become fewer, bigger, and more Orwellian. The antagonist in the book are representatives from a corporation called “Innovative Online Industries.” IOI is the largest internet provider and is portrayed as the stereotypical evil corporation.
Stereotype aside, however, it did spur a few reflections. The strength of today’s largest corporations are driven, directly or indirectly, by network effects. These network effects have given rise to a world with power law dynamics at a global scale (Alex Danco’s post on the topic from 2015 is excellent) and these dynamics are not going away any time soon. Unless we see a massive change in the global trade change leading to every nation closing its doors – unlikely even if it can’t be ruled out – we are going to continue to see massive global corporations who have access to large amounts of user and employee data.
Taking this idea to its logical conclusion, we’re going to see few large global winners across industries. And, just as today’s winners use data to experiment in ways that might have been described Orwellian two decades ago, future global winners will use this data for social engineering in more obvious ways.
This doesn’t have to be all bad. Some of this social engineering may actually work out good for us and help “nudge” us to make better decisions. Some, but not all – we’re human after all.
3. I(/we?) should read more fiction. :) The experience of immersing myself in Ernest Cline’s 2045 version of the world brought virtual reality to life in a way no article on VR ever did. So, here’s to reading more fiction in 2019.
PS: A big thank you to everyone who responded to my previous note asking for fiction recommendations. At my current rate of reading fiction, I think I’m set for the next few decades thanks to your list. :)