Retirement and cognitive decline

A close friend of my late grandfather passed way recently. The pattern of his decline prior to his death, however, was unerringly similar.

My grandfather, till the age of 67, was admired for his relative youth. He went for a swim everyday at the community pool near our home, ran a cost accounting business with a partner, read a lot, and was active. At 67, however, he decided (with his partner) to shut down their business and “retire.”

Every year, for the next ten years, he aged at the rate of three years for every one. The average amount of television he watched per day went up during this period from one hour to ten hours. His physical decline during this period was the hardest for us to stomach. He went from walking and swimming a lot to barely being able to move. His last years were tough on him.

The fable of the frog in boiling water may not be real but its implications for human behavior are definitely true. We were caught unawares by this gradual transformation. And, before we realized something was very wrong, it was too late.

This close friend’s story was similar – his cognitive decline after “retirement” was swift.

The world’s population is ageing. Combine that with advances in medicine and we have a generation that is also going to live longer than any other. As we all learn to deal with our ageing grandparents, parents, and eventually, ourselves, it is worth remembering that the enemy is cognitive decline. There is a lot of truth the phrase “its all in the mind.” Physical decline follows cognitive decline (while this was our observation, it may be that there’s a feedback loop that accelerates both).

My lesson from this experience was – Don’t allow your loved ones to “retire.” Find ways to keep them mentally engaged and away from excessive television.

Death is a natural part of the life experience – but, severe cognitive and physical decline needn’t be.

The problem with “Be humble”

This may be controversial – I’ve learnt that we cannot ask ourselves (or other folks) to “be humble” or, for that matter, to be grateful or to take things less seriously.

The notion that we can tell ourselves – “Hey, I know you think you are wonderful. But, it isn’t right to let people know you think that way. So, tone it down a bit so it looks more acceptable, will you?” – is flawed and comes across as fake.

All we can do is to help ourselves gain perspective and understand reality. And, when we do realize how little we actually know and that most of what is working in our life is a result of accumulated privilege and luck (my theory below), humility, gratitude, and a sense of humor flows easily.

Like many of life’s best things, humility is simply a by-product of a good product (perspective). The same holds true for gratitude and keeping a sense of humor.

(H/T: Kapil Gupta for an excellent articulation of the “be humble” problem/charade)

Unhealthy but comfortable – first and second order consequences

First and second order consequences are regularly in opposition. Unhealthy food, for example, has a good first order consequence (tasty) but a bad second order consequence. The effects of exercise or self reflection, on the other hand, are the opposite.

Most good things suck at first thanks to the inertia involved. This is why it is hard to keep adapting and evolving and why most people and organizations find it easier to be stuck in their old ways.

The big question, a question that we inadvertently ask ourselves when we make decisions, is – will we choose a painful but healthy route to progress or an unhealthy but comfortable delusion?

(H/T: Principles by Ray Dalio)

Gregarious Simulation Systems

As I think of technology waves in the post PC computing era, I like asking the question – “is this a fad or a foundational technology?”

Foundational technologies reach mass adoption – i.e. reach every available user of computing – and are eclipsed by the next wave. In many ways, mobile was the breakthrough foundational technology wave. By reaching nearly every adult on the planet, the mobile wave has made personal computing decidedly mainstream. Thanks to the mobile wave, every future foundational wave will have the opportunity to touch every human being on the planet.

Let’s take a moment to reflect on how incredible that is. The world has never been more accessible.

We’ve now moved from the era of mobile to the era of machine learning. Like previous waves, machine learning benefits from the fact that everyone carries a supercomputer in their pocket. And, like previous waves, it will only be a matter of time before machine learning will be ubiquitous.

All this brings us to the more interesting question – what lies ahead? Is it going to be the blockchain? Or, will the next decade be all about wearables or augmented reality?


Except we’re going to skip speculating about the next decade. Instead, we’re going to skip a wave or two and spend time on virtual reality. I am reasonably certain that the next decade isn’t going to be about virtual reality. But, I predict we’ll all spend plenty of time in virtual reality in the 2030s and 2040s.

Gregarious Simulation Systems
As I shared here, I read Ernest Cline’s bestseller “Ready Player One” over the holidays in December and have been mulling the dystopian view of the future portrayed in the book. If you are interested in mulling the future of technology and haven’t read the book yet, I’ll relay what the person who recommended the book to me said – “You can thank me later.” :)

The story revolves around a virtual world called “The Oasis” created by a successful global behemoth “Gregarious Simulation Systems.” A large portion of the population in the story spend their lives on “The Oasis” via their avatars – it even has its own public schooling system.

I had 3 reflections from the book –

1. Virtual reality is going to be the ultimate outlet for our desire for escapism. I’d written a post looking ahead at augmented reality two years ago. In it, I wrote about two potential use cases for AR – “fun + escapism and information + insights.” Of these, I think fun + escapism (i’ll refer to this as “escapism” going forward) is going to be the area virtual reality will come to dominate.

The industry for escapism is already huge. Think about the amount of time and money spent on television, social media, gaming, and casual (or non-essential) retail. If told someone in 1919 that we’d have humans spend 5+ hours everyday in front of a television, they’d have thought we’d be out of our minds. Now, if we found a way to explain the concept of video games, imagine their reaction if we told them we’d also be spending hours watching others play video games (eSports). We love escapism so much that we’re happy to watch others indulge in it.

Our current go-tos for escapism are Facebook and Netflix. But, why would you spend your time in 2D if you had the opportunity to escape in immersive 3D?

Next, while we’re at it, imagine the potential of virtual reality for more productive pursuits. E-Learning and meetings for remote workers could both be made possible with powerful implications.

And, what if we added the ability to travel – through space and time? What if we had the opportunity to spend time looking at the view from Mount Everest’s summit while also visiting medieval Florence?

Virtual reality is coming. And it is going to be big.

2. Corporations will, on average, become fewer, bigger, and more Orwellian. The antagonist in the book are representatives from a corporation called “Innovative Online Industries.” IOI is the largest internet provider and is portrayed as the stereotypical evil corporation.

Stereotype aside, however, it did spur a few reflections. The strength of today’s largest corporations are driven, directly or indirectly, by network effects. These network effects have given rise to a world with power law dynamics at a global scale (Alex Danco’s post on the topic from 2015 is excellent) and these dynamics are not going away any time soon. Unless we see a massive change in the global trade change leading to every nation closing its doors – unlikely even if it can’t be ruled out – we are going to continue to see massive global corporations who have access to large amounts of user and employee data.

Taking this idea to its logical conclusion, we’re going to see few large global winners across industries. And, just as today’s winners use data to experiment in ways that might have been described Orwellian two decades ago, future global winners will use this data for social engineering in more obvious ways.

This doesn’t have to be all bad. Some of this social engineering may actually work out good for us and help “nudge” us to make better decisions. Some, but not all – we’re human after all.

3. I(/we?) should read more fiction. :) The experience of immersing myself in Ernest Cline’s 2045 version of the world brought virtual reality to life in a way no article on VR ever did. So, here’s to reading more fiction in 2019.

PS: A big thank you to everyone who responded to my previous note asking for fiction recommendations. At my current rate of reading fiction, I think I’m set for the next few decades thanks to your list. :)

Wanting more

Much of making the most of this life is realizing that there is no end to wanting more and, as a result, consistently learning to say “this is enough.”

Life becomes so much better when we perceive most of what we receive as upside.

The challenge with behavioral interview questions

If we want to understand how fit we are, the best way is to test is to play a game or go for a run. It definitely doesn’t involving asking ourselves to describe our fitness. And, yet, if we walked into interview rooms around the world today, we’d hear behavioral questions that do exactly that.

In the ideal world, we’d have interview processes that focus on delivering real work. But, few organizations can do that at meaningful scale. So, cases/live problem solving tends to be a good substitute.

And, I find that behavioral questions can be woven into cases with a bit of extra preparation work. For example, if we wanted to understand how someone deals with a conflict or a challenge, add a dash of conflict to the case and see how the interviewee responds.

Perhaps the first step is accepting that interviews aren’t the perfect window into how a candidate will function in a job. Most of us don’t have the sort of spidey sense that translates a behavioral answer into reliable signal.

So, the best we can do as interviewers is design systems that reduce bias by focusing on how they would actually approach the job and how much they enjoy the process of doing whatever the job is about. Folks who enjoy problem solving, for example, will have their eyes lit up even when problems are thrown at them.

When in doubt, design interview systems that enable us to ignore what people say and watch what they do.