The uncanniness of seeing human beings where human beings are not supposed to be

Somewhere on my computer, I keep a collection of images of humans showing up in non-human contexts. These are contexts that are supposed to be sterile and devoid of humanity. If you’ve just wandered into my site on a search, that may sound like a rather odd hobby; if you’ve been here a while you’ve known for a long time that this is par for the course.

Anyway, it goes like this: I’m a lifelong devotee of secondhand shopping. Naturally, I’m a huge fan of the website thredUP. (Disclosure: that link is a referral link that’s worth $10 to us both if you’re a new customer). It’s a fantastic site for buying used women’s (and children’s) clothes, with lots of great features like saved searches and such. They’ve even featured me as one of the most fashionable women in the US, if you can believe that. They’re a fast-growing, tech-enabled company that went public earlier this year (more disclosure: I definitely bought some of those shares). And their efficient process for photographing the roughly 100K items that get merchandised to their storefront every day generally results in very high-quality photos of their items. In fact, they have a patent on that photography process.

But every now and then, when you’re not expecting to, you’ll see a human sneak into the photo. Just an arm, or a torso. It’s vaguely disconcerting but also kind of warm and humbling, a peek behind the curtain and a reminder that there are people behind so many processes we take for granted. (Note that the topic of “ghost workers” has come up repeatedly on The Tech Humanist Show.)

human torso with clothes on thredup
Hello, human!

Digital Weirdness

It’s also an example of what I like to call Digital Weirdness. And this one below strikes me as extra-weird in that they actually included it in their Instagram ad carousel.

human arm with blazer on thredup
Advertising the weirdness

I am, of course, not alone in having this hobby: Andrew Norman Wilson has a 2019 piece at WIRED about his collection of human hands showing up in Google Books scanned images. Enjoy.

A Closer Look at KO Insights 2022 Technology & Cultural Trends

Nothing changes faster than the trends you haven’t been paying attention to. But in the past few years there’s been SO MUCH OTHER STUFF to pay attention to that you’d be forgiven for taking your eye off of the zeitgeist.

That’s fundamentally why I began sharing these trends. Over the years, our savviest clients have asked for insights presentations, wanting to know the patterns I was seeing emerge from my vantage point of working across a broad range of industries for a wide variety of clients.

They’re not just any trends, though. Since I founded KO Insights, our work has been committed to improving human experiences at scale, so we’re always keeping an eye on the horizon for emerging trends that relate to that mission.

In case you’re wondering about our methodology, I’ll sum it up. Our 2022 Technology and Cultural Trend Map — which we published last week, so if you missed that, you might want to go back and download a printable copy you can keep handy as a reference — is shaped in four ways:

  1. Through the topics that have bubbled up repeatedly in direct conversations with business leaders, civic leaders, and industry thought leaders
  2. Through noticing emergent patterns in news and industry chatter
  3. Through direct research into peer-reviewed studies
  4. Through observations and insights of my own. Which is, after all, what the name of the company promises. It’s what’s on the label.

What all of the resulting topics have in common is that they’re poised to impact human experience in the next few years, and offer considerations leaders should be weighing now. They also reflect the macro themes, such as the ways the current global climate crisis is reverberating through industries and across communities everywhere.

2022 KO Insights Cultural & Technology Trend Map (small format)

This week we’ll offer a brief look at each of the trends included in the map. In weeks to come we’ll unpack them further, but be sure to let us know if you’re particularly interested in one or the other; we’ll be happy to prioritize the order somewhat based on your feedback.

The 2022 trends are:

  • The Immersive Trends
    • Always-On Economies
    • Virtual Third Spaces & Emerging Subcultures
    • Intuitive Intelligence
    • Augmented Everyday Experiences
    • Biometric Face-Off
  • The Integrative Trends
    • Making Tech Safe for Humans
    • Value Disruption
    • Food Innovation
    • Adaptive Cities
    • The Economy is People
    • Work Rework
    • Truth & Trust in Doubt
  • The Innovative Trends
    • Education Everywhere
    • Covid-to-Climate Momentum Transfer
    • Overcoming Supply Chain Chaos with Sustainability
    • Navigating the Just Transition

You can read a little more about each one below.

The Immersive Trends

Always-On Economies

Virtual work, virtual learning, virtual retail, etc

The pandemic has thrust many of us deeper into the virtual versions of the activities that occupy our days, whether that’s work, school, socializing, gaming, or the miscellany of errands and tasks we fulfill online, like shopping, banking, and even medical care. Owing to on-demand content and services, distributed user bases, and algorithmic and machine-generated experiences, these spaces have become “always-on” economies, and they are becoming more and more the norm.

This trend introduces several unique experience considerations. For example, as we live more and more in virtual worlds, our “real life” physical environments must function as shared spaces as well — shared with those with whom we live or work or play, either virtually or physically. Another side effect of such immersive technology is that it will create a growing need to protect our personal information, and how we trade it for services and products.

Virtual Third Spaces & Emerging Subcultures

Metaverse, Web3

The first spaces people spent time in online were virtual chat rooms & services like AOL’s local chat rooms, which often served subcultures that had very little interest in ever meeting face-to-face. Imagine those spaces, but on a much larger scale, and integrated with our day-to-day lives and functions.

A sub-trend here is the need for a personal avatar in the virtual third spaces to represent you. I’ve written for years about our digital selves and how to think about what they represent, but the subject keeps getting more nuanced and interesting all the time.

How do you represent yourself to others? What are the rules & expectations that surround this? These are some of the experience considerations we’ll be weighing as we examine this trend. We already see many people making their avatars look like them or someone they admire, and at times dressing them in ways that are out of the ordinary, daring, or simply unattainable in the physical world.

Intuitive Intelligence

Machine learning into human emotional expression, nuance, abstraction

Researchers are trying to detect human sentiment with machine learning, and piece together nuance and abstraction in a variety of interesting ways. These include chatbots, but also machines that play games and learn to beat human players (and do so again and again).

The capacity to do tremendous good with this kind of technology is enormous. Therapy bots are a current example that can help people who need mental health support and feel less awkward — at least initially — chatting with a bot than seeking out a human therapist. But of course the potential is there as well for these to be deployed in ways that are creepy, invasive, authoritarian, and just otherwise harmful.

The experience considerations here are vast, and will play into everything from personal privacy to public policy. Among the concerns are the fear that machines will eliminate or threaten highly-skilled human jobs; another concern is whether machines can be truly impartial when making decisions on our behalf. We’ll be looking at all of it.

Augmented Everyday Experiences

AR brings integrative entertainment, just-in-time context

I’m on record all over the place saying that augmented reality is the emerging tech I’m most excited about due to its potential to offer just-in-time contextual relevance — which is a form of meaning. Any technology that can be used to offer more meaningful human experiences is one worth exploring.

Of course, to think about the experience design and strategy considerations of augmenting everyday experiences is a bit meta, but we thrive in the land of meta. So we’ll continue to explore the implications of this trend on both a societal and personal level. This technology has the capacity to enhance or amplify experiences without replacing them fully with virtualized equivalents. So the experience design considerations will focus on how to integrate technology in ways that support our lives, not compete with them or require their wholesale reinvention.

Biometric Face-Off

Facial recognition & other biometric tech meets deployment & caution

This is an area I’ve been quoted on extensively in the past few years. There’ve been some developments in the past year, and we’ve written about some of them here at the site. But there are many still-to-come instances of this technology being rolled out in new ways, so we’ll continue to investigate what it means for personal privacy and other implications.

A sub-trend here is how facial recognition will be used in real-time by police & government authorities during protests to catalog & identify people who take part. This is already happening in some places, but it’s very likely only the beginning of what’s to come.


The Integrative Trends

Making Tech Safe for Humans

Emerging tech meets ethics, human protections, etc

In the age of algorithmic decision-making, these are the questions that will arise more and more frequently: How do we know if a machine or artificially intelligent algorithm is making decisions for human beings that are fair, just, accurate, unbiased?

What is bias in data inputs used to train machines, etc. — and how can it vary by race, gender, political affiliation, geography?

Who’s accountable for errors made by machines?

There are many related questions, of course. But the point is that ethics and human protections must be integral features of emerging technologies. If not, humanity could pay an incredibly steep price for new technology before we over-correct to “fix” what we’ve allowed to scale.

Value Disruption

DeFi, NFTs, Bitcoin, mobile payments, cryptocurrency, blockchain

There are big social issues around the speed at which these technologies and platforms are developing. Will they be used for crime and corruption and just plain greed? Sure, but then so will other technologies that don’t rely on decentralization or blockchain’s distributed nature.

We’ve written a bit about this here, but the theme is: money is getting weirder. We’ll continue to explore the innovations and risks, and follow the policy experts on what sorts of regulations might need to be implemented to protect people and use technology in positive, productive ways.

The larger question I’m interested in looking at, though, is in the relationship between disruption — or creative destruction — and meaningful human value. And what new ways might emerge to move from disruption to value-driven innovation. Because the most important part of designing new technologies is ensuring that they can make life better for humans. That means that at some level they must benefit society and the planet at large, not simply advance for technology’s sake. If they don’t, they are not helping us solve the many problems of the present to get to the future we need.

Food Innovation

Combating losses of covid & insecurity with innovations, planet-centric diets

What does the future of our food look like?

Agricultural innovation will be focused on protecting crops to better feed more people. We know that climate change is impacting crop yields, and we urgently need approaches to overcome it. Will lab-grown meat solve our issues around waste production & energy usage? Seems like the questions are more complex than the answers so far.

But as someone who’s been vegetarian for 27 years and vegan for 24, I’m delighted to see an enormous push to make plant-based proteins seem more consumer friendly. I’m especially happy to see the trend toward lab-grown meat that doesn’t require the energy and environmental impact of cattle production. But overall the emphasis is less about plant-based and more about planet-based eating. It just so happens that for now, those two ideas are rather aligned. We’ll be watching this space with interest.

Adaptive Cities

Place-by-place experiments in resilience

The Adaptive City is the global trend of cities undertaking initiatives to better prepare themselves for uncertain futures, whether due to climate change or political volatility. This is as much about major cities developing climate resilience and mitigation strategies as it is about smaller-scale community planning, with a focus on flexibility and affordability in addition to resilience.

Among the tasks cities are taking up are improving infrastructure (laying additional infrastructure underground to make up for above-ground changes) and ensuring the availability of available housing. This is a big trend for the decade to come, and given our work with cities, we will be following it closely.

The Economy is People

Local, community, collectives, mutual aid

The idea that “the economy is people” was a theme in A Future So Bright, has been the subject of quite a few of my Twitter rants, and has shown up repeatedly in research-guided work around everything from the future of work to the future of energy:

Work Rework

Great Resignation, hybrid workplaces, evolving ideas of workplace, work, team

In the pre-2019 days, there was already considerable interest in what the future of work would look like. But in light of the pandemic’s impacts, in light of the shift to remote work and hybrid workplaces, in light of the Great Resignation, in light of the ongoing clamor to wrap our minds around the future of work — we’re still in the early days of this evolution.

Truth & Trust in Doubt

Geopolitical upheaval, misinfo/disinfo, etc

The past few years have seen a deluge of issues in the crisis around trust and truth: “fake news,” suspicions of media bias, the future of democracy in the age of algorithmically-boosted misinformation and propaganda.

We’ve written about this here and we’ll continue to examine these issues.

Personally, the more I think about it, the more I see that we’ll have to work toward truth and trust from the ground up — through education and changing our perspective on what it means to be well-informed. We need tech solutions to untangle tech problems like the amplification of misinformation, but we also need media literacy, citizen literacy, and engaging one another in civil discourse.


The Innovative Trends

The KO Insights working definition of innovative is “aligned with what is going to matter.”

Education Everywhere

Zoom classrooms, just-in-time learning, resolving inequity

The future of learning and education is evolving in a time when we’re grappling with ways to help people cope with change and remain flexible, so they can participate amid public health emergencies, so they can learn at their own pace, so they can compete. It’s easy to imagine a future of learning and education that is “just in time,” ad hoc, and scalable — so people can learn on their own timeline within the constraints of our lives. But these conditions aren’t available to everyone equally, and so the future of education must also grapple with inequity and access.

Covid-to-Climate Momentum Transfer

Hopeful strategic innovation

Of course we’re paying attention to climate momentum anyway, but the rapid technological advancements and digital transformation that have accompanied the COVID-19 pandemic offers incredible opportunity to harness that momentum towards climate mitigation. Observers across industries have noted the opportunity; now it’s just a matter of leaders making the decisions that will most effectively deliver on that promise.

Overcoming Supply Chain Chaos with Sustainability

Investing in greener fleets & fuel

Corporate social responsibility, supply chain management, ESG, ethical procurement — there is growing awareness of these topics as they relate to greener fleets, supply chain chaos, and how these will be addressed in the future.

This also relates to transportation innovation as a whole. Future transport is one of the largest sectors in our series. We’re watching the evolution of electric vehicles including trucks, transportation systems for coastal communities, and more. We’re even keeping our eyes on private spaceflight, although that’s not likely to be a trend we report on here very soon.

Navigating the Just Transition

The challenging move away from fossil fuels

As we undertake the process of moving away from fossil fuels, issues come up around fairness and justice for communities affected by changing policies and initiatives, like native people and people living in low-income neighborhoods in cities, which are often subject to the greatest climate impacts.

The future of work also ties into this topic––green jobs are growing, which is great, because we need solutions for job transitions. And while gender equity isn’t tied to climate per se, studies show the impact of climate change hitting women in developing countries hardest.

This also includes issues related to mobility justice — making sure communities have access to greener infrastructure.

But the scope of this trend actually goes farther and includes sub-topics like: collaborative movements, social impact startups, feminist economics, inclusive policy-making, radical social justice, upending power dynamics, systemic change, universal basic income, collaboration-driven initiatives, building from the ground up, and so on.


And that’s just barely scratching the surface on all of these trends. We plan to dive deeper into each of these topics in the weeks to come, but in the meantime I hope these summaries have given you food for thought in terms of how you might be thinking about your future strategy.

Please don’t hesitate to get in touch if I can answer any questions about how you might use these insights, or if you’d like to discuss booking a session to review them with your team.

I’ll leave you with just one last thought. Although it’s impossible to actually predict the future, one thing is easy to anticipate: the world will continue to need your bold and savvy leadership in the future. More than ever.

The Economy Is People

everything is connected

The following is an excerpt from my latest book, A Future So Bright: How Strategic Optimism and Meaningful Innovation Can Restore Our Humanity and Save the World:


I’m not an economist by education—as I’ve previously mentioned, I was a linguist—so it has taken me a few years and a lot of reading to arrive at what seems like an obvious statement in retrospect: 

The economy is people.

This fuzzy concept we call “the economy” is about a system of tools that enable people to provide for themselves, that measure how well people are able to shelter and feed themselves, and how much people are able to invest back into their own well-being.

You could say that this statement is the “everything is connected” of economic ideas. It sounds simple and self-evident, but it’s the layers of its truth that give it revelatory impact.

The COVID-19 crisis has robbed the world of so much, but I do have to thank the crisis for teaching me so clearly that the economy is people—both in terms of our well-being and our productive output. Also planet, as in the state of nature and natural resources. I hate that I spent so much of my life thinking of the “economy” as merely a monetary abstraction.

Measuring a human health crisis in terms of dollars makes no sense. We should be measuring it in terms of human lives impacted, in terms of human potential cut short, in terms of human experiences thwarted. But those are more nebulous figures, and somehow less motivating in a boardroom.

If we are to be proponents of capitalism, as I’d venture many of my readers are, then capitalism must be about solving people’s problems in alignment with a focused business objective.

The economy, then, should really be a measure of how efficiently people’s problems are solved. And we can apply this to discussions of the various subeconomies: the sharing economy, the gig economy, the knowledge economy. At the end of the day, what we’re talking about in every case is people: The economy is built on people.


— taken from A Future So Bright: How Strategic Optimism and Meaningful Innovation Can Restore Our Humanity and Save the World, available in print and audiobook format

The Great Resignation and the human future of work

blog header great resignation - background image shows woman holding her head while looking at a laptop in her home

Much of the speculation about the so-called “Great Resignation” began after the Bureau of Labor Statistics reported that a record 4 million Americans had left their jobs in April 2021. Well, that was the record then. In August the number was 4.3 million. Another record. 4.4 million workers in September. And then yesterday we learned that 4.5 million US workers quit their jobs in November. How long will the trend continue, and what does it mean?

Plenty has already been written and said about the Great Resignation, and much has been said (and plenty by me) about the future of work, about automation and robots and how they will impact human jobs, and countless other related topics. But not enough is being made of what this might mean about the future of human meaning.

We’ve also done a lot of talking (and yes, me again) about digital transformation, and how much that has to do with data models and emerging technology.

But we haven’t talked enough about work as a form of human fulfillment, and we haven’t talked enough about the kind of transformation that happens at the human scale. What if we’re missing the big insights about the future of work and technology by not connecting all these dots?

What really matters about the human future of work and beyond

Most of the media coverage focuses on wages and benefits, and while those are undeniably important factors to understand, it strikes me as very plausible that much of what this trend is about deals with things most media coverage isn’t hitting: Gender inequality in home duties. Burnout. A re-prioritized sense of dignity. And for heavens sake, grief — or rather, the shifted perspective that comes from grief over lost loved ones.

For years when I have written and spoken about the future of work, I have said that the most important thing is for humans to have a sense of meaning. “What matters in all of this is that humans have the opportunity for meaningful experiences in the future, whether they derive from work or not.”

But even after separating those concepts, we’re still left with big questions about what a humanity-centered understanding of work looks like: what it means to accomplish, contribute, and achieve apart from income and sustenance.

And we’re still not sure what it means to address the overlapping trends of the Great Resignation (or Great Reset), and the waves of innovation around the “creator economy” particularly as it relates to Web3, the Metaverse, and emerging ideas and models of value. (I’m part of a group of future-forward experts that is forming around these topics right now. I’ll be sure to share more about that as I am able.)

One of the most frequently-recurring themes in my work is meaning, and I have very often said that I see no reason why humans shouldn’t have meaning in all sorts of different ways, work being just one.

I have also said (and been repeating a lot lately) that the economy is people. And in economic conditions where people are not cared for, they may be forced to ruthlessly prioritize themselves.

If all of this is true then the Great Resignation could be a sign that not enough people are finding enough of a sense of meaning in work. At least not sufficient to overcome the lack of meaning they are feeling in other areas of their lives, which makes sense given how much the pandemic has cut most of us off from our social circles, from our extended families, from leisure travel — heck, even from serendipitous encounters at coffee shops.

It also means these workers might become a bigger market than ever for employers who want to persuade them that they can find meaning at their place of work.

Still, if those 4.5 million, and those coming behind them, can’t find meaning in their workplaces now, why would they stay with the same employers — especially as they see machines taking over many of the functions in the jobs they face today?

Are we ready yet for a meaningful version of the future of work?

The main focus in the Great Resignation shouldn’t be employee dissatisfaction or talent acquisition cost or about technology taking over human employment opportunities. It’s good to create space for that discourse and to learn from it, but the underlying issue is far more fundamental: as much as we need to commit to making the workplace physically safe for humans, we need to commit to making it fulfilling, too. And that means honoring and respecting the humanness of human employees.

When it comes to digital transformation, the biggest lesson I share with leaders is often: it doesn’t start with tech. Surprise! Just as leaders too often want to begin digital transformation with technology instead of from human-centric values and experiences, too many leaders approach their talent strategy as if it can be driven by cost or satisfaction scores, rather than about infusing a sense of purpose and meaning into the organization at every level.

Figuring out how to build a purposeful organization and a culture of meaning, how to amplify relevance and intentionality in the digital experiences you bring to scale — all of this is part of the human-centric digital transformation effort I have been advocating, talking about, advising executive teams on, and leading workshops in for years. It was always important, but now, between the accelerating pace of digitization and the rising stakes in attracting and retaining talent, it’s more crucial than ever.

Even as intelligent machines, automation, and completely digitized experiences become increasingly pervasive, they won’t replace the nuanced value humans add in creative teams, in design of all kinds, in strategic thinking, and in the simple joy of a serendipitous human-to-human interaction, even if it’s only in a coffee shop.

Goodbye to 2019, hello to our uncertain future

Our emerging tech panel at UN COP25 in Madrid

This time of year is my absolute favorite because for me it’s so much about relaxed reflection and setting intentions for the year — or even the decade! — ahead. And this year, with Christmas and New Years Day falling mid-week, all normal work schedules seem disrupted, creating extra space throughout these final weeks and over the weekend between them to reflect and plan.

It’s also a good time to think about the future in general.

One of the characteristics about the way we tend to think about the future now, though, is with more uncertainty than ever.

Yet as I wrote in Tech Humanist:

Here’s what I want to offer you: To me, the idea that the future is never fixed or certain is actually encouraging. Truly, it fills me with hope. I think of the future largely as something we continuously alter, shape, or at least influence with what we do today.

That thought also fills me with a sense of duty because it means there are always many possible futures that depend on me and you and everyone else doing our parts in the whole. It means our everyday actions have more power to shape outcomes than we are often comfortable admitting.

from Tech Humanist: How You Can Make Technology Better for Business and Better for Humans

Planning your own future

My friend and one of the organizers of House of Beautiful Business, Tim Leberecht, has written a lovely guide to help us all do just that. His process will help you have a productive and insightful “time between the years,” as Tim calls it, and a brilliantly successful 2020:

>> How to Make the Most of the Time Between the Years
(written by Tim Leberecht for Psychology Today)

Some of the questions I like to ask myself and encourage my clients and audiences to ask are:

  • What kind of future do you personally want to have?
  • What kind of future do you want for everyone on the planet?
  • What are you working on building?
  • What are you trying to achieve at scale?

By the way, all of this reflection and planning pairs well with another piece about getting better at training your brain what to retain and what to let go of. Hint: it comes down to the discipline of spending time thinking about what you most want to be thinking about.

>> Your Brain Has A “Delete” Button–Here’s How To Use It
(by Judah Pollack and Olivia Fox Cabane for Fast Company)

What are some other questions that help you clarify your purpose? What are some other exercises you engage in to help you reflect and plan?

Goodbye to my wild 2019

For me, 2019 was a whirlwind of unprecedented life opportunities, but also a time for increasing clarity and commitment to what I see as my mission.

To recap: In January, just a few months after my book Tech Humanist came out, it was featured on the CES stage. The following week, I had a tweet go viral and a follow-up in WIRED that also went viral, and I appeared on just about every major news outlet from BBC to NPR to Marketplace to talk about facial recognition (and to pivot the conversation to the larger issue of how technology is changing our human experiences). The next week, I spoke at the United Nations about innovation and humanity.

Then in June, a few days after delivering a keynote on Tech Humanism at a conference in Mumbai, India, I guest lectured at the University of Cambridge. Yes, the same one Charles Darwin, Sir Isaac Newton, and Stephen Hawking are all associated with. That University of Cambridge. I know, I couldn’t believe it either.

In the second half of the year I keynoted Etsy‘s Engineering Day in Brooklyn, a Google team offsite in Lake Tahoe, the P2P Transformation Summit in London, DevLearn in Las Vegas, UX Australia in Sydney, the Boston CIO Summit, and presented versions of my Tech Humanist talk at INBOUND, Content Marketing World, the Inc. CEO Summit, Mind the Product in London, House of Beautiful Business in Lisbon, and more.

Our emerging tech panel at UN COP25 in Madrid
Our emerging tech panel at UN COP25 in Madrid

Finally in December, after speaking once again at the United Nations headquarters, this time on AI and youth skills, I closed out my work year at the UN COP25 climate change conference in Madrid where I led a panel on the final day about the challenges and opportunities of leveraging emerging technologies to fight climate change.

Oh, and over the course of the year I added representation from Washington Speakers Bureau and Leading Authorities speakers bureau. That’s exciting personally and professionally but in addition it should help make bookings easier for many large company clients, which means there may be even more of those audiences in 2020 and beyond.

I’m telling you this to say: I think all of this activity proves there’s hope. I think my year has been wild because a lot of people see the potential for technology to diminish the humanity in the world, and a lot of people want to see to it that that doesn’t happen. If my experience this year indicates anything, I think it’s that people are determined to make the best of our tech-driven future

So what’s in store for all of us for 2020?

You’ll see many articles with predictions for 2020, and some will be more outlandish than others. I’m including just a few here that will likely affect you and your business more than others:

Expect to see more facial recognition in use everywhere and to hear more debate about it. Governments, law enforcement agencies, and high-traffic facilities like airports see tremendous opportunities and conveniences in deploying this technology, while civil liberties advocates see many privacy risks and challenges. Personally, I’m on Team Ban Facial Recognition Until We Have Better Protections In Place, but I’ll continue to follow all the developments and report on them (as I did in WIRED earlier this year).

Expect to have to grapple with privacy debates inside and outside your organization. The major push for companies to meet GDPR compliance in time for the May 2018 enforcement deadline is only the beginning of such regulatory efforts; the CCPA is due to be fully enforced as of January 1, 2020, and you can bet more regulations will be coming as time goes on. Your best bet to dealing with these is to get ahead of them: enact human-friendly data collection and usage practices such as not collecting more data than you need or than is relevant to the context of the interaction. (I spoke about this topic extensively at House of Beautiful Business in Lisbon, as well as at many other events throughout the year.)

The push for digital transformation isn’t over yet (no matter how tired of hearing about it you may be). Most companies, organizations, and cities are very much just catching up, still sorting out how, for example, the data from their front-end services can inform back-end operations and vice versa. Meanwhile, upstart data-rich apps and services are still disrupting industry after industry, so we’ll still be talking about that for a while. (This was the focus of many of my keynotes to executive audiences, such as the Boston CIO Summit, and more.)

You may also be tired of hearing about AI, but we’ve only scratched the surface of that conversation. While some folks debate the semantics of whether simple machine learning processes really constitute “artificial intelligence,” the advancements within that space progress daily, with both challenges and opportunities aplenty. (Part of my focus throughout 2019 and into 2020 has been on how machine learning and automated intelligence can help with addressing climate change. Stay tuned for more on that.)

Speaking of which, perhaps the biggest and most urgent trend of all will be facing the scale and scope of climate change, and using whatever technologies and tools we can to remediate against its effects.

Looking into the future for me and for us all

Above all, what is ahead in our future is increasing interconnectedness of our experiences. It’s the perfect time to adopt the mindset that in many respects what I do does affect you just as what you do affects me, and that we’re in this together. We need to accept our futures as wholly connected: connected through data, connected to each other, connected to the planet, connected to our collective destinies.

That connectedness shows in the work I’m lined up to do. To prepare for the bookings I have for 2020 so far, for example, I will be examining more deeply the future of jobs and work, the future of privacy, the future of trust, the future of the climate, and more. All of these topics have a through-line: the future of human experiences will depend heavily on our wise use of technology, collectively and individually.

Speaking of my bookings in 2020, I have talks booked throughout the U.S. — and in Budapest for the first time! If you happen to be able to attend any of these events, be sure to come up and say hi — I’d love to see you. And of course you can always book me to speak at your company or event.

And! I’ve begun to work on my next book. More on that to come, but you can be sure it will follow along these themes.

But for now the big question is:

What will you do with the future for you and for us all?

Here’s hoping you find the quiet reflection you need in these last days of 2019 to set the kinds of intentions that will guide you to achieve what you most want to achieve, for your own good and for the good of humanity.


If this theme resonates with the conversations your company, organization, or city has been having and you’d like to hire me as a keynote speaker at an event in 2020, please do reach out. Here’s to a meaningful year for us all. 

Make Your Own Calling (Transcript of Talk to NSA NYC)

On Thursday, August 22nd, I gave a talk to the National Speakers Association New York City chapter at their annual Summer Social event, to which they invite a lot of prospective members, mainly people who may want to make speaking their business but who aren’t yet there. The theme of the evening was to make it clear that we all have a process by which we get there, and I shared mine. The following is a transcript of that talk.

Photo credit: TE McLaughlin

All my life I’ve envied those people who say in interviews that they always had a singular vision of what they wanted to do with their lives.
I’ve never been one of those people.

I always wanted a calling. I would read interviews with famous people and so many of them said they knew that they had to write or they had to act or they had to play baseball.

I mean, did I have to become a professional keynote speaker, talking to corporate leaders about emerging technologies and digital transformation?
Uh, no. I did not have to.

In fact, throughout my life and especially throughout my career I’ve struggled with pinpointing and defining what I do and what I’m about. Maybe you have, too.

So whether you’re here tonight because you’re building a career as a professional speaker, or you think maybe you’d like to, or whether you just want to be able to do it well enough as a secondary part of your occupation to generate leads for your primary business —
whatever the case, I think now and then it’s helpful to go back and look for clues throughout our lives about what has led us to where we are, and how we can take it further.

Me? I grew up interested in lots of things. Reading was one of my favorite hobbies, as well as writing and making up stories, poems, songs, and plays, and performing them for my family and our friends. And charging maybe a quarter for admission. (Because I was also a budding entrepreneur.)
Also learning to program — which in the ’70s and ’80s meant typing up pages of code I’d torn out of printed magazines.

So somehow I was equal parts book worm, aspiring writer, stage ham, and computer geek. I was very adaptable, multi-skilled, as it turned out. But I would’ve traded it all in to have had a singular calling.

I wanted my calling to be music — I loved music — I sang at my church, played clarinet first chair in my high school band, and taught myself literally a dozen other instruments. My dream career was to be not just a singer or rock star, but specifically to be a singer-songwriter.

I have this one memory of being very young — maybe 6? — and using the family typewriter to type out lyrics so that I could study them as inspiration for learning to write great songs. You know the earliest one I can remember studying? Bob Seger’s “Against the Wind.”
Yeah, that’s right: a song about an aging man reflecting on the hard decisions in his life.
I was precocious, yes, but I think it also showed that I was also already fascinated with people and their stories, and the human condition.

Anyway, I really wanted music to be my calling, but I loved too many other things. I was too adaptable.

In first grade, I even won first place in two different statewide competitions:
one for a statewide young author’s contest
for a book called Herman the Horse Gets Lost,
and one for a statewide computer programming contest
with a game I’d written/coded called Doggie.
My love of animals was clearly strong even then (and I’ve now been a vegan for 21 years).

And then there were languages. When my grade school class hosted some foreign exchange students from France and we got handouts with French phrases to learn (“bonjour! je m’appelle Kate”), I discovered that I was good at learning them. (It turned out it ran in the family — my dad had been a linguist in the military, and was fluent in Arabic. Side note: he had also been a singer. Also multi-skilled.) I loved language. I taught myself basic Spanish during family trips to the public library. My older sister studied German in high school, and I helped her with her flash cards — and along the way, I picked up the vocabulary and an affection for the language. I’m not kidding, I loved languages.

So while there was no single calling, there were all these recurring themes: writing, performing, computers, music, fascination with people, and language.

That’s not exactly a college major.

So when it came time for college, I couldn’t decide if I was going to major in music, theater, or language. Ultimately I decided that what I wanted to do in music and theater I could do without a degree in those fields, but what I wanted to do with languages
— which was, get this, to become an interpreter at the United Nations
(remember that, because it comes back up later)
— I could only do with a specialized degree.
So I majored in German, minored in Russian and linguistics, and had a concentration in international studies. I went all in on languages.

And then I built my career in technology. But that actually makes a certain kind of sense.

I’ll explain.

I think the thing I always loved about language is that things could mean different things. That a book is also a Buch and a livre and a libro and a книга(kniga)… that you could have different names for the same thing. Which meant, I realized, that a thing exists separately from whatever you call it. Which meant that meaning itself was adaptable.

It turns out that that idea — that meaning isn’t fixed, that we learn and curate our own sense of meaning, that we can create meaningful connections with each other based on what we have in common — that idea became the undercurrent of the work I’ve done throughout my life.

Part of what drives my work in technology is a curiosity about what makes humans human. My contention after some 25 years of working in this field and researching this topic is that the most notable attribute about humanity — and the one most pertinent to a discussion of technology — is that humans crave meaning.

Meaning, after all, takes many forms in our lives: the considerations of relevance, significance, purpose, even our own existence in the cosmos. Meaning is about what matters.

And one of the ways I describe my work is that I am helping humanity prepare for an increasingly tech-driven future, and what’s so fascinating to me about the idea that meaning is what matters is that you can also say that innovation is about what is going to matter.

And all of this is true on both an organizational level and an individual level.
So conveniently, the same tools that I now talk to corporate leaders about in preparing them for digital transformation are tools we need as speakers:
Purpose, relevance, alignment.

We need to define what is most meaningful to us and to our audience to find the alignment between them. We have to be able to tell our own most meaningful stories and talk about our own experiences in a way that people can see how those insights are relevant to them.

And we have to dig deep for our clarity of purpose and know why we’re doing what we’re doing, and as I tell business leaders, we have to know what we are trying to achieve at scale. In other words, what does this look like when it’s very successful? For us that means, not just how much money do you hope to make as a speaker, but what changes do you want to be part of making in people’s lives and in the world?

For me that can be summarized in the phrase Tech Humanist, which is both the term people sometimes call me and the title of my most recent book.
The idea of the book — the idea of my work — is that technology is changing everything, most notably (and what I care the most about) human experience, and business is most responsible for those changes. So there has to be a way to marry the interests of business and humanity through tech, and my work is dedicated to doing just that.

So in practice, what I speak about is digital transformation. But every speaker’s subtext is transformation, of some kind: we’re all trying to help people see their way from one state of mind or being to another state.
In my audience’s case it’s from a state of fear about the future and technology to a state of preparedness for the future and curiosity about how technology can help amplify their company’s purpose.

And in the biggest picture sense, as I mentioned before, I like to say I am helping humanity prepare for an increasingly tech-driven future.
That idea is endlessly inspiring to me, and in my experience, to make a career out of this, you have to draw from what you’re naturally curious and inspired about.

There also has to be alignment with what the market wants. Sometimes that’s not entirely what you want. My moments of strongest market validation this year followed a sarcastic conspiracy tweet about facial recognition, so now I get tagged on a lot of posts people make about technology conspiracy theories and invasive use of facial recognition. Some of that is relevant and interesting to me, but I can’t imagine reshaping my career to become the Tech Conspiracist.

In any case, this is what it all boils down to, all the advice about finding your niche, your positioning, your value proposition… it’s about whatever consumes you in the middle of the night anyway, and what you’ll wake up with fresh ideas about. For me, that is somewhere at the intersection of meaning, technology, and the human condition.

Because eventually I realized that if you aren’t born with a singular calling, you get to spend the rest of your life knitting the threads of your passions together to form one. You get to make your own calling.

In many ways what I do now is the perfect combination of what I dreamed of doing as a kid.

No, I’m not a singer-songwriter, but I do write and I do perform.
I certainly use my skill with language both in a broader sense of understanding the meaning of things
and in a literal sense: I get to try out my foreign languages when I travel internationally.
And these days it takes a pretty good deal of tech savvy to do some of this work, in terms of the digital marketing it takes to build a business. So I’m grateful to have that in my background, too.
So although it sort of bothered me as a kid, I now consider my versatility to be my strongest asset as a writer and speaker:
so many things interest me that I can draw parallels between unexpected ideas for new insights.

Speaking has become my main source of income, and it’s an amazing career because once you decide what your message is,
you can get the message out to people who can take it to heart and make decisions with it
And I’ve been honored to be asked to speak for big companies with huge impact like Google,
forward-thinking cities like Amsterdam,
and even this year, thanks to my friend Jennifer’s invitation, at the United Nations.
Remember I mentioned that?
When I saw the interpreter booths at the back of the room I got chills.

Getting to speak for cool clients is definitely a perk of this business, and there are plenty of other upsides to this job: travel can be fun, the money can be good, and you can feel like you’re making a difference.
The downsides? The road warrior life can also be exhausting, the time away from friends and loved ones is tough, and most people have no idea what it is you actually do.

As a speaker, you have this weird job no one quite understands
— a lot of people think it’s more glamorous or more sleazy than it really is — 
so it’s nice when you can be around people who understand
that what you’re doing is mining the depths of your experience,
sharing truths about yourself and your observations about the world
so you can help your audience understand how to make a difference, how to transform.

The barrier to entry in this field isn’t very high: you can start speaking anywhere and anytime.
There is no one right way.
You can absolutely use your unique combination of skills and life experiences to carve out a path that suits you so perfectly you might swear it’s your calling.

But the barrier to greatness is a lot higher
and you need great people around you to support you,
to challenge you,
and to encourage you to do better and bigger work.

That’s what’s so great about building our network here and in other communities of speakers, amongst all these other adaptable, versatile, multi-skilled people like you with varied and colorful life experiences who are just as much on a quest to make your life into your calling, get your unique message out, and transform the world.

Preparing for the Next 10+ Years: Data After the #10YearChallenge Data Sharing Discussion

I’ve been fortunate enough to make my living writing, speaking, and advising about the impact of technology on humanity for quite a few years now. Most commonly, though, my audiences tend to be business leaders, and what I write and speak and advise about most often is how they can adopt a digital transformation strategy that helps the company succeed while keeping the human in focus and respecting human data.

So the massive mainstream media reaction to my viral #10YearChallenge tweet and subsequent piece in WIRED was in some ways a switch in perspective: from talking to businesses about human data, to talking to humans about business use of their data. And it gave me the chance to address a far more universal audience than usual — on BBC World News, Marketplace, and NPR Weekend Edition, among many other outlets — in a cultural moment so widely discussed, it was referenced in the top article on the Reddit homepage and mentioned on The Daily Show with Trevor Noah. My goal through it all was to spark greater awareness about how much data we share without realizing it, and how we can manage our data wisely. Just as the goal of my work is to help amplify the meaning in the experiences businesses create for the humans they do business with, my hope in connecting with a mainstream audience was to encourage people to participate in experiences meaningfully and mindfully. People all over the world gave me an overwhelming amount of feedback: some worried, some praising, some critical. I listened as openly as I could to everything I could.

With all that listening, I know that some common questions remain. I see many of the same recurring themes in comments on Twitter and elsewhere. So I’m using this opportunity here, at home on my own company’s site without the time limits and fleeting news cycles of a major news channel, to address a few of them, and I hope they will, in their own small way, be part of the conversation we carry forward.

Let’s get this one out of the way first, since it’s been the biggest misunderstanding throughout this whole deal:

“Facebook says they didn’t have any part in the meme. Didn’t you say they designed the whole #10YearChallenge meme to gather user data to train their facial recognition algorithm?”

It’s funny: I didn’t say Facebook did it, and quite frankly, it wouldn’t matter. I was musing on the fact that the meme was creating a rich data set, and pondering aloud what that data set could theoretically be used for. In any case, it was a thought experiment, not an accusation. In my WIRED article I expanded on the thought experiment and did not accuse Facebook of having engineered it. In fact, more importantly, as I wrote there:

The broader message, removed from the specifics of any one meme or even any one social platform, is that humans are the richest data sources for most of the technology emerging in the world. We should know this, and proceed with due diligence and sophistication.

— excerpt from my article in WIRED

That said, though, I wouldn’t have made any definitive statements from the beginning claiming that Facebook didn’t or wouldn’t have done something like this. I’m sure there are plenty of well-meaning people in the company’s leadership, but between psychological experiments, Cambridge Analytica, and various leaks and breaches, there have been too many missteps, lapses, and outright errors in judgment on Facebook’s part for them to be above suspicion when it comes to violations of data security and trust.

Nonetheless, although it was a very common misconception, I genuinely don’t suspect that the meme began with Facebook — and I don’t believe that matters. What matters is that we use these discussions to deepen our thinking about personal data, privacy, and trust.

“How can people who’ve taken your message to heart and now recognize the importance of this topic learn to manage their data more wisely?”

If you think of your data as money, you may have a better instinct for why you need to manage it well, and take care not to spend it loosely or foolishly. I’m not a fan of the idea of data as currency (partly because I think the human experience is more dimensional than a monetary metaphor conveys), but just this once I think it may be a helpful comparison. And as long as you know you’re safe, not getting lied to or ripped off, this “data is money” comparison may help illustrate why it can be worth it to spend it on experiences that matter to you.

In terms of actionable steps, here are a few helpful resources:

Personally, one easy step I take is to use the On This Day feature on Facebook to go through my posting archive day by day. I may change the permissions on old content, or delete a post completely if it seems like it no longer serves me or anyone else to have it out there.

I also have recurring reminders on my calendar to do reviews and audits of my online presence. I do what I call a weekly glance, a quarterly review, and an annual audit. For the weekly session, you can assign yourself one platform each week, and review your security settings and old content to make sure there isn’t anything out there that you no longer want to share. The quarterly review and annual audit may entail different activities for you, but for me they also involve updating old bios and links in various places, so it becomes a strategic review as well as a security check.

“What about Apple Pay and unlocking your phone with your face, or accessing your bank account with your face? Or paying for your meal with your face? What about other biometric data like fingerprints?”

All of this is relevant, and I’ll unpack some of these issues more in future articles. The short answer, though, is that with some of these uses, such as Apple Pay, you take an educated guess that the company collecting your data will safeguard it, because the company bears some risk if they screw up. But not all data sharing carries proportional risk on both sides, so think critically before using these services.

At least for now, pay for your fried chicken with cash, not your face.

“What about 23andme and other DNA/genetic data issues?”

That’s a whole other article. (I will say I personally haven’t done a commercial DNA test because bad outcomes always seemed possible.) The topic does relate to the rest of this, and it does matter that we’re 1) cautious of using commercial services like this, and that 2) we hold companies accountable to adhere to the uses we agreed to, and not to overstep what we understood to be our contract.

“What about data tracking in smart home systems?”

The standards and precedents are not yet well defined for the use and protections on data collected by smart home devices like smart speakers listening passively for a command. The safest thing to do is hold off on using them, and the second-safest thing is to turn them off when not in use.

While I did address some of the issues and opportunities with smart home automation and devices in Tech Humanist, this is again a topic I’ll dig into more in future articles.

“What about regulations on data? What about regulations on facial recognition, or on AI in general?”

The vast amount of personal data transmitted and collected by business, government, and institutional entities is what powers algorithmic decision making, from ecommerce recommendations to law enforcement. And this vast data and broad algorithmic decision-making is also where machine learning and artificial intelligence takes root. Artificial intelligence, broadly, has the chance to improve human life in many ways. It could help address problems associated with world poverty and hunger; it could improve global transportation logistics in ways that reduce emissions and improve the environment; it could help detect disease and extend healthy human life.

But machines are only as good as the human values encoded into them. And where values aren’t clear or aren’t in alignment with the best and safest outcomes for humanity, regulations can be helpful.

The European Union’s General Data Protection Regulation, or GDPR, that went fully into place in May 2018 is for now the most comprehensive set of regulatory guidelines protecting individuals’ data. And American tech companies have to play by these rules: just this week, Google got hit with a 50 million euro fine for violating the term that requires companies to produce clear disclosure on the data they collect from consumers.

Meanwhile, for many Americans it’s tough to imagine what entity in the United States would be responsible for enforcing any set of regulations pertaining to data and AI.

In the meantime, just as with climate change, we need efforts on the macro and micro scale: the experts tell us that for any kind of real reduction in impact on the environment we need big movement from commercial and industrial entities which produce the lion’s share of emissions, but that doesn’t mean that, say, you shouldn’t put your soda bottle in the recycling bin, not the trash. We’re learning more and more how important it is for us to be mindful of our ecological footprint; we also need to learn how to be mindful of our digital footprint.

“Should I turn off facial recognition image tagging in Facebook?”

I would advise doing so, yes.

the Facebook settings screen where you can disable automatic face recognition

“Are you saying I can’t have any fun online?”

Oh, heck no. By all means, I am very pro-fun. Even when it comes to digital interactions.

It’s easier to have fun when you know you’re reasonably safe, though, right? The biggest takeaway from this discussion about the possible side effects of the #10YearChallenge should be to remember that when any meme or game is encouraging you — and large groups of other people — to share specific information about yourself, it’s worth pausing before you participate. It’s relevant to wonder who might be collecting the data, but it’s far more important to think what the collected data can do.

But share the meaningful parts of your life online with friends and family, and enjoy being able to follow their updates about the meaningful parts of their lives. That has certainly been the most wonderful benefit of social media.

Not only am I pro-fun, I am also very pro-technology. I love tech, and I genuinely think emerging technologies like AI, automation, and the Internet of Things — all largely driven by human data — have the chance to make our lives better. (As I wrote in Tech Humanist, I believe we have the chance to create the best futures for the most people.) But to achieve that, we need to be very mindful about how they can make our lives worse, and put measures in place — in our government, in our businesses, and in our own behavior — to help ensure the best outcomes.

Beyond Customer Experience

Businesses are finally starting to catch on that a disciplined approach to improving the customer experience leads to profit. That’s the starting point, and it’s fantastic.

But what’s the next step? What’s beyond improving the customer experience?

Well, we can think about the customer not merely as a customer, but as a well-rounded human being, who takes on many roles throughout the course of a day: patient, student, user, guest, citizen, not to mention friend, employee, parent, and so on. We can improve the human experience.

How can we improve human experience? How can we think about those many roles we all have in a business context, and why should we?

Those additional roles become dimensions of the person you’re trying to do business with. The more dimensional that person is to you, the more likely you’ll be able to offer them value. When you offer them value, you establish the basis of a meaningful relationship.

We always have to look for the human nuances if we want to build meaning.

The Thing About the Internet of Things is the Humanity in the Data

The thing about the Internet of Things is it isn’t about the things; it’s about the people.

The “things,” for the most part, are designed to create more connected experiences for humans. And the data layer that connects the digital experiences to the physical world through our gestures and actions is our data.

The transactional data that connects the online and offline world happens largely through us, through our transactions and purchases, through our speech, through our attention, through everything we do.

In the course of analyzing, optimizing, and targeting, we can’t let ourselves forget about the humanity in the data.

(This, by the way, is part of what I examine in my forthcoming book Pixels and Place: Designing Human Experience Across Physical and Digital Spaces. Available in print and Kindle versions on September 1st, but you can pre-order a Kindle copy now.)

The Rental Car Traffic Violation Scam Hypothesis and Personal Privacy

vintage photo of driver being ticketed by police officer
vintage photo of driver being ticketed by police officer (source: Wikimedia Commons)

I got a ticket in the mail yesterday for running a red light. Well, it wasn’t a ticket, exactly. It was a “notice of an administrative fee” for a red light violation that allegedly happened while I was driving a rented car in my mom’s town the day before her birthday. The ticket itself apparently hasn’t even arrived in the mail yet, but the rental car company has a whole operation to process the administrative fees from traffic violations incurred by their renters, and they’re not wasting any time collecting theirs.

I bring this up here because it’s actually happened to me quite a lot. Nearly every time I rent a car, I end up getting a traffic ticket in the mail a few months later (and as a consultant and speaker who often travels to clients and events, I rent a lot of cars). You may be tempted to joke that I’m a terrible driver, but these traffic violations by mail never showed up when I was driving my own car and the discrepancy has become enough of a pattern that my mind, not usually given to conspiracy theories, started to formulate a hypothesis about how this could be part of a program to make money off of car renters.

Anatomy of a Scam Hypothesis

How could this be happening? Well, the rental car agencies could be selling their driver rental information to the companies that operate the traffic cameras. The traffic cameras could be scanning license plates and matching them against a list from the rental agencies. They could be issuing tickets on violations or close-enough-to-be-violations only when there’s a match.

I also notice that I never get more than one traffic violation per rental. The system could be set to throttle the tickets to one per rental period. Casual renters wouldn’t think much about it. “Oh well,” they’d think, “I got a ticket. I’ll just pay.”

But frequent renters, like me, start to notice a pattern. Why is it I owned a car all my adult life until late last year, drove all over the country, frequently taking my car on road trips, and have never once been issued a traffic-camera-ticket for any of those trips, yet when I drive in some of these same towns in a rental car, I get tickets mailed to me?

“Maybe It’s Just You”

Perhaps you’re skeptical, as we discuss this over drinks, and you offer that maybe instead of this elaborate scam, there’s instead a behavioral science element to all this: suppose we all drive a little more recklessly when we’re in a rented car. That seems a reasonable counter-hypothesis, I’d concede with a tip of my beer mug, but without supporting data or a compelling argument to convince me that there might be truth to this, I maintain that, whether I own the car or borrow it, I drive as I drive. And pass the peanuts.

Anyway, What About Due Process?

Most of all, whether the intent to conspire is there or not, surely it brings up questions of due process. If a police officer had simply pulled me over in each of these places, there’d be far less question of legitimacy. You say I ran a red light? Pull me over right then. The action on which the claim is based will be fresh in my head and I can either challenge the officer (calmly and politely, of course) about the veracity of the claim or accept the ticket. (Or cry, and maybe get off with a warning. Oh, relax; I’m kidding.) But you say I ran a red light three months after the fact? I barely recall being at the intersection in question, let alone what the conditions of the intersection were, or the timing of the light, or the layout of the traffic around me. Even if you were to furnish me with photographic evidence of my rented car with me in it clearly violating a red light, I still don’t have the consideration of context, and I get no due process at all.

The Bigger Issue: Privacy, Personal Rights, and Public Data

I don’t necessarily believe my conspiracy hypothesis about the rental car traffic violation scam; I just think it’s possible, and at the rate I get these tickets, I admit I’m a tad suspicious. But I’m less concerned with that and more conscious of the the bigger issue: how vulnerable people are and increasingly will be to schemes that take advantage of ever-present tracking data, surveillance, and systems with default authority, such as rental car companies and traffic enforcement bureaus. Even if these entities aren’t trying to be exploitative, the more access they have to integrated data about our movements and behaviors, the greater the potential will be for them to overstep the authority we think we’ve granted them.

So why do I share this half-baked conspiracy idea anyway? Because the premise is not mere science fiction; it’s certainly not impossible, and it’s important that we remind ourselves regularly of the powerful data about people that can be used by companies and government. That power is growing, and to a great degree, it’s already out of our hands as citizens, consumers, patients, and the public. So where and when we can, it’s important that we think critically about what the implications are, and it’s important for those of us who work in and around data systems that track human actions to be mindful of what that means.

Meanwhile, to finish on a lighter note, here’s how comedian Joe Lycett handled a mailed-in notice of a parking ticket. Enjoy.