Preparing for the Next 10+ Years: Data After the #10YearChallenge Data Sharing Discussion

I’ve been fortunate enough to make my living writing, speaking, and advising about the impact of technology on humanity for quite a few years now. Most commonly, though, my audiences tend to be business leaders, and what I write and speak and advise about most often is how they can adopt a digital transformation strategy that helps the company succeed while keeping the human in focus and respecting human data.

So the massive mainstream media reaction to my viral #10YearChallenge tweet and subsequent piece in WIRED was in some ways a switch in perspective: from talking to businesses about human data, to talking to humans about business use of their data. And it gave me the chance to address a far more universal audience than usual — on BBC World News, Marketplace, and NPR Weekend Edition, among many other outlets — in a cultural moment so widely discussed, it was referenced in the top article on the Reddit homepage and mentioned on The Daily Show with Trevor Noah. My goal through it all was to spark greater awareness about how much data we share without realizing it, and how we can manage our data wisely. Just as the goal of my work is to help amplify the meaning in the experiences businesses create for the humans they do business with, my hope in connecting with a mainstream audience was to encourage people to participate in experiences meaningfully and mindfully. People all over the world gave me an overwhelming amount of feedback: some worried, some praising, some critical. I listened as openly as I could to everything I could.

With all that listening, I know that some common questions remain. I see many of the same recurring themes in comments on Twitter and elsewhere. So I’m using this opportunity here, at home on my own company’s site without the time limits and fleeting news cycles of a major news channel, to address a few of them, and I hope they will, in their own small way, be part of the conversation we carry forward.

Let’s get this one out of the way first, since it’s been the biggest misunderstanding throughout this whole deal:

“Facebook says they didn’t have any part in the meme. Didn’t you say they designed the whole #10YearChallenge meme to gather user data to train their facial recognition algorithm?”

It’s funny: I didn’t say Facebook did it, and quite frankly, it wouldn’t matter. I was musing on the fact that the meme was creating a rich data set, and pondering aloud what that data set could theoretically be used for. In any case, it was a thought experiment, not an accusation. In my WIRED article I expanded on the thought experiment and did not accuse Facebook of having engineered it. In fact, more importantly, as I wrote there:

The broader message, removed from the specifics of any one meme or even any one social platform, is that humans are the richest data sources for most of the technology emerging in the world. We should know this, and proceed with due diligence and sophistication.

— excerpt from my article in WIRED

That said, though, I wouldn’t have made any definitive statements from the beginning claiming that Facebook didn’t or wouldn’t have done something like this. I’m sure there are plenty of well-meaning people in the company’s leadership, but between psychological experiments, Cambridge Analytica, and various leaks and breaches, there have been too many missteps, lapses, and outright errors in judgment on Facebook’s part for them to be above suspicion when it comes to violations of data security and trust.

Nonetheless, although it was a very common misconception, I genuinely don’t suspect that the meme began with Facebook — and I don’t believe that matters. What matters is that we use these discussions to deepen our thinking about personal data, privacy, and trust.

“How can people who’ve taken your message to heart and now recognize the importance of this topic learn to manage their data more wisely?”

If you think of your data as money, you may have a better instinct for why you need to manage it well, and take care not to spend it loosely or foolishly. I’m not a fan of the idea of data as currency (partly because I think the human experience is more dimensional than a monetary metaphor conveys), but just this once I think it may be a helpful comparison. And as long as you know you’re safe, not getting lied to or ripped off, this “data is money” comparison may help illustrate why it can be worth it to spend it on experiences that matter to you.

In terms of actionable steps, here are a few helpful resources:

Personally, one easy step I take is to use the On This Day feature on Facebook to go through my posting archive day by day. I may change the permissions on old content, or delete a post completely if it seems like it no longer serves me or anyone else to have it out there.

I also have recurring reminders on my calendar to do reviews and audits of my online presence. I do what I call a weekly glance, a quarterly review, and an annual audit. For the weekly session, you can assign yourself one platform each week, and review your security settings and old content to make sure there isn’t anything out there that you no longer want to share. The quarterly review and annual audit may entail different activities for you, but for me they also involve updating old bios and links in various places, so it becomes a strategic review as well as a security check.

“What about Apple Pay and unlocking your phone with your face, or accessing your bank account with your face? Or paying for your meal with your face? What about other biometric data like fingerprints?”

All of this is relevant, and I’ll unpack some of these issues more in future articles. The short answer, though, is that with some of these uses, such as Apple Pay, you take an educated guess that the company collecting your data will safeguard it, because the company bears some risk if they screw up. But not all data sharing carries proportional risk on both sides, so think critically before using these services.

At least for now, pay for your fried chicken with cash, not your face.

“What about 23andme and other DNA/genetic data issues?”

That’s a whole other article. (I will say I personally haven’t done a commercial DNA test because bad outcomes always seemed possible.) The topic does relate to the rest of this, and it does matter that we’re 1) cautious of using commercial services like this, and that 2) we hold companies accountable to adhere to the uses we agreed to, and not to overstep what we understood to be our contract.

“What about data tracking in smart home systems?”

The standards and precedents are not yet well defined for the use and protections on data collected by smart home devices like smart speakers listening passively for a command. The safest thing to do is hold off on using them, and the second-safest thing is to turn them off when not in use.

While I did address some of the issues and opportunities with smart home automation and devices in Tech Humanist, this is again a topic I’ll dig into more in future articles.

“What about regulations on data? What about regulations on facial recognition, or on AI in general?”

The vast amount of personal data transmitted and collected by business, government, and institutional entities is what powers algorithmic decision making, from ecommerce recommendations to law enforcement. And this vast data and broad algorithmic decision-making is also where machine learning and artificial intelligence takes root. Artificial intelligence, broadly, has the chance to improve human life in many ways. It could help address problems associated with world poverty and hunger; it could improve global transportation logistics in ways that reduce emissions and improve the environment; it could help detect disease and extend healthy human life.

But machines are only as good as the human values encoded into them. And where values aren’t clear or aren’t in alignment with the best and safest outcomes for humanity, regulations can be helpful.

The European Union’s General Data Protection Regulation, or GDPR, that went fully into place in May 2018 is for now the most comprehensive set of regulatory guidelines protecting individuals’ data. And American tech companies have to play by these rules: just this week, Google got hit with a 50 million euro fine for violating the term that requires companies to produce clear disclosure on the data they collect from consumers.

Meanwhile, for many Americans it’s tough to imagine what entity in the United States would be responsible for enforcing any set of regulations pertaining to data and AI.

In the meantime, just as with climate change, we need efforts on the macro and micro scale: the experts tell us that for any kind of real reduction in impact on the environment we need big movement from commercial and industrial entities which produce the lion’s share of emissions, but that doesn’t mean that, say, you shouldn’t put your soda bottle in the recycling bin, not the trash. We’re learning more and more how important it is for us to be mindful of our ecological footprint; we also need to learn how to be mindful of our digital footprint.

“Should I turn off facial recognition image tagging in Facebook?”

I would advise doing so, yes.

the Facebook settings screen where you can disable automatic face recognition

“Are you saying I can’t have any fun online?”

Oh, heck no. By all means, I am very pro-fun. Even when it comes to digital interactions.

It’s easier to have fun when you know you’re reasonably safe, though, right? The biggest takeaway from this discussion about the possible side effects of the #10YearChallenge should be to remember that when any meme or game is encouraging you — and large groups of other people — to share specific information about yourself, it’s worth pausing before you participate. It’s relevant to wonder who might be collecting the data, but it’s far more important to think what the collected data can do.

But share the meaningful parts of your life online with friends and family, and enjoy being able to follow their updates about the meaningful parts of their lives. That has certainly been the most wonderful benefit of social media.

Not only am I pro-fun, I am also very pro-technology. I love tech, and I genuinely think emerging technologies like AI, automation, and the Internet of Things — all largely driven by human data — have the chance to make our lives better. (As I wrote in Tech Humanist, I believe we have the chance to create the best futures for the most people.) But to achieve that, we need to be very mindful about how they can make our lives worse, and put measures in place — in our government, in our businesses, and in our own behavior — to help ensure the best outcomes.

2018: Transformations All Over the Place

It’s a good thing the work I do is in insights and transformations, because probably little else would have prepared me for 2018.

On a global scale, this year seemed to be about 1) getting a grip on the scale and immediacy of climate change, 2) raising questions of policy and human decency toward migrants and refugees, 3) comprehending the magnitude of emerging data privacy issues and the impact of technology on our behavior and our lives, and 4) dealing with a recidivistic slide in countries around the world into populist nationalism and fascism. So. Y’know. Just little stuff.

Transformation and Systems

Since my own work is at the intersection of technology and humanity, I was particularly interested in the stories that pertained to that third topic: data privacy and the impact of technology on human behavior, experience, and life in general. But I also know that none of these topics happened in a vacuum. Our willingness to confront climate change—or not—will parallel and perhaps have rippling consequences in how we handle the emergence of artificial intelligence. Acknowledging and dealing with underlying issues will be key in both scenarios. And the discourse around those topics will shape the global political theater, and vice versa. It’s all connected. 

So it’s timely that this is the year my book Tech Humanist was published; it delves into the idea that how we articulate purpose and values inside business will have effects at scale on the human experience. The reviews and testimonials on behalf of the book have been incredible and humbling. Here was one:

“For the past two decades, the Computer History Museum has chronicled the amazing rise of the technology which just in our lifetime has become the most powerful agent of change the world has ever known. While the stories of creativity, invention, innovation and impact are fascinating, what all this means for the future and humanity is what we are poised to take on now as an institution. Nowhere has this become more clear to me and my colleagues here at the museum than in reading Kate O’Neill’s blog about a year ago entitled “The Tech Humanist Manifesto.” The idea that we need to develop and imbed in all future technologies the very best of ourselves and our ethics and ultimately have the goal of those emerging technologies to make us better humans has resonated deeply into our own plans of what we will present, discuss, and debate going forward.
After reading the manifesto, my initial thoughts were ‘Kate should write a book on this.’ Which I am very happy that she has done, and now her humor, excellent insights and heartfelt philosophy can reach the leaders and influencers throughout the world. And the rest of us too.”
— Gary Matsushita, Vice President, Computer History Museum

As the book launched in September, I embarked on a nearly-four-week long speaking and book tour, finishing the trip by being recorded for the “Talks at Google” lecture series, which they describe as bringing “the world’s most influential thinkers, creators, makers, and doers all to one place” — and that place is the Googleplex in Mountain View, California. With that description, I can’t tell you how honored I was to be asked.

If you don’t have time to watch, I’ll give you the premise: the technology-driven future can be filled with human meaning. I genuinely believe that. It’s the undercurrent of my work, and my personal and professional purpose. With automation and artificial intelligence amplifying and accelerating the goals of business, it’s an important message for business leaders and experience designers to hear. 

Human-Centric Digital Transformation in Business

It encourages me that so many companies have signed on to this “Tech Humanist” message of human-centric digital transformation by hiring me to keynote their corporate events (and it was lovely that so many this year were in wonderful locations, such as Palm Beach in April for a Kelly Services event, and Barcelona in May for a Cisco event). 

Delivering the keynote at a Kelly Services event in early 2018

In October, I was delighted to partner with Cognizant and their Center for the Future of Work for a webinar on automation and the future of human jobs and work. We talked about how and when human jobs will be augmented, displaced, and replaced by automation, but also how new jobs will be created, and what those jobs are likely to be. (I am already booked to do a good deal more writing and speaking on that subject in 2019, as well; sign up for the KO Insights email list if you’d like to be notified when new insights are available.) 

In December, analyst firm HfS Research, which specializes in automation and artificial intelligence for enterprise, invited me back to keynote their FORA (Future of Operations in the Robotic Age) event on the hyperconnected economy. Again, they did this specifically to emphasize the human angle in this otherwise technology-heavy discussion of enterprise operations. I find that incredibly encouraging, and I’d love to suggest that you should, too.

The Tech Humanist Movement Grows

My “tech humanist” message and movement is spreading in ways I could never have predicted, too: some of this year’s highlights for me were seeing my work finding its way into university curricula, such as having The Tech Humanist Manifesto licensed for inclusion in a textbook, and seeing my work spread internationally, such as signing the paperwork to have my previous book, Pixels and Place, translated into Korean.

Transformation in Cities

In fact, speaking of the international scale of the message: in early December I was honored to be asked to keynote the Amsterdam Economic Board’s annual meeting, as part of an initiative preparing the city of Amsterdam to be future-ready for its 750th (!!!) birthday in 2025. It was the perfect synthesis of Pixels and Place and Tech Humanist: I shared my thinking about how cities of the future can be fully human-centric while embracing data and emerging technology to empower its citizens, its visitors, and all the humans who live, work, and play in the city. 

Transformation Happens on a Personal Level, Too

The theme of transformations with systemic consequences carried over on a personal level, too: I celebrated 20 years since quitting smoking and 20 years since going vegan. Apparently 1998 was also a pretty darned transformational year for me.

Oh, and another transformation: suddenly this year I became allergic to mango! I love love love mango, so that was disappointing. Now I have to carry an Epi-Pen with me everywhere; that’s a weird change that could have systemic effects. After all, who knows if someday I may need to offer my Epi-Pen to someone else who’s having an allergic reaction? (It’s a good idea for more of us to carry Epi-Pens.) 

On a heavier note, this was also a tough year for transformative losses: most notably my dear friend Jen lost her husband in late summer, and for me and many of her friends, the following weeks and months were devoted to seeing her through her grief and adjustment to being a widow, something I am unfortunately qualified to help with.

Speaking of which, another systemic effect: with Kate Spade’s and Anthony Bourdain’s deaths by suicide happening just days before the 6th anniversary of my late husband’s death to suicide, I felt their deaths acutely as triggers. Triggers are, at the moment, talked about through a sneer in contemporary culture, but they’re real and they’re hard; I wrote about them in this essay called “Suicide vs. Love” back in 2014 when Robin Williams died.

It All Connects Together

And the reason why all these deeply personal matters are relevant here, in this year’s business summary along the theme of interrelating systems, of how one planet’s shadow causes change on another planet’s surface, is that this is how we must begin to think about humanity. We do not live in isolation; we do not live in tidy boxes that separate one effect from another. Our lives and our deaths affect one another. Our decisions—professional and personal—shape and change each other’s lives.

The world around us is transforming in big, fast, sometimes frightening ways, and it will continue to transform, radically and quickly. We must adapt with it, and we must take responsibility for our role in making the best of those changes. That means thinking about the connectedness of systems, and about the connectedness of us all.

Happy New Year, and may 2019 bring about happy and meaningful changes for you, for me, and for everyone. 


If this theme resonates with the conversations your company, organization, or city has been having and you’d like to hire me as a keynote speaker at an event in 2019, please do reach out. Here’s to a meaningful year for us all. 

10 Fundamental Insights about the Tech-Driven Future for Humanity*

*and why women, POC, and other underrepresented people in tech should lead it

Today I spoke at the Irish Business Organization of New York’s women’s networking luncheon and addressed them on the tech-driven future for humanity, and why women should be leading it.

Tech Humanist front cover

Here are those insights in brief; if you’d like to hear more of this, of course, I elaborate on all of these points within my keynote presentations and my books.

  1. The tech-driven future will be neither dystopia nor utopia. It will be what we make it.
    We tend to tell a story about technology that pits the worst case scenario against the best case scenario — and conveniently leaves our actions and responsibilities out of the equation. But the truth is we are very much responsible for shaping the future of technology.
    Is it possible that tech can even help us be better humans? As I repeatedly asserted in Tech Humanist, with the emergence of automation, artificial intelligence, and other capacity-expanding tech, we will have the opportunity to create the best futures for the most people.
  2. Humans crave meaning.
    We just do. We seek meaning, we’re compelled by meaning; when you offer meaning to us, we can’t resist it. To bridge the gap between what makes tech better for business and better for humans, business needs to create more meaningful human experiences at scale.
    Moreover, the shape meaning takes in business is purpose, and the amazing thing about purpose is that when you can be clear about what you are trying to do at scale, it helps both humans and machines function more effectively. Humans thrive on a sense of meaning, common goals, and a sense of fulfilling something bigger. Machines thrive on succinct instructions. A clearly articulated sense of strategic purpose helps achieve both of these.
  3. Robots aren’t “coming.” They’re here.
    Everyone talks about robots coming 
like they’re some far-off future 
as if millions of homes don’t already have Roomba and Alexa.
  4. What tech does well vs. what humans do well will continuously evolve.
    What does tech do well, for now? Productivity: 
speed up laborious tasks, improve reliability of variable tasks, automate repetitive tasks, archive, index. Certain types of predictive insights: 
track data, expose patterns. Security: 
impose rules and limits, regulate access.
    What doesn’t tech do as well? Tech isn’t so hot at: 
Managing people. Making judgment calls. Fostering relationships. Discerning contextual nuance. (Yet.)
    Also, humans can’t leave meaning up to machines. That’s value humans add to the equation.
  5. Machines are what we encode of ourselves.
    And since that’s true, why not encode our best selves? Our most enlightened selves?
  6. Data-rich experiences tend to be better experiences. Just remember that analytics are people.
    Everyone loves the oft-quoted statistics about data: every 2 days we create as much information as we did from the beginning of time until 2003, and over 90% of all the data in the world was created in the past 2 years.
    And there are huge opportunities to use this data to make amazing, delightful, fulfilling, enriching human experiences possible.
    But what’s important in all of this is remembering that most of this data comes from humans, and represents human identity, preferences, motivations, desires, and so on. Most business data is about people. Analytics, in other words, are people. And while relevance is a form of respect, discretion is, too. So we need to treat human data with respect and protect it excessively, even as we use it to inform the design of more meaningful experiences.
  7. If you don’t align human experiences with meaning, you risk building absurdity at scale.
    There’s a story I tell (and it’s in the book) about a big retailer encoding a behavior change that, at some point, could put a cultural norm in jeopardy. And the upshot is: experience at scale changes culture. Because experience at scale is culture.
  8. “Online” and “offline” are blurrier than you may think.
    This is basically the whole premise of my previous book Pixels and Place, but the short version of this insight is: just about everywhere 
the physical world 
and the digital world converge, 
the connective layer is 
the data captured through 
human experience.
    And to create more meaningful human experiences, 
we need to design more 
integrated human experiences.
  9. Everything is in flux. Embrace change.
    70-80% of CEOs say the next three years are more critical than the past 50 years. The coming years, for example, are likely to see massive shifts in the scope and types of jobs humans do. Some companies will gain tremendous efficiencies from the use of automation; I propose that companies reinvest some of those gains 
into humanity in various ways: better customer experiences, job training, basic income experiments, etc. And that where possible, companies look to repurpose 
human skills and qualities toward higher value roles.
  10. Diversity in tech is a strategic asset. Scratch that: it’s an absolute imperative.
    We need women — 
and diversity of all kinds — 
in tech, 
leadership, and entrepreneurship for myriad reasons: because algorithms contain our biases, because it makes the space better for everyone, because we need diverse representations of the problems tech can solve, and on and on.

If these ideas and insights resonate with you, check out my book Tech Humanist: How You Can Make Technology Better for Business and Better for Humans. Or inquire about booking me to speak at your company or organization.

Here’s to a more meaningful future for all of us.

Experience Timeline by Technology Era

To understand what constitutes experience and what has constituted experience throughout different eras of technology, I offer this timeline of what characterized and will characterize experiences throughout the major eras of recent and forthcoming technology. We are somewhere around the social-enabled and “smart” era, with elements of the “intelligent” era beginning to show up and legacy remnants of the previous eras still left behind.

To understand what constitutes experience and what has constituted experience throughout different eras of technology, I offer this timeline of what characterized and will characterize experiences throughout the major eras of recent and forthcoming technology. We are somewhere around the social-enabled and “smart” era, with elements of the “intelligent” era beginning to show up and legacy remnants of the previous eras still left behind.

Experience Timeline by Technology Era

platform? context? (not eras, because many overlap)

analog (industrial/pre-industrial?)

digital

web-enabled

social-enabled

“smart”/connected data sources

“intelligent”/AI

fully virtual / ambient virtual

characterized by

solid state, tangible

electronic, power-operated

interlinked, global knowledge, global village

social sharing, FOMO, FONS, selfie culture

data tracking, anticipatory based on past behavior, algorithmic

anticipatory based on externalities, secondary behaviors, cognitive cues, emotional indicators

dominant eras

??-?? (ongoing)

19th century – ?? (ongoing)

1990s – ?? (ongoing)

2000s – ?? (ongoing)

2010s – ?? (ongoing)

2010/20s – ?? (ongoing)

automation

mechanical

electronic

interlinked

social triggers

algorithmic

anticipatory

dominant interface

tactile

tactile, impulse?, text

desktop screen, text, images

mobile screen, text, videos

voice

voice, gesture, ambient

sensory interactions

buttons, dials, levers, etc

typing, mouse, visual cues

typing, mouse, visual cues

typing, touch, visual interactions

buttons, keypads, visual displays, voice

visual

y

y

y

y

y

y

tactile

y

y

y

audio

indicators

indicators

content

content

interactions

Interactions, triggers

ambient cues

kinesthetic

motion-powered

gestures to trigger sensors

gestures to interact

olfactory

detect gas leaks, detect coffee smell

simulate aromas?

taste

simulate taste?

What does placemaking look like in each context?

What does business need to do to innovate in each?

What do meaningful human experiences look like in each context?

What is the future of meaningful human experience?

The future of meaningful human experience is multi-sensory, contextual, dimensional, integrated, intelligent, responsive, anticipatory, adaptive, and inclusive.

 

Make It Fun

The “selfie emoji”/bitmoji feature in Google’s new chat app #Allo is well integrated and should drive adoption. The app also features AI in the form of its machine learning capability, encouraging users to interact with a chatbot assistant that learns and adapts. But to do that at scale requires widespread adoption, so they turned to Addictive Product 101: make it fun. :)

(Think it really looks like me or nah?)

Beyond Customer Experience

Businesses are finally starting to catch on that a disciplined approach to improving the customer experience leads to profit. That’s the starting point, and it’s fantastic.

But what’s the next step? What’s beyond improving the customer experience?

Well, we can think about the customer not merely as a customer, but as a well-rounded human being, who takes on many roles throughout the course of a day: patient, student, user, guest, citizen, not to mention friend, employee, parent, and so on. We can improve the human experience.

How can we improve human experience? How can we think about those many roles we all have in a business context, and why should we?

Those additional roles become dimensions of the person you’re trying to do business with. The more dimensional that person is to you, the more likely you’ll be able to offer them value. When you offer them value, you establish the basis of a meaningful relationship.

We always have to look for the human nuances if we want to build meaning.

The Most Interesting Things About Pokemon Go Have Nothing to do With the Game. (CEOs, I’m talking to you.)

Rather, the most interesting things about Pokemon Go have to do with connected experiences, and the sweeping changes these are bringing: new marketing models, opportunities with augmented reality, location-based marketing, and all the assorted issues with data privacy and security. The most interesting things about the Pokemon Go phenomenon have nothing to do with the game itself and everything to do with how different things are starting to be and are going to continue to be.

(These, by the way, are all part of what I examine in my forthcoming book Pixels and Place: Designing Human Experience Across Physical and Digital Spaces. Available in print and Kindle versions on September 1st, but you can pre-order a Kindle copy now.)

Connected Experiences Bring New Marketing Models

Marketing models are poised to be overhauled now that an online interaction can be credibly and consistently traced to offline visits in stores. See McDonald’s deal with Pokemon Go to make all 3,000 of its Japanese stores “gyms” in the game. The full details of their deal haven’t been disclosed, but one option this presents is an incredible opportunity for cost per visit modeling.

Connected Experiences and Social Interaction

The social experiences are different with augmented reality, when interacting with a digital experience doesn’t automatically mean being oblivious to the world around you (although obviously it still can – see, for example, the guys who fell off a cliff while playing, or the person who drove into a cop car).

But since you can engage with the game through a camera view of what’s ahead of you, it’s actually possible to walk and play and still be at least somewhat connected to your surroundings.

Connected Experiences… and Your Business Strategy?

This is only the beginning of what’s to come.

On social media, people have been laughing at the businesses who are developing Pokemon Go strategies (and well, it does sound absurd), but honestly if they’re starting now even these are a little late to the biggest opportunity. The gold rush was this past two weeks, when everything was novel and players were entertained by the outreach. Even if the game’s popularity continues to grow, players will likely begin to be put off by overt attempts to capitalize on the game from late entrants. And if your business is still laughing, you’re missing out on time to think about how augmented reality and connected experiences stand to change the status quo.

Of course then there’s this:

So I’m not saying to rush out and do something specific to Pokemon Go that has no alignment with your customers’ motivations or your brand. (Although if you have an idea for an experience that aligns and integrates your customers’ experience with the game in an organic, authentic, and/or memorable way, by all means do it, measure it, and publish a case study about it.) This is a call for strategic action about a macro trend, not mindless reaction to a micro trend. Trying to capitalize on the trend without strategy will probably come across to people like an attempt to manipulate the moment.

You need strategic planning (and do please note: I offer strategy workshops) that sets you up for success as the physical and digital worlds increasingly converge. There’s enough transformation taking place that there will be a relevant, meaningful way to make these opportunities align with your brand and your customers. Your job is to try to catch it.

The Thing About the Internet of Things is the Humanity in the Data

The thing about the Internet of Things is it isn’t about the things; it’s about the people.

The “things,” for the most part, are designed to create more connected experiences for humans. And the data layer that connects the digital experiences to the physical world through our gestures and actions is our data.

The transactional data that connects the online and offline world happens largely through us, through our transactions and purchases, through our speech, through our attention, through everything we do.

In the course of analyzing, optimizing, and targeting, we can’t let ourselves forget about the humanity in the data.

(This, by the way, is part of what I examine in my forthcoming book Pixels and Place: Designing Human Experience Across Physical and Digital Spaces. Available in print and Kindle versions on September 1st, but you can pre-order a Kindle copy now.)

Is Your Business Based on an Outdated Model of Customer Interaction?

You probably know, as most people do, that Netflix was all about renting unlimited DVDs before pivoting into streaming, but what you may not know is that before launching that DVD subscription program, they started out as a service to rent DVDs a la carte, just like Blockbuster, except online and through the mail. When they hit upon the idea of a DVD subscription model, they discovered that they had been working with a rapidly-aging notion of how customers wanted to interact with the physical world, and their new model simplified it. Of course their even newer model, of streaming video, simplified it even more. What are the wide-open opportunities to rethink the interactions with your customers and in your market?

The key thing to remember is that the convergence of physical and digital happens around the human experience. It’s not a new phenomenon, but the opportunities to adapt and offer more contextually relevant experiences are evolving all the time.

There’s a whole lot more about this in my new book Pixels and Place, coming out September 1st, 2016. You can pre-order the Kindle version here. Check back over the next few weeks, too; I’ll be posting more excerpts and giving away copies.

The Rental Car Traffic Violation Scam Hypothesis and Personal Privacy

vintage photo of driver being ticketed by police officer
vintage photo of driver being ticketed by police officer (source: Wikimedia Commons)

I got a ticket in the mail yesterday for running a red light. Well, it wasn’t a ticket, exactly. It was a “notice of an administrative fee” for a red light violation that allegedly happened while I was driving a rented car in my mom’s town the day before her birthday. The ticket itself apparently hasn’t even arrived in the mail yet, but the rental car company has a whole operation to process the administrative fees from traffic violations incurred by their renters, and they’re not wasting any time collecting theirs.

I bring this up here because it’s actually happened to me quite a lot. Nearly every time I rent a car, I end up getting a traffic ticket in the mail a few months later (and as a consultant and speaker who often travels to clients and events, I rent a lot of cars). You may be tempted to joke that I’m a terrible driver, but these traffic violations by mail never showed up when I was driving my own car and the discrepancy has become enough of a pattern that my mind, not usually given to conspiracy theories, started to formulate a hypothesis about how this could be part of a program to make money off of car renters.

Anatomy of a Scam Hypothesis

How could this be happening? Well, the rental car agencies could be selling their driver rental information to the companies that operate the traffic cameras. The traffic cameras could be scanning license plates and matching them against a list from the rental agencies. They could be issuing tickets on violations or close-enough-to-be-violations only when there’s a match.

I also notice that I never get more than one traffic violation per rental. The system could be set to throttle the tickets to one per rental period. Casual renters wouldn’t think much about it. “Oh well,” they’d think, “I got a ticket. I’ll just pay.”

But frequent renters, like me, start to notice a pattern. Why is it I owned a car all my adult life until late last year, drove all over the country, frequently taking my car on road trips, and have never once been issued a traffic-camera-ticket for any of those trips, yet when I drive in some of these same towns in a rental car, I get tickets mailed to me?

“Maybe It’s Just You”

Perhaps you’re skeptical, as we discuss this over drinks, and you offer that maybe instead of this elaborate scam, there’s instead a behavioral science element to all this: suppose we all drive a little more recklessly when we’re in a rented car. That seems a reasonable counter-hypothesis, I’d concede with a tip of my beer mug, but without supporting data or a compelling argument to convince me that there might be truth to this, I maintain that, whether I own the car or borrow it, I drive as I drive. And pass the peanuts.

Anyway, What About Due Process?

Most of all, whether the intent to conspire is there or not, surely it brings up questions of due process. If a police officer had simply pulled me over in each of these places, there’d be far less question of legitimacy. You say I ran a red light? Pull me over right then. The action on which the claim is based will be fresh in my head and I can either challenge the officer (calmly and politely, of course) about the veracity of the claim or accept the ticket. (Or cry, and maybe get off with a warning. Oh, relax; I’m kidding.) But you say I ran a red light three months after the fact? I barely recall being at the intersection in question, let alone what the conditions of the intersection were, or the timing of the light, or the layout of the traffic around me. Even if you were to furnish me with photographic evidence of my rented car with me in it clearly violating a red light, I still don’t have the consideration of context, and I get no due process at all.

The Bigger Issue: Privacy, Personal Rights, and Public Data

I don’t necessarily believe my conspiracy hypothesis about the rental car traffic violation scam; I just think it’s possible, and at the rate I get these tickets, I admit I’m a tad suspicious. But I’m less concerned with that and more conscious of the the bigger issue: how vulnerable people are and increasingly will be to schemes that take advantage of ever-present tracking data, surveillance, and systems with default authority, such as rental car companies and traffic enforcement bureaus. Even if these entities aren’t trying to be exploitative, the more access they have to integrated data about our movements and behaviors, the greater the potential will be for them to overstep the authority we think we’ve granted them.

So why do I share this half-baked conspiracy idea anyway? Because the premise is not mere science fiction; it’s certainly not impossible, and it’s important that we remind ourselves regularly of the powerful data about people that can be used by companies and government. That power is growing, and to a great degree, it’s already out of our hands as citizens, consumers, patients, and the public. So where and when we can, it’s important that we think critically about what the implications are, and it’s important for those of us who work in and around data systems that track human actions to be mindful of what that means.

Meanwhile, to finish on a lighter note, here’s how comedian Joe Lycett handled a mailed-in notice of a parking ticket. Enjoy.