The Search (and Price) of Intelligent Algorithms

Search

Sometimes, when I want to know what it’s like not to be me, I’ll jump into incognito mode on Chrome and search for something, anything – just to see what a newborn baby might find on his first search.

If there’s a notable difference, it’s that I’m searching alone. None of the content from my friends (as Google knows them) is present. None of the recommendations take into effect what past me has gone searching for. I’m asking a question of the entire web, not the web as Google curates it for me.

Still, Google lets me search. It doesn’t require I feed its data monster with my specific personal information. I am free to wander the Internet as anonymously as possible for anyone with a static IP address.

When I turn to sites like LinkedIn or Facebook, though, doors are closed. While Google will let me get by only paying with the what of my searching, these sites raise the price – they want to know who.

All of this rests on the idea that computer algorithms are strong in their ability to furnish me with answers. The more they know about the questions I’m asking, the better their ability to anticipate and queue up answers most relevant to me. That’s Me, specifically, not someone like me. Insomuch as is possible for a machine, these lines of code are personalizing the answers for which I’m searching in my learning.

But these algorithms are doing more than that. They are deciding what I don’t see. They are narrowing the Internet I experience. Because search engines and other sites that track my behavior online track what they take to be my habits, the options I see when I go looking for information are the answers I’m anticipated to need or want. And, there’s a trade off. I often find what I’m looking for, but I hardly ever stumble upon something randomly interesting. Imagine traveling the world an avoiding all the places you hadn’t seen or heard about before.

These are the answers algorithms provide.

What’s more, while these lines of code are narrowing the world and people I experience online, they’re failing to help me ask better questions. When I’m led to ask questions online, it’s because of breadcrumbs left by other people on the chance I might want to make a turn. Think of a Wikipedia entry as an example. A well written page includes loads of links to what a computer might read as randomly selected. Even when able to identify parts of speech, it is the human element that decides Prince Adam deserves a link on the entry for Skeletor while leaving Keldor as plain text.

Algorithms suck at curiosity. They don’t anticipate it well, and they rarely engender it in users. Any program that ushers a user through a series of pre-conceived questions is avoiding actual questioning. To keep the travel metaphor going, these experiences are like riding It’s a Small World rather than actually traveling to each of the countries depicted. And, no matter how well such applications anticipate your reaction to a given set of stimuli, whatever is put in front of you next isn’t computer generated, it was programmed by someone who decided where your unknowing should go next.

While the secret soup that makes search engines and other sites pull up the answers to my questions is imperfectly good, I have to remember that it comes at the cost of my information (anonymous or not) and experiencing the world in a way someone like me is “supposed” to see it. This is more limited than I know.

For however good these systems are at finding my answers, they are nowhere near as capable of helping me generate questions as a conversation with a friend or reading a thoughtful editorial. While they are able to learn, they are certainly not curious.


This post is part of a daily conversation between Ben Wilkoff and me. Each day Ben and I post a question to each other and then respond to one another. You can follow the questions and respond via Twitter at #LifeWideLearning16.

One thought on “The Search (and Price) of Intelligent Algorithms”

  1. You make a good point. Pick up another person’s phone and open up Facebook. It’s more than just a change in ads, it’s a change in identity. We selectively shape our online persona through the words that we use, the people we friend, and the ideas that we give credence to.

    But it goes deeper than that. Many of the walled gardens on the internet take these traits that we give them and amplify. They send them back to us in a way that reinforces these crafted identities which in turn causes us to reinforce our own stereotypes which in turn…

    You can see windows into these other realities. Sometimes they creep through. You’ll see a post that you’ve made about #BlackLivesMatter get 4 likes and another friend’s get 40 with a meaningful stream of comments tacked on. You’ll see another friend post something about police rights and then become increasingly uncomfortable as you get deeper into the thread’s contributors. These are other moments in people’s lives peeking over your garden’s wall.

    But let’s not blame algorithms for all of this. An algorithm is just a formula for solving a problem. If we don’t like the output of a formula, should we really blame the format?

    Let me exaggerate the point here: a recipe is an algorithm. You can follow Nestle’s recipe for chocolate chip cookies and end up with some rather socially acceptable cookies. You can find someone else’s recipe – something that they’ve put love and time and innovation into – and come out with a far superior dessert. You can also find a recipe that calls for strychnine as an ingredient.

    Recipes aren’t evil, but recipe authors do have shades of moral obligation to their creations and consumers do need to use them with common sense.

    “Algorithms suck at curiosity.” I so disagree! If anything, this is just a reflection of the values that we place on our online activities. We don’t have to follow that. Speaking for developers, we should strive for better.

    For just one quick example, check this video, starting at second 25:

    This is from the Smithsonian Cooper-Hewitt Museum of Design. You’ll see a visitor drawing a shape with a digital pen and then see a complex algorithm intuitively match that shape to an object in the collection. Once you have the object, you can explore it. You can search for other objects by color. You can design your own objects. You can save these objects, bring them home, mash them together. You can discover.

    We can design curiosity into our algorithms, we just have to be open to it. The limitations are not with the technology.

Leave a Reply

Your email address will not be published.