Episode 63
March 13, 2018

Deep Fakes

What happens when you no longer control your own likeness? Is there an ethical line to be crossed with posthumous product spokesmanship? We skirt the line in this episode and get topical - talking about the subject of ethics and artificial intelligence and how online communities have banned "Deep Fakes" - pornographic simulations produced by artificial intelligence. Also - this week we have the return of Future Policy with Danny Sepulveda - this week we have an update about Net Neutrality and how it will affect global commerce and infrastructure.

this episode sponsored by

No items found.

What happens when you no longer control your own likeness? Is there an ethical line to be crossed with posthumous product spokesmanship? We skirt the line in this episode and get topical - talking about the subject of ethics and artificial intelligence and how online communities have banned "Deep Fakes" - pornographic simulations produced by artificial intelligence.

Deep Fakes: weaponizing AI

  • We're seeing a really gross intersection of what we talked about on our prediction show around digital personal identity rights with body timage and data technology and how it advances with consumer products.
  • The Verge and an AP article both discuss the emergence of "deep fakes:" applying people's likenesses using AI in a pornographic way. These communities take hi-res videos and still frames from notable actresses for training data and  apply their likeness to nude photos.
  • There's no real legal consensus on deep fakes and their consequences, so a lot of these online sites have come together and banned them and their communities.
  • This is hitting right on the topic of that scary, black mirror-esque world we now live in where your face can be unwittingly applied without your consent to literally any context in some of the seediest and darkest ways with no way for you to manage it.

Legal Ramifications of Deep Fakes

  • The legal ramifications are unclear because we've never had this sophisticated level of technology.
  • This is something that will come up in law, and we'll probably start to see entire bills at the federal level.
  • There's been no federal regulations yet that address how to handle your body data.

The Historical Blind Eye to Invasive Technology

  • It's troubling that there were a lot of communities that turned a blind eye for years to  still images.
  • There was this incredible story in Wired about 10-12 years ago about how Gillian Anderson at one point was the most photoshopped face on the internet, and many of the photos were suggestive.
  • They were suggesting it was because of facial symmetry.
  • Those types of images have been around for decades with no one doing anything about it.
  • We can all agree that it's harmful to somebody in some way when they're applying your likeness in that way.
  • The advent of AI assisted fakery is taking it to the next level and blurring the line of realism.

Incredibly complex technology in the hands of the people:

  • We were talking to Greg Steinberg at Something Digital and he asked hypothetically, what if we applied this to products? You could change any scene, from commercials to videos, to represent your product with AI.
  • You could apply the Coke filter to any image and anything anyone holding is a can of coke.
  • Amazing movie technology is now available in the palm of your hand.
  • It's now available for consumers and businesses to take advantage of in a pretty easy way.
  • In 2016 at the Adobe max creativity conference they announced a tool that with creative cloud suite, that after 20 minutes of training spoken word data, you could train an AI or ML algorithm to parrot back phrases in another person's voice that you typed in.
  • A year and a half ago tech demo showcased a face to face algorithm that was applied to fake CNN broadcasts that used a source actor and overlaid that actor with political figures to show that GW and Vladimir Putin saying things they didn't actually say.
  • Is there even one way that this is a positive contribution to society?

Manipulative technology for sales

  • There are ways it could be used to leverage selling things, and businesses can use this type of tech as a tool.
  • We touched on this in episode 8.
  • People will have control of their body data and can leverage it for various reasons: they can sell it, it will outlast you posthumously, and you'll need someone to monitor and be in charge of it after you die.
  • For models (and maybe everyone's a model now) they'll be able to give companies access to their body data for specific reasons.
  • The question is: how are we going to enforce this?
  • It's such powerful tech that if we don't have good governance than it's going to get out of control real quick.
  • But just because we legislate this new reality, it doesn't mean that it's going to control people's behavior. Just because it's illegal won't make it cease to exist.

The difficult and expensive road to wiping your image from the internet

  • What's required to wipe your image off the internet, especially if it's someone's likeness or personal image, takes a lot of work: people have to trademark their face, or send DMCA take down notices to sites like Reddit to actually enforce their copyright.
  • At some point, what's coming in weaponized tech and disinformation campaigns. In no way is it helpful for humankind.

Policy Update with Danny Sepulveda!

  • The political traction net neutrality has is fascinating.
  • What happened immediately, and even before Ajit Pai repealed it, was a fairly widespread uprising of folks supporting net neutrality.
  • Nonetheless the department went forward with repealing it.
  • A number of States have gone forward with their own net neutrality rules.
  • The original net neutrality rules were over 400 page long with fairly complex issues.
  • "I've been working in this for over 20 years, and I've never seen an issue that's gotten so much traffic."
  • Reasons for the traction:
  • People love the idea of the internet as  a public space open and accessible to everyone as a relatively egalitarian basis
  • People don't appreciate a regulator behaving in the best interest of the regulated as opposed to the interests of the public.
  • Republicans believe that if you own pipes going into somebody's house, you should have the freedom to contract with the providers for different treatment for better ROI, and consequently, this would encourage additional investment in infrastructure around the country.
  • But there's a tremendous amount of incentive to manipulate that gatekeeper function for non-productive ends to extract tolls and rents.
  • Most Democrats believe net neutrality should be upheld because it works.
  • The way people access internet now without intervention with internet service provider has worked really well with innovation.

What's next

  • Congress is considering repealing the FCC's decision.
  • It's highly unlikely to work due to Republican control of both Congress and the house.
  • Right now 49 Senators wish for repeal. We only need 1 or 2 more Senators to agree to repeal the rule, but it's highly unlikely the House would agree.
  • Even if the House voted to go back to net neutrality, the President could still veto the effort.
  • We are unlikely to see a restoration of net neutrality during this administration.
  • There are also lawsuits against it right now.
  • The courts could throw it out. Which would return us to Obama era rules.
  • Once the courts decide, either way, it will create a political dynamic in which members of Congress will have to come to some decision about whether they wish to write into law some kind of compromise.
  • In all likelihood, Net neutrality won't be restored.
  • But there's been a lot of activism, and we'll see what it means politically in the midterms going forward.

BACK TO THE DUDES

BODY DATAAAA!

  • In a commerce context, body data is really useful. But using it to accomplish things with people's image is just dangerous.
  • We don't see it not being used.
  • It's a tech that exists now, and it'll be used by businesses, and they'll find use cases for it. Now that it exists, we can't go back.
  • Aside from spokesperson and generational licensing groups like the Elvis and Marilyn Monroe estates, all I see this being novel for in a commerce context is us having more and more Reba Mcentire and KFC Colonel Sanders mash ups.
  • We don't need more Jim Gaffigan colonel Sanders to make me buy fried chicken, but that's where we're heading.
  • Consider though, the Micro-Spokesperson: using AI to determine the best person to influence another set of customers.
  • That influencer will sell their digital body rights to influence a certain set of people based of specific sets of DNA factors.

DNA TESTING, FOLKS!

  • 23andme was spamming the heck out of us on the Winter Olympics.
  • If you've watched the Olympics, you probably saw the ads at least 50 times.
  • It's just one example of new DNA testing groups.
  • There's a ton of other really specific stuff going on with DNA testing.
  • It's getting better and better, and you're able to determine more stuff with it.
  • A company is matching DNA to medications: you get your DNA scanned and then get better understanding of what medications will work better for you based off your results.
  • It's personalized medication for you.

BACK TO INFLUENCERS! (Honey I shrunk the influencers.)

  • Venturebeat talks about Influential, a company that just launched a social intelligence platform. They find influencers for brands using the help of IBM Watson.
  • They can find people based on microsegment affinities to predict whether or not they would be influential for a brand for micro-influencer engagement.\ Imagine if they took training data from dating apps, and then used that data to help create influencers based off of attraction factors that would allow people to trust somebody more, or like them more, because they look a specific way ro have a certain personality. (We so need GDPR in the US).

Who will influence the influencers?

  • Everything is happening on instagram.
  • There's a story on L2 called, "Can Nike keep snapchat alive?"
  • Nike was the first company to sell directly on snapchat, and that collaboration is signaling that snapchat might be moving into e commerce.
  • But in the same week, Kendall Jenner tweeted the snapchat 1 milliion dollar dip in their market valuation.
  • Even when you're doing interesting things in retail, when influencers are doing things in retail, and have the products to engage in 1 to 1, even then it comes down to a handful of people having the eyeballs to really determine the fate of those platforms.
  • So there are influencers for the influencers.
  • The success will be in if you can keep the attention of the people who matter.
  • And there's no amount of AI to keep the attention of capricious people.
  • Ad Age recently talked about all the data that shows how micro-influencers are having an insane affect on people above and beyond the standard celebrity influencers.
  • If you're a brand, you probably don't want a big celebrity, you probably want a series of micro-influencers.
  • Instagram influencers wouldn't traditionally have any corporate sponsorship, but they do because they have million and millions of eyeballs.
  • It's only because of their engagement in social. It has nothing to do with any accolade or aptitude.
  • 15-20  years ago you'd have to be an athlete or actor to gain it.
  • Now anybody can do it for just about anything for anybody.
  • Or we can fake you with AI.

Back to Body Data!

  • Shoutout to Shapescale.com: a 3D body scanning tool for fitness tracking and visualization. You stand on their scale and it records your body and then you can get a picture of yourself from a 3D view and actually visualize different things, like how you should go about changing your body to see what you want to do.
  • It looks at fat and muscle mass, and you get heat maps of where your body's changing, and you get visual goal tracking.
  • It's marketed as the next gen of scales beyond the "smart" scale we have no.
  • Beyond it being "cool," but we have to wonder, where does it go from here other than being cool?
  • Perhaps you can mine the data and do your own A/B tests on your body?
  • It does do is allow someone to attack weight loss or health like a business problem: and treat their life like something they can test and try something out on.
  • Despite this wealth of technology and data, we're more depressed than we've ever been as a country.
  • Maybe it's not actually helping us.

That concludes our awesome meandering and tangential show, and we'd love to hear what you have to say. Go to futurecommerce.fm. Hit us up and lend more to conversation. Or Email us at brian@futurecommerce.fm and phillip@futurecommerce.fm

Download MP3 (39.5 MB)


Brian: [00:00:55] Hello and welcome to Future Commerce, the podcast about cutting edge and next generation commerce. I'm Brian.

Phillip: [00:01:01] And I'm Phillip. Question mark? I might be Phillip. I might also be an artificial intelligence abstraction of what you might believe Phillip to be.

Brian: [00:01:11] Oh, we're talking about body data today aren't we?

Phillip: [00:01:15] Oh, my gosh. Well, we need a new term, but I think that's kind of become our brand term. If we ever opened up a T-shirt store on FutureCommerce.fm, which would never happen. But if we did body data would definitely be...

Brian: [00:01:29] It won't happen?

Phillip: [00:01:29] Well, it might. It might.

Brian: [00:01:31] Shot down that idea, too?

Phillip: [00:01:34] Is someone taking notes? Who's taking notes, show notes? Matt, get on that T-shirt idea. Well, we have... There's so much news. This is a lightning round show. We also have in just a few minutes, we're bringing back future policy with Danny Sepulveda. Really excited for that.

Brian: [00:01:54] Yeah. It's good stuff. Really good stuff ahead.

Phillip: [00:01:56] Holy cow. Some good stuff there. And so buckle up. Lots of stuff happening. Also want to welcome back our sponsor, Vertex, rejoining us for another run of episodes. And so we've got a lot of momentum, but without any further ado, OK. Brian, let me tell you a story.

Brian: [00:02:13] Tell your story, Phillip. I want to hear it.

Phillip: [00:02:18] This a little bit of like a sensitive topic, but it goes with a whole bunch of different sort of ish body... Like it's this nice overlap of the things we were talking about in our predictions episode. So to just kind of set it up a little bit. The intersection of digital personal identity rights and body data and body image sort of technology and as it's advancing and how it's coming to consumer products. And I feel like there's a lot of news this week, in the last week or two, that hit on a lot of those topics. So that's kind of the open ended sort of conversation today. In that vein. So I saw on I believe it was an Associated Press article not too long ago, about two weeks ago, where a bunch of online communities put out, you know, a large sort of unified statement that they were going to be banning some smaller communities that were developing among them that focused around artificial intelligence, applying persons, people's likenesses in a pornographic way. So this is a sensitive subject, but it's gained the term deep fakes. And so these communities, actually there's a really great write up on The Verge that you could go check out. But these communities effectively were taking video from notable actresses and using still frames from high resolution video for training data to apply to, you know, nude photos and that sort of thing. But there's no real legal consensus on deep fakes and what they are. But a lot of communities came together and said these are not something that we want to support. So this is kind of an impressive thing. So there was a community on Reddit that was the deep fake subreddit was banned by Reddit, also Pornhub, Gfycat, Discord and Imager all came out and they banned like hundreds of communities that were around these. And so I felt like this was hitting right on the topic of that scary sort of black mirroreske world that we now live in where you can unwittingly, if a video is out of view on YouTube, your face could be unwittingly applied to literally any context in some of the seediest and darkest ways.

Brian: [00:05:02] Exactly. This is why...

Phillip: [00:05:04] Without your consent, and there's no way for you to manage that.

Brian: [00:05:07] And like you said, there's no... The legal ramifications are unclear because we've never had technology to do it at this level. And so this is where, you know, all of a sudden your digital likeness is PII. You know, you have you should have the ability, and this is something that, you know, you're going to see come up in litigation and other lawsuits and, you know, all kinds of different ways. And probably you'll start to see even like entire bills at the national scale, at the federal scale about how your body data can be used and when and where and how at a much more detailed level than we've seen in the past. I think we've seen some states start to take this on, but I'm not aware of anything on a federal level yet that really will address how to handle your body data and treat it as PII. Then I think maybe even PII is the wrong term. There is probably a new type of data we're dealing with here that needs to be addressed in ways that we've never really addressed digital data. This is data just we've never had before in any meaningful way.

Phillip: [00:06:31] So what I find really sort of troubling is that there were a lot of communities that had that Reddit and others have turned a blind eye to for years that were specifically focused around still images. So there's this incredible story that was, I think, in Wired about 10 years ago or 12 years ago about how Gillian Anderson from X-Files fame. I don't know if you know her.

Brian: [00:07:05] Yeah.

Phillip: [00:07:07] She at one point in time was the most photo shopped person on the Internet. There were an incredible amount of fake images of her and all kinds of suggestive poses and such, but she was Photoshopped into a lot of these images. And they were suggesting a lot of it had to do with symmetry of the face and how easy it is to apply certain people's faces, especially when they're very symmetrical, to other people's bodies. But those kind of images and photoshops have been around for decades with nobody actually doing anything about it. And I think we could all agree that that's harmful to somebody in some way when they're applying your likeness. But it's the advent of computer assisted and AI assisted fakery here that is taking it to the next level.

Brian: [00:07:59] Right.

Phillip: [00:08:00] And it's blurring the line of realism.

Brian: [00:08:02] And lets you talk about, you know, not just body data here, but anything data. We were talking with Greg Steinberg at Something Digital. And he is like, "Well, what if people applied this to products?".

Phillip: [00:08:15] Yeah.

Brian: [00:08:19] You could change any scene to represent your product. You know, with AI a video, a commercial, you could apply that coke filter to life, and everything that anyone's holding that they can drink is now a can of coke. There's a lot that can happen here. I think the things we've all seen in movies where people are applying unbelievable graphics and doing all kinds of cool, you know, film editing and overlays and CGI, all of that like now that's available to you in the palm of your hand. Maybe I'm simplifying it quite a bit here, but like this is now available for consumers, or at least businesses, to take advantage of in a pretty easy way.

Phillip: [00:09:12] Yeah. I'll give you some other examples that are not in, you know, sort of this, you know, salacious sort of way. But I remember back in 2016, there was the announcement at Adobe Max Creativity conference that they have a tool that they were in Alpha working with as part of their, you know, creative cloud suite, that after 20 minutes of training data of spoken word, you could train an AI or a machine learning algorithm to parrot back phrases that you would type in in a another person's voice. And the example that was given at the time was Key and Peele. So Jordan Peele was hosting the event with Adobe and they demoed the process by editing audio of Keegan-Michael Key, who is Jordan Peel's comedic partner. And they made him say that he was like making out with Jordan instead of his wife. Like they took this phrase that already existed. And they rearranged it with just typing based on training data to make it say that he, you know, kissed Jordan three times. And this doesn't just stop there. If you even look around about a year and a half ago, there was a face to face algorithm that was nothing more than a tech demo. And it doesn't look so great. But this is now almost two year old technology that was applied to fake CNN broadcasts that took a target actor or source actor and overlaid it with, you know, political figures to show that they could make, you know, George W. Bush and Vladimir Putin saying things that they didn't actually say. And I feel like in every one of these instances, and I just kind of leave it here, and I'll let you kind of chime in because you're usually the positivity person, I cannot for the life of me, think about one way in which this is a positive contribution to society. I can't. I cannot see where this technology could ever be used for our betterment. Full stop.

Brian: [00:11:42] Full stop. Yeah. It's an interesting point. I think that there are ways that this technology can be used, that could be leveraged to sell things. And, you know, businesses can use this type of technology as a tool. I think that's very clear to me. I think, well, there may be other use cases I should probably save this for like an FC Insiders article where I actually have the time to think about it and actually plan out what I'm going to say about it. {laughter} But, I tend to agree with you. I think, we're going to see, and I've already... We talked about this way back in Future Commerce Episode 8. People are going to be able to have control of their body data, be able to leverage that body data for a variety of different reasons. They can sell it. It will outlast you. posthumously now. Your body data is something that you're going to need to have someone sort of monitor and like be in charge of after you die because otherwise people can use it for... This is real.

Phillip: [00:13:17] But isn't that like... Don't we already have... Yeah. Carry on. Sorry.

Brian: [00:13:20] Yeah. And so like models and maybe everyone's a model now.

Phillip: [00:13:28] Lord knows I am.

Brian: [00:13:30] Yeah. You'll be able to give companies the ability to use your body data for specific reasons like a snapshot in time or your body data for your whole life or to be used in one very specific way or in all ways. You'll be able to have control of how your digital image is used, and I think the question is going to be how are we going to enforce this? I have no idea. Enforcement is probably going to be the biggest issue because, like you said, this can be used in so many bad ways. This is such powerful technology that if we don't have a good way of governance, then it's going to get out of control really quickly.

Phillip: [00:14:27] But in a world where we don't need... I think that some of this is novel in that we don't actually for a lot of the celebrities who are affected by this kind of thing. Well, A) we have a wealth of training data because they're, high resolution video and images are everywhere to train. But I think we're also sort of skeptical because these sort of fakories have been around for some time. But for some of them, we don't even need fakes to exist when people are hacking iCloud and putting out private images that they take of themselves. And so my thought process around this...

Brian: [00:15:15] Whoa. Hold on. Your implication is basically that privacy doesn't exist anymore at all? Kind of.

Phillip: [00:15:22] Well, not that it doesn't exist anymore, but even if you solve and legislate, I guess this actually creeps up on another issue that we're having right now as a public conversation that United States around gun control. At some point, the conversation will always come back to just because we legislate it doesn't mean that it's going to control people's behavior. Just because it's illegal is not going to make it cease to exist.

Brian: [00:15:50] Right. That's kind of what I'm talking about. How do we... Because once it happens and it's out there, it's never going away. Like the Internet's forever.

Phillip: [00:16:00] I mean, honestly, there are some things that do get wiped off the Internet that don't exist anymore. I don't know. There are examples of that where people have. But what is required, especially when it's someone's likeness or when it's someone's personal photo or something like that, it kind of comes back to the fappening. You come back to that moment and you have people who are having to trademark their face and people who are having to like register trademarks and send digital millennium copyright acts and DMCA takedown notices to sites like Reddit for them to like actually just enforce a copyright that that's me, and you are not allowed to have that picture.

Brian: [00:16:47] Yeah.

Phillip: [00:16:48] That just at some point. And I know we've talked about this on the show. At some point where... This technology is a weapon. This technology is a weapon, and certainly the information that the disinformation campaigns and things that I can see coming down the pike, like what's coming in a weaponized technology, other than counter warfare, I don't see how having this technology helps anybody, certainly in a commerce context.

Announcer: [00:17:26] Now it's time for our weekly segment called Future Policy, brought to you by Vertex SMB. As always, we're joined by Deputy Assistant Secretary of State Danny Sepulveda.

Danny Sepulveda: [00:17:36] It's actually fascinating to me the degree of political traction that this issue has gotten. So as you know, last year, Ajit Pai, the chairman of the FCC and the Republican majority of the Federal Communications Commission, chose to repeal network neutrality protections that the Obama administration had put into place. And what happened immediately, even before the repeal, as people knew it was coming, was a fairly widespread uprising of folks who supported the rule and believed that your access to the Internet should not be interfered with by your last mile service provider, whether that be a telephone company or a cable company, whoever it is that provides service to your home. Nonetheless, our colleagues at the FCC chose to move forward. There's now legislation pending to impose state level network neutrality rules. And there have been a series of executive orders in a number of states: Vermont, Hawaii, Montana, New Jersey, New York, where what the governors of those states have said is that if a company provides telecommunications or Internet service to any government agency, those service providers have to comply with network neutrality rules. The original network neutrality rules were 400 pages long, and they involve some fairly complex concepts around network management, two sided markets. And I've been working on this issue for 20 years and I'm really involved in technology, in telecommunications public policy. And I've never seen an issue that has gotten so much traction among the public. My hypothesis is that 1) people really love the concept of the Internet as a public space open and accessible to everyone on a relatively egalitarian basis. The second thing is I think people don't appreciate a regulator behaving in the best interests of the regulated as opposed to the best interests of the public. Now, in fairness to my colleagues at the FCC, they would argue that yes, while the Internet service providers support the repeal of network neutrality, and it is in their economic interest to have the rules repealed, that that's not why they did it. They believe that if you own pipes going into somebody's house, you should have the freedom to contract with the information providers that are trying to deliver content into people's homes. Different treatment in order to maximize return on investment, and that that would in turn encourage additional investment in infrastructure around the country. From my perspective, I appreciate that argument and I appreciate that my colleagues on the other side of the aisle believe it. I believe, however, particularly in a market as concentrated as broadband service delivery to your home, there's an excessive amount of incentive to manipulate that gatekeeper function for nonproductive ends. But really just to extract tolls and rents. For that reason, I think that network neutrality should be protected and preserved, and I think that my party believes that. And secondly, I think that we believe on our side of the aisle that it should be protected and preserved because it's worked. The way that people access and use the Internet now without having this intervention between them and their experience by the Internet service provider has worked really well for innovation and creativity and creation at the edge. So where we stand now is Congress is considering repealing the FCC's decision. It's highly unlikely that that will happen because both the House and the Senate are controlled by Republicans. There are now, I believe, forty nine members of the Senate that wish for a repeal. So we only need one or two more senators, I believe, to agree to repeal the rule, in which case the Senate would vote to repeal. But it's highly unlikely that the House would agree, or even if by some circumstance the House did agree, that the president would not be veto a repeal. So for the purposes of this administration, we are unlikely to see a restoration of network neutrality rules. There are lawsuits pending against the Federal Communications Commission for having, in the minds of the people filing the lawsuits, acted arbitrarily and not in accordance with administrative law. And the courts could throw out the repeal of network neutrality, which would just return us to status quo ante with the Obama rules. And we'll see how that turns out. Once the courts make their decision. It will create a political dynamic in which members of Congress who support or oppose the rules will have to come to some decision about whether or not they wish to write into law some form of compromise. Where we are is network neutrality rules have been repealed. They are unlikely, through the legislative process, given the current majority, to be restored, and the courts are reviewing whether or not they were appropriately repealed. There is an immense amount of state action. There's a lot of activism out there among the grassroots in favor of network neutrality. And we'll see what that means politically in the midterms and going forward.

Brian: [00:24:49] Body data is really useful, but using it to, yeah, like you said, accomplish things with people's image is just dangerous.

Phillip: [00:25:01] Right. But it's not going to stop people from using it.

Brian: [00:25:06] The thing is I don't see it not being used because this is a technology that exists now. This is going to get used. It's going to be used by businesses. People are going to... There will be use cases for this in business. It's just going to be a matter of... Now that it exists, we can't go back. I guess that's what I'm getting at.

Phillip: [00:25:33] If I have to draw sort of parallel here in... Aside from spokesperson and like, you know, generational sort of licensing groups... I think about, you know, the Elvis estate or Marilyn Monroe, who, you know, posthumously are repping everything from Chanel to, you know. I don't know, the Hard Rock Hotel. And aside from an estate controlling a likeness as a brand that extends beyond your death, all I see this being novel for in a commerce context is us having more and more Reba McEntire, Colonel... The Colonel... What's the guy's name? They're like Reba McEntire, KFC Colonel.

Brian: [00:26:21] Oh. Colonel...

Phillip: [00:26:22] Sanders. Frikkin A, holy cow. My brain exploded trying to pull that one out of the archives. But we see more of like, I don't need more Jim Gaffigan Colonel Sanders to make me buy fried chicken. But I feel like that's probably where we're heading with the application of this technology towards digital spokesperson.

Brian: [00:26:40] Yeah, well, and also micro influencers like I think that's the other thing. Like right now we have very big names that are doing all of the repping. But what if using AI, we could determine who the best person to influence another set of people is.

Phillip: [00:27:01] Oh, you just dug something up on this. Did you not? This was a story.

Brian: [00:27:07] It probably was a story. I don't remember what story this is from. But yeah. So essentially we'll use AI to figure out who the best influencer is for a certain set of people then that particular person will be approached and given money to have their digital body rights be used to influence that certain set of people, and that certain set of people may be in a specific geographic area. But it might also be spread out across the entire world based on specific DNA factors that...

Phillip: [00:27:43] Holy cow. Wow. Yeah. Keep going.

Brian: [00:27:45] I am getting a little Black Mirror right now. But OK, let's move on to the rest of this conversation.

Phillip: [00:27:52] That's not your M.O. generally. So that's where I take it seriously.

Brian: [00:27:56] Let's move on to the rest of this conversation, which is the rest, the stuff that's happening with body data right now, because 23andMe has been advertising, absolutely spamming us on the Winter Olympics. If you're watching through the NBC app on whatever streaming device you use, you've probably seen the 23andMe like 50 times already. 23andMe is just one example of DNA mapping, a personal DNA mapping. But there's a ton of other like really specific stuff going on with tests, and they're really, really interesting things with DNA testing. I wish I had the other article that I just read about teenage testing, but it's getting better and better. And you are able to determine more and more stuff with DNA testing. And in fact, this is what it was. It was a company that's matching DNA to medications. So you have your DNA scanned, and then you'll have a better understanding of which medications would be effective for you. And I think you sent me...

Phillip: [00:29:14] That's in that personalization sort of personalized to you kind of a product bent.

Brian: [00:29:19] Yup. So why can't we have like that exact same thing, determine which people will best influence us? And then...

Brian: [00:29:29] Perfect. So actually, I was trying to find the article while you were talking because I knew it was a news story that I had picked out. VentureBeat There is an article on VentureBeat. Actually it came out yesterday or the day before? That there is a company called Influential, who have just launched a social intelligence platform to find influencers for brands with the help of IBM Watson's Artificial Intelligence.

Brian: [00:29:59] That'll do it.

Phillip: [00:30:00] LA based, Influential uses three different application programing interfaces for Watson to predict whether Fortune 1000 brands will succeed with a particular influence based marketing campaign. And it can find people based on micro segment affinities...

Brian: [00:30:16] Yep. There you go.

Phillip: [00:30:18] ...to predict whether or not they would be influential for a brand for micro influencer engagement. That is literally what you were just talking about. And it's happening.

Brian: [00:30:27] And it's happening. So there you go. And think about this even further. What if they took, you know, training data from dating apps, saw what kinds of people people were attracted to, in general, based off a variety of factors, and then used that data to help create influencers based off of attraction factors that it would allow people to trust somebody more, or like them more, because they look a specific way or they have a certain personality or whatever it is. This is something that probably... Well, like you said, it's already happening. It's just gonna get bigger and bigger.

Phillip: [00:31:13] But you have to. What platforms? So this always, for me, comes back to where this is happening. Right?

Brian: [00:31:21] Yeah.

Phillip: [00:31:21] And it's always on social.

Brian: [00:31:25] Instagram.

Phillip: [00:31:26] Right it's Instagram.

Brian: [00:31:27] Everything is happening on Instagram. {laughter}

Phillip: [00:31:28] Everything is happening on Instagram. This is the insane thing about what the kind of data research that's happening and the data mining that happens on Instagram is that your attention in a list of images that might seem innocuous to you, it doesn't even require at this point an intent to tap into and see an image in like larger. If you loom in a window of, you know, suggested... If you're looking at like the explorer, like the magnifying glass in Instagram, like all the tags and all the data and all the machine learning data that they've gathered about that image and machine vision about what might be in that image is used to understand more about you and what you like. And this is what's incredible. There's a story out on L2 earlier this week, which is now I guess a Gartner company. Has L2 always been a Gartner company? I don't believe so.

Brian: [00:32:23] I think it has been.

Phillip: [00:32:25] I don't know. It shows my ignorance there.

Brian: [00:32:30] It's been at Gartner for a while.

Phillip: [00:32:30] So L2 is always popping up for me when they're talking about these sorts of interesting sort of influential social influence around retail. And so this happened to pop up for me this week, which is can Nike keep Snapchat alive? So they were the very first...

Brian: [00:32:53] This is pretty good.

Phillip: [00:32:53] Yeah. They were the very first company to sell directly on Snapchat, selling new Air Jordan 3 Tinker Sneaker without ever having to leave the app. And that combination, or that collaboration, is signaling that Snapchat might be moving more into e-commerce. But in the same week, was it Kendall Jenner?

Brian: [00:33:18] Yes.

Phillip: [00:33:19] Yeah. Kendall Jenner, I think tweeted, "Who the heck even opens Snapchat anymore?" causing a $1 billion dip in their market capitalization. And that is the... Even when we're doing interesting things in retail, even when social influencers have a platform to sell products, and even if we have enough data to target them specifically to people one to one, you know, in this case Snapchat could be on Instagram, even then it comes down to a handful of people having the eyeballs that really determines the fate of those platforms.

Brian: [00:33:59] Yeah.

Phillip: [00:33:59] Right? So there's influencers for the influencers. And that's where the success of the platform is going to lie, is if you can keep the attention of the people that matter. And there's no amount of AI that you can apply to keep, you know, the Jenners or the Kardashian's attention span.

Brian: [00:34:15] Sure. That's true for like... I agree with you in that those influencers are big enough where they can have that kind of effect. But I think for you... And there's a lot of data, if you look at Ad Age, there's a ton of data out there. There was an article on Ad Age, I should say, about this.

Phillip: [00:34:39] Ad Age?

Brian: [00:34:39] Ad Age. Yeah. About how micro influencers are having an insane effect on people, above and beyond sort of your standard influencers like Kardashians and Jenners. If you're a brand, you probably don't want to go get a big celebrity, you probably want a series of micro influencers because...

Phillip: [00:35:02] Yes, for sure.

Brian: [00:35:04] Yep. Yep.

Phillip: [00:35:07] I mean and this is what's incredible is that the amount of popularity that's being gained... I know that we're wandering off the subject a bit. The amount of popularity that has had in the long tail of people that are considered to be influencers on Instagram as a platform that wouldn't traditionally have had any kind of corporate sponsorship by any major retail brands or branded manufacturers in any way, but they do because they have millions and millions of eyeballs. And it's only because of their engagement in social. It has nothing to do with any other kind of accolade or aptitude, which is a completely different thing in our world. Fifteen or twenty years ago, you would have had to have been a top level athlete who worked your whole life to gain that kind of influence. And now just about anybody can do it for just about anything. Or we can fake you looking like you know how to do that for just about anything for anybody if we use enough AI.

Brian: [00:36:07] Exactly. So back to the general discussion of body data. Again, you found a pretty cool company that really is in line with what we talked about it way back on episode 8.

Phillip: [00:36:21] A few of them, actually.

Brian: [00:36:21] ShapeScale.com.

Phillip: [00:36:25] ShapeScale.com.

Brian: [00:36:26] It's pretty, pretty unbelievable. It's a 3D body scanning tool that's for fitness tracking and visualization. So you stand on the scale, and it records your body, and then you can get a picture of yourself from a 3D view, and then you can actually visualize different things, and actually see how you should go about changing your body to accomplish what you want to do. It can look at like localized muscle mass and fat mass. It's pretty crazy. You can get heat maps of where your body's changing based on updated scans and, you know, visual goal tracking... It's the next generation of scales beyond a "smart scale" that we have now.

Phillip: [00:37:29] And I have to wonder, like aside from it being cool, because it's kind of cool.

Brian: [00:37:34] It is cool.

Phillip: [00:37:35] What do you do? I don't... They say it's $700, although I see a $349 price point for at least one of their models. I never would discount someone's want to just, you know, use something like this for absolutely no good reason. But I have to wonder, where does it go from here other than it being cool? If we have more information about our quantified self, does that build confidence in us or does it have the opposite effect? If I'm working harder and harder and harder and I'm seeing that I have no change day to day in my heat map on my body and my 3D model...

Brian: [00:38:27] Well no, the beauty of that is I think you can see that if what you're doing is not working, you have to try something else. So you can have that data. You know, I talked about this on the predictions. As we've gathered more and more tools like this, and we can really keep track of every part of our body and our lives and everything, you can actually mine that data and then do tests. You know, like your own a/b tests. "Well, if I do this, this doesn't change at all. So I shouldn't do that. I should stop doing that. It's not helping me. I need to try something else to accomplish that." At least that's what someone should be using this for. How it actually affects us is a different story. I think that's what you were kind of getting at, is maybe it's you know, as we've gathered more data and had more screens and more points of information and more points to share that information with everyone in the world, that's not necessarily... We're more depressed than we've ever been as a country. We are more out of shape and, you know, more obese than we've ever been as a country. I'm talking from a US centric point of view right now, obviously. Maybe all of this information is actually not helping us. But I think what it does allows it allows someone who does want to attack something like body sculpting or weight loss or whatever it is like it's a business problem and treat their life like that's something that they can test and make changes from and have data on and can actually do something about, then maybe this is a pretty cool tool.

Phillip: [00:40:14] Yeah. OK. So that was amazing and sort of meandering and kind of an awesome open conversation. I want to hear what our listeners think. And if you are listening to this today through a smart speaker device or on a podcast app, you have another choice, speaking of choices, to listen. We're now on Spotify. So Future Commerce can be had on Spotify? So make sure to subscribe to us over there. But we want to hear back from you. So go to FutureCommerce.fm and hit us up. Lend your voice to the conversation. We want to hear more from you. What you think about deep fakes? What do you think about AI and body data as a weapon? And also, what do you think about, you know, the advent of personalization for products and on product perspective and using body data to drive personalized products and customized products for you? We want to hear about that. So hit us up, and you can always email us Brian@FutureCommerce.fm and Phillip@FutureCommerce.fm And we're on social, too. Facebook, Twitter and LinkedIn. But whew, awesome conversation. Thanks to Danny for contributing on Future Policy. And until next time. What is it we say?

Brian: [00:41:27] Retail tech is moving fast...

Phillip: [00:41:28] But Future Commerce is moving faster. I'll remember it one of these days. Thanks for listening.Bye.

Brian: [00:41:35] Bye.

Recent episodes

LATEST PODCASTS