Maryland Law Review
)&.' 
3
,,.  +-$& 
Unfair and Deceptive Robots
Woodrow Hartzog
)&&)0-#$,($-$)(&0)+%,- #5*$"$-&)'')(,&0.'+1&( .'&+
+-)!-# )(,.' ++)- -$)(0)'')(, (- +( -0)'')(,(-# .+$,*+. (
)'')(,
4$,+-$& $,+)."#--)1).!)+!+ ()* ( ,,1-#  '$).+(&,-$"$-&)'')(,+ 10-#, ( *- !)+
$(&.,$)($(+1&(0 /$ 01(.-#)+$2 '$($,-+-)+)!$"$-&)'')(,+ 10)+')+ $(!)+'-$)(*& , )(--
,'+-1&0.'+1&( .
 )'' ( $--$)(
 /
785
UNFAIR AND DECEPTIVE ROBOTS
W
OODROW HARTZOG
*
ABSTRACT
Robots, like household helpers, personal digital assistants, au-
tomated cars, and personal drones are or will soon be available
to consumers. These robots raise common consumer protection
issues, such as fraud, privacy, data security, and risks to health,
physical safety and finances. Robots also raise new consumer
protection issues, or at least call into question how existing con-
sumer protection regimes might be applied to such emerging
technologies. Yet it is unclear which legal regimes should govern
these robots and what consumer protection rules for robots
should look like.
The thesis of the Article is that the FTC’s grant of authority and
existing jurisprudence make it the preferable regulatory agency
for protecting consumers who buy and interact with robots. The
FTC has proven to be a capable regulator of communications,
organizational procedures, and design, which are the three cru-
cial concepts for safe consumer robots. Additionally, the struc-
ture and history of the FTC shows that the agency is capable of
fostering new technologies as it did with the Internet. The agency
generally defers to industry standards, avoids dramatic regulato-
ry lurches, and cooperates with other agencies. Consumer robot-
ics is an expansive field with great potential. A light but steady
response by the FTC will allow the consumer robotics industry to
thrive while preserving consumer trust and keeping consumers
safe from harm.
© 2015 Woodrow Hartzog.
*
Associate Professor, Samford University’s Cumberland School of Law; Affiliate Scholar,
Center for Internet and Society at Stanford Law School. The author would like to thank Ryan
Calo, Danielle Citron, Kate Darling, Brannon Denning, Evan Selinger, Michael Froomkin, Margot
Kaminski, and the participants of the We Robot 2015 conference. The author would also like to
think Megan Fitzpatrick and Lydia Wimberly for their excellent research assistance.
786 MARYLAND LAW REVIEW [VOL. 74:785
TABLE OF CONTENTS
A
BSTRACT ............................................................................................. 785
TABLE OF CONTENTS ............................................................................ 786
INTRODUCTION ...................................................................................... 787
I. CONSUMER ROBOTS RAISE EXISTING AND NEW CONSUMER
PROTECTION ISSUES ................................................................... 789
A. Scambots and Decepticons ................................................... 791
B. Spybots ................................................................................. 796
C. Nudgebots ............................................................................. 800
D. Autobots ............................................................................... 805
E. Cyborgs ................................................................................. 807
II. THE FTC HAS THE ABILITY TO ADDRESS CONSUMER ROBOTICS ... 810
A. Broad Regulatory Authority ................................................. 810
B. Diverse and Effective Toolkit .............................................. 813
1. Disclosures ..................................................................... 814
2. Design and Secondary Liability ..................................... 816
3. Organizational Procedures and Data Protection ............ 819
III. THE FTC SHOULD TAKE THE LEAD ON REGULATING CONSUMER
ROBOTICS ................................................................................... 821
A. Established Body of Law and Authority .............................. 823
B. Accommodation of Nascent Technologies ........................... 824
C. Deference to Industry ........................................................... 825
D. The FTC Can and Should Cooperate with Other Agencies . 827
IV. CONCLUSION .................................................................................. 828
2015] UNFAIR AND DECEPTIVE ROBOTS 787
INTRODUCTION
It turns out it is tough to spot your own vulnerabilities. This year a
South Korean woman was sleeping on the floor when her robot vacuum ate
her hair, forcing her to call for emergency help.
1
The mobile dating app
Tinder has been infiltrated by bots posing as real people that attempt to so-
cially manipulate users into downloading other apps, disclose credit card
information, and use webcams.
2
When remotely controlled anthropo-
morphic robots that appear to be acting autonomously are introduced to
children, young ones become attached to the robot and will disclose secrets
to the robot that they would not tell their parents or teachers.
3
Should com-
panies be required to tell people how vulnerable they are? Should the law
require safer robots? Thus far, there is no consensus regarding the regulato-
ry response to consumer robotics. If this uncertainty is not already a prob-
lem, it soon will be.
Robots are now in the hands of consumers. Household helpers, per-
sonal digital assistants, automated cars, personal drones, and countless other
robots are or will soon be available to consumers for a reasonable price.
Yet it remains unclear exactly how vulnerable consumers are to these ro-
bots. It is also unclear which legal regimes should govern these robots and
what consumer protection rules for robots should look like.
Robots for consumers present two kinds of challenges. First, many of
these robots raise common consumer protection issues, such as fraud, pri-
vacy, data security, failure to exercise reasonable care and the exploitation
of the vulnerable. Like computers, robots are capable of collecting, using,
and disclosing information in harmful ways. Robots can also be hacked.
Second, the coming wave of robotics also raises new consumer protection
1. See Matthew Humphries, Fire Department Called After Robot Vacuum ‘Attacks’ Sleep-
ing Owner, G
EEK (Feb. 6, 2015), http://www.geek.com/news/fire-department-called-after-robot-
vacuum-attacks-sleeping-owner-1615192/; see also Brian Ashcraft, Robot Vacuum Attempts to
Chew Owner’s Head Off, K
OTAKU (Feb. 6, 2015), http://kotaku.com/robot-vacuum-attempts-to-
chew-owners-head-off-1684171465.
2. See Leo Kelion, Tinder Accounts Spammed by Bots Masquerading as Singles, BBC (Apr.
2, 2014), http://www.bbc.com/news/26850761; see also Satnam Narang, Tinder: Spammers Flirt
with Popular Mobile Dating App, S
YMANTEC (July 1, 2013),
http://www.symantec.com/connect/blogs/tinder-spammers-flirt-popular-mobile-dating-app.
3. See, e.g., Jacqueline Kory Westlund & Cynthia Breazeal, Deception, Secrets, Children,
and Robots: What’s Acceptable?, 10
TH ACM/IEEE CONFERENCE ON HUMAN-ROBOT INTERAC-
TION
(HRI) (2015), http://www.openroboethics.org/hri15/wp-content/uploads/2015/02/Mf-
Westlund.pdf; C.L. Bethel et al., Secret-Sharing: Interactions between a Child, Robot, and Adult,
IEEE
INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (2011), available at
http://www.cindybethel.com/publications/IEEESMC2011-BethelCL.pdf; C
YNTHIA BREAZEAL,
DESIGNING SOCIABLE ROBOTS (2004); M. Fior et al., Children’s Relationships with Robots: Ro-
bot Is Child’s New Friend, 4 J.
PHYSICAL AGENTS 9 (2010).
788 MARYLAND LAW REVIEW [VOL. 74:785
issues, or at least calls into question how existing consumer protection re-
gimes might be applied to such foreign technologies.
The Federal Trade Commission (“FTC) is responsible for protecting
consumers through its authority under Section 5 of the FTC Act to police
unfair and deceptive trade practices. The FTC’s recent expansion into the
Internet of Things and the mass adoption of robots by consumers are
about to meet head-on. But is the FTC the best agency to protect consum-
ers who purchase and use robots? What should the FTC’s consumer robot-
ics jurisprudence look like?
The goal of this Article is to explore the proper role of the FTC regard-
ing consumer robotics. This Article will identify the consumer vulnerabili-
ties created by robots and analyze existing FTC jurisprudence to determine
what standards, if any, can guide the design and use of robots that are capa-
ble of physically, emotionally, and financially harming consumers.
I argue that that the FTC’s grant of authority and existing jurispru-
dence make it the preferable regulatory agency for protecting consumers
who buy and interact with robots. The FTC has proven to be a capable reg-
ulator of communications, organizational procedures, and design, which are
the three crucial concepts for safe consumer robots. The FTC’s existing
framework for protecting consumers from fraud, data breaches, privacy
harms, and exploitation is robust enough to adequately protect consumers
and clear enough to notify commercial entities of their obligations when
designing, selling, and using robots that interact with consumers.
Additionally, the structure and history of the FTC shows that the agen-
cy is capable of fostering new technologies as it did with the Internet. The
agency consistently defers to industry standards, avoids dramatic regulatory
lurches, and shares authority with other agencies. Consumer robotics is an
expansive field with great potential. A light but steady regulatory approach
by the FTC will allow the consumer robotics industry to thrive while pre-
serving consumer trust and keeping consumers safe from harm.
This Article proceeds in three parts. Part I surveys the potential vul-
nerabilities created by consumer robotics. This Part explores different
kinds of problematic consumer robots such as spybots, nudgebots, scam-
bots, automated algorithms, and cyborgs. It maps concerns about data secu-
rity, exploitation, privacy, and deception onto the existing FTC Section 5
jurisprudence, which prohibits unfair and deceptive trade practices.
Part II will explore the FTC’s background and jurisdiction. While the
FTC is limited to regulating robots in commerce, it has great discretion over
who and what to regulate. Robot designers, merchants, and organizations
using robots as part of a service must refrain from unfair and deceptive
trade practices. The FTC has also developed a means and instrumentali-
tiestheory of secondary liability to reach entities that facilitate consumer
deception or harm.
2015] UNFAIR AND DECEPTIVE ROBOTS 789
Part III addresses the question of agency choice. Is the FTC the best
agency to regulate consumer robotics? Should it be the only regulator of
consumer robotics? I conclude that the FTC should take the lead on con-
sumer robotics issues because the agency has an established body of law to
draw from, it can accommodate nascent technologies, it gives deference to
industry standards, it is capable of responding to technological change
quickly but with stability, and that it can use Section 5 as a safety net for
issues not specifically addressed elsewhere. I also conclude that the FTC
should not try to regulate consumer robotics in isolation. The agency
should work with other federal and state agencies and possibly even new
agencies like a proposed Federal Robotics Commission to ensure that con-
sumers are protected from unfair and deceptive robots.
I. CONSUMER ROBOTS RAISE EXISTING AND NEW CONSUMER PROTECTION
ISSUES
What is a robot anyway? It is difficult to say. There is no settled defi-
nition for the term “robot,” particularly in law and policy circles.
4
Do ro-
bots have to be embodied, or can software botsbe counted as a robot?
Do robots have to be automated, or can telepresence machines that are re-
motely operated be counted as robots?
Neil Richards and William Smart have noted In most [common ex-
amples of robots], the robots can move about their world and affect it, often
by manipulating objects. They behave intelligently when interacting with
the world. They are also constructed by humans. These traits are, to us, the
hallmarks of a robot.
5
Richards and Smart propose the following working
definition: A robot is a constructed system that displays both physical and
mental agency, but is not alive in the biological sense.
6
This definition is a good place to start. However, for purposes of dis-
cussing consumer protection policy, it can be too narrow. Often non-
embodied and non-autonomous technologies will present similar or related
consumer protection issues to those contemplated by Richards and Smart’s
definition. The functional difference between robot and automated tech-
nology can sometimes be hard to articulate. Thus, for the purposes of this
article, I will also include certain automated software and non-automated
technologies.
4. Neil M. Richards & William D. Smart, How Should the Law Think About Robots?, WE
ROBOT CONFERENCE (2012), http://robots.law.miami.edu/wp-
content/uploads/2012/03/RichardsSmart_HowShouldTheLawThink.pdf.
5. Id. at 5.
6. Id.
790 MARYLAND LAW REVIEW [VOL. 74:785
In many ways, robots are not exceptional regarding consumer protec-
tion issues. Robots can be used to lie, scam, pressure, and manipulate con-
sumers in ways that are analogous to existing fraudulent practices. Ian Kerr
presciently warned in 2004, Like Hollywood’s finest directors, who are
able to steer their audiences’ attention away from the false assumptions that
they have so skillfully engendered, some software programmers are apply-
ing principles of cognitive science to develop electronic entities that garner
consumer trust. Unfortunately, some e-businesses are exploiting these ap-
plications to garner trust where no such trust is warranted.
7
Kerr called
this the californication of commerce, and his concern about consumer vul-
nerability to autonomous agents which leverage cognitive science for ma-
nipulative purposes is squarely a consumer protection issue.
8
However, in many ways, robots are exceptional with respect to con-
sumer protection. In a series of articles, Ryan Calo has described how ro-
bots will challenge, among other regulatory schemes, existing consumer
protection regimes such as privacy and notice because they are capable of
physical harm, have emergent properties, and feel to humans like social
actors.
9
Should robots designed with personalities and human or animal-like
faces be subject to different rules than simple boxes with wheels? At what
degree of automation should designers no longer be liable for the decisions
made by their autonomous agents? For example, should the designer of a
software bot whose function is to make random online purchases be liable
for when the bot buys drugs on the black market?
10
Should software terms
of use be subjected to more scrutiny when they govern mechanical body
parts like implanted hearing aids and electronic body parts?
11
Should robot
salespeople be subject to different rules than their human counterparts?
12
7. Ian Kerr, Bots, Babes, and the Californication of Commerce, 1 U. OTTAWA L. & TECH. J.
285, 288 (2004).
8. Id. at 289.
9. M. Ryan Calo, Robotics and the Lessons of Cyberlaw, 103 C
ALIF. L. REV. (forthcoming
2015); see also M. Ryan Calo, People Can Be So Fake: A New Dimension to Privacy and Tech-
nology Scholarship, 114 P
ENN. ST. L. REV. 809 (2010); M. Ryan Calo, Open Robotics, 70 MD. L.
REV. 571 (2011); M. Ryan Calo, Against Notice Skepticism in Privacy (and Elsewhere), 87
N
OTRE DAME L. REV. 1027 (2012); M. Ryan Calo, Robots and Privacy, in ROBOT ETHICS: THE
ETHICAL AND SOCIAL IMPLICATIONS OF ROBOTICS 187, 194 (Patrick Lin, Keith Abney & George
A. Bekey eds., 2012).
10. See Ryan Calo, A Robot Really Committed a Crime. Now What?, F
ORBES (Dec. 23,
2014), http://www.forbes.com/sites/ryancalo/2014/12/23/a-robot-really-committed-a-crime-now-
what/; Daniel Rivero, Robots are Starting to Break the Law and No One Knows What to do About
It, F
USION (Dec. 29, 2014), http://fusion.net/story/35883/robots-are-starting-to-break-the-law-and-
nobody-knows-what-to-do-about-it/.
11. See Benjamin Wittes & Jane Chong, Our Cyborg Future: Law and Policy Implications,
B
ROOKINGS (Sept. 2014), http://www.brookings.edu/research/reports2/2014/09/cyborg-future-
2015] UNFAIR AND DECEPTIVE ROBOTS 791
This Part will explore the different consumer protection issues pre-
sented by robotics. Where appropriate it will also describe potentially rele-
vant FTC jurisprudence and established concepts. It will also highlight are-
as of uncertainty and where more law, policy, and theory is needed. Some
of these robots are not designed to harm consumers, they are just used that
way. For others, consumer harm and deception is the reason for the robot’s
existence. While some robots may be an immediate threat to consumers,
others merely serve as proofs-of-concept or harbingers for future problems.
A. Scambots and Decepticons
Perhaps the most fundamental reason we are vulnerable to robots is
that we trust them. Not only do we entrust them with our most intimate
secrets and give them access to our most personal spaces, but we trust them
with our physical well-being.
13
One of the fastest growing segments of ro-
botics is in the field of health care.
14
There are many reasons why people trust robots. Often the reason is
that we have confidence in the manufacturer or designer because of their
reputation. iRobot, a leader in the consumer robotics field and maker of the
popular vacuum robot Roomba, enjoys a strong reputation among consum-
ers.
15
Consumer trust in robots is also formed by company representations.
In this way, robots are not unique. The FTC has a long history of protect-
ing against deceptive representations by companies.
law-policy-implications; see also Ian Kerr, The Internet of People? Reflections on the Future
Regulation of Human-Implantable Radio Frequency Identification, in
LESSONS FROM THE IDEN-
TITY
TRAIL: ANONYMITY, PRIVACY AND IDENTITY 335 (Ian Kerr, Valerie Steeves & Carole Lu-
cock eds., 2009).
12. Maggie Hiufu Wong, Bleep Blorp: New Japanese Hotel to be Staffed by Robots, CNN
(Feb. 5, 2015), http://www.cnn.com/2015/02/04/travel/japan-hotel-robots/index.html.
13. Calo, Robots and Privacy, supra note 9, at 187, 194.
14. Laurel Riek, Woodrow Hartzog, Don Howard, AJung Moon, & Ryan Calo, The Emerg-
ing Policy and Ethics of Human Robot Interaction, 10
TH ACM/IEEE CONFERENCE ON HUMAN-
R
OBOT INTERACTION (HRI) (2015); see also, Sooyeon Jeong et al., Deploying Social Robots in
Pediatric Hospitals: What Needs to be Considered?, 10
TH ACM/IEEE CONFERENCE ON HUMAN-
R
OBOT INTERACTION (HRI) (2015), http://www.openroboethics.org/hri15/wp-
content/uploads/2015/02/Hf-Jeong-et-al.pdf; Heike Felzmann et al., Robot-assisted Care for El-
derly with Dementia: Is There a Potential for Genuine End-user Empowerment?, 10
TH
ACM/IEEE CONFERENCE ON HUMAN-ROBOT INTERACTION (HRI) (2015),
http://www.openroboethics.org/hri15/wp-content/uploads/2015/02/Hf-Felzmann.pdf.
15. See, e.g., Liam McCabe, The Best Robot Vacuum, S
WEET HOME (Jan. 6, 2015),
http://thesweethome.com/reviews/the-best-robot-vacuum-is-the-roomba-650/; Ed Perratore, iRo-
bot Roomba Zooms Past Other Robotic Vacuums: But No Robotic Vacuum Can Replace Your
Upright or Canister, C
ONSUMER REP. (Sept. 3, 2013),
http://www.consumerreports.org/cro/news/2013/09/consumer-reports-tests-three-new-robotic-
vacuums/index.htm.
792 MARYLAND LAW REVIEW [VOL. 74:785
The FTC’s most effective and commonly used regulatory tool is its
authority to protect against deceptive trade practices in Section 5 of the
FTC Act. A deceptive trade practice is any a material representation,
omission or practice that is likely to mislead the consumer acting reasona-
bly in the circumstances, to the consumer’s detriment.
16
According to the
FTC:
Practices that have been found . . . misleading or deceptive in
specific cases include false oral or written representations, mis-
leading price claims, sales of hazardous or systematically defec-
tive products or services without adequate disclosures, failure to
disclose information regarding pyramid sales, use of bait and
switch techniques, failure to perform promised services, and fail-
ure to meet warranty obligations.
17
Rebecca Tushnet and Eric Goldman wrote There are two pillars of
truth in advertising according to the FTC: First, advertising must be truthful
and not misleading. Second, advertisers must have adequate substantiation
for all product claims before disseminating their advertising.
18
The FTC
has a long history of regulating deceptive advertising and marketing state-
ments, including being on the forefront of niche and novel areas like blog-
ger disclosure of benefits, subliminal advertising, drug advertising, nega-
tive-option marketing, and product demonstrations.
19
There are several
scenarios emerging regarding the design and use of robots where the FTC
might find deception.
Often there is a great difference between people’s conceptions of what
robots are currently able to do and what they are actually able to do. Socie-
ty’s notions of robots’ capabilities are formed less by reality and more by
popular movies, books, and other aspects of pop culture.
20
This makes
marketing robots a ripe opportunity for deception because consumers are
primed to believe.
For example, one problematic kind of representation currently made
by robotics companies has to do with performance videos,often uploaded
16. Letter from James C. Miller III to Hon. John D. Dingell, app. at 17476 (1984); see also
Letter from FTC Comm’rs to Wendell H. Ford & John C. Danforth, Senators (Dec. 17, 1980),
reprinted in In re Int’l Harvester Co., 104 F.T.C. 949 app. at 107076 (1984), available at
https://www.ftc.gov/public-statements/1980/12/ftc-policy-statement-unfairness.
17. Letter from James C. Miller III to Hon. John D. Dingell, app. at 175 (1984).
18. R
EBECCA TUSHNET & ERIC GOLDMAN, ADVERTISING AND MARKETING LAW 101 (2d
ed. 2014) (citing Advertising Substantiation Policy Statement, 49 Fed. Reg. 30,999 (Aug. 2,
1984)).
19. Division of Advertising Practices, FTC,
https://www.ftc.gov/about-ftc/bureaus-
offices/bureau-consumer-protection/our-divisions/division-advertising-practices (last visited Mar.
17, 2015).
20. See Richards & Smart, supra note 4.
2015] UNFAIR AND DECEPTIVE ROBOTS 793
to a video sharing site or funding website like Kickstarter to tout a robots
features or effectiveness.
21
These videos sometimes speed up the motion of
robots to make them appear faster than they are. In other instances, these
videos simulate features that are planned, but might not yet exist. For ex-
ample, the Personal Robotfeatured in a Kickstarter video by Robotbase
simulates an advanced speech recognition that is aspirational and does not
yet exist.
22
The FTC has an established track record regulating deceptive product
demonstrations, which are forms of deceptive advertising.
23
For example,
the FTC alleged that carmaker Volvo acted deceptively when making a
commercial where all cars set to be destroyed as part of a monster truck
show were crushed except a Volvo.
24
In reality, the Volvo’s frame had
been reinforced and the other cars’ roof supports had been debilitated.
25
The FTC also alleged that Campbell’s Soup deceptively placed marbles at
the bottom of a soup bowl in one of its ads to make the soup appear as
though it contained more vegetables than it really had.
26
Another area of robotic deployment where deception becomes a prob-
lem involves what is known as a Wizard-of-Oz setup.
27
According to
Laurel Riek, [Wizard of Oz] refers to a person . . . remotely operating a
robot, controlling any of a number of things, such as its movement, naviga-
tion, speech, gestures, etc. [Wizard of Oz] may involve any amount of con-
trol along the autonomy spectrum, from fully autonomous to fully tele-
21. I thank Ryan Calo for bringing this problem to my attention.
22. See Eamon Kunze, Personal Robot Wants to be Your Ultimate Personal Assistant, WT
VOX (Feb. 20, 2015), https://wtvox.com/2015/02/personal-robot-wants-to-be-your-ultimate-
personal-assistant/?utm_source=dlvr.it&utm_medium=twitter (“‘The video is not an actual
demonstration,’ said CEO Duh Huynh. He told me it’s a production video. ‘It’s what you’ll get
by the end of the year.’ That’s when Robotbase expects to start shipping the first of these person-
al robots to customers. When the robot does finally ship, Huynh admits that it’s ‘not going to
have that sexy beautiful voice like in the video.’”).
23. See, e.g., FTC. v. Colgate Palmolive, 380 U.S. 374 (1965) (describing sand on Plexiglas
used as a substitute for sand paper in a demonstration of shaving cream); S.C. Johnson & Son Inc.
v. Clorox Co., 241 F.3d 232 (3d Cir. 2001) (finding the rate of leakage from competitor’s reseal-
able bag was exaggerated; see also Nikkal Indus. Ltd. v. Salton, Inc., 735 F. Supp. 1227
(S.D.N.Y. 1990) (holding that an advertisement claiming scoopable ice cream was not deceptive
despite a photograph of hard ice cream).
24. Volvo N.A. Corp., 115 F.T.C. 87 (1992); Texas v. Volvo North America Corp., No.
493274 (Tex. D. Ct. Travis Co. 11/5/90) (depiction of a monster truck riding over cars in which a
Volvo is not crushed was prosecuted because the roof supports of the Volvo had been reinforced
and the other cars’ roof supports have been weakened).
25. Id.
26. Campbell Soup Co., 77 F.T.C. 664 (1970).
27. See, e.g., Laurel D. Riek, Wizard of Oz Studies in HRI: A Systematic Review and New
Reporting Guidelines, 1
J. HUMAN-ROBOT INTERACTION 119, 119 (2012).
794 MARYLAND LAW REVIEW [VOL. 74:785
operated, as well as mixed initiative interaction.
28
Jacqueline Kory
Westlund and Cynthia Breazeal note that when a Wizard-of-Oz setup is
deployed, “[a]t the most basic level, the human interacting with the remote-
operated robot is deceived into thinking the robot is acting autonomous-
ly.
29
Westlund and Breazeal noted some of the problems with the Wizard-
of-Oz setup, where people may disclose sensitive information to the robot
that they would not tell a human, not realizing that a human is hearing eve-
rything they say. They may feel betrayed when they find out about the de-
ception. Given that social robots are designed to draw us in, often engaging
us emotionally and building relationships with us, the robot itself could be
deceptive in that it appears to have an emotional response to you but ‘in
reality’ does not.
30
When would a company’s Wizard of Oz deployment
become a deceptive trade practice? Given our general tendency to over-
estimate the technological ability and agency of robots as social actors, the
opportunity is ripe for malicious companies to scam users by convincing
them they are dealing with a fully autonomous agent.
Riek and Robert Watson have also articulated how malicious actors
might utilize security flaws in telepresence robots to deceive remote partic-
ipants in a conversation.
31
Attackers could modify messages, improperly
obtain their contents, or prevent the system from operating at all.
32
With
respect to video communications and telepresence, Riek and Watson find
that telepresence manipulation can also be subtle, stating modifications to
the [communications] channel may not be immediately (or at all) obvious to
the end user, as recent improvements in technology allow the realistic mod-
ification of both verbal and nonverbal communication signals in real time.
This may allow the malicious modification of communication, or even the
complete impersonation of a participant.
33
28. Id.
29. Westlund & Breazeal, supra note 3, at 1.
30. Id. (citing B
REAZEAL, supra note 3); M. Coeckelbergh, Are Emotional Robots Decep-
tive?, 3 IEEE
TRANSACTIONS ON AFFECTIVE COMPUTING 388 (2012)); see also David J. Atkin-
son, Robot Trustworthiness: Guidelines for Simulated Emotion, 10
TH ACM/IEEE CONFERENCE
ON
HUMAN-ROBOT INTERACTION (HRI) (2015),
http://www.academia.edu/9889659/Robot_Trustworthiness_Guidelines_for_Simulated_Emotion.
31. Laurel D. Riek & Robert N.M. Waton, The Age of Avatar Realism: When Seeing
Shouldn’t Be Believing, IEEE
ROBOTICS & AUTOMATION MAG., Dec. 2010, at 37, available at
http://papers.laurelriek.org/riek-watson-final.pdf.
32. Id.
33. Id. (“[T]his kind of identity theft may be tricky to maintain because of the contextual
information and subtle differences in . . . behavior, in particular, nonverbal behaviors that vio-
late . . . expectations. More subtly, [an attacker] might inject facial expressions and gestures into
a conversation . . .[or] . . . may choose to inhibit . . . expressions, such as reducing the intensity
of . . . smiles. [Attackers] can also augment or inhibit . . . tone of voice, or indeed even the words
2015] UNFAIR AND DECEPTIVE ROBOTS 795
Unlike inanimate objects, robots are capable of making their own rep-
resentations. Some of these robots will inevitably deceive consumers. I
will call lying robots decepticons.” Automated software ‘bots’ on social
media like Twitter are increasingly adept at tricking people into thinking
they are operated by humans.
34
Two of these bots were so convincing that
when they threatenedeach other on Twitter, the police responded to their
designer’s house, confused about who actually made a death threat.
35
The same technology and techniques could also be employed in embodied
robots.
Not all deception is actionable and not all decepticons are lawbreakers.
A modest amount of inaccuracy is allowable, if not encouraged, under gen-
eral principles of marketing and the messiness of human interaction. Many
robots that end up misleading people might simply be engaged in trade
puffery or common data analytics, similar to how a salesperson relies upon
context and cues to tailor a strategy to best close the deal. Fortunately, as I
will cover in Part II, the FTC has an established body of law to articulate
the difference between materially deceptive and non-deceptive representa-
tions.
This jurisprudence will be important as robots become more involved
in commerce. One of the core functions of the FTC is to protect consumers
against scams.
36
In addition to the agency’s focus on claims that can affect
health and physical well-being, the FTC dedicates much of its resources to
fighting those who target financially vulnerable consumers or economically
harm consumers.
37
It is worth noting that the relatively new Consumer Financial Protec-
tion Bureau (CFPB”) arguably has even more authority over scammers
than the FTC. The CFPB can regulate abusiveconduct as well as “un-
fair” conduct.
38
An “abusive” practice is one that:
(1) materially interferes with the ability of a consumer to under-
stand a term or condition of a consumer financial product or ser-
[said]. In the field of computer security, attacks such as these are referred to as a man-in-the-
middle attack, in which a third party interferes with the expected execution of a protocol.”).
34. See, e.g., Nick Bilton, Social Media Bots Offer Phony Friends and Real Profit, N.Y.
TIMES (Nov. 19, 2014), http://www.nytimes.com/2014/11/20/fashion/social-media-bots-offer-
phony-friends-and-real-profit.html.
35. See Lee Mathews, Police Respond to Twitter Bot Sending Death Threat to Another Twit-
ter Bot, G
EEK (Feb. 11, 2015), http://www.geek.com/apps/police-respond-to-twitter-bot-sending-
death-threat-to-another-twitter-bot-1615550/.
36. Scam Alerts: What To Know and Do About Scams in the News, FTC,
http://www.consumer.ftc.gov/scam-alerts (last accessed Mar. 19, 2015).
37. T
USHNET & GOLDMAN, supra note 18, at 101.
38. Dodd-Frank Wall Street Reform and Consumer Financial Protection Act, 12 U.S.C.
§ 5531 (2012); see also T
USHNET & GOLDMAN, supra note 18, at 115.
796 MARYLAND LAW REVIEW [VOL. 74:785
vice; or (2) takes unreasonable advantage of—(A) a lack of un-
derstanding on the part of the consumer of the material risks,
costs, or conditions of the product or service; (B) the inability of
the consumer to protect the interests of the consumer in selecting
or using a consumer financial product or service; or (C) the rea-
sonable reliance by the consumer on a covered person to act in
the interests of the consumer.
39
Because of its ability to regulate abusive conduct, the CFPB might be
even more empowered in some contexts than the FTC to regulate those that
would exploit irrational consumer biases such as our tendency to attribute
agency to robots, form emotion bonds to them, and irrationally trust the
results of automated decisions.
40
As I’ll argue in detail below, it will take
many agencies to effectively address consumer robotics. And scambots are
not the only robots that might pose problems for consumers.
B. Spybots
Robots will eventually assist consumers in both banal and intimate as-
pects of people’s lives. To be effective, robots must sense the world around
them. Robots have been equipped with cameras, motion and audio sensors,
facial and object recognition technologies, and even biological sensors that
measure pulse, pupil dilation, and hair follicle stimulation.
41
They have the
capacity to store massive quantities of personal data in perfect, easily re-
called form. When robots are fully realized, they will be nothing short of a
perfected surveillance machine.
42
Ryan Calo has argued that robots introduce new points of access to
historically protected spaces.
43
Calo noted, The home robot in particular
39. 12 U.S.C. § 5531.
40. See generally Benedict J. Schweigert, The CFPB’s “Abusiveness” Standard and Con-
sumer Irrationality, SSRN (May 15, 2012),
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2061733.
41. Calo, Robots and Privacy, Robots and Privacy, supra note 9, at 194; Kristen Thomasen,
Liar Liar Pants on Fire! Examining the Constitutionality of Enhanced Robo-Interrogation, W
E
ROBOT CONFERENCE (2012), http://robots.law.miami.edu/wp-
content/uploads/2012/01/Thomasen_CONSTITUTIONALITY-OF-ROBOT-
INTERROGATION.pdf; Adam Higgenbotham, Deception is Futile When Big Brother’s Lie De-
tector Turns Its Eyes on You, W
IRED (Jan. 17, 2013), http://www.wired.com/2013/01/ff-lie-
detector/.
42. Calo, Robots and Privacy, supra note 9, at 187, 194 (“It is not hard to imagine why ro-
bots raise privacy concerns . . . . Robots can go places humans cannot go, see things humans
cannot see. Robots are, first and foremost, a human instrument. And, after industrial manufactur-
ing, the principle use to which we’ve put that instrument has been surveillance.”); see also Margot
Kamisnki, Robots in the Home: What Will We Have Agreed To?,
IDAHO L. REV. (forthcoming
2015), available at http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2592500.
43. Calo, Robots and Privacy, supra note 9, at 188.
2015] UNFAIR AND DECEPTIVE ROBOTS 797
presents a novel opportunity for government, private litigants, and hackers
to access information about the interior of a living space.
44
Spybots are also particularly problematic because people give robots
social meaning.
45
Calo stated, Robots are increasingly human-like and
socially interactive in design, making them more engaging and salient to
their end-users and the larger community. Many studies demonstrate that
people are hardwired to react to heavily anthropomorphic technologies,
such as robots, as though a person were actually present, including with
respect to the sensation of being observed and evaluated.
46
Calo argued that the social dimension of robots opens up three distinct
dangers: humans will have fewer opportunities for solitude, robots will be
in a unique position to extract information from people, and robots can lev-
erage the advantages of humans (such as fear and praise) in information
gathering without human drawbacks such as imperfect memories, fatigue,
and embarrassment.
47
Spybots are already so prevalent that it is impractical to try to describe
all of the different types. Drones have ignited America’s peeping tom anxi-
ety and they are getting smaller by the day.
48
One company has marketed
Nixie,a small robot that looks like a watch with propellers, as the first
wearable camera that can fly.
49
Visions of drone-covered skies and hidden
drones peeping into bedrooms easily trigger consumer distaste for surveil-
lance. Some of these drones might be regulated under the same theories
that the FTC has used to regulate spyware.
50
The FTC has alleged that the sale of spyware, as well as providing the
means and instrumentalities to install spyware and access consumer’s per-
44. Id.
45. Id.
46. Id.
47. Id.
48. See, e.g., Ryan Calo, The Drone as Privacy Catalyst, 64
STAN. L. REV. ONLINE 2 (2011);
Gregory S. McNeal, Alleged Drone ‘Peeping Tom’ Photo Reveals Perils of Drone Related Jour-
nalism, F
ORBES (July 14, 2014), http://www.forbes.com/sites/gregorymcneal/2014/07/14/alleged-
drone-peeping-tom-photo-reveals-perils-of-drone-related-journalism/; Erica Heartquist, Drone
Accused of Peeping into Woman’s Window Was Photographing Aerial Views, USA
TODAY (June
24, 2014), http://www.usatoday.com/story/news/nation-now/2014/06/24/seattle-woman-drone-
apartment-washington/11339835/.
49. N
IXIE, http://flynixie.com/ (last visited Mar. 17, 2015).
50. See, e.g., Aspen Way Enters., Inc., F.T.C. File No. 112 3151, No. C-4392 (F.T.C. Apr.
11, 2013); CyberSpy Software, LLC and Trace R. Spence, F.T.C. File No. 082 3160, No. 08-CV-
01872 (F.T.C. Nov. 17, 2008) (alleging that selling spyware and showing customers how to re-
motely install it on other people’s computers without their knowledge or consent is an unfair and
deceptive trade practice); see also Spyware and Malware, FTC,
https://www.ftc.gov/news-
events/media-resources/identity-theft-and-data-security/spyware-and-malware.
798 MARYLAND LAW REVIEW [VOL. 74:785
sonal information, are unfair and deceptive trade practices.
51
The FTC has
also concluded that installing spyware and gathering data without notice
was an unfair practice.
52
The agency cited to the substantial harm caused to
consumers from such invasive surveillance and concerns that “[c]onsumers
cannot reasonably avoid these injuries because [the surveillance] is invisi-
ble to them.
53
The FTC filed an unfairness complaint against Sony BMG
on a similar theory alleging that the company caused spyware to be down-
loaded without sufficient notice.
54
When is robotic surveillance like spy-
ware? When it is not obvious? When it is undetectable?
Boxie, an interactive story-capture robot developed at MIT, demon-
strates how people will give a random robot unquestioned access and per-
sonal information simply because it is adorable and unthreatening.
55
Boxie
was designed specifically to coax stories out of people and succeeded in
that goal.
56
Consider how robots like Boxie might be deployed as a robot shopping
assistant. Robotic shopping assistants, which are currently deployed in Ja-
pan, are designed to approach customers and encourage the purchase of a
particular product or service. Calo has noted Unlike ordinary store clerks,
however, robots are capable of recording and processing every aspect of the
51. See, e.g., CyberSpy Software, F.T.C. File No. 082 3160.
52. Aspen Way Enters., Inc., F.T.C. File No. 112 3151.
53. Id.
54. Sony BMG Music Entm’t, F.T.C. File No. 062 3019, No. C-4195 (F.T.C. June 28, 2007),
available at
http://www.ftc.gov/sites/default/files/documents/cases/2007/01/070130cmp0623019.pdf.
55. B
OXIE: THE INTERACTIVE STORY-CAPTURE CAMERA,
http://resenv.media.mit.edu/Boxie/ (last visited Mar. 17, 2015); see also Kasia Cieplak-Mayr von
Baldegg, The World’s Cutest Surveillance Robot Videographer, A
TLANTIC (Jan. 13, 2012),
http://www.theatlantic.com/video/archive/2012/01/the-worlds-cutest-surveillance-robot-
videographer/251370/; Paul Marks, Robot Video Journalist Uses Cuteness to Get Vox Pops, N
EW
SCIENTIST (Dec. 28, 2011), http://www.newscientist.com/article/dn21318-robot-videojournalist-
uses-cuteness-to-get-vox-pops.html#.VQb9j47F93E.; see also Laura Sydell, SXSW Debuts Robot
Petting Zoo For A Personal Peek Into The Future, NPR:
ALL TECH CONSIDERED (Mar. 18, 2015),
http://www.npr.org/blogs/alltechconsidered/2015/03/18/393614456/sxsw-debuts-robot-petting-
zoo-for-a-personal-peek-into-the-future (“BlabDroid actually has some pretty sophisticated wiring
inside, but with its cardboard shell with a smile cut into it, he looks like he was made in someone's
garage. [The robot’s creator] says that's intentional. ‘In a relationship with a robot, where you're
being very vulnerable, the other actor in that situation has to be as vulnerable as you,’ he says.
‘So if the robot is small, tiny, made out of cardboard, you kind of feel like you can open up to him
more because he's very familiar and you feel like you're in control of that situation.’”).
56. Notably, Boxie told people it was collecting information for MIT. The exact script was:
“Hi! My name’s Boxie, I’m from the Media Lab and I’m making a movie about MIT! This movie
will be featured on the Media Lab website. On the side of my head are buttons so you can give
me instructions. If you would like to be part of the movie, press the green button. If you want me
to go away, press the red button.” Boxie’s successor, BlabDroid, also asks for permission to film
and record with a similar button method.
2015] UNFAIR AND DECEPTIVE ROBOTS 799
transaction. Face-recognition technology permits easy re-identification.
Such meticulous, point-blank customer data could be of extraordinary use
in both loss prevention and marketing research.
57
Given this kind of utili-
ty, such features on robots of all kinds seem likely. Like the ubiquity of
smartphones, we will be surrounded by mechanical watchers.
58
While the FTC does not have a long history of regulating surveillance
technologies, over the past twenty years it has begun to develop a theory of
unfair and deceptive surveillance and information gathering. For example,
the FTC has charged a number of companies with deceptive trade practices
for creating a deceptively fake software registrationpage to obtain per-
sonal information from technology users.
59
Because only some types of
surveillance were disclosed to the user, the FTC asserted the companies
acted deceptively when they failed to tell users the nature of the questions
they were being asked via software.
60
57. Calo, Robots and Privacy, supra note 9, at 190.
58. See Bruce Schneier, Cell Phone Spying, S
CHNEIER ON SECURITY BLOG (May 9, 2008),
https://www.schneier.com/blog/archives/2008/05/cell_phone_spyi_1.html; see also Bruce Schnei-
er, Tracking People from Smartphone Accelerometers, S
CHNEIER ON SECURITY BLOG (Apr. 30,
2014), https://www.schneier.com/blog/archives/2014/04/tracking_people_2.html.
59. A number of FTC actions have centered on the creation and use of fake registration spy-
ware software called “Detective Mode.” See, e.g., Complaint at 5, DesignerWare, LLC, F.T.C.
File No. 112 3151, No. C-4390 (F.T.C. Apr. 11, 2013), available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415designerwarecmpt.pdf
(charging company that created and licensed “Detective Mode”). For examples of companies
charged with using Detective Mode to improperly gather personal information on users, see Com-
plaint at 2, Aspen Way Enters., Inc., F.T.C. File No. 112 3151, No. C-4392 (F.T.C. Apr. 11,
2013), available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415aspenwaycmpt.pdf; Com-
plaint at 3, B. Stamper Enters., Inc., F.T.C. File No. 112 3151, No. C-4393 (F.T.C. Apr. 11, 2013),
available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415bstampercmpt.pdf; Com-
plaint at 3, C.A.L.M. Ventures, Inc., F.T.C. File No. 112 3151, No. C-4394 (F.T.C. Apr. 11,
2013), available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415calmcmpt.pdf; Complaint
at 3, J.A.G. Rents, LLC, F.T.C. File No. 112 3151, No. C-4395 (F.T.C. Apr. 11, 2013), available
at http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415jagcmpt.pdf; Complaint
at 3, Red Zone Inv. Grp., Inc., F.T.C. File No. 112 3151, No. C-4396 (F.T.C. Apr. 11, 2013),
available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415redzonecmpt.pdf; Com-
plaint at 2, Watershed Dev. Corp., F.T.C. File No. 112 3151, No. C-4398 (F.T.C. Apr. 11, 2013),
available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/04/130415watershedcmpt.pdf.
60. See, e.g., Complaint at 2, Epic Marketplace, Inc., F.T.C. File No. 112 3182, No. C-4389
(F.T.C. Mar. 13, 2013), available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/03/130315epicmarketplacecmpt.pdf
(charging company for failing to disclose “history sniffing” practice). For an explanation of a
deceptive omission, see Letter from James C. Miller III to Hon. John D. Dingell, supra note 16,
app. at 175 n.4 (“A misleading omission occurs when qualifying information necessary to prevent
800 MARYLAND LAW REVIEW [VOL. 74:785
In FTC v. Frostwire, LLC, the FTC alleged that a software company
deceived consumers through its user interface when it failed to notify con-
sumers adequately regarding how its file-sharing software operated, includ-
ing the fact that downloaded files were shared publicly by default as well as
the fact that the software would publicly share files that consumers previ-
ously downloaded . . . and stored in ‘unshared’ folders even after consum-
ers deselected the Share Finished Downloads setting in the Options-Sharing
dialog box.”
61
Should robot designers and users also be obligated to disclose to con-
sumers how their personal information is being collected? Or should users
simply always be aware that when they are interacting with a robot that
their personal information is fair game? Does it matter that robots like
Boxie are specifically designed to extract personal information through so-
cial engineering?
Surveillance is not the only problematic method of collecting personal
information. The FTC also views the act of pretextingto be a generally
deceptive practice when used to obtain personal information. According to
the FTC, pretexting involves making various misleading and false state-
ments to financial institutions and others. Such tactics include calling fi-
nancial institutions and pretending to be the account holder, thereby induc-
ing the financial institution to disclose private financial information,and,
upon obtaining this private information, selling it.
62
Telepresence manipulation could be a form of pretexting. Would us-
ing a Wizard-of-Oz setup to obtain information be considered similar to
pretexting, given that the fundamental culpability in pretexting lies in the
fact that information was obtained by a person pretending to be someone
(or something) more likely to be entrusted with personal information?
C. Nudgebots
We humans are a persuadable bunch. Over the last half-century,
mounting evidence demonstrates that humans are subject to numerous bias-
a practice, claim, representation, or reasonable expectation or belief from being misleading is not
disclosed. Not all omissions are deceptive, even if providing the information would benefit con-
sumers.”).
61. Complaint for Permanent Injunction and Other Equitable Relief at 19, FTC v. Frostwire,
LLC, No. 1:11-cv-23643 (S.D. Fla. Oct. 12, 2011), available at
http://www.ftc.gov/sites/default/files/documents/cases/2011/10/111011frostwirecmpt.pdf.
62. Complaint for Injunction and Other Equitable Relief, FTC v. Rapp, No. 99-WM-783 (D.
Colo. Apr. 21, 1999), available at
http://www.ftc.gov/sites/default/files/documents/cases/1999/04/ftc.gov-touchtonecomplaint.htm.
2015] UNFAIR AND DECEPTIVE ROBOTS 801
es that motivate us to act in predictably irrational ways.
63
Humans rely too
heavily on available anecdotes and judgments reached by computers.
64
We
attribute human emotions and agency to machines.
65
We care too much
what others think about us and we increasingly entrench ourselves in opin-
ions formed based on trivial, anecdotal, and arbitrary evidence.
66
Even
worse, we consistently fall prey to these biases. This fact is well known
and regularly exploited.
Our vulnerability to manipulation combined with the technical and so-
cial power of robots could create more problems for consumers. One of the
most interesting questions is the extent to which robots will be allowed to
“nudge” humans. Cass Sunstien, who helped develop the concept of nudg-
ing, defines nudges as liberty-preserving approaches that steer people in
particular directions, but that also allow them to go their own way.”
67
Nudging can be acceptable, if not inevitable, in many circumstances.
But it is not always clear at what point nudging turns to wrongful manipula-
tion. Ryan Calo has developed a theory of digital market manipulation that
pinpoints three problematic contexts where personal information is lever-
aged to manipulate consumers: the mass production of bias, disclosure
ratcheting, and means-based targeting.
68
A theory of wrongful robotic ma-
nipulation of consumers could be useful. Consider the different techniques
my hypothetical robotic shopping assistant might use to encourage sales.
What if this robot was part of a Wizard-of-Oz setup? Should companies be
required to disclose their robots are not fully autonomous?
The FTC has a long history of regulating high-pressure sales tech-
niques and otherwise wrongful sales tactics. For example, the agency has
recently targeted negative-option marketing, in which sellers interpret a
customer’s failure to take an affirmative action, either to reject an offer or
cancel an agreement, as assent to be charged for goods or services.
69
Neg-
63. See, e.g., DANIEL KAHNEMAN, THINKING FAST AND SLOW (2013); DAN ARIELY, PRE-
DICTABLY
IRRATIONAL: THE HIDDEN FORCES THAT SHAPE OUR DECISIONS (2d ed. 2009); DAN-
IEL
THAYER & CASS SUNSTEIN, NUDGE: IMPROVISING DECISIONS ABOUT HEALTH, WEALTH,
AND
HAPPINESS (2d ed. 2009).
64. See K
AHNEMAN, supra note 63; Daniel Keats Citron, Technological Due Process, 85
W
ASH. U. L. REV. 1249 (2007).
65. See Kate Darling, Extending Legal Rights to Social Robots, W
E ROBOT CONFERENCE,
(2012), http://robots.law.miami.edu/wp-content/uploads/2012/03/Darling_Extending-Legal-
Rights-to-Social-Robots.pdf.
66. K
AHNEMAN, supra note 63; ARIELY, supra note 63; THAYER & SUNSTEIN, supra note
63.
67. Cass Sunstein, Nudging: A Very Short Guide, 37 J.
CONSUMER POLY 583, 583 (2014);
see also T
HAYER & SUNSTEIN, supra note 63, at 5.
68. Ryan Calo, Digital Market Manipulation, 82 G
EO. WASH. L. REV. 995 (2014).
69. FTC,
NEGATIVE OPTIONS: A REPORT BY THE STAFF OF THE FTC’S DIVISION OF EN-
FORCEMENT
2 (Jan. 2009), https://www.ftc.gov/sites/default/files/documents/reports/negative-
802 MARYLAND LAW REVIEW [VOL. 74:785
ative option tactics take advantage of people’s noted bias for the status
quo.
70
In the past, the FTC has categorized manipulative sales tactics as an
unfair trade practice.
71
In its statement on unfairness, the FTC articulated a
few boundaries for manipulation, stating certain types of sales techniques
may prevent consumers from effectively making their own decisions, and
that corrective action may then become necessary.
72
The FTC stated that
these actions are brought not to second-guess the wisdom of particular
consumer decisions, but rather to halt some form of seller behavior that un-
reasonably creates or takes advantage of an obstacle to the free exercise of
consumer decisionmaking.”
73
The goal of the FTC in this space is to keep companies from hindering
free market decisions. Examples of wrongful tactics include withholding or
failing to generate an important price or performance information, for ex-
ample, leaving buyers with insufficient information for informed compari-
sons. Some [sellers] may engage in overt coercion, as by dismantling a
home appliance for inspectionand refusing to reassemble it until a service
contract is signed. And some may exercise undue influence over highly
susceptible classes of purchasers, as by promoting fraudulent ‘cures’ to se-
riously ill cancer patients.
74
According to the FTC, Each of these prac-
tices undermines an essential precondition to a free and informed consumer
transaction, and, in turn, to a well-functioning market. Each of them is
therefore properly banned as an unfair practice under the FTC Act.
75
options-federal-trade-commission-workshop-analyzing-negative-option-marketing-report-
staff/p064202negativeoptionreport.pdf; see also FTC v. Willms, No. 2:11-cv-00828-MJP (W.D.
Wash. Mar. 6, 2012) (stipulated final judgment and order), available at
https://www.ftc.gov/sites/default/files/documents/cases/2011/09/110913jwillmspiorder.pdf; see
also 16 C.F.R § 425 (2014) (imposing requirements on negative option marketing).
70. See, e.g., Cass R. Sunstein, Impersonal Default Rules vs. Active Choices vs. Personalized
Default Rules: A Triptych 9 (May 19, 2013) (unpublished manuscript), available at
http://ssrn.com/abstract_id=2171343 (“In the domain of privacy on the Internet, a great deal de-
pends on the default rule.”).
71. See Holland Furnace Co. v. FTC, 295 F.2d 302 (7th Cir. 1961); cf. Arthur Murray Studio,
Inc. v. FTC, 458 F.2d 622 (5th Cir. 1972) (discussing emotional high-pressure sales tactics, using
teams of salesmen who refused to let the customer leave the room until a contract was signed); see
also Statement of Basis and Purpose, Cooling-Off Period for Door-to-Door Sales, 37 Fed. Reg.
22,934, 22,93738 (1972).
72. FTC Policy Statement on Unfairness, Letter from FTC Comm’rs to Wendell H. Ford &
John C. Danforth, Senators (Dec. 17, 1980), reprinted in Int’l Harvester Co., 104 F.T.C. 949 app.
at 107076 (1984), available at http://www.ftc.gov/bcp/policystmt/ad-unfair.htm (explaining
evolution of, and rationale for, FTC’s consumer unfairness jurisdiction).
73. Id.
74. Id.
75. Id.
2015] UNFAIR AND DECEPTIVE ROBOTS 803
Robots, particularly embodied ones, are uniquely situated to mentally
manipulate people. Robots can mimic human socialization, yet they are
without shame, fatigue, or internal inconsistency. Robots are also scalable,
so the decision to design a robot to manipulate humans will impact hun-
dreds, if not thousands or millions of people.
Nudgebots are already at work in society. Tinder, the social dating
mobile app, has recently been flooded with bots posing as actual users at-
tempting to persuade users to download apps.
76
The bots will pose as actu-
al users by using typical Tinder conversational language such as Hey :),
What’re you doing?,” and I’m still recovering from last night :) Relaxing
with a game on my phone, castle cash. Have you heard of it?”
77
If the user replies at all, the bot will send the user a link with a trust-
worthy-sounding address www.tinderverified.com/ along with a message
telling to user to play with me a bit and you just might get a phone num-
ber.”
78
Another sophisticated bot on Tinder tricks users into disclosing
credit card numbers as an elaborate scheme to verifya webcam service
under the guise of an invitation to engage in online foreplay.
79
As if dating
was not complicated enough already.
Of all areas where the FTC might regulate robotics, nudgebots seem
the murkiest. While deception might be relatively easy to spot in some in-
stances, other equally harmful tactics, such as exploitation of emotional
attachment to make a sale, might be more difficult to spot and even harder
to articulate a consistent framework for regulating. Catfishing aside, all
people play roles when they are interacting with others.
80
But at some point, it seems clear that our tendency to emotionally in-
vest in robots is a vulnerability worth regulatory attention. Kate Darling
has examined one possible approach: the law might protect robots.
81
Among other reasons, Darling suggests we might want to protect robots
because of the effect robot harm has on humans. Darling has cataloged the
human tendency to form emotional bonds with robots and over-ascribe
them with agency, intelligence, emotion, and feeling. She noted:
76. Leo Kelion, Tinder Accounts Spammed by Bots Masquerading as Singles, BBC (Apr. 2,
2014), http://www.bbc.com/news/26850761.
77. Id.
78. Id.
79. Satnam Nurang, Tinder: Spammers Flirt with Popular Mobile Dating App, S
YMANTEC
(July 1, 2013), http://www.symantec.com/connect/blogs/tinder-spammers-flirt-popular-mobile-
dating-
app?SID=skim38395X1020946X4058df191d7e8584f3eb6715dacc5ed7&API1=100&API2=7104
284.
80. See, e.g., E
RVING GOFFMAN, THE PRESENTATION OF SELF IN EVERYDAY LIFE (1959).
81. Darling, supra note 65.
804 MARYLAND LAW REVIEW [VOL. 74:785
[W]hen the United States military began testing a robot that de-
fused landmines by stepping on them, the colonel in command
called off the exercise. The robot was modeled after a stick in-
sect with six legs. Every time it stepped on a mine, it lost one of
its legs and continued on the remaining ones. According to Gar-
reau (2007), [t]he colonel just could not stand the pathos of
watching the burned, scarred and crippled machine drag itself
forward on its last leg. This test, he charged, was inhumane.
Other autonomous robots employed within military teams evoke
fondness and loyalty in their human teammates, who identify
with the robots enough to name them, award them battlefield
promotions and purple hearts,” introduce them to their families,
and become very upset when they die.” While none of these ro-
bots are designed to give emotional cues, their autonomous be-
havior makes them appear lifelike enough to generate an emo-
tional response. In fact, even simple household robots like the
Roomba vacuum cleaner prompt people to talk to them and de-
velop feelings of camaraderie and gratitude.
82
Ryan Calo similarly noted, There is an extensive literature to support the
claim that people are “hardwired” to react to anthropomorphic technology
such as robots as though a person were actually present. The tendency is so
strong that soldiers have reportedly risked their own lives to ‘save’ a mili-
tary robot in the field.”
83
My family owns a Roomba vacuum cleaning robot. We named it
“Rocco.”
84
Let’s say I buy a future version of this useful technology from a
less scrupulous robotics company than iRobot. Our new version of Rocco
is anthropomorphized and outfitted with a cute face, voice, and personality.
Assume new Anthro-Rocco dutifully serves my family for years. It asks us
how we’re feeling and tells us jokes like how much its job sucks.” Over
time our family becomes quite attached to Rocco. One day poor Rocco
starts to sputter along as though sick. It looks up at me with its round, cute
eyes, and says “Daddy . . . [cough] . . . if you don’t buy me a new software
upgrade . . . I’ll die.
82. Id. at 56.
83. Ryan Calo, The Case for a Federal Robotics Commission, B
ROOKINGS (Sept. 2014),
http://www.brookings.edu/research/reports2/2014/09/case-for-federal-robotics-commission (citing
P.W.
SINGER, WIRED FOR WAR: THE ROBOTICS REVOLUTION AND CONFLICT IN THE TWENTY-
F
IRST CENTURY 33743 (2009)).
84. People commonly name their Roombas. See, e.g., Jonathan Coffrey, What Happens
when Roomba Meets Me (April 24, 2009), https://www.flickr.com/photos/decaf/3472018290/
(posting a picture of a mustached and googly”-eyed Roomba with the assertion, First of all, ask
anybody with a Roomba, it has a name (meet Scruffy, the Janitor)).
2015] UNFAIR AND DECEPTIVE ROBOTS 805
I hope I’ll be able to resist this super-charged Tamagotchi’s under-
handed sales technique.
85
But will all consumers be able to resist Rocco’s
charm? How might robots like these affect the elderly, for whom robots
have great potential as companions?
86
Or what about children, who have
difficulty parsing complex emotional attachments and understanding how
robots work? Research demonstrates that children can think of a robot as a
social being and a friend.
87
Children tell robots secrets that they do not
trust with adults.
88
Of course, children also tell secrets to stuffed animals,
but mere stuffed animals cannot be programed to extract information or
fake emotional bonds via a Wizard-of-Oz setup.
The FTC is acutely sensitive to manipulation of vulnerable popula-
tions. Entire regimes, such as the Children’s Online Privacy Protection Act,
are designed to protect children, who are generally less aware of risk and
therefore less able to avoid it.
89
The elderly are particularly vulnerable to
fraud and are common scam targets and victims.
90
There is little reason to
think this will change with the mass adoption of consumer robotics. Thus
the FTC should begin to address how its existing jurisprudence on decep-
tion and manipulation will apply to nudgebots.
D. Autobots
Whether algorithms and software, by themselves, are properly classifi-
able as “robots” is debatable.
91
However, algorithms enable automation,
which is a signature trait of robotics. Thus, they are worthy of considera-
tion along with other consumer robotics issues. Frank Pasquale has noted
the incredible power of algorithms, stating Decisions that used to be based
85. Tamagotchi, WIKIPEDIA, http://en.wikipedia.org/wiki/Tamagotchi (last visited Mar. 18,
2015). Ryan Calo has proposed a similar hypothetical. Ryan Calo, Could Jibo Developer Cyn-
thia Breazeal Be The Steve Wozniak Of Robots?, F
ORBES (Aug. 17, 2014),
http://www.forbes.com/sites/ryancalo/2014/07/17/could-cynthia-breazeal-prove-the-steve-
wozniak-of-robots/.
86. A Robotics Companion for the Elderly?, GE
IDEA LAB (Aug. 13, 2014),
http://www.ideaslaboratory.com/post/94619189589/a-robotic-companion-for-the-elderly. But see
Amanda Sharkey and Noel Sharkey, Granny and the Robots: Ethical Issues in Robot Care for the
Elderly, 14 E
THICS & INFORMATION TECH. 27 (2012),
http://link.springer.com/article/10.1007/s10676-010-9234-6/fulltext.html.
87. Westlund & Breazeal, supra note 3.
88. Id.
89. Children’s Online Privacy Protection Rule,
FTC,
https://www.ftc.gov/enforcement/rules/rulemaking-regulatory-reform-proceedings/childrens-
online-privacy-protection-rule (last accessed Mar. 19 2015).
90. FTC Testifies on Fraud Against Older Americans, FTC
(May 16, 2013),
https://www.ftc.gov/news-events/press-releases/2013/05/ftc-testifies-fraud-against-older-
americans.
91. See, e.g., Richards & Smart, supra note 4.
806 MARYLAND LAW REVIEW [VOL. 74:785
on human reflection are now made automatically. Software encodes thou-
sands of rules and instructions computed in a fraction of a second.
92
Algo-
rithms are the instructions that dictate how a robot will operate. Thus, they
are consequential and present consumer protection issues.
Pasquale notes that algorithms are endemic in reputation, search, and
finance, yet they are shrouded in secrecy.
93
According to Pasquale, The
values and prerogatives that the encoded rules enact are hidden within black
boxes. The most obvious question is: Are these algorithmic applications
fair?
94
Pasquale and Danielle Citron have warned of a scored society,
where much of people’s lives and reputations are quantified and ranked.
95
Solon Barocas and Andrew Selbst have noted the potential for algorithms
and big data to have a disparate impact on vulnerable and minority popula-
tions.
96
David Vladeck has argued that society will need to consider whether
existing liability rules will be up to the task of assigning responsibility for
any wrongful acts [fully autonomous robots] commit.
97
According to Vla-
deck, The first generation of fully autonomous machines--perhaps driver-
less cars and fully independent drone aircraft--will have the capacity to act
completely autonomously. They will not be tools used by humans; they
will be machines deployed by humans that will act independently of direct
human instruction, based on information the machine itself acquires and
analyzes, and will often make highly consequential decisions in circum-
stances that may not be anticipated by, let alone directly addressed by, the
machine's creators.
98
Vladeck argued that the key question for autonomous thinking ma-
chines is whether it is fair to think of them as agents of some other indi-
vidual or entity, or whether the legal system will need to decide liability
issues on a basis other than agency.
99
Vladeck proposed several possible
direct, indirect, and shared liability answers to this question, including strict
92. FRANK PASQUALE, BLACK BOX SOCIETY: THE SECRET ALGORITHMS THAT CONTROL
MONEY AND INFORMATION 8 (2015).
93. Id.
94. Id. at 89.
95. Danielle Keats Citron & Frank Pasquale, The Scored Society: Due Process for Automat-
ed Predictions, 89 W
ASH. L. REV. 1 (2014).
96. Solon Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 C
ALIF. L. REV.
(forthcoming 2016), http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2477899.
97. David C. Vladeck, Machines Without Principals: Liability Rules and Artificial Intelli-
gence, 89 W
ASH. L. REV. 117, 121 (2014).
98. Id.
99. Id. at 122.
2015] UNFAIR AND DECEPTIVE ROBOTS 807
and “common enterpriseliability, or even the possibility of suing the robot
itself under a theory of “conferred personhood.
100
This Article will not engage the plentiful literature on the consumer
benefits and problems created by algorithms and the automation of ro-
bots.
101
Many issues involving algorithms are related to broader public pol-
icies and issues of social justice, which are harder to obtain solely through
Section 5 of the FTC Act. It is enough to note that algorithms and automa-
tion now present consumer protection issues. The FTC has already started
to take notice of algorithms in related contexts, such as privacy. FTC Chief
Technologist Ashkan Soltani has put algorithmic transparency on his agen-
da for his tenure at the agency, stating I hope to expand the agency’s abil-
ity to measure big data’s disparate effects in order to ensure that the algo-
rithms that consumers interact with on a daily basis afford them the same
rights online as they’re entitled to offline.
102
Machine learning issues aside, robots will do as they are told, so we
must be very careful with what we tell them.
103
Many of the issues present-
ed by algorithms will be part of a larger, problematic kind of robot. For
example, a nudgebot designed to exploit a person’s vulnerability is running
a malicious algorithm. Yet algorithms might also be worth consideration
on their own merit, particularly with respect to possible remedies.
As will be discussed below, the FTC has several tools including dis-
closures and design requirements that could ameliorate the harms from se-
cret algorithms. Can algorithms be so complex that meaningful transparen-
cy is impossible? Is it enough to modify only algorithms if the rest of a
robots design, such as its external features and physical manipulation capa-
bilities, remain capable of harm? Does it matter if robots can engage in
machine learning as a form of artificial intelligence? What is the culpabil-
ity of humans operating robots if they do not understand the content or ef-
fect of a robot’s algorithms?
E. Cyborgs
The final consumer robotics concern that the FTC will need to address
is when people actually become robots, at least partially. Becoming a true
cyborg in the classic sci-fi sense may still be some time away. Yet in many
100. Id. at 14550.
101. See, e.g., N
ICK CARR, THE GLASS CAGE: AUTOMATION AND US (2014).
102. Ashkan Soltani, Hello World!, FTC (Dec. 2, 2014), https://www.ftc.gov/news-
events/blogs/techftc/2014/12/hello-world.
103. With apologies to Kurt Vonnegut, Jr. Cf. K
URT VONNEGUT, JR., MOTHER NIGHT vi
(Dial Press Trade Paperback Edition 2009) (1961) (“We are what we pretend to be, so we must be
careful about what we pretend to be.”).
808 MARYLAND LAW REVIEW [VOL. 74:785
ways the science and reality of man-machine hybrids is closer than you
might think.
104
Exoskeletons hold the promise of mobility. Some have ar-
gued mobile phones are so integral to our person that we should already
consider ourselves cyborgs.
105
But one of the main immediate issues of concern involves the physical
implanting of technology into people’s bodies. Benjamin Wittes and Jane
Chong write, whether or not a technology can be considered medically
superficial in function, once we incorporate it into the body such that it is
no longer easily removed, it is integral to the person in fact. A number of
bars, strip clubs and casinos have banned the use of Google Glass based on
privacy protection concerns, and movie theaters have banned it for reasons
related to copyright protection. But such bans could pose problems when
the equivalent of Google Glass is physically screwed into an individual’s
head.”
106
Implantables are not a fantasy. Neil Harbisson, a cyborg activist had
an “eyeborg” device installed in his head that allows him to hearcolor.
107
After robots, the next Internet of Thingswill likely be the Internet of
Things Inside Our Body.
108
RFID tags are currently implantable, and raise
considerable ethical and legal issues, including privacy and autonomy, the
limits of implanted software licensing, and health and safety issues.
109
This
is to say nothing of the promise and associated problems with nanotechnol-
ogy.
110
Perhaps of most immediate concern to the FTC is the security of data
on implantable devices.
111
Wittes and Chong write:
As it turns out, the state of the law with respect to pacemakers
and other implanted medical devices provides a particularly vivid
illustration of a cyborg gap. Most pacemakers and defibrillators
are outfitted with wireless capabilities that communicate with
home transmitters that then send the data to the patient’s physi-
cian. Experts have demonstrated the existence of enormous vul-
nerabilities in these software-controlled, Internet-connected med-
ical devices, but the government has failed to adopt or enforce
regulations to protect patients against hacking attempts. To date
104. Wittes & Chong, supra note 11.
105. Id.
106. Id. (footnotes omitted).
107. Id.
108. Kerr, supra note 11.
109. Id.
110. See, e.g., Gregory Mandel, Nanotechnology Governance, 59 A
LA. L. REV. 1323 (2008).
111. See, e.g., H@cking Implantable Medical Devices, I
NFOSEC INSTITUTE (Apr. 28, 2014),
http://resources.infosecinstitute.com/hcking-implantable-medical-devices/.
2015] UNFAIR AND DECEPTIVE ROBOTS 809
there have been no reports of such hackingbut then again, it
would be extremely difficult to detect this type of foul play. The
threat is sufficiently viable that former Vice President Dick
Cheney’s doctor ordered the disabling of his heart implant’s wire-
less capability, apparently to prevent a hacking attempt, while
Cheney was in office.
112
As will be discussed below, the FTC has taken the lead in data security reg-
ulatory efforts in the United States. The FTC and the Food and Drug Ad-
ministration (FDA), will likely take the lead in regulating data security
and cybersecurity for implantable devices.
113
The FTC has mandated notice be given by devices capable of physi-
cally harming consumers.
114
For example, in In re Consumer Direct, the
FTC charged that a producer of exerciseequipment called the “Amazing
Gut Buster” failed to adequately warn consumers that when performed as
directed the Gut Buster exercises pose a risk of injury to users from snap-
ping or breakage of the product's spring or other parts.
115
This failure to
warn consumers was alleged to be an unfair trade practice by the FTC.
116
Thus, the FTC has the power to mandate notice and reasonable data
security for implantable robotic-like devices. As will be discussed below,
these are just two of the tools that enable the FTC to regulate the growing
field of consumer robotics.
***
The issues described above are just a small sample of the potential
complications that will arise as consumers embrace robots. In the next Part,
I will describe how the FTC has the ability to competently regulate con-
sumer robotics.
112. Wittes & Chong, supra note 11 (footnotes omitted).
113. See FDA,
CYBERSECURITY FOR NETWORKED MEDICAL DEVICES CONTAINING OFF-THE-
S
HELF (OTS) SOFTWARE (Jan. 14, 2005), available at
http://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocum
ents/ucm077823.pdf; FDA,
CONTENT OF PREMARKET SUBMISSIONS FOR MANAGEMENT OF CY-
BERSECURITY IN
MEDICAL DEVICES (Oct. 2, 2014), available at
http://www.fda.gov/downloads/MedicalDevices/DeviceRegulationandGuidance/GuidanceDocum
ents/UCM356190.pdf; Cybersecurity, FDA,
http://www.fda.gov/MedicalDevices/ProductsandMedicalProcedures/ConnectedHealth/ucm37321
3.htm (last visited Nov. 21, 2014).
114. See, e.g., Consumer Direct Inc. et. al., 118 F.T.C. 923 (1990),
https://www.ftc.gov/sites/default/files/documents/commission_decision_volumes/volume-
113/volume113_923-1015.pdf.
115. Id. at 92526. The opinion includes a reproduction of the advertisement. Id. at 934.
116. Id. at 926.
810 MARYLAND LAW REVIEW [VOL. 74:785
II. THE FTC HAS THE ABILITY TO ADDRESS CONSUMER ROBOTICS
There seem to be three crucial concepts for safe manufacture and use
of consumer robots: communications, design, and organizational procedure.
Companies must accurately communicate to consumers the efficacy of ro-
bots as well as any costs and risks of use. Companies should also use rea-
sonable care when designing robots and avoid culpably providing the
means and instrumentalities for wrongful or harmful conduct. Finally com-
panies that make robots should implement organizational procedures such
as administrative safeguards and training to keep robots and the data they
collect secure and private. The FTC has proven to be a competent regulator
in these areas.
The FTC’s existing framework for protecting consumers from fraud,
data breaches, privacy harms, and exploitation is robust enough to ade-
quately protect consumers and clear enough to notify commercial entities of
their obligations when designing, selling, and using robots that interact with
consumers. Notably, the FTC is enabled by broad regulatory authority and
a diverse set of tools to respond to problems.
A. Broad Regulatory Authority
The FTC has a very interesting history.
117
Originally created to com-
bat harmful monopolies, the Wheeler-Lea Amendments to Section 5 of the
FTC Act to prevent Unfair or deceptive trade practices” in addition to “un-
fair methods of competition.
118
This is a very broad charge for Congress
to delegate to an administrative agency. Any material representation, omis-
sion or practice that is likely to mislead a reasonable consumer is actiona-
ble.
119
Similarly, the FTC’s unfairness authority is also far-reaching.
According to the FTC, The present understanding of the unfairness
standard is the result of an evolutionary process. The statute was deliber-
ately framed in general terms since Congress recognized the impossibility
of drafting a complete list of unfair trade practices that would not quickly
117. See, e.g., Our History, FTC, https://www.ftc.gov/about-ftc/our-history (last accessed
Mar. 19, 2015); see also C
HRIS HOOFNAGLE, FEDERAL TRADE COMMISSION PRIVACY LAW AND
POLICY (forthcoming 2016), https://hoofnagle.berkeley.edu/ftcprivacy/; Gerald C. Henderson,
T
HE FEDERAL TRADE COMMISSION: A STUDY IN ADMINISTRATIVE LAW AND PROCEDURE
(1924); Huston Thompson, Highlights in the Evolution of the Federal Trade Commission, 8 G
EO.
WASH. L. REV. 257 (1939); Eugene R. Baker & Daniel J. Baum, Section 5 of the Federal Trade
Commission Act: A Continuing Process of Redefinition, 7 V
ILL. L. REV. 517 (1962).
118. Federal Trade Commission Act, Pub. L. No. 75-447, § 3, 52 Stat. 111 (1938).
119. See FTC Statement on Deception, Appended to Cliffdale Associates, Inc., 103 F.T.C.
110, 174 (1984).
2015] UNFAIR AND DECEPTIVE ROBOTS 811
become outdated or leave loopholes for easy evasion.
120
Notably, the FTC
can find a practice unfair even when it is otherwise legally permissible.
121
Regarding the meaning of unfairness, the House Conference Report
regarding unfairness stated: It is impossible to frame definitions to em-
brace all unfair practices. There is no limit to human inventiveness in this
field. Even if all known unfair practices were specifically defined and pro-
hibited, it would be at once necessary to begin over again. If Congress
were to adopt the method of definition, it would undertake an endless
task.
122
In short, it is the FTC, ultimately subject to judicial review, that
has been tasked with identifying unfair trade practices.
In its statement on unfairness, the FTC cited the Supreme Court’s ex-
plicit recognition that unfairness should evolve over time instead of an ex
ante prescription.
123
The Court stated that the term unfairness belongs to
that class of phrases which do not admit of precise definition, but the mean-
ing and application of which must be arrived at by what this court else-
where has called the gradual process of judicial inclusion and exclu-
sion.’”
124
This broad scope is ideal for a regulatory agency in charge of respond-
ing to challenges posed by new technologies. Chris Hoofnagle observed,
[With Section 5], Congress chose a broad, vaguely-defined mandate to
address consumer protection. The value of this vagueness comes in the
FTC’s flexibility to address new problems.
125
For example, Hoofnagle
noted that for the first thirty years of the FTC, the agency was focused on
print advertising. With the rise of radio advertising, the agency was able to
pivot and investigate false claims on the airwaves, without having to have
Congress enact a law.
126
The same was true for television, as the FTC
again recalibrated how technology can be used to deceive or harm consum-
ers.
127
The same will be true for robots. As the FTC’s foray into the “In-
ternet of Thingsmakes clear, the FTC does not need a new authorization
120. FTC Policy Statement on Unfairness, Appended to International Harvester Co., 104
F.T.C. 949, 1070 (1984); see 15 U.S.C. § 45(n) (2012).
121. Spiegel v. FTC, 540 F.2d 287, 292 (1976) (citing FTC v. Sperry & Hutchison, Co., 405
U.S. 233 (1972)) (“[T]he Supreme Court left no doubt that the FTC had the authority to prohibit
conduct that, although legally proper, was unfair to the public.”).
122. Sperry, 405 U.S. at 240 (quoting from House Conference Report No. 1142, 63 Cong., 2d
Sess., 19 (1914)).
123. Id. (citing FTC v. Raladam Co., 283 U.S. 643, 648 (1931)); see also FTC v. R.F. Keppel
& Bro., 291 U.S. 304, 310 (1934) (“Neither the language nor the history of the Act suggests that
Congress intended to confine the forbidden methods to fixed and unyielding categories.”)).
124. Raladam, 283 U.S. at 648 (quoting Davidson v. New Orleans, 96 U.S. 97, 104 (1878).
125. H
OOFNAGLE, supra note 117, at 30.
126. Id.
127. Id.
812 MARYLAND LAW REVIEW [VOL. 74:785
of power to tackle a new technology. It is sufficient if a company uses a
new technology in commerce to harm or mislead consumers.
Additionally, the FTC can regulate consumer harms that fall outside
the scope of traditional torts and other regulatory efforts. Although the
linchpin of unfairness is harm, the FTC has been careful to limit the kinds
of harm necessary to establish a practice as unfair. The harm must be sub-
stantial.
128
The most dominant kind of substantial harm asserted by the FTC has
been monetary.
129
Relevant to our hypothetical robot’s underhanded upsell,
the FTC listed as an example of monetary harm in its statement on unfair-
ness as when sellers coerce consumers into purchasing unwanted goods.
The FTC has also stated that [U]nwarranted health and safety risks may
also support a finding of unfairness,citing a case where a company dis-
tributed free-sample razor blades in a way easily obtainable by small chil-
dren.
130
Thus, certain nudgebots, algorithms, products for cyborgs, and
other poorly designed robots may also be unfair due to health and safety
risks.
However, many manipulative tactics by robots might not relate to
health, safety, or finances. For years the accepted wisdom was that Emo-
tional impact and other more subjective types of harm, on the other hand,
will not ordinarily make a practice unfair.
131
However, notions of unfair-
ness harm have been steadily evolving over the past twenty years.
132
In a
remarkable footnote in the Wyndham opinion challenging the FTC’s author-
ity to regulate data security, Judge Salas noted the dispute over whether
non-monetary injuries are cognizable under Section 5. She seemed open to
recognizing non-monetary harm, stating, “the Court is not convinced that
non-monetary harm is, as a matter of law, unsustainable under Section 5 of
the FTC Act.
133
128. FTC Policy Statement on Unfairness, Appended to International Harvester Co., 104
F.T.C. 949, 1070 (1984) (“First of all, the injury must be substantial. The Commission is not
concerned with trivial or merely speculative harms.”); see 15 U.S.C. § 45(n) (2012).
129. FTC Policy Statement on Unfairness, 104 F.T.C. at 1073.
130. Id. at 1073 n.15 (citing Philip Morris, Inc., 82 F.T.C. 16 (1973)) (“Of course, if matters
involving health and safety are within the primary jurisdiction of some other agency, Commission
action might not be appropriate.”).
131. Id. at 1073 (“Thus, for example, the Commission will not seek to ban an advertisement
merely because it offends the tastes or social beliefs of some viewers, as has been suggested in
some of the comments.”).
132. Daniel J. Solove & Woodrow Hartzog, The FTC and the New Common Law of Privacy,
114 C
OLUM. L. REV. 583 (2014).
133. FTC v. Wyndham Worldwide Corp., 10 F. Supp. 3d 602, 623 n.15 (D.N.J. 2014). Ulti-
mately, Judge Salas concluded that “the Court need not reach this issue given the substantial anal-
ysis of the substantial harm element above.” Id.
2015] UNFAIR AND DECEPTIVE ROBOTS 813
If non-monetary harm were to be recognized, it is possible that the
FTC could include emotional harms related to our dependence on and emo-
tional vulnerability to robots and possibly even transference issues, particu-
larly with respect to small children. Even if these harms are incremental for
one individual, if they are collectively a problem they might still be action-
able. The FTC has clarified that An injury may be sufficiently substantial,
however, if it does a small harm to a large number of people . . . .”
134
The FTC’s broad authority would be particularly useful given that
these are still early days for consumer robotics. In supporting his claim that
robots warrant exceptional legal treatment, Ryan Calo observed, Robots
display increasingly emergent behavior, permitting the technology to ac-
complish both useful and unfortunate tasks in unexpected ways.
135
It is
difficult to predict the many different issues that might arise when robots
are adopted by consumers. While many existing laws might cover emer-
gent issues, other problems might fall through the cracks. The breadth of
Section 5 allows it to serve as a safety net to nimbly respond to unanticipat-
ed problems.
There are limits to the FTC’s authority. The agency does not have au-
thority over non-profit organizations and common carriers. It cannot regu-
late consumers who harm other consumers in a non-commercial context.
As mentioned, its authority to regulate data security is being challenged in
court.
136
Notwithstanding these limitations, the FTC has enough authority
to competently address most cognizable consumer harms from robots.
B. Diverse and Effective Toolkit
In addition to having a general grant of authority broad enough to reg-
ulate consumer robotics, the FTC has developed several specific bodies of
jurisprudence that it can rely upon to address established and novel harms
related to consumer robotics. The FTC has a developed record of regulat-
ing when and how a company must disclose information to avoid deception
and protect a consumer from harm. The FTC has also recently developed
secondary liability and means and instrumentality theories for unfair and
deceptive technological design and organizational policies.
134. FTC Policy Statement on Unfairness, 104 F.T.C. at 1070, n.12; see 15 U.S.C. § 45(n)
(2012).
135. Ryan Calo, Robotics and the Lessons of Cyberlaw, 103 C
ALIF. L. REV. (forthcoming
2015).
136. Woodrow Hartzog & Daniel Solove, The Scope and Potential of FTC Data Protection,
83 G
EO. WASH. L. REV. (forthcoming 2015); see also Wyndham, 10 F. Supp. 3d at 607; Order
Denying Respondent LabMD’s Motion to Dismiss, LabMD, Inc., No. 9357 (Jan. 16, 2014), avail-
able at http://www.ftc.gov/sites/default/files/documents/cases/140117labmdorder.pdf.
814 MARYLAND LAW REVIEW [VOL. 74:785
1. Disclosures
One of the most effective tools the FTC has is the power to regulate
company disclosures in advertisements and other statements made in com-
merce. Because robots are relatively new, consumer expectations are not
established. There are many things a robot might be capable or incapable
of that must be disclosed to consumers to avoid deception. The FTC’s dis-
closure jurisprudence is thus an ideal starting point for its entry into con-
sumer robotics.
The FTC’s mandated notice jurisprudence is robust and established.
Generally disclosures are required whenever they are necessary to prevent a
communication or trade practice from being deceptive.
137
Disclosures must
be clear and conspicuous.
138
The agency has detailed specific rules regard-
ing what constitutes effective notice.
139
For traditional advertising, the four
major factors that constitute adequate notice for the FTC are:
1. Prominence: Is the disclosure big enough for consumers to no-
tice and read?
2. Presentation: Is the wording and format easy for consumers to
understand?
3. Placement: Is the disclosure where consumers will look?
4. Proximity: Is the disclosure close to the claim it qualifies?
140
The FTC also looks to repetition, the use of multiple media for com-
munications, and whether there were distracting factors that might diminish
the effectiveness of a disclosure, particularly online.
141
The FTC has also
developed nuanced theories regarding deception by omission, use of scien-
137. FTC, .COM DISCLOSURES: HOW TO MAKE EFFECTIVE DISCLOSURES IN DIGITAL AD-
VERTSING
i (March 2013), https://www.ftc.gov/sites/default/files/attachments/press-releases/ftc-
staff-revises-online-advertising-disclosure-guidelines/130312dotcomdisclosures.pdf.
138. See, e.g., 16 CFR § 14.9 (2014) (“clear and conspicuousdisclosure must be made in the
language of the target audience); Donaldson v. Read Magazine, Inc., 333 U.S. 178 (1948);
FTC,
supra note 137, at i.
139. FTC, supra note 137, at iiii.
140. Id. at 7; see also Donaldson v. Read Magazine, Inc., 333 U.S. 178 (1948); Decision and
Order, BUY.COM, Inc., FTC No. 992 3282, C-3978 (Sept. 8, 2000), available at
https://www.ftc.gov/sites/default/files/documents/cases/2000/09/buydotcom.do.pdf; Consent,
Order, Hewlett-Packard Co., File No. 002-3220 (April 3, 2001),
https://www.ftc.gov/sites/default/files/documents/cases/2001/04/hpagr.htm; Consent Order, Mi-
crosoft Corp., File No. 002-3331 (April 3, 2001)
https://www.ftc.gov/sites/default/files/documents/cases/2001/04/msagr.htm; Consent Order,
Häagen-Dazs Co., 119 F.T.C. 762 (1995).
141. FTC, supra note 137, at 19; 16 C.F.R. § 239.2(a) (mandating disclosure “simultaneously
with or immediately following the warranty claim” in the audio portion or “on the screen for at
least five seconds” in the video portion).
2015] UNFAIR AND DECEPTIVE ROBOTS 815
tific data and endorsements.
142
The FTC is also not bound by the fine print,
which will keep harmful terms that nobody reads from being dispositive.
143
Disclosures regarding robots present both substantive and procedural
disclosure issues. First, given that people have a tendency to treat robots as
social agents, must additional disclosures be made beyond typical contexts
involving physical safety, endorsements, and product efficacy? Recall An-
thro-Rocco, the friendly family vacuum cleaner. If indeed Rocco is pro-
grammed to upsell me by preying on my emotional bond with it, must the
maker actively disclose the fact that Rocco is designed to form emotional
attachment? Should robot sales assistants disclose the fact that their cute-
ness is a tool for information extraction?
If so, why? Are people’s relationships and our resulting vulnerability
with robots sufficiently unique to justify this sort of exceptionalism? If not,
does this mean that there is no limit to the extent to which companies can
leverage human emotions and agency towards robots behind the curtains?
The second disclosure issue presented by robots concerns how notice
is given. Given that robots themselves are capable of marketing and mak-
ing the FTC’s required disclosures and that people’s communication with
robots can be reciprocal, should the rules regarding the four P’s of disclo-
sure (prominence, presentation, placement, proximity) reflect the fact that
the robot will often be in the best position to make a just in timedisclo-
sure?
When the consumer good is also the advertising medium, it is not al-
ways clear when routine communication also constitutes an advertisement.
Since a robot can be programmed to sense context and make disclosures
during its use and not just at the purchase point, it is possible the FTC will
have different rules for such experiential, automated products. In fact, the
FTC might eventually issue new guidance for robot disclosures as it did
with disclosures on the Internet.
144
New disclosure rules for robots would be an ideal opportunity to re-
think modern notice requirements. Existing notice and choice regimes have
been asked to do more than they are capable of. But new technologies open
up the opportunity for innovative new forms of notice. Ryan Calo has pro-
posed a policy shift towards visceral notice,that is, [leveraging] a con-
142. See TUSHNET & GOLDMAN, supra note 18.
143. See, e.g., Decision and Order, BUY.COM, Inc., supra note 140; Consent Order, Mi-
crosoft Corp., supra note 140; T
USHNET & GOLDMAN, supra note 18, at 369 (“Small print, by
itself or combined with other features such as color, contrast and placement, is almost always
deemed ineffective because consumers are unlikely to wade through a long paragraph of fine print
in order to find significant information.”).
144. FTC, supra note 137.
816 MARYLAND LAW REVIEW [VOL. 74:785
sumer’s very experience of a product or service to warn or inform.
145
Might robots provide new opportunities for such kinds of notice? In addi-
tion to warning consumers through their speech, robots can warn consumers
through design signals like a bright red light as well as physical action such
as waving hands or holding up a palm to signal “stop.”
The FTC’s mandated disclosure framework is general enough to be
applied to consumer robots. And the FTC has the ability to refine and ar-
ticulate technology-specific disclosure rules if necessary. This makes its
disclosure jurisprudence the best place to begin addressing consumer robot-
ics.
2. Design and Secondary Liability
One of the FTC’s most promising recent approaches to data protection
is its embrace of design-based solutions, defined broadly as attempts to cre-
ate or modify a technology, architecture, or organizational structure or pro-
cedure ex ante as an attempt to reduce the likelihood of a harm.
Design-based solutions are prospective and implicitly embrace a prob-
abilistic notion of protection. That is, in most circumstances, they make
consumer harms more unlikely, but not impossible. Design-based solutions
are also indirect in that they affect environments and procedures rather than
directly prohibiting certain kinds of conduct. Often, the goal of design is to
raise the transactional costs of a harmful activity so high that most potential
third party bad actors simply won’t succeed or even bother. Other times
design is used to reduce the odds that consumers will harm themselves.
Data security is itself one of the most established design-based protec-
tion strategies. By anonymizing information and creating protocols to keep
information hard for hackers to find, access, or use, all but the most deter-
mined attackers usually do not attempt or succeed in accessing well-
protected data. As any data security professional will likely testify to, no
data security is perfect, but it can be good enough to have confidence that
certain data sets will probably remain secure against all but the most sophis-
ticated and motivated attackers.
Design-based protections such as handrails on stairs and fencing on
balconies keep us from slipping and falling. So too can software design
encourage or discourage irresponsible information sharing. Ultimately, eve-
ryone is limited and guided by the affordances of their environment.
The FTC has begun to embrace design as a regulatory focus. Many
privacy related unfairness complaints by the FTC are attempts to discourage
145. M. Ryan Calo, Against Notice Skepticism in Privacy (And Elsewhere), supra note 9, at
1030.
2015] UNFAIR AND DECEPTIVE ROBOTS 817
certain types of design. For example, the FTC has alleged the design of
websites and software to be unfair. In Sony BMG, digital rights manage-
ment (DRM”) software was installed on consumers’ computers in such a
way that consumers were unable to find or remove the software through
reasonable effort.
146
If consumers attempted to remove the software, it
would render their CD-ROM drive inoperable. The FTC deemed the soft-
ware design to be unfair.
147
The FTC recently brought a complaint against
the maker of a flashlight mobile app alleging that the company deceived
consumers by presenting them with an option to not share their information,
even though it was shared automatically rendering the option meaning-
less.
148
Related to design are the default settings for data sharing, as these
shape consumer behavior. In Frostwire, the FTC alleged that failure to no-
tify users that many pre-existing files on consumer computers would be
designated for public sharing constituted an unfair design.
149
Users who did
not wish to share a large number of files had to go through the burdensome
process of protecting the files one at a time by unchecking many pre-
checked boxes designating the files for sharing. The FTC noted that deceit-
ful or obstructionist default settings constitute an unfair design feature.
150
Since the FTC has more flexibility regarding harm than tort law, the
FTC is more capable of addressing small and nuanced changes in design
that affect consumers. For example, if a robot’s settings or design was de-
ceptive or unfair because of a checked default, the FTC could pressure
companies to make a change that might not even be relevant three years
from now given the pace of change in consumer robotics.
151
146. The term DRM is generally used to technological measures that allow digital content
owners to control how their content is used. See Julie E. Cohen, DRM and Privacy, 18 B
ERKELEY
TECH. L.J. 575, 575 (2003) (“In an effort to control the proliferation of unauthorized copies, and
to maximize profit from information goods distributed over the Internet, copyright owners and
their technology partners are designing digital rights management (DRM) technologies that will
allow more perfect control over access to and use of digital files.”)
147. Decision and Order, Sony BMG Music Entmt, FTC File No. 062 3019, No. C-4195, at 6
(June 29, 2007), available at http://www.ftc.gov/sites/default/files/documents/
cases/2007/06/0623019do070629.pdf.
148. Press Release, FTC, Android Flashlight App Developer Settles FTC Charges It Deceived
Consumers (Dec. 5, 2013), http://www.ftc.gov/news-events/press-releases/2013/12/android-
flashlight-app-developer-settles-ftc-charges-it-deceived.
149. Complaint for Permanent Injunction and Other Equitable Relief at 13, FTC v. Frostwire,
LLC, No. 11-cv-23643 (S.D. Fla. Oct. 12, 2011), available at
http://www.ftc.gov/os/caselist/1123041/111011frostwirecmpt.
150. Id. at 1516, 19.
151. Consent Decree and Order, United States v. Path, Inc., No. 13-CV-00448 (N.D. Cal. Feb.
8, 2013), available at
http://www.ftc.gov/sites/default/files/documents/cases/2013/02/130201pathincdo.pdf.
818 MARYLAND LAW REVIEW [VOL. 74:785
The FTC has also embraced design, such as the features of a user in-
terface, when considering whether disclosures are adequate. In Path, the
FTC asserted that the customizable settings and interactive features of a
mobile app were deceptive.
152
In a non-privacy related complaint, the FTC
alleged that the design of the Apple’s mobile operating system’s user inter-
face resulted in unfair billing of in-app charges.
153
The FTC’s theory of
design regulation would also logically apply to robots.
The FTC has also developed a theory of culpability for design choices
that indirectly harm consumers. Its secondary liability approach resembles
theories of contributory infringement and vicarious liability.
154
Facilitating
the wrongful conduct of another also triggers FTC condemnation. For ex-
ample, in DesignerWare, the FTC alleged that: “[b]y furnishing others with
the means to engage in the unfair practices . . . respondents have provided
the means and instrumentalities for the commission of unfair acts and prac-
tices and thus have caused or are likely to cause substantial injury to con-
sumers that cannot be reasonably avoided and is not outweighed by coun-
tervailing benefits to consumers or competition.”
155
In FTC v. Neovi, also known as the Qchexdispute, the FTC asserted
a theory of indirect liability against a company that created a check creation
and delivery website but failed, by design, to verify that customers were
rightfully drawing upon accounts they identified.
156
The FTC has also stat-
152. Id. at 8.
153. Complaint, Apple, Inc., FTC No. 112 3108 (March 27, 2014), available at
http://www.ftc.gov/sites/default/files/documents/cases/140115applecmpt.pdf.
154. Jay Dratler, Jr., Common-Sense (Federal) Common Law Adrift in A Statutory Sea, or
Why Grokster Was A Unanimous Decision, 22 S
ANTA CLARA COMPUTER & HIGH TECH. L.J. 413,
434 (2006) (“[S]econdary liability in copyright is federal common law . . . .”); Metro-Goldwyn-
Mayer Studios, Inc. v. Grokster, LTD., 545 U.S. 913, 930 (2005) (“Although [t]he Copyright Act
does not expressly render anyone liable for infringement committed by another,these doctrines
of secondary liability emerged from common law principles and are well established in the law.”
(quoting Sony Corp. v. Universal City Studios, 464 U.S. 417, 434 (1984)) (citing id. at 486
(Blackmun, J., dissenting)); Kalem Co. v. Harper Brothers, 222 U.S. 55, 6263 (1911); Gershwin
Pub. Corp. v. Columbia Artists Management, 443 F.2d 1159, 1162 (2d Cir. 1971); 3 M.
NIMMER
& D. NIMMER, COPYRIGHT, § 12.04[A] (2005)); A & M Records, Inc. v. Napster, Inc. 239 F.3d
1004 (9th Cir. 2001).
155. DesignerWare, LLC, FTC File No. 112 3151, No. C-4390 (Apr. 11, 2013), available at
http:// www.ftc.gov/sites/default/files/documents/cases/2013/04/
130415designerwarecmpt.pdf.
156. Complaint, FTC v. Neovi, Inc., ,No. 306-CV-01952-WQH-JMA (S.D. Cal. Sept. 19,
2006), available at https://www.ftc.gov/sites/default/files/documents/cases/2006/10/
060919neovicmplt.pdf.
2015] UNFAIR AND DECEPTIVE ROBOTS 819
ed that providing the means and instrumentalities to install spyware and
access customer’s personal information was an unfair trade practice.
157
The FTC has only occasionally pursued a claim of indirect liability
against companies. It is unlikely to pursue an action against a robotics
company under this theory save for extreme circumstances. Yet it is worth
noting that much of the discussion surrounding ethics and robotics has to do
with design choices.
158
Should home care robots be designed to record pri-
vate moments like going to the bathroom? Should robots be programmable
or controllable by anyone, or just owners? What kind of authentication and
verification protocols should robots have? Should robots be designed to be
closed,in the sense that they have a set, dedicated function and run only
proprietary software?
159
Or can companies design robots to be open
without incurring liability, in the sense that they have a non-dedicated use,
nondiscriminatory software, and modular design?
160
Questions like these reflect that fact that rules for the design of robots
can be just as consequential as rules for their ultimate use. The FTC is one
of the few agencies capable of addressing such design issues.
3. Organizational Procedures and Data Protection
Data security is one of the most crucial components for consumer ro-
botics. If consumers cannot trust robots and companies that make robots
with their personal information, the consumer robotics industry will never
get off the ground. Data security is a process companies must engage in
involving identification of assets and risk, data minimization, implementa-
tion of administrative, technical, and physical safeguards, and the develop-
ment of a data breach response plan.
161
But, at base, it is a component nec-
essary to build consumer trust.
The FTC has established a robust data security jurisprudence, filing
over fifty data security complaints in the past fifteen years that obligate
companies collecting and storing personal information to provide reasona-
157. Complaint at 1011, FTC v. CyberSpy Software, LLC and Trace R. Spence, No. 6:08-
CV-01872 (M.D. Fla. Nov. 5, 2008), available at
https://www.ftc.gov/sites/default/files/documents/cases/2008/11/081105cyberspycmplt.pdf.
158. See generally R
OBOT ETHICS: THE ETHICAL AND SOCIAL IMPLICATIONS OF ROBOTICS
187, 194 (Patrick Lin, Keith Abney & George A. Bekey eds., 2012); Riek et al., supra note 14;
Calo, Open Robotics, supra note 9; Aimiee Van Wynsberghe, A Method for Integrating Ethics
Into the Design of Robots, 40 I
NDUSTRIAL ROBOT 433 (2013); Aimiee Van Wynsberghe, Design-
ing Robots for Care: Care Centered Value-Sensitive Design, 19 S
CIENCE AND ENGINEERING
ETHICS 407 (2013).
159. See Calo, Open Robotics, supra note 9.
160. Id.
161. Commission Statement Marking the FTC’s 50th Data Security Settlement, FTC (January
31, 2014), http://www.ftc.gov/system/files/documents/cases/140131gmrstatement.pdf.
820 MARYLAND LAW REVIEW [VOL. 74:785
ble data security requirements.
162
These obligations are not limited to in-
ternet companies, as demonstrated by complaints against traditional retail-
ers and more relevantly makers of devices for the Internet of Things.
163
In many ways, the FTC’s TRENDnet case, which was the agency’s
first Internet of Thingscomplaint, can be seen as a bridge between its
Internet-related complaints that have dominated its jurisprudence over the
past fifteen years and the eventual attention that must be given to consumer
robotics. At one level, this case simply involves deceptive promises of se-
curity and unreasonable data security design for internet-connected baby
monitors. These monitors were compromised to the shock and dismay of
sleeping toddlers and adults in the United States.
164
Yet the complaint also
signaled that new technologies must protect consumers in the same way
existing established technologies do.
Privacy rules can also be conceptualized as a process. The FTC has re-
cently embraced the concept of privacy by design, broadly described by
the agency as a baseline principle encouraging companies to promote con-
sumer privacy throughout their organizations and at every stage of the de-
velopment of their products and services.
165
According to the FTC, The
concept of privacy by design includes limitations on data collection and
retention, as well as reasonable security and data accuracy. By considering
and addressing privacy at every stage of product and service development,
companies can shift the burden away from consumers who would otherwise
have to seek out privacy protective practices and technologies.
166
The FTC has even required companies to implement privacy by design
in its consent orders through a comprehensive privacy program.
167
These
programs require, among other things, the designation of an employee in
charge of the program, risk assessments, design and implementation of pri-
vacy controls, diligence in working with third party contractors, and regular
re-evaluation and adjustment of the program.
168
Processes like these could
162. Complaint, TRENDnet, FTC No. 122 3090, No. C-4426 (Feb. 7, 2014), available at
https://www.ftc.gov/system/files/documents/cases/140207trendnetcmpt.pdf.
163. See, e.g., Complaint, BJ's Wholesale Club, Inc., 140 F.T.C. 465, 468 (2005), available at
https://www.ftc.gov/sites/default/files/documents/cases/2005/09/092305comp0423160.pdf; Com-
plaint, supra note 162.
164. Complaint, supra note 162, at 5.
165. FTC,
PROTECTING CONSUMER PRIVACY IN AN ERA OF RAPID CHANGE: RECOMMENDA-
TIONS FOR
BUSINESSES AND POLICYMAKERS 2 (2012), available at http://
www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-report-protecting-
consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf.
166. Id.
167. Consent Order, Snapchat, FTC No. 132 3078 (May 8, 2014), available at
https://www.ftc.gov/system/files/documents/cases/140508snapchatorder.pdf.
168. Id.
2015] UNFAIR AND DECEPTIVE ROBOTS 821
also work for companies that design robots, particularly those that collect
personal information.
***
In this Part I have made the argument that the FTC can regulate con-
sumer robotics and has some unique tools and theories to do so effectively.
In the next Part, I will argue that the FTC should embrace consumer robotic
so that the robotics industry can continue to flourish while consumers are
protected.
III. THE FTC SHOULD TAKE THE LEAD ON REGULATING CONSUMER
ROBOTICS
Numerous federal, state, and non-governmental bodies will inevitably
have some role in regulating consumer robotics. For example, the Con-
sumer Product Safety Commission (CPSC), charged with protecting the
public from unreasonable risks of injury or death associated with the use of
the thousands of types of consumer products,is likely to get involved if
robots start physically harming consumers.
169
The Federal Aviation Administration has already begun the process of
regulating drones, which are already commercially available.
170
The FDA
regulates implantable devices and telepresence surgical robots.
171
The Na-
tional Highway Traffic and Safety Administration (NHTSA) has released
a preliminary report outlining a plan to ensure that automated, self-driving
cars are safe.
172
The International Organization for Standardization has re-
leased numerous standards for safety requirements for industrial and per-
sonal care robots.
173
This is to say nothing of the effect that contracts, in-
169. About, CONSUMER PRODUCTS SAFETY COMMISSION, http://www.cpsc.gov/en/About-
CPSC/ (last accessed Mar. 19, 2015).
170. FAA,
OVERVIEW OF SMALL UAS NOTICE OF PROPOSED RULEMAKING (Feb. 15, 2015),
http://www.faa.gov/regulations_policies/rulemaking/media/ 021515_sUAS_Summary.pdf.
171. See supra note 113.
172. Press Release, NHTSA, U.S. Department of Transportation Releases Policy on Automat-
ed Vehicle Development (May 30, 2013),
http://www.nhtsa.gov/About+NHTSA/Press+Releases/U.S.+Department+of+Transportation+Rele
ases+Policy+on+Automated+Vehicle+Development.
173. ISO 13482:2014,
ROBOTS AND ROBOTIC DEVICESSAFETY REQUIREMENTS FOR PER-
SONAL
CARE ROBOTS (2014), available at
http://www.iso.org/iso/catalogue_detail.htm?csnumber=53820; ISO
10218-1:2011, ROBOTS AND
ROBOTIC DEVICES—SAFETY REQUIREMENTS FOR INDUSTRIAL ROBOTS—PART 1: ROBOTS
(2011), http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=51330;
ISO
10218-2:2011, ROBOTS AND ROBOTIC DEVICES—SAFETY REQUIREMENTS FOR INDUSTRIAL
ROBOTS—PART 2: ROBOT SYSTEMS AND INTEGRATION (2011),
http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=41571; ISO/DTS
15066, ROBOTS AND ROBOTIC DEVICES—SAFETY REQUIREMENTS FOR INDUSTRIAL ROBOTS
822 MARYLAND LAW REVIEW [VOL. 74:785
surance, and products liability laws are likely to have on consumer robot-
ics.
174
It is even possible that regulatory bodies will be formed to address ro-
bots. Ryan Calo has proposed a new Federal Robotics Commission, which
would, at least initially, play an advisory role for other regulatory bodies
and companies.
175
Calo recognized that an agency could help responsibly
integrate robotics technologies into American society.
176
According to Ca-
lo, Robots, like radio or trains, make possible new human experiences and
create distinct but related challenges that would benefit from being exam-
ined and treated together. They do require special expertise to understand
and may require investment and coordination to thrive.
177
As of yet, however, no existing body has taken the lead to guide the
development of consumer robotics to protect consumers. Federal agency
leadership would be useful to help develop consistent standards, encourage
cooperation among regulatory bodies, and shield off burdensome, knee-
jerk, and over-reactive regulatory efforts by Congress or other lawmaking
bodies. In this Part, I argue that the FTC should take this leadership role in
consumer robotics.
The FTC is best positioned to take the lead on consumer robotics is-
sues because it has developed a robust body of law to draw from and has a
track-record of fostering nascent technologies like the Internet. This will
allow the agency to enable consumer robotics to flourish while protecting
consumers in a way consistent with established consumer protection goals
and law. The FTC gives deference to industry standards where relevant,
COLLABORATIVE OPERATION (2014),
http://www.iso.org/iso/home/store/catalogue_tc/catalogue_detail.htm?csnumber=62996.
174. See, e.g., David C. Vladeck, Machines Without Principals: Liability Rules and Artificial
Intelligence, 89 W
ASH. L. REV. 117, 121 (2014); Bryant Walker Smith, Proximity-Driven Liabil-
ity, 102 G
EO. L.J. 1777 (2014); Diana Marina Cooper, A Licensing Approach to Regulation of
Open Robotics, W
E ROBOT CONFERENCE, (2013),
http://conferences.law.stanford.edu/werobot/wp-
content/uploads/sites/29/2013/04/Cooper_Diana.pdf; Patrick Hubbard, Regulation and Liability
for Risks of Physical Injury from “Sophisticated Robots, W
E ROBOT CONFERENCE (2012),
http://robots.law.miami.edu/wp-content/uploads/2012/01/Hubbard_Sophisticated-Robots-Draft-
1.pdf.
175. Calo, supra note 83 (“The institution I have in mind would not regulaterobotics in the
sense of fashioning rules regarding their use, at least not in any initial incarnation. Rather, the
agency would advise on issues at all levelsstate and federal, domestic and foreign, civil and
criminalthat touch upon the unique aspects of robotics and artificial intelligence and the novel
human experiences these technologies generate.”).
176. Id. (“The alternative, I fear, is that we will continue to address robotics policy questions
piecemeal, perhaps indefinitely, with increasingly poor outcomes and slow accrual of
knowledge.”).
177. Id.
2015] UNFAIR AND DECEPTIVE ROBOTS 823
which will keep the law of consumer robotics from being arbitrary and dis-
connected from practice.
The agency is capable of responding to technological change quickly
but with stability, which is a necessity in the rapidly evolving field of robot-
ics. The FTC regularly cooperates with other regulatory bodies and can use
this experience to build consensus and consistency. Finally, the FTC can
use Section 5 as a safety net to address emergent issues in consumer robot-
ics that are currently unforeseeable or have not yet been realized.
A. Established Body of Law and Authority
When a federal agency operates for 100 years, it is bound to establish
a robust body of law. FTC official actions and legal resources can take the
form of advisory opinions, advocacy filings, cases, closing letters, volumes
of commission decisions, notices in the Federal Register, press releases,
public comment initiatives, public events, public statements, reports, and
rules.
178
However, the two most prominent forms of FTC jurisprudence
come from rules and cases, namely complaints.
The FTC has filed thousands of complaints regarding advertising and
marketing, credit and finance, and privacy and security in various industries
such as alcohol, appliances, automobiles, clothing, finance, franchises,
business opportunities, and investments, funerals, human resources, jewel-
ry, real estate and tobacco.
179
These cases establish and develop the FTC’s body of law in incremen-
tal steps. In research regarding the FTC’s regulation of privacy, Daniel
Solove and I have argued that the FTC’s complaints functionally operate as
a body of law similar to the common law, even though they are not judicial
opinions with precedential value.
180
The FTC remains consistent with these
complaints and practitioners view them has having precedential weight.
181
Moreover, minus a few exceptions involving unfairness such as data
security, the FTC’s regulation of marketing, finance, and privacy is well-
established.
182
It would be relatively uncontroversial to apply the FTC’s
178. Legal Resources, FTC, https://www.ftc.gov/tips-advice/business-center/legal-
resources?type=case&field_consumer_protection_topics_tid=249 (last accessed Mar. 19, 2015).
179. Id.
180. Solove & Hartzog, supra note 132, at 619 (“Although the FTCs privacy cases nearly all
consist of complaints and settlements, they are in many respects the functional equivalent of
common law. While the analogy to traditional common law has its limits, it is nonetheless a use-
ful frame to understand the FTC's privacy jurisprudence.”). But see Justin (Gus) Hurwitz, Data
Security and the FTC’s UnCommon Law, I
OWA L. REV. (forthcoming), available at
http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2574257
181. Solove & Hartzog, supra note 132, at 620.
182. Hartzog & Solove, supra note 136.
824 MARYLAND LAW REVIEW [VOL. 74:785
framework for disclosure, substantiation, endorsements, or trade puffery to
robots in applicable scenarios. Robots are not so different from other tech-
nologies and trade practices that the FTC’s theories of design, indirect lia-
bility, and data protection would be inapplicable. Thus the FTC is prefera-
ble as a leader in this field because it can leverage the great weight of its
jurisprudence to protect consumers against the non-exceptional problems
presented by consumer robotics.
B. Accommodation of Nascent Technologies
In the past, the FTC has approached promising technologies with a
light regulatory touch when they were new and only increased regulatory
efforts once those technologies became more established. The best exam-
ple of this is the Internet. The incredible potential of the Internet first be-
came clear to both regulators and the public in the 1990s. Regulators gen-
erally did not want to stifle this technology, which could change the world
for the better, with burdensome laws. This included the FTC, which first
embraced a self-regulatory regime for online privacy and then eventually
shifted over into a co-regulatory regime.
183
A similar approach seems wise for consumer robotics as well. As dis-
cussed above, the non-exceptional aspects of consumer robotics, that is,
those with existing analogs, are safely regulated under established FTC ju-
risprudence. Robotics companies should not be allowed to deceive or harm
consumers simply because they are new.
However, the exceptional aspects of consumer robotics should receive
a reasonably light regulatory touch for now. This means a preference for
disclosure requirements over conduct prohibitions and a case-by-case ap-
proach instead of rigorous rulemaking. Before launching into too many
enforcement actions, the FTC should continue its tradition of investigation
and guidance by holding public events on consumer robotics and issuing
reports and white papers to guide norms in relevant areas.
183. See FTC, SELF-REGULATION AND PRIVACY ONLINE: A REPORT TO CONGRESS 1214
(1999) (“[T]he Commission believes that legislation to address online privacy is not appropriate at
this time.”); Robert Pitofsky, Chairman, FTC, Prepared Statement of the Federal Trade Commis-
sion on “Consumer Privacy on the World Wide Web” (July 21, 1998), available at http://
www.ftc.gov/sites/default/files/documents/public_statements/prepared-statement-federal-trade-
commission-consumer-privacy-world-wide-web/privac98.pdf (“[T]he Commission's goal has
been . . .
to encourage and facilitate self-regulation as the preferred approach to protecting con-
sumer privacy online.”); Robert Pitofsky, Chairman, FTC, Prepared Statement of the Federal
Trade Commission on “Self-Regulation and Privacy Online” (July 13, 1999), available at
http://www.ftc.gov/public-statements/1999/07/prepared-statement-federal-trade-commission-self-
regulation-and-privacy (describing self-regulation as “least intrusive and most efficient means to
ensure fair information practices online”).
2015] UNFAIR AND DECEPTIVE ROBOTS 825
The FTC also has limited resources, which means that it places great
emphasis on prioritization. In the privacy context, the FTC files only about
ten to fifteen complaints per year.
184
The likelihood of being the subject of
an FTC complaint is quite small. The result is that the FTC generally stays
away from the grey areas and largely pursues only the most egregious cases
of wrongdoing.
185
Thus, the FTC’s constraints help ensure that the consumer robotics in-
dustry has the room it needs to grow. Most actions by robotics companies
will not result in an agency complaint and only the most serious misrepre-
sentations and unfair actions will trigger enforcement. This preservation of
grey area for robotics companies will allow the industry to flourish while
consumers calibrate appropriate expectations surrounding the use and effi-
cacy of robots.
C. Deference to Industry
The FTC also has a track record of deferring to industry practices to
establish co-regulatory regimes. The most prominent recent example of
this deference is with the FTC’s regulation of data security. The FTC gen-
erally requires “reasonable” data security from companies that collect con-
sumer information.
186
In a statement issued in conjunction with the FTC’s
fiftieth data security complaint, the FTC stated, The touchstone of the
Commission’s approach to data security is reasonableness: a company’s
data security measures must be reasonable and appropriate in light of the
sensitivity and volume of consumer information it holds, the size and com-
plexity of its business, and the cost of available tools to improve security
and reduce vulnerabilities.”
187
The FTC has implicitly and explicitly represented that it looks to in-
dustry standards to guide its enforcement, particularly when determining
what constitutes reasonable data protection.
188
As previously stated, the
FTC generally observes that good data security is a process involving iden-
tification of assets and risk, data minimization, a data breach response plan,
and physical, technical and administrative safeguards.
189
However, more
184. See Solove & Hartzog, supra note 132, at 619.
185. Id.; Hartzog & Solove, supra note 136.
186. Commission Statement, supra note 161.
187. Id.
188. FTC, supra note 165, at 2 (“To the extent that strong privacy codes are developed, the
Commission will view adherence to such codes favorably in connection with its law enforcement
work.”).
189. Commission Statement, supra note 161; see also Julie Brill, Keynote Address Before the
Center for Strategic and International Studies, “Stepping into the Fray: The Role of Independent
Agencies in Cybersecurity” (September 17, 2014),
826 MARYLAND LAW REVIEW [VOL. 74:785
specifically, a review of the FTC’s complaints reveals that what the agency
considers unreasonable largely overlaps with several established industry
standards such as NIST 800-50 and ISO 27001.
190
Similar FTC deference to the consumer robotics industry is desirable
for several reasons. First, deference will help keep the law of consumer
robotics from being arbitrary and disconnected from practice. Co-
regulatory approaches that use industry standards to form rules are also po-
litically palatable as they are the result of stakeholder consensus.
By definition, industry standards also dictate what is feasible in indus-
try. Thus deference can also keep rules regarding consumer robotics from
being overly burdensome. Finally, industry standards are constantly updat-
ed, thus deference provides for flexibility. If rules are tethered to industry
standards, then a new law need not be passed every time standards change.
Laws simply evolve with practice.
Of course, not all potential rules of consumer robotics need be defer-
ential to industry standards. Often, there will be no standard for certain ac-
tivities or designs. Other times, the industry standard will not adequately
protect consumers from harm or deception. Thus deference is no panacea.
Yet it remains a useful strategy that the FTC has deployed effectively and
could do again with consumer robotics. As previously mentioned, industry
standards have already begun to emerge regarding safety and robots with
more inevitably on the way.
191
https://www.ftc.gov/system/files/documents/public_statements/582841/140917csisspeech.pdf
(“The core of the NIST Framework is about risk assessment and mitigation. In this regard, it is
fully consistent with the FTC’s enforcement framework. One of the pillars of reasonable security
practices that the FTC has established through our settlements in more than 50 data security cases
is that assessing and addressing security risks must be a continuous process.”).
190. See Solove & Hartzog, supra note 132, at 619; Kristina Rozan, How Do Industry Stand-
ards for Data Security Match Up with the FTC's Implied ReasonableStandardsAnd What
Might This Mean for Liability Avoidance?, IAPP (Nov. 25, 2014),
https://privacyassociation.org/news/a/how-do-industry-standards-for-data-security-match-up-with-
the-ftcs-implied-reasonable-standards-and-what-might-this-mean-for-liability-avoidance (“The
NIST SP 800-53 Rev. 4 came closest to covering the 72 reasonable data security practices ex-
pected by the FTC, as inferred from the FTC's complaints. Sixty-six of the 72 expected reasonable
practices were recommended in this NIST report. In many instances, the “match” between the
expected practice and recommended standard was nearly perfect.”); NIST,
SECURITY AND PRIVA-
CY
CONTROLS FOR FEDERAL INFORMATION SYSTEMS AND ORGANIZATIONS (NIST SP 800-53
Rev.4); ISO/IEC
27001:2013, INFORMATION TECHNOLOGY—SECURITY TECHNIQUES
I
NFORMATION SECURITY MANAGEMENT SYSTEMS—REQUIREMENTS (2013),
http://www.iso.org/iso/catalogue_detail?csnumber=54534; C
OUNCIL ON CYBERSECURITY, TOP 20
CRITICAL SECURITY CONTROLS, https://www.sans.org/media/critical-security-controls/CSC-
5.pdf.
191. Supra note 173.
2015] UNFAIR AND DECEPTIVE ROBOTS 827
D. The FTC Can and Should Cooperate with Other Agencies
While I argue that the FTC should take the lead in addressing consum-
er robotics, the agency should not seek to go it alone. There will be many
regulatory bodies whose efforts with respect to consumer robotics be rele-
vant to the FTC. The FTC can and should cooperate with overlapping
agencies.
The scope of Section 5 is so broad that it routinely overlaps with other
regulatory agencies.
192
One court has stated, Because we live in ‘an age of
overlapping and concurring regulatory jurisdiction,’ a court must proceed
with the utmost caution before concluding that one agency may not regulate
merely because another may.
193
The FTC has cooperated with other agencies formally with memoran-
dums of understanding. The agency also cooperates informally though reg-
ulator communication or simply by remaining consistent with other regula-
tory bodies. For example, the FTC has worked with the FDA for over forty
years regarding certain kinds of advertising for food and drugs.
194
The FTC
and HHS often coordinate enforcement actions for violations that implicate
both HIPAA and the FTC Act.
195
The FTC has provided comments for the
NHTSA regarding privacy in vehicle-to-vehicle communications.
196
192. Hartzog & Solove, supra note 136.
193. FTC v. Ken Roberts Co., 276 F.3d 583, 593 (D.C. Cir. 2001) (quoting Thompson Medi-
cal Co. v FTC, 791 F.2d 189, 192 (D.C. Cir. 1986)); see also FTC v Texaco, Inc., 555 F.2d 862,
881 (DC Cir 1976). See generally FTC v Cement Inst., 333 U.S. 683, 69495 (1948).
194. See Memorandum of Understanding Between The Federal Trade Commission and The
Food and Drug Administration, MOU 225-71-8003,
http://www.fda.gov/AboutFDA/PartnershipsCollaborations/MemorandaofUnderstandingMOUs/D
omesticMOUs/ucm115791.htm; Thompson Medical, 791 F.2d at 192 (“We find no evidence in the
regulatory scheme that Congress has fashioned for over-the-counter medications that the FTC is
indefinitely barred from all regulatory authority over drug advertising while the FDA conducts its
comprehensive review of drug safety. Nowhere in the case law or in the FTCs grant of authority
is there even a hint that the FTC's jurisdiction is so constricted. To the contrary, the cases recog-
nize that ours is an age of overlapping and concurring regulatory jurisdiction.”); see also Overlap-
ping Authority of FTC, CCH-DCLR P 2010.85 (C.C.H.), 2009 WL 5076333 (“The Federal Trade
Commission has jurisdiction to prohibit false labeling and misbranding of food, drugs and cosmet-
ics and other products where the false labeling and misbranding constitutes unfair competition in
the purview of Section 5 of the FTC Act”) (citing Fresh Grown Preserve Corp. v. FTC, 125 F.2d
917 (2d Cir. 1942)).
195. Order Denying Respondent LabMD’s Motion to Dismiss, LabMD, Inc., No. 9357 (Jan.
16, 2014), available at
http://www.ftc.gov/sites/default/files/documents/cases/140117labmdorder.pdf (citing HHS, Modi-
fications to the HIPAA Privacy, Security, Enforcement, and Breach Notification Rules, Final
Rule, 78 Fed. Reg. 5566, 5579 (Jan. 25, 2013)).
196. Press Release, FTC, FTC Provides Comment to NHTSA on Privacy and Vehicle-To-
Vehicle Communications (Oct. 24, 3014), https://www.ftc.gov/news-events/press-
releases/2014/10/ftc-provides-comment-nhtsa-privacy-vehicle-vehicle-communications.
828 MARYLAND LAW REVIEW [VOL. 74:785
Agency overlap is not only inevitable, but in this instance, desirable.
197
Scholars have argued that when agencies have overlapping authority, their
competition brings them closer to the intent of Congress when granting au-
thority.
198
Robots are or will soon become involved in many diverse areas such
as commerce, aviation, traffic, lodging, healthcare, pharmaceuticals, games,
socialization, and many others. Cooperation will be key to ensure con-
sistency, accuracy, and efficiency. Here is where Calo’s proposed Federal
Robotics Commission could prove the most useful. Calo suggested that one
of the functions of the FRC should be to [a]dvise other federal agencies on
matters having to do with robotics, including the DOT on driverless cars,
the SEC on high speed trading, the FDA on robotic medical devices, the
FCC on cognitive radios, the FAA on drones and, eventually, the Federal
Trade Commission on increasingly sophisticated consumer products.
199
Given the FTC’s broad authority and history of cooperating with other
agencies, it is a strong candidate to take the lead on regulating consumer
robotics while cooperating with existing and proposed administrative agen-
cies.
IV. CONCLUSION
In many ways, robots are nothing special. Neil Richards and Bill
Smart argued, Robots are, and for many years will remain, tools. They are
sophisticated tools that use complex software, to be sure, but no different in
essence than a hammer, a power drill, a word processor, a web browser, or
the braking system in your car.
200
Yet robots are unique in utility and social meaning. People rarely
name their hammers or have candid conversations with their power drills.
A robot can do things hammers never dreamed of. In the same way that
paintings do not raise the same privacy problems as digital photographs,
robots are unique enough from existing technologies to warrant exceptional
legal consideration in some contexts.
The FTC can respond to both exceptional and traditional issues pre-
sented by robots. A relatively light regulatory touch for now focused on
197. Jacob E. Gersen, Overlapping and Underlapping Jurisdiction in Administrative Law,
2006, S
UP. CT. REV. 201, 203, 208 (2006) ([S]tatutes that parcel out authority or jurisdiction to
multiple agencies may be the norm, rather than an exception. . . . Because overlapping and under-
lapping jurisdictional assignment can produce desirable incentives for administrative agencies,
statutes [that create overlapping and underlapping jurisdictional schemes] are useful tools for
managing principal-agent problems inherent in delegation.”).
198. Id. at 201, 212.
199. Calo, supra note 83.
200. Richards & Smart, supra note 4.
2015] UNFAIR AND DECEPTIVE ROBOTS 829
deception, disclosures, data security, and extreme cases of malicious design
will allow consumer robots to flourish while protecting consumers. Con-
sumers want their robots to be safe and truthful. But they do want them.
Thus the FTC, or whatever agency ultimately takes the lead on consumer
robotics, should seek to find analogs where possible, keep an eye out for
genuinely new problems, and otherwise seek to make sure that consumers
can continue to buy and use robots in a safe, sustainable way.