Designing for Privacy in an Increasingly Public World
Product Design Meetup | 21 July 2021 | Robert Stribley
Street art by JR
I’m Robert Stribley
I’m a user experience designer and manager at
Publicis Sapient and I teach user experience design
at the School of Visual Arts in New York.
Introduction
• Corporations and non-profits consider the privacy of their
users’ data, their content, even their browsing behavior for
their clients’ benefit and safety
• But they also do it for their own personal and financial self
interest
• It’s increasingly important that they consider the privacy and
security issues affecting their customers and clients
• So how do we design these digital experiences—apps,
websites, etc — to ensure people’s privacy?
Background
Privacy and security are different concepts
Privacy: Your ability to control your personal information and
how it’s used
Security: How your personal information is protected by those
holding on to it
These concepts often overlap, so we’ll refer to both
Our focus: How we can ensure people’s privacy is maintained
as we design experiences for them
Purpose
Why Privacy?
Image by Jack Ferrentino for NPR
“Arguing that you don't
care about the right to
privacy because you
have nothing to hide is
no different than saying
you don't care about
free speech because
you have nothing to
say.”
— Edward Snowden, former CIA
employee, infamous NSA leaker
Why Privacy?
• If we’re not concerned with a particular privacy
issue, remember, we’re not designing for
ourselves
• If we’re designing with empathy, we’ll consider
the needs of people not like ourselves — people
with different backgrounds and experiences
• That means researching privacy issues, but also
engaging with people with diverse backgrounds
and lived experiences
Why Privacy?
Examples:
DayOne, a non-profit, provides services for
young people in abusive dating relationships.
These clients may worry about their partners
tracking their online activity or stalking them in
real life.
Similarly, LGBTQ youth need to feel their privacy
is secure when reaching out for help online.
In this sense, privacy issues are often diversity
issues.
Why Privacy?
Privacy Issues
In April, Facebook, the
largest social media platform
on the planet was hacked.
533 million user’s phone
numbers and personal data
were leaked online.
Data for half a billion people.
Data Security
Fraud & identity theft on the
rise during the pandemic.
FTC: 1.4 million reports of
identity theft in 2020 — double
from 2019.
Leaks of personal data can be
catastrophic to people’s lives.
Fraud & Identity
Theft
Photo by Kyle Glenn
Stores, such as Albertsons,
Rite-Aid, Macy’s, ACE
Hardware are using facial
recognition programs to
identify customers.
Some also use apps to track
customers around their stores
to present them with ads
online later.
Facial
Recognition
Amazon required delivery
drivers to sign consent
forms, which allowed the
company to collect their
biometric data and to use AI
cameras to monitor their
location, movement, their
driving patterns.
At least one driver quit over
this form of “AI surveillance.”
Biometric Data
• A donation site for Donald Trump
deployed “dark patterns” to trick
supporters into agreeing to recurring
donations
• Designers rolled out iterations of this
feature with increasingly confusing
language, fine print, bold text, all-
caps, and a pre-selected check box
• They referred to the feature as a
“money bomb”
• Donations grew astronomically — as
did fraud complaints from angry
supporters
• One 78-year-old supporter summed
up his thoughts: “Bandits!”
Dark Patterns
• Demand for personalized content, which benefits from
personal data seems higher than ever
• People say they want personalized ads, so you’d think
they enjoy sharing their data
• But a 2019 survey by network security company RSA
found only 17% of respondents said it was ethical to
track their online activity to personalize ads
• Earlier, Pew Research found 91% of adults believe
consumers have lost control over how their personal
information is collected and used by companies
Data Sharing
Data Sharing
Apple rolled out a new iPhone
feature called “App Tracking
Transparency,” an anti-tracking
shield, which prevents apps from
shadowing you across the internet.
Now, they have to ask first.
Only ~15% of iOS users worldwide
allowing apps to track them so far –
The Register, May 2021.
Impact of Regulations
GDPR stands for …
The General Data Protection Regulation
Law finalized in 2016, came into effect in
2018
Regulates how apps and sites can gather
and transfer or process personal data when
working within the European Union
Also, what happens to that data when it’s
transferred outside of the EU?
Impact of Regulations
Remember a while back
when you suddenly got a
gazillion emails from
companies telling you they
had updated their privacy
policies?
That was a result of the
GDPR.
Some things GDPR requires …
• Ask people to opt in to sharing their data
• Communicate to people in the moment, when
you’re collecting their personal data
• Be transparent about what you’re doing with it
• Allow people to download their data and
delete it — a “right to erasure” or “right to be
forgotten”
Impact of Regulations
California passed their own version of the GDPR —
the California Consumer Privacy Act.
Gives Californians more control over how their
personal data is used.
Requirements very similar to those in the GDPR.
CCPA differs in that it (currently) allows businesses
to collect your information by default—though they
do have to offer the ability to opt out.
California Consumer Privacy Act 2018
Impact of Regulations
In March, California announced they’re
banning “dark patterns.”
And a new “Privacy Options” icon for
businesses to show you where to opt out of
data collection.
The icon was designed by Carnegie Mellon’s
CyLab and the University of Michigan’s School
of Information.
Impact of Regulations
New York, Maryland, Massachusetts and Hawaii
are developing their own privacy laws, too.
So, if you’re designing for GDPR and California
privacy laws and more, you may as well design
for all — design for the highest common good.
Impact of Regulations
What’s Our Role?
Our Role
“You were not hired to get approval
or to have your work pinned to the
company fridge.”
“People hire you to be the expert, so
you might as well be the expert.”
—Mike Monteiro, designer, co-founder of Mule Design
in Ruined by Design
More specifically?
We have a responsibility to act as the
advocate for users — but even that’s
too abstract.
The term “user” tends to strip people of
their individual circumstances, their
personality, their history, even their
lives.
We have a responsibility to real human
beings.
We may need to push back where
necessary in terms our clients
understand.
Our Role
Photo by Vince Fleming
We may have to explain to our clients the impacts of ignoring privacy and security concerns.
What are these impacts, specifically?
• Civic responsibility. As user-centered designers, we really should be encouraging our clients to treat
their “end users” as human beings, who are members of their community
• Reputation management: We may have to remind our clients that what companies do can
undermine their brands
• Using dark patterns may anger people and cause them to abandon your site in favor of another with
a more transparent experience
• Data breaches and sloppy treatment of data may lead to the loss of their user base — likely affecting
their profits
• Financial consideration: Keep in mind the increasing number of laws and regulations and the
resulting fines for not following them
Even if there’s an up-front cost to designing for privacy and security, the long-term costs can be
devastating
Our Role
In 2019, 5 employees quit their jobs at
GitHub after learning the company
shared its data with ICE, the
government agency, which has been
accused repeatedly of human rights
violations — especially related to the
treatment of immigrants.
It might be tough to speak up in such
a situation, but we got into this
business to help people — and what
we do has a real-world impact.
Our Role
In the1940s a Frenchman, Rene Carmille was working on the
French Census.
He and his team have been dubbed the first “ethical
hackers.” They decided to sabotage their own machines, so
the punch cards couldn’t register people’s religion properly.
The team was discovered, arrested by the Nazis and
tortured. Carmille died at Dachau.
But they prevented the Nazis from discovering the identities
of tens of thousands Jewish people living in France, saving
their lives in the process.
They did so by changing an experience to maintain people’s
privacy.
Rene Carmille
Best Practices
In Privacy by Design, Dr. Ann Cavoukian
lays out 7 foundation principles for Fair
Information Practices.
She recommends making privacy the
“default setting” in our designs and
says privacy should be “embedded”
into design.
So, what are some practical ways to
ensure we’re doing that?
Best Practices
Self Study:
“Privacy by Design: The 7 Foundational Principles”
by Dr. Ann Cavoukian
Founder of Global Privacy & Security by Design and the former Information and Privacy Commissioner
for the Canadian province of Ontario
Avoid dark patterns
Dark Patterns
1
Dark Patterns
UX designer Harry Brignull coined
the term “dark pattern” in 2010
He defines dark pattern: a “user
interface that has been carefully
crafted to trick users into doing
things” that you didn’t mean to do
— like buying or signing up for
something
Another researcher described dark
patterns as supplanting user value
“in favor of shareholder value”
Brignull identified about a dozen types of
dark patterns.
Bait and Switch – You set out to accomplish
one thing but something else completely
undesirable happens.
Confirmshaming – You try to unsubscribe
from something, for example, and the
feature to opt out uses language to guilt
you out of taking action.
Friend spamming – A site asks to access
your contacts, so you can find your friends,
then it emails all your friends without your
permission.
Dark Patterns
Example of confirmshaming
Dark Patterns
“Dark patterns are the canaries in the
coal mine of unethical design.
A company who’s willing to keep a
customer hostage is willing to do
worse.”
— Mike Monteiro, Ruined by Design
Dark patterns can expose users’ personal
information
When you make a payment on Venmo, it
defaults to public, so you automatically share
your payments with … everyone
The opposite of designing with privacy as a
default
Somebody created Vicemo, which scraped
payments listed with words associated with
drugs, alcohol or sex and posted them online
for all to see
Dark Patterns
Strava automatically tagged other runners when you
passed them if they didn’t change their settings.
This feature even had a name: Flyby.
If you clicked on a face, it showed the user’s full name,
picture and a map of their running route — effectively
revealing where they lived.
This happened without you following users and without
them knowing they were sharing their activity.
After receiving criticism, Strava did change the default
setting to private.
It should have always been private.
“Stalkerware”– Apps which allow people to be tracked —
intentionally or not
Dark Patterns
Be transparent
about what
personal data is
used
What Data Is Used?
2
It’s important to be very specific —
especially when sharing PII.
Personally identifiable information —
data points such as name, email, phone
number, social security number, mother’s
maiden name, which can be used to steal
people’s identities and commit fraud
87% of the U.S. population can be
uniquely identified by just their date of
birth, gender, ZIP code? (Those items
aren’t even considered PII.)
Imagine how much damage a bad actor
can do with just 3 data points of PII.
What Data Is Used?
Be transparent about
why specific personal
data is collected or
shared
Why Is Data Used?
3
Consider this as an opportunity to explain the benefits of
sharing their data:
• Does it ensure a better experience in the future?
• Does it personalize ads and offers for them?
Be prepared to explain those benefits in detail.
If you can’t, are you’re designing the right sort of product?
Why Is Data Used?
Why Is Data Used?
Lemonade includes an itemized,
detailed explanation of what
personal information you’re
sharing, and they also explain
why.
They also promise never to sell
your information to third parties.
“TL;DR: We will never, ever, sell your data to anyone.”
Always use clear,
approachable language
Clear Language
4
Clear Language
The New York Times studied 150 privacy policies
from various tech and media platforms. They
described what they found as an “incomprehensible
disaster.”
They described AirBnB’s privacy policy as
“particularly inscrutable.”
“This information is necessary for the adequate performance of
the contract between you and us and to allow us to comply with
our legal obligations.”
Vague language and jargon allow for a wide range
of interpretation, making it easy companies to
defend their practices in a lawsuit while making it
harder for us to understand what’s really going on
with their data.
Twitter advises you to read
their privacy policy in full but
highlights key aspects of it up
front — in a dedicated section
— advising you to pay
attention to those particular
things
Clear Language
Guidelines:
• Avoid legalese and jargon: Even your
terms and conditions content doesn’t
have to sound like legal content
• Consider different ages groups and levels
of savviness
• Most adult Americans read at about a
basic or intermediate literacy level
• 50% can’t read a book written at an 8th
grade level
• The Content Marketing Institute
recommends writing for about a 14- or
15-year-old (about 8th grade)
• Carefully crafted personas can help
determine if an experience’s reading level
should vary from that range
Clear Language
Photo by John-Mark Smith
Give users options to control their
own data
User Controls
5
User Controls
Google offers a Privacy Checkup with high
level descriptions of how your personal data is
being used and why.
This links to specific Privacy Controls, which
allow you to adjust how that data is accessed.
They allow you to turn off activity tracking,
location history, your YouTube history, your
Google photo settings, check which 3rd
parties have access to your account
information, and access other key settings all
in one privacy dashboard.
This seems like a good moment to
recall Dr. Cavoukian’s maxim:
Keep these settings private by
default
User Controls
Ensure these privacy
features and information are
placed contextually and
easy to find
Easy to Find
6
Easy to Find
Such important information
shouldn’t be placed in 8-point
font …
buried in the Terms &
Conditions …
hidden in the footer …
or several levels of navigation
down deep in your app
— and yet, that’s often where
we find it
A feature like California’s
new “Privacy Options” icon
could prove effective to
draw attention to these
privacy options.
Easy to Find
Contextual and easy to find also means …
Onboarding — Explaining in detail how you use people’s
data when they’re using your app for the very first time.
“Just in time” alerts – Alerting users in the moment—when
they’re about to share data in a new way—even if they have a
history of using your experience.
Easy to Find
Mozilla displays robust
Privacy information by
default in a dedicated tab
when you download and
open their Firefox
browser for the first time.
Remind users regularly about their privacy
options
And actively encourage them to take
advantage of them
Reminders
7
Reminders
Facebook allows you to set
reminders to do a privacy
checkup every week, month, 6
months or year
Google also has a feature, which
will send you a reminder to check
your privacy settings.
Never change users’ privacy settings
without telling them in advance.
They should also have the option to opt
out of such changes.
Never Change Without Notice
8
A few years ago, Facebook made users’ “likes” visible overnight,
which consequently may have outed some people in the
LGTBQ community or revealed people’s personal, political or
religious beliefs.
When I asked an employee how they justified this change, they
responded that the company valued transparency and wanted
people to be transparent about their interests.
The company’s founder, Mark Zuckerberg, had even famously
said privacy was no longer a “social norm.”
Never Change Without Notice
We don’t have the right to make decisions about other
people’s personal data and interests on their behalf.
Assuming everyone’s information can safely be made
public is a belief that comes from a position of privilege.
We should never make decisions like this, which can
profoundly affect people’s privacy without their explicit
consent.
Never Change Without Notice
In Conclusion
We talk a lot about “empathy” in
design.
If we design with empathy, we
won’t design experiences we
wouldn’t want to use ourselves.
And we won’t design using “dark
patterns” either.
Conclusion
Photo by Josh Calabrese
Privacy is not about secrecy.
It’s all about control.
— Dr.Ann Cavoukian
If we want to ensure people have control over their
own personal information
If we want to ensure experiences we design are user
friendly and truly “user-centered”
We’ll keep these best practices in mind
Conclusion
Photo by Zanardi, Unsplash
thank you
Further Study
Further Study
• California Consumer Privacy Act
• GDPR.eu
• “Privacy by Design: The 7 Foundational Principles” -
Dr. Ann Cavoukian
• The Privacy Project – New York Times
• “We Read 150 Privacy Policies. They Were an
Incomprehensible Disaster”– Kevin Litman-Navarro,
New York Times
• “Privacy UX - Common Concerns and Privacy in Web
Forms” – Vitaly Friedman, Smashing Magazine
• “What GDPR Means for UX” – Claire Barrett
• www.darkpatterns.org – Harry Brignull
• “How Dark Patterns Trick You Online” – YouTube
• Ruined by Design – Mike Monteiro

Designing for Privacy in an Increasingly Public World

  • 1.
    Designing for Privacyin an Increasingly Public World Product Design Meetup | 21 July 2021 | Robert Stribley Street art by JR
  • 2.
    I’m Robert Stribley I’ma user experience designer and manager at Publicis Sapient and I teach user experience design at the School of Visual Arts in New York. Introduction
  • 3.
    • Corporations andnon-profits consider the privacy of their users’ data, their content, even their browsing behavior for their clients’ benefit and safety • But they also do it for their own personal and financial self interest • It’s increasingly important that they consider the privacy and security issues affecting their customers and clients • So how do we design these digital experiences—apps, websites, etc — to ensure people’s privacy? Background
  • 4.
    Privacy and securityare different concepts Privacy: Your ability to control your personal information and how it’s used Security: How your personal information is protected by those holding on to it These concepts often overlap, so we’ll refer to both Our focus: How we can ensure people’s privacy is maintained as we design experiences for them Purpose
  • 5.
    Why Privacy? Image byJack Ferrentino for NPR
  • 6.
    “Arguing that youdon't care about the right to privacy because you have nothing to hide is no different than saying you don't care about free speech because you have nothing to say.” — Edward Snowden, former CIA employee, infamous NSA leaker Why Privacy?
  • 7.
    • If we’renot concerned with a particular privacy issue, remember, we’re not designing for ourselves • If we’re designing with empathy, we’ll consider the needs of people not like ourselves — people with different backgrounds and experiences • That means researching privacy issues, but also engaging with people with diverse backgrounds and lived experiences Why Privacy?
  • 8.
    Examples: DayOne, a non-profit,provides services for young people in abusive dating relationships. These clients may worry about their partners tracking their online activity or stalking them in real life. Similarly, LGBTQ youth need to feel their privacy is secure when reaching out for help online. In this sense, privacy issues are often diversity issues. Why Privacy?
  • 9.
  • 10.
    In April, Facebook,the largest social media platform on the planet was hacked. 533 million user’s phone numbers and personal data were leaked online. Data for half a billion people. Data Security
  • 11.
    Fraud & identitytheft on the rise during the pandemic. FTC: 1.4 million reports of identity theft in 2020 — double from 2019. Leaks of personal data can be catastrophic to people’s lives. Fraud & Identity Theft Photo by Kyle Glenn
  • 12.
    Stores, such asAlbertsons, Rite-Aid, Macy’s, ACE Hardware are using facial recognition programs to identify customers. Some also use apps to track customers around their stores to present them with ads online later. Facial Recognition
  • 13.
    Amazon required delivery driversto sign consent forms, which allowed the company to collect their biometric data and to use AI cameras to monitor their location, movement, their driving patterns. At least one driver quit over this form of “AI surveillance.” Biometric Data
  • 14.
    • A donationsite for Donald Trump deployed “dark patterns” to trick supporters into agreeing to recurring donations • Designers rolled out iterations of this feature with increasingly confusing language, fine print, bold text, all- caps, and a pre-selected check box • They referred to the feature as a “money bomb” • Donations grew astronomically — as did fraud complaints from angry supporters • One 78-year-old supporter summed up his thoughts: “Bandits!” Dark Patterns
  • 15.
    • Demand forpersonalized content, which benefits from personal data seems higher than ever • People say they want personalized ads, so you’d think they enjoy sharing their data • But a 2019 survey by network security company RSA found only 17% of respondents said it was ethical to track their online activity to personalize ads • Earlier, Pew Research found 91% of adults believe consumers have lost control over how their personal information is collected and used by companies Data Sharing
  • 16.
    Data Sharing Apple rolledout a new iPhone feature called “App Tracking Transparency,” an anti-tracking shield, which prevents apps from shadowing you across the internet. Now, they have to ask first. Only ~15% of iOS users worldwide allowing apps to track them so far – The Register, May 2021.
  • 17.
  • 18.
    GDPR stands for… The General Data Protection Regulation Law finalized in 2016, came into effect in 2018 Regulates how apps and sites can gather and transfer or process personal data when working within the European Union Also, what happens to that data when it’s transferred outside of the EU? Impact of Regulations Remember a while back when you suddenly got a gazillion emails from companies telling you they had updated their privacy policies? That was a result of the GDPR.
  • 19.
    Some things GDPRrequires … • Ask people to opt in to sharing their data • Communicate to people in the moment, when you’re collecting their personal data • Be transparent about what you’re doing with it • Allow people to download their data and delete it — a “right to erasure” or “right to be forgotten” Impact of Regulations
  • 20.
    California passed theirown version of the GDPR — the California Consumer Privacy Act. Gives Californians more control over how their personal data is used. Requirements very similar to those in the GDPR. CCPA differs in that it (currently) allows businesses to collect your information by default—though they do have to offer the ability to opt out. California Consumer Privacy Act 2018 Impact of Regulations
  • 21.
    In March, Californiaannounced they’re banning “dark patterns.” And a new “Privacy Options” icon for businesses to show you where to opt out of data collection. The icon was designed by Carnegie Mellon’s CyLab and the University of Michigan’s School of Information. Impact of Regulations
  • 22.
    New York, Maryland,Massachusetts and Hawaii are developing their own privacy laws, too. So, if you’re designing for GDPR and California privacy laws and more, you may as well design for all — design for the highest common good. Impact of Regulations
  • 23.
  • 24.
    Our Role “You werenot hired to get approval or to have your work pinned to the company fridge.” “People hire you to be the expert, so you might as well be the expert.” —Mike Monteiro, designer, co-founder of Mule Design in Ruined by Design
  • 25.
    More specifically? We havea responsibility to act as the advocate for users — but even that’s too abstract. The term “user” tends to strip people of their individual circumstances, their personality, their history, even their lives. We have a responsibility to real human beings. We may need to push back where necessary in terms our clients understand. Our Role Photo by Vince Fleming
  • 26.
    We may haveto explain to our clients the impacts of ignoring privacy and security concerns. What are these impacts, specifically? • Civic responsibility. As user-centered designers, we really should be encouraging our clients to treat their “end users” as human beings, who are members of their community • Reputation management: We may have to remind our clients that what companies do can undermine their brands • Using dark patterns may anger people and cause them to abandon your site in favor of another with a more transparent experience • Data breaches and sloppy treatment of data may lead to the loss of their user base — likely affecting their profits • Financial consideration: Keep in mind the increasing number of laws and regulations and the resulting fines for not following them Even if there’s an up-front cost to designing for privacy and security, the long-term costs can be devastating Our Role
  • 27.
    In 2019, 5employees quit their jobs at GitHub after learning the company shared its data with ICE, the government agency, which has been accused repeatedly of human rights violations — especially related to the treatment of immigrants. It might be tough to speak up in such a situation, but we got into this business to help people — and what we do has a real-world impact. Our Role
  • 28.
    In the1940s aFrenchman, Rene Carmille was working on the French Census. He and his team have been dubbed the first “ethical hackers.” They decided to sabotage their own machines, so the punch cards couldn’t register people’s religion properly. The team was discovered, arrested by the Nazis and tortured. Carmille died at Dachau. But they prevented the Nazis from discovering the identities of tens of thousands Jewish people living in France, saving their lives in the process. They did so by changing an experience to maintain people’s privacy. Rene Carmille
  • 29.
  • 30.
    In Privacy byDesign, Dr. Ann Cavoukian lays out 7 foundation principles for Fair Information Practices. She recommends making privacy the “default setting” in our designs and says privacy should be “embedded” into design. So, what are some practical ways to ensure we’re doing that? Best Practices Self Study: “Privacy by Design: The 7 Foundational Principles” by Dr. Ann Cavoukian Founder of Global Privacy & Security by Design and the former Information and Privacy Commissioner for the Canadian province of Ontario
  • 31.
  • 32.
    Dark Patterns UX designerHarry Brignull coined the term “dark pattern” in 2010 He defines dark pattern: a “user interface that has been carefully crafted to trick users into doing things” that you didn’t mean to do — like buying or signing up for something Another researcher described dark patterns as supplanting user value “in favor of shareholder value”
  • 33.
    Brignull identified abouta dozen types of dark patterns. Bait and Switch – You set out to accomplish one thing but something else completely undesirable happens. Confirmshaming – You try to unsubscribe from something, for example, and the feature to opt out uses language to guilt you out of taking action. Friend spamming – A site asks to access your contacts, so you can find your friends, then it emails all your friends without your permission. Dark Patterns Example of confirmshaming
  • 34.
    Dark Patterns “Dark patternsare the canaries in the coal mine of unethical design. A company who’s willing to keep a customer hostage is willing to do worse.” — Mike Monteiro, Ruined by Design
  • 35.
    Dark patterns canexpose users’ personal information When you make a payment on Venmo, it defaults to public, so you automatically share your payments with … everyone The opposite of designing with privacy as a default Somebody created Vicemo, which scraped payments listed with words associated with drugs, alcohol or sex and posted them online for all to see Dark Patterns
  • 36.
    Strava automatically taggedother runners when you passed them if they didn’t change their settings. This feature even had a name: Flyby. If you clicked on a face, it showed the user’s full name, picture and a map of their running route — effectively revealing where they lived. This happened without you following users and without them knowing they were sharing their activity. After receiving criticism, Strava did change the default setting to private. It should have always been private. “Stalkerware”– Apps which allow people to be tracked — intentionally or not Dark Patterns
  • 37.
    Be transparent about what personaldata is used What Data Is Used? 2
  • 38.
    It’s important tobe very specific — especially when sharing PII. Personally identifiable information — data points such as name, email, phone number, social security number, mother’s maiden name, which can be used to steal people’s identities and commit fraud 87% of the U.S. population can be uniquely identified by just their date of birth, gender, ZIP code? (Those items aren’t even considered PII.) Imagine how much damage a bad actor can do with just 3 data points of PII. What Data Is Used?
  • 39.
    Be transparent about whyspecific personal data is collected or shared Why Is Data Used? 3
  • 40.
    Consider this asan opportunity to explain the benefits of sharing their data: • Does it ensure a better experience in the future? • Does it personalize ads and offers for them? Be prepared to explain those benefits in detail. If you can’t, are you’re designing the right sort of product? Why Is Data Used?
  • 41.
    Why Is DataUsed? Lemonade includes an itemized, detailed explanation of what personal information you’re sharing, and they also explain why. They also promise never to sell your information to third parties. “TL;DR: We will never, ever, sell your data to anyone.”
  • 42.
    Always use clear, approachablelanguage Clear Language 4
  • 43.
    Clear Language The NewYork Times studied 150 privacy policies from various tech and media platforms. They described what they found as an “incomprehensible disaster.” They described AirBnB’s privacy policy as “particularly inscrutable.” “This information is necessary for the adequate performance of the contract between you and us and to allow us to comply with our legal obligations.” Vague language and jargon allow for a wide range of interpretation, making it easy companies to defend their practices in a lawsuit while making it harder for us to understand what’s really going on with their data.
  • 44.
    Twitter advises youto read their privacy policy in full but highlights key aspects of it up front — in a dedicated section — advising you to pay attention to those particular things Clear Language
  • 45.
    Guidelines: • Avoid legaleseand jargon: Even your terms and conditions content doesn’t have to sound like legal content • Consider different ages groups and levels of savviness • Most adult Americans read at about a basic or intermediate literacy level • 50% can’t read a book written at an 8th grade level • The Content Marketing Institute recommends writing for about a 14- or 15-year-old (about 8th grade) • Carefully crafted personas can help determine if an experience’s reading level should vary from that range Clear Language Photo by John-Mark Smith
  • 46.
    Give users optionsto control their own data User Controls 5
  • 47.
    User Controls Google offersa Privacy Checkup with high level descriptions of how your personal data is being used and why. This links to specific Privacy Controls, which allow you to adjust how that data is accessed. They allow you to turn off activity tracking, location history, your YouTube history, your Google photo settings, check which 3rd parties have access to your account information, and access other key settings all in one privacy dashboard.
  • 48.
    This seems likea good moment to recall Dr. Cavoukian’s maxim: Keep these settings private by default User Controls
  • 49.
    Ensure these privacy featuresand information are placed contextually and easy to find Easy to Find 6
  • 50.
    Easy to Find Suchimportant information shouldn’t be placed in 8-point font … buried in the Terms & Conditions … hidden in the footer … or several levels of navigation down deep in your app — and yet, that’s often where we find it A feature like California’s new “Privacy Options” icon could prove effective to draw attention to these privacy options.
  • 51.
    Easy to Find Contextualand easy to find also means … Onboarding — Explaining in detail how you use people’s data when they’re using your app for the very first time. “Just in time” alerts – Alerting users in the moment—when they’re about to share data in a new way—even if they have a history of using your experience.
  • 52.
    Easy to Find Mozilladisplays robust Privacy information by default in a dedicated tab when you download and open their Firefox browser for the first time.
  • 53.
    Remind users regularlyabout their privacy options And actively encourage them to take advantage of them Reminders 7
  • 54.
    Reminders Facebook allows youto set reminders to do a privacy checkup every week, month, 6 months or year Google also has a feature, which will send you a reminder to check your privacy settings.
  • 55.
    Never change users’privacy settings without telling them in advance. They should also have the option to opt out of such changes. Never Change Without Notice 8
  • 56.
    A few yearsago, Facebook made users’ “likes” visible overnight, which consequently may have outed some people in the LGTBQ community or revealed people’s personal, political or religious beliefs. When I asked an employee how they justified this change, they responded that the company valued transparency and wanted people to be transparent about their interests. The company’s founder, Mark Zuckerberg, had even famously said privacy was no longer a “social norm.” Never Change Without Notice
  • 57.
    We don’t havethe right to make decisions about other people’s personal data and interests on their behalf. Assuming everyone’s information can safely be made public is a belief that comes from a position of privilege. We should never make decisions like this, which can profoundly affect people’s privacy without their explicit consent. Never Change Without Notice
  • 58.
  • 59.
    We talk alot about “empathy” in design. If we design with empathy, we won’t design experiences we wouldn’t want to use ourselves. And we won’t design using “dark patterns” either. Conclusion Photo by Josh Calabrese
  • 60.
    Privacy is notabout secrecy. It’s all about control. — Dr.Ann Cavoukian If we want to ensure people have control over their own personal information If we want to ensure experiences we design are user friendly and truly “user-centered” We’ll keep these best practices in mind Conclusion Photo by Zanardi, Unsplash
  • 61.
  • 62.
  • 63.
    Further Study • CaliforniaConsumer Privacy Act • GDPR.eu • “Privacy by Design: The 7 Foundational Principles” - Dr. Ann Cavoukian • The Privacy Project – New York Times • “We Read 150 Privacy Policies. They Were an Incomprehensible Disaster”– Kevin Litman-Navarro, New York Times • “Privacy UX - Common Concerns and Privacy in Web Forms” – Vitaly Friedman, Smashing Magazine • “What GDPR Means for UX” – Claire Barrett • www.darkpatterns.org – Harry Brignull • “How Dark Patterns Trick You Online” – YouTube • Ruined by Design – Mike Monteiro