Home
Authors
Tags
App
Get QuoteDark Inspirational Quotes App
" With big data comes big responsibilities. "
Kate Crawford
Big
Data
Responsibilities
Related Quotes:
" It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science. "
Kate Crawford
Failure
Good
Science
" Data will always bear the marks of its history. That is human history held in those data sets. "
Kate Crawford
Human
Always
Bear
" Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation. "
Kate Crawford
Better
Good
Belief
" Data is something we create, but it's also something we imagine. "
Kate Crawford
Also
Create
Data
" Sexism, racism, and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many 'intelligent' systems that shape how we are categorized and advertised to. "
Kate Crawford
Racism
How
Technology
" Histories of discrimination can live on in digital platforms, and if they go unquestioned, they become part of the logic of everyday algorithmic systems. "
Kate Crawford
Logic
Digital
Discrimination
" While many big-data providers do their best to de-identify individuals from human-subject data sets, the risk of re-identification is very real. "
Kate Crawford
Data
Risk
Best
" The promoters of big data would like us to believe that behind the lines of code and vast databases lie objective and universal insights into patterns of human behavior, be it consumer spending, criminal or terrorist acts, healthy habits, or employee productivity. But many big-data evangelists avoid taking a hard look at the weaknesses. "
Kate Crawford
Healthy
Behavior
Believe
" Surveillant anxiety is always a conjoined twin: The anxiety of those surveilled is deeply connected to the anxiety of the surveillers. But the anxiety of the surveillers is generally hard to see; it's hidden in classified documents and delivered in highly coded languages in front of Senate committees. "
Kate Crawford
Anxiety
Connected
Hidden
" We need a sweeping debate about ethics, boundaries, and regulation for location data technologies. "
Kate Crawford
Data
Need
Location
" Big Data is neither color-blind nor gender-blind. We can see how it is used in marketing to segment people. "
Kate Crawford
People
Used
See
" Numbers can't speak for themselves, and data sets - no matter their scale - are still objects of human design. "
Kate Crawford
Numbers
Data
Design
" We urgently need more due process with the algorithmic systems influencing our lives. If you are given a score that jeopardizes your ability to get a job, housing, or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision. "
Kate Crawford
Know
Decision
Job
" We should have equivalent due-process protections for algorithmic decisions as for human decisions. "
Kate Crawford
Decisions
Human
Equivalent
" Like all technologies before it, artificial intelligence will reflect the values of its creators. So inclusivity matters - from who designs it to who sits on the company boards and which ethical perspectives are included. "
Kate Crawford
Values
Company
Intelligence
" We need to be vigilant about how we design and train these machine-learning systems, or we will see ingrained forms of bias built into the artificial intelligence of the future. "
Kate Crawford
Train
Design
Future
" Books about technology start-ups have a pattern. First, there's the grand vision of the founders, then the heroic journey of producing new worlds from all-night coding and caffeine abuse, and finally, the grand finale: immense wealth and secular sainthood. Let's call it the Jobs Narrative. "
Kate Crawford
New
Vision
Journey
" People think 'big data' avoids the problem of discrimination because you are dealing with big data sets, but, in fact, big data is being used for more and more precise forms of discrimination - a form of data redlining. "
Kate Crawford
Think
People
You
" Hidden biases in both the collection and analysis stages present considerable risks and are as important to the big-data equation as the numbers themselves. "
Kate Crawford
Risks
Present
Important
" Vivametrica isn't the only company vying for control of the fitness data space. There is considerable power in becoming the default standard-setter for health metrics. Any company that becomes the go-to data analysis group for brands like Fitbit and Jawbone stands to make a lot of money. "
Kate Crawford
Space
Money
Health
" Many of us now expect our online activities to be recorded and analyzed, but we assume the physical spaces we inhabit are different. The data broker industry doesn't see it that way. To them, even the act of walking down the street is a legitimate data set to be captured, catalogued, and exploited. "
Kate Crawford
Data
Down
See
" Self-tracking using a wearable device can be fascinating. "
Kate Crawford
Device
Using
Fascinating
" The fear isn't that big data discriminates. We already know that it does. It's that you don't know if you've been discriminated against. "
Kate Crawford
You
Against
Big
" We should always be suspicious when machine-learning systems are described as free from bias if it's been trained on human-generated data. Our biases are built into that training data. "
Kate Crawford
Free
Training
Bias