Numbers can't speak for themselves, and data sets - no matter their scale - are still objects of human design. Kate Crawford More Quotes by Kate Crawford More Quotes From Kate Crawford We need a sweeping debate about ethics, boundaries, and regulation for location data technologies. Kate Crawford need debate ethics data With big data comes big responsibilities. Kate Crawford responsibilities big data Books about technology start-ups have a pattern. First, there's the grand vision of the founders, then the heroic journey of producing new worlds from all-night coding and caffeine abuse, and finally, the grand finale: immense wealth and secular sainthood. Let's call it the Jobs Narrative. Kate Crawford new vision technology journey Like all technologies before it, artificial intelligence will reflect the values of its creators. So inclusivity matters - from who designs it to who sits on the company boards and which ethical perspectives are included. Kate Crawford will like company intelligence Biases and blind spots exist in big data as much as they do in individual perceptions and experiences. Yet there is a problematic belief that bigger data is always better data and that correlation is as good as causation. Kate Crawford better good data blind If you're not thinking about the way systemic bias can be propagated through the criminal justice system or predictive policing, then it's very likely that, if you're designing a system based on historical data, you're going to be perpetuating those biases. Kate Crawford you justice way thinking Data and data sets are not objective; they are creations of human design. We give numbers their voice, draw inferences from them, and define their meaning through our interpretations. Kate Crawford meaning design voice numbers We urgently need more due process with the algorithmic systems influencing our lives. If you are given a score that jeopardizes your ability to get a job, housing, or education, you should have the right to see that data, know how it was generated, and be able to correct errors and contest the decision. Kate Crawford job you decision education Data will always bear the marks of its history. That is human history held in those data sets. Kate Crawford will always data history If you have rooms that are very homogeneous, that have all had the same life experiences and educational backgrounds, and they're all relatively wealthy, their perspective on the world is going to mirror what they already know. That can be dangerous when we're making systems that will affect so many diverse populations. Kate Crawford mirror you perspective life We should have equivalent due-process protections for algorithmic decisions as for human decisions. Kate Crawford equivalent decisions human should There is no quick technical fix for a social problem. Kate Crawford technical fix problem social Big data sets are never complete. Kate Crawford complete big never data The amount of money and industrial energy that has been put into accelerating AI code has meant that there hasn't been as much energy put into thinking about social, economic, ethical frameworks for these systems. We think there's a very urgent need for this to happen faster. Kate Crawford ai think money thinking Vivametrica isn't the only company vying for control of the fitness data space. There is considerable power in becoming the default standard-setter for health metrics. Any company that becomes the go-to data analysis group for brands like Fitbit and Jawbone stands to make a lot of money. Kate Crawford health money power fitness Self-tracking using a wearable device can be fascinating. Kate Crawford device using fascinating Rather than assuming Terms of Service are equivalent to informed consent, platforms should offer opt-in settings where users can choose to join experimental panels. If they don't opt in, they aren't forced to participate. Kate Crawford forced choose where service It is a failure of imagination and methodology to claim that it is necessary to experiment on millions of people without their consent in order to produce good data science. Kate Crawford good failure science people Surveillant anxiety is always a conjoined twin: The anxiety of those surveilled is deeply connected to the anxiety of the surveillers. But the anxiety of the surveillers is generally hard to see; it's hidden in classified documents and delivered in highly coded languages in front of Senate committees. Kate Crawford hidden see always anxiety Big Data is neither color-blind nor gender-blind. We can see how it is used in marketing to segment people. Kate Crawford big see data people