Примечания
1
Heyn, «Berlin’s Wonderful Horse.»
2
Pfungst, Clever Hans.
3
«Clever Hans’ Again.»
4
Pfungst, Clever Hans.
5
Pfungst.
6
Lapuschkin et al., «Unmasking Clever Hans Predictors.»
7
See the work of philosopher Val Plumwood on the dualisms of intelligence-stupid, emotional-rational, and master-slave. Plumwood, «Politics of Reason.»
8
Turing, «Computing Machinery and Intelligence.»
9
Von Neumann, The Computer and the Brain, 44. This approach was deeply critiqued by Dreyfus, What Computers Can’t Do.
10
See Weizenbaum, «On the Impact of the Computer on Society,» After his death, Minsky was implicated in serious allegations related to convicted pedophile and rapist Jeffrey Epstein. Minsky was one of several scientists who met with Epstein and visited his island retreat where underage girls were forced to have sex with members of Epstein’s coterie. As scholar Meredith Broussard observes, this was part of a broader culture of exclusion that became endemic in AI: «As wonderfully creative as Minsky and his cohort were, they also solidified the culture of tech as a billionaire boys’ club. Math, physics, and the other ‘hard’ sciences have never been hospitable to women and people of color; tech followed this lead.» See Broussard, Artificial Unintelligence, 174.
11
Weizenbaum, Computer Power and Human Reason, 202–3.
12
Greenberger, Management and the Computer of the Future, 315.
13
Dreyfus, Alchemy and Artificial Intelligence.
14
Dreyfus, What Computers Can’t Do.
15
Ullman, Life in Code, 136–37.
16
See, as one of many examples, Poggio et al., «Why and When Can Deep – but Not Shallow – Networks Avoid the Curse of Dimensionality.»
17
Quoted in Gill, Artificial Intelligence for Society, 3.
18
Russell and Norvig, Artificial Intelligence, 30.
19
Daston, «Cloud Physiognomy.»
20
Didi-Huberman, Atlas, 5.
21
Didi-Huberman, 11.
22
Franklin and Swenarchuk, Ursula Franklin Reader, Prelude.
23
For an account of the practices of data colonization, see «Colonized by Data»; and Mbembé, Critique of Black Reason.
24
Fei-Fei Li quoted in Gershgorn, «Data That Transformed AI Research.»
25
Russell and Norvig, Artificial Intelligence, 1.
26
Bledsoe quoted in McCorduck, Machines Who Think, 136.
27
Mattern, Code and Clay, Data and Dirt, xxxiv-xxxv.
28
Ananny and Crawford, «Seeing without Knowing.»
29
Any list will always be an inadequate account of all the people and communities who have inspired and informed this work. I’m particularly grateful to these research communities: FATE (Fairness, Accountability, Transparency and Ethics) and the Social Media Collective at Microsoft Research, the AI Now Institute at NYU, the Foundations of AI working group at the École Normale Supérieure, and the Richard von Weizsäcker Visiting Fellows at the Robert Bosch Academy in Berlin.
30
Saville, «Towards Humble Geographies.»
31
For more on crowdworkers, see Gray and Suri, Ghost Work; and Roberts, Behind the Screen.
32
Canales, Tenth of a Second.
33
Zuboff, Age of Surveillance Capitalism.
34
Cetina, Epistemic Cultures, 3.
35
«Emotion Detection and Recognition (EDR) Market Size.»
36
Nelson, Tu, and Hines, «Introduction,» 5.
37
Danowski and de Castro, Ends of the World.
38
Franklin, Real World of Technology, 5.
39
Brechin, Imperial San Francisco.
40
Brechin, 29.
41
Agricola quoted in Brechin, 25.
42
Quoted in Brechin, 50.
43
Brechin, 69.
44
See, e. g., Davies and Young, Tales from the Dark Side of the City; and «Grey Goldmine.»
45
For more on the street-level changes in San Francisco, see Bloomfield, «History of the California Historical Society’s New Mission Street Neighborhood.»
46
«Street Homelessness.» See also «Counterpoints: An Atlas of Displacement and Resistance.»
47
Gee, «San Francisco or Mumbai?»
48
H. W. Turner published a detailed geological survey of the Silver Peak area in July 1909. In beautiful prose, Turner extolled the geological variety within what he described as «slopes of cream and pink tuffs, and little hillocks of a bright brick red.» Turner, «Contribution to the Geology of the Silver Peak Quadrangle, Nevada,» 228.
49
Lambert, «Breakdown of Raw Materials in Tesla’s Batteries and Possible Breaknecks.»
50
Bullis, «Lithium-Ion Battery.»
51
«Chinese Lithium Giant Agrees to Three-Year Pact to Supply Tesla.»
52
Wald, «Tesla Is a Battery Business.»
53
Scheyder, «Tesla Expects Global Shortage.»
54
Wade, «Tesla’s Electric Cars Aren’t as Green.»
55
Business Council for Sustainable Energy, «2019 Sustainable Energy in America Factbook.» U. S. Energy Information Administration, «What Is U. S. Electricity Generation by Energy Source?»
56
Whittaker et al., AI Now Report 2018.
57
Parikka, Geology of Media, vii – viii; McLuhan, Understanding Media.
58
Ely, «Life Expectancy of Electronics.»
59
Sandro Mezzadra and Brett Neilson use the term «extractivism» to name the relation between different forms of extractive operations in contemporary capitalism, which we see repeated in the context of the AI industry. Mezzadra and Neilson, «Multiple Frontiers of Extraction.»
60
Nassar et al., «Evaluating the Mineral Commodity Supply Risk of the US Manufacturing Sector.»
61
Mumford, Technics and Civilization, 74.
62
See, e. g., Ayogu and Lewis, «Conflict Minerals.»
63
Burke, «Congo Violence Fuels Fears of Return to 90s Bloodbath.»
64
«Congo ’s Bloody Coltan.»
65
«Congo ’s Bloody Coltan.»
66
«Transforming Intel’s Supply Chain with Real-Time Analytics.»
67
See, e. g., an open letter from seventy signatories that criticizes the limitations of the so-called conflict-free certification process: «An Open Letter.»
68
«Responsible Minerals Policy and Due Diligence.»
69
In The Elements of Power, David S. Abraham describes the invisible networks of rare metals traders in global electronics supply chains: «The network to get rare metals from the mine to your laptop travels through a murky network of traders, processors, and component manufacturers. Traders are the middlemen who do more than buy and sell rare metals: they help to regulate information and are the hidden link that helps in navigating the network between metals plants and the components in our laptops» [89].
70
«Responsible Minerals Sourcing.»
71
Liu, «Chinese Mining Dump.»
72
«Bayan Obo Deposit.»
73
Maughan, «Dystopian Lake Filled by the World’s Tech Lust.»
74
Hird, «Waste, Landfills, and an Environmental Ethics of Vulnerability,» 105.
75
Abraham, Elements of Power, 175.
76
Abraham, 176.
77
Simpson, «Deadly Tin Inside Your Smartphone.»
78
Hodal, «Death Metal.»
79
Hodal.
80
Tully, «Victorian Ecological Disaster.»
81
Starosielski, Undersea Network, 34.
82
See Couldry and Mejías, Costs of Connection, 46.
83
Couldry and Mejías, 574.
84
For a superb account of the history of undersea cables, see Starosielski, Undersea Network.
85
Dryer, «Designing Certainty,» 45.
86
Dryer, 46.
87
Dryer, 266-68.
88
More people are now drawing attention to this problem – including researchers at AI Now. See Dobbe and Whittaker, «AI and Climate Change.»
89
See, as an example of early scholarship in this area, Ensmenger, «Computation, Materiality, and the Global Environment.»
90
Hu, Prehistory of the Cloud, 146.
91
Jones, «How to Stop Data Centres from Gobbling Up the World’s Electricity.» Some progress has been made toward mitigating these concerns through greater energy efficiency practices, but significant long-term challenges remain. Masanet et al., «Recalibrating Global Data Center Energy – Use Estimates.»
92
Belkhir and Elmeligi, «Assessing ICT Global Emissions Footprint»; Andrae and Edler, «On Global Electricity Usage.»
93
Strubell, Ganesh, and McCallum, «Energy and Policy Considerations for Deep Learning in NLP.»
94
Strubell, Ganesh, and McCallum.
95
Sutton, «Bitter Lesson.»
96
«AI and Compute.»
97
Cook et al., Clicking Clean.
98
Ghaffary, «More Than 1,000 Google Employees Signed a Letter.» See also «Apple Commits to Be 100 Percent Carbon Neutral»; Harrabin, «Google Says Its Carbon Footprint Is Now Zero»; Smith, «Microsoft Will Be Carbon Negative by 2030.»
99
«Powering the Cloud.»
100
«Powering the Cloud.»
101
«Powering the Cloud.»
102
Hogan, «Data Flows and Water Woes.»
103
«Off Now.»
104
Carlisle, «Shutting Off NSA’s Water Gains Support.»
105
Materiality is a complex concept, and there is a lengthy literature that contends with it in such fields as STS, anthropology, and media studies. In one sense, materiality refers to what Leah Lievrouw describes as «the physical character and existence of objects and artifacts that makes them useful and usable for certain purposes under particular conditions.» Lievrouw quoted in Gillespie, Boczkowski, and Foot, Media Technologies, 25. But as Diana Coole and Samantha Frost write, «Materiality is always something more than ‘mere’ matter: an excess, force, vitality, relationality, or difference that renders matter active, self-creative, productive, unproductive.» Coole and Frost, New Materialisms, 9.
106
United Nations Conference on Trade and Development, Review of Maritime Transport, 2017.
107
George, Ninety Percent of Everything, 4.
108
Schlanger, «If Shipping Were a Country.»
109
Vidal, «Health Risks of Shipping Pollution.»
110
«Containers Lost at Sea–2017 Update.»
111
Adams, «Lost at Sea.»
112
Mumford, Myth of the Machine.
113
Labban, «Deterritorializing Extraction.» For an expansion on this idea, see Arboleda, Planetary Mine.
114
Ananny and Crawford, «Seeing without Knowing.»
115
Wilson, «Amazon and Target Race.»
116
Lingel and Crawford, «Alexa, Tell Me about Your Mother.»
117
Federici, Wages against Housework; Gregg, Counterproductive.
118
In The Utopia of Rules, David Graeber details the sense of loss experienced by white-collar workers who now have to enter data into the decision-making systems that have replaced specialist administrative support staff in most professional workplaces.
119
Smith, Wealth of Nations, 4–5.
120
Marx and Engels, Marx-Engels Reader, 479. Marx expanded on this notion of the worker as an «appendage» in Capital, vol. 1: «In handicrafts and manufacture, the worker makes use of a tool; in the factory, the machine makes use of him. There the movements of the instrument of labor proceed from him, here it is the movements of the machine that he must follow. In manufacture the workers are parts of a living mechanism. In the factory we have a lifeless mechanism which is independent of the workers, who are incorporated into it as its living appendages.» Marx, Das Kapital, 548–49.
121
Luxemburg, «Practical Economies,» 444.
122
Thompson, «Time, Work-Discipline, and Industrial Capitalism.»
123
Thompson, 88–90.
124
Werrett, «Potemkin and the Panopticon,» 6.
125
See, e. g., Cooper, «Portsmouth System of Manufacture.»
126
Foucault, Discipline and Punish; Horne and Maly, Inspection House.
127
Mirzoeff, Right to Look, 58.
128
Mirzoeff, 55.
129
Mirzoeff, 56.
130
Gray and Suri, Ghost Work.
131
Irani, «Hidden Faces of Automation.»
132
Yuan, «How Cheap Labor Drives China’s A. I. Ambitions»; Gray and Suri, «Humans Working behind the AI Curtain.»
133
Berg et al., Digital Labour Platforms.
134
Roberts, Behind the Screen; Gillespie, Custodians of the Internet, 111–40.
135
Silberman et al., «Responsible Research with Crowds.»
136
Silberman et al.
137
Huet, «Humans Hiding behind the Chatbots.»
138
Huet.
139
See Sadowski, «Potemkin AI.»
140
Taylor, «Automation Charade.»
141
Taylor.
142
Gray and Suri, Ghost Work.
143
Standage, Turk, 23.
144
Standage, 23.
145
See, e. g., Aytes, «Return of the Crowds,» 80.
146
Irani, «Difference and Dependence among Digital Workers,» 225.
147
Pontin, «Artificial Intelligence.»
148
Menabrea and Lovelace, «Sketch of the Analytical Engine.»
149
Babbage, On the Economy of Machinery and Manufactures, 39–43.
150
Babbage evidently acquired an interest in quality-control processes while trying (vainly) to establish a reliable supply chain for the components of his calculating engines.
151
Schaffer, «Babbage’s Calculating Engines and the Factory System,» 280.
152
Taylor, People’s Platform, 42.
153
Katz and Krueger, «Rise and Nature of Alternative Work Arrangements.»
154
Rehmann, «Taylorism and Fordism in the Stockyards,» 26.
155
Braverman, Labor and Monopoly Capital, 56, 67; Specht, Red Meat Republic.
156
Taylor, Principles of Scientific Management.
157
Marx, Poverty of Philosophy, 22.
158
Qiu, Gregg, and Crawford, «Circuits of Labour»; Qiu, Goodbye iSlave.
159
Markoff, «Skilled Work, without the Worker.»
160
Guendelsberger, On the Clock, 22.
161
Greenhouse, «McDonald’s Workers File Wage Suits.»
162
Greenhouse.
163
Mayhew and Quinlan, «Fordism in the Fast Food Industry.»
164
Ajunwa, Crawford, and Schultz, «Limitless Worker Surveillance.»
165
Mikel, «WeWork Just Made a Disturbing Acquisition.»
166
Mahdawi, «Domino’s ‘Pizza Checker’ Is Just the Beginning.»
167
Wajcman, «How Silicon Valley Sets Time.»
168
Wajcman, 1277.
169
Gora, Herzog, and Tripathi, «Clock Synchronization.»
170
Eglash, «Broken Metaphor,» 361.
171
Kemeny and Kurtz, «Dartmouth Timesharing,» 223.
172
Eglash, «Broken Metaphor,» 364.
173
Brewer, «Spanner, TrueTime.»
174
Corbett et al., «Spanner,» 14, cited in House, «Synchronizing Uncertainty,» 124.
175
Galison, Einstein’s Clocks, Poincaré’s Maps, 104.
176
Galison, 112.
177
Colligan and Linley, «Media, Technology, and Literature,» 246.
178
Carey, «Technology and Ideology.»
179
Carey, 13.
180
This contrasts with what Foucault called the «microphysics of power» to describe how institutions and apparatuses create particular logics and forms of validity. Foucault, Discipline and Punish, 26.
181
Spargo, Syndicalism, Industrial Unionism, and Socialism.
182
Personal conversation with the author at an Amazon fulfillment center tour, Robbinsville, N.J., October 8, 2019.
183
Muse, «Organizing Tech.»
184
Abdi Muse, personal conversation with the author, October 2, 2019.
185
Gurley, «60 Amazon Workers Walked Out.»
186
Muse quoted in Organizing Tech.
187
Desai quoted in Organizing Tech.
188
Estreicher and Owens, «Labor Board Wrongly Rejects Employee Access to Company Email.»
189
This observation comes from conversations with various labor organizers, tech workers, and researchers, including Astra Taylor, Dan Greene, Bo Daley, and Meredith Whittaker.
190
Kerr, «Tech Workers Protest in SF.»
191
National Institute of Standards and Technology (NIST), «Special Database 32-Multiple Encounter Dataset (MEDS).»
192
Russell, Open Standards and the Digital Age.
193
Researchers at NIST (then the National Bureau of Standards, NBS) began working on the first version of the FBI’s Automated Fingerprint Identification System in the late 1960s. See Garris and Wilson, «NIST Biometrics Evaluations and Developments,» 1.
194
Garris and Wilson, 1.
195
Garris and Wilson, 12.
196
Sekula, «Body and the Archive,» 7.
197
Sekula, 18–19.
198
Sekula, 17.
199
See, e. g., Grother et al., «2017 IARPA Face Recognition Prize Challenge (FRPC).»
200
See, e. g., Ever AI, «Ever AI Leads All US Companies.»
201
Founds et al., «NIST Special Database 32.»
202
Curry et al., «NIST Special Database 32 Multiple Encounter Dataset I (MEDS-I),» 8.
203
See, e. g., Jaton, «We Get the Algorithms of Our Ground Truths.»
204
Nilsson, Quest for Artificial Intelligence, 398.
205
«ImageNet Large Scale Visual Recognition Competition (ILSVRC).»
206
In the late 1970s, Ryszard Michalski wrote an algorithm based on symbolic variables and logical rules. This language was popular in the 1980s and 1990s, but as the rules of decision-making and qualification became more complex, the language became less usable. At the same moment, the potential of using large training sets triggered a shift from this conceptual clustering to contemporary machine learning approaches. Michalski, «Pattern Recognition as Rule-Guided Inductive Inference.»
207
Bush, «As We May Think.»
208
Light, «When Computers Were Women»; Hicks, Programmed Inequality.
209
As described in Russell and Norvig, Artificial Intelligence, 546.
210
Li, «Divination Engines,» 143.
211
Li, 144.
212
Brown and Mercer, «Oh, Yes, Everything’s Right on Schedule, Fred.»
213
Lem, «First Sally (A), or Trurl’s Electronic Bard,» 199.
214
Lem, 199.
215
Brown and Mercer, «Oh, Yes, Everything’s Right on Schedule, Fred.»
216
Marcus, Marcinkiewicz, and Santorini, «Building a Large Annotated Corpus of English.»
217
Klimt and Yang, «Enron Corpus.»
218
Wood, Massey, and Brownell, «FERC Order Directing Release of Information,» 12.
219
Heller, «What the Enron Emails Say about Us.»
220
Baker et al., «Research Developments and Directions in Speech Recognition.»
221
I have participated in early work to address this gap. See, e. g., Gebru et al., «Datasheets for Datasets.» Other researchers have also sought to address this problem for AI models; see Mitchell et al., «Model Cards for Model Reporting»; Raji and Buolamwini, «Actionable Auditing.»
222
Phillips, Rauss, and Der, «FERET (Face Recognition Technology) Recognition Algorithm Development and Test Results,» 9.
223
Phillips, Rauss, and Der, 61.
224
Phillips, Rauss, and Der, 12.
225
See Aslam, «Facebook by the Numbers (2019)»; and «Advertising on Twitter.»
226
Fei-Fei Li, as quoted in Gershgorn, «Data That Transformed AI Research.»
227
Deng et al., «ImageNet.»
228
Gershgorn, «Data That Transformed AI Research.»
229
Gershgorn.
230
Markoff, «Seeking a Better Way to Find Web Images.»
231
Hernandez, «CU Colorado Springs Students Secretly Photographed.»
232
Zhang et al., «Multi-Target, Multi-Camera Tracking by Hierarchical Clustering.»
233
Sheridan, «Duke Study Recorded Thousands of Students’ Faces.»
234
Harvey and LaPlace, «Brainwash Dataset.»
235
Locker, «Microsoft, Duke, and Stanford Quietly Delete Databases.»
236
Murgia and Harlow, «Who’s Using Your Face?» When the Financial Times exposed the contents of this dataset, Microsoft removed the set from the internet, and a spokesperson for Microsoft claimed simply that it was removed «because the research challenge is over.» Locker, «Microsoft, Duke, and Stanford Quietly Delete Databases.»
237
Franceschi-Bicchierai, «Redditor Cracks Anonymous Data Trove.»
238
Tockar, «Riding with the Stars.»
239
Crawford and Schultz, «Big Data and Due Process.»
240
Franceschi-Bicchierai, «Redditor Cracks Anonymous Data Trove.»
241
Nilsson, Quest for Artificial Intelligence, 495.
242
And, as Geoff Bowker famously reminds us, «Raw data is both an oxymoron and a bad idea; to the contrary, data should be cooked with care.» Bowker, Memory Practices in the Sciences, 184-85.
243
Fourcade and Healy, «Seeing Like a Market,» 13, emphasis added.
244
Meyer and Jepperson, «‘Actors’ of Modern Society.»
245
Gitelman, «Raw Data» Is an Oxymoron, 3.
246
Many scholars have looked closely at the work these metaphors do. Media studies professors Cornelius Puschmann and Jean Burgess analyzed the common data metaphors and noted two widespread categories: data «as a natural force to be controlled and [data] as a resource to be consumed.» Puschmann and Burgess, «Big Data, Big Questions,» abstract. Researchers Tim Hwang and Karen Levy suggest that describing data as «the new oil» carries connotations of being costly to acquire but also suggests the possibility of «big payoffs for those with the means to extract it.» Hwang and Levy, «‘The Cloud’ and Other Dangerous Metaphors.»
247
Stark and Hoffmann, «Data Is the New What?»
248
Media scholars Nick Couldry and Ulises Mejías call this «data colonialism,» which is steeped in the historical, predatory practices of colonialism but married to (and obscured by) contemporary computing methods. However, as other scholars have shown, this terminology is double-edged because it can occlude the real and ongoing harms of colonialism. Couldry and Mejías, «Data Colonialism»; Couldry and Mejías, Costs of Connection; Segura and Waisbord, «Between Data Capitalism and Data Citizenship.»
249
They refer to this form of capital as «ubercapital.» Fourcade and Healy, «Seeing Like a Market,» 19.
250
Sadowski, «When Data Is Capital,» 8.
251
Sadowski, 9.
252
Here I’m drawing from a history of human subjects review and largescale data studies coauthored with Jake Metcalf. See Metcalf and Crawford, «Where Are Human Subjects in Big Data Research?»
253
«Federal Policy for the Protection of Human Subjects.»
254
See Metcalf and Crawford, «Where Are Human Subjects in Big Data Research?»
255
Seo et al., «Partially Generative Neural Networks.» Jeffrey Brantingham, one of the authors, is also a co-founder of the controversial predictive policing company PredPol. See Winston and Burrington, «A Pioneer in Predictive Policing.»
256
«CalGang Criminal Intelligence System.»
257
Libby, «Scathing Audit Bolsters Critics’ Fears.»
258
Hutson, «Artificial Intelligence Could Identify Gang Crimes.»
259
Hoffmann, «Data Violence and How Bad Engineering Choices Can Damage Society.»
260
Weizenbaum, Computer Power and Human Reason, 266.
261
Weizenbaum, 275-76.
262
Weizenbaum, 276.
263
For more on the history of extraction of data and insights from marginalized communities, see Costanza-Chock, Design Justice; and D’Ignazio and Klein, Data Feminism.
264
Revell, «Google DeepMind’s NHS Data Deal ‘Failed to Comply.’»
265
«Royal Free-Google DeepMind Trial Failed to Comply.»
266
Fabian, Skull Collectors.
267
Gould, Mismeasure of Man, 83.
268
Kolbert, «There’s No Scientific Basis for Race.»
269
Keel, «Religion, Polygenism and the Early Science of Human Origins.»
270
Thomas, Skull Wars.
271
Thomas, 85.
272
Kendi, «History of Race and Racism in America.»
273
Gould, Mismeasure of Man, 88.
274
Mitchell, «Fault in His Seeds.»
275
Horowitz, «Why Brain Size Doesn’t Correlate with Intelligence.»
276
Mitchell, «Fault in His Seeds.»
277
Gould, Mismeasure of Man, 58.
278
West, «Genealogy of Modern Racism,» 91.
279
Bouche and Rivard, «America ’s Hidden History.»
280
Bowker and Star, Sorting Things Out, 319.
281
Bowker and Star, 319.
282
Nedlund, «Apple Card Is Accused of Gender Bias»; Angwin et al., «Machine Bias»; Angwin et al., «Dozens of Companies Are Using Facebook to Exclude.»
283
Dougherty, «Google Photos Mistakenly Labels Black People ‘Gorillas’»; Perez, «Microsoft Silences Its New A. I. Bot Tay»; McMillan, «It’s Not You, It’s It»; Sloane, «Online Ads for High-Paying Jobs Are Targeting Men More Than Women.»
284
See Benjamin, Race after Technology; and Noble, Algorithms of Oppression.
285
Greene, «Science May Have Cured Biased AI»; Natarajan, «Amazon and NSF Collaborate to Accelerate Fairness in AI Research.»
286
Dastin, «Amazon Scraps Secret AI Recruiting Tool.»
287
Dastin.
288
This is part of a larger trend toward automating aspects of hiring. For a detailed account, see Ajunwa and Greene, «Platforms at Work.»
289
There are several superb accounts of the history of inequality and dicrimination in computation. These are a few that have informed my thinking on these issues: Hicks, Programmed Inequality; McIlwain, Black Software; Light, «When Computers Were Women»; and Ensmenger, Computer Boys Take Over.
290
Cetina, Epistemic Cultures, 3.
291
Merler et al., «Diversity in Faces.»
292
Buolamwini and Gebru, «Gender Shades»; Raj et al. «Saving Face.»
293
Merler et al., «Diversity in Faces.»
294
«YFCC100M Core Dataset.»
295
Merler et al., «Diversity in Faces,» 1.
296
There are many excellent books on these issues, but in particular, see Roberts, Fatal Invention, 18–41; and Nelson, Social Life of DNA, See also Tishkoff and Kidd, «Implications of Biogeography.»
297
Browne, «Digital Epidermalization,» 135.
298
Benthall and Haynes, «Racial Categories in Machine Learning.»
299
Mitchell, «Need for Biases in Learning Generalizations.»
300
Dietterich and Kong, «Machine Learning Bias, Statistical Bias.»
301
Domingos, «Useful Things to Know about Machine Learning.»
302
Maddox v. State, 32 Ga. 5S7, 79 Am. Dec. 307; Pierson v. State, 18 Tex. App. 55S; Hinkle v. State, 94 Ga. 595, 21 S. E. 601.
303
Tversky and Kahneman, «Judgment under Uncertainty.»
304
Greenwald and Krieger, «Implicit Bias,» 951.
305
Fellbaum, WordNet, xviii. Below I am drawing on research into ImageNet conducted with Trevor Paglen. See Crawford and Paglen, «Excavating AI.»
306
Fellbaum, xix.
307
Nelson and Kucera, Brown Corpus Manual.
308
Borges, «The Analytical Language of John Wilkins.»
309
These are some of the categories that have now been deleted entirely from ImageNet as of October 1, 2020.
310
See Keyes, «Misgendering Machines.»
311
Drescher, «Out of DSM.»
312
See Bayer, Homosexuality and American Psychiatry.
313
Keyes, «Misgendering Machines.»
314
Hacking, «Making Up People,» 23.
315
Bowker and Star, Sorting Things Out, 196.
316
This is drawn from Lakoff, Women, Fire, and Dangerous Things.
317
ImageNet Roulette was one of the outputs of a multiyear research collaboration between the artist Trevor Paglen and me, in which we studied the underlying logic of multiple benchmark training sets in AI. ImageNet Roulette, led by Paglen and produced by Leif Ryge, was an app that allowed people to interact with a neural net trained on the «person» category of ImageNet. People could upload images of themselves – or news images or historical photographs – to see how ImageNet would label them. People could also see how many of the labels are bizarre, racist, misogynist, and otherwise problematic. The app was designed to show people these concerning labels while warning them in advance of the potential results. All uploaded image data were immediately deleted on processing. See Crawford and Paglen, «Excavating AI.»
318
Yang et al., «Towards Fairer Datasets,» paragraph 4.2.
319
Yang et al., paragraph 4.3.
320
Markoff, «Seeking a Better Way to Find Web Images.»
321
Browne, Dark Matters, 114.
322
Scheuerman et al., «How We’ve Taught Algorithms to See Identity.»
323
UTKFace Large Scale Face Dataset, https://susanqq.github.io/UTK Face.
324
Bowker and Star, Sorting Things Out, 197.
325
Bowker and Star, 198.
326
Edwards and Hecht, «History and the Technopolitics of Identity,» 627.
327
Haraway, Modest_Witness@Second_Millennium, 234.
328
Stark, «Facial Recognition Is the Plutonium of AI,» 53.
329
In order of the examples, see Wang and Kosinski, «Deep Neural Networks Are More Accurate than Humans»; Wu and Zhang, «Automated Inference on Criminality Using Face Images»; and Angwin et al., «Machine Bias.»
330
Agüera y Arcas, Mitchell, and Todorov, «Physiognomy’s New Clothes.»
331
Nielsen, Disability History of the United States; Kafer, Feminist, Queer, Crip; Siebers, Disability Theory.
332
Whittaker et al., «Disability, Bias, and AI.»
333
Hacking, «Kinds of People,» 289.
334
Bowker and Star, Sorting Things Out, 31.
335
Bowker and Star, 6.
336
Eco, Infinity of Lists.
337
Douglass, «West India Emancipation.»
338
Particular thanks to Alex Campolo, who was my research assistant and interlocutor for this chapter, and for his research into Ekman and the history of emotions.
339
«Emotion Detection and Recognition»; Schwartz, «Don’t Look Now.»
340
Ohtake, «Psychologist Paul Ekman Delights at Exploratorium.»
341
Ekman, Emotions Revealed, 7.
342
For an overview of researchers who have found flaws in the claim that emotional expressions are universal and can be predicted by AI, see Heaven, «Why Faces Don’t Always Tell the Truth.»
343
Barrett et al., «Emotional Expressions Reconsidered.»
344
Nilsson, «How AI Helps Recruiters.»
345
Sánchez-Monedero and Dencik, «Datafication of the Workplace,» 48; Harwell, «Face-Scanning Algorithm.»
346
Byford, «Apple Buys Emotient.»
347
Molnar, Robbins, and Pierson, «Cutting Edge.»
348
Picard, «Affective Computing Group.»
349
«Affectiva Human Perception AI Analyzes Complex Human States.»
350
Schwartz, «Don’t Look Now.»
351
See, e. g., Nilsson, «How AI Helps Recruiters.»
352
«Face: An AI Service That Analyzes Faces in Images.»
353
«Amazon Rekognition Improves Face Analysis»; «Amazon Rekognition – Video and Image.»
354
Barrett et al., «Emotional Expressions Reconsidered,» 1.
355
Sedgwick, Frank, and Alexander, Shame and Its Sisters, 258.
356
Tomkins, Affect Imagery Consciousness.
357
Tomkins.
358
Leys, Ascent of Affect, 18.
359
Tomkins, Affect Imagery Consciousness, 23.
360
Tomkins, 23.
361
Tomkins, 23.
362
For Ruth Leys, this «radical dissociation between feeling and cognition» is the major reason for its attractiveness to theorists in the humanities, most notably Eve Kosofsky Sedgwick, who wants to revalorize our experiences of error or confusion into new forms of freedom. Leys, Ascent of Affect, 35; Sedgwick, Touching Feeling.
363
Tomkins, Affect Imagery Consciousness, 204.
364
Tomkins, 206; Darwin, Expression of the Emotions; Duchenne (de Boulogne), Mécanisme de la physionomie humaine.
365
Tomkins, 243, quoted in Leys, Ascent of Affect, 32.
366
Tomkins, Affect Imagery Consciousness, 216.
367
Ekman, Nonverbal Messages, 45.
368
Tuschling, «Age of Affective Computing,» 186.
369
Ekman, Nonverbal Messages, 45.
370
Ekman, 46.
371
Ekman, 46.
372
Ekman, 46.
373
Ekman, 46.
374
Ekman, 46.
375
Ekman and Rosenberg, What the Face Reveals, 375.
376
Tomkins and McCarter, «What and Where Are the Primary Affects?»
377
Russell, «Is There Universal Recognition of Emotion from Facial Expression?» 116.
378
Leys, Ascent of Affect, 93.
379
Ekman and Rosenberg, What the Face Reveals, 377.
380
Ekman, Sorenson, and Friesen, «Pan-Cultural Elements in Facial Diplays of Emotion,» 86, 87.
381
Ekman and Friesen, «Constants across Cultures in the Face and Emotion,» 128.
382
Aristotle, Categories, 70b8–13, 527.
383
Aristotle, 805a, 27–30, 87.
384
It would be difficult to overstate the influence of this work, which has since fallen into disrepute: by 1810 it went through sixteen German and twenty English editions. Graham, «Lavater’s Physiognomy in England,» 561.
385
Gray, About Face, 342.
386
Courtine and Haroche, Histoire du visage, 132.
387
Ekman, «Duchenne and Facial Expression of Emotion.»
388
Duchenne (de Boulogne), Mécanisme de la physionomie humaine.
389
Clarac, Massion, and Smith, «Duchenne, Charcot and Babinski,» 362-63.
390
Delaporte, Anatomy of the Passions, 33.
391
Delaporte, 48–51.
392
Daston and Galison, Objectivity.
393
Darwin, Expression of the Emotions in Man and Animals, 12, 307.
394
Leys, Ascent of Affect, 85; Russell, «Universal Recognition of Emotion,» 114.
395
Ekman and Friesen, «Nonverbal Leakage and Clues to Deception,» 93.
396
Pontin, «Lie Detection.»
397
Ekman and Friesen, «Nonverbal Leakage and Clues to Deception,» In a footnote, Ekman and Friesen explained: «Our own research and the evidence from the neurophysiology of visual perception strongly suggest that micro-expressions that are as short as one motion-picture frame (1/50 of a second) can be perceived. That these micro-expressions are not usually seen must depend upon their being embedded in other expressions which distract attention, their infrequency, or some learned perceptual habit of ignoring fast facial expressions.»
398
Ekman, Sorenson, and Friesen, «Pan-Cultural Elements in Facial Displays of Emotion,» 87.
399
Ekman, Friesen, and Tomkins, «Facial Affect Scoring Technique,» 40.
400
Ekman, Nonverbal Messages, 97.
401
Ekman, 102.
402
Ekman and Rosenberg, What the Face Reveals.
403
Ekman, Nonverbal Messages, 105.
404
Ekman, 169.
405
Eckman, 106; Aleksander, Artificial Vision for Robots.
406
«Magic from Invention.»
407
Bledsoe, «Model Method in Facial Recognition.»
408
Molnar, Robbins, and Pierson, «Cutting Edge.»
409
Kanade, Computer Recognition of Human Faces.
410
Kanade, 16.
411
Kanade, Cohn, and Tian, «Comprehensive Database for Facial Expression Analysis,» 6.
412
See Kanade, Cohn, and Tian; Lyons et al., «Coding Facial Expressions with Gabor Wavelets»; and Goeleven et al., «Karolinska Directed Emotional Faces.»
413
Lucey et al., «Extended Cohn-Kanade Dataset (CK+).»
414
McDuff et al., «Affectiva-MIT Facial Expression Dataset (AM-FED).»
415
McDuff et al.
416
Ekman and Friesen, Facial Action Coding System (FACS).
417
Foreman, «Conversation with: Paul Ekman»; Taylor, «2009 Time 100»; Paul Ekman Group.
418
Weinberger, «Airport Security,» 413.
419
Halsey, «House Member Questions $900 Million TSA ‘SPOT’ Screening Program.»
420
Ekman, «Life’s Pursuit»; Ekman, Nonverbal Messages, 79–81.
421
Mead, «Review of Darwin and Facial Expression,» 209.
422
Tomkins, Affect Imagery Consciousness, 216.
423
Mead, «Review of Darwin and Facial Expression,» See also Fridlund, «Behavioral Ecology View of Facial Displays.» Ekman later conceded to many of Mead’s points. See Ekman, «Argument for Basic Emotions»; Ekman, Emotions Revealed; and Ekman, «What Scientists Who Study Emotion Agree About.» Ekman also had his defenders. See Cowen et al., «Mapping the Passions»; and Elfenbein and Ambady, «Universality and Cultural Specificity of Emotion Recognition.»
424
Fernández-Dols and Russell, Science of Facial Expression, 4.
425
Gendron and Barrett, Facing the Past, 30.
426
Vincent, «AI ‘Emotion Recognition’ Can’t Be Trusted.’» Disability studies scholars have also noted that assumptions about how biology and bodies function can also raise concerns around bias, especially when automated through technology. See Whittaker et al., «Disability, Bias, and AI.»
427
Izard, «Many Meanings/Aspects of Emotion.»
428
Leys, Ascent of Affect, 22.
429
Leys, 92.
430
Leys, 94.
431
Leys, 94.
432
Barrett, «Are Emotions Natural Kinds?» 28.
433
Barrett, 30.
434
See, e. g., Barrett et al., «Emotional Expressions Reconsidered.»
435
Barrett et al., 40.
436
Kappas, «Smile When You Read This,» 39, emphasis added.
437
Kappas, 40.
438
Barrett et al., 46.
439
Barrett et al., 47–48.
440
Barrett et al., 47, emphasis added.
441
Apelbaum, «One Thousand and One Nights.»
442
See, e. g., Hoft, «Facial, Speech and Virtual Polygraph Analysis.»
443
Rhue, «Racial Influence on Automated Perceptions of Emotions.»
444
Barrett et al., «Emotional Expressions Reconsidered,»48.
445
See, e. g., Connor, «Chinese School Uses Facial Recognition»; and Du and Maki, «AI Cameras That Can Spot Shoplifters.»
446
NOFORN stands for Not Releasable to Foreign Nationals. «Use of the ‘Not Releasable to Foreign Nationals’ (NOFORN) Caveat.»
447
The Five Eyes is a global intelligence alliance comprising Australia, Canada, New Zealand, the United Kingdom, and the United States. «Five Eyes Intelligence Oversight and Review Council.»
448
Galison, «Removing Knowledge,» 229.
449
Risen and Poitras, «N.S.A. Report Outlined Goals for More Power»; Müller-Maguhn et al., «The NSA Breach of Telekom and Other German Firms.»
450
FOxACID is software developed by the Office of Tailored Access Operations, now Computer Network Operations, a cyberwarfare intelligence gathering unit of the NSA.
451
Schneier, «Attacking Tor.» Document available at «NSA Phishing Tactics and Man in the Middle Attacks.»
452
Swinhoe, «What Is Spear Phishing?»
453
«Strategy for Surveillance Powers.»
454
Edwards, Closed World.
455
Edwards.
456
Edwards, 198.
457
Mbembé, Necropolitics, 82.
458
Bratton, Stack, 151.
459
For an excellent account of the history of the internet in the United States, see Abbate, Inventing the Internet.
460
SHARE Foundation, «Serbian Government Is Implementing Unlawful Video Surveillance.»
461
Department of International Cooperation Ministry of Science and Technology, «Next Generation Artificial Intelligence Development Plan.»
462
Chun, Control and Freedom; Hu, Prehistory of the Cloud, 87–88.
463
Cave and ÓhÉigeartaigh, «AI Race for Strategic Advantage.»
464
Markoff, «Pentagon Turns to Silicon Valley for Edge.»
465
Brown, Department of Defense Annual Report.
466
Martinage, «Toward a New Offset Strategy,» 5–16.
467
Carter, «Remarks on ‘the Path to an Innovative Future for Defense’»; Pellerin, «Deputy Secretary.»
468
The origins of U.S. military offsets can be traced back to December 1952, when the Soviet Union had almost ten times more conventional military divisions than the United States. President Dwight Eisenhower turned to nuclear deterrence as a way to «offset» these odds. The strategy involved not only the threat of the retaliatory power of the U.S. nuclear forces but also accelerating the growth of the U.S. weapons stockpile, as well as developing long-range jet bombers, the hydrogen bomb, and eventually intercontinental ballistic missiles. It also included increased reliance on espionage, sabotage, and covert operations. In the 1970s and 1980s, U.S. military strategy turned to computational advances in analytics and logistics, building on the influence of such military architects as Robert McNamara in search of military supremacy. This Second Offset could be seen in military engagements like Operation Desert Storm during the Gulf War in 1991, where reconnaissance, suppression of enemy defenses, and precision-guided munitions dominated how the United States not only fought the war but thought and spoke about it. Yet as Russia and China began to adopt these capacities and deploy digital networks for warfare, anxiety grew to reestablish a new kind of strategic advantage. See McNamara and Blight, Wilson’s Ghost.
469
Pellerin, «Deputy Secretary.»
470
Gellman and Poitras, «U.S., British Intelligence Mining Data.»
471
Deputy Secretary of Defense to Secretaries of the Military Departments et al.
472
Deputy Secretary of Defense to Secretaries of the Military Departments et al.
473
Michel, Eyes in the Sky, 134.
474
Michel, 135.
475
Cameron and Conger, «Google Is Helping the Pentagon Build AI for Drones.»
476
For example, Gebru et al., «Fine-Grained Car Detection for Visual Census Estimation.»
477
Fang, «Leaked Emails Show Google Expected Lucrative Military Drone AI Work.»
478
Bergen, «Pentagon Drone Program Is Using Google AI.»
479
Shane and Wakabayashi, «‘Business of War.’»
480
Smith, «Technology and the US Military.»
481
When the JEDI contract was ultimately awarded to Microsoft, Brad Smith, the president of Microsoft, explained that the reason that Microsoft won the contract was that it was seen «not just as a sales opportunity, but really, a very large-scale engineering project.» Stewart and Carlson, «President of Microsoft Says It Took Its Bid.»
482
Pichai, «AI at Google.»
483
Pichai. Project Maven was subsequently picked up by Anduril Industries, a secretive tech startup founded by Oculus Rift’s Palmer Luckey. Fang, «Defense Tech Startup.»
484
Whittaker et al., AI Now Report 2018.
485
Schmidt quoted in Scharre et al., «Eric Schmidt Keynote Address.»
486
As Suchman notes, «‘Killing people correctly’ under the laws of war requires adherence to the Principle of Distinction and the identification of an imminent threat.» Suchman, «Algorithmic Warfare and the Reinvention of Accuracy,» n. 18.
487
Suchman.
488
Suchman.
489
Hagendorff, «Ethics of AI Ethics.»
490
Brustein and Bergen, «Google Wants to Do Business with the Military.»
491
For more on why municipalities should more carefully assess the risks of algorithmic platforms, see Green, Smart Enough City.
492
Thiel, «Good for Google, Bad for America.»
493
Steinberger, «Does Palantir See Too Much?»
494
Weigel, «Palantir goes to the Frankfurt School.»
495
Dilanian, «US Special Operations Forces Are Clamoring to Use Software.»
496
«War against Immigrants.»
497
Alden, «Inside Palantir, Silicon Valley’s Most Secretive Company.»
498
Alden, «Inside Palantir, Silicon Valley’s Most Secretive Company.»
499
Waldman, Chapman, and Robertson, «Palantir Knows Everything about You.»
500
Joseph, «Data Company Directly Powers Immigration Raids in Workplace»; Anzilotti, «Emails Show That ICE Uses Palantir Technology to Detain Undocumented Immigrants.»
501
Andrew Ferguson, conversation with author, June 21, 2019.
502
Brayne, «Big Data Surveillance.» Brayne also notes that the migration of law enforcement to intelligence was occurring even before the shift to predictive analytics, given such court decisions as Terry v. Ohio and Whren v. United States that made it easier for law enforcement to circumvent probable cause and produced a proliferation of pretext stops.
503
Richardson, Schultz, and Crawford, «Dirty Data, Bad Predictions.»
504
Brayne, «Big Data Surveillance,» 997.
505
Brayne, 997.
506
See, e. g., French and Browne, «Surveillance as Social Regulation.»
507
Crawford and Schultz, «AI Systems as State Actors.»
508
Cohen, Between Truth and Power; Calo and Citron, «Automated Administrative State.»
509
«Vigilant Solutions»; Maass and Lipton, «What We Learned.»
510
Newman, «Internal Docs Show How ICE Gets Surveillance Help.»
511
England, «UK Police’s Facial Recognition System.»
512
Scott, Seeing Like a State.
513
Haskins, «How Ring Transmits Fear to American Suburbs.»
514
Haskins, «Amazon’s Home Security Company.»
515
Haskins.
516
Haskins. «Amazon Requires Police to Shill Surveillance Cameras.»
517
Haskins, «Amazon Is Coaching Cops.»
518
Haskins.
519
Haskins.
520
Hu, Prehistory of the Cloud, 115.
521
Hu, 115.
522
Benson, «‘Kill ’Em and Sort It Out Later,’» 17.
523
Hajjar, «Lawfare and Armed Conflicts,» 70.
524
Scahill and Greenwald, «NSA’s Secret Role in the U.S. Assassination Program.»
525
Cole, «‘We Kill People Based on Metadata.’»
526
Priest, «NSA Growth Fueled by Need to Target Terrorists.»
527
Gibson quoted in Ackerman, «41 Men Targeted but 1,147 People Killed.»
528
Tucker, «Refugee or Terrorist?»
529
Tucker.
530
O’Neil, Weapons of Math Destruction, 288–326.
531
Fourcade and Healy, «Seeing Like a Market.»
532
Eubanks, Automating Inequality.
533
Richardson, Schultz, and Southerland, «Litigating Algorithms,» 19.
534
Richardson, Schultz, and Southerland, 23.
535
Agre, Computation and Human Experience, 240.
536
Bratton, Stack, 140.
537
Hu, Prehistory of the Cloud, 89.
538
Nakashima and Warrick, «For NSA Chief, Terrorist Threat Drives Passion.»
539
Document available at Maass, «Summit Fever.»
540
The future of the Snowden archive itself is uncertain. In March 2019, it was announced that the Intercept – the publication that Glenn Greenwald established with Laura Poitras and Jeremy Scahill after they shared the Pulitzer Prize for their reporting on the Snowden materials – was no longer going to fund the Snowden archive. Tani, «Intercept Shuts Down Access to Snowden Trove.»
541
Silver et al., «Mastering the Game of Go without Human Knowledge.»
542
Silver et al., 357.
543
Full talk at the Artificial Intelligence Channel: Demis Hassabis, DeepMind – Learning from First Principles. See also Knight, «Alpha Zero’s ‘Alien’ Chess Shows the Power.»
544
Demis Hassabis, DeepMind – Learning from First Principles.
545
For more on the myths of «magic» in AI, see Elish and boyd, «Situating Methods in the Magic of Big Data and AI.»
546
Meredith Broussard notes that playing games has been dangerously conflated with intelligence. She cites the programmer George V. NevilleNeil, who argues: «We have had nearly 50 years of human/computer competition in the game of chess, but does this mean that any of those computers are intelligent? No, it does not – for two reasons. The first is that chess is not a test of intelligence; it is the test of a particular skill – the skill of playing chess. If I could beat a Grandmaster at chess and yet not be able to hand you the salt at the table when asked, would I be intelligent? The second reason is that thinking chess was a test of intelligence was based on a false cultural premise that brilliant chess players were brilliant minds, more gifted than those around them. Yes, many intelligent people excel at chess, but chess, or any other single skill, does not denote intelligence.» Broussard, Artificial Unintelligence, 206.
547
Galison, «Ontology of the Enemy.»
548
Campolo and Crawford, «Enchanted Determinism.»
549
Bailey, «Dimensions of Rhetoric in Conditions of Uncertainty,» 30.
550
Bostrom, Superintelligence.
551
Bostrom.
552
Strand, «Keyword: Evil,» 64–65.
553
Strand, 65.
554
Hardt and Negri, Assembly, 116, emphasis added.
555
Wakabayashi, «Google’s Shadow Work Force.»
556
Quoted in McNeil, «Two Eyes See More Than Nine,» 23.
557
On the idea of data as capital, see Sadowski, «When Data Is Capital.»
558
Harun Farocki discussed in Paglen, «Operational Images.»
559
For a summary, see Heaven, «Why Faces Don’t Always Tell the Truth.»
560
Nietzsche, Sämtliche Werke, 11:506.
561
Wang and Kosinski, «Deep Neural Networks Are More Accurate Than Humans»; Kleinberg et al., «Human Decisions and Machine Predictions»; Crosman, «Is AI a Threat to Fair Lending?»; Seo et al., «Partially Generative Neural Networks.»
562
Pugliese, «Death by Metadata.»
563
Suchman, «Algorithmic Warfare and the Reinvention of Accuracy.»
564
Simmons, «Rekor Software Adds License Plate Reader Technology.»
565
Lorde, Master’s Tools.
566
Schaake, «What Principles Not to Disrupt.»
567
Jobin, Ienca, and Vayena, «Global Landscape of AI Ethics Guidelines.»
568
Mattern, «Calculative Composition,» 572.
569
For more on why AI ethics frameworks are limited in effectiveness, see Crawford et al., AI Now 2019 Report.
570
Mittelstadt, «Principles Alone Cannot Guarantee Ethical AI.» See also Metcalf, Moss, and boyd, «Owning Ethics.»
571
For recent scholarship that addresses important practical steps to do this without replicating forms of extraction and harm, see Costanza-Chock, Design Justice.
572
Winner, The Whale and the Reactor, 9.
573
Mbembé, Critique of Black Reason, 3.
574
Bangstad et al., «Thoughts on the Planetary.»
575
Haraway, Simians, Cyborgs, and Women, 161.
576
Mohamed, Png, and Isaac, «Decolonial AI,» 405.
577
«Race after Technology, Ruha Benjamin.»
578
Bangstad et al., «Thoughts on the Planetary.»
579
Blue Origin’s Mission.
580
Blue Origin’s Mission.
581
Powell, «Jeff Bezos Foresees a Trillion People.»
582
Bezos, Going to Space to Benefit Earth.
583
Bezos.
584
Foer, «Jeff Bezos’s Master Plan.»
585
Foer.
586
«Why Asteroids.»
587
Welch, «Elon Musk.»
588
Cuthbertson, «Elon Musk Really Wants to ‘Nuke Mars.’»
589
Rein, Tamayo, and Vokrouhlicky, «Random Walk of Cars.»
590
Gates, «Bezos’ Blue Origin Seeks Tax Incentives.»
591
Marx, «Instead of Throwing Money at the Moon»; O’Neill, High Frontier.
592
«Our Mission.»
593
Davis, «Gerard K. O’Neill on Space Colonies.»
594
Meadows et al., Limits to Growth.
595
Scharmen, Space Settlements, In recent years, scholars have suggested that the Club of Rome’s models were overly optimistic, underestimating the rapid rate of extraction and resource consumption worldwide and the climate implications of greenhouse gases and industrial waste heat. See Turner, «Is Global Collapse Imminent?»
596
The case for a no-growth model that involves staying on the planet has been made by many academics in the limits to growth movement. See, e. g., Trainer, Renewable Energy Cannot Sustain a Consumer Society.
597
Scharmen, Space Settlements, 91.
598
One wonders how the Bezos mission would differ had he been inspired instead by the science fiction author Philip K. Dick, who wrote the short story «Autofac» in In it, human survivors of an apocalyptic war are left on Earth with the «autofacs»-autonomous, self-replicating factory machines. The autofacs had been tasked with producing consumer goods in prewar society but could no longer stop, consuming the planet’s resources and threatening the survival of the last people left. The only way to survive was to trick the artificial intelligence machines to fight against each other over a critical element they need for manufacturing: the rare earth element tungsten. It seems to succeed, and wild vines begin to grow throughout the factories, and farmers can return to the land. Only later do they realize that the autofacs had sought more resources deep in Earth’s core and would soon launch thousands of self-replicating «seeds» to mine the rest of the galaxy. Dick, «Autofac.»
599
NASA, «Outer Space Treaty of 1967.»
600
U.S. «Commercial Space Launch Competitiveness Act.»
601
Wilson, «Top Lobbying Victories of 2015.»
602
Shaer, «Asteroid Miner’s Guide to the Galaxy.»
603
As Mark Andrejevic writes, «The promise of technological immortality is inseparable from automation, which offers to supplant human limitations at every turn.» Andrejevic, Automated Media, 1.
604
Reichhardt, «First Photo from Space.»
605
See, e. g., Pulitzer Prize-winning journalist Wayne Biddle’s account of von Braun as a war criminal who participated in the brutal treatment of slave laborers under the Nazi regime. Biddle, Dark Side of the Moon.
606
Grigorieff, «Mittelwerk/Mittelbau/Camp Dora.»
607
Ward, Dr. Space.
608
Keates, «Many Places Amazon CEO Jeff Bezos Calls Home.»
609
Center for Land Use Interpretation, «Figure 2 Ranch, Texas.»