The pandemic has taught we have a lesson we need to learn again, says Alondra Nelson: Science and technology have everything to do with the problems of society, inequality and social life.
After a year in which science has become politicized in the midst of one pandemic and a presidential campaign, in January President-elect Joe Biden appointed a new director of science and society to the White House Office of Science and Technology Policy. Nelson will build a science and society division in the OSTP aimed at addressing issues ranging from data and democracy to STEM education. In another interview, Biden was part of his science advisor Eric Lander, who is also director of OSTP closet.
Nelson has spent his career at the intersection of race, technology and society, writing on topics such as Afrofuturism it can make the world a better place and how the Black Panthers use health care as a form of activism, leading the organization to develop an initial interest in genetics. He is the author of several books, including Social Life of DNA, which guards the rise of the consumer genetics the testing industry and how a desire to learn about its offspring has led Black and Mormon people to become early users of the technology.
Nelson is a professor at the Institute for Advanced Study in Princeton, New Jersey. Prior to his appointment, he was writing a book on the OSTP and the Obama administration’s major scientific initiatives, which included a series of reports on AI and government policy.
In his first formal remarks in his new role in January, Nelson called science a social phenomenon and described technology as artificial intelligence may reveal or reflect dangerous social architectures that underlie the pursuit of scientific progress. In an interview with WIRED, Nelson said the black community in particular is overwhelmed by the harms of science and technology and is underserved by the benefits.
In the interview, he talked about the Biden administration’s plans for lunar science, because the administration does not have a formal position on the ban. facial recognition, and issues related to emerging technology and society that she thinks should be addressed during the time of administration in office. Followed by an edited transcript.
WIRED: In January you talked about “the dangerous social architecture that lies beneath the scientific progress we’re pursuing” and you mentioned genetic editing and artificial intelligence. What prompted you to mention genetic editing and AI in your first public remarks in this role?
Alondra Nelson: I think what genetic science and the part of AI are that they are data centered. There are things we know about data and how data analysis works at scale that are true for large-scale genomic analysis as they are machine learning in some respect, and so are types of fundamentals. What I think we still need to deal with as a nation are the questions about where the data analyzed with AI tools come from and questions about who gets to make decisions about which variables to use and what questions are asked by the research. scientific and technical. What I hope is different and distinctive about this OSTP is a sense of honesty about the past. Science and technology have damaged some communities, left communities out, and left people out to do the work of science and technology.
Working in an administration that first identified issues of racial equity and restores trust in government as key issues means that the work of science and technology policy must be truly honest about the past and that part. to restore confidence in government – part of restoring confidence in the ability for science and technology to do all kinds of good in the world – is really open about the history of the flaws and failures of science and technology .
Unfortunately, there are many examples. Next month will be another anniversary of the Associated Press history which revealed the study of Tuskegee’s syphilis nearly 40 years ago, so we’re back on that anniversary. Then, of course, we get problems in AI from research that the data that is used is incomplete and that its incompleteness means that they are making inferences that are incomplete, inaccurate and, when used in social services and the criminal justice system in particular. , have truly disproportionately harmful effects on black and brown communities.
Lander said at his confirmation hearing that OSTP will address discrimination arising from the algorithm. prejudice. How does it work?