Artificial Intelligence Is Too Important to Leave to Google and Facebook Alone – The New York Times

Americans don’t have to be beholden to the tech Goliaths to get the benefits of artificial intelligence. An alternative possibility is for government to provide the infrastructure needed for a technological future — through a public option for artificial intelligence.

Big tech companies have an extraordinary amount of data about how we behave, largely because they engage in widespread surveillance of much of our behavior. Because A.I. depends on data, these companies have a huge market advantage over start-ups and entrepreneurs — and it’s a gap that will only get wider. The lax regulatory environment hasn’t helped, either; instead, it has allowed the biggest tech companies to acquire their rivals, stifle competition and snatch up the best software engineers and data scientists. Together, these dynamics make it hard for start-ups, governments and nonprofits to develop and use artificial intelligence without relying on big tech companies, effectively ceding influence over this developing field to a private sphere distorted by anti-competitive practices.

The alternative is a public option for A.I. The public option, a term familiar from debates over health care, is a public program that provides universal access to goods and services, with a private opt-out. A public option for A.I. wouldn’t prevent companies like Google from collecting and using data. But it would provide a pathway for start-ups and public-sector organizations to develop abilities and products that would compete with those of the tech giants. Although the Trump administration released an executive order on A.I. this year, we believe that a broader conceptual framework for the public option for A.I. — coupled with significant financial resources from Congress — is an opportunity for all levels of government to take control of technology for their constituents and engage them deeply in the development of the rules by which it is governed and used.

Our proposal has three components: The first is a public data pool that would make data accessible to registered users. Local, state and federal governments have sizable data resources that would seed this digital commons. Users would be verified to block foreign governments, hackers and others with ill motives from access, and users would be prevented from using the data to engage in racial or other forms of discrimination and for microtargeted advertising.

Some of the data may be very sensitive, and access to those resources would be highly regulated. We can imagine a variety of ways that regulation and technology together could protect privacy and still foster innovation: Data could be anonymized at the source; the commons could have an interface that allowed users to derive insight from the data set, while leaving the underlying information inaccessible; less sensitive data, like weather information, could be made available in a format optimized for training A.I. What’s more, methods for safely sharing A.I. models without disclosing the underlying data are being developed today and could enable users of the data commons to collaborate on public-interest A.I. services. The federal government should also invest in researching new and better ways to protect privacy and prevent misuse.

Second, a public option for artificial intelligence would include a significant increase in research and development spending. Proponents of big tech celebrate private-sector research and are right to do so. But big tech companies, like all companies, have an incentive to fund research that will support their bottom line, and the profit motive doesn’t always mean a focus on the most important problems.

For generations, government R&D spending has been one of the central engines of economic growth and technological progress in America. Yet China is projected to spend far more than the United States on A.I. research over the next decade. A sizable increase in research funding for companies, governments and nonprofits developing public-interest technologies would help expand the types of research taking place and give scientists and engineers the option to do groundbreaking work on a broader range of problems.

Third, much of government’s A.I. work takes place in the military sector and is applied to national security problems. But health care, transportation, energy and other areas could also benefit significantly from A.I. The federal government should expand its A.I. procurement across all of these sectors as an opportunity to improve public services for all Americans. In addition, the government should ensure that its use of algorithms meets the highest ethical standards.

A public option for A.I. can’t solve every problem related to technology and surveillance, and it would require careful thinking about public governance of these programs — including a commitment to privacy and awareness of biases. But it would help address the problem of a small number of companies having virtually all power over this technology. It would facilitate the conditions for a competitive market with many players and many new innovations, all while preserving our democratic values and improving our society.

Ben Gansky (@bengansky) is executive director of Free Machine and a researcher and designer at the Institute for the Future’s Equitable Futures Lab. Michael Martin is policy director for Free Machine and head of communities at SignalFire. Ganesh Sitaraman (@ganeshsitaraman), a professor of law at Vanderbilt Law School, is a co-author of “The Public Option: How to Expand Freedom, Increase Opportunity, and Promote Equality.”

Follow @privacyproject on Twitter and The New York Times Opinion Section on Facebook and Instagram.

@font-face { font-family: ‘nyt-cheltenham-cond’; src: local(‘☺︎’), url() format(‘woff’); font-style: normal; font-weight: 700; font-display: swap; } .interactive-header { display: none; } /* BEGIN HEADER CSS FOR WEB*/ /* header p, header div { text-align: center !important; margin-left: auto !important; margin-right: auto !important; } header h1 { text-align: center !important; */ } /* END HEADER CSS FOR WEB*/ /* BEGIN HEADER CSS FOR APP*/ .g-header-wrapper { margin-top: 40px; margin-bottom: 40px; font-size: 22px; } .g-header-opinion { font-weight: 300; font-family: ‘nyt-cheltenham’, georgia, ‘times new roman’, times, serif; font-size: 1.5625rem; line-height: 2rem; } .g-header-opinion, .g-header-opinion a, .g-header-opinion a:visited { color: #999 !important; text-decoration: none !important; } .g-header-privacy { font-family: ‘nyt-cheltenham-cond’, ‘nyt-cheltenham’, georgia, “times new roman”, times, serif; font-weight: bold; padding-left: 12px; margin-left: 12px; font-size: 18px; border-left: 0.5px solid black; text-transform: uppercase; letter-spacing: 1px; } .g-header-privacy, .g-header-privacy a, .g-header-privacy a:visited { color: blue !important; text-decoration: none !important; } /* .HeaderBasicBlock-opinionLabel { margin-bottom: 1.5rem; } .HeaderBasicBlock-opinionLabel, .Heading1Block-heading1, .SummaryBlock-summary { text-align: center !important; } .BylineBlock-byline *, .BylineBlock-byline div, .BylineBlock-byline div p { text-align: center !important; display: block !important; margin: 0 auto !important; } */ /* time { text-align: center; } */ /* END HEADER CSS FOR APP*/ /* BEGIN GLOSSARY CSS*/ .glossary-wrapper { font-family: inherit; line-height: normal; position: relative; color: blue; } .glossary-info { position: absolute; left: 0; top: calc(100% + 10px); color: white; padding: 15px; -webkit-font-smoothing: antialiased; text-transform: uppercase; width: 270px; background-color: blue; text-align: left !important; border: #e2e2e2 1px solid; border-radius: 2px; visibility: hidden; opacity: 0; box-sizing: border-box; z-index: 99999999999999 !important; } .glossary-text { cursor: pointer; } .glossary-text:hover + .glossary-info, .tapped + .glossary-info { visibility: visible; opacity: 1; transition: opacity 0.5s; } .close-tooltip { position: absolute; top: 17px; right: 13px; height: 12px; width: 12px; cursor: pointer; } span.glossary-icon { width: 0.35em; height: 0.35em; background-color: blue; border-radius: 100%; vertical-align: middle; margin-left: 0.4em; position: relative; line-height: inherit; margin-left: 5px; display: inline-block; justify-content: center; } span.glossary-definition{ font-family: ‘nyt-franklin’, Arial, sans-serif; text-transform: none; display: block; margin-top: 0; width: 225px; font-size: 14px; line-height: 1.25; font-weight: 500; } span.glossary-definition a, span.glossary-definition a:visited { text-decoration: underline; color: #ccc; }

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s