GLOBAL RESEARCH SYNDICATE
No Result
View All Result
  • Login
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights
No Result
View All Result
globalresearchsyndicate
No Result
View All Result
Home Data Collection

Knight Foundation’s millions aim to make the internet less toxic

globalresearchsyndicate by globalresearchsyndicate
November 19, 2019
in Data Collection
0
Knight Foundation’s millions aim to make the internet less toxic
0
SHARES
7
VIEWS
Share on FacebookShare on Twitter

Fifty years after that first connection, the internet has become famous for connecting and informing the world—and for growing monopoly fears, privacy violations, and the spread of toxic speech, disinformation, and perhaps impeachment-worthy conspiracy theories. Don’t think of these problems as merely digital, says Sam Gill, a vice president at the John S. and James L. Knight Foundation: They amount to a modern-day public health crisis and demand the kind of effort undertaken in previous eras to counter disease, filth, and social and economic ills.

“At the turn of the 19th and 20th centuries, you not only had a new class of problems as the country densified and urbanized; you had problems that we didn’t quite know how to study and talk about,” he says. “And I think we feel the same way about the current moment—that there’s a new class of problems and they defy a lot of the traditional boundaries of research.”

Hence the Knight Foundation—typically known for its funding of local journalism, communities, and the arts—said last week it was putting more than $3.5 million into projects aimed at advancing an emerging field of study around internet governance, examining ways to limit the fakery, hate, and distrust that are spreading on social networks. The initiative is meant to add to public understanding—and generate basic, often difficult-to-obtain data—that can inform an ongoing barrage of critique of the tech industry, from state lawsuits to federal investigations, from California’s privacy law to candidates’ calls for breaking up the companies.

We’re not just living through a tech backlash but the birth of a new discipline, says Gill. “We’re responding to researchers, scholars, policy makers who aim to define the field. They’re the ones grappling with the issues, whether it’s on the Hill or in an attorney general’s office in a state that’s looking at Facebook or Google, or at a university studying these issues in new ways.”

So far, the projects Knight will fund—selected through an ongoing open call for proposals—include an effort at Yale to map the economic impacts of new regulations on tech companies, a fellowship focused on disinformation at Harvard’s Berkman Klein Center for Internet & Society, and research projects at Utah State University, Stanford, and NYU studying content moderation, discrimination, and the governance of the commercial internet from the perspective of marginalized populations.

Professors Sarah T. Roberts and Safiya U. Noble, who run the UCLA Center for Critical Internet Inquiry, will apply Knight funds to their study of content moderation systems and the impacts that platform policies and algorithms have on vulnerable communities, people who “by design, bear the brunt of digital systems in the form of ‘technological redlining,’ uneven and inequitable applications of technologies on their communities.” For example, Facebook agreed to curtail discriminatory housing, employment, and credit ads earlier this year as part of a settlement with civil rights groups, but research has shown that the platform’s ad targeting systems make discrimination difficult to avoid.

The foundation’s investment is part of a broader $50 million commitment aimed at supporting efforts following elections in 2016 to understand how technology is impacting democracy and to reinvigorate trust in information. Most of that cash will go toward local news organizations and helping them find new revenue models, but Knight is also seeking to address questions about the kind of challenges that stretch from newsrooms to Silicon Valley boardrooms: “how to produce, to curate, to connect people to trusted information at the speed of the internet,” says Gill.

Knight’s funds, Gill acknowledges, are “just the tip of the iceberg” in terms of the resources required to build a stronger information ecosystem, the kind of cross-sector and bipartisan efforts that focus on public health. “Think about the kinds of public and private resources that medical research takes,” he says. “We have an NIH, we have a National Science Foundation. That’s the scale we should be thinking around this research too.”

Not all of its efforts have panned out. Last month, the foundation was one of nearly a dozen funders that yanked their support for Social Science One, an ambitious effort that aims to give academic disinformation researchers unprecedented access to Facebook data. This summer, Facebook said it had to limit the amount of raw data it had promised to share with researchers over concerns about user privacy. The Cambridge Analytica scandal that had motivated C.E.O. Mark Zuckerberg to personally agree to the effort also fueled fears that academic researchers could again sell valuable data to the highest bidder. The funders, who included the Charles Koch Foundation, Omidyar Network’s Tech and Society Solutions Lab, and the Alfred P. Sloan Foundation, balked at Facebook’s changes.

“The data that was originally intended to be made available wasn’t, and we think that’s too bad,” says Gill.

A Facebook official in charge of the project told me earlier this month that the company was committed to providing quality data to researchers and was working to develop better techniques, based on differential privacy, that would sufficiently protect user identities while still exposing useful patterns. “We’re producing at a slower pace because we’re trying to move slowly and carefully and do this the right way,” said Chaya Nayak, head of Facebook’s Election Research Commission & Data Sharing efforts.

Facebook’s transparency efforts have stood out, but like other Silicon Valley giants, it remains largely a black box to outside researchers. “As someone who’s been a party to some of this, I think there’s obviously a long way to go in terms of the availability of information about what’s happening inside Facebook or any of these companies—and they don’t have a legal obligation to turn it over,” Gill says. “But I think that until we understand better, it’s going to be very difficult as a society publicly to have a collaborative conversation about how these platforms can be a part of a thriving democracy.”

Alongside projects such as Social Science One, new laws could enforce more transparency at Facebook and other tech firms or give the companies more leeway to share data. For instance, “safe harbor” protections could allow digital platform companies to share sensitive data with researchers without legal repercussions. Some of Knight’s funding is aimed at exploring those options, Gill says.

“A paradigm is starting to cohere around what regulation could look like, and I think one of the elements of that is data transparency.” Along with predictability and uniformity, corporations “want to know that the disclosures that they’re making aren’t going to be ultimately legally problematic for them. That’s a good role for a regulator.”

One proposal in Congress, the Honest Ads Act, would establish regulations for digital political advertising, including greater transparency from companies such as Facebook around the legitimacy of content and the authenticity of users. But no new laws governing internet platforms have been passed in the U.S. Congress since the misinformation wars of 2016. One of the bill’s sponsors, Senator Mark Warner, a Democrat from Virginia, said in an email that Facebook and other social media platforms had “started making some efforts to address these challenges.

“However, there’s so much more we need to do to safeguard our democracy,” Senator Warner wrote. “And if platforms refuse to comply, we need to be able to hold them responsible.”

The role of algorithms in enabling online extremism and violence has also forced lawmakers to consider modifying Section 230 of the Communications Decency Act, which shields tech companies from legal liability for user-generated content. The Federal Trade Commission and Department of Justice have ongoing investigations into the business practices of several tech companies, while almost all state attorneys general are investigating Google and Facebook for anticompetitive behavior. Most recently, Facebook’s political-ad policies have infuriated watchdogs and policy makers and exposed the challenges of policing—or not policing—”political” speech.

Paradigm shift and “extraordinary asymmetry”

Part of the challenge with crafting new regulation involves the confusion around what a well-functioning digital marketplace should look like—in effect, what public health looks like in the internet era. It’s now well understood that antitrust law isn’t fit for the era of Bezos and Zuckerberg; it’s mostly been focused on “consumer welfare,” and before that, antitrust policy trained its sights on power and innovation in manufacturing.

“Our doctrines of understanding business aren’t particularly well attuned to control over data as a competitive advantage,” says Gill. “They’re not particularly well tuned to zero-cost services, and what those markets should look like. And they’re not particularly attuned to the kind of horizontal integration that these companies can very effectively effect.” Privacy laws are a prime example of an outmoded paradigm waiting to be updated: “a paradigm in which it was really about individual sovereignty, what could I keep or could someone else, as opposed to the use of that information,” which is how privacy is increasingly defined now.

As the industry amasses and bundles giant amounts of our data, its own data remains largely hidden. “There’s an extraordinary asymmetry between their understanding of how this technology works and everybody else’s understanding,” he says.

The failures of corporate transparency, especially in big tech, echo previous eras. “That was a really big problem toward the end of the 19th century,” Gill says. “The railroad company knew better than everybody else how it was influencing markets. Now we created something to respond to that, the administrative state: The state got more complex to understand how these markets were operating. So I think there’s no question that regulation is going to have to evolve.”

Other regulations would add crucial incentives for Facebook, Google, and others to make their products safer, the way previous lawmaking has made transportation less deadly. “Automakers today are a much more proactive part of the auto safety solution, not necessarily because better, more kind-hearted, safety-conscious people run those companies, but through the negotiation of regulation and self-regulation and consumer action, through the interplay of those forces. The parameters are different today than they were in the 1940s, when we really had a serious auto death problem.

“And you’ve got different structures in these companies,” Gill says of the automakers. “We have safety engineers who work in these companies. They compete in markets where consumers now expect and are acculturated to certain kinds of safety features.”

Knight also took care to sponsor researchers from different sides of the ideological spectrum in order to bolster a diverse, bipartisan conversation. The American Antitrust Institute is a recipient, as is the American Enterprise Institute. The Economic Security Project, a left-of-center nonprofit started by Facebook founder and former New Republic owner Chris Hughes, will get $250,000 to research the impacts of economic concentration by tech companies, while the Lincoln Network, a conservative tech group, will receive the same amount to host its annual conference in San Francisco focused on “innovation policy and governance.”

“A lot of the folks that we’re funding don’t agree about what the answers are,” Gill says. “Our feeling is, we don’t know the answer. What we need is evidence. What we need is thoughtful debate and discussion about various alternatives to inform ultimately what can be quality answers over a period of time.”

Breaking up big tech is one idea; Knight is also examining new business models for internet platforms and media organizations—including nonprofit and publicly funded approaches—to build a healthy information ecosystem. The question comes down to: Is this a public good? “And if it is, do we need to approach different ways of supporting it? Should there be public subsidy for these things? Should there be philanthropic support? Should we just let the market determine the future of these organizations? I think this is a very serious question for us as a society to confront. It may be that the true public good, which is deep, verified, contextualized community information, is not alone going to be supported by the market. I think that’s possible.”

Still, he stresses that strong internet governance—and new rules—needn’t divide people along predictable liberal-conservative lines. “What’s interesting about public health as an analogy is, the fact that it became a problem at a public scale didn’t mean that the solution was exclusively public. We still have a private healthcare system. It just meant that at that moment, there were questions that we had to ask as a society, management questions to which companies and hospitals and insurers and government, doctors, patients, et cetera, were all party.”

The problems wrought by big tech now require similar collaboration and problem-solving, says Gill. “If this really is one of the biggest issues that democracy is facing in our time, then this is just the smallest amount of work that’s going to be necessary to grapple with that.”

Related Posts

How Machine Learning has impacted Consumer Behaviour and Analysis
Consumer Research

How Machine Learning has impacted Consumer Behaviour and Analysis

January 4, 2024
Market Research The Ultimate Weapon for Business Success
Consumer Research

Market Research: The Ultimate Weapon for Business Success

June 22, 2023
Unveiling the Hidden Power of Market Research A Game Changer
Consumer Research

Unveiling the Hidden Power of Market Research: A Game Changer

June 2, 2023
7 Secrets of Market Research Gurus That Will Blow Your Mind
Consumer Research

7 Secrets of Market Research Gurus That Will Blow Your Mind

May 8, 2023
The Shocking Truth About Market Research Revealed!
Consumer Research

The Shocking Truth About Market Research: Revealed!

April 25, 2023
market research, primary research, secondary research, market research trends, market research news,
Consumer Research

Quantitative vs. Qualitative Research. How to choose the Right Research Method for Your Business Needs

March 14, 2023
Next Post
Paint Leveling Agent Market Research Key Players, Industry Overview, Supply Chain and Analysis to 2019-2025 – Top Latest Daily News

Paint Leveling Agent Market Research Key Players, Industry Overview, Supply Chain and Analysis to 2019-2025 – Top Latest Daily News

Categories

  • Consumer Research
  • Data Analysis
  • Data Collection
  • Industry Research
  • Latest News
  • Market Insights
  • Marketing Research
  • Survey Research
  • Uncategorized

Recent Posts

  • Ipsos Revolutionizes the Global Market Research Landscape
  • How Machine Learning has impacted Consumer Behaviour and Analysis
  • Market Research: The Ultimate Weapon for Business Success
  • Privacy Policy
  • Terms of Use
  • Antispam
  • DMCA

Copyright © 2024 Globalresearchsyndicate.com

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In
This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Cookie settingsACCEPT
Privacy & Cookies Policy

Privacy Overview

This website uses cookies to improve your experience while you navigate through the website. Out of these cookies, the cookies that are categorized as necessary are stored on your browser as they are essential for the working of basic functionalities of the website. We also use third-party cookies that help us analyze and understand how you use this website. These cookies will be stored in your browser only with your consent. You also have the option to opt-out of these cookies. But opting out of some of these cookies may have an effect on your browsing experience.
Necessary
Always Enabled
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Non-necessary
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.
SAVE & ACCEPT
No Result
View All Result
  • Latest News
  • Consumer Research
  • Survey Research
  • Marketing Research
  • Industry Research
  • Data Collection
  • More
    • Data Analysis
    • Market Insights

Copyright © 2024 Globalresearchsyndicate.com