DEV Community

Cover image for The US Government, Open Source Software and Analyzing 786 Pages of Responses - Results
Serkan Holat
Serkan Holat

Posted on • Updated on

The US Government, Open Source Software and Analyzing 786 Pages of Responses - Results

A semi-professional content analysis on the submitted responses to the US Government's Request for Information on Open Source Software

Part I: Results


In August, the US Government issued a public call in the form of a Request For Information (RFI) regarding open source software. The objective was to seek input from private and public sectors and start formulating a long-term strategy and action plan for the federal government to strengthen the open source ecosystem.

Here is an excerpt from the official press release:

In addition to its many benefits, the ubiquity of open-source software in commercial products, government systems, and military platforms presents unique security risks. For this reason, the White House established the Open-Source Software Security Initiative (OS3I), an interagency working group with the goal of identifying policy solutions and channeling government resources to foster greater open-source software security across the ecosystem. By working with other interagency partners, OS3I identified several focus areas, including

  • (i) increasing the proliferation of memory safe programming languages;
  • (ii) designing implementation requirements for secure, privacy-preserving security attestations;
  • and (iii) identifying and promoting focused areas for prioritization.

For more detailed information about the RFI, you can refer to the following link:
Request for Information on Open-Source Software Security: Areas of Long-Term Focus and Prioritization

National Public Fund to Finance Open Source Ecosystem

As a dedicated advocate for Agile Public Funds, I took the opportunity to contribute by submitting a 'National Public Fund to Finance Open Source Ecosystem' proposal to the RFI.

Briefly, considering OSS falls under the public good category and its global consumption leads to the well-known Free-rider problem:

  • Can we establish dedicated funds to overcome coordination issues?
  • And can we design these funds tailored to the fast-paced nature of these new digital public goods?

Below is the Principles section for the Scalable Public Funds:

  • Proactive: The fund should proactively identify and evaluate eligible open source initiatives, reducing bureaucracy and uncertainty.
  • Scalable: The fund should be designed to scale, accounting for the continued growth of the open source ecosystem.
  • Data-Driven: Resource allocation should rely on objective metrics, ensuring unbiased distribution.
  • Transparent: The evaluation criteria, metrics, and weights should be publicly accessible.
  • Continuous: Acknowledging the ongoing contributions of the OSS to the economy, the fund should commit to generating constant revenue rather than one-off payments.

To review the proposal, you can visit the following link:
National Public Fund to Finance Open Source Ecosystem

An Inevitable Analysis

The RFI has become an invaluable repository, collecting insights totaling 786 pages from 107 organizations and individuals across the industry, including major technology companies, foundations, research centers, and security firms. The fact that these public responses were limited to roughly ten pages turned out to be a convenient detail.

As a result, my initial interest in identifying reactions related to my proposal inevitably shifted into a semi-professional, purely manual content analysis.

Results

Let's start with sharing the Results document, which should speak for itself:
ONCD-2023-0002 - Content Analysis

Quick items about the document:

  • The "Notes" of each column contains the descriptions and alternative keywords.
  • If available, the "Notes" of a cell includes the paragraph related to the keyword.
  • ✖ indicates "False positive" or "Not applicable".

Methodology

All responses were converted to PDF and organized under a folder on Google Drive. Utilizing Google Drive's full-text search functionality proved more efficient than relying solely on the Regulations website.

Feel free to refer to the Google Drive folder and download the responses:
ONCD-2023-0002 - Content Analysis - Responses

Steps

Here is the list of steps I took to produce the results.

  1. Start with a text search on Google Drive.
  2. Mark the responses in the sheet.
  3. Review the marked responses.
  4. Determine True/False positives and categorize results accordingly.
  5. Take note of Highlights and newly revealed keywords during the review.
  6. Iterate through the steps to improve the coverage.

No ChatGPT 🤷‍♂️

While ChatGPT could have been an excellent tool for analysis and categorization, the initial focus was on identifying and reviewing similar responses, and the analytical part unexpectedly evolved. When it became clear ChatGPT could be handy, I had already completed most of the work and didn't want to integrate it in the later stages to maintain consistency. So, next time, Chad!

Categories & Keywords

Below is the list of the categories, associated keywords, and how many different responses they appear in. In the document, please check the "Notes" of each column to see the details and alternative keywords in the document.

Definitions: Which keywords does the response refer to when describing OSS?

  • Volunteer: 22 responses
  • Public good: 14 responses

Solutions: Which solutions does the response recommend?

  • Government funding: 37 responses
  • Procurement: 8 responses
  • Tax credits: 7 responses
  • Government OSPO: 4 responses

Systemic issues: Which systemic issues does the response mention?

  • Underfunded: 12 responses
  • Free rider: 6 responses
  • Bureaucracy: 4 responses
  • Cyber Resilience Act: 11 responses

Organizations: Which organizations does the response mention?

  • OpenSSF: 23 responses
  • OWASP: 12 responses
  • Sovereign Tech Fund: 10 responses
  • Open Technology Fund: 7 responses
  • Open Source Initiative: 6 responses

Quick Takes

I may write follow-up articles to expand my findings about the analysis, coupled with specifics of why we should establish dedicated public funds to finance open technologies and challenges around these initiatives.

For now, I will briefly share my initial remarks about the results.

Growing Demand for Public Funding

Compared to the previous conversations on financing open technologies, it is promising to see public funding being promoted as a feasible solution by numerous organizations. Degrees and conditions vary, but out of 108, 37 responses suggest government funding as a way to strengthen the open source ecosystem, including big tech companies like Amazon, Google, and Microsoft.

Following these recommendations, it may not be a surprising outcome if the US Government decides to form a dedicated public fund for open source – a significant impact on the ecosystem.

Charity or Business

On the flip side, open source software is still predominantly defined as volunteer work rather than potentially a business activity that generates substantial economic value. The term "volunteer" or alternatives appear together with open source software in 22 responses—similarly, only a few mention open technologies' social and economic benefits.

This outcome might be understandable since the RFI primarily focuses on security. Still, is it a sign that we should work on our elevator pitch for open tech across the board and emphasize its connections with real-world challenges?

Successful organizations

Under the Organizations category, The Open Source Security Foundation (OpenSSF), established under the Linux Foundation only three years ago, stands out prominently. Undoubtedly, the RFI is perfectly in line with its scope, yet it was mentioned in 23 responses numerous times, making it the most referenced organization.

Similarly, despite being just one year old, the Sovereign Tech Fund has been cited as a model in almost all instances where dedicated public funding has been proposed.

A closer look at the ingredients of these organizations might be good homework, giving us valuable insights.

Potential Improvements

Once you start digging such a mine, you constantly conjure new ideas. These are my notes on potential updates to extend the analysis.

  1. Demographics of Response Owners:
    • Include a breakdown of response owners' demographics, determining individuals, foundations, universities, and corporations.
  2. Collect Links for Referenced Sources:
    • Gather links mentioned in the responses to identify the most referenced sources.
  3. Expand Categories:
    • Introduce the "Incidents" category together with notable events, such as "Log4Shell," "Heartbleed," or "Solarwinds."
    • Introduce the "Digital Infrastructure/Software Supply Chain" keywords under the "Definitions" category.
    • Introduce the "Central Software Inventory" keyword under the "Solutions" category.
  4. Optimize Existing Keywords:
    • Improve the "Bureaucracy" by including additional keywords such as "regulations" and "mandates" for a more extensive analysis.
    • Restructure the "Volunteer" keyword into "Who is Producing and Maintaining OSS," and improve the clarity by identifying Volunteers, Foundations, and Corporations.
    • Refine the "Government Funding" category to address items falling on either the "Strong" or "Weak" sides for better categorization.

Highlights

You can continue with the second article, where I share over forty highlights that got my attention while reviewing the responses. It's a compilation of quotes from 25 organizations, including major technology companies, well-known foundations, universities, and security firms.

Feedback

As usual, your feedback is priceless! Do you have any suggestions or questions? Please don't hesitate to share them under this article or directly in the document.

Top comments (0)