pattern of openAI logo

OpenAI partners with Common Sense Media to collaborate on AI guidelines

pattern of openAI logo

Image Credits: Bryce Durbin / TechCrunch

OpenAI hopes to win the trust of parents — and policymakers — by partnering with organizations that work to minimize tech and media harms to kids, preteens and teens.

Case in point, OpenAI today announced a partnership with Common Sense Media, the nonprofit organization that reviews and ranks the suitability of various media and tech for kids, to collaborate on AI guidelines and education materials for parents, educators and young adults.

As a part of the partnership, OpenAI will work with Common Sense Media to curate “family-friendly” GPTs — chatbot apps powered by OpenAI’s GenAI models — in the GPT Store, OpenAI’s GPT marketplace, based on Common Sense’s rating and evaluation standards, OpenAI CEO Sam Altman says.

“AI offers incredible benefits for families and teens, and our partnership with Common Sense will further strengthen our safety work, ensuring that families and teens can use our tools with confidence,” Altman added in a canned statement.

The launch of the partnership comes after OpenAI said that it would participate in Common Sense’s new framework, launched in September, for ratings and reviews designed to assess the safety, transparency, ethical use and impact of AI products. Common Sense’s framework aims to produce a “nutrition label” for AI-powered apps, according to Common Sense co-founder and CEO James Steyer, toward shedding light on the contexts in which the apps are used and highlight areas of potential opportunity and harm against a set of “common sense” tenets.

OpenAI logo is being displayed on a mobile phone screen in front of computer screen with the logo of ChatGPT
The OpenAI logo displayed on a smartphone screen in front of computer screen with the ChatGPT logo. Image Credits: Didem Mente/Anadolu Agency / Getty Images

In a press release, Steyer alluded to the fact that today’s parents remain generally less knowledgeable about GenAI tools — for example, OpenAI’s viral AI-powered chatbot ChatGPT — than younger generations. An Impact Research poll commissioned by Common Sense Media late last year found that 58% of students aged 12 to 18 have used ChatGPT compared to 30% of parents of school-aged children.

“Together, Common Sense and OpenAI will work to make sure that AI has a positive impact on all teens and families,” Steyer said in an emailed statement. “Our guides and curation will be designed to educate families and educators about safe, responsible use of [OpenAI tools like] ChatGPT, so that we can collectively avoid any unintended consequences of this emerging technology.”

OpenAI’s under pressure from regulators to show that its GenAI-powered apps, including ChatGPT, are an overall boon for society — not a detriment to it. Just last summer, the U.S. Federal Trade Commission opened an investigation into OpenAI over whether ChatGPT harmed consumers through its collection of data and publication of false statements on individuals. European data authorities have also expressed concern over OpenAI’s private information handling.

OpenAI’s tools, like all GenAI tools, tend to confidently make things up and get basic facts wrong. And they’re biased — a reflection of the data that was used to train them.

Kids and teens, aware of the tools’ limitations or no, are increasingly turning to them for help not only with schoolwork but personal issues. According to a poll from the Center for Democracy and Technology, 29% of kids report having used ChatGPT to deal with anxiety or mental health issues, 22% for issues with friends and 16% for family conflicts.

Concept illustration depicting software development.

Open source foundations unite on common standards for EU's Cyber Resilience Act

Concept illustration depicting software development.

Image Credits: SergeyBitos / Getty Images

Seven open source foundations are coming together to create common specifications and standards for Europe’s Cyber Resilience Act (CRA), regulation adopted by the European Parliament last month.

The Apache Software Foundation, Blender Foundation, Eclipse Foundation, OpenSSL Software Foundation, PHP Foundation, Python Software Foundation, and Rust Foundation revealed their intentions to pool their collective resources and connect the dots between existing security best practices in open source software development — and ensure that the much-maligned software supply chain is up to the task when the new legislation comes into force in three years.

Componentry

It’s estimated that between 70% and 90% of software today is made up of open source components, many of which are developed for free by programmers in their own time and on their own dime.

The Cyber Resilience Act was first unveiled in draft form nearly two years ago, with a view toward codifying best cybersecurity practices for both hardware and software products sold across the European Union. It’s designed to force all manufacturers of any internet-connected product to stay up-to-date with all the latest patches and security updates, with penalties in place for shortcomings.

These noncompliance penalties include fines of up to €15 million, or 2.5% of global turnover.

The legislation in its initial guise prompted fierce criticism from numerous third-party bodies, including more than a dozen open source industry bodies that last year wrote an open letter saying that the Act could have a “chilling effect” on software development. The crux of the complaints centered on how “upstream” open source developers might be held liable for security defects in downstream products, thus deterring volunteer project maintainers from working on critical components for fear of legal retribution (this is similar to concerns that abounded around the EU AI Act, which was greenlighted last month).

The wording within the CRA regulation did offer some protections for the open source realm, insofar as developers not concerned with commercializing their work were technically exempt. However, the language was open to interpretation in terms of what exactly fell under the “commercial activity” banner — would sponsorships, grants, and other forms of financial assistance count, for example?

Some changes to the text were eventually made, and the revised legislation substantively addressed the concerns through clarifying open source project exclusions, and carves out a specific role for what it calls “open source stewards,” which includes not-for profit foundations.

“In general, we are pleased with the outcome… the process worked, and the open source community was listened to,” Eclipse Foundation executive director Mike Milinkovich told TechCrunch. “One of the most interesting aspects of the final regulation is that it recognizes ‘open source software stewards’ as a form of economic actor which are part of the overall software supply chain. This is the first piece of legislation globally that recognizes the role played by foundations and other forms of community stewards.”

Although the new regulation has already been rubber stamped, it won’t come into force until 2027, giving all parties time to meet the requirements and iron out some of the finer details of what’s expected of them. And this is what the seven open source foundations are coming together for now.

“There is an enormous amount of work that will need to be done over the next three years in order to implement the CRA,” Milinkovich said. “Keep in mind that the CRA is the first law anywhere in the world regulating the software industry as a whole. The implications of this go far beyond the open source community and will impact startups and small enterprises as well as the global industry players.”

Documentation

The manner in which many open source projects evolve has meant that they often have patchy documentation (if any at all), which makes it difficult to support audits and makes it difficult for downstream manufacturers and developers to develop their own CRA processes.

Many of the better-resourced open source initiatives already have decent best practice standards in place, relating to things like coordinated vulnerability disclosures and peer review, but each entity might use different methodologies and terminologies. By coming together as one, this should go some way toward treating open source software development as a single “thing” bound by the same standards and processes.

Throw into the mix other proposed regulation, including the Securing Open Source Software Act in the U.S., and it’s clear that the various foundations and “open source stewards” will come under greater scrutiny for their role in the software supply chain.

“While open source communities and foundations generally adhere to and have historically established industry best practices around security, their approaches often lack alignment and comprehensive documentation,” the Eclipse Foundation wrote in a blog post today. “The open source community and the broader software industry now share a common challenge: legislation has introduced an urgent need for cybersecurity process standards.”

The new collaboration, while consisting of seven foundations initially, will be spearheaded in Brussels by the Eclipse Foundation, which is home to hundreds of individual open source projects spanning developer tools, frameworks, specifications, and more. Members of the foundation include Huawei, IBM, Microsoft, Red Hat and Oracle.