The beloved Rodecaster board gets a video production counterpart

Image Credits: Rode

Like much of the podcasting world, we loved the Rodecaster Pro when it launched toward the end of 2018. It’s taken a half-dozen years, but microphone maker Rode is finally extending the user-friendly line into the world of video.

The Rodecaster Video faces much stiffer competition than the original Pro board did. In this post-pandemic age of video-first, plenty have launched similar devices. It also has a high price tag to contend with at $1,119 — nearly double the Pro’s $600 asking price.

Rode is looking to loop in the same novice category it attracted with the first Pro: Effectively those who are interested in upping their video production game but not ready to commit a full pro rig.

Whereas the Pro focuses entirely on in-person audio podcasts, however, the Video casts a wider net, ranging from solo Twitch streamers to multi-person in-studio video productions. The new model is more compact than the original, as it’s meant to be sat in front of a computer, rather than the middle of a podcasting table.

Image Credits: Rode

It features a pair of Neutrik jacks with pre-amps for XLR mics, coupled with four HDMI and two USB-C inputs, allowing users to switch between video sources in real time. Audio processing, meanwhile, includes EQ, along with a compressor, high-pass filter, de-esser, and noise gate.

It can automatically cue out green and blue screens, switch audio between video sources, and stream to “all major platforms” via Wi-Fi or hardwired through Ethernet and USB. It can also record directly to a connected external hard drive.

“For decades, our mission has been to empower creators with professional yet accessible audio equipment,“ CEO Damien Wilson says in a release. “Groundbreaking innovations like the Rodecaster Pro and Wireless GO have changed the way that people approach creating content across multiple categories, placing studio-grade audio solutions within the reach of today’s creators. With the Rodecaster Video, we are doing the same for video production.”

The Rodecaster Video is up for preorder starting today. It ships October 10.

Mastodon displayed on smartphone screens

Twitter co-founder Biz Stone joins board of Mastodon's new US nonprofit

Mastodon displayed on smartphone screens

Image Credits: Mastodon

Biz Stone, a Twitter co-founder, is among those who have joined the board of directors of Mastodon’s new U.S. nonprofit, Mastodon CEO Eugen Rochko announced over the weekend. Mastodon’s service, an open source, decentralized social network and rival to Elon Musk’s X, has gained increased attention following the Twitter acquisition as users sought alternatives to X’s would-be “everything app” that felt more like the old Twitter of days past.

Mastodon only somewhat fits that bill. Though the service resembles Twitter in many ways, it’s underpinned by different infrastructure. As part of the “fediverse” — or the open social web made up of interconnected servers communicating over the ActivityPub protocol — Mastodon benefits users who no longer want to be locked into a centralized social network that can be bought and sold to new billionaire owners, like Musk.

Though Mastodon was already established as a nonprofit in Germany in 2021, the creation of a 501(c)(3) nonprofit in the U.S. will allow the company to receive tax-deductible donations and other financial support. The change also comes as Mastodon has inexplicitly lost its nonprofit status in Germany.

“…we have received a notice from the same tax office that our non-profit status has been withdrawn,” wrote Rochko on the Mastodon blog. “This came with no advance warning or explanation. Earlier this year we went through a successful tax audit, which in fact resulted in some favourable adjustments as we’ve been paying too much tax. Our tax advisor immediately submitted an appeal to the decision, but so far, we have no new information,” he said.

Mastodon’s day-to-day operations were unaffected by this change, as most of its income comes from the crowdfunding platform Patreon. It also received donations from Jeff Atwood and Mozilla at $100,000 apiece, which allowed the company to hire a third full-time developer this year.

However, being established as a nonprofit enables Mastodon to communicate how it differs from other social media businesses. While becoming a nonprofit in the U.S. will help Mastodon regain its status, it wants to remain based out of the EU.

In addition to Biz Stone, other board members include Esra’a Al Shafei, a human rights advocate and founder of Majal.org; Karien Bezuidenhout, an advocate for openness and experienced board member across sustainable social enterprise; Amir Ghavi, a partner at law firm Fried Frank, where he’s the co-head of the Technology Transactions Practice; and Felix Hlatky, the chief financial officer of Mastodon since 2020, who originally incorporated the project as a nonprofit LLC in Germany and helped it raise additional funds.

Meta's Oversight Board extends its scope to Threads

The Threads app logo is seen in this illustration photo

Image Credits: Jaap Arriens/NurPhoto / Getty Images

Meta’s external advisory group, its Oversight Board, announced today that it is expanding its scope to Threads along with Facebook and Instagram to scrutinize Meta’s content moderation decisions.

This means if users on Threads are unsatisfied with Meta’s decision on issues like content or account takedown, they can appeal to the Oversight Board.

“The Board’s expansion to Threads builds on our foundation of helping Meta solve the most difficult questions around content moderation. Having independent accountability early on for a new app such as Threads is vitally important,” Oversight Board co-chair Helle Thorning-Schmidt said in a statement.

In 2018, Mark Zuckerberg formally talked about having an independent oversight board for the first time. In January 2020, Facebook proposed the board’s bylaws and announced the first set of members in May. In October 2020, the board said that it would start reviewing cases. In 2021, the observatory body expanded its scope to review Meta’s decision to keep certain content up.

The Oversight Board has ruled on some important cases over the years. The most notable ruling was criticizing Facebook for “indefinitely” banning former President Donald Trump. While it agreed that Trump broke the platform’s rules, the board said that the guidelines don’t have specifications for an indefinite ban.

Earlier this year, the board called on Meta to reform its “incoherent” rules about fake videos.

Content moderation on Threads

Since Threads launched in July last year, users have questioned its moderation practices multiple times. In October, The Washington Post reported that the platform has been blocking terms like “Covid” and “vaccines” along with “gore,” “nude,” “sex” and “porn.” The same month, Instagram head Adam Mosseri mentioned that this ban was temporary. However, the company hasn’t lifted the ban as of today.

Earlier this month, Threads said that it is running a fact-checking program, and users were seeing labels on some of the posts. However, the company clarified that it is because it matches existing fact checks on Meta’s other properties to posts on Threads. Last year, Meta said it intended to introduce a separate fact-check program for Threads, but the company hasn’t finalized what fact-checkers will be part of.

Mosseri has been adamant about the decision to not recommend political content and “amplify news” on the platform. However, the social network’s newly rolled out trending topics function could feature political content as long as it doesn’t break company policy, Meta said last week.

Oversight Board co-chair Catalina Botero Marino pointed out that with polls coming up in countries like the U.S. and India this year, advances in AI and conflicts across the world, content moderation has become harder.

“With conflicts raging in Europe, the Middle East and elsewhere, billions of people heading to the polls in global elections and growing abuse towards marginalized groups online, there is no time to lose. Advances in artificial intelligence can make content moderation even more challenging,” Marino said.

“The Board is dedicated to finding concrete solutions that protect freedom of expression while reducing harm, and we look forward to setting standards that will improve the online experience for millions of Threads users.”

The process of the oversight board

The Oversight Board hasn’t changed its process for appealing Meta’s decisions on Threads. A user has to appeal to Meta first, and within 15 days of receiving a verdict, they should appeal to the board if they are unhappy with the social platform’s judgment.

The board can take up to 90 days to review the decision from the date of appeal. In the past, the board and Meta have been criticized for the slowness of their responses. However, the organization hasn’t changed any processes for Threads at the moment.

Notably, the board can issue both recommendations and decisions. While recommendations are not binding, Meta is obliged to follow the board’s rulings.

OpenAI CEO Sam Altman speaks during the OpenAI DevDay

OpenAI announces new board members, reinstates CEO Sam Altman

OpenAI CEO Sam Altman speaks during the OpenAI DevDay

Image Credits: Justin Sullivan / Getty Images

Sam Altman, the CEO of OpenAI, has a seat at the table — or board, rather — once again.

OpenAI today announced that Altman will be rejoining the company’s board of directors several months after losing his seat and being pushed out as OpenAI’s CEO.

Joining him are three new members: former CEO of the Bill and Melinda Gates Foundation Sue Desmond-Hellmann, ex-Sony Entertainment president Nicole Seligman and Instacart CEO Fidji Simo — bringing OpenAI’s board to eight people.

The members of the transitionary board — the board formed after Altman’s firing in November — won’t be stepping down with the appointment of Desmond-Hellmann, Seligman and Simo. Former Salesforce co-CEO Bret Taylor (OpenAI’s current board chair), Quora CEO Adam D’Angelo and Larry Summers, the economist and former Harvard president, will remain in their roles on the board, as will Dee Templeton, a Microsoft-appointed board observer.

The appointment of the four new board members — and reappointment of Altman — comes after OpenAI received criticism for its board’s all-male makeup and the nomination of Summers, who has a history of making unflattering remarks about women. The Congressional Black Caucus flagged the board’s lack of diversity in a letter sent in January, noting the importance of the Black perspective in building tools to help mitigate AI bias.

OpenAI’s expanded board is certainly diverse — at least in terms of their backgrounds.

Desmond-Hellmann, in addition to heading the Bill and Melinda Gates Foundation for six years, was previously chancellor of the University of California, San Francisco and before that president of product development at Genentech, where she helped develop gene-targeted cancer drugs. Desmond-Hellmann is an oncologist by training, board-certified in both internal medicine and medical oncology.

Seligman, an attorney and corporate director, received national attention for her representation of Lieutenant Colonel Oliver North during the Iran-Contra hearings and President Bill Clinton during his impeachment trial. Seligman was Sony’s VC and general counsel before rising through the ranks to CEO of Sony Corporation and president of Sony Corporation of America.

As for Fidji Simo, before becoming CEO of Instacart, she was head of the Facebook app at Meta and the VP overseeing Meta’s various video, games and monetization efforts. Simo also co-founded — and is currently president of — The Metrodora Foundation, a health clinic and research institute.

“Sue, Fidji and Nicole have experience in leading global organizations and navigating complex regulatory environments, including backgrounds in technology, nonprofit and board governance,” OpenAI wrote in a blog post. “They will work closely with current board members Adam D’Angelo, Larry Summers and Bret Taylor as well as Sam and OpenAI’s senior management.”

The board’s expansion and Altman’s reinstatement also follows an investigation by the law firm WilmerHale, retained by OpenAI, that concluded Altman’s ouster was a “consequence of a breakdown in the relationship and loss of trust” between Altman and the prior board — not out of “concerns regarding product safety or security, the pace of development, OpenAI’s finances or its statement to investors, customers, or business partners.”

OpenAI in a blog post said that, during the probe, WilmerHale conducted dozens of interviews with the company’s prior board, current executives, advisers and other witnesses and reviewed thousands of documents and other corporate actions. In the opinion of the firm, the prior board acted within its right to terminate Altman — but Altman’s conduct didn’t mandate removal.

“We have unanimously concluded that Sam and [OpenAI president Greg Brockman] are the right leaders for OpenAI,” Taylor said in a statement. “We recognize the magnitude of our role in stewarding transformative technologies for the global good.”

Not all at OpenAI would likely agree.

New York Times reporting earlier this week paints a picture of a manipulative Altman — a leader who often told people what they wanted to hear to charm them and support his decisions but who undermined their credibility when they challenged him. Both OpenAI CTO Mira Murati and Ilya Sutskever, a former OpenAI board member and the startup’s chief scientist, approached members of OpenAI’s previous board to express concerns about Altman’s behavior prior to his ouster last year, according to The Times.

In addition to today’s board appointments, OpenAI said that it would adopt a new set of corporate governance guidelines, including strengthening its conflict of interest policy, creating a whistleblower hotline “to serve as an anonymous reporting resource for all OpenAI employees and contractors” and establishing additional board committees — including a mission and strategy committee “focused on implementation and advancement of the core mission of OpenAI.”

We’ve asked OpenAI for more information on the reworked conflict of interest policy and mission and strategy committee and will update this post if we hear back.

Mastodon displayed on smartphone screens

Twitter co-founder Biz Stone joins board of Mastodon's new US nonprofit

Mastodon displayed on smartphone screens

Image Credits: Mastodon

Biz Stone, a Twitter co-founder, is among those who have joined the board of directors of Mastodon’s new U.S. nonprofit, Mastodon CEO Eugen Rochko announced over the weekend. Mastodon’s service, an open source, decentralized social network and rival to Elon Musk’s X, has gained increased attention following the Twitter acquisition as users sought alternatives to X’s would-be “everything app” that felt more like the old Twitter of days past.

Mastodon only somewhat fits that bill. Though the service resembles Twitter in many ways, it’s underpinned by different infrastructure. As part of the “fediverse” — or the open social web made up of interconnected servers communicating over the ActivityPub protocol — Mastodon benefits users who no longer want to be locked into a centralized social network that can be bought and sold to new billionaire owners, like Musk.

Though Mastodon was already established as a nonprofit in Germany in 2021, the creation of a 501(c)(3) nonprofit in the U.S. will allow the company to receive tax-deductible donations and other financial support. The change also comes as Mastodon has inexplicitly lost its nonprofit status in Germany.

“…we have received a notice from the same tax office that our non-profit status has been withdrawn,” wrote Rochko on the Mastodon blog. “This came with no advance warning or explanation. Earlier this year we went through a successful tax audit, which in fact resulted in some favourable adjustments as we’ve been paying too much tax. Our tax advisor immediately submitted an appeal to the decision, but so far, we have no new information,” he said.

Mastodon’s day-to-day operations were unaffected by this change, as most of its income comes from the crowdfunding platform Patreon. It also received donations from Jeff Atwood and Mozilla at $100,000 apiece, which allowed the company to hire a third full-time developer this year.

However, being established as a nonprofit enables Mastodon to communicate how it differs from other social media businesses. While becoming a nonprofit in the U.S. will help Mastodon regain its status, it wants to remain based out of the EU.

In addition to Biz Stone, other board members include Esra’a Al Shafei, a human rights advocate and founder of Majal.org; Karien Bezuidenhout, an advocate for openness and experienced board member across sustainable social enterprise; Amir Ghavi, a partner at law firm Fried Frank, where he’s the co-head of the Technology Transactions Practice; and Felix Hlatky, the chief financial officer of Mastodon since 2020, who originally incorporated the project as a nonprofit LLC in Germany and helped it raise additional funds.

Meta's Oversight Board overturns takedown decision for Pakistan child abuse documentary

magnifying glass over facebook logo, dim bacteria/petri dish in background

Image Credits: TechCrunch

Meta’s external advisory group, the Oversight Board, has overturned the social media company’s decision to take down a news documentary revealing the identities of child victims of sexual abuse and murder in Pakistan — an exceptional case based on newsworthiness.

The 11-minute documentary, posted by the broadcaster Voice of America (VOA) Urdu on its Facebook page in January 2022, was reported by 67 users until July 2023 for disturbing details of the crimes conducted by Javed Iqbal, who murdered and sexually abused about 100 children in Pakistan in the 1990s. It contained images of newspaper clips that showed child victims’ faces with their names and people in tears.

Initially, Meta did not find it a violation following automated and human reviews. However, the post, viewed about 21.8 million times and shared approximately 18,000 times until it was pulled, was later removed by Meta’s policy team for violating the Child Sexual Exploitation, Abuse and Nudity policy after it was escalated internally and flagged separately by the company’s High Risk Early Review Operations system for high chances of it being viral.

Despite violating the Child Sexual Exploitation, Abuse and Nudity Community Standard, the majority of the Oversight Board found that the content should be allowed on the platform after Meta referred the case to the board.

“For the majority, the public interest in reporting on these child abuse crimes outweighed the possible harms to the victims and their families,” the Oversight Board said in a blog post Tuesday explaining its extraordinary decision.

It noted that the documentary was produced to raise awareness and not to sensationalize the gruesome details of the crimes that took place about 25 years ago, with none of the victims surviving.

“This passage of time is the most important factor because it means possible direct harms to the child victims had diminished. Meanwhile, the public interest in child abuse remains,” the board said.

The board has pointed out that Meta decided to pull the content after it had been available on the platform for over 18 months. Additionally, it questions the sufficiency of Meta’s resources for Urdu-language videos.

While most of the board favored overturning the takedown decision, a minority suggested making the content unavailable, as it was possible to discuss the issues raised in the video without revealing the names and faces of victims.

The Oversight Board recommends Meta create a section within each Community Standard describing what exceptions and allowances apply. The company should include its rationale for not allowing certain exceptions that apply to other policies (such as news reporting or awareness raising) in the new section when it is used, the board said.

“While the rarely used newsworthiness allowance — a general exception that can be applied only by Meta’s expert teams — was relevant here, the Board notes that no specific policy exceptions, such as raising awareness or reporting on, are available for the Child Sexual Exploitation, Abuse and Nudity policy. Meta should provide more clarity to users about this,” the board noted. “Additionally, it could be made clearer to people in the public language of this policy what qualifies as identifying alleged victims “by name or image.”

Meta has acknowledged that it erred in removing the content due to the substantial harms it posed to the victims and their families even though the events happened over two decades ago. It also welcomed the Oversight Board’s decision and will reinstate the content within seven days.

The board, which started its work in 2020 after Meta CEO Mark Zuckerberg conceptualized its formation in 2018, has ruled some important oversight cases, including the one criticizing Facebook for banning former President Donald Trump “indefinitely.” In February this year, it called Meta to reform its “incoherent” rules about altered videos.

Meta's Oversight Board takes its first Threads case

The Threads logo on a smartphone

Image Credits: Bloomberg / Gabby Jones (opens in a new window) / Getty Images

Meta’s Oversight Board has now extended its scope to include the company’s newest platform, Instagram Threads. Designed as an independent appeals board that hears cases and then makes precedent-setting content moderation decisions, the board to date has decided on cases like Facebook’s ban of Donald Trump, COVID-19 misinformation, the removal of breast cancer photos, and more.

Now the board has begun hearing cases emerging from Threads, Meta’s Twitter/X competitor.

This is an important point of differentiation between Threads and rivals like X, where Elon Musk and other users heavily rely on crowdsourced fact-checks by Community Notes to complement its otherwise light moderation. It’s also very different from how decentralized solutions, like Mastodon and Bluesky, are managing moderation duties on their platforms. Decentralization allows community members to establish their own servers with their own set of moderation rules and gives them the option to de-federate from other servers whose content runs afoul of their guidelines.

The startup Bluesky is also investing in stackable moderation, meaning community members can create and run their own moderation services, which can be combined with others to create a customized experience for each individual user.

Meta’s move to offload difficult decisions to an independent board that could overrule the company and its CEO Mark Zuckerberg was meant to be the solution to the problem of Meta’s centralized authority and control over content moderation. But as these startups have shown, there are other ways to do this that allow the user to be more in control of what they see, without stepping on the rights of others to do the same.

Nevertheless, the Oversight Board on Thursday announced it would hear its first case from Threads.

The case involves a user’s reply to a post containing a screenshot of a news article in which Japanese prime minister Fumio Kishida made a statement about his party’s alleged underreporting of fundraising revenues. The post also included a caption criticizing him for tax evasion and contained derogatory language as well as the phrase “drop dead.” It also used derogatory language for someone who wears glasses. Because of the “drop dead” component and hashtags calling for death, a human reviewer at Meta decided the post violated the company’s Violence and Incitement rule — despite sounding much like your run-of-the-mill X post these days. After their appeal was denied a second time, the user appealed to the Board.

The Board says it selected this case to examine Meta’s content moderation policies and enforcement of practices over political content on Threads. That’s a timely move, considering that it’s an election year and that Meta declared it would not proactively recommend political content on Instagram or Threads.

The Board’s case will be the first involving Threads, but it won’t be the last. The organization is already preparing to announce another bundle of cases tomorrow focused on criminal allegations based on nationality. These latter cases were referred to the Board by Meta, but the Board will also receive and weigh in on appeals from Threads users, as it did with the case concerning Prime Minister Kishida.

The decisions the Board renders will influence how Threads as a platform chooses to uphold users’ ability to express themselves freely on its platform, or whether Threads will moderate content more closely than on Twitter/X. That will ultimately help shape public opinion about the platforms and influence users to choose one or the other, or perhaps a startup experimenting with new ways to moderate content in a more personalized fashion.