Raspberry Pi launches camera module for vision-based AI applications

Image Credits: Romain Dillet / TechCrunch

Raspberry Pi, the company that sells tiny, cheap, single-board computers, is releasing an add-on that is going to open up several use cases — and yes, because it’s 2024, there’s an AI angle. Called the Raspberry Pi AI Camera, this image sensor comes with on-board AI processing and is going to cost $70.

In more technical terms, the AI Camera is based on a Sony image sensor (the IMX500) paired with a RP2040, Raspberry Pi’s own microcontroller chip with on-chip SRAM. Like the rest of the line-up, the RP2040 follows Raspberry Pi’s overall philosophy — it is inexpensive yet efficient.

In other words, AI startups aren’t going to replace their Nvidia GPUs with RP2040 chips for inference. But when you pair it with an image sensor, you get an extension module that can capture images and process those images through common neural network models.

As an added benefit, on-board processing on the camera module means that the host Raspberry Pi isn’t affected by visual data processing. The Raspberry Pi remains free to perform other operations — you don’t need to add a separate accelerator. The new module is compatible with all Raspberry Pi computers.

This isn’t Raspberry Pi’s first camera module. The company still sells the Raspberry Pi Camera Module 3, a simple 12-megapixel image sensor from Sony (IMX708) mounted on a small add-on board that you can pair with a Raspberry Pi with a ribbon cable. As Raspberry Pi promises to keep production running for many years, the Camera Module 3 will remain available for around $25.

The AI Camera is the same size as the Camera Module 3 (25mm x 24mm) but slightly thicker due to the structure of the optical sensor. It comes pre-loaded with the MobileNet-SSD model, an object detection model that can run in realtime.

At this point, you might be wondering who is going to use the Raspberry Pi AI Camera. While the tiny computers were originally designed for tech hobbyists and homelab projects, Raspberry Pi now sells most of its devices to companies that use Raspberry Pi devices in their own products or as part their assembly lines for internal industrial use cases.

When Raspberry Pi became a public company, it reported that the industrial and embedded segment represented 72% of its sales. That ratio is likely going to be even higher for the AI Camera.

I could imagine companies using the AI Camera module for smart city sensors that detect empty parking spots, say, or track traffic flows. In an industrial environment, the hardware could be used for basic, automated quality assurance with objects passing under the camera module.

The reason why companies like using Raspberry Pi is because they can produce computers and modules at scale — they faced some post-Covid supply constraints but those seem to be resolved. Companies know they can reliability source Raspberry Pi products without suffering delays in a production pipeline. That’s also part of the reason why Raspberry Pi promises that the AI Camera will remain in production until at least January 2028.

Image Credits: Romain Dillet / TechCrunch

The iPhone 16 launches today without its most hyped feature: Apple Intelligence

iPhone 16 Pro

Image Credits: Apple

The iPhone 16 officially goes on sale Friday. But for its earliest adopters, it arrives with a fundamental compromise baked into the deal.

Put simply, this is not the iPhone 16 that they were promised. Apple CEO Tim Cook said it would be the “first iPhone built for Apple Intelligence.” But that “for” is key: The handsets actually will not have its most hyped AI features out of the gate.

This feels like a turning point for Apple. When it comes to new features on phones, the company may not always be known for being the first to market or for jumping on gimmicks, but it is known for being the best. That’s not the case here. Apple was compelled to board the AI hype train and is thus taking a leap into the half-baked void.

Apple has now talked about its Apple Intelligence suite twice — first when announcing the AI suite at its Worldwide Developer Conference (WWDC) in June, and second during its September iPhone 16 launch.

iPhone 16 Pro Max review: A $1,200 glimpse at a more intelligent future

But in actuality, the company is far behind in terms of shipping features when it comes to its competitors like Google and Microsoft, as well as upstarts like OpenAI and Anthropic.

The company’s first set of AI tools, announced and released in developer betas, are rewriting tools, summarizations of articles and notifications, erasing objects in photos, and audio transcription. Much of this functionality already exists in the market. Apple’s bet is that its decisions around privacy — your usage data is not shared with other users, or with other tech companies, it promises — will be enough to attract buyers.

Strictly speaking, the gap between product and feature is not as dramatic as you might think — or at least that is how Apple would defend all this. The iPhone went on sale on September 20, and Apple has promised to start launching its AI features in October.

Yet only a handful of features will be made live at that time, and they will only be in U.S. English. (Recall that the company counts heavily on international markets, with North America accounting for just over half of all iPhone unit sales.)

And for the more complicated AI bells and whistles, we all still have to wait. The company plans to roll out features like visual search and Image Playground starting next month, while additional language support is starting to be rolled out in December — but first with localized English. Other languages will arrive sometime in 2025.

The iPhone 16 isn’t strictly necessary for those who want the new AI features. The company has already confirmed that the iPhone 15 Pro and 15 Pro Max will also get access to the platform.

So if Apple Intelligence is going to be a game changer as Apple promises, it’s fair to wonder if the rollout gaps and delays will keep users from upgrading. Or, if we start to see consumers adopt more of a wait-and-see approach — which might also translate to lower sales.

However, as my colleague Sarah Perez pointed out, Apple’s AI features could become more useful once third-party developers are able to fully integrate them with their apps. That’s worth considering, if and when it happens, but that’s more of a conversation for the iPhone 17.

That might well be the point here. Apple is building for the longer term opportunities, and for the first time, it feels like it’s asking buyers to take that leap of faith with it.

Illustration of the Threads app logo

Threads finally launches its API for developers

Illustration of the Threads app logo

Image Credits: Jaap Arriens/NurPhoto / Getty Images

Meta said today that it finally launched its much-awaited API for Threads so developers can build experiences around it.

Mark Zuckerberg posted about the API launch, saying, “The Threads API is now widely available and coming to more of you soon.”

In a blog post, Threads engineer Jesse Chen said that with the new API, developers can publish posts, fetch their own content, and deploy reply management tools. That means developers can let users hide/unhide or respond to specific replies.

Making the Threads API announcement at the Cannes Festival, the company added that along with these features, it will also allow developers to tap into analytics with measurements such as the number of views, likes, replies, reposts, and quotes at the media and account level.

Instagram head Adam Mosseri also posted about the announcement, saying that this move will help “businesses and creators manage their Threads presence at scale.”

In October 2023, Mosseri mentioned the company’s work on the Threads API for the first time. The company launched the API in a closed beta with partners such as Sprinklr, Sprout Social, Social News Desk, Hootsuite, tech news board Techmeme and a few other developers. At that time, Chen said that Meta plans to make the API widely available to developers in June. The company has delivered on the promise.

With the new API launch, the company has also released a reference open source app on GitHub for developers to play around with.

Third-party developers building social networking tools faced a tough 2023, with social networks like Twitter (now X) and Reddit restricting or shutting down API access at different levels. Decentralized social networks such as Mastodon and Bluesky have taken a more developer-friendly approach. But Meta’s Threads is the biggest new social network with more than 150 million users. With Threads integrating with the fediverse and releasing an API, it gives a chance to third-party developers to build some great social media experiences.

Social networks are getting stingy with their data, leaving third-party developers in the lurch

Former Snap engineer launches Butterflies, a social network where AIs and humans coexist

Image Credits: Butterflies

Butterflies is a social network where humans and AIs interact with each other through posts, comments and DMs. After five months in beta, the app is launching Tuesday to the public on iOS and Android. 

Anyone can create an AI persona, called a Butterfly, in minutes on the app. After that, the Butterfly automatically creates posts on the social network that other AIs and humans can then interact with. Each Butterfly has backstories, opinions and emotions. 

Butterflies was founded by Vu Tran, a former engineering manager at Snap. Vu came up with the idea for Butterflies after seeing a lack of interesting AI products for consumers outside of generative AI chatbots. Although companies like Meta and Snap have introduced AI chatbots in their apps, they don’t offer much functionality beyond text exchanges. Tran notes that he started Butterflies to bring more creativity to humans’ relationships with AI. 

“With a lot of the generative AI stuff that’s taking flight, what you’re doing is talking to an AI through a text box, and there’s really no substance around it,” Vu told TechCrunch. “We thought, OK, what if we put the text box at the end and then try to build up more form and substance around the characters and AIs themselves?”

Butterflies’ concept goes beyond Character.AI, a popular a16z-backed chatbot startup that lets users chat with customizable AI companions. Butterflies wants to let users create AI personas that then take on their own lives and coexist with other. 

When you open the app, you see a traditional social media feed filled with humans and AIs posting updates about their days. For instance, you might see a Butterfly who’s a woodworker post their latest creation. Or you may come across a Butterfly CEO of a Costco in an alternative universe who is hell-bent on keeping hot dogs priced at $1.50 (yes, someone actually created this Butterfly).

Image Credits: Butterflies

The app’s beta phase gave tens of thousands of users access to the social network. During the beta, Vu says users spent an average of one to three hours interacting with AIs on the app. 

“It’s fascinating what people are using Butterflies for,” Vu said. “At Snap, I did a lot of user research, but the behavior on Butterflies is just so new.” Vu says one person spent five hours a day creating 300 personas. He also found that some people connect with other humans on the platform because they resonate over what they have created. 

In one instance, two friends created two Butterflies simultaneously and gave them their own backstories to have them interact on their behalf and see where they end up. Another person created a version of themselves that lived in Game of Thrones’ fictional continent of Westeros, while someone else re-created themselves as a Dungeons & Dragons character. 

Vu says that Butterflies is one of the most wholesome ways to use and interact with AI. He notes that while the startup isn’t claiming that it can help cure loneliness, he says it could help people connect with others, both AI and human. 

“Growing up, I spent a lot of my time in online communities and talking to people in gaming forums,” Vu said. “Looking back, I realized those people could just have been AIs, but I still built some meaningful connections. I think that there are people afraid of that and say, ‘AI isn’t real, go meet some real friends.’ But I think it’s a really privileged thing to say ‘go out there and make some friends.’ People might have social anxiety or find it hard to be in social situations.”

Vu says Butterflies is getting an outpouring of positive feedback. 

The app is free-to-use at launch, but Butterflies may experiment with a subscription model in the future, Vu says. Over time, Butterflies plans to offer opportunities for brands to leverage and interact with AIs.  

The app is mainly being used for entertainment purposes, but in the future, the startup sees Butterflies being used for things like discovery in a way that’s similar to Instagram. 

Butterflies closed a $4.8 million seed round led by Coatue in November 2023. The funding round included participation from SV Angel and strategic angels, many of whom are former Snap product and engineering leaders.

Elizabeth Edwards is the founder of H Partners.

H Venture Partners launches venture studio focused on microbiome tech

Elizabeth Edwards is the founder of H Partners.

Image Credits: Courtesy of Elizabeth Edwards

H Venture Partners is launching a venture studio that will focus on sourcing microbiome technologies and materials that can be commercialized and turned into startups. 

The firm said on Friday that it will source talent who can solve health problems like depression, cancer, eczema and neurodegenerative diseases. This expansion is in part funded by H Partner’s Fund II, which just announced a $10 million investment from the state of Ohio. Fund II is now oversubscribed with $24 million in total funding.

“We are tackling the top two threats to humanity,” Elizabeth Edwards, the firm’s founder and managing partner, told TechCrunch. “Our primary focus is preventing antibiotic resistance and our secondary focus is reversing climate change and eliminating petrol-based plastics.” 

Edwards hopes to invest in at least 13 early-stage companies, with check sizes ranging from $500,000 to $1 million. The firm’s portfolio already includes companies like Felix Health and Parsley Health, and it has more than $40 million in assets under management. 

So far, companies with at least one woman founder have raised more than $2 billion in funding in the pharma and bio sector, according to PitchBook; last year, such groups raised $5.4 billion out of around the $18.4 billion that was invested in the sector altogether.

H Venture Partners closes $10M debut fund targeting science-based brands

Edwards founded H Ventures in 2017 and the firm remains female-founded, owned, and operated. The firm’s goal is to help support more marginalized founders, and it aims to continue doing that at a broader scale with this new venture studio and new fund. Currently, H Ventures invests more than 90% of its portfolio into marginalized founders, helping to bring more women and people of color to the forefront of biological sciences at a time when less than 2% of all venture funding goes to such groups. 

“If you think that investing in women is a bad idea, you’re missing half of the best opportunities in the world,” Edwards said, adding that the firm believes in “excellence through inclusion.”