Snap CEO Warns Of Further And Faster Deterioration, Citing Economy And Privacy

Snap will miss its own targets for revenue and adjusted earnings in the second quarter in 2022, the company’s CEO Evan Speigel warned Monday in a Securities & Exchange Commission filing, and as a result will slow hiring through the end of the year as it looks to manage expenses.

“The macroeconomic environment has deteriorated further and faster than anticipated,” he wrote.

The company released an 8K filing alongside an investor conference that it is tracking below the low end of its Q2 guidance, which was for 20% to 25% growth, according to RBC Capital markets. Some of the deterioration points to Apple’s privacy controls.

“Given management sounded incrementally more negative on ever fully restoring iOS signal loss, we'd expect investors to view this as a somewhat idiosyncratic issue, consistent with concerns we raised in our Feb. 4th downgrade,” RBC Capital analysts wrote in a research note published late Monday. “Fingerprinting remains a key source of controversy in the space and while SNAP reiterated that it does not leverage fingerprinting, we believe that advertisers [working] directly with [third-party mobile measurement partners] likely remain exposed and to the degree that Apple implements further privacy controls, SNAP advertisers could still face further negative exposure.”



Raymond James analysts expect other companies in the advertising sector to fall “in sympathy with Snap given lowered outlook and commentary around deteriorating macro environment.” While the analysts expect other company stocks to fall, they note that Snap estimates were the most aggressive of the group in terms of year-over-year and quarter-over-quarter growth rates.

Meta, which has struggled with its own ad-targeting challenges related to Apple’s App Tracking Transparency privacy framework, earlier in May shared more about its efforts around scaling the privacy-enhancing technology, federated learning (FL) -- a tool Meta is using to preserve the privacy of people’s data when training AI models.

FL allows practitioners to train their models without requiring that people’s data ever leave their devices.

It can train models that recognize voice commands without recording and transmitting audio clips into the cloud. The challenge, per Meta, learning can deliver results only as fast as the slowest device, degrading efficiency when scaling to millions of devices.

Developers believe they have figured out a way to circumvent this problem. The company is training a model on 100 million Android devices, and with the ability to scale to millions of devices it can make significant leaps in training speed and improve efficiency.

Early results showed Meta’s asynchronous FL system is five times faster than synchronous federated learning and can achieve the same result with high accuracy, the company said.

Next story loading loading..