Is Splunk in the Valley of Despair?

The pains of transitioning from a licensing to a subscription SaaS model

Splunk provides mission-critical data to customers navigating transformations on an epic scale. Yet they missed on both revenue and margin dollars in their Q3 FY2021. Is this a reflection of demand, a misaligned offering, poor execution, new competition, or something less obvious? We think Spunk’s transformation from licensing to the cloud is thrusting them into a revenue “Valley of Despair”.

Splunk reported a revenue increase of only $67M in Q2 to Q3 of 2021 ($492 in Q2 to $559 in Q3), less than forecasted. We believe the revenue miss is less an execution issue and more a revenue composition challenge.

The Valley of Despair

Companies transitioning from licensing to subscription models often find an initial drop in reported quarterly revenues either as they transition from reporting the entire license value to a monthly or quarterly license or as their customers evolve to a larger group of smaller customers attracted to the affordability of monthly subscriptions. There is also a delay in SaaS growth as license customers wait until necessary to move to the cloud. Sales teams trained to sell large enterprise license contracts struggle with negotiating SaaS agreements. A company’s demand engine is not easily adjusted to a different offering or to a new customer base. Most companies find the transition harder culturally and operationally than contractually.

In this illustration, we see the impact of a drop in total revenue until a company changes its cloud-focused sales and marketing model.

Why is Splunk in the Valley of Despair

Splunk’s revenue is suffering because they are not driving a faster transition to the cloud which is driving them into the Valley of Despair where their cloud revenue grows slower than licensing. Their Cloud/SaaS revenue was $145M in Q3 2021, up from $80m in the same period of 2020 but only up $19M Q/Q while their traditional licensing business was up $78m in the same period.

Our View

Cloud models are more appealing to a broader group of customers. Adoption rates are faster, cost of sales lower, LTV/CAC more reliable than a licensing model. Any argument that enterprises prefer licensing models is disputed by how many companies are adopting Office 365, SF.com, or Slack.

There are a few reasons why their cloud/SaaS revenue may be tracking slower than necessary to avoid the valley.

1. Trying to sell their cloud offering to the same customer base. That is a slow transition and does not leverage the cloud economies to a larger customer segment. Splunk needs to reach mid-market and SMB customers with their cloud solution.

2. Only 20% of their Cloud/SaaS revenue is outside the US while 30% of their total revenue is foreign-based. This may be attributed to slower cloud adoption in Europe or Splunk may not have cloud offerings in all foreign markets. This needs to be resolved since the company cannot switch to “cloud” successfully in foreign markets without local domains.

3. Their sales teams may not be better-compensated to transition customers to their cloud offering. Sales teams follow compensation. Sales compensation must be biased to cloud sales even to the extent of denying compensation for any license renewals.

Our Investment Thesis

Cloud/SaaS is a better model for Splunk. GMs for cloud/SaaS grew 6% Y/Y to 62% while licensing margins are down -5.9% Y/Y to 79.9%. It appears most of the revenue from Q3 2021 came from customer engagements of over $1M per year. Cloud is a way to expand their customer base by making their offering affordable for more enterprise and mid-size companies.

Splunk should more aggressively bias their sales compensation model to their cloud offering as other companies who lived through this transition including Adobe and Microsoft.

Splunk has every opportunity to make a successful transition to a cloud model, to expand its customer base, offering, and global footprint. Data companies offering will be in great demand by enterprises, mid-market, and SMBs. The key is to align every resource to make the cloud model a success.

It is time for Splunk to be serious about driving customers to their Cloud offering and rapidly the Valley of Despair.

Is IoT striking out or ready to hit a home run?

IoT is just “getting up to bat” which is why I think this week’s Economic Times article quoting a Cisco report that “3/4 of the current IoT proof of concept projects (PoCs) were failing” is actually encouraging. Read article here

Some will say this article proves IoT is just hype. That connecting billions of things is impossible or meaningless. But, flip the article and it tells us that ¼ of all early stage IoT projects are succeeding! For the rookie IoT in the computing industry, this is an amazing batting average.

IoT will see massive deployment because it has four economic drivers.

1. All the technologies are proven: Sensors have been around for decades as have MCUs and the wireless radios that connect sensors to gateways. The Gateways that connect the billions of sensors to the Cloud are just edge servers. Software to capture, manage and store the data has been deployed for decades in thin-client networks. We already have cloud analytics. IoT just brings together embedded technology that already exists.

2. IoT brings a lot of value at a low capital cost: A sensor costs a few dollars. The MCU and wireless radios to connect the sensors to Gateways are just a few more dollars. A complete sensing node can be as little as $5. ARM based IoT Gateways are available for $500 and IoT Cloud Applications can be billed by the CPU cycle. A $35 Raspberry Pi is more computing power than most IoT networks require to sense, store and communicate data. IoT is the most economically efficient type of IT ever conceived.

3. IoT puts heavy compute in light weight clouds: IoT is about capturing data and turning it into insights. IoT networks sense millions of tiny bits of data then send short, frequent messages to an application – likely running in the cloud, where billions of bits of data can be organized, analyzed and computed into industry insights. It’s about doing compute where it is efficient to compute.

4. IoT solves the hard problems IT has ignored: Capturing data used to be expensive. With sensors and low-power, low-cost wireless connected to an ARM based Gateway talking to the Cloud, we will capture data at a cost of bits-per-fractions of a penny.

IoT brings lots of value at a low capital cost. IoT sensors are inexpensive, use little power and are easy to deploy. These three factors will drive a great deal of PoCs and first time experiments. Maybe some too soon and not as well thought through, but the number of PoCs is a much more important indicator of future adoption.

We still need to learn what to sense, how much data we need to run meaningful analytics and how to connect 20 billion things to the internet. We need early-stage IoT PoCs to try to address all of these functions (Sense/Capture/Analysis) at super low-power, with no people-interference- 24 hours a day- for years and years. This is hard. It is supposed to be hard. We want PoCs to stretch boundaries so we can find the “edge” of IoT.

Bravo to the failed IoT PoCs. Let’s keep falling until we learn how to run- someday connecting billions and billions of things and unlocking IoT success.

The Next Big Thing in Compute!

“The Pundits Are Wrong…
IoT really is The Next Big Thing… the history of compute offers the proof.”

Here is what I tell everyone about IoT.

The big mainframe had a small installed footprint.

Mainframes created a compute-evolution by calculating so much, so fast. Mainframes advanced space exploration, scientific research and advanced mathematics. Mainframes allowed economists to more accurately forecast markets and business to used new data sets to understand customer models. But, mainframes had limitations. Only the largest companies had the compute challenges to justify the capital and operational costs. Only a handful of government agencies had the budgets to buy mainframes and hire the programming teams. The mainframe’s limitation was return on capital. A less expensive “department level” compute technology was needed. So, the industry created the mini-computer…

Mini computers addressed the cost of capital but not the cost of programming.

In the 1980s mini computers computerized business processes and automated machine control. Industry specific software application packages were marketed to mid-sized business. But, even “packaged applications” required expensive customization. To leverage the power of mini computers companies had to train teams on new work flows to leverage their new “department computers.” The cost of purchasing, updating and training on applications limited mini-computer adoption. Each person needed their own computer and affordable “off-the shelf” software. So, the industry iterated again creating the personal computer…

At its peak 350 million PCs were sold each year.

And although that yearly number is now about 240M/year, there are billions of PCs in use today running easy-to-buy and easy-to-use applications. But PCs need to connect to the internet to deliver real value and even as PC prices drop below $500, if you don’t have connectivity, the PC is not terribly valuable. Wi-Fi networks are great in the office, your home or at your favorite coffee shop but Wi-Fi is not free everywhere in the world and cellular dongles for PCs and Tablets come with expensive service fees. Continuous connectivity is the PC’s limitation. So, the industry iterated, again, creating a use anywhere compute and communication device – the smartphone…

About 5 billion active smartphones on cellular networks and more coming.

To reach the 7 Billion people in the world, smartphone makers are working to lower device costs and cellular carriers are expanding coverage. Someday- and I hope in my lifetime – every person in the world with have the power of the internet in the palm of their hand. Even when we have both affordable smartphones and cheep cellular everywhere, we will not likely have more than 10 billion active devices. There just are not enough people! The smartphones’ limitation is the number of persons in the world. So, what makes compute and communications unlimited? The Internet of Things..

An IoT network connects things to things and there are Billions and Billions of THINGS!

An IoT “thing” does not use expensive hardware. It connects to the internet through a WiFi radio costing about $2, its MCU costs about $4 and a full IoT module can be sold for less than $20. An IoT device usually runs on an open source embedded RTOS or Linux so there is nearly unlimited programming expertise available. An IoT device is not restricted by number of people on earth. It is only limited by the number of things we want to connect….

How many things will be connected? That’s anyone’s guess. But current thinking is more than 20 billion things by 2020. More than all the Smartphones, PCs, Mini Computers and Mainframes combined.

The number of IoT devices that will be deployed is only limited by our enthusiasm to connect things to sense, learn, provide data and reveal new insights and understanding. The internet of things is the next big, audacious, game changing and nearly unlimited phase of compute.

How is your company thinking about the next big thing? Are you having a conversation about how a world of connected devices impacts your business and your customers? Would you like to create new business models and develop new revenue streams by leveraging IoT?