Why Contribution Margin is a Strong Predictor of Success for Companies

In the last post I concluded with a brief discussion of Contribution Margin as a key KPI. Recall:

Contribution Margin = Variable Profits – Sales and Marketing Cost

The higher the contribution margin, the more dollars available towards covering G&A. Once contribution margin exceeds G&A, a company reaches operating profits. For simplicity in this post, I’ll use gross margin (GM) as the definition of variable profits even though there may be other costs that vary directly with revenue.

The Drivers of Contribution Margin (CM)

There is an absolute correlation between GM percent and CM. Very high gross margin companies will, in general, get to strong contribution margins and low gross margin companies will struggle to get there. But the sales and marketing needed to drive growth is just as important. There are several underlying factors in how much needs to be spent on sales and marketing to drive growth:

  1. The profits on a new customer relative to the cost of acquiring her (or him). That is, the CAC (customer acquisition cost) for customers derived from paid advertising compared to the profits on those customers’ first purchase
  2. The portion of new traffic that is “free” from SEO (search engine optimization), PR, existing customers recommending your products, etc.
  3. The portion of revenue that comes from repeat customers

The Relationship Between CAC and First Purchase Profits Has a Dramatic Impact on CM

Suppose Company A spends $60 to acquire a customer and has GM of $90 on the initial purchase by that customer. The contribution margin will already be positive $30 without accounting for customers that are organic or those that are repeat customers; in other words, this tends to be extremely positive! Of course, the startups I see in eCommerce are rarely in this situation but those that are can get to profitability fairly quickly if this relationship holds as they scale.

It would be more typical for companies to find that the initial purchase GM only covers a portion of CAC but that subsequent purchases lead to a positive relationship between the LTV (life time value) of the customer and CAC. If I assume the spend to acquire a customer is $60 and the GM is $30 then the CM on the first purchase would be negative (-$30), and it would take a second purchase with the same GM dollars to cover that initial cost. Most startups require several purchases before recovering CAC which in turn means requiring investment dollars to cover the outlay.

Free Traffic and Contribution Margin

If a company can generate a high proportion of free/organic traffic, there is a benefit to contribution margin. CAC is defined as the marketing spend divided by the number of new customers derived from this spend. Blended CAC is defined as the marketing spend divided by all customers who purchased in the period. The more organically generated and return customers, the lower the “blended CAC”. Using the above example, suppose 50% of the new customers for Company A come from organic (free) traffic. Then the “blended CAC“ would be 50% of the paid CAC. In the above example that would be $30 instead of $60 and if the GM was only $30 the initial purchase would cover blended CAC.

Of course, in addition to obtaining customers for free from organic traffic, companies, as they build their customer base, have an increasing opportunity to obtain free traffic by getting existing customers to buy again. So, a company should never forget that maintaining a persistent relationship with customers leads to improved Contribution Margin.

Spending to Drive Higher Growth Can Mean Lower Contribution Margin

Unless the GM on the first purchase a new customer makes exceeds their CAC, there is an inverse relationship between expanding growth and achieving high contribution margin. Think of it this way: suppose that going into a month the likely organic traffic and repeat buyers are somewhat set. Boosting that month’s growth means increasing the number of new paid customers, which in turn makes paid customers a higher proportion of blended CAC and therefore increases CAC. For an example consider the following assumptions for Company B:

  • The GM is $60 on an average order of $100
  • Paid CAC is $150
  • The company will have 1,000 new customers through organic means and 2,000 repeat buyers or $300,000 in revenue with 60% GM ($180,000) from these customers before spending on paid customers
  • G&A besides marketing for the month will be $150,000
  • Last year Company B had $400,000 in revenue in the same month
  • The company is considering the ramifications of targeting 25%, 50% or 100% year-over-year growth

Table 1: The Relationship Between Contribution Margin & Growth

Since the paid CAC is $150 while Gross Margin is only $60 per new customer, each acquired customer generates negative $90 in contribution margin in the period. As can be seen in Table 1, the company would shrink 25% if there is no acquisition spend but would have $180,000 in contribution margin and positive operating profit. On the other end of the spectrum, driving 100% growth requires spending $750,000 to acquire 5,000 new customers and results in a negative $270,000 in contribution margin and an Operating Loss of $420,000 in the period. Of course, if new customers are expected to make multiple future purchases than the number of repeat customers would rise in future periods.

Subscription Models Create More Consistency but are not a Panacea

When a company’s customers are monthly subscribers, each month starts with the prior month’s base less churn. To put it another way, if churn from the prior month is modest (for example 5%) then that month already has 95% of the prior months revenue from repeat customers. Additionally, if the company increases the average invoice value from these customers, it might even have a starting point where return customers account for as much revenue as the prior month. For B-to-B companies, high revenue retention is the norm, where an average customer will pay them for 10 years or more.

Consumer ecommerce subscriptions typically have much more substantial churn, with an average life of two years being closer to the norm. Additionally, the highest level of churn (which can be as much as 30% or more) occurs in the second month, and the next highest, the third month before tapering off. What this means is that companies trying to drive high sequential growth will have a higher % churn rate than those that target more modest growth. Part of a company’s acquisition spend is needed just to stay even. For example, if we assume all new customers come from paid acquisition, the CAC is $200, and that 15% of 10,000 customers churn then the first $300,000 in marketing spend would just serve to replace the churned customers and additional spend would be needed to drive sequential growth.

Investing in Companies with High Contribution Margin

As a VC, I tend to appreciate strong business models and like to invest after some baseline proof points are in place.  In my last post I outlined a number of metrics that were important ways to track a company’s health with the ratio of LTV (life time value) to CAC being one of the most important. When a company has a high contribution margin they have the time to build that ratio by adding more products or establishing subscriptions without burning through a lot of capital. Further, companies that have a high LTV/CAC ratio should have a high contribution margin as they mature since this usually means customers buy many times – leading to an expansion in repeat business as part of each month’s total revenue.

This thought process also applies to public companies. One of the most extreme is Facebook, which I’ve owned and recommended for five years. Even after the recent pullback its stock price is about 7x what it was five years ago (or has appreciated at a compound rate of nearly 50% per year since I’ve been recommending it). Not a surprise as Facebook’s contribution margin runs over 70% and revenue was up year/year 42% in Q2. These are extraordinary numbers for a company its size.

To give the reader some idea of how this method can be used as one screen for public companies, Table 2 shows gross margin, contribution margin, revenue growth and this year’s stock market performance for seven public companies.

Table 2: Public Company Contribution Margin Analysis

Two of the seven companies shown stand out as having both high Contribution Margin and strong revenue growth: Etsy and Stitch Fix. Each had year/year revenue growth of around 30% in Q2 coupled with 44% and 29% contribution margins, respectively. This likely has been a factor in Stitch Fix stock appreciating 53% and Etsy 135% since the beginning of the year.

Three of the seven have weak models and are struggling to balance revenue growth and contribution margin: Blue Apron, Overstock, and Groupon. Both Blue Apron and Groupon have been attempting to reduce their losses by dropping their marketing spend. While this increased their CM by 10% and 20% respectively, it also meant that they both have negative growth while still losing money. The losses for Blue Apron were over 16% of revenue. This coupled with shrinking revenue feels like a lethal combination. Blue Apron stock is only down a marginal amount year-to-date but is 59% lower than one year ago. Groupon, because of much higher gross margins than Blue Apron (52% vs 35%), still seems to have a chance to turn things around, but does have a lot of work to do. Overstock went in the other direction, increasing marketing spend to drive modest revenue growth of 12%. But this led to a negative CM and substantially increased losses. That strategy did not seem to benefit shareholders as the stock has declined 53% since the beginning of the year.

eBay is a healthy company from a contribution margin point of view but has sub 10% revenue growth. I can’t tell if increasing their market spend by a substantial amount (at the cost of lower CM) would be a better balance for them.

For me, Spotify is the one anomaly in the table as its stock has appreciated 46% since the IPO despite weak contribution margins which was one reason for my negative view expressed in a prior post. I think that is driven by three reasons: its product is an iconic brand; there is not a lot of float in the stock creating some scarcity; and contribution margin has been improving giving bulls on the stock a belief that it can get to profitability eventually. I say it is an anomaly, as comparing it to Facebook, it is hard to justify the relative valuations. Facebook grew 42% in Q2, Spotify 26%; Facebook is trading at a P/E of 24 whereas even if we assume Spotify can eventually get to generating 6% net profit (it currently is at a 7% loss before finance charges and 31% loss after finance charges, so this feels optimistic) Spotify would be trading at 112 times this theoretic future earnings.

 

SoundBytes

I found the recent controversy over Elon Musk’s sharing his thoughts on taking Tesla private interesting. On the one hand, people want transparency from companies and Elon certainly provides that! On the other hand, it clearly impacted the stock price for a few days and the SEC abhors anything that can be construed as stock manipulation. Of course, Elon may not have been as careful as he should have been when he sent out his tweet regarding whether financing was lined up…but like most entrepreneurs he was optimistic.

Interesting KPIs (Key Performance Indicators) for a Subscription Company

what-are-key-performance-indicators-kpis

In working with early stage businesses, I often get the question as to what metrics should management and the board use to help understand a company’s progress. It is important for every company to establish a set of consistent KPIs that are used to objectively track progress. While these need to be a part of each board package, it is even more important for the executive team to utilize this for managing their company. While this post focuses on SaaS/Subscription companies, the majority of it applies to most other types of businesses.

Areas KPIs Should Cover

  1. P&L Trends
  2. MRR (Monthly Recurring Revenue) and LTR (Lifetime Revenue)
  3. CAC (Cost of Customer Acquisition)
    1. Marketing to create leads
    2. Customers acquired electronically
    3. Customers acquired using sales professionals
  4. Gross Margin and LTV (Life Time Value of a customer)
  5. Marketing Efficiency

Many companies will also need KPIs regarding inventory in addition to the ones above.

While there may be very complex analysis behind some of these numbers, it’s important to try to keep KPIs to 2-5 pages of a board package. Use of the right KPIs will give a solid, objective, consistent top-down view of the company’s progress. The P&L portion of the package is obviously critical, but I have a possibly unique view on how this should be included in the body of a board package.

P&L Trends: Less is More

One mistake many companies make is confusing detail with better analysis. I often see models that have 50-100 line items for expenses and show this by month for 3 or more years out… but show one or no years of history. What this does is waste a great deal of time on predicting things that are inconsequential and controllable (by month), while eliminating all perspective. Things like seasonality are lost if one is unable to view 3 years of revenue at a time without scrolling from page to page. Of course, for the current year’s budget it is appropriate for management to establish monthly expectations in detail, but for any long-term planning, success revolves around revenue, gross margins, marketing/sales spend and the number of employees. For some companies that are deep technology players there may be significant costs in R&D other than payroll, but this is the exception. By using a simple formula for G&A based on the number of employees, the board can apply a sanity check on whether cost estimates in the long-term model will be on target assuming revenue is on target. So why spend excessive time on nits? Aggregating cost frees up time for better understanding how and why revenue will ramp, the relationship between revenue types and gross margin, the cost of acquiring a customer, the lifetime value of a customer and the average spend per employee.

In a similar way, the board is well served by viewing a simple P&L by quarter for 2 prior years plus the current one (with a forecast of remaining quarters). The lines could be:

Table1: P&L by Quarter

A second version of the P&L should be produced for budget comparison purposes. It should have the same rows but have the columns be current period actual, current period budget, year to date (YTD) actual, year to date budget, current full year forecast, budget for the full year.

Table 2: P&L Actual / Budget Comparison

Tracking MRR and LTR

For any SaaS/Subscription company (I’ll simply refer to this as SaaS going forward) MRR growth is the lifeblood of the company with two caveats: excessive churn makes MRR less valuable and excessive cost in growing MRR also leads to deceptive prosperity. More about that further on. MRR should be viewed on a rolling basis. It can be done by quarter for the board but by month for the management team. Doing it by quarter for the board enables seeing a 3-year trend on one page and gives the board sufficient perspective for oversight. Management needs to track this monthly to better manage the business. A relatively simple set of KPIs for each of 12 quarterly periods would be:

Table 3: MRR and Retention

Calculating Life Time Revenue through Cohort Analysis

The detailed method of calculating LTR does not need to be shown in every board package but should be included at least once per year, but calculated monthly for management.

The LTR calculation uses a grid where the columns would be the various Quarterly cohorts, that is all customers that first purchased that quarter (management might also do this using monthly instead of quarterly). This analysis can be applied to non-SaaS companies as well as SaaS entities. The first row would be the number of customers in the cohort. The next row would be the first month’s revenue for the cohort, the next the second months revenue, and so on until reaching 36 months (or whatever number the board prefers for B2B…I prefer 60 months). The next row would be the total for the full period and the final row would be the average Lifetime Revenue, LTR, per member of the cohort.

Table 4: Customer Lifetime Revenue

A second table would replicate the grid but show average per member of the cohort for each month (row). That table allows comparisons of cohorts to see if the average revenue of a newer cohort is getting better or worse than older ones for month 2 or month 6 or month 36, etc.

Table 5: Average Revenue per Cohort

Cohorts that have a full 36 months of data need to be at least 36 months old. What this means is that more recent cohorts will not have a full set of information but still can be used to see what trends have occurred. For example, is the second months average revenue for a current cohort much less than it was for a cohort one year ago? While newer cohorts do not have full sets of monthly revenue data, they still are very relevant in calculating more recent LTR. This can be done by using average monthly declines in sequential months and applying them to cohorts with fewer months of data.

Customer Acquisition Cost (CAC)

Calculating CAC is done in a variety of ways and is quite different for customers acquired electronically versus those obtained by a sales force.  Many companies I’ve seen have a combination of the two.

Marketing used to generate leads should always be considered part of CAC. The marketing cost in a month first is divided by the number of leads to generate a cost/lead. The next step is to estimate the conversion rate of leads to customers. A simple table would be as follows:

Table 6: Customer Acquisition Costs

table 6.1

For an eCommerce company, the additional cost to convert might be one free month of product or a heavily subsidized price for the first month. If the customer is getting the item before becoming a regular paying customer than the CAC would be:

CAC = MCTC / the percent that converts from the promotional trial to a paying customer.

CAC when a Sales Force is Involved

For many eCommerce companies and B2B companies that sell electronically, marketing is the primary cost involved in acquiring a paying customer. For those utilizing a sales force, the marketing expense plus the sales expense must be accumulated to determine CAC.

Typically, what this means is steps 1 through 3 above would still be used to determine CPL, but step 1 above might include marketing personnel used to generate leads plus external marketing spend:

  1. CPL (cost per lead) as above
  2. Sales Cost = current month’s cost of the sales force including T&E
  3. New Customers in the month = NC
  4. Conversion Rate to Customer = NC/number of leads= Y%
  5. CAC = CPL/Y% + (Sales Cost)/NC

There are many nuances ignored in the simple method shown. For example, some leads may take many months to close. Some may go through a pilot before closing. Therefore, there are more sophisticated methods of calculating CAC but using this method would begin the process of understanding an important indicator of efficiency of customer acquisition.

Gross Margin (GM) is a Critical Part of the Equation

While revenue is obviously an important measure of success, not all revenue is the same. Revenue that generates 90% gross margin is a lot more valuable per dollar than revenue that generates 15% gross margin. When measuring a company’s potential for future success it’s important to understand what level of revenue is required to reach profitability. A first step is understanding how gross margin may evolve. When a business scales there are many opportunities to improve margins:

  • Larger volumes may lead to larger discounts from suppliers
  • Larger volumes for products that are software/content may lower the hosting cost as a percent of revenue
  • Shipping to a larger number of customers may allow opening additional distribution centers (DCs) to facilitate serving customers from a DC closer to their location lowering shipping cost
  • Larger volumes may mean improved efficiency in the warehouse. For example, it may make more automation cost effective

When forecasting gross margin, it is important to be cautious in predicting some of these savings. The board should question radical changes in GM in the forecast. Certain efficiencies should be seen in a quarterly trend, and a marked improvement from the trend needs to be justified. The more significant jump in GM from a second DC can be calculated by looking at the change in shipping rates for customers that will be serviced from the new DC vs what rates are for these customers from the existing one.

Calculating LTV (Lifetime Value)

Gross Margin, by itself may be off as a measure of variable profits of a customer. If payment is by credit card, then the credit card cost per customer is part of variable costs. Some companies do not include shipping charges as part of cost of goods, but they should always be part of variable cost. Customer service cost is typically another cost that rises in proportion to the number of customers. So:

Variable cost = Cost of Goods sold plus any cost that varies directly with sales

Variable Profit = Revenue – Variable Cost

Variable Profit% (VP%) = (Variable Profit)/Revenue

LTV = LTR x VP%

The calculation of VP% should be based on current numbers as they will apply going forward. Determining a company’s marketing efficiency requires comparing LTV to the cost of customer acquisition. As mentioned earlier in the post, if the CAC is too large a proportion of LTV, a company may be showing deceptive (profitless) growth. So, the next set of KPIs address marketing efficiency.

Marketing Efficiency

It does not make sense to invest in an inefficient company as they will burn through capital at a rapid rate and will find it difficult to become profitable. A key measure of efficiency is the relationship between LTV and CAC or LTV/CAC. Essentially this is how many dollars of variable profit the company will make for every dollar it spends on marketing and sales. A ratio of 5 or more usually means the company is efficient. The period used for calculating LTR will influence this number. Since churn tends to be much lower for B2B companies, 5 years is often used to calculate LTR and LTV. But, using 5 years means waiting longer to receive resulting profits and can obscure cash flow implications of slower recovery of CAC. So, a second metric important to understand burn is how long it takes to recover CAC:

CAC Recovery Time = number of months until variable profit equals the CAC

The longer the CAC recovery time, the more capital required to finance growth. Of course, existing customers are also contributing to the month’s revenue alongside new customers. So, another interesting KPI is contribution margin which measures the current state of balance between marketing/sales and Variable Profits:

Contribution Margin = Variable Profits – Sales and Marketing Cost

Early on this number will be negative as there aren’t enough older customers to cover the investment in new ones. But eventually the contribution margin in a month needs to turn positive. To reach profitability it needs to exceed all other costs of the business (G&A, R&D, etc.). By reducing a month’s marketing cost, a company can improve contribution margin that month at the expense of sequential growth… which is why this is a balancing act.

I realize this post is long but wanted to include a substantial portion of KPIs in one post. However, I’ll leave more detailed measurement of sales force productivity and deeper analysis of several of the KPIs discussed here for one or more future posts.

Soundbytes

I’ll begin by apologizing for a midyear brag, but I always tell others to enjoy success and therefore am about to do that myself. In my top ten predictions for 2018 I included a market prediction and 4 stock predictions. I was feeling pretty good that they were all working well when I started to create this post. However, the stock prices for high growth stocks can experience serious shifts in very short periods. Facebook and Tesla both had (what I consider) minor shortfalls against expectations in the 10 days since and have subsequently declined quite a bit in that period. But given the strength of my other two recommendations, Amazon and Stitchfix, the four still have an average gain of 15% as of July 27. Since I’ve only felt comfortable predicting the market when it was easy (after 9/11 and after the 2008 mortgage blowup), I was nervous about predicting the S&P would be up this year as it was a closer call and was somewhat controversial given the length of the bull market prior to this year. But it seemed obvious that the new tax law would be very positive for corporate earnings. So, I thought the S&P would be up despite the likelihood of rising interest rates. So far, it is ahead 4.4% year to date driven by stronger earnings. Since I always fear that my record of annual wins can’t continue I wanted to take a midyear victory lap just in case everything collapses in the second half of the year (which I don’t expect but always fear). So I continue to hold all 4 stocks and in fact bought a bit more Facebook today.

Highlights From the 2018 Azure CEO Summit

It’s All About the Network

On June 13th, 2018, Azure held our 12th Annual CEO Summit, hosted at the Citrix Templeton Conference Center. Success for our companies is typically predicated on the breadth and depth of their networks in Silicon Valley and beyond. This event is a cornerstone of how we support this, providing a highly curated, facilitated opportunity to expand connections for business development, fund-raising, and strategic partner dialogue. It is also an opportunity for our portfolio companies to develop strong relationships with our investors, networks, and among each other, which provides business partnership opportunities, potential future investors and is a first step towards engaging with future acquirers. An incidental benefit to Azure is that the appeal of the event also leads to expansion of our own network.

Throughout the day, we had participation of nearly 70 corporate entities, venture funds and financial institutions, including Amazon, Google, Apple, P&G, Citrix, Ericsson, Intel, Microsoft, Oracle, Trinet, Arcserv, Citibank, SVB, and UBS, in addition to 28 of Azure’s portfolio companies, and six Canadian startups which were invited as part of Azure’s Canada-Bridge initiative. The Canadian companies were selected from a group of about 100 nominated by Canadian VCs. At the event, the six winners gained access to Azure’s Silicon Valley network not only through participation along with our portfolio CEOs in the approximately 370 one-on-one meetings we arranged but also through networking opportunities throughout the rest of the day and into the evening.

Nearly all the Azure portfolio companies participating gave demo-day style presentations to the full audience, which expanded the reach of their message beyond the more intimate one-on-one meetings.

Visionary Keynote Speakers

Azure was quite fortunate in once again having several visionary keynote speakers who provided inspiration and thought-provoking inputs from their experiences as highly successful entrepreneurs and investors.

The first was David Ko, currently President and COO, Rally Health, and formerly SVP, Yahoo and COO, Zynga (famous for Farmville which peaked at 34.5 million daily active users). David provided his vision for the consumer-focused future for managing health and shared lessons learned from his journeys both in taking Zynga public and in leading Rally Health as it has grown in eight years from a company with low single-digit millions in revenue to more than a billion in revenue. Rally works with more than 200,000 employers to help drive employee engagement in their health. Accessible to more than 35 million people, Rally’s digital platform and solutions help people adopt healthier lifestyles, select health benefits, and choose the best doctor at the right price for their needs. The company’s wellness solution focuses on four key areas to improve health: nutrition, exercise, stress reduction and preventive health. Given the astronomical increase in the portion of U.S. GDP spent on healthcare, David pointed out how critical it is to help individuals improve their “wellness” tactics. He believes this is one of the waves of the future to curb further acceleration of healthcare cost.

Shai Agassi, Former President, Product and Technology Group, SAP, and former CEO, Better Place responded to questions posed by me and the audience during a fireside chat.  Shai first shared his experience of building a business that successfully became integrated into SAP, but the heart of his session revolved around his perspectives on the evolution of the electric car and the future emergence of (safe) automated vehicles. He painted a vivid picture of what the oncoming transition to a new generation of vehicles means for the future, where automated, electric cars will become the norm (in 5-10 years). As a result, he believes people will reduce their use of their own cars and instead, use an “automated Uber-like service” for much of their transportation. In such a world, many people won’t own a car and for those that do, their autos will have much longer useful lives thereby reducing the need to replace cars with the same frequency. If he proves correct, this would clearly have major ramifications for auto manufacturers and the oil industry.

Our final keynote speaker was Ron Suber, President Emeritus, Prosper Marketplace, who is referred to as “The Godfather of Fintech”.  Ron shared with us his perspective that we’re at the beginning stages of the ‘Golden Age of Fintech’ which he believes will be a 20-year cycle. He expects to continue to see a migration to digital, accessible platforms driven by innovation by existing players and new entrants to the market that will disrupt the incumbents. What must be scary to incumbents is that the new entrants in fintech include tech behemoths like Paypal, Google, Amazon, Tencent (owner of WeChat), Facebook and Apple.  While traditional banks may have access to several hundred million customers, these players can leverage their existing reach into relationships with billions of potential customers. For example,  WeChat and Instagram have both recently surpassed one billion users. With digital/mobile purchasing continuing to gain market share, a player like Apple can nearly force its users to include Apple Pay as one of their apps giving Apple some unique competitive advantages. Amazon and WeChat (in China) are in a strong position to leverage their user bases.

All That Plus a Great Dinner

After an action packed daytime agenda, the Summit concluded with a casual cocktail hour and outdoor dinner in Atherton. Most attendees joined, and additional members of the Azure network were invited as well. The dinner enabled significant networking to continue and provided an additional forum for some who were not able to be at the daytime event to meet some of our portfolio executives.

The Bottom Line – It’s About Results

How do we measure the success of the Summit? We consider it successful if several of our companies garner potential investors, strike business development deals, etc.  As I write this, only nine days after the event, we already know of a number of investment follow-ups, more than ten business-development deals being discussed, and multiple debt financing conversations. Investment banks and corporate players have increased awareness of the quality of numerous companies who presented. Needless to say, Azure is pleased with the bottom line.

The Valuation Bible

Facebook valuation image

After many years of successfully picking public and private companies to invest in, I thought I’d share some of the core fundamentals I use to think about how a company should be valued. Let me start by saying numerous companies defy the logic that I will lay out in this post, often for good reasons, sometimes for poor ones. However, eventually most companies will likely approach this method, so it should at least be used as a sanity check against valuations.

When a company is young, it may not have any earnings at all, or it may be at an earnings level (relative to revenue) that is expected to rise. In this post, I’ll start by considering more mature companies that are approaching their long-term model for earnings to establish a framework, before addressing how this framework applies to less mature companies. The post will be followed by another one where I apply the rules to Tesla and discuss how it carries over into private companies.

Growth and Earnings are the Starting Points for Valuing Mature Companies

When a company is public, the most frequently cited metric for valuation is its price to earnings ratio (PE). This may be done based on either a trailing 12 months or a forward 12 months. In classic finance theory a company should be valued based on the present value of future cash flows. What this leads to is our first rule:

Rule 1: Higher Growth Rates should result in a higher PE ratio.

When I was on Wall Street, I studied hundreds of growth companies (this analysis does not apply to cyclical companies) over the prior 10-year period and found that there was a very strong correlation between a given year’s revenue growth rate and the next year’s revenue growth rate. While the growth rate usually declined year over year if it was over 10%, on average this decline was less than 20% of the prior year’s growth rate. What this means is that if we took a group of companies with a revenue growth rate of 40% this year, the average organic growth for the group would likely be about 33%-38% the next year. Of course, things like recessions, major new product releases, tax changes, and more could impact this, but over a lengthy period of time this tended to be a good sanity test. As of January 2, 2018, the average S&P company had a PE ratio of 25 on trailing earnings and was growing revenue at 5% per year. Rule 1 implies that companies growing faster should have higher PEs and those growing slower, lower PEs than the average.

Graph 1: Growth Rates vs. Price Earnings Ratios

graph

The graph shows the correlation between growth and PE based on the valuations of 21 public companies. Based on Rule 1, those above the line may be relatively under-priced and those below relatively over-priced. I say ‘may be’ as there are many other factors to consider, and the above is only one of several ways to value companies. Notice that most of the theoretically over-priced companies with growth rates of under 5% are traditional companies that have long histories of success and pay a dividend. What may be the case is that it takes several years for the market to adjust to their changed circumstances or they may be valued based on the return from the dividend. For example, is Coca Cola trading on: past glory, its 3.5% dividend, or is there something about current earnings that is deceptive (revenue growth has been a problem for several years as people switch from soda to healthier drinks)? I am not up to speed enough to know the answer. Those above the line may be buys despite appearing to be highly valued by other measures.

Relatively early in my career (in 1993-1995) I applied this theory to make one of my best calls on Wall Street: “Buy Dell sell Kellogg”. At the time Dell was growing revenue over 50% per year and Kellogg was struggling to grow it over 4% annually (its compounded growth from 1992 to 1995, this was partly based on price increases). Yet Dell’s PE was about half that of Kellogg and well below the S&P average. So, the call, while radical at the time, was an obvious consequence of Rule 1. Fortunately for me, Dell’s stock appreciated over 65X from January 1993 to January 2000 (and well over 100X while I had it as a top pick) while Kellogg, despite large appreciation in the overall stock market, saw its stock decline slightly over the same 7-year period (but holders did receive annual dividends).

Rule 2: Predictability of Revenue and Earnings Growth should drive a higher trailing PE

Investors place a great deal of value on predictability of growth and earnings, which is why companies with subscription/SaaS models tend to get higher multiples than those with regular sales models. It is also why companies with large sales backlogs usually get additional value. In both cases, investors can more readily value the companies on forward earnings since they are more predictable.

Rule 3: Market Opportunity should impact the Valuation of Emerging Leaders

When one considers why high growth rates might persist, the size of the market opportunity should be viewed as a major factor. The trick here is to make sure the market being considered is really the appropriate one for that company. In the early 1990s, Dell had a relatively small share of a rapidly growing PC market. Given its competitive advantages, I expected Dell to gain share in this mushrooming market. At the same time, Kellogg had a stable share of a relatively flat cereal market, hardly a formula for growth. In recent times, I have consistently recommended Facebook in this blog for the very same reasons I had recommended Dell: in 2013, Facebook had a modest share of the online advertising, a market expected to grow rapidly. Given the advantages Facebook had (and they were apparent as I saw every Azure ecommerce portfolio company moving a large portion of marketing spend to Facebook), it was relatively easy for me to realize that Facebook would rapidly gain share. During the time I’ve owned it and recommended it, this has worked out well as the share price is up over 8X.

How the rules can be applied to companies that are pre-profit

As a VC, it is important to evaluate what companies should be valued at well before they are profitable. While this is nearly impossible to do when we first invest (and won’t be covered in this post), it is feasible to get a realistic range when an offer comes in to acquire a portfolio company that has started to mature. Since they are not profitable, how can I apply a PE ratio?

What needs to be done is to try to forecast eventual profitability when the company matures. A first step is to see where current gross margins are and to understand whether they can realistically increase. The word realistic is the key one here. For example, if a young ecommerce company currently has one distribution center on the west coast, like our portfolio company Le Tote, the impact on shipping costs of adding a second eastern distribution center can be modeled based on current customer locations and known shipping rates from each distribution center. Such modeling, in the case of Le Tote, shows that gross margins will increase 5%-7% once the second distribution center is fully functional. On the other hand, a company that builds revenue city by city, like food service providers, may have little opportunity to save on shipping.

  • Calculating variable Profit Margin

Once the forecast range for “mature” gross margin is estimated, the next step is to identify other costs that will increase in some proportion to revenue. For example, if a company is an ecommerce company that acquires most of its new customers through Facebook, Google and other advertising and has high churn, the spend on customer acquisition may continue to increase in direct proportion to revenue. Similarly, if customer service needs to be labor intensive, this can also be a variable cost. So, the next step in the process is to access where one expects the “variable profit margin” to wind up. While I don’t know the company well, this appears to be a significant issue for Blue Apron: marketing and cost of goods add up to about 90% of revenue. I suspect that customer support probably eats up (no pun intended) 5-10% of what is left, putting variable margins very close to zero. If I assume that the company can eventually generate 10% variable profit margin (which is giving it credit for strong execution), it would need to reach about $4 billion in annual revenue to reach break-even if other costs (product, technology and G&A) do not increase. That means increasing revenue nearly 5-fold. At their current YTD growth rate this would take 9 years and explains why the stock has a low valuation.

  • Estimating Long Term Net Margin

Once the variable profit margin is determined, the next step would be to estimate what the long-term ratio of all other operating cost might be as a percent of revenue. Using this estimate I can determine a Theoretic Net Earnings Percent. Applying this percent to current (or next years) revenue yields a Theoretic Earnings and a Theoretic PE (TPE):

TPE= Market Cap/Theoretic Earnings     

To give you a sense of how I successfully use this, review my recap of the Top Ten Predictions from 2017 where I correctly predicted that Spotify would not go public last year despite strong top line growth as it was hard to see how its business model could support more than 2% or so positive operating margin, and that required renegotiating royalty deals with record labels.  Now that Spotify has successfully negotiated a 3% lower royalty rate from several of the labels, it appears that the 16% gross margins in 2016 could rise to 19% or more by the end of 2018. This means that variable margins (after marketing cost) might be 6%. This would narrow its losses, but still means it might be several years before the company achieves the 2% operating margins discussed in that post. As a result, Spotify appears headed for a non-traditional IPO, clearly fearing that portfolio managers would not be likely to value it at its private valuation price since that would lead to a TPE of over 200. Since Spotify is loved by many consumers, individuals might be willing to overpay relative to my valuation analysis.

Our next post will pick up this theme by walking through why this leads me to believe Tesla continues to have upside, and then discussing how entrepreneurs should view exit opportunities.

 

SoundBytes

I’ve often written about effective shooting percentage relative to Stephen Curry, and once again he leads the league among players who average 15 points or more per game. What also accounts for the Warriors success is the effective shooting of Klay Thompson, who is 3rd in the league, and Kevin Durant who is 6th. Not surprisingly, Lebron is also in the top 10 (4th). The table below shows the top ten among players averaging 15 points or more per game.  Of the top ten scorers in the league, 6 are among the top 10 effective shooters with James Harden only slightly behind at 54.8%. The remaining 3 are Cousins (53.0%), Lillard (52.2%), and Westbrook, the only one below the league average of 52.1% at 47.4%.

Table: Top Ten Effective Shooters in the League

table

*Note: Bolded players denote those in the top 10 in Points per Game

Ten Predictions for 2018

In my recap of 2017 predictions I pointed out how boring my stock predictions have been with Tesla and Facebook on my list every year since 2013 and Amazon on for two of the past three years. But what I learned on Wall Street is that sticking with companies that have strong competitive advantages in a potentially mega-sized market can create great performance over time (assuming one is correct)! So here we go again, because as stated in my January 5 post, I am again including Tesla, Facebook and Amazon in my Top ten list for 2018. I believe they each continue to offer strong upside, as explained below. I’m also adding a younger company, with a modest market cap, thus more potential upside coupled with more risk. The company is Stitch Fix, an early leader in providing women with the ability to shop for fashion-forward clothes at home. My belief in the four companies is backed up by my having an equity position in each of them.

I’m expecting the four stocks to outperform the market. So, in a steeply declining market, out-performance might occur with the stock itself being down (but less than the market). Having mentioned the possibility of a down market, I’m predicting the market will rise this year. This is a bit scary for me, as predicting the market as a whole is not my specialty.

We’ll start with the stock picks (with January 2 opening prices of stocks shown in parenthesis) and then move on to the remainder of my 10 predictions.

1. Tesla stock appreciation will continue to outpace the market (it opened the year at $312/share).

The good news and bad news on Tesla is the delays in production of the Model 3. The good part is that we can still look forward to massive increases in the number of cars the company sells once Tesla gets production ramping (I estimate the Model 3 backlog is well in excess of 500,000 units going into 2018 and demand appears to be growing). In 2017, Tesla shipped between 80,000 and 100,000 vehicles with revenue up 30% in Q3 without help from the model 3. If the company is successful at ramping capacity (and acquiring needed parts), it expects to reach a production rate of 5,000 cars per week by the end of Q1 and 10,000 by the end of the year. That could mean that the number of units produced in Q4 2018 will be more than four times that sold in Q4 2017 (with revenue about 2.0-2.5x due to the Model 3 being a lower priced car). Additionally, while it is modest compared to revenue from selling autos, the company appears to be the leader in battery production. It recently announced the largest battery deal ever, a $50 million contract (now completed on time) to supply what is essentially a massive backup battery complex for energy to Southern Australia. While this type of project is unlikely to be a major portion of revenue in the near term, it can add to Tesla’s growth rate and profitability.

2. Facebook stock appreciation will continue to outpace the market (it opened the year at $182/share).

The core Facebook user base growth has slowed considerably but Facebook has a product portfolio that includes Instagram, WhatsApp and Oculus. This gives Facebook multiple opportunities for revenue growth: Improve the revenue per DAU (daily active user) on Facebook itself; increase efforts to monetize Instagram and WhatsApp in more meaningful ways; and build the install base of Oculus. Facebook advertising rates have been increasing steadily as more mainstream companies shift budget from traditional advertising to Facebook, especially in view of declining TV viewership coupled with increased use of DVRs (allowing viewers to skip ads). Higher advertising rates, combined with modest growth in DAUs, should lead to continued strong revenue growth. And while the Oculus product did not get out of the gate as fast as expected, it began picking up steam in Q3 2017 after Facebook reduced prices. At 210,000 units for the quarter it may have contributed up to 5% of Q3 revenue. The wild card here is if a “killer app” (a software application that becomes a must have) launches that is only available on the Oculus, sales of Oculus could jump substantially in a short time.

3. Amazon stock appreciation will outpace the market (it opened the year at $1188/share).

Amazon, remarkably, increased its revenue growth rate in 2017 as compared to 2016. This is unusual for companies of this size. In 2018, we expect online to continue to pick up share in retail and Amazon to gain more share of online. The acquisition of Whole Foods will add approximately $4B per quarter in revenue, boosting year/year revenue growth of Amazon an additional 9%-11% per quarter, if Whole Foods revenue remains flattish. If Amazon achieves organic growth of 25% (in Q3 it was 29% so that would be a drop) in 2018, this would put the 3 quarters starting in Q4 2017 at about 35% growth. While we do expect Amazon to boost Whole Foods revenue, that is not required to reach those levels. In Q4 2018, reported revenue will return to organic growth levels. The Amazon story also features two other important growth drivers. First, I expect the Echo to have another substantial growth year and continue to emerge as a new platform in the home. Additionally, Amazon appears poised to benefit from continued business migration to the cloud coupled with increased market share and higher average revenue per cloud customer. This will be driven by modest price increases and introduction of more services as part of its cloud offering. The success of the Amazon Echo with industry leading voice technology should continue to provide another boost to Amazon’s revenue. Additionally, having a large footprint of physical stores will allow Amazon to increase distribution of many products.

4. Stitch Fix stock appreciation will outpace the market (it opened the year at $25/share and is at the same level as I write this post).

Stitch Fix is my riskiest stock forecast. As a new public company, it has yet to establish a track record of performance that one can depend upon. On the other hand, it’s the early leader in a massive market that will increasingly move online, at-home shopping for fashion forward clothes. The number of people who prefer shopping at home to going to a physical store is on the increase. The type of goods they wish to buy expands every year. Now, clothing is becoming a new category on the rapid rise (it grew from 11% of overall clothing retail sales in 2011 to 19% in 2016). It is important for women buying this way to feel that the provider understands what they want and facilitates making it easy to obtain clothes they prefer. Stitch Fix uses substantial data analysis to personalize each box it sends a customer. The woman can try them on, keep (and pay for) those they like, and return the rest very easily.

5. The stock market will rise in 2018 (the S&P opened the year at 2,696 on January 2).

While I have been accurate on recommending individual stocks over a long period, I rarely believe that I understand what will happen to the overall market. Two prior exceptions were after 9/11 and after the 2008 mortgage crisis generated meltdown. I was correct both times but those seemed like easy calls. So, it is with great trepidation that I’m including this prediction as it is based on logic and I know the market does not always follow logic! To put it simply, the new tax bill is quite favorable to corporations and therefore should boost after-tax earnings. What larger corporations pay is often a blend of taxes on U.S. earnings and those on earnings in various countries outside the U.S. There can be numerous other factors as well. Companies like Microsoft have lower blended tax rates because much of R&D and corporate overhead is in the United States and several of its key products are sold out of a subsidiary in a low tax location, thereby lowering the portion of pre-tax earnings here. This and other factors (like tax benefits in fiscal 2017 from previous phone business losses) led to blended tax rates in fiscal 2015, 2016 and 2017 of 34%, 15% and 8%, respectively. Walmart, on the other hand, generated over 75% of its pre-tax earnings in the United States over the past three fiscal years, so their blended rate was over 30% in each of those years

Table 1: Walmart Blended Tax Rates 2015-2017

The degree to which any specific company’s pre-tax earnings mix changes between the United States and other countries is unpredictable to me, so I’m providing a table showing the impact on after-tax earnings growth for theoretical companies instead. Table 2 shows the impact of lowering the U.S. corporate from 35% to 21% on four example companies. To provide context, I show two companies growing pre-tax earnings by 10% and two companies by 30%. If blended tax rates didn’t change, EPS would grow by the same amount as pre-tax earnings. For Companies 1 and 3, Table 2 shows what the increase in earnings would be if their blended 2017 tax rate was 35% and 2018 shifts to 21%. For companies 2 and 4, Table 2 shows what the increase in earnings would be if the 2017 rate was 30% (Walmart’s blended rate the past three years) and the 2018 blended rate is 20%.

Table 2: Impact on After-Tax Earnings Growth

As you can see, companies that have the majority of 2018 pre-tax earnings subject to the full U.S. tax rate could experience EPS growth 15%-30% above their pre-tax earnings growth. On the other hand, if a company has a minimal amount of earnings in the U.S. (like the 5% of earnings Microsoft had in fiscal 2017), the benefit will be minimal. Whatever benefits do accrue will also boost cash, leading to potential investments that could help future earnings.  If companies that have maximum benefits from this have no decline in their P/E ratio, this would mean a substantial increase in their share price, thus the forecast of an up market. But as I learned on Wall Street, it’s important to sight risk. The biggest risks to this forecast are the expected rise in interest rates this year (which usually is negative for the market) and the fact that the market is already at all-time highs.

6. Battles between the federal government and states will continue over marijuana use but the cannabis industry will emerge as one to invest in.

The battle over legalization of Marijuana reached a turning point in 2017 as polls showed that over 60% of Americans now favor full legalization (as compared to 12% in 1969). Prior to 2000, only three states (California, Oregon and Maine) had made medical cannabis legal. Now 29 states have made it legal for medical use and six have legalized sale for recreational use. Given the swing in voter sentiment (and a need for additional sources of tax revenue), more states are moving towards legalization for recreational and medical purposes. This has put the “legal” marijuana industry on a torrid growth curve. In Colorado, one of the first states to broadly legalize use, revenue is over $1 billion per year and overall 2017 industry revenue is estimated at nearly $8 billion, up 20% year/year. Given expected legalization by more states and the ability to market product openly once it is legal, New Frontier Data predicts that industry revenue will more than triple by 2025. The industry is making a strong case that medical use has compelling results for a wide variety of illnesses and high margin, medical use is forecast to generate over 50% of the 2025 revenue. Given this backdrop, public cannabis companies have had very strong performance. Despite this, in 2016, VCs only invested about $49 million in the sector. We expect that number to escalate dramatically in 2017 through 2019. While public cannabis stocks are trading at nosebleed valuations, they could have continued strong performance as market share consolidates and more states (and Canada) head towards legalization. One caveat to this is that Federal law still makes marijuana use illegal and the Trump administration is adopting a more aggressive policy towards pursuing producers, even in states that have made use legal. The states that have legalized marijuana use are gearing up to battle the federal government.

7. At least one city will announce a new approach to Urban transport

Traffic congestion in cities continues to worsen. Our post on December 14, 2017 discussed a new approach to urban transportation, utilizing small footprint automated cars (one to two passengers, no trunk, no driver) in a dedicated corridor. This appears much more cost effective than a Rapid Bus Transit solution and far more affordable than new subway lines. As discussed in that post, Uber and other ride services increase traffic and don’t appear to be a solution. The thought that automating these vehicles will relieve pressure is overly optimistic. I expect at least one city to commit to testing the method discussed in the December post before the end of this year – it is unlikely to be a U.S. city. The approach outlined in that post is one of several that is likely to be tried over the coming years as new thinking is clearly needed to prevent the traffic congestion that makes cities less livable.

8. Offline retailers will increase the velocity of moving towards omnichannel.

Retailers will adopt more of a multi-pronged approach to increasing their participation in e-commerce. I expect this to include:

  • An increased pace of acquisition of e-commerce companies, technologies and brands with Walmart leading the way. Walmart and others need to participate more heavily in online as their core offline business continues to lose share to online. In 2017, Walmart made several large acquisitions and has emerged as the leader among large retailers in moving online. This, in turn, has helped its stock performance. After a stellar 12 months in which the stock was up over 40%, it finally exceeded its January 2015 high of $89 per share (it reached $101/share as we are finalizing the post). I expect Walmart and others in physical retail to make acquisitions that are meaningful in 2018 so as to speed up the transformation of their businesses to an omnichannel approach.
  • Collaborating to introduce more online/technology into their physical stores (which Amazon is likely to do in Whole Foods stores). This can take the form of screens in the stores to order online (a la William Sonoma), having online purchases shipped to your local store (already done by Nordstrom) and adding substantial ability to use technology to create personalized items right at the store, which would subsequently be produced and shipped by a partner.

9. Social commerce will begin to emerge as a new category.

Many e-commerce sites have added elements of social, and many social sites have begun trying to sell various products. But few of these have a fully integrated social approach to e-commerce. The elements of a social approach to e-commerce include:

  • A feed-based user experience
  • Friends’ actions impact your feed
  • Following trend setters to see what they are buying, wearing, and favoring
  • Notifications based on your likes and tastes
  • One click to buy
  • Following particular stores and/or friends

I expect to see existing e-commerce players adding more elements of social, existing social players improving their approach to commerce and a rising trend of emerging companies focused on fully integrated social commerce.

10. “The Empire Strikes Back”: automobile manufacturers will begin to take steps to reclaim use of its GPS.

It is almost shameful that automobile manufacturers, other than Tesla, have lost substantial usage of their onboard GPS systems as many people use their cell phones or a small device to run Google, Waze (owned by Google) or Garmin instead of the larger screen in their car. In the hundreds of times I’ve taken an Uber or Lyft, I’ve never seen the driver use their car’s system. To modernize their existing systems, manufacturers may need to license software from a third party. Several companies are offering next generation products that claim to replicate the optimization offered by Waze but also add new features that go beyond it like offering to order coffee and other items to enable the driver to stop at a nearby location and have the product prepaid and waiting for them. In addition to adding value to the user, this also leads to a lead-gen revenue opportunity. In 2018, I expect one or more auto manufacturers to commit to including a third-party product in one or more of their models.

Soundbytes

Tesla model 3 sample car generates huge buzz at Stanford Mall in Menlo Park California. This past weekend my wife and I experienced something we had not seen before – a substantial line of people waiting to check out a car, one of the first Model 3 cars seen live. We were walking through the Stanford Mall where Tesla has a “Guide Store” and came upon a line of about 60 people willing to wait a few hours to get to check out one of the two Model 3’s available for perusal in California (the other was in L.A.). An hour later we came back, and the line had grown to 80 people. To be clear, the car was not available for a test drive, only for seeing it, sitting in it, finding out more info, etc. Given the buzz involved, it seems to me that as other locations are given Model 3 cars to look at, the number of people ordering a Model 3 each week might increase faster than Tesla’s capacity to fulfill.

Re-cap of 2017 Top Ten Predictions

I started 2017 by saying:

When I was on Wall Street I became very boring by having the same three strong buy recommendations for many years…  until I downgraded Compaq in 1998 (it was about 30X the original price at that point). The other two, Microsoft and Dell, remained strong recommendations until I left Wall Street in 2000. At the time, they were each well over 100X the price of my original recommendation. I mention this because my favorite stocks for this blog include Facebook and Tesla for the 4th year in a row. They are both over 5X what I paid for them in 2013 ($23 and $45, respectively) and I continue to own both. Will they get to 100X or more? This is not likely, as companies like them have had much higher valuations when going public compared with Microsoft or Dell, but I believe they continue to offer strong upside, as explained below.

Be advised that my top ten for 2018 will continue to include all three picks from 2017. I’m quite pleased that I continue to be fortunate, as the three were up an average of 53% in 2017. Furthermore, each of my top ten forecasts proved pretty accurate, as well!

I’ve listed in bold the 2017 stock picks and trend forecasts below, and give a personal evaluation of how I fared on each. For context, the S&P was up 19% and the Nasdaq 28% in 2017.

  1. Tesla stock appreciation will continue to outpace the market. Tesla, once again, posted very strong performance.  While the Model 3 experienced considerable delays, backorders for it continued to climb as ratings were very strong. As of mid-August, Tesla was adding a net of 1,800 orders per day and I believe it probably closed the year at over a 500,000-unit backlog. So, while the stock tailed off a bit from its high ($385 in September), it was up 45% from January 3, 2017 to January 2, 2018 and ended the year at 7 times the original price I paid in 2013 when I started recommending it. Its competitors are working hard to catch up, but they are still trailing by quite a bit.
  2. Facebook stock appreciation will continue to outpace the market. Facebook stock appreciated 57% year/year and opened on January 2, 2018 at $182 (nearly 8 times my original price paid in 2013 when I started recommending it). This was on the heels of 47% revenue growth (through 3 quarters) and even higher earnings growth.
  3. Amazon stock appreciation will outpace the market. Amazon stock appreciated 57% in 2017 and opened on January 2, 2018 at $1,188 per share. It had been on my recommended list in 2015 when it appreciated 137%. Taking it off in 2016 was based on Amazon’s stock price getting a bit ahead of itself (and revenue did catch up that year growing 25% while the stock was only up about 12%). In 2017, the company increased its growth rate (even before the acquisition of Whole Foods) and appeared to consolidate its ability to dominate online retail.
  4. Both online and offline retailers will increasingly use an omnichannel approach. Traditional retailers started accelerating the pace at which they attempted to blend online and offline in 2017. Walmart led, finally realizing it had to step up its game to compete with Amazon. While its biggest acquisition was Jet.com for over $3 billion, it also acquired Bonobos, Modcloth.com, Moosejaw, Shoebuy.com and Hayneedle.com, creating a portfolio of online brands that could also be sold offline. Target focused on becoming a leader in one-day delivery by acquiring Shipt and Grand Junction, two leaders in home delivery. While I had not predicted anything as large as a Whole Foods acquisition for Amazon, I did forecast that they would increase their footprint of physical locations (see October 2016 Soundbytes). The strategy for online brands to open “Guide” brick and mortar stores ( e.g. Tesla, Warby Parker, Everlane, etc.) continued at a rapid pace.
  5. A giant piloted robot will be demo’d as the next form of entertainment. As expected, Azure portfolio company, Megabots, delivered on this forecast by staging an international fight with a giant robot from Japan. The fight was not live as the robots are still “temperamental” (meaning they occasionally stop working during combat). However, interest in this new form of entertainment was incredible as the video of the fight garnered over 5 million views (which is in the range of an average prime-time TV show). There is still a large amount of work to be done to convert this to an ongoing form of entertainment, but all the ingredients are there.
  6. Virtual and Augmented reality products will escalate. Sales of VR/AR headsets appear to have well exceeded 10 million units for the year with some market gain for higher-end products. The types of applications have expanded from gaming to room design (and viewing), travel, inventory management, education, healthcare, entertainment and more. While the actual growth in unit sales fell short of what many expected, it still was substantial. With Apple’s acquisition of Vrvana (augmented reality headset maker) it seems clear that Apple plans to launch multiple products in the category over the next 2-3 years, and with Facebook’s launch of ArKIT, it’s social AR development platform, there is clearly a lot of focus and growth ahead.
  7. Magic Leap will disappoint in 2017. Magic Leap, after 5 years of development and $1.5 billion of investment, did not launch a product in 2017. But, in late December they announced that their first product will launch in 2018. Once again, the company has made strong claims for what its product will do, and some have said early adopters (at a very hefty price likely to be in the $1,500 range) are said to be like those who bought the first iPod. So, while it disappointed in 2017, it is difficult to tell whether or not this will eventually be a winning company as it’s hard to separate hype from reality.
  8. Cable companies will see a slide in adoption. According to eMarketer, “cord cutting”, i.e. getting rid of cable, reached record proportions in 2017, well exceeding their prior forecast. Just as worrisome to providers, the average time watching TV dropped as well, implying decreased dependence on traditional consumption. Given the increase now evident in cord cutting, UBS (as I did a year ago) is now forecasting substantial acceleration of the decline in subscribers. While the number of subscribers bounced around a bit between 2011 and 2015, when all was said and done, the aggregate drop in that four-year period was less than 0.02%. UBS now forecasts that between the end of 2016 and the end of 2018 the drop will be 7.3%. The more the industry tries to offset the drop by price increases, the more they will accelerate the pace of cord cutting.
  9. Spotify will either postpone its IPO or have a disappointing one. When we made this forecast, Spotify was expected to go public in Q2 2017. Spotify postponed its IPO into 2018 while working on new contracts with the major music labels to try to improve its business model. It was successful in these negotiations in that the labels all agreed to new terms. Since the terms were not announced, we’ll need to see financials for Q1 2018 to better understand the magnitude of improvement. In the first half of the year, Spotify reported that gross margins improved from 16% to 22%, but this merely cut its loss level rather than move the company to profitability. It has stated that it expects to do a non-traditional IPO (a direct listing without using an investment bank) in the first half of 2018. If the valuation approaches its last private round, I would caution investors to stay away, as that valuation, coupled with 22% gross margins (and over 12% of revenue in sales and marketing cost to acquire customers), implies net margin in the mid-single digits at best (assuming they can reduce R&D and G&A as a percent of revenue). This becomes much more challenging in the face of a $1.6 billion lawsuit filed against it for illegally offering songs without compensating the music publisher. Even if they managed to successfully fight the lawsuit and improve margin, Spotify would be valued at close to 100 times “potential earnings” and these earnings may not even materialize.
  10. Amazon’s Echo will gain considerable traction in 2017. Sales of the Echo exploded in 2017 with Amazon announcing that it “sold 10s of millions of Alexa-enabled devices” exceeding our aggressive forecast of 2-3x the 4.4 million units sold in 2016. The Alexa app was also the top app for both Android and iOS phones. It clearly has carved out a niche as a new major platform.

Stay tuned for my top 10 predictions of 2018!

 

SoundBytes

  • In our December 20, 2017 post, I discussed just how much Steph Curry improves teammate performance and how effective a shooter he is. I also mentioned that Russell Westbrook leading the league in scoring in the prior season might have been detrimental to his team as his shooting percentage falls well below the league average. Now, in his first game returning to the lineup, Curry had an effective shooting percentage that exceeded 100% while scoring 38 points (this means scoring more than 2 points for every shot taken). It would be interesting to know if Curry is the first player ever to score over 35 points with an effective shooting percentage above 100%! Also, as of now, the Warriors are scoring over 15 points more per game this season with Curry in the lineup than they did for the 11 games he was out (which directly ties to the 7.4% improvement in field goal percentage that his teammates achieve when playing with Curry as discussed in the post).

Using Technology to Revolutionize Urban Transit

Winter Traffic Photo

Worsening traffic requires new solutions

As our population increases, the traffic congestion in cities continues to worsen. In the Bay Area my commute into the city now takes about 20% longer than it did 10 years ago, and driving outside of typical rush hours is now often a major problem. In New York, the subway system helps quite a bit, but most of Manhattan is gridlocked for much of the day.

The two key ways of relieving cities from traffic snarl are:

  1. Reduce the number of vehicles on city streets
  2. Increase the speed at which vehicles move through city streets

Metro areas have been experimenting with different measures to improve car speed, such as:

  1. Encouraging carpooling and implementing high occupancy vehicle lanes on arteries that lead to urban centers
  2. Converting more streets to one-way with longer periods of green lights
  3. Prohibiting turns onto many streets as turning cars often cause congestion

No matter what a city does, traffic will continue to get worse unless compelling and effective urban transportation systems are created and/or enhanced. With that in mind, this post will review current alternatives and discuss various ways of attacking this problem.

Ride sharing services have increased congestion

Uber and Lyft have not helped relieve congestion. They have probably even led to increasing it, as so many rideshare vehicles are cruising the streets while awaiting their next ride. While the escalation of ridesharing services like Uber and Lyft may have reduced the number of people who commute using their own car to work, they have merely substituted an Uber driver for a personal driver. Commuters parked their cars when arriving at work while ridesharing drivers continue to cruise after dropping off a passenger, so the real benefit here has been in reducing demand for parking, not improving traffic congestion.

A simple way to think about this is that the total cars on the street at any point in time consists of those with someone going to a destination plus those cruising awaiting picking up a passenger. Uber does not reduce the number of people going to a destination by car (and probably increases it as some Uber riders would have taken public transportation if not for Uber).

The use of optimal traffic-aware routing GPS apps like Waze doesn’t reduce traffic but spreads it more evenly among alternate routes, therefore providing a modest increase in the speed that vehicles move through city streets. The thought that automating these vehicles will relieve pressure is unrealistic, as automated vehicles will still be subject to the same movement as those with drivers (who use Waze). Automating ridesharing cars can modestly reduce the number of cruising vehicles, as Uber and Lyft can optimize the number that remain in cruise mode. However, this will not reduce the number of cars transporting someone to a destination. So, it is clear to me that ridesharing services increase rather than reduce the number of vehicles on city streets and will continue to do so even when they are driverless.

Metro rail systems effectively reduce traffic but are expensive and can take decades to implement

Realistically, improving traffic flow requires cities to enhance their urban transport system, thereby reducing the number of vehicles on their streets. There are several historic alternatives but the only one that can move significant numbers of passengers from point A to point B without impacting other traffic is a rail system. However, construction of a rail system is costly, highly disruptive, and can take decades to go from concept to completion. For example, the New York City Second Avenue Line was tentatively approved in 1919. It is educational to read the history of reasons for delays, but the actual project didn’t begin until 2005 despite many millions of dollars being spent on planning, well before that date. The first construction commenced in April 2007. The first phase of the construction cost $4.5 billion and included 3 stations and 2 miles of tunnels. This phase was complete, and the line opened in January 2017. By May daily ridership was approximately 176,000 passengers. A second phase is projected to cost an additional $6 billion, add 1.5 more miles to the line and be completed 10-12 years from now (assuming no delays). Phase 1 and 2 together from actual start to hopeful finish will be over two decades from the 2005 start date…and about a century from when the line was first considered!

Dedicated bus rapid transit, less costly and less effective

Most urban transportation networks include bus lines through city streets. While buses do reduce the number of vehicles on the roads, they have several challenges that keep them from being the most efficient method of urban transport:

  1. They need to stop at traffic lights, slowing down passenger movement
  2. When they stop to let one passenger on or off, all other passengers are delayed
  3. They are very large and often cause other street traffic to be forced to slow down

One way of improving bus efficiency is a Dedicated Bus Rapid Transit System (BRT). Such a system creates a dedicated corridor for buses to use. A key to increasing the number of passengers such a system can transport is to remove them from normal traffic (thus the dedicated lanes) and to reduce or eliminate the need to stop for traffic lights by either altering the timing to automatically accommodate minimal stoppage of the buses or by creating overpasses and/or underpasses. If traffic lights are altered, the bus doesn’t stop for a traffic light but that can mean cross traffic stops longer, thus increasing cross traffic congestion. Elimination of interference using underpasses and/or overpasses at each intersection can be quite costly given the substantial size of buses. San Francisco has adopted the first, less optimal, less costly, approach along a two-mile corridor of Van Ness Avenue. The cost will still be over $200 million (excluding new buses) and it is expected to increase ridership from about 16,000 passengers per day to as much as 22,000 (which I’m estimating translates to 2,000-3,000 passengers per hour in each direction during peak hours). Given the increased time cross traffic will need to wait, it isn’t clear how much actual benefit will occur.

Will Automated Car Rapid Transit (ACRT) be the most cost effective solution?

I recently met with a company that expects to create a new alternative using very small automated car rapid transit (ACRT) that costs a fraction of and has more than double the capacity of a BRT.  The basic concept is to create a corridor similar to that of a BRT, utilizing underpasses under some streets and bridges over other streets. Therefore, cross traffic would not be affected by longer traffic light stoppages. Since the size of an underpass (tunnel) to accommodate a very small car is a fraction of that of a very large bus, so is the cost. The cars would be specially designed driverless automated cars that have no trunk, no back seats and hold one or two passengers. The same 3.5 to 4.0-meter-wide lane needed for a BRT would be sufficient for more than two lanes of such cars. Since the cars would be autonomous, speed and distance between cars could be controlled so that all cars in the corridor move at 30 miles per hour unless they exited. Since there would be overpasses and underpasses across each cross street, the cars would not stop for lights. Each vehicle would hold one or two passengers going to the same stop, so the car would not slow until it reached that destination. When it did, it would pull off the road without reducing speed until it was on the exit ramp.

The company claims that it will have the capacity to transport 10,000 passengers per hour per lane with the same setup as the Van Ness corridor if underpasses and overpasses were added. Since a capacity of 10,000 passengers per hour in each direction would provide significant excess capacity compared to likely usage, 2 lanes (3 meters in total width instead of 7-8 meters) is all that such a system would require. The reduced width would reduce construction cost while still providing excess capacity. Passengers would arrive at destinations much sooner than by bus as the autos would get there at 30 miles per hour without stopping even once. This translates to a 2-mile trip taking 4 minutes! Compare that to any experience you have had taking a bus.  The speed of movement also helps make each vehicle available to many more passengers during a day. While it is still unproven, this technology appears to offer significant cost/benefit vs other alternatives.

Conclusion

The population expansion within urban areas will continue to drive increased traffic unless additional solutions are implemented. If it works as well in practice as it does in theory, an ACRT like the one described above offers one potential way of improving transport efficiency. However, this is only one of many potential approaches to solving the problem of increased congestion. Regardless of the technology used, this is a space where innovation must happen if cities are to remain livable. While investment in underground rail is also a potential way of mitigating the problem, it will remain an extremely costly alternative unless innovation occurs in that domain.