Re-cap of 2017 Top Ten Predictions

I started 2017 by saying:

When I was on Wall Street I became very boring by having the same three strong buy recommendations for many years…  until I downgraded Compaq in 1998 (it was about 30X the original price at that point). The other two, Microsoft and Dell, remained strong recommendations until I left Wall Street in 2000. At the time, they were each well over 100X the price of my original recommendation. I mention this because my favorite stocks for this blog include Facebook and Tesla for the 4th year in a row. They are both over 5X what I paid for them in 2013 ($23 and $45, respectively) and I continue to own both. Will they get to 100X or more? This is not likely, as companies like them have had much higher valuations when going public compared with Microsoft or Dell, but I believe they continue to offer strong upside, as explained below.

Be advised that my top ten for 2018 will continue to include all three picks from 2017. I’m quite pleased that I continue to be fortunate, as the three were up an average of 53% in 2017. Furthermore, each of my top ten forecasts proved pretty accurate, as well!

I’ve listed in bold the 2017 stock picks and trend forecasts below, and give a personal evaluation of how I fared on each. For context, the S&P was up 19% and the Nasdaq 28% in 2017.

  1. Tesla stock appreciation will continue to outpace the market. Tesla, once again, posted very strong performance.  While the Model 3 experienced considerable delays, backorders for it continued to climb as ratings were very strong. As of mid-August, Tesla was adding a net of 1,800 orders per day and I believe it probably closed the year at over a 500,000-unit backlog. So, while the stock tailed off a bit from its high ($385 in September), it was up 45% from January 3, 2017 to January 2, 2018 and ended the year at 7 times the original price I paid in 2013 when I started recommending it. Its competitors are working hard to catch up, but they are still trailing by quite a bit.
  2. Facebook stock appreciation will continue to outpace the market. Facebook stock appreciated 57% year/year and opened on January 2, 2018 at $182 (nearly 8 times my original price paid in 2013 when I started recommending it). This was on the heels of 47% revenue growth (through 3 quarters) and even higher earnings growth.
  3. Amazon stock appreciation will outpace the market. Amazon stock appreciated 57% in 2017 and opened on January 2, 2018 at $1,188 per share. It had been on my recommended list in 2015 when it appreciated 137%. Taking it off in 2016 was based on Amazon’s stock price getting a bit ahead of itself (and revenue did catch up that year growing 25% while the stock was only up about 12%). In 2017, the company increased its growth rate (even before the acquisition of Whole Foods) and appeared to consolidate its ability to dominate online retail.
  4. Both online and offline retailers will increasingly use an omnichannel approach. Traditional retailers started accelerating the pace at which they attempted to blend online and offline in 2017. Walmart led, finally realizing it had to step up its game to compete with Amazon. While its biggest acquisition was Jet.com for over $3 billion, it also acquired Bonobos, Modcloth.com, Moosejaw, Shoebuy.com and Hayneedle.com, creating a portfolio of online brands that could also be sold offline. Target focused on becoming a leader in one-day delivery by acquiring Shipt and Grand Junction, two leaders in home delivery. While I had not predicted anything as large as a Whole Foods acquisition for Amazon, I did forecast that they would increase their footprint of physical locations (see October 2016 Soundbytes). The strategy for online brands to open “Guide” brick and mortar stores ( e.g. Tesla, Warby Parker, Everlane, etc.) continued at a rapid pace.
  5. A giant piloted robot will be demo’d as the next form of entertainment. As expected, Azure portfolio company, Megabots, delivered on this forecast by staging an international fight with a giant robot from Japan. The fight was not live as the robots are still “temperamental” (meaning they occasionally stop working during combat). However, interest in this new form of entertainment was incredible as the video of the fight garnered over 5 million views (which is in the range of an average prime-time TV show). There is still a large amount of work to be done to convert this to an ongoing form of entertainment, but all the ingredients are there.
  6. Virtual and Augmented reality products will escalate. Sales of VR/AR headsets appear to have well exceeded 10 million units for the year with some market gain for higher-end products. The types of applications have expanded from gaming to room design (and viewing), travel, inventory management, education, healthcare, entertainment and more. While the actual growth in unit sales fell short of what many expected, it still was substantial. With Apple’s acquisition of Vrvana (augmented reality headset maker) it seems clear that Apple plans to launch multiple products in the category over the next 2-3 years, and with Facebook’s launch of ArKIT, it’s social AR development platform, there is clearly a lot of focus and growth ahead.
  7. Magic Leap will disappoint in 2017. Magic Leap, after 5 years of development and $1.5 billion of investment, did not launch a product in 2017. But, in late December they announced that their first product will launch in 2018. Once again, the company has made strong claims for what its product will do, and some have said early adopters (at a very hefty price likely to be in the $1,500 range) are said to be like those who bought the first iPod. So, while it disappointed in 2017, it is difficult to tell whether or not this will eventually be a winning company as it’s hard to separate hype from reality.
  8. Cable companies will see a slide in adoption. According to eMarketer, “cord cutting”, i.e. getting rid of cable, reached record proportions in 2017, well exceeding their prior forecast. Just as worrisome to providers, the average time watching TV dropped as well, implying decreased dependence on traditional consumption. Given the increase now evident in cord cutting, UBS (as I did a year ago) is now forecasting substantial acceleration of the decline in subscribers. While the number of subscribers bounced around a bit between 2011 and 2015, when all was said and done, the aggregate drop in that four-year period was less than 0.02%. UBS now forecasts that between the end of 2016 and the end of 2018 the drop will be 7.3%. The more the industry tries to offset the drop by price increases, the more they will accelerate the pace of cord cutting.
  9. Spotify will either postpone its IPO or have a disappointing one. When we made this forecast, Spotify was expected to go public in Q2 2017. Spotify postponed its IPO into 2018 while working on new contracts with the major music labels to try to improve its business model. It was successful in these negotiations in that the labels all agreed to new terms. Since the terms were not announced, we’ll need to see financials for Q1 2018 to better understand the magnitude of improvement. In the first half of the year, Spotify reported that gross margins improved from 16% to 22%, but this merely cut its loss level rather than move the company to profitability. It has stated that it expects to do a non-traditional IPO (a direct listing without using an investment bank) in the first half of 2018. If the valuation approaches its last private round, I would caution investors to stay away, as that valuation, coupled with 22% gross margins (and over 12% of revenue in sales and marketing cost to acquire customers), implies net margin in the mid-single digits at best (assuming they can reduce R&D and G&A as a percent of revenue). This becomes much more challenging in the face of a $1.6 billion lawsuit filed against it for illegally offering songs without compensating the music publisher. Even if they managed to successfully fight the lawsuit and improve margin, Spotify would be valued at close to 100 times “potential earnings” and these earnings may not even materialize.
  10. Amazon’s Echo will gain considerable traction in 2017. Sales of the Echo exploded in 2017 with Amazon announcing that it “sold 10s of millions of Alexa-enabled devices” exceeding our aggressive forecast of 2-3x the 4.4 million units sold in 2016. The Alexa app was also the top app for both Android and iOS phones. It clearly has carved out a niche as a new major platform.

Stay tuned for my top 10 predictions of 2018!

 

SoundBytes

  • In our December 20, 2017 post, I discussed just how much Steph Curry improves teammate performance and how effective a shooter he is. I also mentioned that Russell Westbrook leading the league in scoring in the prior season might have been detrimental to his team as his shooting percentage falls well below the league average. Now, in his first game returning to the lineup, Curry had an effective shooting percentage that exceeded 100% while scoring 38 points (this means scoring more than 2 points for every shot taken). It would be interesting to know if Curry is the first player ever to score over 35 points with an effective shooting percentage above 100%! Also, as of now, the Warriors are scoring over 15 points more per game this season with Curry in the lineup than they did for the 11 games he was out (which directly ties to the 7.4% improvement in field goal percentage that his teammates achieve when playing with Curry as discussed in the post).

Ending the Year on a High Note…or should I say Basketball Note

Deeper analysis on what constitutes MVP Value

Blog 35 photo

In my blog post dated February 3, 2017, I discussed several statistics that are noteworthy in analyzing how much a basketball player contributes to his team’s success. In it, I compared Stephen Curry and Russell Westbrook using several advanced statistics that are not typically highlighted.

The first statistic: Primary plus Secondary Assists per Minute a player has the ball. Time with the ball equates to assist opportunity, so holding the ball most of the time one’s team is on offense reduces the opportunity for others on the team to have assists. This may lead to fewer assisted baskets for the whole team, but more for the individual player. As of the time of the post, Curry had 1.74 assists (primary plus secondary) per minute he had the ball, while Westbrook only had 1.30 assists per minute. Curry’s efficiency in assists is one of the reasons the Warriors total almost 50% more assists per game than the Thunder, make many more easy baskets, and lead the league in field goal percentage.

The second statistic: Effective Field Goal Percentages (where making a 3-point shot counts the same as making 1 ½ 2-point shots). Again, Curry was vastly superior to Westbrook at 59.1% vs 46.4%. What this means is that Westbrook scores more because he takes many more shots, but these shots are not very efficient for his team, as Westbrook’s shooting percentage continued to be well below the league average of 45.7% (Westbrook’s was 42.5% last season and is 39.6% this season to date).

The third statistic: Plus/Minus.  Plus/Minus reflects the number of points your team outscores opponents while you are on the floor.  Curry led the league in this in 2013, 2014, and 2016 and leads year-to-date this season. In 2015 he finished second by a hair to a teammate. Westbrook has had positive results, but last year averaged 3.2 per 36 minutes vs Curry’s 13.8. One challenge to the impressiveness of this statistic for Curry is whether his leading the league in Plus/Minus is due to the quality of players around him. In refute, it is interesting to note that he led the league in 2013 when Greene was a sub, Durant wasn’t on the team and Thompson was not the player he is today.

The background shown above brings me to today’s post which outlines another way of looking at a player’s value. The measurement I’m advocating is: How much does he help teammates improve? My thesis is that if the key player on a team creates a culture of passing the ball and setting up teammates, everyone benefits. Currently the value of helping teammates is only measured by the number of assists a player records. But, if I’m right, and the volume of assists is the wrong measure of helping teammates excel (as sometimes assists are the result of holding the ball most of the time) then I should be able to verify this through teammate performance. If most players improve their performance by getting easier shots when playing with Westbrook or Curry, then this should translate into a better shooting percentage. That would mean we should be able to see that most teammates who played on another team the year before or the year after would show a distinct improvement in shooting percentage while on his team. This is unlikely to apply across the board as some players get better or worse from year to year, and other players on one’s team also impact this data. That being said, looking at this across players that switch teams is relevant, especially if there is a consistent trend.

To measure this for Russell Westbrook, I’ve chosen 5 of the most prominent players that recently switched teams to or from Oklahoma City: Victor Oladipo, Kevin Durant, Carmelo Anthony, Paul George and Enes Kantor. Three left Oklahoma City and two went there from another team. For the two that went there, Paul George and Carmelo Anthony, I’ll compare year-to-date this season (playing with Westbrook) vs their shooting percentage last year (without Westbrook). For Kantor and Oladipo, the percentage last year will be titled “with Westbrook” and this year “without Westbrook” and for Durant, the seasons in question are the 2015-16 season (with Westbrook) vs the 2016-17 season (without Westbrook).

Shooting Percentage

Table 0

Given that the league average is to shoot 45.7%, shooting below that can hurt a team, while shooting above that should help. An average team takes 85.4 shots per game, so a 4.0% swing translates to over 8.0 points a game. To put that in perspective, the three teams with the best records this season are the Rockets, Warriors and Celtics and they had first, second and fourth best Plus/Minus for the season at +11.0, +11.0 and +5.9, respectively. The Thunder came in at plus 0.8. If they scored 8 more points a game (without giving up more) their Plus/Minus would have been on a par with the top three teams, and their record likely would be quite a bit better than 12 and 14.

Curry and His Teammates Make Others Better

How does Curry compare? Let’s look at the same statistics for Durant, Andrew Bogut, Harrison Barnes, Zaza Pachulia and Ian Clark (the primary player who left the Warriors). For Barnes, Bogut, Pachulia and Durant I’ll compare the 2015 and 2016 seasons and for Clark I’ll use 2016 vs this season-to-date.

Table 1

So, besides being one of the best shooters to play the game, Curry also has a dramatic impact on the efficiency of other players on his team. Perhaps it’s because opponents need to double team him, which allows other players to be less guarded. Perhaps it’s because he bought into Kerr’s “spread the floor, move the ball philosophy”. Whatever the case, his willingness to give up the ball certainly has an impact. And that impact, plus his own shooting efficiency, clearly leads to the Warriors being an impressive scoring machine. As an aside, recent Warrior additions Casspi and Young are also having the best shooting percentages of their careers.

Westbrook is a Great Player Who Could be Even Better

I want to make it clear that I believe Russell Westbrook is a great player. His speed, agility and general athleticism allow him to do things that few other players can match. He can be extremely effective driving to the basket when it is done under control. But, he is not a great outside shooter and could help his team more by taking fewer outside shots and playing less one/one basketball. Many believed that the addition of George and Anthony would make Oklahoma City a force to be reckoned with, but to date this has not been the case. Despite the theoretic offensive power these three bring to the table, the team is 24th in the league in scoring at 101.8 per game, 15 points per game behind the league leading Warriors. This may change over the course of the season but I believe that each of them playing less one/one basketball would help.

Using Technology to Revolutionize Urban Transit

Winter Traffic Photo

Worsening traffic requires new solutions

As our population increases, the traffic congestion in cities continues to worsen. In the Bay Area my commute into the city now takes about 20% longer than it did 10 years ago, and driving outside of typical rush hours is now often a major problem. In New York, the subway system helps quite a bit, but most of Manhattan is gridlocked for much of the day.

The two key ways of relieving cities from traffic snarl are:

  1. Reduce the number of vehicles on city streets
  2. Increase the speed at which vehicles move through city streets

Metro areas have been experimenting with different measures to improve car speed, such as:

  1. Encouraging carpooling and implementing high occupancy vehicle lanes on arteries that lead to urban centers
  2. Converting more streets to one-way with longer periods of green lights
  3. Prohibiting turns onto many streets as turning cars often cause congestion

No matter what a city does, traffic will continue to get worse unless compelling and effective urban transportation systems are created and/or enhanced. With that in mind, this post will review current alternatives and discuss various ways of attacking this problem.

Ride sharing services have increased congestion

Uber and Lyft have not helped relieve congestion. They have probably even led to increasing it, as so many rideshare vehicles are cruising the streets while awaiting their next ride. While the escalation of ridesharing services like Uber and Lyft may have reduced the number of people who commute using their own car to work, they have merely substituted an Uber driver for a personal driver. Commuters parked their cars when arriving at work while ridesharing drivers continue to cruise after dropping off a passenger, so the real benefit here has been in reducing demand for parking, not improving traffic congestion.

A simple way to think about this is that the total cars on the street at any point in time consists of those with someone going to a destination plus those cruising awaiting picking up a passenger. Uber does not reduce the number of people going to a destination by car (and probably increases it as some Uber riders would have taken public transportation if not for Uber).

The use of optimal traffic-aware routing GPS apps like Waze doesn’t reduce traffic but spreads it more evenly among alternate routes, therefore providing a modest increase in the speed that vehicles move through city streets. The thought that automating these vehicles will relieve pressure is unrealistic, as automated vehicles will still be subject to the same movement as those with drivers (who use Waze). Automating ridesharing cars can modestly reduce the number of cruising vehicles, as Uber and Lyft can optimize the number that remain in cruise mode. However, this will not reduce the number of cars transporting someone to a destination. So, it is clear to me that ridesharing services increase rather than reduce the number of vehicles on city streets and will continue to do so even when they are driverless.

Metro rail systems effectively reduce traffic but are expensive and can take decades to implement

Realistically, improving traffic flow requires cities to enhance their urban transport system, thereby reducing the number of vehicles on their streets. There are several historic alternatives but the only one that can move significant numbers of passengers from point A to point B without impacting other traffic is a rail system. However, construction of a rail system is costly, highly disruptive, and can take decades to go from concept to completion. For example, the New York City Second Avenue Line was tentatively approved in 1919. It is educational to read the history of reasons for delays, but the actual project didn’t begin until 2005 despite many millions of dollars being spent on planning, well before that date. The first construction commenced in April 2007. The first phase of the construction cost $4.5 billion and included 3 stations and 2 miles of tunnels. This phase was complete, and the line opened in January 2017. By May daily ridership was approximately 176,000 passengers. A second phase is projected to cost an additional $6 billion, add 1.5 more miles to the line and be completed 10-12 years from now (assuming no delays). Phase 1 and 2 together from actual start to hopeful finish will be over two decades from the 2005 start date…and about a century from when the line was first considered!

Dedicated bus rapid transit, less costly and less effective

Most urban transportation networks include bus lines through city streets. While buses do reduce the number of vehicles on the roads, they have several challenges that keep them from being the most efficient method of urban transport:

  1. They need to stop at traffic lights, slowing down passenger movement
  2. When they stop to let one passenger on or off, all other passengers are delayed
  3. They are very large and often cause other street traffic to be forced to slow down

One way of improving bus efficiency is a Dedicated Bus Rapid Transit System (BRT). Such a system creates a dedicated corridor for buses to use. A key to increasing the number of passengers such a system can transport is to remove them from normal traffic (thus the dedicated lanes) and to reduce or eliminate the need to stop for traffic lights by either altering the timing to automatically accommodate minimal stoppage of the buses or by creating overpasses and/or underpasses. If traffic lights are altered, the bus doesn’t stop for a traffic light but that can mean cross traffic stops longer, thus increasing cross traffic congestion. Elimination of interference using underpasses and/or overpasses at each intersection can be quite costly given the substantial size of buses. San Francisco has adopted the first, less optimal, less costly, approach along a two-mile corridor of Van Ness Avenue. The cost will still be over $200 million (excluding new buses) and it is expected to increase ridership from about 16,000 passengers per day to as much as 22,000 (which I’m estimating translates to 2,000-3,000 passengers per hour in each direction during peak hours). Given the increased time cross traffic will need to wait, it isn’t clear how much actual benefit will occur.

Will Automated Car Rapid Transit (ACRT) be the most cost effective solution?

I recently met with a company that expects to create a new alternative using very small automated car rapid transit (ACRT) that costs a fraction of and has more than double the capacity of a BRT.  The basic concept is to create a corridor similar to that of a BRT, utilizing underpasses under some streets and bridges over other streets. Therefore, cross traffic would not be affected by longer traffic light stoppages. Since the size of an underpass (tunnel) to accommodate a very small car is a fraction of that of a very large bus, so is the cost. The cars would be specially designed driverless automated cars that have no trunk, no back seats and hold one or two passengers. The same 3.5 to 4.0-meter-wide lane needed for a BRT would be sufficient for more than two lanes of such cars. Since the cars would be autonomous, speed and distance between cars could be controlled so that all cars in the corridor move at 30 miles per hour unless they exited. Since there would be overpasses and underpasses across each cross street, the cars would not stop for lights. Each vehicle would hold one or two passengers going to the same stop, so the car would not slow until it reached that destination. When it did, it would pull off the road without reducing speed until it was on the exit ramp.

The company claims that it will have the capacity to transport 10,000 passengers per hour per lane with the same setup as the Van Ness corridor if underpasses and overpasses were added. Since a capacity of 10,000 passengers per hour in each direction would provide significant excess capacity compared to likely usage, 2 lanes (3 meters in total width instead of 7-8 meters) is all that such a system would require. The reduced width would reduce construction cost while still providing excess capacity. Passengers would arrive at destinations much sooner than by bus as the autos would get there at 30 miles per hour without stopping even once. This translates to a 2-mile trip taking 4 minutes! Compare that to any experience you have had taking a bus.  The speed of movement also helps make each vehicle available to many more passengers during a day. While it is still unproven, this technology appears to offer significant cost/benefit vs other alternatives.

Conclusion

The population expansion within urban areas will continue to drive increased traffic unless additional solutions are implemented. If it works as well in practice as it does in theory, an ACRT like the one described above offers one potential way of improving transport efficiency. However, this is only one of many potential approaches to solving the problem of increased congestion. Regardless of the technology used, this is a space where innovation must happen if cities are to remain livable. While investment in underground rail is also a potential way of mitigating the problem, it will remain an extremely costly alternative unless innovation occurs in that domain.

How much do you know about SEO?

Search Engine Optimization: A step by step process recommended by experts

Azure just completed its annual ecommerce marketing day. It was attended by 15 of our portfolio companies, two high level executives at major corporations, a very strong SEO consultant and the Azure team. The purpose of the day is to help the CMOs in the Azure portfolio gain a broader perspective on hot marketing topics and share ideas and best practices. This year’s agenda included the following sessions:

  1. Working with QVC/HSN
  2. Brand building
  3. Using TV, radio and/or podcasts for marketing
  4. Techniques to improve email marketing
  5. Measuring and improving email marketing effectiveness
  6. Storytelling to build your brand and drive marketing success
  7. Working with celebrities, brands, popular YouTube personalities, etc.
  8. Optimizing SEO
  9. Product Listing Ads (PLAs) and Search Engine Marketing (SEM)

One pleasant aspect of the day is that it generated quite a few interesting ideas for blog posts! In other words, I learned a lot regarding the topics covered. This post is on an area many of you may believe you know well, Search Engine Optimization (SEO). I thought I knew it well too… before being exposed to a superstar consultant, Allison Lantz, who provided a cutting-edge presentation on the topic. With her permission, this post borrows freely from her content. Of course, I’ve added my own ideas in places and may have introduced some errors in thinking, and a short post can only touch on a few areas and is not a substitute for true expertise.

SEO is Not Free if You Want to Optimize

I have sometimes labeled SEO as a free source of visitors to a site, but Allison correctly points out that if you want to focus on Optimization (the O in SEO) with the search engines, then it isn’t free, but rather an ongoing process (and investment) that should be part of company culture. The good news is that SEO likely will generate high quality traffic that lasts for years and leads to a high ROI against the cost of striving to optimize. All content creators should be trained to write in a manner that optimizes generating traffic by using targeted key words in their content and ensuring these words appear in the places that are optimal for search. To be clear, it’s also best if the content is relevant, well written and user-friendly. If you were planning to create the content anyway, then the cost of doing this is relatively minor. However, if the content is incremental to achieve higher SEO rankings, then the cost will be greater. But I’m getting ahead of myself and need to review the step by step process Allison recommends to move towards optimization.

Keyword Research

The first thing to know when developing an SEO Strategy is what you are targeting to optimize. Anyone doing a search enters a word or phrase they are searching for. Each such word or phrase is called a ‘keyword’. If you want to gain more users through SEO, it’s critical to identify thousands, tens of thousands or even hundreds of thousands of keywords that are relevant to your site. For a fashion site, these could be brands, styles, and designers. For an educational site like Education.com (an Azure portfolio company that is quite strong in SEO and ranks on over 600,000 keywords) keywords might be math, english, multiplication, etc. The broader the keywords, the greater the likelihood of higher volume.  But along with that comes more competition for search rankings and a higher cost per keyword. The first step in the process is spending time brainstorming what combinations of words are relevant to your site – in other words if someone searched for that specific combination would your site be very relevant to them? To give you an idea of why the number gets very high, consider again Education.com. Going beyond searching on “math”, one can divide math into arithmetic, algebra, geometry, calculus, etc. Each of these can then be divided further. For example, arithmetic can include multiplication, addition, division, subtraction, exponentiation, fractions and more.  Each of these can be subdivided further with multiplication covering multiplication games, multiplication lesson plans, multiplication worksheets, multiplication quizzes and more.

Ranking Keywords

Once keywords are identified the next step is deciding which ones to focus on. The concept leads to ranking keywords based upon the likely number of clicks to your site that could be generated from each one and the expected value of potential users obtained through these clicks. Doing this requires determining for each keyword:

  • Monthly searches
  • Competition for the keyword
  • Conversion potential
  • Effort (and possible cost) required to achieve a certain ranking

Existing tools report the monthly volume of searches for each keyword (remember to add searches on Bing to those on Google). Estimating the strength of competition requires doing a search using the keyword and learning who the top-ranking sites are currently (given the volume of keywords to analyze, this is very labor intensive). If Amazon is a top site they may be difficult to surpass but if the competition includes relatively minor players, they would be easier to outrank.

The next question to answer for each keyword is: “What is the likelihood of converting someone who is searching on the keyword if they do come to my site”. For example, for Education.com, someone searching on ‘sesame street math games’ might not convert well since they don’t have the license to use Sesame Street characters in their math games. But someone searching on ‘1st grade multiplication worksheets’ would have a high probability of converting since the company is world-class in that area. The other consideration mentioned above is the effort required to achieve a degree of success. If you already have a lot of content relevant to a keyword, then search optimizing that content for the keyword might not be very costly. But, if you currently don’t have any content that is relevant or the keyword is very broad, then a great deal more work might be required.

Example of Keyword Ranking Analysis

Source: Education.com

Comparing Effort Required to Estimated Value of Keywords

Once you have produced the first table, you can make a very educated guess on your possible ranking after about 12 months (the time it may take Google/Bing to recognize your new status for that keyword).

There are known statistics on what the likely click-through rates (share of searches against the keyword) will be if you rank 1st, 2nd, 3rd, etc. Multiplying that by the average search volume for that keyword gives a reasonable estimate of the monthly traffic that this would generate to your site. The next step is to estimate the rate at which you will convert that traffic to members (where they register so you get their email) and/or customers (I’ll assume customers for the rest of this post but the same method would apply to members). Since you already know your existing conversion rate, in general, this could be your estimate. But, if you have been buying clicks on that keyword from Google or Bing, you may already have a better estimate of conversion. Multiplying the number of customers obtained by the LTV (Life Time Value) of a customer yields the $ value generated if the keyword obtains the estimated rank. Subtract from this the current value being obtained from the keyword (based on its current ranking) to see the incremental benefit.

Content Optimization

One important step to improve rankings is to use keywords in titles of articles. While the words to use may seem intuitive, it’s important to test variations to see how each may improve results. Will “free online multiplication games” outperform “free times table games”. The way to test this is by trying each for a different 2-week (or month) time period and see which gives a higher CTR (Click Through Rate). As discussed earlier, it’s also important to optimize the body copy against keywords. Many of our companies create a guide for writing copy that provides rules that result in better CTR.

The Importance of Links

Google views links from other sites to yours as an indication of your level of authority. The more important the site linking to you, the more it impacts Google’s view. Having a larger number of sites linking to you can drive up your Domain Authority (a search engine ranking score) which in turn will benefit rankings across all keywords. However, it’s important to be restrained in acquiring links as those from “Black Hats” (sites Google regards as somewhat bogus) can actually result in getting penalized. While getting another site to link to you will typically require some motivation for them, Allison warns that paying cash for a link is likely to result in obtaining some of them from black hat sites. Instead, motivation can be your featuring an article from the other site, selling goods from a partner, etc.

Other Issues

I won’t review it here but site architecture is also a relevant factor in optimizing SEO benefits. For a product company with tens of thousands of products, it can be extremely important to have the right titles and structure in how you list products. If you have duplicative content on your site, removing it may help your rankings, even if there was a valid reason to have such duplication. Changing the wording of content on a regular basis will help you maintain rankings.

Summary

SEO requires a well-thought-out strategy and consistent, continued execution to produce results. This is not a short-term fix, as an SEO investment will likely only start to show improvements four to six months after implementation with ongoing management. But as many of our portfolio companies can attest, it’s well worth the effort.

 

 

SoundBytes

  • It’s a new basketball season so I can’t resist a few comments. First, as much as I am a fan of the Warriors, it’s pretty foolish to view them as a lock to win as winning is very tenuous. For example, in game 5 of the finals last year, had Durant missed his late game three point shot the Warriors may have been facing the threat of a repeat of the 2016 finals – going back to Cleveland for a potential tying game.
  • Now that Russell Westbrook has two star players to accompany him we can see if I am correct that he is less valuable than Curry, who has repeatedly shown the ability to elevate all teammates. This is why I believe that, despite his two MVPs, Curry is under-rated!
  • With Stitchfix filing for an IPO, we are seeing the first of several next generation fashion companies emerging. In the filing, I noted the emphasis they place on SEO as a key component of their success. I believe new fashion startups will continue to exert pressure on traditional players. One Azure company moving towards scale in this domain is Le Tote – keep an eye on them!

Will Grocery Shopping Ever be the Same?

Will grocery shopping ever be the same?

Dining and shopping today is very different than in days gone by – the Amazon acquisition of Whole Foods is a result

“I used to drink it,” said Andy Warhol once of Campbell’s soup. “I used to have the same lunch every day, for 20 years, I guess, the same thing over and over again.” In Warhol’s signature medium, silkscreen, the artist reproduced his daily Campbell’s soup can over and over again, changing only the label graphic on each one.

When I was growing up I didn’t have exactly the same thing over and over like Andy Warhol, but virtually every dinner was at home, at our kitchen table (we had no dining room in the 4-room apartment). Eating out was a rare treat and my father would have been abhorred if my mom brought in prepared food. My mom, like most women of that era, didn’t officially work, but did do the bookkeeping for my dad’s plumbing business. She would shop for food almost every day at a local grocery and wheel it home in her shopping cart.

When my wife and I were raising our kids, the kitchen remained the most important room in the house. While we tended to eat out many weekend nights, our Sunday through Thursday dinners were consumed at home, but were sprinkled with occasional meals brought in from the outside like pizza, fried chicken, ribs, and Chinese food. Now, given a high proportion of households where both parents work, eating out, fast foods and prepared foods have become a large proportion of how Americans consume dinner. This trend has reached the point where some say having a traditional kitchen may disappear as people may cease cooking at all.

In this post, I discuss the evolution of our eating habits, and how they will continue to change. Clearly, the changes that have already occurred in shopping for food and eating habits were motivations for Amazon’s acquisition of Whole Foods.

The Range of How We Dine

Dining can be broken into multiple categories and families usually participate in all of them. First, almost 60% of dinners eaten at home are still prepared there. While the percentage has diminished, it is still the largest of the 4 categories for dinners. Second, many meals are now purchased from a third party but still consumed at home. Given the rise of delivery services and greater availability of pre-cooked meals at groceries, the category spans virtually every type of food. Thirdly, many meals are purchased from a fast food chain (about 25% of Americans eat some type of fast food every day1) and about 20% of meals2 are eaten in a car. Finally, a smaller percentage of meals are consumed at a restaurant. (Sources: 1Schlosser, Eric. “Americans Are Obsessed with Fast Food: The Dark Side of the All-American Meal.” CBSNews. Accessed April 14, 2014 / 2Stanford University. “What’s for Dinner?” Multidisciplinary Teaching and Research at Stanford. Accessed April 14, 2014).

The shift to consuming food away from home has been a trend for the last 50 years as families began going from one worker to both spouses working. The proportion of spending on food consumed away from home has consistently increased from 1965-2014 – from 30% to 50%.

Source: Calculated by the Economic Research Service, USDA, from various data sets from the U.S. Census Bureau and the Bureau of Labor Statistics.

With both spouses working, the time available to prepare food was dramatically reduced. Yet, shopping in a supermarket remained largely the same except for more availability of prepared meals. Now, changes that have already begun could make eating dinner at home more convenient than eating out with a cost comparable to a fast food chain.

Why Shopping for Food Will Change Dramatically over the Next 30 Years

Eating at home can be divided between:

  1. Cooking from scratch using ingredients from general shopping
  2. Buying prepared foods from a grocery
  3. Cooking from scratch from recipes supplied with the associated ingredients (meal kits)
  4. Ordering meals that have previously been prepared and only need to be heated up
  5. Ordering meals from a restaurant that are picked up or delivered to your home
  6. Ordering “fast food” type meals like pizza, ribs, chicken, etc. for pickup or delivery.

I am starting with the assumption that many people will still want to cook some proportion of their dinners (I may be romanticizing given how I grew up and how my wife and I raised our family). But, as cooking for yourself becomes an even smaller percentage of dinners, shopping for food in the traditional way will prove inefficient. Why buy a package of saffron or thyme or a bag of onions, only to see very little of it consumed before it is no longer usable? And why start cooking a meal, after shopping at a grocery, only to find you are missing an ingredient of the recipe? Instead, why not shop by the meal instead of shopping for many items that may or may not end up being used.

Shopping by the meal is the essential value proposition offered by Blue Apron, Plated, Hello Fresh, Chef’d and others. Each sends you recipes and all the ingredients to prepare a meal. There is little food waste involved (although packaging is another story). If the meal preparation requires one onion, that is what is included, if it requires a pinch of saffron, then only a pinch is sent. When preparing one of these meals you never find yourself missing an ingredient. It takes a lot of the stress and the food waste out of the meal preparation process. But most such plans, in trying to keep the cost per meal to under $10, have very limited choices each week (all in a similar lower cost price range) and require committing to multiple meals per week. Chef’d, one of the exceptions to this, allows the user to choose individual meals or to purchase a weekly subscription. They also offer over 600 options to choose from while a service like Blue Apron asks the subscriber to select 3 out of 6 choices each week.

Blue Apron meals portioned perfectly for the amount required for the recipes

My second assumption is that the number of meals that are created from scratch in an average household will diminish each year (as it already has for the past 50 years). However, many people will want to have access to “preferred high quality” meals that can be warmed up and eaten, especially in two-worker households. This will be easier and faster (but perhaps less gratifying) than preparing a recipe provided by a food supplier (along with all the ingredients). I am talking about going beyond the pre-cooked items in your average grocery. There are currently sources of such meals arising as delivery services partner with restaurants to provide meals delivered to your doorstep. But this type of service tends to be relatively expensive on a per meal basis.

I expect new services to arise (we’ve already seen a few) that offer meals that are less expensive prepared by “home chefs” or caterers and ordered through a marketplace (this is category 4 in my list). The marketplace will recruit the chefs, supply them with packaging, take orders, deliver to the end customers, and collect the money. Since the food won’t be from a restaurant, with all the associated overhead, prices can be lower. Providing such a service will be a source of income for people who prefer to work at home. Like drivers for Uber and Lyft, there should be a large pool of available suppliers who want to work in this manner. It will be very important for the marketplaces offering such service to curate to ensure that the quality and food safety standards of the product are guaranteed. The availability of good quality, moderately priced prepared meals of one’s choice delivered to the home may begin shifting more consumption back to the home, or at a minimum, slow the shift towards eating dinners away from home.

Where will Amazon be in the Equation?

In the past, I predicted that Amazon would create physical stores, but their recent acquisition of Whole Foods goes far beyond anything I forecast by providing them with an immediate, vast network of physical grocery stores. It does make a lot of sense, as I expect omnichannel marketing to be the future of retail.  My reasoning is simple: on the one hand, online commerce will always be some minority of retail (it currently is hovering around 10% of total retail sales); on the other hand, physical retail will continue to lose share of the total market to online for years to come, and we’ll see little difference between e-commerce and physical commerce players.  To be competitive, major players will have to be both, and deliver a seamless experience to the consumer.

Acquiring Whole Foods can make Amazon the runaway leader in categories 1 and 2, buying ingredients and/or prepared foods to be delivered to your home.  Amazon Fresh already supplies many people with products that are sourced from grocery stores, whether they be general food ingredients or traditional prepared foods supplied by a grocery. They also have numerous meal kits that they offer, and we expect (and are already seeing indications) that Amazon will follow the Whole Foods acquisition by increasing its focus on “meal kits” as it attempts to dominate this rising category (3 in our table).

One could argue that Whole Foods is already a significant player in category 4 (ordering meals that are prepared, and only need to be heated up), believing that category 4 is the same as category 2 (buying prepared meals from a grocery). But it is not. What we envision in the future is the ability to have individuals (who will all be referred to as “Home Chefs” or something like that) create brands and cook foods of every genre, price, etc. Customers will be able to order a set of meals completely to their taste from a local home chef. The logical combatants to control this market will be players like Uber and Lyft, guys like Amazon and Google, existing recipe sites like Blue Apron…and new startups we’ve never heard of.

When and How to Create a Valuable Marketing Event

Azure CEO Summit
Snapshots from Azure’s 11th Annual CEO Summit

A key marketing tool for companies is to hold an event like a user’s conference or a topical forum to build relationships with their customers and partners, drive additional revenue and/or generate business development opportunities. Azure held its 11th annual CEO Summit last week, and as we’re getting great feedback on the success of the conference, I thought it might be helpful to dig deeply into what makes a conference effective. I will use the Azure event as the example but try to abstract rules and lessons to be learned, as I have been asked for my advice on this topic by other firms and companies.

Step 1. Have a clear set of objectives

For the Azure CEO Summit, our primary objectives are to help our portfolio companies connect with:

  1. Corporate and Business Development executives from relevant companies
  2. Potential investors (VCs and Family Offices)
  3. Investment banks so the companies are on the radar and can get invited to their conferences
  4. Debt providers for those that can use debt as part of their capital structure

A secondary objective of the conference is to build Azure’s brand thereby increasing our deal flow and helping existing and potential investors in Azure understand some of the value we bring to the table.

When I created a Wall Street tech conference in the late 90’s, the objectives were quite different. They still included brand building, but I also wanted our firm to own trading in tech stocks for that week, have our sell side analysts gain reputation and following, help our bankers expand their influence among public companies, and generate a profit for the firm at the same time. We didn’t charge directly for attending but monetized through attendees increasing use of our trading desk and more companies using our firm for investment banking.

When Fortune began creating conferences, their primary objective was to monetize their brand in a new way. This meant charging a hefty price for attending. If people were being asked to pay, the program had to be very strong, which they market quite effectively.

Conferences that have clear objectives, and focus the activities on those objectives, are the most successful.

Step 2. Determine invitees based on who will help achieve those objectives

For our Summit, most of the invitees are a direct fallout from the objectives listed above. If we want to help our portfolio companies connect with the above-mentioned constituencies, we need to invite both our portfolio CEOs and the right players from corporations, VCs, family offices, investment banks and debt providers. To help our brand, inviting our LPs and potential LPs is important. To insure the Summit is at the quality level needed to attract the right attendees we also target getting great speakers.  As suggested by my partners and Andrea Drager, Azure VP (and my collaborator on Soundbytes) we invited several non-Azure Canadian startups. In advance of the summit, we asked Canadian VCs to nominate candidates they thought would be interesting for us and we picked the best 6 to participate in the summit. This led to over 70 interesting companies nominated and added to our deal flow pipeline.

Step 3. Create a program that will attract target attendees to come

This is especially true in the first few years of a conference while you build its reputation. It’s important to realize that your target attendees have many conflicting pulls on their time. You won’t get them to attend just because you want them there! Driving attendance from the right people is a marketing exercise. The first step is understanding what would be attractive to them. In Azure’s case, they might not understand the benefit of meeting our portfolio companies, but they could be very attracted by the right keynotes.

Azure’s 2017 Summit Keynote Speakers: Mark Lavelle, CEO of Magento Commerce & Co-founder of Bill Me Later. Cameron Lester, Managing Director and Co-Head of Global Technology Investment Banking, Jeffries. Nagraj Kashyap, Corporate VP & Global Head, Microsoft Ventures.

Over the years we have had the heads of technology investment banking from Qatalyst, Morgan Stanley, Goldman, JP Morgan and Jeffries as one of our keynote speakers. From the corporate world, we also typically have a CEO, former CEO or chairman of notable companies like Microsoft, Veritas, Citrix, Concur and Audible as a second keynote. Added to these were CEOs of important startups like Stance and Magento and terrific technologists like the head of Microsoft Labs.

Finding the right balance of content, interaction and engagement is challenging, but it should be explicitly tied to meeting the core objectives of the conference.

Step 4. Make sure the program facilitates meeting your objectives

Since Azure’s primary objective is creating connections between our portfolio (and this year, the 6 Canadian companies) with the various other constituencies we invite, we start the day with speed dating one-on-ones of 10 minutes each. Each attendee participating in one-on-ones can be scheduled to meet up to 10 entities between 8:00AM and 9:40. Following that time, we schedule our first keynote.

In addition to participating in the one-on-ones, which start the day, 26 of our portfolio companies had speaking slots at the Summit, intermixed with three compelling keynote speakers. Company slots are scheduled between keynotes to maximize continued participation. This schedule takes us to about 5:00pm. We then invite the participants and additional VCs, lawyers and other important network connections to join us for dinner. The dinner increases everyone’s networking opportunity in a very relaxed environment.

These diverse types of interaction phases throughout the conference (one-on-ones, presentations, discussions, and networking) all facilitate a different type of connection between attendees, focused on maximizing the opportunity for our portfolio companies to build strong connections.

Azure Company Presentations
Azure Portfolio Company CEO Presentations: Chairish, Megabots & Atacama

Step 5. Market the program you create to the target attendees

I get invited to about 30 conferences each year plus another 20-30 events. It’s safe to assume that most of the invitees to the Azure conference get a similar (or greater) number of invitations. What this means is that it’s unlikely that people will attend if you send an invitation but then don’t effectively market the event (especially in the first few years). It is important to make sure every key invitee gets a personal call, email, or other message from an executive walking them through the agenda and highlighting the value to them (link to fortune could also go here). For the Azure event, we highlight the great speakers but also the value of meeting selected portfolio companies. Additionally, one of my partners or I connect with every attendee we want to do one-on-ones with portfolio companies to stress why this benefits them and to give them the chance to alter their one-on-one schedule. This year we managed over 320 such meetings.

When I created the first “Quattrone team” conference on Wall Street, we marketed it as an exclusive event to portfolio managers. While the information exchanged was all public, the portfolio managers still felt they would have an investment edge by being at a smaller event (and we knew the first year’s attendance would be relatively small) where all the important tech companies spoke and did one-on-one meetings. For user conferences, it can help to land a great speaker from one of your customers or from the industry. For example, if General Electric, Google, Microsoft or some similar important entity is a customer, getting them to speak will likely increase attendance. It also may help to have an industry guru as a speaker. If you have the budget, adding an entertainer or other star personality can also add to the attraction, as long as the core agenda is relevant to attendees.

Step 6. Decide on the metrics you will use to measure success

It is important to set targets for what you want to accomplish and then to measure whether you’ve achieved those targets. For Azure, the number of entities that attend (besides our portfolio), the number of one-on-one meetings and the number of follow-ups post the conference that emanate from one-on-one are three of the metrics we measure. One week after the conference, I already know that we had over 320 one-on-ones which, so far, has led to about 50 follow ups that we are aware of including three investments in our portfolio. We expect to learn of additional follow up meetings but this has already exceeded our targets.

Step 7. Make sure the value obtained from the conference exceeds its cost

It is easy to spend money but harder to make sure the benefit of that spend exceeds its cost. On one end of the spectrum, some conferences have profits as one of the objectives. But in many cases, the determination of success is not based on profits, but rather on meeting objectives at a reasonable cost. I’ve already discussed Azure’s objectives but most of you are not VCs. For those of you dealing with customers, your objectives can include:

  1. Signing new customers
  2. Reducing churn of existing customers
  3. Developing a better understanding of how to evolve your product
  4. Strong press pickup / PR opportunity

Spending money on a conference should always be compared to other uses of those marketing dollars. To the degree you can be efficient in managing it, the conference can become a solid way to utilize marketing dollars. Some of the things we do for the Azure conference to control cost which may apply to you include:

  1. Partnering with a technology company to host our conference instead of holding it at a hotel. This only works if there is value to your partner. Cost savings is about 60-70%.
  2. Making sure our keynotes are very relevant but are at no cost. You can succeed at this with keynotes from your customers and/or the industry. Cost savings is whatever you might have paid someone.
  3. Having the dinner for 150 people at my house. This has two benefits: it is a much better experience for those attending and the cost is about 70% less than having it at a venue.

Summary

I have focused on using the Azure CEO Summit as the primary example but the rules laid out apply in general. They not only will help you create a successful conference but following them means only holding it if its value to you exceeds its cost.

 

SoundBytes

The warriors…

Last June I wrote about why Kevin Durant should join the Warriors

If you look at that post, you’ll see that my logic appears to have been born out, as my main reason was that Durant was likely to win a championship and this would be very instrumental in helping his reputation/legacy.

Not mentioned in that post was the fact that he would also increase his enjoyment of playing, because playing with Curry, Thompson, Green and the rest of the Warriors is optimizing how the game should be played

Now it’s up to both Durant and Curry to agree to less than cap salaries so the core of the team can be kept intact for many years. If they do, and win multiple championships, they’ll probably increase endorsement revenue. But even without that offset my question is “How much is enough?” I believe one can survive nicely on $30-$32 million a year (Why not both agree to identical deals for 4 years, not two?). Trying for the maximum is an illusion that can be self-defeating. The difference will have zero impact on their lives, but will keep players like Iguodala and Livingston with the Warriors, which could have a very positive impact. I’m hoping they can also keep West, Pachulia and McGee as well.

It would also be nice if Durant and Curry got Thompson and Green to provide a handshake agreement that they would follow the Durant/Curry lead on this and sign for the same amount per year when their contracts came up. Or, if Thompson and Green can extend now, to do the extension at equal pay to what Curry and Durant make in the extension years. By having all four at the same salary at the end of the period, the Warriors would be making a powerful statement of how they feel about each other.

Amazon & Whole Foods…

Amazon’s announced acquisition of Whole Foods is very interesting. In a previous post, we predicted that Amazon would open physical stores. Our reasoning was that over 90% of retail revenue still occurs offline and Amazon would want to attack that. I had expected these to be Guide Stores (not carrying inventory but having samples of products). Clearly this acquisition shows that, at least in food, Amazon wants to go even further. I will discuss this in more detail in a future post.

The Business of Theater

Earnest Shackleton

I have become quite interested in analyzing theater, in particular, Broadway and Off-Broadway shows for two reasons:

  1. I’m struck by the fact that revenue for the show Hamilton is shaping up like a Unicorn tech company
  2. My son Matthew is producing a show that is now launching at a NYC theater, and as I have been able to closely observe the 10-year process of it getting to New York, I see many attributes that are consistent with a startup in tech.

Incubation

It is fitting that Matthew’s show, Ernest Shackleton Loves Me, was first incubated at Theatreworks, San Francisco, as it is the primary theater of Silicon Valley. Each year the company hosts a “writer’s retreat” to help incubate new shows. Teams go there for a week to work on the shows, all expenses paid. Theatreworks supplies actors, musicians, and support so the creators can see how songs and scenes seem to work (or not) when performed. Show creators exchange ideas much like what happens at a tech incubator. At the culmination of the week, a part of each show is performed before a live audience to get feedback.

Creation of the Beta Version

After attending the writer’s retreat the creators of Shackleton needed to do two things: find a producer (like a VC, a Producer is a backer of the show that recruits others to help finance the project); and add other key players to the team – a book writer, director, actors, etc. Recruiting strong players for each of these positions doesn’t guarantee success but certainly increases the probability. In the case of Shackleton, Matthew came on as lead producer and he and the team did quite well in getting a Tony winning book writer, an Obie winning director and very successful actors on board. Once this team was together an early (beta version) of the show was created and it was performed to an audience of potential investors (the pitch). Early investors in the show are like angel investors as risk is higher at this point.

Beta Testing

The next step was to run a beta test of the product – called the “out of town tryout”. In general, out of town is anyplace other than New York City. It is used to do continuous improvement of the show much like beta testing is used to iterate a technology product based on user feedback. Theater critics also review shows in each city where they are performed. Ernest Shackleton Loves Me (Shackleton) had three runs outside of NYC: Seattle, New Jersey and Boston. During each, the show was improved based on audience and critic reaction. While it received rave reviews in each location, critics and the live audience can be helpful as they usually still can suggest ways that a show can be improved. Responding to that feedback helps prepare a show for a New York run.

Completing the Funding

Like a tech startup, it becomes easier to raise money in theater once the product is complete. In theater, a great deal of funding is required for the steps mentioned above, but it is difficult to obtain the bulk of funding to bring a show to New York for most shows without having actual performances. An average musical that goes to Off-Broadway will require $1.0 – $2.0 million in capitalization. And an average one that goes to Broadway tends to capitalize between $8 – $17 million. Hamilton cost roughly $12.5 million to produce, while Shackleton will capitalize at the lower end of the Off-Broadway range due to having a small cast and relatively efficient management. For many shows the completion of funding goes through the early days of the NYC run. It is not unusual for a show to announce it will open at a certain theater on a certain date and then be unable to raise the incremental money needed to do so. Like a tech startup, some shows, like Shackleton, may run a crowdfunding campaign to help top off its funding.

You can see what a campaign for a theater production looks like by clicking on this link and perhaps support the arts, or by buying tickets on the website (since the producer is my son, I had to include that small ask)!

The Product Launch

Assuming funding is sufficient and a theater has been secured (there currently is a shortage of Broadway theaters), the New York run then begins.  This is the true “product launch”. Part of a shows capitalization may be needed to fund a shortfall in revenue versus weekly cost during the first few weeks of the show as reviews plus word of mouth are often needed to help drive revenue above weekly break-even. Part of the reason so many Broadway shows employ famous Hollywood stars or are revivals of shows that had prior success and/or are based on a movie, TV show, or other well-known property is to insure substantial initial audiences. Some examples of this currently on Broadway are Hamilton (bestselling book), Aladdin (movie), Beautiful (Carole King story), Chicago (revival of successful show), Groundhog Day (movie), Hello Dolly (revival plus Bette Midler as star) and Sunset Boulevard (revival plus Glenn Close as star).

Crossing Weekly Break Even

Gross weekly burn for shows have a wide range (just like startups), with Broadway musicals having weekly costs from $500,000 to about $800,000 and Off-Broadway musicals in the $50,000 to $200,000 range. In addition, there are royalties of roughly 10% of revenue that go to a variety of players like the composer, book writer, etc. Hamilton has about $650,000 in weekly cost and roughly a $740,000 breakeven level when royalties are factored in.  Shackleton weekly costs are about $53,000, at the low end of the range for an off-Broadway musical, at under 10% of Hamilton’s weekly cost.

Is Hamilton the Facebook of Broadway?

Successful Broadway shows have multiple sources of revenue and can return significant multiples to investors.

Chart 1: A ‘Hits’ Business Example Capital Account

Since Shackleton just had its first performance on April 14, it’s too early to predict what the profit (or loss) picture will be for investors. On the other hand, Hamilton already has a track record that can be analyzed. In its first months on Broadway the show was grossing about $2 million per week which I estimate drove about $ 1 million per week in profits. Financial investors, like preferred shareholders of a startup, are entitled to the equivalent of “liquidation preferences”. This meant that investors recouped their money in a very short period, perhaps as little as 13 weeks. Once they recouped 110%, the producer began splitting profits with financial investors. This reduced the financial investors to roughly 42% of profits. In the early days of the Hamilton run, scalpers were reselling tickets at enormous profits. When my wife and I went to see the show in New York (March 2016) we paid $165 per ticket for great orchestra seats which we could have resold for $2500 per seat! Instead, we went and enjoyed the show. But if a scalper owned those tickets they could have made 15 times their money. Subsequently, the company decided to capture a portion of this revenue by adjusting seat prices for the better seats and as a result the show now grosses nearly $3 million per week. Since fixed weekly costs probably did not change, I estimate weekly profits are now about $1.8 million. At 42% of this, investors would be accruing roughly $750,000 per week. At this run rate, investors would receive over 3X their investment dollars annually from this revenue source alone if prices held up.

Multiple Companies Amplify Revenue and Profits

Currently Hamilton has a second permanent show in Chicago, a national touring company in San Francisco (until August when it’s supposed to move to LA) and has announced a second touring company that will begin the tour in Seattle in early 2018 before moving to Las Vegas and Cleveland and other stops. I believe it will also have a fifth company in London and a sixth in Asia by late 2018 or early 2019. Surprisingly, the touring companies can, in some cities, generate more weekly revenue than the Broadway company due to larger venues. Table 1 shows an estimate of the revenue per performance in the sold out San Francisco venue, the Orpheum Theater which has a capacity 2203 versus the Broadway capacity (Richard Rogers Theater) of 1319.

Table 1: Hamilton San Francisco Revenue Estimates

While one would expect Broadway prices to be higher, this has not been the case. I estimate the average ticket price in San Francisco to be $339 whereas the average on Broadway is now $282. The combination of 67% higher seating capacity and 20% higher average ticket prices means the revenue per week in San Francisco is now close to $6 million. Since it was lower in the first 4 weeks of the 21 plus week run, I estimate the total revenue for the run to be about $120 million. Given the explosive revenue, I wouldn’t be surprised if the run in San Francisco was extended again. While it has not been disclosed what share of this revenue goes to the production company, normally the production company is compensated as a base guarantee level plus a share of the profits (overage) after the venue covers its labor and marketing costs. Given these high weekly grosses, I assume the production company’s share is close to 50% of the grosses given the enormous profits versus an average show at the San Francisco venue (this would include both guarantee and overage). At 50% of revenue, there would still be almost $3 million per week to go towards paying the production company expenses (guarantee) and the local theater’s labor and marketing costs. If I use a lower $2 million of company share per week as profits to the production company that annualizes at over $100 million in additional profits or $42 million more per year for financial investors. The Chicago company is generating lower revenue than in San Francisco as the theater is smaller (1800 seats) and average ticket prices appear to be closer to $200. This would make revenue roughly $2.8 million per week. When the show ramps to 6 companies (I think by early 2019) the show could be generating aggregate revenue of $18-20 million per week or more should demand hold up. So, it would not be surprising if annual ticket revenue exceeded $1 billion per year at that time.

Merchandise adds to the mix

I’m not sure what amount of income each item of merchandise generates to the production company. Items like the cast album and music downloads could generate over $25 million in revenue, but in general only 40% of the net income from this comes to the company. On the other hand, T-shirts ($50 each) and the high-end program ($20 each) have extremely large margin which I think would accrue to the production company. If an average attendee of the show across the 6 (future) or more production companies spent $15 this could mean $1.2 million in merchandise sales per week across the 6 companies or another $60 million per year in revenue. At 60% gross margin this would add another $36 million in profits.

I expect Total Revenue for Hamilton to exceed $10 billion

In addition to the sources of revenue outlined above Hamilton will also have the opportunity for licensing to schools and others to perform the show, a movie, additional touring companies and more.  It seems likely to easily surpass the $6 billion that Lion King and Phantom are reported to have grossed to date, or the $4 billion so far for Wicked. In fact, I believe it eventually will gross over a $10 billion total. How this gets divided between the various players is more difficult to fully access but investors appear likely to receive over 100x their investment, Lin-Manuel Miranda could net as much as $ 1 billion (before taxes) and many other participants should become millionaires.

Surprisingly Hamilton may not generate the Highest Multiple for Theater Investors!

Believe it or not, a very modest musical with 2 actors appears to be the winner as far as return on investment. It is The Fantasticks which because of its low budget and excellent financial performance sustained over decades is now over a 250X return on invested capital. Obviously, my son, an optimistic entrepreneur, hopes his 2 actor musical, Ernest Shackleton Loves Me, will match this record.