Ending the Year on a High Note…or should I say Basketball Note

Deeper analysis on what constitutes MVP Value

Blog 35 photo

In my blog post dated February 3, 2017, I discussed several statistics that are noteworthy in analyzing how much a basketball player contributes to his team’s success. In it, I compared Stephen Curry and Russell Westbrook using several advanced statistics that are not typically highlighted.

The first statistic: Primary plus Secondary Assists per Minute a player has the ball. Time with the ball equates to assist opportunity, so holding the ball most of the time one’s team is on offense reduces the opportunity for others on the team to have assists. This may lead to fewer assisted baskets for the whole team, but more for the individual player. As of the time of the post, Curry had 1.74 assists (primary plus secondary) per minute he had the ball, while Westbrook only had 1.30 assists per minute. Curry’s efficiency in assists is one of the reasons the Warriors total almost 50% more assists per game than the Thunder, make many more easy baskets, and lead the league in field goal percentage.

The second statistic: Effective Field Goal Percentages (where making a 3-point shot counts the same as making 1 ½ 2-point shots). Again, Curry was vastly superior to Westbrook at 59.1% vs 46.4%. What this means is that Westbrook scores more because he takes many more shots, but these shots are not very efficient for his team, as Westbrook’s shooting percentage continued to be well below the league average of 45.7% (Westbrook’s was 42.5% last season and is 39.6% this season to date).

The third statistic: Plus/Minus.  Plus/Minus reflects the number of points your team outscores opponents while you are on the floor.  Curry led the league in this in 2013, 2014, and 2016 and leads year-to-date this season. In 2015 he finished second by a hair to a teammate. Westbrook has had positive results, but last year averaged 3.2 per 36 minutes vs Curry’s 13.8. One challenge to the impressiveness of this statistic for Curry is whether his leading the league in Plus/Minus is due to the quality of players around him. In refute, it is interesting to note that he led the league in 2013 when Greene was a sub, Durant wasn’t on the team and Thompson was not the player he is today.

The background shown above brings me to today’s post which outlines another way of looking at a player’s value. The measurement I’m advocating is: How much does he help teammates improve? My thesis is that if the key player on a team creates a culture of passing the ball and setting up teammates, everyone benefits. Currently the value of helping teammates is only measured by the number of assists a player records. But, if I’m right, and the volume of assists is the wrong measure of helping teammates excel (as sometimes assists are the result of holding the ball most of the time) then I should be able to verify this through teammate performance. If most players improve their performance by getting easier shots when playing with Westbrook or Curry, then this should translate into a better shooting percentage. That would mean we should be able to see that most teammates who played on another team the year before or the year after would show a distinct improvement in shooting percentage while on his team. This is unlikely to apply across the board as some players get better or worse from year to year, and other players on one’s team also impact this data. That being said, looking at this across players that switch teams is relevant, especially if there is a consistent trend.

To measure this for Russell Westbrook, I’ve chosen 5 of the most prominent players that recently switched teams to or from Oklahoma City: Victor Oladipo, Kevin Durant, Carmelo Anthony, Paul George and Enes Kantor. Three left Oklahoma City and two went there from another team. For the two that went there, Paul George and Carmelo Anthony, I’ll compare year-to-date this season (playing with Westbrook) vs their shooting percentage last year (without Westbrook). For Kantor and Oladipo, the percentage last year will be titled “with Westbrook” and this year “without Westbrook” and for Durant, the seasons in question are the 2015-16 season (with Westbrook) vs the 2016-17 season (without Westbrook).

Shooting Percentage

Table 0

Given that the league average is to shoot 45.7%, shooting below that can hurt a team, while shooting above that should help. An average team takes 85.4 shots per game, so a 4.0% swing translates to over 8.0 points a game. To put that in perspective, the three teams with the best records this season are the Rockets, Warriors and Celtics and they had first, second and fourth best Plus/Minus for the season at +11.0, +11.0 and +5.9, respectively. The Thunder came in at plus 0.8. If they scored 8 more points a game (without giving up more) their Plus/Minus would have been on a par with the top three teams, and their record likely would be quite a bit better than 12 and 14.

Curry and His Teammates Make Others Better

How does Curry compare? Let’s look at the same statistics for Durant, Andrew Bogut, Harrison Barnes, Zaza Pachulia and Ian Clark (the primary player who left the Warriors). For Barnes, Bogut, Pachulia and Durant I’ll compare the 2015 and 2016 seasons and for Clark I’ll use 2016 vs this season-to-date.

Table 1

So, besides being one of the best shooters to play the game, Curry also has a dramatic impact on the efficiency of other players on his team. Perhaps it’s because opponents need to double team him, which allows other players to be less guarded. Perhaps it’s because he bought into Kerr’s “spread the floor, move the ball philosophy”. Whatever the case, his willingness to give up the ball certainly has an impact. And that impact, plus his own shooting efficiency, clearly leads to the Warriors being an impressive scoring machine. As an aside, recent Warrior additions Casspi and Young are also having the best shooting percentages of their careers.

Westbrook is a Great Player Who Could be Even Better

I want to make it clear that I believe Russell Westbrook is a great player. His speed, agility and general athleticism allow him to do things that few other players can match. He can be extremely effective driving to the basket when it is done under control. But, he is not a great outside shooter and could help his team more by taking fewer outside shots and playing less one/one basketball. Many believed that the addition of George and Anthony would make Oklahoma City a force to be reckoned with, but to date this has not been the case. Despite the theoretic offensive power these three bring to the table, the team is 24th in the league in scoring at 101.8 per game, 15 points per game behind the league leading Warriors. This may change over the course of the season but I believe that each of them playing less one/one basketball would help.

Using Technology to Revolutionize Urban Transit

Winter Traffic Photo

Worsening traffic requires new solutions

As our population increases, the traffic congestion in cities continues to worsen. In the Bay Area my commute into the city now takes about 20% longer than it did 10 years ago, and driving outside of typical rush hours is now often a major problem. In New York, the subway system helps quite a bit, but most of Manhattan is gridlocked for much of the day.

The two key ways of relieving cities from traffic snarl are:

  1. Reduce the number of vehicles on city streets
  2. Increase the speed at which vehicles move through city streets

Metro areas have been experimenting with different measures to improve car speed, such as:

  1. Encouraging carpooling and implementing high occupancy vehicle lanes on arteries that lead to urban centers
  2. Converting more streets to one-way with longer periods of green lights
  3. Prohibiting turns onto many streets as turning cars often cause congestion

No matter what a city does, traffic will continue to get worse unless compelling and effective urban transportation systems are created and/or enhanced. With that in mind, this post will review current alternatives and discuss various ways of attacking this problem.

Ride sharing services have increased congestion

Uber and Lyft have not helped relieve congestion. They have probably even led to increasing it, as so many rideshare vehicles are cruising the streets while awaiting their next ride. While the escalation of ridesharing services like Uber and Lyft may have reduced the number of people who commute using their own car to work, they have merely substituted an Uber driver for a personal driver. Commuters parked their cars when arriving at work while ridesharing drivers continue to cruise after dropping off a passenger, so the real benefit here has been in reducing demand for parking, not improving traffic congestion.

A simple way to think about this is that the total cars on the street at any point in time consists of those with someone going to a destination plus those cruising awaiting picking up a passenger. Uber does not reduce the number of people going to a destination by car (and probably increases it as some Uber riders would have taken public transportation if not for Uber).

The use of optimal traffic-aware routing GPS apps like Waze doesn’t reduce traffic but spreads it more evenly among alternate routes, therefore providing a modest increase in the speed that vehicles move through city streets. The thought that automating these vehicles will relieve pressure is unrealistic, as automated vehicles will still be subject to the same movement as those with drivers (who use Waze). Automating ridesharing cars can modestly reduce the number of cruising vehicles, as Uber and Lyft can optimize the number that remain in cruise mode. However, this will not reduce the number of cars transporting someone to a destination. So, it is clear to me that ridesharing services increase rather than reduce the number of vehicles on city streets and will continue to do so even when they are driverless.

Metro rail systems effectively reduce traffic but are expensive and can take decades to implement

Realistically, improving traffic flow requires cities to enhance their urban transport system, thereby reducing the number of vehicles on their streets. There are several historic alternatives but the only one that can move significant numbers of passengers from point A to point B without impacting other traffic is a rail system. However, construction of a rail system is costly, highly disruptive, and can take decades to go from concept to completion. For example, the New York City Second Avenue Line was tentatively approved in 1919. It is educational to read the history of reasons for delays, but the actual project didn’t begin until 2005 despite many millions of dollars being spent on planning, well before that date. The first construction commenced in April 2007. The first phase of the construction cost $4.5 billion and included 3 stations and 2 miles of tunnels. This phase was complete, and the line opened in January 2017. By May daily ridership was approximately 176,000 passengers. A second phase is projected to cost an additional $6 billion, add 1.5 more miles to the line and be completed 10-12 years from now (assuming no delays). Phase 1 and 2 together from actual start to hopeful finish will be over two decades from the 2005 start date…and about a century from when the line was first considered!

Dedicated bus rapid transit, less costly and less effective

Most urban transportation networks include bus lines through city streets. While buses do reduce the number of vehicles on the roads, they have several challenges that keep them from being the most efficient method of urban transport:

  1. They need to stop at traffic lights, slowing down passenger movement
  2. When they stop to let one passenger on or off, all other passengers are delayed
  3. They are very large and often cause other street traffic to be forced to slow down

One way of improving bus efficiency is a Dedicated Bus Rapid Transit System (BRT). Such a system creates a dedicated corridor for buses to use. A key to increasing the number of passengers such a system can transport is to remove them from normal traffic (thus the dedicated lanes) and to reduce or eliminate the need to stop for traffic lights by either altering the timing to automatically accommodate minimal stoppage of the buses or by creating overpasses and/or underpasses. If traffic lights are altered, the bus doesn’t stop for a traffic light but that can mean cross traffic stops longer, thus increasing cross traffic congestion. Elimination of interference using underpasses and/or overpasses at each intersection can be quite costly given the substantial size of buses. San Francisco has adopted the first, less optimal, less costly, approach along a two-mile corridor of Van Ness Avenue. The cost will still be over $200 million (excluding new buses) and it is expected to increase ridership from about 16,000 passengers per day to as much as 22,000 (which I’m estimating translates to 2,000-3,000 passengers per hour in each direction during peak hours). Given the increased time cross traffic will need to wait, it isn’t clear how much actual benefit will occur.

Will Automated Car Rapid Transit (ACRT) be the most cost effective solution?

I recently met with a company that expects to create a new alternative using very small automated car rapid transit (ACRT) that costs a fraction of and has more than double the capacity of a BRT.  The basic concept is to create a corridor similar to that of a BRT, utilizing underpasses under some streets and bridges over other streets. Therefore, cross traffic would not be affected by longer traffic light stoppages. Since the size of an underpass (tunnel) to accommodate a very small car is a fraction of that of a very large bus, so is the cost. The cars would be specially designed driverless automated cars that have no trunk, no back seats and hold one or two passengers. The same 3.5 to 4.0-meter-wide lane needed for a BRT would be sufficient for more than two lanes of such cars. Since the cars would be autonomous, speed and distance between cars could be controlled so that all cars in the corridor move at 30 miles per hour unless they exited. Since there would be overpasses and underpasses across each cross street, the cars would not stop for lights. Each vehicle would hold one or two passengers going to the same stop, so the car would not slow until it reached that destination. When it did, it would pull off the road without reducing speed until it was on the exit ramp.

The company claims that it will have the capacity to transport 10,000 passengers per hour per lane with the same setup as the Van Ness corridor if underpasses and overpasses were added. Since a capacity of 10,000 passengers per hour in each direction would provide significant excess capacity compared to likely usage, 2 lanes (3 meters in total width instead of 7-8 meters) is all that such a system would require. The reduced width would reduce construction cost while still providing excess capacity. Passengers would arrive at destinations much sooner than by bus as the autos would get there at 30 miles per hour without stopping even once. This translates to a 2-mile trip taking 4 minutes! Compare that to any experience you have had taking a bus.  The speed of movement also helps make each vehicle available to many more passengers during a day. While it is still unproven, this technology appears to offer significant cost/benefit vs other alternatives.

Conclusion

The population expansion within urban areas will continue to drive increased traffic unless additional solutions are implemented. If it works as well in practice as it does in theory, an ACRT like the one described above offers one potential way of improving transport efficiency. However, this is only one of many potential approaches to solving the problem of increased congestion. Regardless of the technology used, this is a space where innovation must happen if cities are to remain livable. While investment in underground rail is also a potential way of mitigating the problem, it will remain an extremely costly alternative unless innovation occurs in that domain.

How much do you know about SEO?

Search Engine Optimization: A step by step process recommended by experts

Azure just completed its annual ecommerce marketing day. It was attended by 15 of our portfolio companies, two high level executives at major corporations, a very strong SEO consultant and the Azure team. The purpose of the day is to help the CMOs in the Azure portfolio gain a broader perspective on hot marketing topics and share ideas and best practices. This year’s agenda included the following sessions:

  1. Working with QVC/HSN
  2. Brand building
  3. Using TV, radio and/or podcasts for marketing
  4. Techniques to improve email marketing
  5. Measuring and improving email marketing effectiveness
  6. Storytelling to build your brand and drive marketing success
  7. Working with celebrities, brands, popular YouTube personalities, etc.
  8. Optimizing SEO
  9. Product Listing Ads (PLAs) and Search Engine Marketing (SEM)

One pleasant aspect of the day is that it generated quite a few interesting ideas for blog posts! In other words, I learned a lot regarding the topics covered. This post is on an area many of you may believe you know well, Search Engine Optimization (SEO). I thought I knew it well too… before being exposed to a superstar consultant, Allison Lantz, who provided a cutting-edge presentation on the topic. With her permission, this post borrows freely from her content. Of course, I’ve added my own ideas in places and may have introduced some errors in thinking, and a short post can only touch on a few areas and is not a substitute for true expertise.

SEO is Not Free if You Want to Optimize

I have sometimes labeled SEO as a free source of visitors to a site, but Allison correctly points out that if you want to focus on Optimization (the O in SEO) with the search engines, then it isn’t free, but rather an ongoing process (and investment) that should be part of company culture. The good news is that SEO likely will generate high quality traffic that lasts for years and leads to a high ROI against the cost of striving to optimize. All content creators should be trained to write in a manner that optimizes generating traffic by using targeted key words in their content and ensuring these words appear in the places that are optimal for search. To be clear, it’s also best if the content is relevant, well written and user-friendly. If you were planning to create the content anyway, then the cost of doing this is relatively minor. However, if the content is incremental to achieve higher SEO rankings, then the cost will be greater. But I’m getting ahead of myself and need to review the step by step process Allison recommends to move towards optimization.

Keyword Research

The first thing to know when developing an SEO Strategy is what you are targeting to optimize. Anyone doing a search enters a word or phrase they are searching for. Each such word or phrase is called a ‘keyword’. If you want to gain more users through SEO, it’s critical to identify thousands, tens of thousands or even hundreds of thousands of keywords that are relevant to your site. For a fashion site, these could be brands, styles, and designers. For an educational site like Education.com (an Azure portfolio company that is quite strong in SEO and ranks on over 600,000 keywords) keywords might be math, english, multiplication, etc. The broader the keywords, the greater the likelihood of higher volume.  But along with that comes more competition for search rankings and a higher cost per keyword. The first step in the process is spending time brainstorming what combinations of words are relevant to your site – in other words if someone searched for that specific combination would your site be very relevant to them? To give you an idea of why the number gets very high, consider again Education.com. Going beyond searching on “math”, one can divide math into arithmetic, algebra, geometry, calculus, etc. Each of these can then be divided further. For example, arithmetic can include multiplication, addition, division, subtraction, exponentiation, fractions and more.  Each of these can be subdivided further with multiplication covering multiplication games, multiplication lesson plans, multiplication worksheets, multiplication quizzes and more.

Ranking Keywords

Once keywords are identified the next step is deciding which ones to focus on. The concept leads to ranking keywords based upon the likely number of clicks to your site that could be generated from each one and the expected value of potential users obtained through these clicks. Doing this requires determining for each keyword:

  • Monthly searches
  • Competition for the keyword
  • Conversion potential
  • Effort (and possible cost) required to achieve a certain ranking

Existing tools report the monthly volume of searches for each keyword (remember to add searches on Bing to those on Google). Estimating the strength of competition requires doing a search using the keyword and learning who the top-ranking sites are currently (given the volume of keywords to analyze, this is very labor intensive). If Amazon is a top site they may be difficult to surpass but if the competition includes relatively minor players, they would be easier to outrank.

The next question to answer for each keyword is: “What is the likelihood of converting someone who is searching on the keyword if they do come to my site”. For example, for Education.com, someone searching on ‘sesame street math games’ might not convert well since they don’t have the license to use Sesame Street characters in their math games. But someone searching on ‘1st grade multiplication worksheets’ would have a high probability of converting since the company is world-class in that area. The other consideration mentioned above is the effort required to achieve a degree of success. If you already have a lot of content relevant to a keyword, then search optimizing that content for the keyword might not be very costly. But, if you currently don’t have any content that is relevant or the keyword is very broad, then a great deal more work might be required.

Example of Keyword Ranking Analysis

Source: Education.com

Comparing Effort Required to Estimated Value of Keywords

Once you have produced the first table, you can make a very educated guess on your possible ranking after about 12 months (the time it may take Google/Bing to recognize your new status for that keyword).

There are known statistics on what the likely click-through rates (share of searches against the keyword) will be if you rank 1st, 2nd, 3rd, etc. Multiplying that by the average search volume for that keyword gives a reasonable estimate of the monthly traffic that this would generate to your site. The next step is to estimate the rate at which you will convert that traffic to members (where they register so you get their email) and/or customers (I’ll assume customers for the rest of this post but the same method would apply to members). Since you already know your existing conversion rate, in general, this could be your estimate. But, if you have been buying clicks on that keyword from Google or Bing, you may already have a better estimate of conversion. Multiplying the number of customers obtained by the LTV (Life Time Value) of a customer yields the $ value generated if the keyword obtains the estimated rank. Subtract from this the current value being obtained from the keyword (based on its current ranking) to see the incremental benefit.

Content Optimization

One important step to improve rankings is to use keywords in titles of articles. While the words to use may seem intuitive, it’s important to test variations to see how each may improve results. Will “free online multiplication games” outperform “free times table games”. The way to test this is by trying each for a different 2-week (or month) time period and see which gives a higher CTR (Click Through Rate). As discussed earlier, it’s also important to optimize the body copy against keywords. Many of our companies create a guide for writing copy that provides rules that result in better CTR.

The Importance of Links

Google views links from other sites to yours as an indication of your level of authority. The more important the site linking to you, the more it impacts Google’s view. Having a larger number of sites linking to you can drive up your Domain Authority (a search engine ranking score) which in turn will benefit rankings across all keywords. However, it’s important to be restrained in acquiring links as those from “Black Hats” (sites Google regards as somewhat bogus) can actually result in getting penalized. While getting another site to link to you will typically require some motivation for them, Allison warns that paying cash for a link is likely to result in obtaining some of them from black hat sites. Instead, motivation can be your featuring an article from the other site, selling goods from a partner, etc.

Other Issues

I won’t review it here but site architecture is also a relevant factor in optimizing SEO benefits. For a product company with tens of thousands of products, it can be extremely important to have the right titles and structure in how you list products. If you have duplicative content on your site, removing it may help your rankings, even if there was a valid reason to have such duplication. Changing the wording of content on a regular basis will help you maintain rankings.

Summary

SEO requires a well-thought-out strategy and consistent, continued execution to produce results. This is not a short-term fix, as an SEO investment will likely only start to show improvements four to six months after implementation with ongoing management. But as many of our portfolio companies can attest, it’s well worth the effort.

 

 

SoundBytes

  • It’s a new basketball season so I can’t resist a few comments. First, as much as I am a fan of the Warriors, it’s pretty foolish to view them as a lock to win as winning is very tenuous. For example, in game 5 of the finals last year, had Durant missed his late game three point shot the Warriors may have been facing the threat of a repeat of the 2016 finals – going back to Cleveland for a potential tying game.
  • Now that Russell Westbrook has two star players to accompany him we can see if I am correct that he is less valuable than Curry, who has repeatedly shown the ability to elevate all teammates. This is why I believe that, despite his two MVPs, Curry is under-rated!
  • With Stitchfix filing for an IPO, we are seeing the first of several next generation fashion companies emerging. In the filing, I noted the emphasis they place on SEO as a key component of their success. I believe new fashion startups will continue to exert pressure on traditional players. One Azure company moving towards scale in this domain is Le Tote – keep an eye on them!

Will Grocery Shopping Ever be the Same?

Will grocery shopping ever be the same?

Dining and shopping today is very different than in days gone by – the Amazon acquisition of Whole Foods is a result

“I used to drink it,” said Andy Warhol once of Campbell’s soup. “I used to have the same lunch every day, for 20 years, I guess, the same thing over and over again.” In Warhol’s signature medium, silkscreen, the artist reproduced his daily Campbell’s soup can over and over again, changing only the label graphic on each one.

When I was growing up I didn’t have exactly the same thing over and over like Andy Warhol, but virtually every dinner was at home, at our kitchen table (we had no dining room in the 4-room apartment). Eating out was a rare treat and my father would have been abhorred if my mom brought in prepared food. My mom, like most women of that era, didn’t officially work, but did do the bookkeeping for my dad’s plumbing business. She would shop for food almost every day at a local grocery and wheel it home in her shopping cart.

When my wife and I were raising our kids, the kitchen remained the most important room in the house. While we tended to eat out many weekend nights, our Sunday through Thursday dinners were consumed at home, but were sprinkled with occasional meals brought in from the outside like pizza, fried chicken, ribs, and Chinese food. Now, given a high proportion of households where both parents work, eating out, fast foods and prepared foods have become a large proportion of how Americans consume dinner. This trend has reached the point where some say having a traditional kitchen may disappear as people may cease cooking at all.

In this post, I discuss the evolution of our eating habits, and how they will continue to change. Clearly, the changes that have already occurred in shopping for food and eating habits were motivations for Amazon’s acquisition of Whole Foods.

The Range of How We Dine

Dining can be broken into multiple categories and families usually participate in all of them. First, almost 60% of dinners eaten at home are still prepared there. While the percentage has diminished, it is still the largest of the 4 categories for dinners. Second, many meals are now purchased from a third party but still consumed at home. Given the rise of delivery services and greater availability of pre-cooked meals at groceries, the category spans virtually every type of food. Thirdly, many meals are purchased from a fast food chain (about 25% of Americans eat some type of fast food every day1) and about 20% of meals2 are eaten in a car. Finally, a smaller percentage of meals are consumed at a restaurant. (Sources: 1Schlosser, Eric. “Americans Are Obsessed with Fast Food: The Dark Side of the All-American Meal.” CBSNews. Accessed April 14, 2014 / 2Stanford University. “What’s for Dinner?” Multidisciplinary Teaching and Research at Stanford. Accessed April 14, 2014).

The shift to consuming food away from home has been a trend for the last 50 years as families began going from one worker to both spouses working. The proportion of spending on food consumed away from home has consistently increased from 1965-2014 – from 30% to 50%.

Source: Calculated by the Economic Research Service, USDA, from various data sets from the U.S. Census Bureau and the Bureau of Labor Statistics.

With both spouses working, the time available to prepare food was dramatically reduced. Yet, shopping in a supermarket remained largely the same except for more availability of prepared meals. Now, changes that have already begun could make eating dinner at home more convenient than eating out with a cost comparable to a fast food chain.

Why Shopping for Food Will Change Dramatically over the Next 30 Years

Eating at home can be divided between:

  1. Cooking from scratch using ingredients from general shopping
  2. Buying prepared foods from a grocery
  3. Cooking from scratch from recipes supplied with the associated ingredients (meal kits)
  4. Ordering meals that have previously been prepared and only need to be heated up
  5. Ordering meals from a restaurant that are picked up or delivered to your home
  6. Ordering “fast food” type meals like pizza, ribs, chicken, etc. for pickup or delivery.

I am starting with the assumption that many people will still want to cook some proportion of their dinners (I may be romanticizing given how I grew up and how my wife and I raised our family). But, as cooking for yourself becomes an even smaller percentage of dinners, shopping for food in the traditional way will prove inefficient. Why buy a package of saffron or thyme or a bag of onions, only to see very little of it consumed before it is no longer usable? And why start cooking a meal, after shopping at a grocery, only to find you are missing an ingredient of the recipe? Instead, why not shop by the meal instead of shopping for many items that may or may not end up being used.

Shopping by the meal is the essential value proposition offered by Blue Apron, Plated, Hello Fresh, Chef’d and others. Each sends you recipes and all the ingredients to prepare a meal. There is little food waste involved (although packaging is another story). If the meal preparation requires one onion, that is what is included, if it requires a pinch of saffron, then only a pinch is sent. When preparing one of these meals you never find yourself missing an ingredient. It takes a lot of the stress and the food waste out of the meal preparation process. But most such plans, in trying to keep the cost per meal to under $10, have very limited choices each week (all in a similar lower cost price range) and require committing to multiple meals per week. Chef’d, one of the exceptions to this, allows the user to choose individual meals or to purchase a weekly subscription. They also offer over 600 options to choose from while a service like Blue Apron asks the subscriber to select 3 out of 6 choices each week.

Blue Apron meals portioned perfectly for the amount required for the recipes

My second assumption is that the number of meals that are created from scratch in an average household will diminish each year (as it already has for the past 50 years). However, many people will want to have access to “preferred high quality” meals that can be warmed up and eaten, especially in two-worker households. This will be easier and faster (but perhaps less gratifying) than preparing a recipe provided by a food supplier (along with all the ingredients). I am talking about going beyond the pre-cooked items in your average grocery. There are currently sources of such meals arising as delivery services partner with restaurants to provide meals delivered to your doorstep. But this type of service tends to be relatively expensive on a per meal basis.

I expect new services to arise (we’ve already seen a few) that offer meals that are less expensive prepared by “home chefs” or caterers and ordered through a marketplace (this is category 4 in my list). The marketplace will recruit the chefs, supply them with packaging, take orders, deliver to the end customers, and collect the money. Since the food won’t be from a restaurant, with all the associated overhead, prices can be lower. Providing such a service will be a source of income for people who prefer to work at home. Like drivers for Uber and Lyft, there should be a large pool of available suppliers who want to work in this manner. It will be very important for the marketplaces offering such service to curate to ensure that the quality and food safety standards of the product are guaranteed. The availability of good quality, moderately priced prepared meals of one’s choice delivered to the home may begin shifting more consumption back to the home, or at a minimum, slow the shift towards eating dinners away from home.

Where will Amazon be in the Equation?

In the past, I predicted that Amazon would create physical stores, but their recent acquisition of Whole Foods goes far beyond anything I forecast by providing them with an immediate, vast network of physical grocery stores. It does make a lot of sense, as I expect omnichannel marketing to be the future of retail.  My reasoning is simple: on the one hand, online commerce will always be some minority of retail (it currently is hovering around 10% of total retail sales); on the other hand, physical retail will continue to lose share of the total market to online for years to come, and we’ll see little difference between e-commerce and physical commerce players.  To be competitive, major players will have to be both, and deliver a seamless experience to the consumer.

Acquiring Whole Foods can make Amazon the runaway leader in categories 1 and 2, buying ingredients and/or prepared foods to be delivered to your home.  Amazon Fresh already supplies many people with products that are sourced from grocery stores, whether they be general food ingredients or traditional prepared foods supplied by a grocery. They also have numerous meal kits that they offer, and we expect (and are already seeing indications) that Amazon will follow the Whole Foods acquisition by increasing its focus on “meal kits” as it attempts to dominate this rising category (3 in our table).

One could argue that Whole Foods is already a significant player in category 4 (ordering meals that are prepared, and only need to be heated up), believing that category 4 is the same as category 2 (buying prepared meals from a grocery). But it is not. What we envision in the future is the ability to have individuals (who will all be referred to as “Home Chefs” or something like that) create brands and cook foods of every genre, price, etc. Customers will be able to order a set of meals completely to their taste from a local home chef. The logical combatants to control this market will be players like Uber and Lyft, guys like Amazon and Google, existing recipe sites like Blue Apron…and new startups we’ve never heard of.

When and How to Create a Valuable Marketing Event

Azure CEO Summit
Snapshots from Azure’s 11th Annual CEO Summit

A key marketing tool for companies is to hold an event like a user’s conference or a topical forum to build relationships with their customers and partners, drive additional revenue and/or generate business development opportunities. Azure held its 11th annual CEO Summit last week, and as we’re getting great feedback on the success of the conference, I thought it might be helpful to dig deeply into what makes a conference effective. I will use the Azure event as the example but try to abstract rules and lessons to be learned, as I have been asked for my advice on this topic by other firms and companies.

Step 1. Have a clear set of objectives

For the Azure CEO Summit, our primary objectives are to help our portfolio companies connect with:

  1. Corporate and Business Development executives from relevant companies
  2. Potential investors (VCs and Family Offices)
  3. Investment banks so the companies are on the radar and can get invited to their conferences
  4. Debt providers for those that can use debt as part of their capital structure

A secondary objective of the conference is to build Azure’s brand thereby increasing our deal flow and helping existing and potential investors in Azure understand some of the value we bring to the table.

When I created a Wall Street tech conference in the late 90’s, the objectives were quite different. They still included brand building, but I also wanted our firm to own trading in tech stocks for that week, have our sell side analysts gain reputation and following, help our bankers expand their influence among public companies, and generate a profit for the firm at the same time. We didn’t charge directly for attending but monetized through attendees increasing use of our trading desk and more companies using our firm for investment banking.

When Fortune began creating conferences, their primary objective was to monetize their brand in a new way. This meant charging a hefty price for attending. If people were being asked to pay, the program had to be very strong, which they market quite effectively.

Conferences that have clear objectives, and focus the activities on those objectives, are the most successful.

Step 2. Determine invitees based on who will help achieve those objectives

For our Summit, most of the invitees are a direct fallout from the objectives listed above. If we want to help our portfolio companies connect with the above-mentioned constituencies, we need to invite both our portfolio CEOs and the right players from corporations, VCs, family offices, investment banks and debt providers. To help our brand, inviting our LPs and potential LPs is important. To insure the Summit is at the quality level needed to attract the right attendees we also target getting great speakers.  As suggested by my partners and Andrea Drager, Azure VP (and my collaborator on Soundbytes) we invited several non-Azure Canadian startups. In advance of the summit, we asked Canadian VCs to nominate candidates they thought would be interesting for us and we picked the best 6 to participate in the summit. This led to over 70 interesting companies nominated and added to our deal flow pipeline.

Step 3. Create a program that will attract target attendees to come

This is especially true in the first few years of a conference while you build its reputation. It’s important to realize that your target attendees have many conflicting pulls on their time. You won’t get them to attend just because you want them there! Driving attendance from the right people is a marketing exercise. The first step is understanding what would be attractive to them. In Azure’s case, they might not understand the benefit of meeting our portfolio companies, but they could be very attracted by the right keynotes.

Azure’s 2017 Summit Keynote Speakers: Mark Lavelle, CEO of Magento Commerce & Co-founder of Bill Me Later. Cameron Lester, Managing Director and Co-Head of Global Technology Investment Banking, Jeffries. Nagraj Kashyap, Corporate VP & Global Head, Microsoft Ventures.

Over the years we have had the heads of technology investment banking from Qatalyst, Morgan Stanley, Goldman, JP Morgan and Jeffries as one of our keynote speakers. From the corporate world, we also typically have a CEO, former CEO or chairman of notable companies like Microsoft, Veritas, Citrix, Concur and Audible as a second keynote. Added to these were CEOs of important startups like Stance and Magento and terrific technologists like the head of Microsoft Labs.

Finding the right balance of content, interaction and engagement is challenging, but it should be explicitly tied to meeting the core objectives of the conference.

Step 4. Make sure the program facilitates meeting your objectives

Since Azure’s primary objective is creating connections between our portfolio (and this year, the 6 Canadian companies) with the various other constituencies we invite, we start the day with speed dating one-on-ones of 10 minutes each. Each attendee participating in one-on-ones can be scheduled to meet up to 10 entities between 8:00AM and 9:40. Following that time, we schedule our first keynote.

In addition to participating in the one-on-ones, which start the day, 26 of our portfolio companies had speaking slots at the Summit, intermixed with three compelling keynote speakers. Company slots are scheduled between keynotes to maximize continued participation. This schedule takes us to about 5:00pm. We then invite the participants and additional VCs, lawyers and other important network connections to join us for dinner. The dinner increases everyone’s networking opportunity in a very relaxed environment.

These diverse types of interaction phases throughout the conference (one-on-ones, presentations, discussions, and networking) all facilitate a different type of connection between attendees, focused on maximizing the opportunity for our portfolio companies to build strong connections.

Azure Company Presentations
Azure Portfolio Company CEO Presentations: Chairish, Megabots & Atacama

Step 5. Market the program you create to the target attendees

I get invited to about 30 conferences each year plus another 20-30 events. It’s safe to assume that most of the invitees to the Azure conference get a similar (or greater) number of invitations. What this means is that it’s unlikely that people will attend if you send an invitation but then don’t effectively market the event (especially in the first few years). It is important to make sure every key invitee gets a personal call, email, or other message from an executive walking them through the agenda and highlighting the value to them (link to fortune could also go here). For the Azure event, we highlight the great speakers but also the value of meeting selected portfolio companies. Additionally, one of my partners or I connect with every attendee we want to do one-on-ones with portfolio companies to stress why this benefits them and to give them the chance to alter their one-on-one schedule. This year we managed over 320 such meetings.

When I created the first “Quattrone team” conference on Wall Street, we marketed it as an exclusive event to portfolio managers. While the information exchanged was all public, the portfolio managers still felt they would have an investment edge by being at a smaller event (and we knew the first year’s attendance would be relatively small) where all the important tech companies spoke and did one-on-one meetings. For user conferences, it can help to land a great speaker from one of your customers or from the industry. For example, if General Electric, Google, Microsoft or some similar important entity is a customer, getting them to speak will likely increase attendance. It also may help to have an industry guru as a speaker. If you have the budget, adding an entertainer or other star personality can also add to the attraction, as long as the core agenda is relevant to attendees.

Step 6. Decide on the metrics you will use to measure success

It is important to set targets for what you want to accomplish and then to measure whether you’ve achieved those targets. For Azure, the number of entities that attend (besides our portfolio), the number of one-on-one meetings and the number of follow-ups post the conference that emanate from one-on-one are three of the metrics we measure. One week after the conference, I already know that we had over 320 one-on-ones which, so far, has led to about 50 follow ups that we are aware of including three investments in our portfolio. We expect to learn of additional follow up meetings but this has already exceeded our targets.

Step 7. Make sure the value obtained from the conference exceeds its cost

It is easy to spend money but harder to make sure the benefit of that spend exceeds its cost. On one end of the spectrum, some conferences have profits as one of the objectives. But in many cases, the determination of success is not based on profits, but rather on meeting objectives at a reasonable cost. I’ve already discussed Azure’s objectives but most of you are not VCs. For those of you dealing with customers, your objectives can include:

  1. Signing new customers
  2. Reducing churn of existing customers
  3. Developing a better understanding of how to evolve your product
  4. Strong press pickup / PR opportunity

Spending money on a conference should always be compared to other uses of those marketing dollars. To the degree you can be efficient in managing it, the conference can become a solid way to utilize marketing dollars. Some of the things we do for the Azure conference to control cost which may apply to you include:

  1. Partnering with a technology company to host our conference instead of holding it at a hotel. This only works if there is value to your partner. Cost savings is about 60-70%.
  2. Making sure our keynotes are very relevant but are at no cost. You can succeed at this with keynotes from your customers and/or the industry. Cost savings is whatever you might have paid someone.
  3. Having the dinner for 150 people at my house. This has two benefits: it is a much better experience for those attending and the cost is about 70% less than having it at a venue.

Summary

I have focused on using the Azure CEO Summit as the primary example but the rules laid out apply in general. They not only will help you create a successful conference but following them means only holding it if its value to you exceeds its cost.

 

SoundBytes

The warriors…

Last June I wrote about why Kevin Durant should join the Warriors

If you look at that post, you’ll see that my logic appears to have been born out, as my main reason was that Durant was likely to win a championship and this would be very instrumental in helping his reputation/legacy.

Not mentioned in that post was the fact that he would also increase his enjoyment of playing, because playing with Curry, Thompson, Green and the rest of the Warriors is optimizing how the game should be played

Now it’s up to both Durant and Curry to agree to less than cap salaries so the core of the team can be kept intact for many years. If they do, and win multiple championships, they’ll probably increase endorsement revenue. But even without that offset my question is “How much is enough?” I believe one can survive nicely on $30-$32 million a year (Why not both agree to identical deals for 4 years, not two?). Trying for the maximum is an illusion that can be self-defeating. The difference will have zero impact on their lives, but will keep players like Iguodala and Livingston with the Warriors, which could have a very positive impact. I’m hoping they can also keep West, Pachulia and McGee as well.

It would also be nice if Durant and Curry got Thompson and Green to provide a handshake agreement that they would follow the Durant/Curry lead on this and sign for the same amount per year when their contracts came up. Or, if Thompson and Green can extend now, to do the extension at equal pay to what Curry and Durant make in the extension years. By having all four at the same salary at the end of the period, the Warriors would be making a powerful statement of how they feel about each other.

Amazon & Whole Foods…

Amazon’s announced acquisition of Whole Foods is very interesting. In a previous post, we predicted that Amazon would open physical stores. Our reasoning was that over 90% of retail revenue still occurs offline and Amazon would want to attack that. I had expected these to be Guide Stores (not carrying inventory but having samples of products). Clearly this acquisition shows that, at least in food, Amazon wants to go even further. I will discuss this in more detail in a future post.

The Business of Theater

Earnest Shackleton

I have become quite interested in analyzing theater, in particular, Broadway and Off-Broadway shows for two reasons:

  1. I’m struck by the fact that revenue for the show Hamilton is shaping up like a Unicorn tech company
  2. My son Matthew is producing a show that is now launching at a NYC theater, and as I have been able to closely observe the 10-year process of it getting to New York, I see many attributes that are consistent with a startup in tech.

Incubation

It is fitting that Matthew’s show, Ernest Shackleton Loves Me, was first incubated at Theatreworks, San Francisco, as it is the primary theater of Silicon Valley. Each year the company hosts a “writer’s retreat” to help incubate new shows. Teams go there for a week to work on the shows, all expenses paid. Theatreworks supplies actors, musicians, and support so the creators can see how songs and scenes seem to work (or not) when performed. Show creators exchange ideas much like what happens at a tech incubator. At the culmination of the week, a part of each show is performed before a live audience to get feedback.

Creation of the Beta Version

After attending the writer’s retreat the creators of Shackleton needed to do two things: find a producer (like a VC, a Producer is a backer of the show that recruits others to help finance the project); and add other key players to the team – a book writer, director, actors, etc. Recruiting strong players for each of these positions doesn’t guarantee success but certainly increases the probability. In the case of Shackleton, Matthew came on as lead producer and he and the team did quite well in getting a Tony winning book writer, an Obie winning director and very successful actors on board. Once this team was together an early (beta version) of the show was created and it was performed to an audience of potential investors (the pitch). Early investors in the show are like angel investors as risk is higher at this point.

Beta Testing

The next step was to run a beta test of the product – called the “out of town tryout”. In general, out of town is anyplace other than New York City. It is used to do continuous improvement of the show much like beta testing is used to iterate a technology product based on user feedback. Theater critics also review shows in each city where they are performed. Ernest Shackleton Loves Me (Shackleton) had three runs outside of NYC: Seattle, New Jersey and Boston. During each, the show was improved based on audience and critic reaction. While it received rave reviews in each location, critics and the live audience can be helpful as they usually still can suggest ways that a show can be improved. Responding to that feedback helps prepare a show for a New York run.

Completing the Funding

Like a tech startup, it becomes easier to raise money in theater once the product is complete. In theater, a great deal of funding is required for the steps mentioned above, but it is difficult to obtain the bulk of funding to bring a show to New York for most shows without having actual performances. An average musical that goes to Off-Broadway will require $1.0 – $2.0 million in capitalization. And an average one that goes to Broadway tends to capitalize between $8 – $17 million. Hamilton cost roughly $12.5 million to produce, while Shackleton will capitalize at the lower end of the Off-Broadway range due to having a small cast and relatively efficient management. For many shows the completion of funding goes through the early days of the NYC run. It is not unusual for a show to announce it will open at a certain theater on a certain date and then be unable to raise the incremental money needed to do so. Like a tech startup, some shows, like Shackleton, may run a crowdfunding campaign to help top off its funding.

You can see what a campaign for a theater production looks like by clicking on this link and perhaps support the arts, or by buying tickets on the website (since the producer is my son, I had to include that small ask)!

The Product Launch

Assuming funding is sufficient and a theater has been secured (there currently is a shortage of Broadway theaters), the New York run then begins.  This is the true “product launch”. Part of a shows capitalization may be needed to fund a shortfall in revenue versus weekly cost during the first few weeks of the show as reviews plus word of mouth are often needed to help drive revenue above weekly break-even. Part of the reason so many Broadway shows employ famous Hollywood stars or are revivals of shows that had prior success and/or are based on a movie, TV show, or other well-known property is to insure substantial initial audiences. Some examples of this currently on Broadway are Hamilton (bestselling book), Aladdin (movie), Beautiful (Carole King story), Chicago (revival of successful show), Groundhog Day (movie), Hello Dolly (revival plus Bette Midler as star) and Sunset Boulevard (revival plus Glenn Close as star).

Crossing Weekly Break Even

Gross weekly burn for shows have a wide range (just like startups), with Broadway musicals having weekly costs from $500,000 to about $800,000 and Off-Broadway musicals in the $50,000 to $200,000 range. In addition, there are royalties of roughly 10% of revenue that go to a variety of players like the composer, book writer, etc. Hamilton has about $650,000 in weekly cost and roughly a $740,000 breakeven level when royalties are factored in.  Shackleton weekly costs are about $53,000, at the low end of the range for an off-Broadway musical, at under 10% of Hamilton’s weekly cost.

Is Hamilton the Facebook of Broadway?

Successful Broadway shows have multiple sources of revenue and can return significant multiples to investors.

Chart 1: A ‘Hits’ Business Example Capital Account

Since Shackleton just had its first performance on April 14, it’s too early to predict what the profit (or loss) picture will be for investors. On the other hand, Hamilton already has a track record that can be analyzed. In its first months on Broadway the show was grossing about $2 million per week which I estimate drove about $ 1 million per week in profits. Financial investors, like preferred shareholders of a startup, are entitled to the equivalent of “liquidation preferences”. This meant that investors recouped their money in a very short period, perhaps as little as 13 weeks. Once they recouped 110%, the producer began splitting profits with financial investors. This reduced the financial investors to roughly 42% of profits. In the early days of the Hamilton run, scalpers were reselling tickets at enormous profits. When my wife and I went to see the show in New York (March 2016) we paid $165 per ticket for great orchestra seats which we could have resold for $2500 per seat! Instead, we went and enjoyed the show. But if a scalper owned those tickets they could have made 15 times their money. Subsequently, the company decided to capture a portion of this revenue by adjusting seat prices for the better seats and as a result the show now grosses nearly $3 million per week. Since fixed weekly costs probably did not change, I estimate weekly profits are now about $1.8 million. At 42% of this, investors would be accruing roughly $750,000 per week. At this run rate, investors would receive over 3X their investment dollars annually from this revenue source alone if prices held up.

Multiple Companies Amplify Revenue and Profits

Currently Hamilton has a second permanent show in Chicago, a national touring company in San Francisco (until August when it’s supposed to move to LA) and has announced a second touring company that will begin the tour in Seattle in early 2018 before moving to Las Vegas and Cleveland and other stops. I believe it will also have a fifth company in London and a sixth in Asia by late 2018 or early 2019. Surprisingly, the touring companies can, in some cities, generate more weekly revenue than the Broadway company due to larger venues. Table 1 shows an estimate of the revenue per performance in the sold out San Francisco venue, the Orpheum Theater which has a capacity 2203 versus the Broadway capacity (Richard Rogers Theater) of 1319.

Table 1: Hamilton San Francisco Revenue Estimates

While one would expect Broadway prices to be higher, this has not been the case. I estimate the average ticket price in San Francisco to be $339 whereas the average on Broadway is now $282. The combination of 67% higher seating capacity and 20% higher average ticket prices means the revenue per week in San Francisco is now close to $6 million. Since it was lower in the first 4 weeks of the 21 plus week run, I estimate the total revenue for the run to be about $120 million. Given the explosive revenue, I wouldn’t be surprised if the run in San Francisco was extended again. While it has not been disclosed what share of this revenue goes to the production company, normally the production company is compensated as a base guarantee level plus a share of the profits (overage) after the venue covers its labor and marketing costs. Given these high weekly grosses, I assume the production company’s share is close to 50% of the grosses given the enormous profits versus an average show at the San Francisco venue (this would include both guarantee and overage). At 50% of revenue, there would still be almost $3 million per week to go towards paying the production company expenses (guarantee) and the local theater’s labor and marketing costs. If I use a lower $2 million of company share per week as profits to the production company that annualizes at over $100 million in additional profits or $42 million more per year for financial investors. The Chicago company is generating lower revenue than in San Francisco as the theater is smaller (1800 seats) and average ticket prices appear to be closer to $200. This would make revenue roughly $2.8 million per week. When the show ramps to 6 companies (I think by early 2019) the show could be generating aggregate revenue of $18-20 million per week or more should demand hold up. So, it would not be surprising if annual ticket revenue exceeded $1 billion per year at that time.

Merchandise adds to the mix

I’m not sure what amount of income each item of merchandise generates to the production company. Items like the cast album and music downloads could generate over $25 million in revenue, but in general only 40% of the net income from this comes to the company. On the other hand, T-shirts ($50 each) and the high-end program ($20 each) have extremely large margin which I think would accrue to the production company. If an average attendee of the show across the 6 (future) or more production companies spent $15 this could mean $1.2 million in merchandise sales per week across the 6 companies or another $60 million per year in revenue. At 60% gross margin this would add another $36 million in profits.

I expect Total Revenue for Hamilton to exceed $10 billion

In addition to the sources of revenue outlined above Hamilton will also have the opportunity for licensing to schools and others to perform the show, a movie, additional touring companies and more.  It seems likely to easily surpass the $6 billion that Lion King and Phantom are reported to have grossed to date, or the $4 billion so far for Wicked. In fact, I believe it eventually will gross over a $10 billion total. How this gets divided between the various players is more difficult to fully access but investors appear likely to receive over 100x their investment, Lin-Manuel Miranda could net as much as $ 1 billion (before taxes) and many other participants should become millionaires.

Surprisingly Hamilton may not generate the Highest Multiple for Theater Investors!

Believe it or not, a very modest musical with 2 actors appears to be the winner as far as return on investment. It is The Fantasticks which because of its low budget and excellent financial performance sustained over decades is now over a 250X return on invested capital. Obviously, my son, an optimistic entrepreneur, hopes his 2 actor musical, Ernest Shackleton Loves Me, will match this record.

Lessons Learned from Anti-Consumer Practices/Technologies in Tech and eCommerce

One example of the anti-consumer practices by airline loyalty programs.

As more and more of our life consists of interacting with technology, it is easier and easier to have our time on an iPhone, computer or game device become all consuming. The good news is that it is so easy for each of us to interact with colleagues, friends and relatives; to shop from anywhere; to access transportation on demand; and to find information on just about anything anytime. The bad news is that anyone can interact with us: marketers can more easily bombard us, scammers can find new and better ways to defraud us, and identity thieves can access our financials and more. When friends email us or post something on Facebook, there is an expectation that we will respond.  This leads to one of the less obvious negatives: marketers and friends may not consider whether what they send is relevant to us and can make us inefficient.

In this post, I want to focus on lessons entrepreneurs can learn from products and technologies that many of us use regularly but that have glaring inefficiencies in their design, or those that employ business practices that are anti-consumer. One of the overriding themes is that companies should try to adjust to each consumer’s preferences rather than force customers to do unwanted things. Some of our examples may sound like minor quibbles but customers have such high expectations that even small offenses can result in lost customers.

Lesson 1: Getting email marketing right

Frequency of email 

The question: “How often should I be emailing existing and prospective customers?” has an easy answer. It is: “As often as they want you to.”  If you email them too frequently the recipients may be turned off. If you send too few, you may be leaving money on the table. Today’s email marketing is still in a rudimentary stage but there are many products that will automatically adjust the frequency of emails based on open rates. Every company should use these. I have several companies that send me too many emails and I have either opted out of receiving them or only open them on rare occasions. In either case the marketer has not optimized their sales opportunity.

Relevance of email

Given the amount of data that companies have on each of us one would think that emails would be highly personalized around a customer’s preferences and product applicability. One thing to realize is that part of product applicability is understanding frequency of purchase of certain products and not sending a marketing email too soon for a product that your customer would be unlikely to be ready to buy. One Azure portfolio company, Filter Easy, offers a service for providing air filters. Filter Easy gives each customer a recommended replacement time from the manufacturer of their air conditioner. They then let the customer decide replacement frequency and the company only attempts to sell units based on this time table. Because of this attention to detail, Filter Easy has one of the lowest customer churn rates of any B to C company. In contrast to this, I receive marketing emails from the company I purchase my running shoes from within a week of buying new ones even though they should know my replacement cycle is about every 6 months unless there is a good sale (where I may buy ahead). I rarely open their emails now, but would open more and be a candidate for other products from them if they sent me fewer emails and thought more about which of their products was most relevant to me given what I buy and my purchase frequency. Even the vaunted Amazon has sent me emails to purchase a new Kindle within a week or so of my buying one, when the replacement cycle of a Kindle is about 3 years.

In an idea world, each customer or potential customer would receive emails uniquely crafted for them. An offer to a customer would be ranked by likely value based on the customer profile and item profile. For example, customers who only buy when items are on sale should be profiled that way and only sent emails when there is a sale. Open Road, another Azure company, has created a daily email of deeply discounted e-books and gets a very high open rate due to the relevance of their emails (but cuts frequency for subscribers whose open rates start declining).

Lesson 2: Learning from Best Practices of Others

I find it surprising when a company launches a new version of a software application without attempting to incorporate best practices of existing products. Remember Lotus 123? They refused to create a Windows version of their spreadsheet for a few years and instead developed one for OS/2 despite seeing Excel’s considerable functionality and ease of use sparking rapid adoption. By the time they created a Windows version, it was too late and they eventually saw their market share erode from a dominant position to a minimal level.  In more modern times, Apple helped Blackberry survive well past it’s expected funeral by failing to incorporate many of Blackberry’s strong email features into the iPhone. Even today, after many updates to mail, Apple still is missing such simple features like being able to hit a “B” to go to the bottom of my email stack on the iPhone. Instead, one needs to scroll down through hundreds of emails to get to the bottom if you want to process older emails first. This wastes lots of time. But Microsoft Outlook in some ways is even worse as it has failed to incorporate lookup technology from Blackberry (and now from Apple) that always allows finding an email address from a person’s name. When one has not received a recent email from a person in your contact list, and the person’s email address is not their name, outlook requires an exact email address. When this happens, I wind up looking it the person’s contact information on my phone!

Best practices extends beyond software products to marketing, packaging, upselling and more. For example, every ecommerce company should study Apple packaging to understand how a best in class branding company packages its products. Companies also have learned that in many cases they need to replicate Amazon by providing free shipping.

Lesson 3: The Customer is Usually Right

Make sure customer loyalty programs are positive for customers but affordable for the company

With few exceptions, companies should adopt a philosophy that is very customer-centric. Failing to do so has negative consequences. For example, the airline industry has moved towards giving customers little consideration and this results in many customers no longer having a preferred airline, instead looking for best price and/or most convenient scheduling. Whereas the mileage programs from airlines were once a very attractive way of retaining customers, the value of miles has eroded to such a degree that travelers have lost much of the benefit. This may have been necessary for the airlines as the liability associated with outstanding points reached billions of dollars. But, in addition, airlines began using points as a profit center by selling miles to credit cards at 1.5 cents per mile. Then, to make this a profitable sale, moved average redemption value to what I estimate to be about 1 cent per point. This leads to a concern of mine for consumers. Airlines are selling points at Kiosks and online for 3 cents per point, in effect charging 3 times their cash redemption value.

The lesson here is that if you decide to initiate a loyalty points program, make sure the benefits to the customer increase retention, driving additional revenue. But also make sure that the cost of the program does not exceed the additional revenue. (This may not have been the case for airlines when their mileage points were worth 3-4 cents per mile).  It is important to recognize the future cost associated with loyalty points at the time they are given out (based on their exchange value) as this lowers the gross margin of the transaction. We know of a company that failed to understand that the value of points awarded for a transaction so severely reduced the associated gross margin that it was nearly impossible for them to be profitable.

Make sure that customer service is very customer centric

During the Thanksgiving weekend I was buying a gift online and found that Best Buy had what I was looking for on sale. I filled out all the information to purchase the item, but when I went to the last step in the process, my order didn’t seem to be confirmed. I repeated the process and again had the same experience. So, I waited a few days to try again, but by then the sale was no longer valid. My assistant engaged in a chat session with their customer service to try to get them to honor the sale price, and this was refused (we think she was dealing with a bot but we’re not positive). After multiple chats, she was told that I could try going to one of their physical stores to see if they had it on sale (extremely unlikely). Instead I went to Amazon and bought a similar product at full price and decided to never buy from Best Buy’s online store again. I know from experience that Amazon would not behave that way and Azure tries to make sure none of our portfolio companies would either. Turning down what would still have been a profitable transaction and in the process losing a customer is not a formula for success! While there may be some lost revenue in satisfying a reasonable customer request the long term consequence of failing to do so usually will far outweigh this cost.

 

Soundbytes

My friend, Adam Lashinsky, from Fortune has just reported that an insurance company is now offering lower rates for drivers of Teslas who deploy Autopilot driver-assistance. Recall that Tesla was one of our stock picks for 2017 and this only reinforces our belief that the stock will continue to outperform.

 

 

They got it right: Why Stephen Curry deserves to be a First Team All-Star

Curry vs. Westbrook

Much has been written about the fact that Russell Westbrook was not chosen for the first team on the Western All-Stars. The implication appears to be that he was more deserving than Curry. I believe that Westbrook is one of the greatest athletes to play the game and one of the better players currently in the league. Yet, I also feel strongly that so much weight is being placed on his triple doubles that he is being unfairly anointed as the more deserving player. This post takes a deeper dive into the available data and, I believe, shows that Curry has a greater impact on winning games and is deserving of the first team honor. So, as is my want to analyze everything, I spent some time dissecting the comparison between the two.  It is tricky comparing the greatest shooter to ever play the game to one of the greatest athletes to ever play, but I’ll attempt it, statistic by statistic.

 

Rebounding

Westbrook is probably the best rebounding guard of all time (with Oscar Robertson and Magic Johnson close behind). This season he is averaging 10.4 rebounds per game while Curry is at 4.3. There is no question that Westbrook wins hands down in this comparison with Curry, who is a reasonably good rebounding point guard.  But on rebounds per 36 minutes played this season, Westbrook’s stats are even better than Oscar’s in his best year. In that year, Robertson averaged 12.5 rebounds playing over 44 minutes a game which equates to 10.2 per 36 minutes vs Westbrook’s 10.8 per 36 minutes (Magic never averaged 10 rebounds per game for a season).

 

Assists

You may be surprised when I say that Curry is a better assist producer than Westbrook this season. How can this be when Westbrook averages 10.3 assists per game and Curry only 6.2?  Since Oklahoma City plays a very different style of offense than the Warriors, Westbrook has the ball in his hands a much larger percentage of the time. They both usually bring the ball up the court but once over half court, the difference is striking. Curry tends to pass it off a high proportion of the time while Westbrook holds onto it far longer. Because of the way Curry plays, he leads the league in secondary assists (passes that set up another player to make an assist) at 2.3 per game while Westbrook is 35th in the league at 1.1 per game. The longer one holds the ball the more likely they will shoot it, commit a turnover or have an assist and the less likely they will get a secondary assist. The reason is that if they keep the ball until the 24 second clock has nearly run out before passing, the person they pass it to needs to shoot (even if the shot is a poor one) rather than try to set up someone else who has an easier shot. For example, if a player always had the ball for the first 20 seconds of the 24 second clock, they would likely have all assists for the team while on the court.

Table 1: Assist Statistic Comparison

Curry vs. Westbrook Assists
*NBA.com statistics average per game through Feb 1st, 2017

When in the game, Westbrook holds the ball about 50% of the time his team is on offense, he gets a large proportion of the team’s assists. But that style of play also means that the team winds up with fewer assists in total. In fact, while the Warriors rank #1 in assists as a team by a huge margin at 31.1 per game (Houston is second at 25.6), Oklahoma City is 20th in the league at 21.2 per game. If you agree that the opportunity to get an assist increases with the number of minutes the ball is in the player’s possession, then an interesting statistic is the number of assists per minute that a player possesses the ball (see Table 1). If we compare the two players from that perspective, we see that Curry has 1.27 assists per minute and Westbrook 1.17. Curry also has 0.47 secondary assists per minute while Westbrook only 0.13. This brings the total primary and secondary assist comparison to 1.74 per minute of possession for Curry and 1.30 for Westbrook, a fairly substantial difference. It also helps understand why the Warriors average so many more assists per game than Oklahoma City and get many more easy baskets. This leads to them having the highest field goal percentage in the league, 50.1%.

 

Shooting

Russell Westbrook leads the league in scoring, yet his scoring is less valuable to his team than Stephen Curry’s is to the Warriors. This sounds counterintuitive but it is related to the shooting efficiency of the player: Curry is extremely efficient and Westbrook is inefficient as a shooter. To help understand the significance of this I’ll use an extreme example. Suppose the worst shooter on a team took every one of a team’s 80 shots in a game and made 30% of them including two 3-point shots. He would score 24 baskets and lead the league in scoring by a mile at over 50 points per game (assuming he also got a few foul shots). However, his team would only average 50 or so points per game and likely would lose every one of them. If, instead, he took 20 of the 80 shots and players who were 50% shooters had the opportunity to take the other 60, the team’s field goals would increase from 24 to 36. Westbrook’s case is not the extreme of our example but none-the-less Westbrook has the lowest efficiency of the 7 people on his team who play the most minutes. So, I believe his team overall would score more points if other players had more shooting opportunities. Let’s look at the numbers.

Table 2: Shot Statistic Comparison

shots-table
*NBA.com statistics average per game through Feb 1st, 2017

Westbrook’s shooting percentage of 42.0% is lower than the worst shooting team in the league, Memphis at 43.2%, and, as mentioned is the lowest of the 7 people on his team that play the most minutes. Curry has a 5.5% higher percentage than Westbrook. But the difference in their effectiveness is even greater as Curry makes far more three point shots. Effective shooting percentage adjusts for 3 point shots made by considering them equal to 1½ two point shots. Curry’s effective shooting percentage is 59.1% and Westbrook’s is 46.4%, an extraordinary difference. However, Westbrook gets to the foul line more often and “true shooting percent” takes that into account by assuming about 2.3 foul shots have replaced one field goal attempt (2.3 is used rather than 2.0 to account for 3 point plays and being fouled on a 3-point shot). Using the “true shooting percentage” brings Westbrook’s efficiency slightly closer to Curry’s, but it is still nearly 10% below Curry (see table 2). What this means is very simple – if Curry took as many shots as Westbrook he would score far more. In fact, at his efficiency level he would average 36.1 points per game versus Westbrook’s 30.7. While it is difficult to prove this, I believe if Westbrook reduced his number of shots Oklahoma City would score more points, as other players on his team, with a higher shooting percentage, would have the opportunity to shoot more. And he might be able to boost his efficiency as a shooter by eliminating some ill-advised shots.

 

Turnovers vs Steals

This comparison determines how many net possessions a player loses for his team by committing more turnovers than he has steals. Stephen Curry averages 2.9 turnovers and 1.7 steals per game, resulting in a net loss of 1.2 possessions per game. Russell Westbrook commits about 5.5 turnovers per game and has an average of 1.6 steals, resulting in a net loss of 3.9 possessions per game, over 3 times the amount for Curry.

 

Plus/Minus

In many ways, this statistic is the most important one as it measures how much more a player’s team scores than its opponents when that player is on the floor. However, the number is impacted by who else is on your team so the quality of your teammates clearly will contribute.  Nonetheless, the total impact Curry has on a game through high effective shooting percent and assists/minute with the ball is certainly reflected in the average point differential for his team when he is on the floor. Curry leads the league in plus/minus for the season as his team averages 14.5 more points than its opponents per 36 minutes he plays.  Westbrook’s total for the season is 41st in the league and his team averages +3.4 points per 36 minutes.

 

Summing Up

While Russell Westbrook is certainly a worthy all-star, I believe that Stephen Curry deserves having been voted a starter (as does James Harden but I don’t think Harden’s selection has been questioned). Westbrook stands out as a great rebounding guard, but other aspects of his amazing triple double run are less remarkable when compared to Curry. Curry is a far more efficient scorer and any impartial analysis shows that he would average more points than Westbrook if he took the same number of shots. At the same time, Curry makes his teammates better by forcing opponents to space the floor, helping create more open shots for Durant, Thompson and others. He deserves some of the credit for Durant becoming a more efficient scorer this year than any time in his career. While Westbrook records a far larger number of assists per game than Curry, Curry is a more effective assist creator for the time he has the ball, helping the Warriors flirt with the 32-year-old record for team assists per game while Oklahoma City ranks 20th of the 30 current NBA teams with 10 less assists per game than the Warriors.

Top 10 Predictions for 2017

Conceptualization of giant robot fight.
Conceptualization of giant robot fight.

When I was on Wall Street I became very boring by having the same three strong buy recommendations for many years until I downgraded Compaq in 1998 (it was about 30X the original price at that point). The other two, Microsoft and Dell, remained strong recommendations until I left in 2000. At the time, they were each well over 100X the price of my original recommendation. I mention this because my favorite stocks for this blog include Facebook and Tesla for the 4th year in a row. They are both over 5X what I paid for them in 2013 (23 and 45, respectively) and I continue to own both. Will they get to 100X or more? This is not likely, as companies like them have had much higher valuations when going public compared with Microsoft or Dell, but I believe they continue to offer strong upside, as explained below.

In each of my stock picks, I’m expecting the stocks to outperform the market. I don’t have a forecast of how the market will perform, so in a steeply declining market, out-performance might occur with the stock itself being down (but less than the market). Given the recent rise in the market subsequent to the election of Donald Trump, on top of several years of a substantial bull market, this risk is real. While I have had solid success at predicting certain individual stocks’ performance, I do not pride myself in being able to predict the market itself. So, consider yourself forewarned regarding potential market volatility.

This top ten is unusual in having three picks that are negative forecasts as last year there were no negatives and in 2015 only one.

We’ll start with the stock picks (with prices of stocks valid as of writing this post, January 10, all higher than the beginning of the year) and then move on to the remainder of my 10 predictions.

  1. Tesla stock appreciation will continue to outpace the market (it is currently at $229/share). Tesla expected to ship 50,000 vehicles in the second half of 2016 and Q3 revenue came in at $2.3 billion. This equates to 100,000 vehicles and a $9.2 billion annualized run rate. The model 3 has over 400,000 units on back order and Tesla is ramping capacity to produce 500,000 vehicles in total in 2018. If the company stays on track, from a production point of view, this amounts to 5X the vehicle unit sales rate and about 3X the revenue run rate. While the model 3 is unlikely to have the same gross margins as the current products, tripling revenue should still lead to substantially more than tripling profits. Tesla remains the clear leader in electric vehicles and fully integrated automated features in an automobile. While others are looking towards 2020/2021 to deliver automated cars, Tesla is already delivering most of the functionality required. Between now and 2020 Tesla is likely to install numerous improvements and should remain the leader. Tesla also continues to have the strongest business model as it sells directly to the consumer, eliminating dealers. I also believe that the Solar City acquisition will prove more favorable than anticipated. Given these factors, I expect Tesla stock to have solid outperformance in 2017. The biggest risk is product delay and/or delivering a faulty product, but competitors are trailing by quite a bit so there is some headroom if this happens.

2. Facebook stock appreciation will continue to outpace the market (it is currently at $123/share). While the core Facebook user base growth has slowed considerably, Facebook has a product portfolio that also includes Instagram, WhatsApp and Oculus. This gives Facebook multiple opportunities for revenue growth: Improve the revenue per DAU (daily active user) on Facebook itself; begin to monetize Instagram and WhatsApp in more meaningful ways; and build the install base of Oculus. We have seen Facebook advertising rates increase steadily as more and more mainstream companies shift budget from traditional advertising to Facebook. This, combined with modest growth in DAUs, should lead to continued strong revenue growth from the Facebook platform itself. The opportunity to increase monetization on its other platforms should become more real during 2017, providing Facebook with additional revenue streams. And while the Oculus did not get out of the gate as fast as expected, it is still viewed as the premier product in VR. We believe the company will need to produce a lower priced version to drive sales into the millions of units annually. The wild card here is the “killer app”; if a product becomes a must have and is only available on the Oculus, sales would jump substantially in a short time.

3. Amazon stock appreciation will outpace the market (it is currently at $795/share). I had Amazon as a recommended stock in 2015 but omitted it in 2016 after the stock appreciated 137% in 2015 while revenue grew less than 20%. That meant my 2015 recommendation worked extremely well. But while I still believed in Amazon fundamentals at the beginning of 2016, I felt the stock might have reached a level that needed to be absorbed for a year or so. In fact, 2016 Amazon fundamentals continued to be quite strong with revenue growth accelerating to 26% (to get to this number, I assumed it would have its usual seasonally strong Q4). At the same time, the stock was only up 10% for the year. While it has already appreciated a bit since year end, it seems to be more fairly valued than a year ago, and I am putting it back on our recommended list as we expect it to continue to gain share in retail, have continued success with its cloud offering (strong growth and increased margin), leverage their best in class AI and voice recognition with Echo (see pick 10), and add more physical outlets that drive increased adoption.

4. Both Online and Offline Retailers will increasingly use an Omnichannel Approach. The line between online and offline retailers will become blurred over the next five years. But despite the continued increase in online’s share of the total, physical stores will be the majority of sales for many years. This means that many online retailers will decide to have some form of physical outlets. The most common will be “guide stores” like those from Warby Parker, Bonobos and Tesla where samples of product are in the store but the order is still placed online for subsequent delivery. We believe Amazon may begin to create several such physical locations over the next year or two. I expect brick and mortar retailers to up their game online as they struggle to maintain share. But currently, they continue to struggle to optimize their online presence, so much so that Walmart paid what I believe to be an extremely overpriced valuation for Jet to access better technology and skills. Others may follow suit. One retailer that appears to have done a reasonable job online is William Sonoma.

5. A giant piloted robot will be demo’d as the next form of Entertainment. Since the company producing it, MegaBots, is an Azure portfolio company, this is one of my easier predictions, assuming good execution. The robot will be 16 feet high, weigh 20,000 pounds and be able to lift a car in one hand (a link to the proto-type was in my last post). It will be able to shoot a paint ball at a speed that pierces armor. If all goes well, we will also be able to experience the first combat between two such robots in 2017. Actual giant robots as a new form of entertainment will emerge as a new category over the next few years.

6. Virtual and Augmented reality products will escalate. If 2016 was the big launch year for VR (with every major platform launching), 2017 will be the year where these platforms are more broadly evaluated by millions of consumers. The race to supplement them with a plethora of software applications, follow on devices, VR enabled laptops and 360 degree cameras will escalate the number of VR enabled products on the market. For every high-tech, expensive VR technology platform release, there will be a handful of apps that will expand VR’s reach outside of gaming (and into viewing homes, room design, travel, education etc.), allowing anyone with simple VR glasses connected to a smartphone to experience VR in a variety of settings.  For AR, we see 2017 as the year where AR applicability to retail, healthcare, agriculture and manufacturing will start to be tested, and initial use cases will emerge.

7. Magic Leap will disappoint in 2017. Magic Leap has been one of the “aha” stories in technology for the past few years as it promised to build its technology into a pair of glasses that will create virtual objects and blend them with the real world. At the Fortune Brainstorm conference in 2016, I heard CEO Rony Abovitz speak about the technology. I was struck by the fact that there was no demo shown despite the fact that the company had raised about $1.4 billion starting in early 2014 (with a last post-money at $4.5 billion). The problem for this company is that while it may have been conceptually ahead in 2014, others, like Microsoft, now appear further along and it remains unclear when Magic Leap will actually deliver a marketable product.

8. Cable companies will see slide in adoption. Despite many thinking to the contrary, the number of US cable subscribers has barely changed over the past two years, going down from 49.9 million in Q2 2014 to 48.9 million in Q2 2016 (a 2% loss). During the same period, Broadband services subscribers (video on demand for Netflix, Hulu and others) increased about 12% to 57.0 million. Given the extremely high price of cable, more people (especially millennials) are shifting to paying for what they want at considerably less cost so that the rate of erosion of the subscriber base should continue and may even accelerate over the next few years. I expect to see further erosion of traditional TV usage as well, despite the fact that overall media usage per day is rising. The reason for lower TV usage is the shift people are making to consuming media on their smart phones. This shift is much broader than millennials as every age group is increasing their media consumption through their phones.

9. Spotify will either postpone its IPO or have a disappointing one. In theory, valuation of a company should be calculated based on future earnings flows. The problem for evaluating companies that are losing money is that we can only use proxies for such flows and often wind up using them to determine a multiple of revenue that appears appropriate. To do this I first consider gross margin, cost of customer acquisition and operating cost to determine a “theoretic potential operating profit percentage” that a company can reach when it matures. I believe the higher this is, the higher the multiple and similarly the higher the revenue growth rate, the higher the multiple. When I look at Spotify numbers for 2015 (2016 financials won’t be released for several months) it strikes me (and many others) that this is a difficult business to make profitable as gross margins were a thin 16% based on hosting and royalty cost. Sales and marketing (both of which are variable costs that ramp with revenue) was an additional 12.6% leaving only 3.4% before G&A and R&D (which in 2015 were over 13% of revenue). This combination has meant that scaling revenue has not improved earnings. In fact, the 80% increase in revenue over the prior year still led to higher dollars in operating loss (about 9.5% of revenue). Unless the record labels agree to lower royalties substantially (which seems unlikely) its appears that even strong growth would not result in positive operating margins. If I give them the benefit of the doubt and assume they somehow get to 2% positive operating margin, the company’s value ($8 billion post) would still be over 175X this percent of 2015 revenue. If Spotify grew another 50% in 2016, the same calculation would bring the multiple of theoretical 2016 operating margin to about 120X. I believe it will be tough for them to get an IPO valuation as high as their last post if they went public in Q2 of this year as has been rumored.

10. Amazon’s Echo will gain considerable traction in 2017. The Echo is Amazon’s voice-enabled device that has built-in artificial intelligence and voice recognition. It has a variety of functions like controlling smart devices, answering questions, telling jokes, playing music through Sonos and other smart devices and more. Essentially an app for it is called a “skill”. There are now over 3,000 of these apps and this is growing at a rapid rate. In the first 12 months of sales, a consulting firm, Activate, estimated that about 4.4 million were sold. If we assume an average price of about $150, this would amount to over $650 million to Amazon. The chart below shows the adoption curve for five popular devices launched in the past. Year 1 unit sales for each is set at 1.0 and subsequent years show the multiple of year 1 volume that occurred in that year. As can be seen from the chart, the second year ranged from 2x to over 8X the first year’s volume and in the third year every one of them was at least 5 times the first year’s volume. Should the Echo continue to ramp in a similar way to these devices, its unit sales could increase by 2-3X in 2017 placing the device sales at $1.5-2.0 billion. But the device itself is only one part of the equation for Amazon as the Echo also facilitates ordering products, and while skills are free today, some future skills could entail payments with Amazon taking a cut.

graph-image

Re-cap of 2016 Predictions

fridge-image
Samsung FamilyHub Fridge: manage groceries, family scheduling, display photos and play music through a wifi enabled touchscreen

In my post for top 10 predictions for 2016 I noted how lucky I had been for 3 years running as all my picks seemed to work. I pointed out that all winning streaks eventually come to an end. I’m not sure if this constitutes an end to my streak but in my forecasts for 2016 I was wrong with one of the three stock picks (GoPro) and also missed on one of my seven forecasts of industry trends (that the 2016 political spend would reach record levels). My other 2 stock picks and other 6 trend forecasts did prove accurate.

I’ve listed in bold the 2016 stock picks and trend forecasts below and give a personal evaluation of how I fared on each. For context, the S&P was up 7.5% and the Nasdaq 10.0% in 2016.

1. Facebook stock appreciation will continue to outpace the market (it is currently at $97/share). One year later (January 3) Facebook opened at $117.50, a year over year gain of 21.1% from the time of my blog post. While this was short of the 40% gain in 2015, it still easily outpaced the market.

2. Tesla stock appreciation will continue to outpace the market (it is currently at $193/share). One year later, Tesla shares opened at $219.25 (January 3), a 13.5% gain from the time of my blog post. It might have been higher but the acquisition of Solar City created headwinds for the stock as revenue grew well over 100%, gross profit improved and in Q3 (last reported quarter) EBITDA was positive. Still, it outperformed the market.

3. GoPro stock appreciation should outpace the market in 2016 (shares are currently at $10.86). This pick was a clear miss as the stock declined 17.1% from the time of the blog post to January 3. In my defense, I had it partly right as the stock peaked at $17/share at the time of the drone and new camera announcements. In retrospect, given GoPro’s history of poor execution, I would have been smarter to recommend selling at the time these were announced. Instead, I mistakenly viewed execution as pretty easy and failed to suggest this. Since the company, once again, had an execution misstep, I was proven wrong and the stock subsequently declined.

The remaining predictions were about industry trends rather than stocks.

4. UAV/Drones will continue to increase in popularity. Drones continued to increase in popularity at the end of 2015 and into the first half of 2016. According to Market Watch, drone sales were up over 200% in April of 2016 as compared with April of 2015. Starting in December of 2015, the government began requiring drone operators to register on a federal database and by December 2016 had registered over 600,000 drones and users.

5. Political spend will reach record levels in 2016 and have a positive impact on advertising revenue. This forecast proved incorrect. Donald Trump won the presidency despite raising less money than any major party presidential candidate since 2008. Hillary Clinton, raised nearly twice as much as Trump, but still fell short of what President Obama raised in 2012. In the case of President-Elect Trump, more than half of his small raise consisted of $66 million he personally donated to his campaign and $280 million from donors giving $200 or less. Mrs. Clinton, despite depicting Trump as the candidate of the rich, received a substantial portion of her donations from wealthy individuals. The two candidates raising less money meant that the size of the boost in advertising from political ads fell short of my prediction.

6. Virtual/Augmented Reality will have a big year in 2016. As expected, 2016 was the big launch year for VR and AR. Highly anticipated VR product launches (the Facebook Oculus Rift in March, the HTC Vive in April and the PlayStation VR in October) showed strong consumer interest with sales of over 1.5M units. Pokemon Go’s 500M + downloads and the initial release of Microsoft’s Hololense generated intense interest in AR, creating a flurry of application development across a variety of industries including healthcare, agriculture, manufacturing and retail. Unsurprisingly, this excitement is mirrored in VC investment dollars, with a 140% growth in funding over 2015, bringing the total amount invested this past year to $1.8 Bn. This shows a strong trajectory for more development across gaming and commercial applications in AR / VR as we move into 2017.

7. Robotic market will expand to new areas in 2016. From chatbots being introduced by many companies for interacting with customers, to a giant fighting robot (16 foot tall, 20,000 pounds) that can lift and throw a car, to robots for making pizzas, to robots that help educate kids, 2016 was a year of enormous expansion in the robotics market.

8. A new generation of automated functionality will begin to be added to cars. In 2016 autonomous cars moved from concept to closer to reality. To date, the technology leaders appeared to be Tesla and Google, the former building a fully integrated product, the latter a set of components that can be integrated into many different vehicles. Tesla, who appears to be furthest along in putting a fully autonomous car on the road in volume, added more components (software and sensors) to its autonomous technology but suffered a setback when a driver ignored Tesla requirements to “supervise” the autonomous driving and suffered a fatal accident. Autonomous cars took many steps forward in 2016 as additional companies entered the fray. Uber, a company that has much to gain from driverless cars (like eliminating the need for its over 1 million drivers), began an experiment in Pittsburg to offer driverless cars (supervised by an actual person in the driver’s seat) as part of its service. These cars are being manufactured in a partnership with Volvo using technology created by Carnegie Robotics (who’s founder was one of the creators of the Google technology). Uber also acquired Otto, a startup focused on driverless trucks, to gain further technology. In August, Ford announced its intent to bring an autonomous car to market by 2021. Audi just announced a partnership with Nvidia to bring an autonomous car to the road by 2020-21. Toyota, Chrysler and others have also announced intent to create such a vehicle. While I believe that the actual mass usage of driverless cars will be further out then 2021, we seem to be close to a breakout of “supervised automated vehicles”.

9. The Internet of Things will expand further into kitchen appliances and will start being adopted by the average consumer. In the past 12 months Samsung, LG, GE and others have launched numerous smart refrigerators. These can now be thought of as devices that can connect to a smart phone through an app. The user can receive alerts like ‘a water filter needs replacing’ or ‘the door was left open’. Some have digital bulletin boards on the fridges, other features can let you know when various items stored in the fridge are running low, and still more features can be deployed to control functionality (change temperature, etc). The adoption of these devices has reached sufficient levels for them to be carried in mainstream stores like Best Buy.

10. Amazon will move to profitability on their book subscription service and improve cloud capex. Amazon did indeed make three major shifts in its book subscription strategy. First, it significantly reduced payouts to publishers for their books that were downloaded; second, it reduced the proportion of third party published books offered to subscribers to the service and third it reduced the amount it pays their own authors. While Amazon does not report these numbers, I believe this combination has reduced the cost to Amazon by over 50% and has made the service profitable. The gross margin before stock based compensation for Amazon’s cloud service increased year over year in Q3 (last reported quarter) from 27.1% in 2015 to 31.6% in 2016.

 

While it wasn’t in my Top 10 post for 2016, I did predict that Kevin Durant would sign with the Warriors as he would fit right in and improve his chances of winning championships. He has signed, seems to fit in well, but we’ll have to wait to see if the championships follow.

I’ll be making my 2017 picks within the next week.

Trump’s Carrier deal a positive step for workers

It saves at least 800 jobs at a 14x return to government

Let me start this post by saying I did not vote for Donald Trump and consider myself an independent. But, as my readers know, I can’t help analyzing everything including company business models (both public and private), basketball performance, football, and of course, economics. I have, to date, resisted opining on the election, as it appears to be a polarizing event and therefore a no-win for those who comment. However, I care deeply about the future of our country and the welfare of workers of all levels. Being in Venture Capital allows me to believe (perhaps naively) that I contribute to adding jobs to our country. All this brings me to the recent agreement reached between Trump and Carrier, as it may mark a shift in economic policy.

A key assumption in interpreting the value of the deal is how many jobs were already slated by Carrier to leave the country and which of these were saved. President-Elect Trump has claimed he saved 1,150 jobs. Trump’s opponents say 350 were never slated to leave the country. I’m not going to try to figure out which camp is right. My analysis will only assume 800 manufacturing jobs that were slated to leave the country now will remain in Indiana. This does not seem to be disputed by anyone and was confirmed by a Carrier spokesperson. My observations for this analysis are:

  1. Had those jobs left, 800 fewer people would be employed (which might be different ones than these but less jobs mean less employment).
  2. The average worker at these jobs would make $20 an hour plus overtime (some reports have put this as high as $30 per hour fully loaded cost to Carrier). The average worker at these jobs would make about $45,000 annually, assuming modest overtime.
  3. On average, assuming working spouses in many cases, family income would be an average of $65,000.

Given what we know, here’s why I think Trump’s Carrier deal is a good one for the U.S., and actually results in revenue to the government that far exceeds the tax credits:

Social security taxes are currently 6.2% of each worker’s wages. The employer matches that, resulting in about $5,600 in FICA tax income to the government per worker from social security. Medicare is 1.45% and is also matched, resulting in about $1,300 in Medicare taxes paid to the government.

The federal income tax increment between a $20,000 family income (for spouse) and $65,000 family income is about $4,000 (but depends on a number of factors). Indiana state taxes of 3.3% on adjusted gross income comes out to nearly $1,400.

To make the total relatively conservative, I’ve omitted county taxes, payroll taxes and other payments that various other governmental entities might receive. This should mean the total financial income to various governmental entities from these jobs remaining probably exceeds those calculated in Table 1 below even if some of my rough assumptions are not exact.

Table 1. Governmental Income per Worker

Table

So, the economic question of whether the subsidy Trump agreed to was worth it partly depends on how much additional income was derived by the government versus the tax credits of $700,000 per year granted to Carrier in exchange for keeping the jobs here.

Of course, there is also a multiplier effect of families having higher income available for spending. And if 800 additional people are unemployed, there are numerous costs paid by the government. We’ll leave these out of the analysis, but they are all real benefits to our society of more people being employed. It is important to realize how expensive it is for the government to subsidize unemployed workers as opposed to realizing multiple sources of tax revenues when these workers have good jobs.

If we take the total from Table 1, which we believe underestimates the income to governmental entities, and multiply it by the 800 workers, the annual benefit adds up to about $9.8 million. Since Carrier is getting a $700,000 annual subsidy, the governmental revenue derived is over 14 times the cost. And that is without including a number of other benefits, some of which we mentioned above. As an investor, I’d take a 14 times return every day of the year. Wouldn’t you? Shouldn’t the government?

This is not a sweetheart deal for Carrier

I won’t go into all the math, but it indicates that Carrier will spend tens of millions of dollars more by keeping workers in the U.S. rather than moving them to Mexico. Comments that the $700,000 yearly benefit they have been given is a sweetheart deal does not appear to be the case.

Why the Democrats lost the election

Trump campaigned on the promise that he would create policies and heavily negotiate to increase employment in America. While this is a small victory in the scheme of things and certainly falls short of retaining all the jobs Carrier wanted to move, the analysis demonstrates that spending some money in tax breaks to increase employment has a large payback to government. It also means a lot to 800 people who greatly prefer being paid for working rather than receiving unemployment benefits.

Is this approach scalable?

The other question is whether this is scalable as a way of keeping jobs in America. Clearly Trump would not be able to negotiate individually with every company planning on moving jobs out of the U.S. Some infrastructure would need to be created – the question would be at what cost? If this became policy, would it encourage more companies to consider moving jobs as a way of attracting tax benefits? Any approach would need to prevent that. My guess is that getting a few companies known to be moving jobs to reconsider is only an interim step. If Trump is to fulfill his promise, an ongoing solution will be needed. But it is important to properly evaluate any steps from an impartial financial viewpoint as the United States needs to increase employment.

Employment is the right way of measuring the economy’s health

My post of March 2015 discussed the health of the economy and pointed out that looking at the Unemployment Rate as the key indicator was deceptive as much of the improvement was from people dropping out of the workforce. Instead, I advocated using the “Employment Rate” (the percent of the eligible population employed) as a better indicator. I noted that in 2007, pre-downturn, 63.0% of the population had a job. By 2010 this had declined to 58.5%, a 450 basis point drop due to the recession. Four years later the “Recovery” drove that number up to 59.0% which meant only 1/9 of the drop in those working returned to the workforce. Since then the workforce has recovered further but still stands at 325 basis points below the pre-recession level. That is why the rust belt switched from voting Democrat to President-Elect Trump.

The real culprit is loss of better quality job opportunities

In an article in the New York Times on December 7, “stagnant wages” since 1980 were blamed for lack of income growth experienced by the lower half on the economic scale. I believe that the real culprit is loss of better quality job opportunities. Since 1980 production and non-supervisory hourly wages have increased 214% but at the same time manufacturing workers as a percent of the workforce has shrunk from 18.9% to 8.1% and there has been no recovery of these jobs subsequent to the 2007-2010 recession. Many of these displaced workers have been forced to take lower paying jobs in the leisure, health care or other sectors, part-time jobs or dropped out of the workforce entirely (triggering substantial government spending to help them). This loss of available work in manufacturing is staggering and presents a challenge to our society. It also is the button Donald Trump pushed to get elected. I am hoping he can change the trend but it is a difficult task for anyone, Republican or Democrat.

A condensed version of of this post is featured on Fortune.com 

The Importance of Lifelong Relationships

At my son’s convocation at Wharton, the incoming MBA class was asked to write the names of their five best friends on the left side of a page and then the five people with whom they would most want to start a business on the right side of a page. The lesson was that the key to success in business was to develop relationships so that the future version of that piece of paper would have as many overlapping names as possible on both sides of the page.

Earlier this summer, I was invited to speak to Brooklyn College’s 2016 graduating class. I wanted to emphasize the importance of lifelong relationships for personal and business success. For me, Brooklyn College was foundational to so many of my most important relationships. It is where I met my beautiful, brilliant wife Michelle as well as eight couples who all attended my son’s wedding late last year. As we wrapped up our 10th Annual Azure CEO Summit, I was humbled to see so many familiar faces that may have started as business acquaintances but have now become close friends. As I reflect on the importance of these lifelong relationships, I wanted to share my speech to the Brooklyn College’s Class of 2016.

Good Morning, President Gould, distinguished faculty, parents, and especially – the fabulous graduating class of 2016! It’s a great pleasure to be back in Brooklyn to greet you all today, as I now live in what’s known as Silicon Valley, California.

I want to focus on three things:

  1. Make sure your friends from Brooklyn College become friends for life.
  2. College is only the beginning of your education, post-college you must continue to learn or you will be left behind.
  3. Never forget that Brooklyn College helps people move up in society.

My beautiful, brilliant wife, who I met at Brooklyn College, is also here today. We recently celebrated our son’s wedding. One of the highlights was there were 8 couples attending where the origin of the relationship stemmed from our school days. And make no mistake about it; there is a difference in the depth of the relationship when you know someone from that early in life. So my first advice is: “Make sure you stay in touch with those you really care about from college”

Brooklyn College is for people who work hard, are smart and typically couldn’t have afforded to go to college were CUNY not available

It helps people move up in society

The close friends I met here all had parents with modest incomes. Yet, we are all very successful financially –but more importantly –in life.

In my case, my father was an immigrant who came through Ellis Island. He had to go to work and couldn’t even attend high school. My mother, the daughter of an immigrant, did have the opportunity to finish high school.

Brooklyn College allowed me to be part of the first generation from my family that could afford college. And it provided as good an education as any school in the country!

I became the CEO of a successful startup and then went to Wall Street where I became the Number 1 Analyst following the PC space, and after 10 years left Wall Street to co-found a Venture Capital firm. 

The trick for you to replicate what my friends from College and I have achieved is to leverage this great education and your superior intelligence beyond college. Senator Schumer mentioned the advantage you have because you know today’s technology. This advantage is ephemeral. Whether you’re going to grad school or straight to a job my second bit of advice is:

Never take anything for granted, the world is changing at an increasingly rapid pace. Within 5 years all that you know regarding technology will likely be obsolete. To keep up you must always continue to learn. That coupled with working hard is the way you will succeed beyond college.

Many of you may have noticed that governmental support for CUNY is diminishing and could impact the school. “So, once you do succeed, as I know you will, remember to give back to Brooklyn College so the next generation that wants to move up in society has the same opportunity as you

Thank you and congratulations.

Soundbytes

  • Speaking of long term relationships, I am both happy and sad to note that Dan Park, my editor and collaborator for SoundBytes is leaving his full-time position at Azure to take a senior operating role at Uber Canada. I’m happy for him but sad not to have him continue full-time at Azure. Fortunately, he has agreed to remain as an Azure Venture partner and to continue to work with me on this blog.

An Analysis of Kevin Durant’s Free Agency Decision

There is much controversy over whether Kevin Durant should leave OKC and if so, what team he best fits with. In evaluating what makes the most sense for him I’d like to cut through emotional clutter and start with objectives:

  • To be rated among the best ever, a basketball player needs to win championships – which is why LeBron James left Cleveland originally and why Bill Russell (8 championships) usually gets rated above Wilt Chamberlain (2 championships) despite the fact that Wilt was clearly a much more complete player and why you don’t typically see the great Patrick Ewing, Allen Iverson or Elgin Baylor (all 0 championships) getting ranked that high among the greatest players of the century .
  • When you win championships, people soon forget how stacked your team may or may not have been – LeBron is sometimes referred to as a failure in his first Cleveland stint despite taking the worst team in the league to the NBA finals and few talk about how good Michael Jordan’s supporting cast was in making the playoffs even when he was playing baseball instead of basketball.
  • I believe Durant understands that and his primary objective is to win championships so that he can rank higher among the greats.

How can he best accomplish that?

  • Kevin Durant could stay in OKC because of the emotional concept that it’s “his team” and he should not abandon them. The idea being that helping them win is somehow better than helping someone else win. If he does, his chance of winning a championship would be less than 12.5% (1 in 8) since they would probably need to beat San Antonio, Golden State and Cleveland and it’s hard to rate them as favorites in any of those matchups.
  • If Durant went to Golden State they would likely win the Western Conference again and have an easier schedule than a Durant led OKC could have in the playoffs. They are already the favorite to win the title even without Durant and the odds of them winning would increase significantly should they land him. Golden State is also a perfect fit for him as it plays a team game that would improve the quality of his shot opportunities. How does a team simultaneously double team Durant, Curry and Thompson? So not only would this increase his chance of winning, it also would likely increase his shooting percent and his assists.
  • The other team that he could pick with the best opportunity to win would be Cleveland but there is no cap space there and it’s unlikely that this would be a fit.
  • The third strong opportunity is San Antonio. While this would be a fit, the path to a title would not be as likely as Golden State or Cleveland because several key players are aging. However, adding Durant would create a strong trio that could challenge Golden State and possibly would be favored over them. But not the overwhelmingly favorites that the Warriors would be with Durant. Also going from one small market to another would not add the media draw that would lead to maximizing endorsement income.
  • Although there are rumors of Boston, Los Angeles, New York, Washington, Houston and Miami also courting Durant, none of these teams would solve any of his objectives. None would give him a high probability of winning a championship and would solve even less for the emotional component of the decision.
  • The question that was rattling around all year was “Why would the Warriors want Durant.” The answer is obvious and even more obvious after their game 7 loss – he will make them better. Adding one of the 5 best players in basketball, who shoots for a high percentage, plays defense well and is team oriented makes any team better.
  • What about the argument that adding Durant would use up so much cap space that the Warriors would need to shed other key players? I agree that they would not be able to keep Harrison Barnes and Festus Ezeli. But the reality is that Ezeli is not a key player and they should not match the high price he is likely to get in the free market, regardless of whether or not they get Durant. By Durant (and in the future Curry) taking less than a max salary, the Warriors could make sure that they kept Iguodala and Livingston plus all starters (including Andrew Bogut) other than Barnes. The rest of the team could be filled in and I would predict the Warriors could attract others who are willing to take lower salaries in order to be on a championship team. So, I suspect the remainder of the supporting cast will be as good as this year. If Durant is willing to take a salary that enables keeping the 6 key players mentioned, then he will maximize his chance of winning a title. When the cap goes up next year, he and Curry could take higher, but not maximum, salaries so that the team around them could continue to include Iguodala and Livingston.
  • What about the argument that Durant should maximize his compensation? My answer is that he will maximize his compensation by taking a lower salary and going to the Warriors because his endorsement money will increase by far more than any salary he forgoes since he would be playing on the highest profile team in a major market and winning championships. To quantify the opportunity, Michael Jordan made more in 2015 from endorsements (12 years after his last retirement) than he did in all 15 years in NBA earnings. Curry is already proving that and can easily take a lower than max salary when his contract expires in another year as his endorsements will dwarf his salary. And winning more championships will only increase all the key players’ outside revenue dramatically.

 

The Ultimate Marketing Framework

Combining a Top Marketing Specialist’s Framework with Our Thoughts

We just spent several hours speaking with Marc Schwartz, an Integrated Marketing Specialist.  Marc has been in marketing for 25 years in various high level positions with companies like Kraft/Gevalia, Publishers Clearing House, Starwood Hotels, Wyndham, Pfizer and Sanofi. His experience spans both online and offline. In this post we combine his concept of a marketing framework with our thoughts on specific topics within that framework.

Creating a Marketing Framework

Marc points out that it is important for every company to have a marketing framework that has three common threads throughout:

  • Consistently build your brand throughout every step in the process.
  • Measure everything – “If you can’t measure it, don’t do it!”
  • Be customer centric – always think about how the customer will feel about anything you choose to do

Marketing can be broken down into 4 important steps and companies will likely need different people with different skill sets to address each step:

  1. Acquisition
  2. Retention
  3. Upsell /Cross sell
  4. Winning Back Customers

1. Customer Acquisition – The 40/40/20 Rule

Marc’s experience has shown that in acquiring new customers 40% of success has to do with targeting the right people, 40% with the nature of the offer and 20% with creative.  Worth noting while creative and messaging is critical, in direct marketing the right offer delivered to the right audience is the most important factor. Targeting is not about mass marketing but rather about knowing your potential customers and finding the most efficient way to reach them.

Targeting

One important thing I have found is that it may make sense to spend more money per potential customer (referred to as Customer Acquisition Cost or CAC) if you reach individuals who will be greater spenders on your product (this is referred to as Life Time Revenue or LTR).  For example Facebook charges more to find closer matches to your target demographic but spending more initially has led several Azure portfolio companies to acquire a stronger customer set which in turn increases LTR and and makes the higher spending worthwhile. The key is to compare the CAC of each particular acquisition channel to the value of the customer (Lifetime profits on the customer or LTV). When LTV is higher than the CAC that means the customer is profitable. But we discourage our companies from going after marginally profitable customers so I would encourage you to think in terms of LTV being at least twice the CAC (I won’t invest in a startup unless I believe the ratio can exceed 3X). Each acquisition channel can become less effective when going beyond a certain scale but that scale will differ dramatically with the products being offered. The determination of where to cap spend should be decided by gradually increasing your commitment on a successful channel until you find that the incremental spend is not yielding good incremental results.  It is important to avoid being a one trick pony so using multiple channels helps scale customer acquisition without hitting diminishing returns for a much longer period. Any channel that tests well should be utilized with the total spend being apportioned based on effectiveness (the ratio of LTV/CAC) of each channel.

The Offer

Each campaign should lead with an offer that is a strong value proposition for the target customer. The offer must have a very clear call to action. Saying “try my product” would not usually be viewed as a compelling offer. At Gevalia Coffee, the company offered a free coffee maker if you began subscribing. At Publisher’s Clearing House the company entered you in a contest where you could win $1,000,000. More recently, Warren Buffet offered $1 billion to anyone who picked every game right in the NCAA tournament (his risk of paying out is low as the odds of there being a correct answer among 100 million unique entries is less than one in 10 billion!).  Marc points out that his experience indicates there is a direct correlation between the value of what you give away and retention. The more compelling the offer, the lower the retention as more “cherry pickers” sign up so starting with a free month of a physical product can potentially backfire.  In fact, these days, there are bloggers who tell their following, “Go to this site for a free month of some product”. It would be surprising if your company recovered it’s CAC on this set of potential customers as followers of such bloggers will rarely become paying customers.

Therefore it’s important to find the balance between your brand equity and the value of the premium used.  While a lower valued premium will probably lead to fewer customers being acquired, it is likely to also lead to higher LTV for those customers and stronger brand equity. The key with this, as with everything in this blog post, is a continuous A/B testing philosophy to see what works best.

One note of caution: From early November through late December, the cost of virtually every form of marketing goes up due to increased purchasing that takes place for Christmas gifts. If your product won’t benefit from this, your annual plan should have the lowest spend (if any) during this period.

Creative

Marc ranks creative at 20% of the formula for winning customers. This is half the importance of proper targeting and the nature of the offer because great creative can’t overcome a poor offer or going after the wrong customers. However, the creative is where you get to explain who you are (your brand statement), why the offer has value to the target customer and your call to action. If the call to action is not clear enough than you won’t get the desired action. The creative needs to be A/B/C tested for best language, fonts, colors, graphics, etc. Every element of the creative has an impact on how well the offer performs. You also need to decide if different offers and/or creative should be used for different subsets of the target customers. One of the best examples of this that I have seen was a campaign that targeted graduates of various schools and led with something like: “Your Harvard degree is worth even more if you …” The conversion rate of this highly targeted campaign was more than double the norm for this company.

2. Optimizing Customer Retention

On Boarding

I’m sure you have heard the expression: “You only get one chance to make a good first impression.”  Your best opportunity comes after the customer has placed their order (although the acquisition process was the first step). For this section I’m going to assume you are sending the customer physical goods. Marc calls this first experience “The Brand Moment”. If you think of how Apple packages its products, they have clearly enhanced their brand through the packaging with every element of the package as perfect as they can make it. Opening their box certainly enhances the Apple brand. So you need to balance the expense of better quality packaging against the degree to which it enhances your brand equity. The box itself should be branded and can contain a message that you want to relate to the customer. The nature of collateral material, how many items, what messaging on the materials and the order they are placed in the box needs to be researched. Have you included easy to find information on how to resolve a problem? Should there be a customer support phone number to call if something is amiss? Is your brand position re-emphasized in the materials?  A good impression can lead to higher LTV, recommendations for other customers and more.

Communications

Marc believes the first step in communications should be to welcome the new customer. This would usually be through an email (or snail mail). Marc finds an actual phone call is highly effective but costly.  Obviously your business model will determine the appropriate action to welcome the new customer. The welcome email (or call) is an opportunity to re-emphasize your brand and its value to the customer. It’s important to communicate regularly with every customer. He actually found that placing a phone call can often improve customer retention even more than giving something for free.  To the degree that it makes sense, customers should be segmented and each segment should get their own drip campaign of emails.  Don’t over communicate!  This can be even more negative than under communicating and can cause churn. Of course if you can offer real value to the customer in greater frequency than do so – for example, customers that sign up to a “daily deal” product probably expect daily emails. There are some email platforms that automatically adjust frequency based on open rates (very low open rates are a good indicator of over communication with that customer).

Retention Offers

Various types of premiums can be used for retention. Much like those used in acquisition there should be careful testing of cost vs expanded LTV. For any offer, tests should be constructed that track how paired groups perform who haven’t received the offer vs those that have.  If the LTV of the group receiving the offer doesn’t exceed the LTV of the paired group by more than the cost of the offer than the offer should not be rolled out. Marc found in the past (in a subscription model) that if an offer is too valuable the company may see a large churn of customers in the month subsequent to the offer being received.

3. Cross Sell/Upsell

There are multiple ways to think about increasing a customer’s LTR:

  1. Keep them as customers longer
  2. Get them to buy more frequently
  3. Get their average invoice value to be higher, i.e. cross sell/upsell

The strategies I spoke about for retention actually focus on the first two of these. The third is an extremely valuable part of a marketing arsenal and I am surprised on how underutilized this tactic is among many companies. To begin, you need to have things in your product set to upsell or cross sell. The items should be relevant to your brand and to your customers. If you are already shipping a box to your customer as their base order, adding another item or shifting to a more expensive version of the base item typically makes the order not only higher in revenue but also can increase the Gross Margin on the order as shipping and fulfillment are unlikely to increase much, if at all (and these days shipping is usually absorbed by the seller). Every company needs to think about brand positive ways to make such offers.

When I was the primary analyst on Dell it was the early days of selling online. Dell quickly created a script that included several upsells like “add another x bytes of storage at 75% of the normal price” (and huge GM to Dell), “financing available for your computer”, etc.  Several phone manufacturers started to offer the ability to insure your screen against breakage (usually from a drop). What is interesting there is insurance sometimes covers things the company would have done anyway but is now being paid extra.

For clothing companies, saying “this blouse would go very well with the skirt you’re buying” or “for the suit you bought which of these three ties would you like to buy”, can substantially increase cart size and increase margin.  One company I’ve dealt with has an increasing discount on the entire order based on the total dollars you spend (net of the discount). Since it sells T-shirts, socks, underwear, etc., it’s easy to add to your order to get to the next discount level and given my personality I always wind up buying enough to qualify for the maximum 20% off.

4. Winning Back Customers

Cancel/Save Tactics

Customer service is usually the first line of defense for preventing customer cancellations. The key to saving a customer is to listen to his or her issue that is causing the cancellation and to be able to adjust your relationship in order to solve that issue. Most companies match each cancellation reason with a particular offer. A simple example is if a customer thinks the product/service is too expensive, a company may offer a discount.

Creating rebuttals and scripts

Your company should create a list of reasons why a customer might cancel. If a new reason why a customer cancels arises, add it to the list. For each reason they might cancel, you must prepare a rebuttal that addresses that issue. If she is receiving too many offers you can agree to cut the frequency to what she prefers, if she is unhappy with something you sent her you can agree to take it back or give a coupon towards the next purchase, etc.  Marc says the key is creating scripts and emails that address every reason for cancellation with a counter that you believe will make the customer happy (without too much cost burden on you). Once you have this in place, anyone who will be dealing with the unhappy customer needs to be trained on how to use the script and what escalation (to a supervisor) procedure should be used.

Winning back Customers

The least expensive customer to acquire is a previous customer. You know quite a bit about them: what they prefer, how profitable they were and more.  Given what you know, churned customers can be segregated into groups as you are probably willing to spend more to win back the high value group than a lower value one. For each group you need to go through the process of original acquisition, but with a lot more specific knowledge. So an offer needs to be determined for each group and creative needs to be created. The methodology should parallel that of customer acquisition with the difference being a defined target.

5. Summary

The steps of the framework have been outlined in detail. But there are a few more points to be made.

  1. SEO should be utilized by all companies as it’s the lowest cost of access to target customers.
  2. Work with a very strong agency partner who understands the fundamentals of marketing.
  3. Have a solid set of vendors for things like email, campaign management, etc.
  4. Data is crucial. Make sure you track as much as you can regarding every potential and actual customer.
  5. If you can’t measure it don’t do it!

SoundBytes

  • There has been much chatter this season about Curry becoming the 8th player with 50/40/90 stats – 50% field goal shooting, 40% 3-point shooting and 90% from the foul line. My partner Paul Ferris noted that if we raise this to 50/45/90 Curry is only the third. And the surprise is that the other two are Steve Nash (not a surprise) and Steve Kerr! Data for this observation was gleaned from BasketballReference.com.

Challenging the Argument for Homogeneous Classrooms

In our November post, Transforming Education”, we discussed several issues associated with the U.S. education system. Two respondents (both former teachers) to the post had some very interesting comments (I’ve included them below followed by my observations). The first respondent, Seth Leslie said:

I’ve always been a proponent of heterogeneous groupings in classrooms, but I’d be the first to admit that pulling it off in a way that benefits all learners is a huge challenge.  It takes a very skilled teacher, excellent curriculum and the right materials to make this work well.  But when it does work well, it’s awesome, and the relational/socio-emotional learning that occurs alongside of the content learning is super important in an increasingly collaborative and interconnected workplace.  It’s just so hard to do this well!

 One other point – no mention of teacher quality in your article.  This is also a factor that contributes significantly to student outcomes – as much or more than class size and family circumstances.

Technology seems to offer some interesting opportunities to schools and learning, but my experience tells me that too much effort goes into selling goods to schools, and not enough effort goes into ensuring that teachers are well trained and well supported in utilizing the technology effectively.  I’d love to tell you about my personal experience with my son Zach, who is in the second year of the 1-to-1 iPad program at his school.  In short, I’m not a fan (and I love technology!)

 Very interesting read, though, and your points make a lot of sense.  It’s frustrating that we as Americans produce so much to be proud of, yet we can’t seem to solve education.”

Although he is a proponent of heterogeneous grouping, he does acknowledge how hard it is to make it work. I, on the other hand, am against it because I believe the obstacles to it working outweigh the small number of cases where a great teacher might be successful in making it work. He also points out that teacher quality can be an issue.  I believe that this stems from not budgeting enough dollars to education, including teacher salaries. Finally, he has had poor experience with the use of technology in the classroom. I agree that this issue has yet to be solved. Simply putting technology into a classroom without integrating it into the learning experience and providing the training necessary for teachers won’t lead to success.

The second teacher that responded, Tatum Omari, is now the lead for Education.com learning products, an Azure portfolio company. She is also a supporter of heterogeneous grouping. Her comments follow.

“Hetero vs. Homogeneous grouping is definitely a complex topic. It can be incredibly hard to do well. Those that are able to pull it off well are usually teachers who have years of experience under their belt. The problem with implementation involves many factors, including the high rate of teacher turnover, and the fact that they don’t quite have time to build the necessary experience to master approaching classroom instruction that facilitates heterogeneous grouping. This requires instruction that utilizes whole group tasks that have low floors and high ceilings. Being able to consistently provide your classroom with tasks that are this rich and promote deep understanding because of their ability to be extended so easily takes quite a bit of skill. That said, to abandon it completely is problematic as there is much research to support that it is not only a worthy endeavor, but one that will be critical to the U.S. elevating our educational system, and our students, back to a place that is competitive with that of the achievements of other countries and our students back to a place that is competitive with that of the achievements of other countries.

The most successful countries, in terms of academic achievement, including Finland, Japan, and Korea, all teach to heterogeneous classrooms and do not practice ability-based grouping. This is because they prize the development of cooperative group achievement over that of the individual. As a result, all of their students experience a far more elevated degree of achievement. There are also some key negative consequences to ability-based grouping which include:

  • Lower expectations from teachers regarding the abilities of students that are placed in groups believed to have lower abilities. Research has shown that randomly distributed students of varying levels scored higher when their teachers believed them to be a group with a higher level of ability. In contrast, another randomized group scored lower when the teacher was led to believe that the students had a lower level of academic ability.
  • Less masterful teaching practices. When teachers are given the ability to use ability-based tracking and teach their students in homogeneous groups, they are less likely to provide all of their students with the type of rich tasks that provide low floors and high ceilings. That means that while the high group may periodically gain access to higher level tasks, the teacher instruction overall is aimed at the middle of the class and there the high students actually miss out on encountering that type of deeper learning throughout the day. In some cases that higher group will only work with the teacher 1-2 times per week which means they are bored a fair bit during the rest of instruction.
  • There can be borderline casualty students, assessed just below the entry requirement for the more advanced groups. This means students who are assessed at one point below what is required to be included in the high group, could be excluded permanently from the opportunity for the rest of their educational career.
  • The development of a fixed mindset by both higher and lower achieving students. Surprisingly the adoption of a fixed mindset can be just as detrimental for a high achiever as that of a low achiever. If the high achiever sees themselves fixed at “smart” they can develop anxiety which leads them to ask fewer questions so as to never appear to not understand or “not smart”. This keeps them from developing a flexible mindset where it is ok to problem-solve out loud and in a group.
  • Missed resources in terms of what students can learn from working and problem-solving together in a group. Often times high achieving students who are offered instruction in mixed ability groups score much higher than those instructed in homogeneous groups because their thinking is stretched when working in groups and looking at problems through different perspectives. The act of observing a fellow students possible wrong assumption, and then helping them to clarify, can help them grasp the concept on a much deeper level, as they are forced to take abstract mathematical concepts, and translate them into oral language which can be very difficult.

While, like Seth, Tatum makes strong arguments (many drawn from the book by Jo Boaler: “What’s Math Got To Do With It?”) that heterogeneous grouping can be beneficial under the right circumstances, I continue to believe that it does not work well in the US for the reasons she points out at the beginning of her comments: inadequate training, teacher turnover, insufficient resources, etc. However, I believe it is worthwhile to provide readers with these alternate points of view (and a reference that expounds on it) from very thoughtful teachers who themselves I’m convinced could make it work to the benefit of students. It seems to me from an aspirational view, heterogeneous grouping is ideal but not from a practical point of view given current U.S. classroom conditions.

Soundbytes:

  • Recently, a number of former players have stated that the lack of adequate defense is the reason behind Curry’s success. Personally, I think defense is actually stronger today than in the past but regardless, the best way of judging any player is by comparing him to his peers. At Curry’s current pace he will score over 50% more 3s in a season than anyone besides him has ever done. The prior record holder before Curry, Ray Allen scored 41.2% of his 3s in his record setting year. Stephen Curry is hitting 46.8% of his 3s this year despite taking more shots per game (which for most would lower their shooting percentage). To put this in perspective, at Allen’s percentage made, he would have scored 34 fewer 3s on the same number of shots Curry has taken this season to date. This equates to 102 less points And Allen was widely considered the best 3 point shooter ever prior to Curry! If we compared Curry to the league average 3-point shooting percentage for the season to date of 35.7%, then the difference becomes about 67 extra 3s made on the 3 point shots he has taken through 56 games played or an extra 201 points vs the league average (which equates to 286 points for the full season). I believe there are few record holders in any era that have such a large discrepancy vs peers (today’s NY times sited Wayne Gretzky and Babe Ruth as similar in producing outsize increases in a major record).

Top 10 Predictions for 2016

In my forecast of 2015 trends I wrote:

 “I’ve been very lucky to have a history of correctly predicting trends, especially in identifying stocks that would outperform. I say lucky because even assuming one gets the analysis right, the prediction can still be wrong due to poor management execution and/or unforeseen events. Last year I highlighted 10 trends that would occur in 2014 and I’m pleased that each proved accurate (see 2014 Predictions). Rather than pat myself on the back for past performance, my high-risk, A-type personality makes me go back into the fray for 2015. Last year’s highlighted stocks, Tesla and Facebook, were up 48% and 43%, respectively, from January 3 to December 31, 2014 vs. 15% for the Nasdaq and under 13% for the S&P 500. This year, I’ll identify more than two stocks to watch as I am probably over-confident due to past success. But because I’m not doing the level of work that I did on Wall Street, there is significant risk in assuming I’m correct.”

As I discussed in the last post I got even luckier in 2015 as my highlighted four stocks had average appreciation of 86% while the broader market was nearly flat. As we saw with the Golden State Warriors on December 12th, all winning streaks have to come to an end so bearing that in mind, I wanted to start with a more general discussion of 5 stocks and why I chose to highlight three and back off of two others (despite still liking their stories). The two stocks that I recommended last year that I’m not putting on the list again are Netflix and Amazon. The rationale is quite simple: neither is at the same compelling price that it was a year ago. Netflix stock, as of today, is up over 100% year/year while its revenue increase is under 25% and profit margins shrank. This means that the price-to-revenue and price-to-earnings multiple of its stock is about twice what it was a year ago. So, while I continue to like the long term fundamentals, the value that was there a year ago is not there today. Amazon is a similar story. Its stock is currently up over 100% year-over-year but revenue and profit growth for 2015 was likely around 20%. Again, I continue to believe in the long term story, but at this share price, it will need to grow 20% per year for three more years for the stock value to be what it was a year ago.

My two other highlighted stocks from last year are Facebook and Tesla. At today’s prices they are each at a lower price-to-revenue multiple than a year ago (that is, their stocks appreciated at a slower pace than revenue growth). But, in both cases, the fundamentals remain strong for another solid growth year (more below on these) and I would expect each to outpace the market. I’ll discuss my final (riskiest) stock pick below.

In each of my stock picks, I’m expecting the stocks to outperform the market. I don’t have a forecast of how the market will perform so in a steeply declining market, out performance might occur with the stock itself being down (but less than the market). So consider yourself forewarned on a number of accounts.

We’ll start with the three stock picks and then move on to the remainder of my 10 predictions.

  1. Facebook stock appreciation will continue to outpace the market (it is currently at $97/share). Most of the commerce companies in the Azure portfolio continue to find Facebook the most compelling place to advertise. Now many of the very large brands are moving more budget to Facebook as well. This shift to online and mobile marketing still has a long way to go and we expect Facebook revenue growth to remain very strong. In addition, Facebook has begun to ramp the monetization of other properties, particularly Instagram. If we start to see real momentum in monetization of Instagram, the market will likely react very positively as it exposes another growth engine. Finally, with the Oculus release early this year, we may see evidence that Facebook will become the early leader in the emerging virtual reality space (which was one of the hits at CES this year).
  1. Tesla stock appreciation will continue to outpace the market (it is currently at $193/share). Last year Tesla grew revenues an estimated 30%+ but order growth far exceeded that as the company remains supply constrained. The good news is that revenue growth in 2016 should continue at a very high level (perhaps higher than 30% year-over-year) and the stock’s price-to-revenue multiple is lower than a year ago. The new Model X has a very significant backlog (I’ve seen estimates as high as 25,000-30,000 vehicles). Since this would be incremental to Model S sales, growth could accelerate once capacity ramps. Additionally, both service revenue and sales of used Teslas are increasing as well. When this is added to distribution expansion, Tesla appears to have 2-3 years of solid revenue growth locked in. I’m not sure when the low priced vehicle will be announced (it is supposed to be in 2017) but a more modest price point for one of its models could increase demand exponentially.
  1. GoPro stock appreciation should outpace the market in 2016 (shares are currently at $10.86). On the surface this may appear my riskiest prediction but there are solid reasons for my thoughts here. I believe investors are mistakenly comparing GoPro to a number of tech high fliers that collapsed due to valuations based on “air”. GoPro is far from that. In fact, I believe it is now a “value” play. To begin, unlike many tech high fliers, GoPro is profitable and generates positive cash flow. Its current book value is over $6 per share (of which $3.73 is cash with no debt). It is trading at less than 1x revenue and about 15x 2016 earnings estimates. Despite the announced shortfall expected in q4 and a number of downward revisions, revenue should still be up about 15% in 2015. While the current version of its camera has failed to meet expectations (and competition is increasing), the brand is still the leader in its space (action video). If new camera offerings advance the technology, this could help GoPro resume growth in the video arena. The brand can also be used to create leverage in new arenas. The three that the company has targeted are: content, drones and virtual reality. Of the three, I would significantly discount their ability to create a large content revenue stream and believe virtual reality products may prove difficult (and even if successful, will take multiple years to be meaningful). However, the company is very well positioned to earn a reasonable share in the UAV/drone market (which was about $1.5B last year and could grow 50-100% in 2016). The primary use of drones today is for photography and video and the majority of the ones we saw at CES were outfitted with a GoPro camera. Given the GoPro brand and distribution around action video, I believe that, if they are able to launch a credible product by mid-year, the company will be well positioned to experience reasonable growth in H2 2016 and the shares should react well.

The remaining predictions revolve around industry trends rather than stocks:

  1. UAV/Drones will continue to increase in popularity. In 2015, the worldwide drone market reached about $1.5B and there is no sign of slowing growth. When I think about whether trends will continue, I base my analysis on whether there are valuable use cases. In the case of drones there are innumerable ones. We’ll save the detailed explanation for a full post but I’ll list several here:
    1. Photography: this is a major use case for both consumers and professionals, namely being able to get overhead views of terrain either in photos or in video.
    2. Security: as an offshoot of photography, drones offer the potential of having continuous monitoring of terrain from an aerial view. This enables intrusion detection, monitoring, and tracking.
    3. Delivery: although current drones are not yet able to carry significant payloads, they are close to having the ability to follow a flight path, drop off a small package and then return. As innovations in UAV hardware and battery technology continue, delivery will become more of a reality in the future (which companies like Amazon, Google and others are counting on). This will also require some type of monitoring of airspace for drones to prevent crashes.
    4. Consumer: consumers will purchase drones in droves not only for the simple pleasure of flying them but also for various types of competitions including racing, battling and obstacles.
  1. Political spend will reach record levels in 2016 and have a positive impact on advertising revenue. Political advertising is expected to reach a record $11.4 billion in 2016, up 20% from the previous presidential election year. While the bulk of spending is forecast to go to TV, 2016 will be the first election year in which digital ad spending will exceed $1 billion (and if the candidates are savvy may be even higher). Adding 2015 spending, total political advertising in this election cycle could total $16.5 billion or more. About 50% of the total spending typically goes for the national election and the other half to backing candidates and issues in local races. During the 2015-16 election cycle, $8.5 billion is expected to be spent on broadcast TV, with $5.5 billion coming from national races and $3.1 billion spent on state and local contests. Cable TV is forecast to see $1.5 billion in spending, with $738 million coming from the national contest and $729 million from local races. Online and digital spending is forecast to total $1.1 billion, with $665 million going for national races and $424 million spent on local contests.[1]
  1. Virtual/Augmented Reality will have a big year in 2016: With the general release of Oculus expected in 2016, we will see an emergence of companies developing content and use cases in virtual reality. Expect to see the early beginnings of mainstream adoption of virtual reality applications. In addition, augmented reality products were heavily on display at CES and we think they will begin to ramp as an alternative to virtual reality. Both virtual reality and augmented reality are similar in that they both immerse the user but with AR, users continue to be in touch with the real world while interacting with virtual objects. With VR, the user is isolated from the real world. For now, expect VR to remain focused on entertainment and gaming while AR has broader applications in commercial use (i.e., real estate, architecture, training, education) as well as personal use.
  1. Robotic market will expand to new areas in 2016: Outside of science fiction, robots have made only minimal progress to date in generating interesting products that begin to drive commercial acceptance (outside of carpet cleaning, i.e. the Rumba). This year could mark a change in that. First, carpet cleaning robots will expand to window cleaning, bathtub cleaning and more. Second, robots will be deployed much more generally for commercial applications (like they already are in the Tesla factory). And we will also see much more progress in the consumer entertainment applications highlighted by the emergence of actual giant robots that stage a monumental battle akin to ones previously only created visually in movies.
  2. A new generation of automated functionality will begin to be added to cars. Tesla has led the way for this and already has a fully automated car on the market. Others are now attempting to follow and perhaps even surpass Tesla in functionality. In addition to the automation of driving, the computerization of the automobile has led to the ability to improve other capabilities. One demonstration I saw at CES was from a company called Telenav. They gave a proof of concept demonstration of a next gen GPS. Their demonstration (of a product expected to launch in Q2 or Q3) showed a far more functional GPS with features like giving the driver alternate routes when there are traffic problems regardless of whether route guidance is on as it determined where the driver was going based on tracking driving habits by day of the week. Their system will also help you buy a cup of coffee in route, incorporate messaging with an iPhone (with the drivers voice converted to text on the phone and vice versa) for communicating with someone you’re picking up, helping you find a garage with available spots, etc. all through the normal interface. Telenav is working as an OEM to various auto manufacturers and others like Bosch are doing the same. And, of course, several of the car manufacturers are trying to do this themselves (which we believe will lead to inferior systems).
  3. The Internet of Things will further expand into kitchen appliances and will start being adopted by the average consumer. We’re going to see the launches of smart refrigerators, smart washing machines, ovens, etc. Earlier this month, Samsung released its new Family Hub refrigerator which uses three high quality cameras inside the fridge to manage groceries, identify foods you have or need, and track product expiration dates to cut down on waste. It also has a screen on its door that can interface with other devices (like an iphone) to find and display the current days schedule for each member of the household, keep a shopping list and more.
  4. Amazon will move to profitability on their book subscription service and also improve cloud capex. Amazon launched its book subscription service with rapid customer acquisition in mind. Publishers were incentivized to include their titles as the company would pay the full price for each book downloaded once a portion of it was read. This meant that Amazon was paying out far more money than it was taking in. We believe Amazon has gone back to publishers with a new offering that has a much more Amazon-favored revenue share which results in the service moving from highly unprofitable to profitable overnight. The Amazon cloud has reached a level of maturity where we believe the cash needed for Capex is now a much smaller portion of revenue which in turn should improve Amazon cash flow and profitability.

 

 

 

 

[1] http://www.broadcastingcable.com/news/currency/political-ad-spending-hit-114b-2016/143445

Recap of 2015 Predictions

Our forecasts for 2015 proved mostly on the money (especially for stocks). For context, the S&P 500 was down very slightly for the year (0.81%) and the Nasdaq was up 5.73%. I’ve listed the 2015 stock picks and trend forecasts below and give my evaluation of how I fared on each one.

  1. Facebook will have a strong 2015. At the time we wrote this Facebook shares were at $75. The stock closed the year at roughly $105, a gain of 40% in a down market. Pretty good call!
  2. Tesla should have another good year in 2015. At the time we wrote this, Tesla shares were at $192. They closed the year at roughly $241, a gain of 25%. I’m happy with that call.
  3. Amazon should rebound in 2015. At the time we wrote this, Amazon was trading at $288. It closed the year at $682, a gain of 137%. Great call but still trailed my next one.
  4. Netflix power in the industry should increase in 2015. At the time we wrote this, Netflix was trading at $332 but subsequently split 7 for 1 making the adjusted price just over $44/share. Since it closed the year at $116 the gain was 144% making Netflix the best performer in the S&P for the year!

The average gain for these 4 stock picks was about 86%. The remaining predictions were about trends rather than stocks.

  1. Azure portfolio company Yik Yak, will continue to emerge as the next important social network. I also mentioned that others would copy Yik Yak and that Twitter could be impacted (Twitter stock was down in 2015). Yik Yak has continued to emerge as a powerhouse in the college arena. After attempting to copy Yik Yak, Facebook threw in the towel. In November, Business Insider ranked leading apps with the highest share of millennial users. Yik Yak was at the top of the list with 98% indicating its importance among the next generation.
  2. Curated Commerce will continue to emerge. This trend continued and picked up steam in 2015. Companies mentioned in last year’s post, like Honest Company, Stitchfix and Dollar Shave Club all had strong momentum and have caused traditional competitors like Gillette, Nordstrom and others to react. Additionally, Warby Parker and Bonobos also emerged as threats to older line players.
  3. Wearable activity will slow. I had expected Fitbit and others to be replaced by iphone apps and that still has not occurred. On the other hand, the iWatch has fallen short of expectations. This is not a surprise to me despite the hype around it. Still, this prediction was more wrong than right.
  4. Robotics will continue to make further inroads with products that provide value. I also highlighted drone emergence in this forecast. We have seen robotics and drones make strong strides in 2015, but regulatory hurdles remain a real issue for both consumer and B2B drone companies.
  5. Part-time employees and replacing people with technology will continue to be a larger part of the work force. This forecast has proven valid and is one reason why employment numbers have not bounced back as strongly as some expected from the 2008/2009 recession.
  6. 3D printers will be increasingly used in smaller batch and custom printing. We have seen this trend continue and even companies like Zazzle have begun to move part of their business into this arena to take advantage of their superior technology and distribution.

I also mentioned in the post that the Cleveland Cavaliers would have a much better second half of the season if LeBron remained healthy. At the time their record was 20 wins and 20 losses. This proved quite accurate as they were 33 and 9 for the rest of the season.

I’ll be making my 2016 predictions in another week or so but it may be hard to match last year!

Next Gen Selling vs Old (or “Traditional”) Methods

In this post I want to compare the buying experiences I’ve had recently when purchasing from an older generation company vs a newer one. I think it highlights the fact that ecommerce based models can create a much better buying experience than traditional brick and mortar sellers when coupled with a multi-channel approach. The two companies I want to highlight are Tesla (where my wife recently purchased a car) and Warby Parker (where I recently bought a pair of glasses). I’ll compare them to Mercedes and LensCrafters but you should understand it almost doesn’t matter which older gen companies I compared them to, so just consider the ones I’ve chosen (due to recent personal experience) as representative of their industries.

Controlling the buying experience

Warby Parker began opening retail “Guideshops” a few years ago. I recently went into one and was very pleased with the experience. They displayed all the frames they have and there were only two price categories which included the prescription lenses and the frames, $95 and $145. I selected a frame, went over to the desk and received assistance in completing the transaction. The person assisting me took one measurement of my eyes and then suggested I get slightly better lenses for a charge of $30 which I think was only necessary due to my particular prescription. There were no other charges, no salesperson, no other upsells, no waiting while the glasses are being made. Once I paid by credit card, the glasses were put in their cue to be made at their factory and shipped to my home within 10 days (with no shipping charge) and my receipt was sent by email rather than printed. From the time I entered the store until I left was about 10 minutes.

Compare this experience to buying a pair of glasses at LensCrafters. At LensCrafters the price range of frames is all over the map without any apparent reason except many carry a designer brand logo (but are unlikely to have been designed by that designer). To me the Warby Parker frames are as good or better looking as far more expensive ones at LensCrafters.  Even if you select a frame at LensCrafters that costs $95-$300 or more, the lenses are not included. A salesperson then sits with you and begins the upselling process. Without going into all the details, suffice it to say that it is very difficult to discern what is really needed and therefore it is hard to walk out of the store without spending $100-$300 more than the cost of the frame. Further, since the glasses are made at the store you come back in a few hours to pick them up (of course this is a positive if you want them right away; I usually don’t care).  I have typically spent well over an hour in the buying process plus going for a coffee for the 2 hours or so it took for them to make the lenses.

Tesla has been very adamant about owning and controlling their physical retail outlets rather than having their cars sold by independent dealerships. This gives them multiple advantages as they completely control the buying experience, eliminate competition between dealers, reduce distribution cost and can decide what the purpose of each location is and how it should look. They have also eliminated having cars to sell on the lot but instead use an ecommerce model where you order a car exactly the way you want it and it gets produced for you and brought to the Tesla physical location you want for pickup. Essentially, they have designed two types of physical stores: one that has a few demo models to enable test drives and one that also has a customer service department. This means that the latter is a much smaller size than a traditional car dealership (as it doesn’t need space for new car inventory on the lot) and the former is much smaller than that. The showroom approach occupies such a small footprint that Tesla has been able to locate showrooms in high foot traffic (high cost per foot) locations like malls.  In their sites at the Stanford Mall and on Santana Row (two of the most expensive per square foot), Tesla kept the cars for test drives in the parking lots (at a fraction of the cost of store footage). When my wife decided to buy her second Tesla (trading in the older one) we spent about an hour at the dealer as there was no negotiation on price, the car could be configured to her exact specification on a screen at the dealership (or at home) and would be manufactured for her. There were no upsell attempts, no competing dealers to visit, and really no salesperson but rather a facilitator (much like at Warby Parker) that answered questions.

I bought my new car from Mercedes and had a much less pleasant buying experience. It starts with the fact that the price on the car isn’t the real price. This means that one needs to try to go to multiple dealers as well as online to get a better handle on what the real price is as the dealers are difficult to trust. Each dealer now has its own online person (or team) but this is actually still buying from a dealer. There is also a strong encouragement to buy a car in inventory (on the lot) and the idea of configuring the way one wants and ordering it is discouraged. The cars on the lot are frequently configured with costly (highly profitable) options that are unnecessary so that even with a discount from list one typically spends more than ordering it with only options you want and paying closer to list. After multiple days (and many, many hours) spent online and visiting dealerships I decided to replicate the Tesla concept and order a 2016 model to be built exactly how I wanted. Because I spent many hours shopping around, I still was able to get a price that was an extra $4,000 off list from what I had been offered if I bought a 2015 off the lot. The car was the color I wanted, only had the options I wanted and would have a higher resale value because of being a 2016. Since the list price had not increased and there were no unneeded options on the car I actually saved about $10,000 vs taking one off the lot with the lower discount even though all additional options I wanted were bundled with it.

Receiving the product

In the Warby Parker example, the glasses were shipped to my home in a very well designed box that enhanced their brand. The box contained an upscale case and a card that said: “For every pair of glasses sold, a pair is distributed to someone in need.” Buying at LensCrafters meant returning to the store for the glasses. The case included was a very cheap looking one (creating an upsell if one wanted a nicer case) and there was no packaging other than the case. However, I did get the glasses the same day and someone sat with me to make sure they fit well on my ears (fit was not an issue for me for the Warby Parker glasses but could be for some people).

On the automobile side, the car pickup at Tesla was a much better experience than the one at Mercedes. At Tesla, my wife and I spent a little over an hour at the pickup. We spent about 20 minutes on paperwork and 45 minutes getting a walk through on how various options on the car work. There were no attempts to upsell us on anything. At Mercedes the car pickup experience took nearly 4 hours and was very painful as over 3 hours of it was spent on paperwork and attempts at a variety of upsells. To be fair, we had decided to lease this car and that time occupied a portion of the paperwork. But the attempted upsells were extreme. The most ludicrous was trying to get us to buy an extended warranty when the included warranty exceeded the length of the lease. I could understand that it might be of value to some but, in our case, we told the lease person that we were only doing the lease so we wouldn’t own the car at the end of it. There were also upsells on various online services, and a number of other items. The time this took meant we did not have enough time left to go over all the features of the car. This process was clearly the way each person had been trained and was not a function of the particular people we dealt with. The actual salesperson who sold me the car was extremely nice but was working within a system that is not geared towards the customer experience as dealers can’t count on buyers returning even if they buy the same brand again.

Summary

There is a significant advantage being created by new models of doing business which control the complete distribution chain. Their physical locations have a much smaller footprint than traditional competitors which allow them to put their shops in high traffic locations without incurring commensurate cost. They consolidate inventory into a centralized location which reduces inventory cost, storage and obsolescence. They completely control the buying experience and understand that customer satisfaction leads to higher life time value of a customer.

 

SoundBytes

In my SoundByte post dated April 9, I discussed several of the metrics that caused me to conclude that Stephen Curry should be the 2014-15 season MVP. He subsequently received the award but it still appeared that many did not fully understand his value. I thought it was well captured in the post by looking at EFG, or effective shooting percentage (where a three point shot made counts as 1.5 two point shots made since its worth 50% more points), plus/minus and several other statistics not widely publicized. This year, Curry has become even better and I realized one other statistic might help highlight his value in an even better way, points created above the norm (PAN).

I define PAN as the extra points created versus an average NBA player through more effective shooting. It is calculated using this formula:

PAN= 2 x (the players average number of shots per game) x (players EFG- league norm EFG)

The league’s effective shooting percentage as of December 6 is 49.0%. Since Curry’s effective shooting percentage is 66.1% as of today date, the difference is 17.1%. Curry has been averaging 20.2 shots per game this year so his PAN = 2 x 20.2 x 17.1%= 6.9. This means Curry’s shooting alone (excluding foul shots) adds about 7 points per game to his team versus an average shooter. But, because Curry is unselfish and is often double teamed, he also contributes heavily to helping the team as a whole be more effective shooters. This leads to a team PAN of 14.0. Which means the Warriors score an extra 14 points a game due to more effective shooting.

Interestingly, when you compare this statistic to other league leaders and NBA stars, Curry’s contribution becomes even more remarkable. While Curry add about 7 points per game to his team versus an average shooter, James Harden, Dwayne Wade and Kobe Bryant are all contributing less than the average player. Given Curry’s wildly superior efficiency he is contributing almost twice as much as Kevin Durant.

Efficiency

With Curry’s far superior individual and team contribution to shooting efficiency, it is not surprising that the Warriors are outscoring their opponents by such record breaking margins.

To further emphasize how much Curry’s PAN impacts his team we compared him to Kobe Bryant. The difference in their PANs is 11.8 points per game. How much would it change the Lakers record if they had these extra 11.8 points per game and all else was equal? It would move the Lakers from the second worst point differential (only Philadelphia trails them) to 10th in the league and 4th among Western conference teams. Since point differential correlates closely to team record, that might mean the Lakers would be competing for home court in the playoffs instead of the worst record in the league!

Transforming Education

I was recently interviewed on NBC regarding education, market valuations and the accuracy of my forecasts of market trends made in 2001, among other things (www.pressheretv.com). Since the interview stimulated thoughts on the education market, I thought it was worth capturing a few in a post.

Why the quality of education in the United States trails

According to the World Bank, the United States leads the world in Gross Domestic Product, dwarfing anyone else. The U.S. GDP is 70 percent ahead of number 2 China, and almost 4 times the size of number 3, Japan. Given this wealth of resources, it is somewhat surprising just how low we rank in K-12 education among nations:

  1. 36th in mathematics for 15 year olds[1]
  2. 24th in reading for 15 year olds[1]
  3. 28th in science for 15 years olds[1]
  4. 14th in cognitive skills and educational attainment[2]
  5. 11th in fourth-grade mathematics and 9th in eighth-grade mathematics[3]
  6. 7th in fourth-grade science and 10th in eighth-grade science[3]

Part of the problem is that much of our priority as a country tends to emphasize short-term gratification over long-term issues such as investing in primary education. But the problem goes much deeper.

Classroom sizes have been increasing

Parents Across America, a non-profit organization committed to strengthening US public schools, point to studies that indicate that students (especially in grades K-3) who are assigned to smaller classes do better in every way that can be measured. Of 27 countries shown in a 2007 Organization of Economic Cooperation and Development (OECD) survey the United States ranked 17th in lower education classroom size, at 24.3 students per class. While other countries are investing in reducing classroom size, the U.S. is going in the other direction. Of 26 countries included in the OECD data from 2000 and 2009, 25 either decreased classroom sizes or kept them about the same. The United States was the only one that increased classroom sizes during that period.

Heterogeneous grouping exasperates the problem

In years gone by, U.S. classrooms were homogenous, as children were separated according to their skill level. This practice came under heavy criticism because it was viewed as discriminatory.  Throughout the United States, schools shifted to heterogeneous classes not because this technique was proven to be effective but rather in an effort to promote “political correctness.”

One teacher pointed out that “administrators love to boast that their school has heterogeneous grouping…but the administrators aren’t in the classroom, and they don’t see the disappointment on the faces of students when a new experience is presented and not everyone remains on the same page.”

Another teacher stated: “That ideal [of heterogeneous grouping], is an ideal….Truth is, in our experience the low-end kids tend to pull down the high-end kids, rather than the other way around. The class pace slows, and the teacher has to in effect devise two lesson plans for each period, one for the accelerated students and another for those who have low skills.”

In the past decade, many teachers have moved toward creating homogenous groups for reading and math within their heterogeneous classroom. One teacher who has 17 years of experience teaching in New Hampshire said that the second graders in her class showed up on the first day with a bewildering mix of strengths and weaknesses. Some children coasted through math worksheets in a few minutes, she said; others struggled to finish half a page. The swifter students, bored, would make mischief, while the slowest students would become frustrated, give up, and act out.

“My instruction aimed at the middle of my class, and was leaving out approximately two-thirds of my learners,” said this fourth grade teacher at Woodman Park Elementary in Dover, N.H. “I didn’t like those odds.”

So she completely reorganized her classroom. About a decade ago, instead of teaching all her students as one group, she began ability grouping, teaching all groups the same material but tailoring activities and assignments to each group. “I just knew that for me to have any sanity at the end of the day, I could just make these changes,” she said.

Flexible ability grouping, when used appropriately, works. According to a 2010 meta-analysis by Kelly Puzio and Glenn Colby, students who were grouped by ability within a class for reading were able to make up to an additional “half of a year’s growth in reading in one year.” Similarly, a 2013 National Bureau of Economic Research study of students who were grouped by ability found that the performance of both high- and low-performing students significantly improved in math and reading, demonstrating the universal utility of this tool, particularly as our classrooms become more academically diverse.

In summary, I believe that teachers have a more difficult time today than ever before. Their classes are getting bigger, their budgets are smaller, and heterogeneous grouping for the class means that effective teaching requires splitting the class into homogenous groups that each require a different lesson plan. Teaching parts of a class separately leads to less quality time that a teacher can spend with each group.  Without essential one-on-one instruction time, students suffer. If the class isn’t divided into 2 or 3 homogenous groups for lessons the students suffer even more, as they are denied level-appropriate learning.

The current system discriminates against the lower two-thirds of society

What I find surprising is that more people don’t realize that the practice of heterogeneous grouping is actually discriminatory to the lower two-thirds of society. Wealthier families typically live in neighborhoods with better school systems (with students that are more homogenous in skill levels); can readily afford tutors for their children; can provide after school access to learning centers; give their children prep courses for various subjects and for SATs; and if all else fails, send their offspring to a private school. Those in the lower two-thirds, economically speaking, have more limited access to additional help outside the classroom, cannot afford private school, often have parents without college education who are less able to help them, and may not even take an SAT prep course. Each of these put them at a disadvantage versus those that come from the upper economic strata of America.

I myself had hardworking parents who had not gone to college. My father was an immigrant and had to work before completing high school. But my education was accelerated because homogenous grouping was the norm at that time. This included being placed into a class of high performers who all received 3 years of curriculum in two years and therefore skipped a grade. I also was able to take a competitive test that enabled me to be accepted into Stuyvesant, one of the very high-end public high schools in New York City, geared toward helping public school students receive an honors course level education. I firmly believe that the access I had to be paired with students that were high achievers played a very important role in my subsequent success.

Technology provides a range of alternatives

One potential way to bridge the gap is by having multiple teachers in a classroom but I think this would be extremely unlikely to be funded out of constrained government resources. A second possibility is to provide teachers with the training and technical resources that could be used to enhance the student experience. All too often when technology is utilized, it is not integrated into curriculum and/or very complicated to use.  However, in recent years, there have been advances in K-12 education through the use of technology that is relatively easy to use and integrate to curriculum. For example, an Azure portfolio company, Education.com has created online workbooks and games that are aligned to the Common Core and are sold to parents and teachers for a starting price of $49 per year, making a subscription affordable to everyone.  Millions of teachers and parents in the K-5 levels now are basic members of the site, which offers weekly emails and limited educational resources that can be consumed free each month. With a Pro subscription, a teacher can print out unlimited workbooks, worksheets, and lesson plans for their class. The company estimates that roughly one billion worksheets are printed each year by parents and teachers for students to use. While teachers are likely to print worksheets that complement their current curriculum, parents can use printable worksheets and workbooks as supplemental material to help their kids in academic areas where their skills need strengthening. The fact that about 8 million parents and teachers come to the site in a peak month also indicates how strong the need is for these types of materials.

Education.com’s Brainzy program is a first step at individualized learning. It uses games to practice strategies for mastering core curriculum for students in kindergarten through 2nd grade. I have also met with a number of other companies who are creating products that can provide students with personalized learning tools. What Education.com and others are doing is still early steps in a process that I believe will lead to individualized education. If the United States keeps insisting on heterogeneous classroom composition but couples this with under-investing in education and requiring teachers to divide their time into separate lesson levels, then computer tools for the individual personalized instruction of each student appears to be the solution that can bridge the gap.

SOUNDBYTES

  • Stephen Curry picked up right where he left off last year scoring 40 points in the first game of the new season on strong shooting. Over the last 11 games, the Warriors remain undefeated behind Curry’s league leading scoring. After 11 games, Curry is averaging 33.4 points per game, a full 5 points ahead of the number two, James Harden, at 28.4 points per game. On Saturday night, in Curry’s 427th game, he surpassed the number of threes made by his father, Dell Curry, over his 1,083 game career. Earlier this year, we discussed why Curry deserved to be the clear NBA MVP and analyzed his scoring efficiency adjusting for his ability to hit threes from seemingly anywhere on the court (he was subsequently voted the MVP) .

Curry backup

  • Curry’s shooting has been even more dominant this year. Even based on “standard” statistics, Curry leads the pack not only in scoring average but also in field goal percentage. At 51.7% he trails only Blake Griffin, who has only taken 3 three-point attempts this season, in the top 10 scorers. Looking at “Field Goal Efficiency” (FGE%), a metric introduced in our previous post, that calculates a 3-point field goal as worth 1.5 times a 2-point field goal, we see Curry’s true dominance this season. Also, “True Shooting Percentage” (TS%) assumes that 1 of every 9 foul shots is part of a 3-point (or 4-point) play and therefore considers 2.25 foul shots as the same as one field goal attempt (since most pairs of foul shots replace a field goal attempt). Looking at these metrics we continue to see Curry’s clear dominance. Curry’s performance is off the charts as he is nearly 7 percentage points ahead of the second highest of the top ten scorers in FGE% and he is also well ahead of anyone else in TS%.
  • In a recent ESPN segment, Brad Daugherty called Curry “un-defendable”. If he continues to shoot the ball at this level, the road to a second consecutive championship and another MVP seems well paved.
[1] Organization for Economic Cooperation and Development (OECD), Program for International Student Assessment (PISA). 65 educational systems ranked.
[2] Pearson Global Index of Cognitive Skills and Educational Attainment compares the performance of 39 countries and one region (Hong Kong) on two categories of education: Cognitive Skills and Educational Attainment. The Index provides a snapshot of the relative performance of countries based on their education outputs.
[3] International Study Center at Boston College. Fourth graders in 57 countries or education systems took the math and science tests, while 56 countries or education systems administered the tests to eighth graders.

OmniChannel Selling

The latest trend in retail is the concept of “OmniChannel” selling. While many players have been engaged in this arena for some time, there has been acceleration in the practice. Online retailers are now attempting to find ways to add an in-store experience and many brands, larger retailers, and numerous smaller ones have added more of a push towards e-tail. Additionally, direct sales through TV (QVC, Home Shopping, etc.), telemarketing and consumer-to-consumer fill out the spectrum of options.

The concept of selling through multiple and diverse channels is not a new concept, but the increased integration of in-store and e-tail channels is becoming more sophisticated. With 93% of consumer sales still occurring offline, many e-tailers understand that a physical presence can help escalate sales. Similarly, with this percentage shrinking and with $1.6 trillion in e-commerce sales expected this year, brick and mortar cannot ignore the importance of being online. Earlier this year, Square, in partnership with Bigcommerce, announced a new integration that provides merchants with a simple and seamless way to expand their businesses online. Similarly, Shopify’s POS system allows physical retailers to easily sell online.

US retail sales

E-Tailers Move to Physical Retail

This post focuses on the trend of e-tailers moving into physical retail and when and why it can work. E-tailers fall into three categories: those that only sell other company’s brands, those that are creating their own brand, and those that sell other brands as well as their own.

The dominant player in the first category is Amazon. ver time, it has built an overwhelming network of distribution centers geared towards efficiently shipping one or more items to an individual consumer. Now it has begun experimenting with physical locations, the first of which opened on the Purdue campus in February with additional locations being planned on other college campuses. There are also reports that it will follow this with other types of store openings. Given its widespread distribution centers, the company already has significant capability to inexpensively pick and pack goods for an individual consumer. But, despite limiting most shipments to one zone, there is still a relatively large cost to deliver a single order to an individual household. If it can begin getting non-Amazon Prime customers to come to a convenient location for pickup (Amazon locker or store), shipping cost could be reduced quite a bit. Further, having physical locations will undoubtedly add to the company’s sales and its brand. Since it would not need to stock the stores the way a traditional retailer does, it could capture the efficiency associated with centralized inventory locations combined with the brick-and-mortar efficiency of shipping a large number of goods to one location (probably in an Amazon-owned truck). I expect to see a major expansion of Amazon into physical locations over the next 5 years.

Trading High Shipping Cost for Brick and Mortar Cost

In e-commerce companies that we know well, fulfillment (picking and packing) and shipping can be as much as 40% or more of COGS.  Moving to one’s own physical retail stores adds substantial cost but removes shipping cost. Most e-tailers now offer some version of free shipping, but whether the seller or the customer pays for shipping, it is a major factor. What this means is that such an e-tailer can spend that money on its own stores or by offering to discount its products to a third-party brick and mortar reseller without necessarily incurring any loss in gross margin dollars (of course a larger discount may be required). Even if gross margins are lower when partnering with a third party brick and mortar retailer, it can still be as profitable as the e-tailer’s online sales since  brick and mortar stores already attract many customers whereas online sales normally require a marketing spend to create greater volume.

The OmniChannel Approach for Branded Product

Amazon is in a unique position because of its size. Although there are other e-tailers of third-party products with sufficient size to open their own physical locations, the bigger opportunity to increase sales resides with e-tailers that have their own branded product. A great example of this is Warby Parker, an emerging brand in eyewear. About 2 ½ years ago it opened its first brick and mortar store in New York City. What it found is that this not only added to its client base through in-store purchases, but also drove additional online sales. Why would this occur? Besides the obvious fact that many people still prefer buying from a physical location, trying on a pair of frames and having them fitted to your needs improves the experience. The inability to do this online may have inhibited some customers from purchasing. But once you have had the opportunity to have eyeglasses fitted to your requirements, it is much easier to buy subsequent pairs online with the knowledge that the fit should be appropriate.  The same issue of good fit applies to shoes and clothes.

Fit is one reason why Bonobos, an online e-tailer of men’s clothes, began opening shops.  But unlike Warby Parker, the Bonobos shops are “Guideshops” (where clothing can be tried on and then ordered for delivery). By taking this approach, Bonobos keeps inventory centralized and the stores much smaller (only requiring one unit of each SKU) but gains the benefit of addressing people less comfortable with shopping online and also insuring that the clothes fit. By locating the shops in malls and other high traffic areas, Bonobos gains exposure to a fair amount of foot traffic making the stores another customer acquisition vehicle. Note that the stores we expect Amazon to open are essentially Guideshops but on a much larger scale.

Online Brands Partnering with Brick and Mortar Retailers Will Continue to Increase

Bonobos has also partnered with Nordstrom but in its case it’s simply as another brand offered in Nordstrom stores.  In August, Warby Parker announced their first retail partnership with Nordstrom. Once Nordstrom saw the benefits of OmniChannel brands, it acquired Trunk Club (another men’s clothing e-tailer). Subsequent to the acquisition, it began adding space in some of its stores for men to come in, get fitted and talk to a stylist about preferences. The stylist then acts as a personal shopper and picks Trunk Club clothes for the customer to try. This results in a much larger average order than online sales for Trunk Club. In this case the customer takes the clothes with him. Again, once this occurs, buying subsequent items online becomes easier as there is more confidence that the fit will be good. Now Trunk Club is entering the women’s clothing market to compete with the successful online brand, Stitchfix.

Shoes are even more difficult to buy without trying on than eyeglasses or clothes. As a result, Shoes of Prey, which offers women the ability to design their own custom shoes, has also opened Guideshops but in their case they are in known retailers like Nordstrom. This makes sense to me as I prefer buying my first pair of shoes in a store and “refills” online. And now most brands that once were only available in brick and mortar stores can be purchased online. For the first pair I sometimes try on 8-10 styles/sizes before finding one that satisfies my needs (this is a major problem for Zappos who appears to have about a 35% return rate. If I try to buy a second pair a few months later from the same store, odds are they won’t have it. Instead, it’s seamless to go online for the follow-on pair. With the acquisition of Trunk Club, Nordstrom has taken a strong initiative in blending the online/offline experience.

Notice the difference between Warby Parker and Bonobos versus Trunk Club. Warby Parker and Bonobos, in addition to being another brand at third party retailers, opened their own branded stores whereas Trunk Club began expanding into an existing major retailer (albeit its new parent) as a service to customers. Opening your own stores can involve substantial capital expenditures and large ongoing operating cost. The alternative of getting one’s online branded product to be carried by a retailer reduces risk and saves substantial fixed cost. But, there’s a trade-off; the brand gives up margin as the third-party retailer will be buying at a discount. Merely getting into stores does not guarantee added success. In the store, the control of the purchase experience moves to the retailer so it becomes very important that the brand is comfortable with the way the retailer will position its products in terms of shelf space and point of purchase marketing through materials and/or sales people in the store. Julep, a successful online brand in the cosmetics space, has partly solved the issue of positioning by partnering with QVC as well as several brick and mortar retailers including Nordstrom. A strong advantage of a QVC partnership is that “shelf space” allocated to the brand consists of a brand spokesperson going on the TV show to market the brand to a very large audience. Resulting sales occur immediately through QVC but other channels also benefit.

Advantages to the Retailer of Carrying Online Brands in Their Physical Stores

An online brand should have substantial information regarding customer demand. It knows the geographies in which its products sell best, the demographics of its customers, which of its products will be in greater demand, etc. It also may have very substantial traffic to its site, to which it can offer the alternative of buying at physical retail. Furthermore, unlike physical retail, e-commerce retailers have a deeper understanding of customer acquisition metrics and customer conversion funnel, and can readily A/B test various elements on their site. Such insights can help a physical store decide which items to carry, volumes needed in different geographies and more.  It can also mean the online brand will drive additional customers to their store. A brand like Le Tote, one of Azure’s portfolio companies, which offers women a subscription that entitles them to rent everyday clothes, has even more data as an average customer will have worn over 50 of their clothing items over the course of a year. Since the company receives ongoing feedback on most of the items it ships, it has very substantial data on customer preferences regarding third party brands as well as house brands. The company believes that it is likely to form one or more partnerships with brick and mortar retailers to begin selling its “house” brands.

Intelligently Moving to OmniChannel Selling Makes Sense for Many Players

Given the growing synergism between online and offline retail, there is substantial opportunity for heightened growth for startups that are able to intelligently emerge from an e-tail only model to one that uses both online and brick and mortar distribution. If the e-tailer has its own branded goods, then this can be done through partnering with existing stores. In executing this strategy, it is important to ensure that the presentation and knowledge of the products placed in such stores are sufficient to enable customers of the store to adequately learn about the products. In turn, the e-tailer can provide a deeper understanding of the customer in order to accelerate growth and improve sales conversions in all channels. The abundance of data being provided from online channels as well as in-store tracking can provide significant insight to retailers, and startups that best capitalize on this information are better positioned for success.  Startups that are able to capitalize on this trend can experience a significant escalation in growth.

SoundBytes

  • The recent acquisition of EMC by Dell brought back memories of my thesis while still on Wall Street as a top analyst covering technology. In 1999 I predicted that successful PC companies would hit roadblocks to growth and profitability if they didn’t move “Beyond the Box”. As we’ve seen the prediction proved true as Apple thrived by doing so and others like Compaq, Dell, Gateway and HP ran into difficulty. I’m not as close to it now but the merger of these two companies seems to create obvious cross-selling opportunities and numerous efficiencies that should benefit the combined entity.

The Argument for Curry as a Unicorn

In our previous post we posed the potential for Stephen Curry to become a Unicorn (in venture this is a company that reaches $1 billion in value). While it was mostly for fun, on reflection we decided that it actually could prove valid. This post will walk you through why an athlete like Curry (or potentially James Harden, Russell Westbrook or Anthony Davis) could become a Unicorn should they be elevated to the elite status of a LeBron James.

curry unicorn

The Precedent for Creating a Corporation Owning an Athlete’s Earnings Exists

In April 2014, Vernon Davis offered stock in his future earnings via a venture with Fantex, Inc. as part of a new financial instrument being sold by Fantex. Davis offered a 10% share of all future earnings from his brand marketing company to Fantex, which would then turn around and divide it into shares of a tracking stock that can be traded within their own exchange. The offering was 421,100 shares, valued at $10 each, for a total of $4.2 million. This implied a total value of the “Vernon Davis Corporation” of $42 million. Davis’ current salary is $4.7 million and endorsement income about $1.75 million for a total income of $6.5 million. Given that the longevity of football players is rarely into their mid-thirties coupled with Davis being over 30 at the time, it seems likely that he had no more than 3-4 years left in his playing career. Putting those facts together makes it appear that Davis was unlikely to earn much more than $42 million going forward and might earn less as we would expect his income to drop precipitously once he retired. So buying the stock was probably viewed as more of a symbol of support for Davis and its “market cap” appears about equal to his expected future earnings.

NBA Stars are Among the Highest Earning Athletes’

The current highest earner of endorsements in the NBA is LeBron James at about $44 million per year (Kevin Durant is second at $35 million). The highest contract in the league is Kobe Bryant at about $23 million per year (and had been $30 million previously) with the 10 highest players in the league making an average of over $21 million. Given the new TV contract scheduled to go into effect in the 2016-2017 season, it’s been projected that the cap will increase from about  $63 million today to $90 million in 2017 and be nearly $140 million by 2025 (10 years from now, at age 37, Curry should still be playing). Let’s make the following assumptions:

  1. Curry’s salary will go from a current level of $11 million in 2015 and 12 million in 2016 (4 other Warriors will be paid more that year) to about $30 million in 2017 assuming the top salaries tend to be about 1/3 of their team’s cap as they are today.
  2. It will be up to $40 million in 2025, or less than 1/3 the projected $140 million cap.
  3. His endorsements will reach midway between the current levels experienced by LeBron and Durant, to about $40 million by 2017 (they are currently at about $5.5 million from Under Armour)
  4. His endorsement income will rise by about 10%/year subsequently, through 2025 to reach $92 million in 2025
  5. He will continue to earn endorsement income (but will retire from playing) subsequent to the 2025 season.
  6. The level post 2025 will average $60 million per year for 10 years and then go to zero.

The last assumption is based on observing the income of retired stars like Michael Jordan (earning $100 million/year 12 years after retirement), David Beckham (earned about $75 million the first year after retiring), Arnold Palmer (earned $42 million/year 40 years after winning his last tournament), Shaq ($21 million), Magic Johnson is now worth over $500 million. Each are making more now than the total they made while playing and, in several cases, more per year than in their entire playing careers. So assuming Curry’s income will drop by 1/3 after retirement is consistent with these top earners.

chart

This puts his total income from 2016 through the end of 2035 at over $1.5 billion. All of the above assumptions can prove true if Curry continues to ascend to super-star status, which would be helped if the Warriors win the championship this year. They could even prove low if Curry played longer and/or remained an icon for longer than 10 years after retiring. Thankfully, Curry has remained relatively injury free and our analysis assumes that he remains healthy. Curry is not only one of the most exciting players to watch, but is also becoming the most popular player with fans around the league. Curry now ranks second overall in total uniform sales, behind LeBron James.

So while the concept of Stephen Curry as a Unicorn (reaching $1 billion in value) started as a fun one to contemplate with our last post, further analysis reveals that it is actually possible that Fantex or some other entity could create a tracking stock that might reach that type of valuation.

As a VC, I would love to invest in him!

SoundBytes:

  • In the recent game against the Blazers there was further validation of Curry’s MVP bid. Curry delivered eight 3-pointers, hit 17 of 23 shots and went 7-of-7 in his 19-point fourth quarter. His last two threes were a combined distance of 55 feet, setting a new record for threes in a season and breaking his own record!
  • To understand just how well Curry shot, his Field Goal Efficiency was 91% (he had 8 threes bringing his equivalent field goals to 21/23). Not only was this higher than anyone who scored 40 points this year or took at least 20 shots in a game, we believe it may be among the highest ever for someone taking 20 shots in a game.
  • As a comparison, the two Portland stars, Aldredge and Lillian, each had strong games and scored 27 and 20 points, respectively. But, to do that, they took 46 shots between them (double that of Curry) and only scored 2 more points in total for the extra 23 shots!
  • The 4th quarter performance by Curry, cited above, translates to a 114% FGE rating, which is averaging more than 100% shooting as he scored 16 points on 7 shots. When foul shots are taken into account, his True Shooting % was 137% as he scored 19 points on 8 field goal attempts (counting the one on which he was fouled).To draw a comparison, when Russell Westbrook scored 54 points against Portland on April 12 he took 43 shots, 20 more than Curry (23 more if we include shots that led to foul shots).

Top 10 Predictions for 2015

I’ve been very lucky to have a history of correctly predicting trends, especially in identifying stocks that would outperform. I say lucky because even assuming one gets the analysis right, the prediction can still be wrong due to poor management execution and/or unforeseen events. Last year I highlighted 10 trends that would occur in 2014 and I’m pleased that each proved accurate (see 2014 Predictions). Rather than pat myself on the back for past performance, my high-risk, A-type personality makes me go back into the fray for 2015. Last year’s highlighted stocks, Tesla and Facebook, were up 48% and 43%, respectively, from January 3 to December 31, 2014 vs. 15% for the Nasdaq and under 13% for the S&P 500. This year, I’ll identify more than two stocks to watch as I am probably over-confident due to past success. But because I’m not doing the level of work that I did on Wall Street, there is significant risk in assuming I’m correct.

So consider yourself forewarned.

  1. Facebook will have a strong 2015. I have not sold my Facebook shares (I’m up over 3x since acquiring them in mid-2013). Momentum appears just as solid as it did a year ago and revenue and earnings multiples have contracted. In 2014, revenue grew over 60% and earnings per share nearly 100% (using analyst estimates for Q4) vs the share price increase of 43%. Beware that high growth stocks can go through periods of multiple contraction, but Facebook ($75/share) seems well positioned to continue to see revenue surge and EPS increase even faster.
  1. Tesla should have another good year in 2015. I continue to hold my stock and think it will perform well despite expecting numerous wild gyrations ($192/share). Because the SUV launch has been delayed to 2016, revenue growth could taper off from the approximately 75% in 2014 (using analyst Q4 estimates). Investors could fear that lower gas prices will impact people’s desire for an all-electric car. But, do you believe customers paying $90,000 for a Tesla are doing so to save on fuel? I don’t. Tesla sales will be helped by: increasing distribution, more locations to charge one’s car (reducing one of the biggest buying inhibitors), more knowledge of the car, increasing awareness of its relative price attractiveness given the new $136,000 BMW I8 high-end sports hybrid and continued governmental incentives to buy an electric vehicle.
  1. Amazon should rebound in 2015. Last year the stock was down over 22% for a variety of short-term reasons. Amazon 2014 revenue is expected to be about 20% over 2013 revenue. Its competitive advantages in retail, if anything, improved as its local delivery capabilities continued to dominate competition (we expect Amazon to leverage this further by opening showrooms/ordering centers in several cities in 2015) and Amazon Prime service saw further and further adoption. But, 2014 was a year of substantial investment and this hurt the stock ($288/share). Such investment stimulating increased market share has occurred before and the stock typically bounces back subsequently. While it doesn’t have the growth dynamics of Facebook or Tesla and I don’t own the stock yet, I believe it is worth considering for any portfolio.
  1. Netflix power in the industry should increase in 2015. Like HBO before it, Netflix’ superior economics provides the opportunity to create more of its own proprietary content. It also may see more opportunity to launch movies online simultaneous to their launch in theaters – the success of The Interview could help drive this trend and no one is better positioned than Netflix to exploit it. After peaking mid-year at $480/share, the stock closed 2014 slightly down from a year earlier and is now at $332.
  1. Azure portfolio company, Yik Yak, will continue to emerge as the next important social network. This will cause a number of entrenched competitors to modify their products to try to slow Yik Yak growth. The most vulnerable public entity is Twitter as Yik Yak is the next, more modern version of Twitter. Given Twitter’s large user base, this will not likely affect its stock in 2015, but it is something to monitor.
  1. Curated Commerce will continue to emerge. This trend was one we forecast in last year’s blog and appears to have solid resonance. A number of startups in the category saw valuations rise to $300M – $1B including Honest Company, Birchbox, Stitchfix, and Dollar Shave Club. There is more to come as many shoppers want a better shopping experience from etailers. To date, most web shopping starts with knowing what item one wants to buy rather than “browsing”. The best brick and mortar retailers create a shopping experience by stocking items that are pleasing to those that visit their store. Most of us know people that prefer shopping in a particular store due to this experience. This trend is emerging on the web and will continue in 2015. At Azure, we continue to believe in this model and made investments into Julep, Le Tote, The Bouqs and Filter Easy in 2014.
  1. Wearable activity will slow. With the exception of the iWatch which is expected to release in early 2015, the hype around wearable devices will be more muted. Fitness trackers, wearable cameras, smart watches, heart rate monitors, and GPS tracking devices will largely be replaced by phone or watch-based apps. An early indication of this trend was an October 2014 report that claimed Apple had plans to remove Fitbit products from its physical retail stores. 
  1. Robotics will continue to make further inroads with products that provide value. Specifically, the commercial use of UAVs and drones will continue to accelerate. The recent FAA issuance of permits to use drones to monitor crops and photograph properties for sale is an initial first step in a broader application of UAVs. Companies involved in infrastructure and software related to UAVs will continue to attract more interest. 
  1. Part-time employees and replacing people with technology will continue to be a larger part of the work force. The Affordable care Act and increasing minimum wages will each be a force in driving this trend.
  1. 3D printers will be increasingly used in smaller batch and custom manufacturing.

Soundbytes.

  • Switched to an iPhone from Blackberry and while this may sound prehistoric, I will miss many of the efficiencies of a Blackberry that the iPhone lacks; but I had to change because the iPhone is so much better for online, graphics and had apps I felt were mandatory and it made more sense to switch than to buy the newest Blackberry.
  • Wanted to put a stake in the ground predicting the Cavaliers will have a much better second half of the season assuming LeBron is healthy.

Wal-Mart is making progress in ecommerce but it is less than people think

Many years ago it became obvious to some of us that online retail would continue to grow at a much faster pace than brick and mortar stores. This appeared to be less obvious to traditional retailers until more recently. In 2001, I suggested to some colleagues that Wal-Mart should acquire Amazon to gain an edge in online retail (Amazon stock was about $5 a share at the time). This idea was scoffed at. I bought Amazon stock but, clearly, didn’t maximize my execution as I sold it within 18 months for 3 times the return (it’s now $317). I’m guessing there were also some prescient investment bankers who received a similar response after suggesting that Wal-Mart buy Amazon. Who knows what the world would be like today had that occurred, as Amazon could easily have been derailed under Wal-Mart management. Continue reading

A Different Perspective on LinkedIn, The Dominant Business Social network

A high proportion of people I know use Facebook as their social network and LinkedIn as their business network. LinkedIn has executed well in capturing a massive audience of business users with over 300 million members, especially in North America which has approximately one third of the network’s members (with Europe quickly catching up). Having done so, it is well positioned to replicate what Facebook has done on the social side – capture business discussions. The question is how can they best do this? LinkedIn’s Influencer Series identifies the most influential voices on LinkedIn and invites them to allow LinkedIn to distribute their articles. The distribution goes to the LinkedIn feeds of people who have opted into seeing posts from each writer. However, the relatively small number of posts and limited distribution doesn’t drive the user value and uptick in page views that could be possible if LinkedIn is to own business discussion in the way that Facebook owns social discussion. Earlier this year LinkedIn recognized this opportunity and opened its Influencer programs to its wider member base with hopes that it would generate more engagement. The move came shortly after the company disclosed that page views declined for the second consecutive quarter.

Continue reading

Will Satya’s manifesto make Microsoft a tech leader again?

The CEO correctly lays out some of the ways the world is changing, but can the software maker really change? 

Microsoft CEO Satya Nadella recently emailed Microsoft employees a speech that I’ll refer to as his “Satya Manifesto.” In it, he points out that the software maker must make fundamental strategic and cultural changes to deliver on his vision of being “the productivity and platform company for the mobile-first and cloud-first world.” He further states: “We will reinvent productivity to empower every person and every organization on the planet to do more and achieve more.”

I was impressed with his willingness to shift Microsoft’s focus to the mainstream of where the world is moving. Yet, I couldn’t help compare his memo to a 1999 speech by Carly Fiorina after assuming the CEO role at Hewlett Packard. In her speech she said, “…we are a single global ecosystem – wired, connected, overlapping …”

Continue reading