Data processing architecture has swung like a pendulum, from a centralized mainframe era, to decentralized PCs, and now back to the cloud. Modern cloud data centers feature distributed architectures that share computing and storage resources across widely dispersed locations at nearly limitless scale, delivering exceptional performance for extraordinarily low costs. The considerable advantages stem from four major factors: Scale economies in purchasing technology and hiring talent facilitate cost and skill advantages. Scope puts servers much closer to users, improving response time and cutting costs. Innovative data center design can dramatically reduce capital and operating costs. Finally, software innovation led by Google, and followed in the open source Hadoop, allows applications to work on processor arrays and data sets of nearly limitless size. Against these advantages, enterprises will weigh transition costs, slowly receding security concerns and application-specific idiosyncrasies to determine which applications should be shifted to the cloud and how quickly. As the change proceeds, private data center investment will wane and traditional IT will commoditize, with the leading cloud providers, IT consultants and their customers the big winners.
Archive for July, 2012
- Amazon slightly missed consensus sales and earnings estimates, barely turning a profit, but sales were up 29% despite the global economic situation.
- The story remains intact – low margins because of investment against huge opportunities. E-tail dominance unchallenged, but platform and media plays remain risky.
- Facebook met sales expectations with 32% YoY growth, but posted an unexpected loss and offered no guidance for future quarters beyond a significant rise in operating expenses.
- With usage shifting rapidly to mobile platforms, Facebook is stymied by the control that Apple, Google and Microsoft have over their platforms.
Amazon and Facebook offered very little insight to help investors make confident forecasts of future results. SHOCKER! Amazon shareholders have come to expect this sort of treatment from management, although the track record of growth is fairly astounding given the company’s $50B annual run rate as a retailer in the worst global economy of most investors lifetimes. The story is the same as it ever has been: “We are investing pell mell to attack opportunities. Profits will come later.” Every once in a while, Amazon posts a margin surprise, like 1Q12 when it delivered nearly $200M in operating earnings, seemingly to prove that it can and tease investors of the times to come. Times when Amazon is done building distribution centers, done selling tablets at below cost, and done giving its media products away for free to customers who pay the $79 annual fee to avoid shipping charges. It’s easy to be skeptical, but the opportunities are huge and I think Amazon is uniquely well positioned for much of it. I’m not optimistic for the future of Amazon’s device platforms, but there is no obvious candidate to slow Amazon’s march to the sea in retail.
When (if) the Affordable Care Act (ACA) is put into effect, states will be free to disenroll beneficiaries to the federal minimums (in many cases < 30 FPL), expand to (or beyond) the ACA’s original target of 138 FPL, or anything in between
Despite this range of options, we expect most states to adopt 100 FPL as their upper limit of Medicaid eligibility (for non-dually eligible, non-pregnant adults), for a very simple reason: the closer states bring their eligibility thresholds to 100 FPL, the better off they are; conversely the further (above or below) they move from 100 FPL, the worse off they are
A state with current eligibility < 100 FPL who disenrolls the marginal beneficiary saves $0.43 in budgetary terms, but in economic terms loses the $0.57 federal match (and associated multiplier effects) and incurs costs for uncompensated care. If instead the state enrolls the marginal beneficiary who lies above the current eligibility standard (but < 100 FPL), the state spends $0.10 for a $0.90 (plus multiplier effects) gain. In budgetary terms it’s easier to go backward, but in economic terms it’s far better to go forwards – at least until you hit 100 FPL
Likewise the state with current eligibility > 100 FPL who disenrolls the marginal beneficiary above this limit also saves $0.43 in budgetary terms, and loses the $0.57 federal match. However if that person – who at >= 100 FPL is eligible for federally subsidized coverage on the exchanges – enrolls on the exchange, the gross federal dollars entering the state for that beneficiary increase by 185%. As long as >= 75% of those affected go to the exchanges (the true breakpoint is probably lower), it is to this more generous state’s economic (and budgetary) advantage to lower its eligibility threshold to 100 FPL
In reality most states would have to expand eligibility (for non-dually eligible, non-pregnant adults) to reach 100 FPL, so we expect a net expansion (23% increase in enrollees and 15% in spending v. current eligibility standard), though this is roughly half the size we would have expected if states had all expanded to 138 FPL as originally called for by ACA
Republican states have nearly two-thirds of the enrollees who would be picked up in a nationwide expansion to 100 FPL, though most of these can’t be relied on until after the general election, and also after any attempts at repeal in the event of a Romney presidency. Thus the risks to our sizing of the expansion are to the downside, particularly in the first few years following 2014
100 FPL as a likely equilibrium points to a net expansion, which eases our concerns regarding the Medicaid HMOs. Of these, AGP stands to see the largest enrollment gains, MGLN and MOH the least
Hospitals lose a fair bit of reform-related upside if eligibility stops at 100 FPL; mainly because of collections problems (low income persons covered on the exchanges are less able and less likely to honor out-of-pocket cost obligations) and higher rates of un-insurance, hospital revenue from a low-income beneficiary on the exchange is about 25% less than if that person were covered by Medicaid. UHS is most negatively affected, HCA least affected
Because persons above 100 FPL who might otherwise have been Medicaid beneficiaries can buy federally subsidized coverage on the exchanges, total federal subsidy spending is likely to be much greater than originally forecast – so much so that the ACA’s cap on subsidies as a % of GDP should be breached in or just after the exchanges’ first year of operation
This implies the ACA may immediately need additional budget authority; in light of the House’s power over budgetary matters and the high likelihood the House remains in Republican hands even if President Obama is re-elected, the subsidy cap may be a large but under-appreciated weakness of the Act
For more information, please see the published research archive
- 12 month product cycles may be too long given the pace set by the Android eco-system – iPhone sales drop-off now 6 months post release, with new Samsung Galaxy S III now the hot product.
- Cutting price on old models vs. purpose building midrange models may miss opportunity for more frequent product introductions while dragging down margins.
- Apple is not yet exploiting the power of its iOS platform to control web-based services to its user base – e.g. advertising, e-commerce, media streaming, etc. – to drive revenue.
- Without changes to the above, future product-driven boom/bust cyclicality will be even more pronounced. The next big product cycle begins in Apple’s 2Q13.
Apple missed its 3Q12 in a manner eerily reminiscent of its 4Q11 miss 9 months ago. iPhone is the culprit, at 50% of the company’s revenues, any shortfall is particularly painful, and short it was. Apple’s strategy of releasing a single new iPhone model once per year has served them well over the years – R&D is conserved, launch costs are amortized over a long product life, and thirst for the new model builds. However, the Android ecosystem is not playing by Apple’s rules, and Cupertino would be well advised to note that the summer of their discontent is playing out while rival Samsung has launched the phone of the season in the Galaxy S III. Google is pushing out major platform updates on a 9 month schedule, and its smartphone partners led Apple 56% to 23% in global market share as of the first calendar quarter, a gap that is almost certain to widen for 2Q12 based on Apple’s reported numbers.
- Netflix quarterly beat was not enough – increasing programming costs and a reminder that a lot of people will watch the Olympics rather than Netflix for a few weeks slammed the stock.
- The rising fees for the studio libraries that remain the lifeblood of Netflix are a reminder that exclusive content deals come with an ever increasing premium.
- Netflix is following the HBO playbook – use library content to build a subscriber base, then establish original content to hold the beachhead once competition arrives.
- If Netflix can build an enthusiastic audience for its originals, studio leverage and competition become much less important issues, but the question is “if”.
Netflix was on target with revenue and beat earnings expectations yet was slammed in after hours trading down nearly 15%. While the company reported domestic subscribers up 7.7% YoY driven by a growing streaming customer base, its increasing content costs are becoming painfully apparent with subscription cost of revenue up 36.3% YoY and squeezing profits. Netflix has several deals for exclusive content with Paramount and Lionsgate’s EPIX as well as the Weinstein Company which typically carry a premium to comparable non-exclusive content of 1.5x or more. With Amazon, Comcast and others looking to elbow their way into the online streaming movie market, and with Apple looking hungry on the sidelines, investors are justly worried whether Netflix keep the lead in the next stage of the race.
In our initiation of coverage of the transport space – June 15th – we wrote about the improved nature of the rail industry and that how, following a period of terrible returns, the industry restructured and has seen improving returns on capital since. This does not apply to GWR, which has had a slow decline to its return on capital since 1995 and is currently showing returns that are on that trend rather than above, which contrasts with the Class 1 companies. RA has more of a flat trend to its return on capital, but we have less history for RA.
The return on capital trends for the regional rails contrast sharply with the Class 1 group (summarized below), which have a very positive slope from 1999. In addition, both GWR and RA are currently below the Class 1 absolute and trend returns on capital.
Source: Capital IQ, SSR Analysis
Our analysis of M&A returns, published in May, did not cover the transport group, but subsequent work shows that the average acquirer in the transport space has outperformed both the S&P 500 and the sector in the 12 months post deal completion, but has underperformed in the 12 months post deal announcement.
While this acquisition might give GWR the opportunity to improve returns and catch up with the Class 1 group, investors have historically not given the benefit of the doubt until the deal has closed. Moreover, our May report showed that in aggregate (for the Industrials and Basic Materials sectors), investors have been much more skeptical about deals since 2008 than they were prior to 2008 and acquirers have generally underperformed their sectors in the last 4 years. The successful and value creative integration of an acquisition is not an easy task.
One further consideration is that our analysis shows GWR valued like the rest of the rail space – well above historic mid-cycle levels. This has happened without the return on capital improvement that the rest of the sector has seen. In our transport piece we indicated that the rail sector had a high Skepticism Index (SSRSI) – valuations high, but not as high as current returns on capital would support. GWR was an outlier in this analysis suggesting that current valuation discounts an improvement in returns on capital that we are yet to see.
- Microsoft’s 4Q shows the power of Office, Mister Softy’s biggest asset in establishing Windows 8 and Windows Phone as viable 3rd platform vs. Apple and Android.
- Google is powering through the recession as online ads take share from traditional media, on stabilizing click pricing. On-going big Capex expected, as infrastructure is its most important asset.
- Servers for the cloud are driving Intel growth despite the dying PC platform, but appear vulnerable long term to price pressure from ARM-based alternatives.
- Qualcomm missed due to supply constraints, but revealed the light at the end of the tunnel, with optimism that the supply catch up before year end.
Ah, earning season, when a young analyst’s fancy turns lightly to thoughts of interminable conference calls, and deliberately opaque financial disclosures. Speaking of which, several tech bellwethers reported this week – Google, Microsoft, Intel and Qualcomm – and were generally greeted with relief, if not actual applause. Relative to the published consensus, the results were a mixed bag, but as usual, the story was more important.
As a stock research analyst I covered PPG for 7 years (from 1997 to 2004) and the one word that I would have used then to best describe the company and the management style was “sensible”. Not flamboyant, or aggressive, or conservative, or inconsistent, or fantastic, or terrible. Sensible was the right word then and it looks like it is still very appropriate today. Chlor-alkali has been a bit of a headache for PPG for 10+ years; it is inconsistent with the rest of the portfolio and it has often been a distraction from the core strategy because of its margin volatility and lack of growth. The analyst community has been calling for divestment for years. To PPG’s credit, management has been patient and has resisted the pressure because almost every option open to them was likely to be earnings dilutive and cash flow dilutive – while not core, the business has thrown off a lot of cash.
Patience appears to have paid off, in that we are at a point in time where the investment world feels better about the chlor-alkali business, because of cheap natural gas/electricity and because the improving new home market will likely reduce PVC exports from the US (profitable today) and increase domestic consumption of PVC at greater profit margins. Further, Georgia Gulf is now at a phase of its evolution where this move makes very good business sense.
The deal structure is one which minimizes the risk of cyclical miss-valuation, creates the scale that the new GGC needs to compete without the potentially crippling costs of a premium driven acquisition. PPG shareholders can elect to hold the commodity company or sell it. If they liked the old PPG structure they can still own the pieces – but with a larger and more efficient commodity piece.
However, these deals are never without risk and we offer a few thoughts:
- Already we are seeing the popularity of the deal and a divergence in valuation. GGC is up 12% as of 3:00 – PPG is up 6.5%. There is the risk that valuations diverge dramatically prior to the deal close – in either direction. The structure of the deal looks like it can accommodate this without GGC feeling like it is overpaying or PPG feeling like it is being underpaid. However, there are examples of deals where some unanticipated arbitrage opens up and throws valuations off balance, complicating the process.
- Candidly, and with no personal experience of current GGC management, our view is that “sensible” would not have been an appropriate description over the last 10-12 years – on a split adjusted basis this was a $500 stock in 2007. GGC has turned down an offer at $35 per share already this year, which, in the light of better housing numbers in the US and cheaper energy, was probably the right thing to do. However, the focus will now be on whether they can deliver the synergies and integration benefits. As a PPG shareholder, post deal close, you can always sell the stock if you do not like the story.
- A risk to the deal following on from the point above is that GGC has operational or other disappointments before the deal closes. If GGC were to trade at levels that suggested its business was worth less than other chlor-alkali businesses, PPG could reasonably get cold feet.
The fundamental issue with the global commodity chemical industry is that there are far too many competitors – too many to suggest that the industry has much of a chance of consistently earning its cost of capital through the cycle, despite the recent trend – see chart and prior research. Consolidation attempts have resulted in many deals that have hurt shareholders, as often cash has been paid at the top of the cycle or premiums have been paid to get the deal done that have either diluted the acquirer’s earnings or raised debt to crippling levels.
This proposed transaction looks to take most of the risk away from both the effective buyer and the effective seller, while at the same time achieving the scale and consolidation that should benefit the new company – sensible.
- Apple vs. Google vs. Microsoft is the central battle that will drive the next 20 years of TMT. The road to success for others is to ride the waves created by that battle.
- The big three have taken turns making splashy announcements that demonstrate the very different strengths each brings to the table.
- Apple’s defining strength – UI/design – is obvious to investors. Google’s – distributed data processing – and Microsoft’s – enterprise apps – are harder to appreciate, but no less valuable
- Strengths define somewhat divergent strategies that will lessen the degree of direct confrontation. The opportunities are huge and all three companies should be winners
I’ve been writing about the outsized importance of software platforms, and our belief that Apple, Google and Microsoft are in position to co-opt a disproportionate share of the value being siphoned from the rest of the economy by the Internet, for quite a while. See LINK, LINK and LINK. Essentially, these platforms have become gatekeepers for users, with the power to integrate their own technology to the exclusion of 3rd parties and guide users to favored 3rd party apps that pay for the privilege. However, the brewing battle is not as straightforward as it may seem. The three companies come to the fray with very different company strengths, and, given this, very different strategies. This has been evident in the series of big announcements made by the companies over the past 5 weeks.
On Friday, we published a comprehensive analysis of current consensus expectation for the third quarter for the US Chemical Industry. We believe that consensus is far too optimistic and are suggesting that earnings estimates may come down by as much as 15% for the quarter. We have looked beyond company reports, analyzing several of the macro drivers and, yes, we are taking a risk and sticking our necks out a bit. However, much of what we discuss harks back to the fundamental risk of forecasting and the current dynamics of the sell side equity business
In short, it is our view that sell side analysts have neither the time nor the incentive to do the quality of work that is needed to get through the noise and get more than a hair away from consensus. The structure of the equity business today discourages analysts from taking risk and prevents them from taking their time to consider what might be really happening.
Two events last week highlight the issue. On July 12th, one of the analysts covering Dow Chemical cut numbers for the year and for the second quarter. The cuts were extreme – more than 4% for annual revenue and almost 10% for Q2 revenues. We do not know the identity of the analyst as it is restricted on Capital IQ (and it does not matter), but the analyst was way above consensus prior to the cut and is now below. The questions are: why did it take so long and what value is this to investors a few days before Dow reports earnings? The fundamentals driving the revisions have been known for more than a month (pricing and volume) and, arguably, were predictable two months ago.
The second data point was a publication by the American Chemistry Council last week talking about how strong polyethylene sales and export volumes were in June relative to May. This was picked up by several analysts and reported to be a good thing. It is probably the opposite, but because no one is allowed the luxury of time to consider the data itself, its implications and how it interacts with other data, the simplest conclusion is drawn.
Looking at our piece on Chemicals specifically, we are focused on the macro environment driving demand and subsequent production disappointments in Q3 2012, and how this in turn will impact earnings and valuation. We have simply looked at the global economic indicators today and compared them with Q3 of 2011. They are all weaker, with the exception of US Housing and US Autos, both of which are coming off historic lows and are direct beneficiaries of very low interest rates. Everything else looks grim and points to some meaningful demand disappointments. The ISM new order numbers tend to be a reasonable leading indicator of industrials sales growth with a 12 to 18 month lag 20-30 years ago, and closer to a 9-12 month lag today. The lag, together with its constriction, is probably explained by inventory. If this relationship continues, the recent ISM number is a cause for concern and supports the idea that expectations at the stock level look too high.
Now, moving on to the ACC data – better numbers in June for polyethylene – whoopee! This cannot be heralded as good news without some thought, and when you think about it, how can it be good news? Higher exports are completely understandable – see our report – and they are the basis on which the US is building 5 to 7 new world scale basic chemical facilities. We have cheap natural gas and we are going to exploit that – but export volumes carry a lower margin than domestic volumes, so it is good news and bad news.
But a significant increase in domestic consumption in June simply does not add up – all the macro indicators point to weaker demand except in US Autos and Housing (which have minimal impact on polyethylene). It has to be a move in inventory, exploiting lower pricing in June, and it will lead to lower demand in Q3, supporting our cautious stance. This may not be good news. It may add incrementally to June numbers, but we are clear in our research that we believe the bigger surprise will be Q3 and 2H guidance not Q2 results.
But let’s cut right to the real heart of the matter – consensus estimates are often meaningless, arguably through no fault of the research analysts. The average investment manager and hedge fund has steadily increased the share of commissions that it pays its brokers and research providers for “corporate access”. Fifteen years ago we never spoke about corporate access but today it is 30+% of the reason investors pay their brokers and boutique research providers. Moreover, as trading volumes and commissions have fallen, investment banks have become steadily more dependent on advisory fees and equity capital markets to justify the investment made in and to pay the large overheads of research sales and trading departments. Very few equity platforms can pay their own way anymore.
Consequently, the average analyst has two very compelling reasons to take minimal risk when commenting on individual companies – he or she does not want to upset the company or create any waves that might limit corporate access or damage broader corporate relationships. We have created an environment where analysts are not rewarded for taking risks. Suggesting that guidance from a company is wrong and that they are going to miss numbers can be considered a risk in today’s environment. Sad, but there it is.
Further, because, as an analyst you are paid to maximize client revenues, you are compelled to publish all the time (any piece of news flow merits a response – often without time for analysis), make dozens of calls a day and focus on seeing clients and providing the corporate access. You are not encouraged to step back and review the longer term bigger picture, think for a while, try some new analysis, test the widely accepted view etc. So it doesn’t happen; you are running at full sped on the same treadmill each day. Worse still, the job market stinks and there are too many people doing the same jobs at too many firms, so the backdrop for an average analyst today is that at a very basic level you are distracted by fear and can do little more than focus on the quantitative metrics against which you are measured internally. We can forgive the analyst who was late to the estimate change on DOW as he has likely spent the last 8 weeks marketing like a madman trying to get as many II votes as possible (Institutional Investor – the annual poll closes shortly).
How can investors expect to see original – forward thinking – research in this environment, and how can you have any confidence that the estimates at the aggregators have been reviewed properly or even recently?
On Supply/Demand Forecasting Specifically
In our 25 years of forecasting fundamental supply and demand, it has been our observation that it is almost always the demand side of the equation that trips us up. Supply is generally easier to quantify, and yet we tend to spend much more time researching and analyzing it than we do demand. We look at production capacity additions (it does not matter whether it is plastics, autos, cement trucks, drilling rigs); we look at operating efficiencies and we look at feedstock availability. We draw cost curves; we posit trade flows and regional interplays and so on.
When we get to demand we either extrapolate history or we find a multiple of an economic indicator, like GDP and then we take a consensus forecast for that – we spend 10% of the time on the demand side of the equation and 90% of the time on supply – at least we do in Industrials and Basic Industries. Of course, the companies themselves help in this process, because they give us lots of help on supply – capacity on the ground, expected capital expenditures etc. For demand, they often use the same methodology as us – take a guess.
There is a problem with taking consensus GDP or other economic forecasts. Simply put, many companies making those forecasts have conflicted motivations. The majority have an interest in portraying a positive outlook as this builds customer confidence and drives business. You will not be a very successful investment bank if your core view is “the world is slowing down so put your plans on hold for a while”. Consensus economic forecasts consistently underestimate slowdowns and demand shocks. There are some very good economic forecasters but they often get lost in the averaging.
In our experience companies themselves are just as bad at forecasting demand; underestimating and getting caught short of product on the upside and overestimating and over-producing on the downside. There is also the issue of the “localized” company view. “We accept that overall demand may fall, but it won’t impact us”.