When is everyday low pricing the right tactic?

When stores like Wal-Mart, Sam's Club, and Costco began their rapid expansion in the 1990s, supermarkets were thrown for a loop. The limited service, thinner assortments, and “everyday low pricing” of items in these “supercenters” — including foodstuffs — created enormous cost savings and increased credibility with consumers. What was a Safeway or a Stop & Shop to do in the face of such brutal competition?

A new paper from Stanford GSB looks at the strategic pricing decisions made by grocery firms during that period in response to the shock to their local market positions by the entry of Wal-Mart. The paper answers the age-old question in the supermarket industry: Is “everyday low pricing” (EDLP) better than promotional (PROMO) pricing that attempts to attract consumers through periodic sales on specific items? Investigators find that while EDLP has lower fixed costs, PROMO results in higher revenues — which is why it is the preferred marketing strategy of many stores.

The research is also the first to provide econometric evidence that repositioning firms’ marketing approaches can be quite costly. Switching from PROMO to EDLP is six times more expensive than migrating the other way around — which explains why supermarkets did not shift en masse to an “everyday low pricing” format as predicted when Wal-Mart entered the game.

From this article from Stanford's GSB. Ex-Apple exec Ron Johnson can attest to the switching costs of going from EDLP to PROMO pricing; it cost him his job at J. C. Penney.

I'm a Costco regular, but I'll buy groceries at Safeway or other grocery stores sometimes just because or geographic convenience and longer shopping hours. If you have proprietary products, that also allows you to sidestep, to some extent, the EDLP war of attrition.

For commodity products, however, the more retail moves online, the less tenable it is for stores to rely on the sheer convenience of physical store proximity to bypass the EDLP game. The paper above looked at the entry of Wal-Mart, but of course the modern day successor to Wal-Mart as an e-commerce gorilla is Amazon. If you are selling the same commodities as Amazon, it's a brutal game, especially as the eventual customer expectation will likely be same-day delivery AND every day low prices for most retail goods. In that scenario, the findings of the researchers above would not hold. 

Cities are superlinear, companies are not

But unlike animals, cities do not slow down as they get bigger. They speed up with size! The bigger the city, the faster people walk and the faster they innovate. All the productivity-related numbers increase with size---wages, patents, colleges, crimes, AIDS cases---and their ratio is superlinear. It's 1.15/1. With each increase in size, cities get a value-added of 15 percent. Agglomerating people, evidently, increases their efficiency and productivity.

Does that go on forever? Cities create problems as they grow, but they create solutions to those problems even faster, so their growth and potential lifespan is in theory unbounded.

...

Are corporations more like animals or more like cities? They want to be like cities, with ever increasing productivity as they grow and potentially unbounded lifespans. Unfortunately, West et al.'s research on 22,000 companies shows that as they increase in size from 100 to 1,000,000 employees, their net income and assets (and 23 other metrics) per person increase only at a 4/5 ratio. Like animals and cities they do grow more efficient with size, but unlike cities, their innovation cannot keep pace as their systems gradually decay, requiring ever more costly repair until a fluctuation sinks them. Like animals, companies are sublinear and doomed to die.
 

From a Stewart Brand summary of research by Geoffrey West.

From a long conversation with West at Edge:

Let me tell you the interpretation. Again, this is still speculative.

The great thing about cities, the thing that is amazing about cities is that as they grow, so to speak, their dimensionality increases. That is, the space of opportunity, the space of functions, the space of jobs just continually increases. And the data shows that. If you look at job categories, it continually increases. I'll use the word "dimensionality."  It opens up. And in fact, one of the great things about cities is that it supports crazy people. You walk down Fifth Avenue, you see crazy people, and there are always crazy people. Well, that's good. It is tolerant of extraordinary diversity.

This is in complete contrast to companies, with the exception of companies maybe at the beginning (think of the image of the Google boys in the back garage, with ideas of the search engine no doubt promoting all kinds of crazy ideas and having maybe even crazy people around them).

Well, Google is a bit of an exception because it still tolerates some of that. But most companies start out probably with some of that buzz. But the data indicates that at about 50 employees to a hundred, that buzz starts to stop. And a company that was more multi dimensional, more evolved becomes one-dimensional. It closes down.

Indeed, if you go to General Motors or you go to American Airlines or you go to Goldman Sachs, you don't see crazy people. Crazy people are fired. Well, to speak of crazy people is taking the extreme. But maverick people are often fired.

It's not surprising to learn that when manufacturing companies are on a down turn, they decrease research and development, and in fact in some cases, do actually get rid of it, thinking "oh, we can get that back, in two years we'll be back on track."

Well, this kind of thinking kills them. This is part of the killing, and this is part of the change from super linear to sublinear, namely companies allow themselves to be dominated by bureaucracy and administration over creativity and innovation, and unfortunately, it's necessary. You cannot run a company without administrative. Someone has got to take care of the taxes and the bills and the cleaning the floors and the maintenance of the building and all the rest of that stuff. You need it. And the question is, “can you do it without it dominating the company?” The data suggests that you can't.
 

Lastly, from an article about West and his research in the NYTimes.

The mathematical equations that West and his colleagues devised were inspired by the earlier findings of Max Kleiber. In the early 1930s, when Kleiber was a biologist working in the animal-husbandry department at the University of California, Davis, he noticed that the sprawlingly diverse animal kingdom could be characterized by a simple mathematical relationship, in which the metabolic rate of a creature is equal to its mass taken to the three-fourths power. This ubiquitous principle had some significant implications, because it showed that larger species need less energy per pound of flesh than smaller ones. For instance, while an elephant is 10,000 times the size of a guinea pig, it needs only 1,000 times as much energy. Other scientists soon found more than 70 such related laws, defined by what are known as “sublinear” equations. It doesn’t matter what the animal looks like or where it lives or how it evolved — the math almost always works.

West’s insight was that these strange patterns are caused by our internal infrastructure — the plumbing that makes life possible. By translating these biological designs into mathematics, West and his co-authors were able to explain the existence of Kleiber’s scaling laws. “I can’t tell you how satisfying this was,” West says. “Sometimes, I look out at nature and I think, Everything here is obeying my conjecture. It’s a wonderfully narcissistic feeling.”
 

The pace of technology has already shifted some of the old company scaling constraints in the past two decades. When I first joined Amazon, one of the first analyses I performed was a study of the fastest growing companies in history. Perhaps it was Jeff, perhaps it was Joy (our brilliant CFO at the time), but someone had in their mind that we could be the fastest growing company in history as measured by revenue. Back in 1997, no search engine gave good results for the question "what is the fastest growing company in history."

Some clear candidates emerged, like Wal-Mart and Sam's Club or Costco. I looked at technology giants like IBM and Microsoft. Two things were clear: most every company had some low revenue childhood years when they were finding their footing before they achieved the exponential growth they became famous for. Second, and this was most interesting to us, many companies seemed to suffer some distress right around $1B in revenue.

This was very curious, and a deeper examination revealed that many companies went through some growing pains right around that milestone because smaller company processes, systems, and personnel that worked fine until that point broke down at that volume of business. This was a classic scaling problem, and around $1B or just before it, many companies hit that wall, like the fabled 20 mile wall in a marathon.

Being as competitive as we were, we quickly turned our gaze inward to see which of our own systems and processes might break down as we approached our first billion in revenue (by early 1998 it was already clear to us that we were going to hit that in 1999).

Among other things, it led us to the year of GOHIO. Reminiscent of how, in David Foster Wallace's Infinite Jest, each year in the future had a corporate sponsor, each year at Amazon we had a theme that tied our key company goals into a memorable saying or rubric. One year it was Get Big Fast Baby because we were trying to achieve scale ahead of our competitors. GOHIO stood for Getting Our House In Order.

In finance, we made projections for all aspects of our business at $1B+ in revenue: orders, customer service contacts, shipments out of our distribution centers, website traffic, everything. In the year of GOHIO, the job of each division was to examine their processes, systems, and people and ensure they could support those volumes. If they couldn't, they had to get them ready to do so within that year.

Just a decade later, the $1B scaling wall seems like a distant memory. Coincidentally, Amazon has helped to tear down that barrier with Amazon Web Services (AWS) which makes it much easier for technology companies to scale their costs and infrastructure linearly with customer and revenue growth. GroupOn came along and vaulted to $1B in revenue faster than any company in history.

[Yes, I realize Groupon revenue is built off of what consumers pay for a deal and that Groupon only keeps a portion of that, but no company takes home 100% of its revenue. I also realize Groupon has since run into issues, but those are not ones of scaling as much as inherent business model problems.]

Companies like Instagram and WhatsApp now routinely can scale to hundreds of millions of users with hardly a hiccup and with many fewer employees than companies in the past. Unlike biological constraints like the circulation of blood, oxygen, or nutrients, technology has pushed some of the business scaling constraints out.

Now we look to companies like Google, Amazon, and Facebook, companies that seem to want to compete in a multitude of businesses, to study what the new scaling constraints might be. Technology has not removed all of them: government regulation, bureaucracy or other forms of coordination costs, and employee churn or hiring problems remain some of the common scaling constraints that put the brakes on growth.

A theory of jerks

Picture the world through the eyes of the jerk. The line of people in the post office is a mass of unimportant fools; it’s a felt injustice that you must wait while they bumble with their requests. The flight attendant is not a potentially interesting person with her own cares and struggles but instead the most available face of a corporation that stupidly insists you shut your phone. Custodians and secretaries are lazy complainers who rightly get the scut work. The person who disagrees with you at the staff meeting is an idiot to be shot down. Entering a subway is an exercise in nudging past the dumb schmoes.

We need a theory of jerks. We need such a theory because, first, it can help us achieve a calm, clinical understanding when confronting such a creature in the wild. Imagine the nature-documentary voice-over: ‘Here we see the jerk in his natural environment. Notice how he subtly adjusts his dominance display to the Italian restaurant situation…’ And second – well, I don’t want to say what the second reason is quite yet.
 

From Eric Schwitzgebel over at Aeon. He defines a jerk thus:

I submit that the unifying core, the essence of jerkitude in the moral sense, is this: the jerk culpably fails to appreciate the perspectives of others around him, treating them as tools to be manipulated or idiots to be dealt with rather than as moral and epistemic peers. This failure has both an intellectual dimension and an emotional dimension, and it has these two dimensions on both sides of the relationship. The jerk himself is both intellectually and emotionally defective, and what he defectively fails to appreciate is both the intellectual and emotional perspectives of the people around him. He can’t appreciate how he might be wrong and others right about some matter of fact; and what other people want or value doesn’t register as of interest to him, except derivatively upon his own interests. The bumpkin ignorance captured in the earlier use of ‘jerk’ has changed into a type of moral ignorance.
 

At some point in the technology world, it became fashionable to reject the jerk. Perhaps the first meaningful example was the great Netflix culture deck, with its rejection of the “brilliant jerk.” I don't know if any studies led to this movement or whether it was anecdotal, but somewhere along the line, the prevailing strategy shifted from trying to hire and isolate brilliant jerks to just rejecting them outright, perhaps because of a perceived increase in the need for effective collaboration among team members to ship important work.

The music biz

14. Coldplay, Radiohead and Dave Matthews are huge because they snuck in under the wire via the old system, they were the beneficiaries of big TV video play, when that meant something, before the Web obliterated it. Otherwise, they’d be Arcade Fire, which garners great reviews, wins Grammys and most people still have not heard and don’t even care about.

15. There’s tons of money in music, more than ever before, if you’re a superstar, if not, you’re starving.
 

From The Lefsetz Letter on What People Don't Want to Believe

Difference between peacetime and wartime

Military operations are, arguably, especially mistake-prone, because militaries aren’t like other organizations. A normal bureaucracy has a job, and it does that job all the time. Militaries, on the other hand, tend to spend most of their time not really engaged in their main purpose: fighting wars.

Teles noted James Q. Wilson’s observation about the fundamental difference between a peacetime army and a wartime army. In peacetime, it’s easy to observe inputs but impossible to observe the output -- which is to say, how ready your troops are to go out and kick some enemy butt on the battlefield. When you get into a war, this completely reverses. In the chaos of battle, it’s very difficult to know exactly what your people are doing. On the other hand, it’s relatively easy to observe whether they killed the people they were supposed to kill and took the territory they were supposed to take.

That means that the people who advance in a peacetime army are, unfortunately, not necessarily the same people you want around when the shooting breaks out.
 

Megan McArdle on why armies mess up so often. I liked this quote from Steven Teles, “In wartime, you want people who would rather ask for forgiveness than permission. In the peacetime army, it’s the opposite.”

The same applies to companies. Being in wartime is very different from being in peacetime, from the CEO level on down. Ben Horowitz writes nicely about the difference for CEO's in his great book The Hard Thing About Hard Things.

Modern movie studio economics

A really fantastic three part analysis by Liam Boluk of modern movie studio economics as it pertains to blockbusters.

Future of Film I: Why Summer 2013 was Destined for Losses

Much has been said about the growing role of ‘tent-pole’ filmmaking, where the superlative performance of a major blockbuster supports the rest of the studio’s portfolio (including failed blockbusters). In practice, however, the strategy doesn’t ‘hold up’. Over the past decade, the Summer Blockbuster season has delivered a net theatrical profit only three times and the major studios have lost nearly $2.6B on $34B in production and marketing spend.

...

The Summer 2013 season was so jam packed with “blockbusters” that the industry seemed destined for historic losses containing:

  • 18 blockbusters – A historic high and 41% increase over the ten year average
  • 15 back-to-back weekends of blockbuster releases – A third more than ten year average and 25% more than a decade ago
  • 5 weekends with two blockbuster releases – 317% more than the average, 2.5x the previous record and 5x the number in 2003
     

Future of Film II: Box Office Losses as the Price of Admission

For all its glamour, theatrical entertainment is simply a rotten business to be in.

Though its products are not commodities, many of the industry’s competitive dynamics and characteristics suggest they could be:

  • Past success is not a predicator of future performance. Last year’s box office receipts do not influence current-year performance and year-to-year momentum translates into little beyond high spirits
  • Talent doesn’t ensure success. The most “valuable” stars, brand-name directors and veteran producers routinely produce box-office bombs
  • Hollywood brands are irrelevant. Aside from Pixar (whose brand is arguably in decline), consumers don’t pick films based on whether they were a Universal or Paramount production. Indeed, consumers rarely even know
  • All products are offered at the same market price. Regardless of the film’s production costs or target customers, end consumer pricing is largely identical

...

Why then, do executives continue making films? They have few (if any) levers they can reliably play with, the success of individual films causes massive disruptions in annual performance and in the long run, performance is unlikely to break-even, let alone outpace market returns.

The answer: ancillary revenue. In 2012, box office receipts represented only 52% of revenue for the average film, with the remainder comprised of home video sales, pay-per-view and TV/OTT licensing, syndication fees and merchandising. After appropriating for related costs, as well as backend participation (Robert Downey Jr. took a reported $50M from Avengers) and corporate overhead, the average Internal Rate of Returns (IRR) for the majors jumps to roughly 80%.

...

Since silent films first appeared on the silver screen, motion pictures has been primarily a B2C business, with film studios sharing revenue with theater operators. But over the past decade, the majors have transformed into an increasingly diversified B2B partner. Their job is not to bring eyes to their theatrical products, but to enable NBC to drive Sunday advertising revenue, ABC Studios to create a high-margin television series, HBO to collect monthly subscriber fees or Mattel to sell Cars toys. Entertainment, in short, has become both a platform and a service.
 

Future of Film III: The Crash of 'Film as a Platform'

More important, however, is the impending ‘Film as a Platform’ implosion. Looking at 2016′s dense release schedule, theatrical losses per blockbuster are likely to increase considerably. Not only will increased competition drive down average attendance, it could push studios to invest even more into their film properties in the hopes of standing out. This itself isn’t a fatal exposure – studios will simply need to rely more heavily on ancillary revenues. However, the real issue is that further audience fragmentation will make it even harder to achieve the critical mass audience needed to support ancillary revenue streams. Worse still, the growing number of franchise films may end up flooding ancillary channels.

Ancillary markets such as home video, merchandising and children’s television can only absorb so much content. A child, after all, will not want a Christmas comprised of various X-Men, Star Wars and Avatar paraphernalia and parents are unlikely to purchase multiple bedroom sets. Television audiences, can support only so many series in a given genre (the Marvel Cinematic Universe will have 7 in 2015 alone). Though themed sets have been a strong sales driver for the Lego Group, optimizing marketing and inventory investments will limit the number of franchises they will support – especially in the holiday season. As a result, the deluge of ‘platform films’ is likely to significantly reduce the ancillary revenues studios rely on for film profitability. To make matters worse, it would take at least two years for studios to emerge from this crunch due to the fact films are released 1-2 years after investment/production decisions are made.
 

I love my occasional summer blockbuster movie, but I already feel like I have pop movie diabetes. The latest Captain America movie has not one, but two extra scenes during the end credits, each previewing a different future Marvel movie. One day soon, after the credits of the latest Marvel movie, the extra scene will just be a house ad for the theme park ride based on the movie, and on the way out of the theater there will be a booth set up to sell toys from the movie. Theaters already make most of their profits on concessions, using the blockbuster movies as a loss leader, it's not all that different for the studios themselves.

The automatic corporation?

The intermediate step to a fully automated corporation is one where tasks requiring humans are performed not by employees but are broken into micro-tasks and fulfilled by crowdsourcing (using, for example, services like Mechanical Turk).

Corporations do not scale, and eventually die. That’s because they scale sub-linearly. Their productivity metrics scale by an exponent of ⅘ on the number of employees.

I hypothesize that the management overhead which makes corporations grow sub-linearly is due to the limited information processing capability of individual humans. People at the top do not have local on-the-ground information: how are individual products performing, what are customers’ complaints etc. And the rank-and-file folks on the ground do not have the relevant high-level information: how does what I’m doing translate to the value that the corporation as a whole seeks to maximize? In fact, the the flow of value and information is so complex that employees have pretty much given up on determining that relationship, and know of it only at a macro P&L-center level.

An algorithm will have no such problems with acting on both global as well as fine-grained local information. In fact, I suspect that the more information it gets to act on, the better decisions it will make, making automatic corporations grow super-linearly.
 

More here, all fascinating, on the concept of an automatic corporation.

When the idea of two-pizza teams was first proposed at Amazon, it was an attempt at accomplishing two thing simultaneously. On the one hand, keeping teams small was an attempt at giving them autonomy in figuring out what strategy and projects to pursue. On the other hand, since each team had to optimize on a fitness function agreed upon with senior management, it was a model for scaling Jeff Bezos and his senior management team's ability to coordinate activities across the company. If you have a limited number of people you trust to choose the fitness functions, that's still a bottleneck.

The idea of an automatic corporation would replace the humans in both the fitness function and project selection process with software which scales infinitely where humans cannot.

This may sound far-fetched, but the author Vivek Haldar notes it already exists in some forms today.

A limited version of what I’m describing already exists. High-frequency trading firms are already pure software, mostly beyond human control or comprehension. The flash crash of 2010 demonstrated this. Companies that are centered around logistics, like FedEx or Walmart, can be already thought of as complex software entities where human worker bees carry out the machine’s instructions.

This happens naturally, because over time more and more of the business logic of a company becomes encoded in software. Humans still have some control (or so they think) but mostly what they’re doing is supplying parameters to the computation. A modern corporation is so complex that it does not fit in the brain of a single person (or a small number of persons). Software carries the slack.

Big data and price discrimination

Adam Ozimek speculates that Big Data might bring about more price discrimination.  First degree price discrimination has always been a sort of business holy grail, but it was too difficult to get enough information on the shape of the price-demand curve to make it so.

For some time now, though, this has no longer been the case for many companies, and in fact one company did try to capitalize on this: Amazon.com. I know because I was there, and the reason that was a short-lived experiment is a real world case study of how the internet both enables and then kneecaps this type of price discrimination.

Amazon, until then, had one price for all customers on books, CDs, DVDs (this was the age before those products had been digitized for retail sale). A test was undertaken to vary the discount on hot DVDs for each customer visiting the website. By varying the discount from 10% up to, say, 40%, then tracking purchase volume, you could theoretically draw the price-demand curve with beautiful empirical accuracy. 

Just one catch: some customers noticed. At that time, DVDs were immensely popular, selling like hotcakes, and the most dedicated of DVD shoppers perused all the online retail sites religiously for the best deals, posting links to hot deals on forums. One customer posted a great deal on a hot DVD on such a forum, and immediately some other respondents replied saying they weren't seeing the discount.

The internet giveth, the internet taketh away. The resulting PR firestorm resulted in the experiment being cancelled right away. Theoretically, the additional margin you could make over such price discrimination is attractive. But the idea that different customers would be charged different prices would cause such distrust in Amazon's low price promise that any such margin gains would more than offset by the volume of customers hesitating to hit the buy button.

Ozimek notes this: "The headwind leaning against this trend is fairness norms." What's key to this is that the internet is the world's most efficient transmitter of information, and while it enables a greater degree of measurement that might enable first degree price discrimination, it also enables consumers to more easily share prices with each other. This greater transparency rewards the single low price strategy.

It's not a coincidence, in my mind, that Apple fought for a standard $0.99 per track pricing scheme with the music labels while Amazon fought the publishers for a standard $9.99 pricing for Kindle ebooks. Neither Amazon or Apple was trying to profit on the actual ebooks or digital music retail sales (in fact many were likely sold at break-even or a loss), they were building businesses off of the sale of complementary goods. In the case of Amazon, which is always thinking of the very long game, there are plenty of products it does make a healthy profit off of when customers come to its site, and getting users to invest heavily into building a Kindle library acted as a mild form of system lock-in. In the case of Apple, it was profiting off of iPod sales.

In the meantime, second and third order price discrimination continues to exist and thrive even with the advent of the internet so it's not as if the pricing playbook has dried up.

A skeptic might counter: didn't Ron Johnson get fired from J. C. Penney for switching them over to an everyday low price model? Didn't their customers revolt against the switch from sales and coupons and deals you had to hunt down? 

Yes, but everyday low pricing isn't a one-size-fits-all pricing panacea (as I wrote about in reference to the Johnson pricing debate at J.C. Penney). For one thing, there is path dependence. Once you go with a regular discount/deal scheme, customers create a mental price anchor that centers on that discount percentage and absolute price. It's hard to lift an anchor.

J. C. Penney was trying to go from a heavy sale-driven pricing scheme to an everyday low pricing model, and that's an uphill, unmarked path. Only the reverse path is paved. It's not clear whether the switch would have worked in the long run. Johnson ran out of runway from his Board soon after he made the switch and revenues declined. 

Everyday low pricing tends to work best when you're selling commodities since those items are ones your customers can purchase many places online. At Amazon we were far more interested in dominating one crucial bit of mental math: what website do I load up first if I want to buy something? We were obsessed with being the site of first resort in a consumer's mind, it was the core reason we were obsessed with being the world's most customer-centric company. Anything that might stand in the way of someone making a purchase, whether it be prices, return policy, shipping fees, speed of delivery, was an obstacle we assaulted with a relentless focus. On each of those dimensions, I don't think you'll find a company that is as customer-friendly as Amazon.com.

Ultimately, customers have a hard time figuring out intrinsic value of products, they're constantly using cues to establish a sense of what fair value is. Companies can choose to play the pricing game any number of ways, but I highly doubt Netflix and Amazon will choose to make their stand on the first order price discrimination game. There are many other ways they can win that are more suited to their brand and temperament.

Still, the peanut gallery loves to speculate that Amazon's long term plan is to take out all of its competitors and then to start jacking up prices. A flurry of speculation that the price hikes had begun spun up in July this year after an article in the NYTimes: As Competition Wanes, Amazon Cuts Back Discounts. After the NYTimes article hit, many jumped on the bandwagon with articles with titles like  Monopoly Achieved: An invincible Amazon begins raising prices.

If you read the NYTimes article, however, the author admits "It is difficult to comprehensively track the movement of prices on Amazon, so the evidence is anecdotal and fragmentary." But the article proceeds onward anyhow using exactly such anecdotal and fragmentary evidence to support its much more certain headline. 

Even back when I was at Amazon years ago we had some longer tail items discounted less heavily than bestsellers. However, pricing the long tail of books efficiently is not as easy as it sounds, there are millions of book titles, and most of the bandwidth the team had for managing prices was spent on frontlist titles where there was the most competitive pressure. All the titles listed in the NYTimes article sound to me like examples of long tail titles that were discounted too aggressively for a long period due to limited pricing management bandwidth and are finally being priced based on the real market price of such books. Where in the real world can you find scholarly titles at much of a discount?

The irony is that the authors cited in the article complain their titles aren't discounted enough, while publishers ended up in court with Amazon over Amazon discounting Kindle titles too much. This is to say nothing of the bizarre nature of book pricing in general, in which books seem to be assigned retail prices all over the map, with the most tenuous ties to any intuitive intrinsic value. The publishers set the retail price, then Amazon sets a price off of the retail price. If the publishers wants the discount on their books to be greater they could just increase the retail price and voila! The discount would be larger.

To take another category of products, DVDs, soon after we first launched the DVD store, long tail title like Criterion Collection DVDs were reduced from a 30% discount to a 10% to 15% discount. But just now, I checked Amazon, and most of its Criterion DVDs are discounted 25% or more. If I'd taken just that sample set I could easily write an article saying Amazon had generously decided to discount more heavily as part of its continued drive to return value from its supply chain to customers.

Could the net prices on Amazon be increasing across the board? I suppose it's possible, but I highly doubt that Amazon would pursue such a strategy, and any article that wanted to convince me that Amazon was seeking to boost its gross margins through systematic price hikes would need to cite more than just a few anecdotes from authors of really long tail books. 

It will remain a tempting narrative, however, because most observers think it's the only way for Amazon to turn a profit in the long run.

However, that's not to say big data hasn't benefitted them both in extraordinary ways. Companies like Amazon and Netflix know far more about each of its customers than any traditional retailer, especially offline ones, because their customers transact with them on an authenticated basis, with credit cards. Based on their customers' purchase and viewing habits, both companies recommend, better than their competitors, products their customers will want.

Offline retailers now all want the same type of data on their customers, so everyone from your local drugstore or grocery store to clothing retailers and furniture stores try to get you to sign up for an account of some sort, often by offering discounts if you carry a free membership card of some sort.