Version 2/13/05
Return
You will note that there are links from many of the words and phrases in this text (hyperlinked.) By clicking on these links, an additional browser window will open. Many of these links will take you to a Dictionary of Terms, in order to explain the word/phrase (other links take you to other relevant documents.) The purpose of the Dictionary of Terms is to allow many users, with different levels of knowledge, to benefit from the text. If there is a term in the text you feel should be better explained, please email me at alex@udel.edu and I will include it.

Markets, Pricing Models and Digital Economics

Evolution of Markets

A hundred years ago dynamic pricing was common practice, as trade occured in local markets, face-to-face between buyers and sellers (many-to-many communication). This practice still occurs in some markets in certain regions of the world. An example can be observed in the Gold Souks of Dubai. The price that is associated with the product is a starting point (high point) for a series of haggles between the buyer and seller.

As mass marketing and mass production, outgrowths of the industrial revolution, evolved, separation of supply and consumption was inevitable (consumers no longer went to a market to deal directly with the producer/hunter) and it became more difficult to associate a dynamic price with individual products. Thus fixed pricing has become the norm in most markets.

As markets have become more competitive, consumers have gained more power in the relationship between the business and the customer, as illustrated by the evolution of the marketing concept (from the Production Era to the One-to-One Era). Technology has allowed companies to target their markets to narrower niches, and for customers to gain access to wider choices in terms of with whom they can trade. On top of this, the hypertext nature of the Web (many-to-many communication) has allowed for negotiations to take place, via this medium, that could only occur previously in face-to-face markets (many-to-many communications). This has allowed the medium to simulate the marketplace bazaar environment we were used to trading in 100 years ago, but now on a global scale! The natural evolution of this new medium is to shift (reverse back) to an environment that supports dynamic pricing.

For the same reasons the internet is allowing for a more "bazaar-style" marketplace (hypertext etc.) customers are now able to develop markets for their own products for other customers. Thus second hand markets are becoming more robust in many marketplaces. This evolution is a stark improvement from the fragmented second hand markets of the offline world (second hand stores / charities, "seconds" stores and second hand car dealers for example). Now with Ebay customers can retail literally anything (rather than storing it in a basement), with Amazon.com a second hand book platform allows customers to buy and sell used books, and with Carmax the trade of used cars is developing (a few interesting examples).

These evolving markets benefit customers by allowing for the purchase of a cheaper product, and for the freedom to resell existing product (creating more value for the initial purchase). Manufacturers (publishers etc.) have voiced concern about the cannibalization impact of these markets on the new product. Their argument is that this will take needed revenue away from the original developers of the product, which reduces the ability to further develop new product. The counter argument is that these markets actually expand the overall market for these products by allowing greater reach to more price sensitive consumers, thus stimulating interest that may eventually lead to new market purchases. A similar debate exists with the filesharing based marketplaces. The difference between these issues and the filesharing issues is these issues do not involve copyright infringement. When we purchase a product, we have the right to resell it (in most cases, but this is not the case with software, which has its use limited by its license, you do not actually own the software you buy). This right was difficult to exercise in the pre-web era, that is no longer the case.

These emerging online marketplaces also serve a role in disciplining the more fragmented offline markets (whether for new or second-hand products). Since customers can now compare prices with an online equivalent, traditional geographic monopolies no longer can offer prices that are out of line with these markets.

***Discussion Topic: comment on your experiences with purchasing products second-hand, what type of products have you purchased and what was your level of satisfaction with the process ?

Types of Dynamic Pricing Models Evolving

As noted, dynamic pricing is not new, and the models that are evolving all have developed from traditional pricing models. The following are the more common dynamic pricing models.

Auction Pricing Model
The Auction Pricing Model is similar to a regular auction. An example of its popularity can be seen by the phenomenal growth of Ebay. It is becoming very popular as it replaces classified advertisements as a medium for selling goods. This makes perfect sense, as classified advertisements require the seller to fix a price before knowing what the market will bare. This creates two likely outcomes, and one very unlikely outcome. The seller can leave money on the table by fixing a price too low (many people try to purchase, but the price is already established). This also creates the likelihood that the buyer who places most value on the product (and hence would have paid most for the product) does not ultimately purchase the product. The seller may also not sell the product, by striking a price point that is too high, and there is no demand at that price point. Ways around this second issue is to offer the price "or best offer", but this still eliminates negotiations between seller and buyers and does not allow buyers to respond to each others offers. The more unlikely scenario is the seller established the right price, and maximises revenue from the transaction, and makes sure the buyer who assigned the highest value for the product received the product.

Reverse Pricing Model
Priceline has pioneered the Reverse Pricing Model for the consumer. Consumers can establish a price they are willing to pay for a product, and the company can then determine whether to agree to that price. Thus we are essentially marketing our requirements to the marketplace, and the market then competes to determine whether the requirements can be satisfied.

For each of these models to become more robust, such that we can truly return to a world of dynamic pricing (where it is appropriate) industry standards and intelligent agents need to evolve. This will then reduce the effort required on the part of the purchaser and seller to search the markets for ideal trades. If the markets are fragmented, and require significant search costs on the part of both buyers and sellers, then this marginalizes the impact of dynamic pricing. Currently dynamic pricing does require a significant cost on the part of the consumer (in terms of effort required), this hinders its growth.

**Discussion Topic: Comment on your experiences with dynamic pricing processes.

Types of Markets for Dynamic Pricing

Dynamic pricing models have evolved in certain markets to this point. The following markets are the early adopters.

Business-to-Business Trade
The supply chain has been able to take advantage of dynamic pricing in a couple of different areas. Removing excess inventory has been a major loss for companies who are willing to gain 5 cents on the dollar for inventory that will not be used. Now they are able to develop an auction scenario and gain a wide audience via industry meta markets. They are also able to purchase other companies excess inventories, at deep discounts, that may be of more value to them, thus a secondary market can evolve.

A second area that is proving successful is a verson of the reversed pricing scenario. Companies can use the web, in the form of an extranet to communicate with its suppliers the requirements it has for products. These "specs" can then be bid on by a wide range of suppliers that are perhaps more able to get involved in the bidding process, than were able before the internet. Previous to the development of extranets, there were significant costs associated in engaging suppliers in a company's supply chain in terms of efficiency of communication. The entire process of communication was also very cumbersome for those few that were involved (significant paper trails required etc.) These ineffeciencies translated into very inefficient supply chains, where relationships were established and evolved as it was easier to deal with the same organizations so as not to incur significant switching costs in terms of developing relationships with other organizations. While relationships do carry value, many are built for the wrong reasons and sustained through business inertia.

**Discussion Topic: For those working with business-to-business relationships, how has your company leveraged the internet in terms of its relationships with its supply chain?

Collectibles
Collectibles are another area that has taken advantage of dynamic pricing. A major portion of Ebay trade has been for collectibles. This makes sense considering their traditional mode of trade is through markets that do have some form of dynamic pricing, and are fragmented. Auction sites, like Ebay, are able to bring a much wider audience together (reduce the fragmented nature of the marketplace), which makes the site more valuable for each participant (network effects) to a much greater effect than a physical auction "event" can. As the market for auction-style trading matures, more traditional industries will start adopting them as a dynamic pricing model. Business-to-Business trade, as noted above, has begun to adopt this model. This makes sense as there are fewer known potential "traders" than in the business-to-consumer markets.

**Discussion Topic: For those Ebay users, what type of products have you traded and how was your experience with Ebay?

Cost Structures
Products with cost structures that include significant sunk costs are starting to be marketed using reverse pricing models in terms of selling excess inventories. Priceline has taken advantage of this need for the airline industry, hotel industry and other similarly priced products. Thus products sold this way will add some revenue to the seller, which would not have materialized through traditional channels. It is important that this channel is separated from other channels to consumers to avoid cannibalization of traditional channels and migration of revenues to the discounted channel. Thus this channel should only attract price sensitive customers. More on cost structures appears below.

Dynamic Pricing vs. Fixed Pricing

The question remains whether all products should return to a dynamic pricing scenario. Simply because this model works with the internet, it does not mean it makes sense for all types of products. It does incur considerable expense for the consumer in terms of time and effort, and costs for marketers in terms of extra effort and uncertainty of revenue. The question begs, is this expense returned in the value received from the exchange? As noted, intelligent agents and industry standards can reduce the costs of the exchange for the consumer, and provide a wider market search, but it must be considered that dynamic pricing is not ideal for all transactions.

**Dicussion Topic: Determine which type of transactions will not be best suited to a dynamic pricing scenario, and establish the reasons why fixed pricing is better in the specified scenario.

Cost Structures

Cost structures of digital products are very different from the traditional manufactured good. Costs are typically made of up Fixed costs and Variable costs. Fixed costs (such as rent on a manfacturing site) are fixed over a given number of units produced. Therefore the more units that are sold, the lower the average cost per unit. Variable costs are fixed per unit produced. Examples include materials for the product.

With digital products a large component of the cost structure is fixed, and sunk. Sunk costs are non-recoverable fixed costs, including research and development and human capital. Since these costs are sunk costs, they should not be considered in future pricing decisions for the product. Digital products typically have small variable costs, and can have zero variable costs, assuming the product (software for instance) is being marketed directly from the web-site, with no distribution or packaging costs. Thus with a cost structure that is mostly fixed (and sunk) and not variable, it effects the decisions one can make about marketing the product. Three factors that support the notion of zero marginal costs are:

In competitive markets, where more than one firm/product compete for the same consumer needs, this competition will tend to drive the price of the products close to the marginal cost of the product, since any revenue over the margin will contribute directly to recovering the sunk costs and contribute to the profit. Thus, if two competing software packages are competing for the same market (accounting packages for lawyers, for example) and assume they are the only competing players in this niche market. Their products have identical functionality. Firm A spent $2 million developing the package, and retails it over its web site. It has marginal costs of $1.50 per unit. Firm B spent $4 million and retails the package over its web site, and its marginal cost per unit is $1. All fixed costs are considered sunk costs, in this case. Each dollar gained from the price (over the marginal cost) of the product contributes to the company, contributing to the fixed costs and then profit.

**Discussion Topic: Which company, A or B is likely to be the stronger competitor in the marketplace? What are the long term implications of these costs structures to the competitive nature of the marketplace? What assumptions do you need to make?

**Discussion Topic: Think of your own examples of products that have zero marginal costs?

Product Differentiation

Due to the unique costs structures of digital products (high fixed costs, close to zero marginal costs) this allows for some interesting possibilities for differentiating products in the marketplace. If one wanted to introduce multiple products into the marketplace, to satisfy different needs of different customers --- and one is producing automobiles, then there is significant fixed and variable costs associated with each style of product introduced. For digital products this is not the case. The research and development is applied to developing the core product, this product is then altered to satisfy different markets. For the most part, the core product will be the most sophisticated product offered to the market (high end spreadsheet package for tax people) and the low end product uses the same code-base, with limitations ADDED. This is interesting to note, as the low end product (sold at the lower price) is actually the most expensive of the products to produce (given the additional work required to limit its capabilities). Clearly, this work could be avoided, to increase the margins on the low end product, but this would allow the high end market to purchase the low end product.

A simple example can illustrate how this works. The goal is to maximize the revenue generated for the product, therefore maximizing the profits (assuming zero marginal costs:)

Software version casual user: Sold at $50, size of market 10,000
Software version student user: Sold at $30, size of market 50,000
Software version professional user: Sold at $150, size of market 50,000

This gives a total revenue of: $500,000 + $1,500,000 + $7,500,000 = $9.5M
Assume zero variable costs and fixed costs of : $1M
Thus gross profit = $9.5M - $1M = $8.5M

Assume that instead, the firm decided to launch the product, without differentiation, the high-end product at $150. What are the consequences? Fixed costs (sunk) will be reduced because there will be less development costs associated with developing only one version, thus, fixed costs are reduced to: $900,000. However, due to the high price point, there is only demand from the professional user, generating revenue of $7,500,000 (150 x 50,000). Thus the product generates a profit of $6.6M.
Now assume the firm introduces the product at $30, but again, with all the functionality in place. This saves $100,000 for fixed costs (similar to last example). The entire market will demand the product, but will have access to the product at $30. Revenue = 30 x 110,000 = $3.3 M. Profit = $3.3M - 900,000 = $2.4M.

Thus both scenarios leave the company with less opportunity to be as successful.

Product Differentiation to Market Products

Given that digital products have small (or zero marginal costs) it is possible to give-away (or significantly discount) a version of the product in order to entice people to purchase a paid version of the product in the future, as they become more sophisticated users of the product. In the above scenario, if the marketer reduced the student version to zero, it would lose $1.5M in revenue, but still realise a profit of $7M. If this strategy increases the number of student users, which in turn increases the likelihood of increased conversion to professional users, then in the long term, this may make sense.

The only revenue the company is losing with this strategy is the opportunity cost of not selling to the 50,000 students ($150,000). The hope is the gain in additional students who purchase the free product (through future lock-in and upgrades) is greater than the $150,000. Blogger.com has adopted this strategy, as it has a very basic version (blogspot) that is free to use for all web users. The problem marketers can face by doing this, is that if the free version is very limited in its functionality (has to be somewhat limited or there is no incentive to upgrade to the paid version) this may create a poor perception of the product in the consumers' mind.

**Discussion Topic: Can you think of other examples of product differentiation

Installed-Base

Installed-base is the term used to reflect the marketer's customer base. The value of the installed-base to the marketer is essentially the value that can be placed on the company. Thus as marketers build their installed-base, they will consider the following issues: Lock-in refers to the strategies/tactics in place to more closely align the customer with the particular product in the marketplace. A marketer will lock the customers into their product when costs (switching costs) are associated with the customer selecting another product. In its most basic form, lock-in can come from the marketer having data on the customer, such that the marketer can use this data to better present their products to the customer on return visits. Ebay is a great example of establishing lock-in for its customer base. Its customers are essentially both the buyers and the sellers that trade on Ebay. Once you have performed a transaction on Ebay, you have a historical record that other users can use to determine whether they want to trade with you. Once you have built up a good reputation on Ebay, you have created a cost that is associated with you moving to another online auction service. Your established reputation, which in turn will translate into dollars, becomes a cost to for you to switch to another online auction system. Other costs will include the need to learn an unfamiliar user interface and the need to offer personal data via a sign in form.

Switching Costs refer to the costs of a customer (or entire customer base) switching from one competing product to another. Telephone companies are an good example of companies that estimate the switching costs of consumers, and then develop an incentive program to encourage consumers to switch from one competitor to another. (Surely you have experienced the phone calls to encourage you to switch!) Once the telephone company estimates that switching cost, and calculates the life-time value of a customer then the company will try to grow its installed base by trying to encourage customers to switch. As long as the switching cost is less than the life-time value, there is incentive available to encourage a switch. If lock-in can be established within the customer base after the switch, this clearly increases the life-time value of the customer as it increases the switching costs to another service.

The Life-Time Value of a customer, is the value placed on a customer, to the company, for the life-time that customer will remain with the company. This calculation is often used to estimate the overall value of a company. It increases as alternative revenue streams are established from the installed-base. For example, if the telephone company is able to sell additional services to its installed-base, on top of the traditional phone service, then they are able to increase this value. They may also sell the data of their installed-base to third-party marketers who are interested in selling additional products. etc.

Discussion Topic: Is the auto industry able to establish high switching costs? Why or why not? How about the airline industry? What do they do to create switching costs?

Network Effects

Digital products that benefit from connectivity may experience network effects. The idea behind network effects is that as the number of users of the network increases, the value of the network to each user also increases. This is also known as increasing returns to scale or demand-side economies of scale. This is clearly a different phenomena that we are used to with the industrial age, the notion of diminishing returns to scale. Diminishing returns holds that as the number of users increases, the value to each user diminishes. (Imagine if everyone owned a Porsche motor car, the prestige of being a Porsche owner loses value to each owner.)

The Fax machine is an excellent illustration of network effects (and increasing returns). When the fax machine was first introduced, it was an expensive product, and the product needed additional owners for it to be useful. Thus, the buyer of the first fax machine had no use for it, since there was no one with whom to communicate (no value). As more fax machines were purchased, the value of each fax machine to each user increased. This also drove down the costs of developing the fax machines (economies of scale and learning curve effect) which reduced the price of the fax machines, which increased the demand for fax machines, which increased the value of each machine for each user (and so it goes on). The positive spiral that occurs enabled the fax machine to become an industry standard for transmitting digital documents.

This is also known as Metcalfe's Law: The value of the network to each user is proportional to the number of other users. The total value of the network is proportional to n x (n-1) = n squared - n. (n is the number of users in the network.)

Network effects are one of the reasons first-mover advantage can occur. As the first player in a new market, if the marketer can take advantage of network effects and create a positive spiral, they can make it very difficult for other marketers to enter into the market. The first-mover will also move down the learning curve very quickly, and this will reduce the average cost of the product, creating margins that later entrants into the market will find difficult with which to compete. It is important that the first-mover tries to establish lock-in. Clearly this does not always work effectively. Microsoft has done a wonderful job of never being the first to a market! (netscape, mac)

The combination of network effects (the more users, the more valuable the product to each user) and the learning curve effect (the more units developed/experience developing the product, the lower the average costs of the product) leads to a "winner takes all" scenario, where markets work more effectively if one company (standard) controls the entire market. In fact as competing companies in the market go head to head, as one company reaches its tipping point and experiences increasing returns due to positive spirals etc., the competing company will experience a negative spiral as it loses customers and increases its average costs per unit. One can also argue that it is economically inefficient to have more than one company compete in a marketplace for digital products, where all costs are fixed and sunk. Each competing firm simply adds to the total investment made in developing and creating the product (thus the sunk cost of product development for a product is similar for each competing firm, and can only be recovered through that firm's return determined by its marketshare (volume) and price).

These same effects are not relevant for industrial goods, whose organizations typically experience diminishing returns once they reach a certain size. Thus we see many traditional industries operating as ologopolies (car industry for example) as they discover their optimal size of operation.

**Discussion Topic: Should digital markets be controlled by one company?

Standards

Digital marketplaces are much more efficient once a standard for the marketplace is established. Standards are able to increase the overall size of the market as the market itself increases its utility for each consumer. WINTEL (Windows and Intel) is a proprietary-standard that has helped consumers communicate with each other. It is comforting to know that the receiver of a document will be able to read the document a sender e-mails. Imagine if there were several competing software vendors, marketing incompatable software, competing in the office suite market. It would create a fragmented market, which would undoubtedly reduce our ability to communicate efficiently. Traditional market thinking suggests several competing players would create a better marketplace. Here we see developing a standard actually creates a more robust market. In the above case, the standard is propietary (controlled by commercial organizations) this is not always the case.

TCP/IP is a communications standard, that allows any computer to talk to any other computer on the internet. Without this communications protocol, we would not have the internet. Thus TCP/IP is a non-proprietary standard that has enabled the worldwide internet. A non-proprietary standard (unlike the Wintel proprietary standard) is open to the public, and not owned by one company.

A couple of examples of standard setting to establish markets a few years ago can help illustrate some interesting points.

The standard railway guage in the United States was not always so. Many years ago, for goods to pass all the way across the country, they had to travel by one rail track, then be unloaded and reloaded to another train, to continue travel, due to the different rail guages being used. (At one point, there were seven different guages in use!) Clearly this limited trade, and created industries within communities whose work relied on transferring goods from one railway line to another. Once the railway guages were consistent, trade immediately increased.

** Discussion Topic: What is the history of the now adopted standard 4 foot 8 1/2 inch railway track guage? Where did this guage originate, and why?

The QWERTY example is an interesting case study of an entrenched standard. The keyboard layout that we are used to was developed many years ago, in order to slow down our ability to type. (The QWERTY layout also allowed early salespeople to impress their costomers, since they could type the brand name, typewriter, using the top row of the keyboard only!) Typewriter keys in the early days were prone to get stuck together, if the typist typed quickly. Soon, typewriters overcame this technical glitch, but QWERTY has stayed. In fact, a more effective keyboard design (Dvorak layout) was introduced, but never had a chance to become a standard. Even if it is easier to teach new users to type using the new layout, companies would not be encouraged to adopt the new design since its current employees were familiar with the old design, and new users would rather learn a keyboard layout that gave them skills that they could transfer from one organization to another. Thus while learning the new design may have been easier, it had much less value to the user. The collective switching costs of users from one standard to another was too high.

The development of the internet is another interesting case study in standards. Many proprietary companies attempted to develop the standard for online networks. These include compuserve, genie and prodigy (no longer living!) and aol and microsoft (now gateways to the internet). From the early 1980s to the mid 1990s these companies competed with propietary standards to develop their online communities. Each developed their effort with a blind eye turned to the internet. This changed in the mid 1990s when realization set in that the open standard internet was going to become the network. Even subsequent to this, Microsoft believed that it had the power to force Word to become the standard language of the web (replacing html). This clearly did not happen. This is a good example of an open standard becoming more robust, more quickly than many competing proprietary standards.

When developing a new product, in a new market place, a significant consideration a company is faced relates to the standard setting for the market. Should the company try to establish a proprietary standard (as Front Page did for web authoring tools) or develop an open standard that competing companies would have access. The second strategy assumes that this will allow the market place to grow faster, and the pioneer is able to take first mover advantage.

Creating a standard is a very political process, that is better understood with knowledge in game theory.

**Discussion Topic: Identify a company which developed an open standard, successfully. Identify markets in need of a standard.