Home How Businesses Can Use P2P

How Businesses Can Use P2P

Almost every description of P2P in the context of business infrastructure starts something like this: “P2P is notorious for…” This comes from many years of people associating P2P with illegal downloading, to the point that the terms are now almost synonymous. Such an association is inherently unfair, however, because no one equates TCP/IP and crime, despite the fact that TCP/IP is the protocol of choice for many cyber-criminals.

Rather than resorting to out-dated and inaccurate definitions, let’s start from scratch and consider the following: what is P2P, really? What is it good for? How can we use it to save and earn money?

What’s in a Name?

P2P stands for “peer to peer.” Put simply, it’s one method of establishing communication between parties. Uploading information to Google Docs to share it with colleagues is not P2P, but sending the same information as an attachment to email is, despite the fact that mail servers are involved. In this context, P2P doesn’t mean “serverless communication” so much as “communication that is perceived to be serverless.” Like email, instant messaging (IM) is considered P2P technology because even though servers are used quite extensively, there is no explicit act of uploading data to an intermediate location. With both email and IM, servers are used behind the scenes, so to speak.

Thus, “P2P” is as much a social term as a technical one. It connotes a grid or cloud of devices that are more or less equal, rather than a constellation of star-like servers with clusters of clients surrounding them. But from the purely technical point of view, there is a distinct difference between true P2P (in which data is not relayed through a server) and perceived P2P (in which data is relayed through a server, but we don’t see it happen).

P2P and the Cloud

As currently implemented, “cloud computing” is really just a new name for old-school client-server computing, except that the servers pretend to be redundant. Clients have little or no actual control over this redundancy and cannot even verify its existence. When a cloud-based service tells you that your data is stored securely, you have no choice but to trust it… or not.

Compare that with a P2P file-sharing network. On a P2P network, every peer can tell who has complete or partial copies of a given file, what percentage of the file is stored locally, as well as many other details. Doesn’t that seem like a better example of cloud storage? If not, then what is?

Even better, with this set-up you can easily control the level of redundancy: just add another client instance, have it share the same file, and you’ve increased your redundancy by one. However, you can’t reduce redundancy beyond the peers under your influence. If all peers don’t agree to remove a given file, no one can. There have been many ideas about implementing a kind of “delete button” for the web, but the closer we move towards cloud computing, the less likely such a scenario becomes.

Cloud services are chosen for their convenience (being accessible from everywhere with simple tools) and reliability (with redundant storage in stable data centers). P2P technologies increase both factors: they increase convenience because there is no uploading or downloading to and from the cloud, and they provide directly controllable redundancy and, thus, cost control.

In most cases, more reliability means higher prices, and not all data deserves the same level of service. With P2P platforms behind cloud services, developers could implement applications that allow multiple storage and processing schemes without much hassle. This is not always good for service providers, because flexible cost control means that customers can scale up and down freely as business and economic conditions demand. But for the industry as a whole, it is definitely a good thing because it stiffens competition and enables customers to better survive.

What is perhaps more significant about introducing P2P technology into cloud computing is that the P2P cloud would truly be a cloud, not just a 15-year-old client-server technology with a new sticker. If you are inviting us into the cloud, then let us truly be a part of it, instead of remaining a client that we can neither see nor control.

P2P as a Social Tool

The traditional Internet (Web 1.0) was built mostly like a television network. Websites were controlled by a small group of content producers (editors, media teams, individual owners, etc.), and millions of users were consumers of that content. The barrier to entry was far lower than it was in the television industry, but it was still mainly a one-way road from the website to the user. The Internet’s infrastructure reflected this: data centers, thick pipes that connected them, and subscriber lines that could download much faster than upload.

Then Web 2.0 came along, and everything started to change. Today, most popular sites were created by ordinary users who care less about owning content than about sharing it with others. Developers created ecosystems and gave users tools to track each other and exchange content, and sites became service providers instead of content sources.

What happened to the Internet’s infrastructure, then? Nothing. Imagine YouTube going out of business: the entire infrastructure for that video would disappear in a day, but the videos themselves would still exist, scattered among computers of individual users, stuck on cell phones, caught in caches, etc.

Now take that one step further: instead of YouTube, we have a P2P network, full of videos and convenient tools to watch and upload them. Nobody can close this network or put it out of business. Sufficiently large P2P networks are invincible; if you don’t believe that, ask the MPAA and RIAA.

Web 2.0 is P2P (in a social sense), done with Web 1.0 tools and old infrastructure. To unlock a box with a whole set of new services, we need to upgrade the infrastructure. Just as Gopher was replaced by WWW, and UUCP was replaced by SMTP, the current star-shaped web infrastructure will be replaced by a mesh-shaped cloud network. Data centers would still exist, but instead of providing bandwidth and servers, they would provide reliability and accessibility. (And a system of measurement would need to be established for both.)

We can call this a “social Internet infrastructure”: an infrastructure that reflects new social behavior, that allows anyone to connect and share content with anyone else, while still enjoying sufficient privacy and security. It’s not so much a revolution as an evolution: another step in a process that has been occuring for some time already. It happened to the telephone system — does anyone remember having to call a phone station to be put through to another person? I’ve only read about it in books. It will happen to the Internet eventually, too.

So, Show Us the Money

What is P2P good for, then? To answer that, let’s first look at the potential benefits of an “ideal” P2P implementation, benefits that a business could use for competitive advantage or to implement an entirely new service.

The most talked-about advantage of P2P is its ability to cut bandwidth bills. Imagine how much money companies like Dell, Logitech and Microsoft spend on downloads of products and services? Logitech’s generic mouse driver is over 30 MB in size, and the number of its mice being used around the world is countless. A Microsoft Windows service pack can be as big as 300 MB. Now think of how many computers run Windows? (Remember that Apple has less than 25% market share, if that helps.) There are also media distribution companies, web accelerators, distributed back-ups, the list goes on. And every one of these companies could dramatically reduce its bandwidth bill by using P2P. Properly implementing P2P content distribution isn’t easy; but when our industry had (almost) more money than it could spend, nobody cared to try it. Today, money isn’t just an issue; it’s the issue.

Look at Skype. For it, implementing P2P was not optional but mandatory. Creating a free multimedia service for millions of people without spending tens of millions of dollars on “free” infrastructure wasn’t (and still isn’t) possible. With P2P, Skype was able to provide free phone access without actually subsidizing users (this is not 100% accurate, but accurate enough for this example). All the money it collects from paid users is profit, and yet its creators failed with Joost, not because no ones wants free television, but because Joost had a different infrastructure. People were not into sharing TV as much as they were into simply talking to each other. And then came YouTube, which was funded first by venture capital and then by Google.

Another benefit of P2P is that it requires zero configuration. Skype is probably not the best IP phone around, nor was it the first; but you don’t have to be a telecommunications engineer to use it. You download the installer, run it, register yourself as a user, and off you go, from nothing to brilliant conversation in a few minutes.

As with content distribution, implementing a P2P network that requires no configuration isn’t an easy task, but it dramatically reduces the number of users who drop off from being intimidated by the technology or feeling they lack the necessary skill. For many services, this is the difference between 100,000 users and 10 million users, or between going out of business as soon as venture money dries up and being profitable within a year.

Zero-configuration P2P has to do with more than just P2P, though. It also implies being able to fully network with zero configuration: the ability to connect any device anywhere using any available connection. Unattended sensors, medical IT devices, military computers, none of these should require in-field configuration. The people who use them generally don’t have time to read instruction manuals. They should be able to open the box, insert batteries, and have a workable device within seconds. This is what zero configuration gives users: a choice, not just between high and low profitability, but between life and death.

Even in the case of lower-profile applications, zero-configuration P2P can cut deployment costs tremendously — and well-implemented P2P platforms could reduce those costs to almost nothing. On such a platform, for example, setting up a new message-processing server for a financial system would be as easy as opening the box, throwing the server on the rack, plugging in the ethernet and power cables, and nothing else.

Ubiquitous connectivity simplifies development costs. Message-passing platforms wouldn’t need to account for different types of hosts, relays, connectivity fall-backs, and so on. You would simply confirm that the peer is up and then send it a message. Done. Think of how many networked hosts out there are actually “gateways” between email and text messages, between Internet and Intranets, between X and Y technologies. When one networked device can securely connect with any other, many problems simply evaporate. Before TCP/IP took the world by storm, gazillions of networking technologies existed. No one remembers their names, not because they were inadequate, but because TCP/IP was everywhere, and it’s much easier to speak the common language than to teach others a “better” one.

All of these factors are hopefully pushing cloud networking technologies towards commoditization, which would expand markets, decrease infrastructure costs, and allow companies to deliver better products more cheaply.

About ReadWrite’s Editorial Process

The ReadWrite Editorial policy involves closely monitoring the tech industry for major developments, new product launches, AI breakthroughs, video game releases and other newsworthy events. Editors assign relevant stories to staff writers or freelance contributors with expertise in each particular topic area. Before publication, articles go through a rigorous round of editing for accuracy, clarity, and to ensure adherence to ReadWrite's style guidelines.

Get the biggest tech headlines of the day delivered to your inbox

    By signing up, you agree to our Terms and Privacy Policy. Unsubscribe anytime.

    Tech News

    Explore the latest in tech with our Tech News. We cut through the noise for concise, relevant updates, keeping you informed about the rapidly evolving tech landscape with curated content that separates signal from noise.

    In-Depth Tech Stories

    Explore tech impact in In-Depth Stories. Narrative data journalism offers comprehensive analyses, revealing stories behind data. Understand industry trends for a deeper perspective on tech's intricate relationships with society.

    Expert Reviews

    Empower decisions with Expert Reviews, merging industry expertise and insightful analysis. Delve into tech intricacies, get the best deals, and stay ahead with our trustworthy guide to navigating the ever-changing tech market.