Guest author Alex Salkever is the global product manager of cloud computing/IaaS at Telefónica.

Intel’s just-announced Edison computer—the one the size of an SD card—illustrates the corollary to Moore’s Law. If you can jam more and more circuits on a piece of silicon, you can also make computers that are smaller and smaller. And this heralds a whole new way of thinking about computing, clouds and connectivity.

Intel, to its credit, mainly envisions the new Edison computer as a vehicle for building wearable computing devices. That’s correct. It is the perfect size and shape for a new generation of wearables that will come with powerful computing capabilities onboard. This translates into an ever looser tether between smart devices and hub devices such as a smartphone.

Equally important, however, are additional possibilities around retrofitting, cloud computing, and connectivity. Let’s start with retrofitting. While its one thing to populate the planet with a wave of new Fitbits and other wearables, the reality is that we have a ton of legacy devices that many people will continue to use for years. Most of those are only marginally connected, perhaps via Wi-Fi, and maybe not at all. Additionally, their onboard processors are aging, underpowered and impossible to upgrade.

Make that, almost impossible. Suppose every Canon camera could accept an SD-shaped microcomputer that could direct its Wi-Fi connection chip to automagically upload pictures tagged for specific purposes to Amazon S3 for immediate processing and posting into a production website?

This type of capability can be hacked right now, no doubt, by serious users. But the ability to ship an SD-shaped computer that “just works” and does things to a DSLR that users themselves would be hard-pressed to do is an impressive possibility, and one that unlocks many other great ideas.

Then there is the possibility of cloud computing. My friend Jason Hoffman of Ericsson has talked about building a cloud powered entirely by the spare processing capacity in smartphones. This is a fascinating idea and one that isn't as far off as people might think. In theory, a company could first tap the spare compute pool of employees' phones on site, or those of people in the building, for example, before they burst into a public compute cloud or use additional server resources.

The flip side of something like this is a connectivity mesh powered by the Internet of Things. This is, to a degree, what OpenGarden—an effort to share Internet connections between devices in radical new ways to sidestep bandwidth and other limitations—is trying to do with their open networking application and SDKs that allow developers to easily mesh and connect Web-enabled devices to each other and the broader Internet.

Both Open Garden and the Internet of Things, however, suffer somewhat from questions of scale and connectivity, in part because there just aren't enough gadgets around that are ready and able to connect with each other. When it becomes very, very easy and justifiable to add a tiny computer to lots of devices, then the idea of a device-powered cloud becomes far more viable, just because of the vast increase in potentially addressable compute capacity.

Similarly, the prospect of ubiquitous tiny computers could dethrone the smartphone from its emerging role as a "hub" device for the Internet of Things—and that's a good thing. Essentially, if you can move the network node structure out of the smartphone and into anything that's connected (or that can be connected) to a relatively intelligent and configurable computer, then all of a sudden a true mesh network emerges—one that's most likely much less dependent on human interaction to function, because no one has to take their phone out of their pocket to make the thing go.

This could exist as a nearly passive entity, an underlying framework that rides beneath all our networks. Granted, this is Matrix talk, and it's still early days. But the smaller the computer and the greater its the power, the bigger the possibilities.