Last month, real-time feed service Superfeedr introduced the option to subscribe to any type of arbitrary content – such as static HTML pages, V Cards, JSON and more. This week the company announced the ability to subscribe to only fragments of HTML pages. As an example, Superfeedr explained how to subscribe to just the “current conditions” on The New York Times Weather page.
The basic use case for both subscribing to arbitrary content and subscribing to fragments is obvious: you can monitor when something on the Web changes. For example, if a company you want to work for has a job listings site but doesn’t have a feed for those listings, you could monitor just the part of the page that lists open positions. One person in the comments of the announcements mentioned using it to monitor price changes. There are some really interesting real-time intelligence gathering applications that could be built with this.
But what are some less obvious uses?
It seems like you could use this to embed and automatically update content from other sources into your own Web projects. For example, if you wanted to include a weather widget on your dashboard, you could use this to scrape weather data from The Times. The ethics of scraping is debatable, but you get the idea.
Interestingly, enterprise software company SimplyBox has using fragments as a means to “remix” applications. Perhaps it should partner with Superfeedr to improve this feature.
Scraping still seems like a pretty obvious idea. What else could you use this for?