Jump to content

Is It Time For Computers To Have Their Own .data Domains?


Spartan

Recommended Posts

[img]http://tctechcrunch2011.files.wordpress.com/2012/01/data.png?w=157[/img]

The web, as we all know, was built for humans. A nice graphical interface to the internet, which has been around much longer. But as the web has grown from a nice way to display information to the largest computing infrastructure on the planet, we need to make the web friendlier for computers once again. Computers don’t want to look at pretty web pages. They want data.
Of course, there are a whole mish-mash of APIs and other ways computers speak to one another across the internet. But it is not standardized, and it is a mess. Computer scientist and Wolphram Alpha founder Stephen Wolfram thinks there is a better way for computers to speak to each other. He suggests that it is time for a new .data top-level domain.
The familiar top-level domains are .com, .org, .gov, and so on. But the number of top-level domains is about to be expanded greatly. Some people think that adding more top-level domains is a waste (hi, Esther!). But Wolfram’s suggestion is worth considering. He lays out his thinking in this blog post:[indent]

[color=#ff0000][i]My concept for the .data domain is to use it to create the “data web”—in a sense a parallel construct to the ordinary web, but oriented toward structured data intended for computational use. The notion is that alongside a website like wolfram.com, there’d be wolfram.data.[/i][i]If a human went to wolfram.data, there’d be a structured summary of what data the organization behind it wanted to expose. And if a computational system went there, it’d find just what it needs to ingest the data, and begin computing with it.[/i][/color]

Sure, there are already many ways to extract data from sites today, but “it’s mostly from a complicated patchwork of data files and feeds and database dumps,” he concludes. What he wants instead is a standard way for computers to retrieve structured data from any site.
Us humans would keep going to the .com domains in our browsers, while computers can visit the .data domains directly. Right now, popular services have to keep their websites current and also keep their APIs current, which can take almost as much work. But every site’s APIs are slightly different, and they are not all readily exposed or available like HTML web pages are to human eyes. A web of .data sites would be built for computers. It’s time they had information standards too.

Link to comment
Share on other sites

×
×
  • Create New...