On March 15, 1985, Symbolics.com became the first .com domain name ever registered through the appropriate Domain Name System (DNS) process. Fast forward 28 years: Symbolics.com has been purchased by XF.com Investments for an undisclosed sum and there are more than 252 million registered domain names worldwide across all top-level domains (TLD).
Verisign, Inc., a domain name registration, Distributed Denial of Service (DDoS) and DNS management provider, revealed that more than six million domain names were added to the Internet in the fourth quarter of 2012, resulting in a 2.5-percent growth rate over the third quarter in 2012 and the eight straight quarter with greater than a 2-percent growth.
Image via Blogging Pages
New .com and .net registrations totaled eight million during the fourth quarter of 2012, reaching a combined total of approximately 121.1 million domain names.
The largest TLDs in order by zone size are .com, .de (Germany), .net, .tk (Tokelau), .uk (United Kingdom), .org, .cn (China), .info, .nl (Netherlands) and .ru (Russian Federation.) According to the .tk registry, 97 percent of its active domain name registrations are free domain name registrations.
During the fourth quarter of 2012, Verisign’s average daily Domain Name System (DNS) query load was 77 billion, across all TLDs operated by Verisign, with a peak of 123 billion.
How Does Big Data Fit In?
Big data refers to the ability to collect and store highly relevant, mission-critical data and effectively process, analyze and leverage it to make informed business decisions. With more than 252 million registered diomain names generating billions of Web pages, the DNS itself presents its own unique big data challenge, but also offers distinctive opportunities; by analyzing DNS transactions, companies can get greater insight into how domain names are being used, including their functionality, connectivity and reach, or what information users leverage the most.
DNS data can become an important tool in securing the network. Being able to analyze network activity and traffic through DNS queries can help network administrators determine where malicious traffic comes from and prevent access to these sources where DDoS attacks and spam originate.
The report explains that companies today shouldn’t just be focusing on their capacity to store massive amounts of data, but rather the ability to turn that data into meaningful and insightful information.
“Twenty years ago success in business, as often as not, was determined by who could gather the best and most relevant data (about competitors, customers, emerging markets, etc.) in the timeliest fashion,” the report says. “Because analyzing that data was comparatively simple, and a relatively homogenous process from one organization to another, competitive differentiation came from who could find the best data first.”
The Internet changed this in three critical ways:
1. It globally democratized access to data, enabling many more players to gather similar relevant data
2. It exponentially increased the amount of relevant data that is generated, collected and stored
3. There are now tools and technologies that make it easier to analyze large amounts of unstructured data
To read the entire report, click here.
TechZone360 Web Editor
Last week, ABI Research issued its latest report and forecasts in the network orchestration domain, asserting that while a disruption in orchestration…
A brief look at what's new in the world of artificial intelligence as it relates to IT operations; customer engagement; marketing analytics; and cloud…
IBM plans to purchase Red Hat in a $34 billion deal. Big Blue says its combination with the open source pioneer will establish it as the world's No. 1…
SAM is a series of kits that integrates hardware and software with the Internet. Combining wireless building blocks composed of sensors and actors con…
Artificial intelligence is changing the way businesses interact with customers. Facebook's announcement this week is just another example of how this …