On March 15, 1985, Symbolics.com became the first .com domain name ever registered through the appropriate Domain Name System (DNS) process. Fast forward 28 years: Symbolics.com has been purchased by XF.com Investments for an undisclosed sum and there are more than 252 million registered domain names worldwide across all top-level domains (TLD).
Verisign, Inc., a domain name registration, Distributed Denial of Service (DDoS) and DNS management provider, revealed that more than six million domain names were added to the Internet in the fourth quarter of 2012, resulting in a 2.5-percent growth rate over the third quarter in 2012 and the eight straight quarter with greater than a 2-percent growth.
Image via Blogging Pages
New .com and .net registrations totaled eight million during the fourth quarter of 2012, reaching a combined total of approximately 121.1 million domain names.
The largest TLDs in order by zone size are .com, .de (Germany), .net, .tk (Tokelau), .uk (United Kingdom), .org, .cn (China), .info, .nl (Netherlands) and .ru (Russian Federation.) According to the .tk registry, 97 percent of its active domain name registrations are free domain name registrations.
During the fourth quarter of 2012, Verisign’s average daily Domain Name System (DNS) query load was 77 billion, across all TLDs operated by Verisign, with a peak of 123 billion.
How Does Big Data Fit In?
Big data refers to the ability to collect and store highly relevant, mission-critical data and effectively process, analyze and leverage it to make informed business decisions. With more than 252 million registered diomain names generating billions of Web pages, the DNS itself presents its own unique big data challenge, but also offers distinctive opportunities; by analyzing DNS transactions, companies can get greater insight into how domain names are being used, including their functionality, connectivity and reach, or what information users leverage the most.
DNS data can become an important tool in securing the network. Being able to analyze network activity and traffic through DNS queries can help network administrators determine where malicious traffic comes from and prevent access to these sources where DDoS attacks and spam originate.
The report explains that companies today shouldn’t just be focusing on their capacity to store massive amounts of data, but rather the ability to turn that data into meaningful and insightful information.
“Twenty years ago success in business, as often as not, was determined by who could gather the best and most relevant data (about competitors, customers, emerging markets, etc.) in the timeliest fashion,” the report says. “Because analyzing that data was comparatively simple, and a relatively homogenous process from one organization to another, competitive differentiation came from who could find the best data first.”
The Internet changed this in three critical ways:
1. It globally democratized access to data, enabling many more players to gather similar relevant data
2. It exponentially increased the amount of relevant data that is generated, collected and stored
3. There are now tools and technologies that make it easier to analyze large amounts of unstructured data
To read the entire report, click here.
TechZone360 Web Editor
The SD-WAN marketplace is a crowded one. But Hughes Network Systems says it brings unique expertise and proven technology to the table. And that, Jeff…
Organizations are changing their cybersecurity strategies, says Juniper Networks Cybersecurity Strategist Nick Bilogorskiy, who presented the closing …
It was a sweep. Both the audience and the judges at ITEXPO's IDEA Showcase Thursday picked Welbitz as the winner. The company went up against fellow s…
This afternoon at ITEXPO, HD Voice News Editor-in-Chief Doug Mahoney led a panel titled "How to Beat Evolving Security Threats," where he was joined b…
DialPad CEO Craig Walker opines about the future of business communications, looking back to his first version of the company and to where the industr…