On March 15, 1985, Symbolics.com became the first .com domain name ever registered through the appropriate Domain Name System (DNS) process. Fast forward 28 years: Symbolics.com has been purchased by XF.com Investments for an undisclosed sum and there are more than 252 million registered domain names worldwide across all top-level domains (TLD).
Verisign, Inc., a domain name registration, Distributed Denial of Service (DDoS) and DNS management provider, revealed that more than six million domain names were added to the Internet in the fourth quarter of 2012, resulting in a 2.5-percent growth rate over the third quarter in 2012 and the eight straight quarter with greater than a 2-percent growth.
Image via Blogging Pages
New .com and .net registrations totaled eight million during the fourth quarter of 2012, reaching a combined total of approximately 121.1 million domain names.
The largest TLDs in order by zone size are .com, .de (Germany), .net, .tk (Tokelau), .uk (United Kingdom), .org, .cn (China), .info, .nl (Netherlands) and .ru (Russian Federation.) According to the .tk registry, 97 percent of its active domain name registrations are free domain name registrations.
During the fourth quarter of 2012, Verisign’s average daily Domain Name System (DNS) query load was 77 billion, across all TLDs operated by Verisign, with a peak of 123 billion.
How Does Big Data Fit In?
Big data refers to the ability to collect and store highly relevant, mission-critical data and effectively process, analyze and leverage it to make informed business decisions. With more than 252 million registered diomain names generating billions of Web pages, the DNS itself presents its own unique big data challenge, but also offers distinctive opportunities; by analyzing DNS transactions, companies can get greater insight into how domain names are being used, including their functionality, connectivity and reach, or what information users leverage the most.
DNS data can become an important tool in securing the network. Being able to analyze network activity and traffic through DNS queries can help network administrators determine where malicious traffic comes from and prevent access to these sources where DDoS attacks and spam originate.
The report explains that companies today shouldn’t just be focusing on their capacity to store massive amounts of data, but rather the ability to turn that data into meaningful and insightful information.
“Twenty years ago success in business, as often as not, was determined by who could gather the best and most relevant data (about competitors, customers, emerging markets, etc.) in the timeliest fashion,” the report says. “Because analyzing that data was comparatively simple, and a relatively homogenous process from one organization to another, competitive differentiation came from who could find the best data first.”
The Internet changed this in three critical ways:
1. It globally democratized access to data, enabling many more players to gather similar relevant data
2. It exponentially increased the amount of relevant data that is generated, collected and stored
3. There are now tools and technologies that make it easier to analyze large amounts of unstructured data
To read the entire report, click here.
TechZone360 Web Editor
Mist has created an AI-driven wireless platform that puts the user and his or mobile device at the heart of the wireless network. Combining machine le…
The Consumer Technology Association (CTA) is best known for the world's largest trade event, but the organization's reach is growing far beyond the CE…
In what could result in the biggest tech deal in history, semiconductor company Broadcom has made an offer to buy Qualcomm for a whopping $130 billion…
The term "moonshot" encapsulates the spirit of technological achievement: an accomplishment so ambitious, so improbable, that it's equivalent to sendi…
Cisco's trail of acquisition tears over the decades includes the Flip video camera, Cerent, Scientific Atlantic, Linksys, and a couple of others. The …