Christian Huitema
Interest: Books: Information: Comments: .

Measuring the size and the quality of the Internet

The Internet measurement work conducted at Telcordia led to the development of the "Netsizer," which measures the size of the Internet. The information is updated daily. At the core of netsizer is a statistical sampling algorithm that I developed in 1997. The basic idea is to pick 32 bit random numbers, and to check whether there is a host located at that address in the DNS; we did the check by looking in the "reverse lookup" tree, Once we have found a host, we can look at its domain name, and we can try to grab more information, e.g. what is its location, which services it runs, etc. Sam Werahandi complemented the intial sampling tool by a rigorous statistical analysis, and also introduced other aspects, such as population counts, by correlating the statistical sampling with "real life" polls from other sources. The Netsizer algorithms are patented by Telcordia.

We also developed other tools focused on measuring the quality of the network. At one point, we were performing daily samplings of a few hundred web connections, and we could see the evolution of the quality of service in real time. The results of these measurements are reported in an article, "Internet Measurements: the Rising Tide and the DNS Snag" that helped raise concerns about the quality of a key Internet component, the DNS service.