Website monitoring is an established business. Perhaps the best known example is Keynote, whose statistics carry a weight akin to Neilsen ratings minus the subjective element of user preference. These systems measure the performance of websites as perceived by customers. Typically they have an array of sensors, distributed across the globe– the more dispersed, the better a picture that emerges– that periodically pings a website to check how quickly the pages load, whether any errors are returned or key services such as authentication experience an outage.
So far none of them have stepped up to the plate and offered to detect traffic shaping. Considering that network neutrality is under attack from all directions, this could be the next application. Suppose an ISP in North America decides to put the brakes on downloads from Netflix while prioritizing streaming video traffic from a competing website. (It’s not a stretch to imagine a kickback, revenue sharing model or even past grudge against Netflix could motivate this behavior.) An extensive sensor array would reveal this anomaly, provided there is at least one sensor inside this ISP boundary. The measurement may reveal an unexpected latency compared to other network locations in the same region or strange bandwidth caps in effect over time.
That still leaves open one question: whether the web service provider could have any legal recourse once they discover their traffic was being discriminated against.