The Press Room for USC gave some of the findings from the research
Over the past 10 months, Google search has dramatically increased the number of sites around the world from which it serves client queries, repurposing existing infrastructure to change the physical way that Google processes web searches, according to a new study from USC.
From October 2012 to late July 2013, the number of locations serving Google’s search infrastructure increased from from a little less than 200 to a little more than 1400, and the number of ISPs grew from just over 100 to more than 850, according to the study.
While many might see this as a “So what?” it really is significant in that it shows the position that search still holds with the company even as it expands in many different directions. It is also indicative of the importance of search to end users because Google wouldn’t make this type of ‘investment’ if the service wasn’t critical to them.
You might wonder if there will ever be a time when search is not the primary driver of Google’s business. It’s hard to imagine unless there is a huge shift in end user behavior. it would seem that until the population consists of only people who have grown up in the Internet Age there will be some level of ‘traditional’ use of the web. At this point there is nothing more traditional about the web than search.
The report highlights the differences that have occurred.
Previously, if you submitted a search request to Google, your request would go directly to a Google data center.
Now, your search request will first go to the regional network, which relays it to the Google data center. While this might seem like it would make the search take longer by adding in another step, the process actually speeds up searches.
Data connections typically need to “warm up” to get to their top speed – the continuous connection between the client network and the Google data center eliminates some of that warming up lag time. In addition, content is split up into tiny packets to be sent over the Internet – and some of the delay that you may experience is due to the occasional loss of some of those packets. By designating the client network as a middleman, lost packets can be spotted and replaced much more quickly.
In the end it’s about the end user, of course.
The strategy seems to have benefits for webusers, ISPs and Google, according to the team. Users have a better web browsing experience, ISPs lower their operational costs by keeping more traffic local, and Google is able to deliver its content to webusers quicker.
It doesn’t appear that there are public copies of the report available today but we’ll keep a look out for them. We feel confident that when they are they will be delivered faster than ever!