Select Page

Cyren Security Blog

Fresh and Local is Best for Security Data!

When shopping, there is a trend for consumers to increasingly prefer fresh, local produce. While reasons for this may vary, there is general agreement that items purchased and consumed close to the point of origin are “better” – in terms of taste, health benefits and environmental impact.

Many suppliers of web security tools boast vast repositories with hundreds of millions of ranked URLs. Closer examination usually shows that many of these URLs are from obscure sites places that nobody visits and worse-still, their rankings are ‘stale’ – from weeks or even months ago. These vendors usually try to avoid the ‘freshness’ question by stating “never mind that, look how many URLS we have!”. But do you really want to make a decision about whether a web site is safe to visit based on ‘stale’ data?

Fresh and Local

So now we understand that ‘fresh’ is important, but where does ‘local’ come into it? Many web security solutions employ a “one database fits all” approach. What this means is that if you are using a web gateway in Italy, your reference database will be full of information on URLs from all over the globe. While this is great if you visit global sites, it means that there will be little or no data held about local sites.

Because of this, when you try to browse to a local site, the request will effectively trigger “dynamic content categorization” to take place. While this sounds like a good thing in theory, in practice it seldom is. This type of categorization dictates that a site is examined and ranked in real-time. Again this sounds great as it should fulfill our requirement for ‘fresh’ data – but if it’s that good, why don’t all web security tools do this all the time? The answer is simple – categorization (analysis) of URLs does not work well for two main reasons:

  1. Resource consumption. It requires substantial processing power to be available in the platform analyzing the URLs. If this is done “in the Cloud” processing power is likely not an issue, but if it’s onboard a security appliance CPU power, memory and storage are at a premium and performing this sort of operation can compromise the primary function of the device.
  2. Latency. Latency is the nemesis of a good browsing experience. If it takes “too long” (an entirely subjective measurement) for a web page to come back to us, we tend to abandon the operation and go elsewhere. So if all our requests to access local sites trigger dynamic categorization, the chances are that many of them will be abandoned as the user loses patience with the background process. This will be perceived by that user as a poor user experience caused by the security tool.

So, when you look for a web security solution, first make sure the data’s fresh. Next, check whether the background database is “one size fits all”, or if it has the capability to adapt based on local browsing patterns. If it can do that, you will minimize the impact of latency and therefore ensure that you don’t create a poor user experience for your customers.

You might also like

Square Enix Phishing Campaign

From July 20 until August 16, 2021, Cyren observed a significant increase in the number of Square Enix phishing URLs. The campaign coincided with 14 days of free play announced by Square Enix on July 12, 2021. During this period, we detected a total of 47,076 URLs for...