LOG ANALYSIS TO INCREASE SEO TRAFFIC: A FREE TOOL!
I'll start with the basics:
a significant potential traffic lever is the SEO traffic, that is to say from
Google, and more specifically from the free results we call
"natural". These are the results outside AdWords.
Our customers only want one
thing, increase their SEO traffic and therefore the number of keywords well
positioned in Google! I invite you to discover the analysis of logs which makes
it possible to quickly find optimizations to be put in place on your site in
order to increase your visibility!
Understand what we are talking about
Log analysis is complex in
itself and obtaining the logs of your client is often laborious. Exploit files
then demand some expertise!
In this article, I propose a
free tool, quick to implement and indicators to find growth sites
Understand what is a "log"
Each website records 100% of
the pages visited by Internet users, and therefore also the pages visited by
Google and its robots, called "GoogleBot". This "log" or
"log" records everything! Even in detail: the images called for the
display of the page, CSS files, JS.
In short, here is the
passage of Google that interests us. Beware; Google's robots are sometimes
specific: one that collects information related to mobile compatibilities, or
images... But overall, we will simplify the subject and assume that there is
only one!
Log analysis: two different and complementary possibilities
As a reminder, the log
analysis is possible "at a time T": we get 15 days the log in
general, but it depends on your "window crawl" concept that I will
not develop here in order not to lose anyone! This analysis is therefore a
complete SEO audit and that allows looking for a maximum of indicators / KPIs
likely to hide a site of growth of your traffic.
Otherwise, one can, with a
tool, recover logs in real time and monitor. For example, installing a script
on the server like Elastic search, Log stash for example.
An example of a growth site resulting from a log analysis
To start, remember that such
an analysis is long and is aimed at sites with at least more than 1000 pages.
This number is arbitrary, but say that the larger the site, the more the
analysis will be paid (provided you
require a senior consultant with at least 6 years of SEO behind him).
Unknown example: the crawl budget
I often ask the question
during an interview, what is the use of a "robots.txt" ... I am often
surprised answers. A robots.txt allows you to cut the passage of GoogleBot on
parts of the site that your SEO consultant deems to have no potential for SEO
visits. If you identify pages / categories of pages without SEO potential, for
example without content, GoogleBot must be prevented from browsing these pages.
(Warning, contrary to popular belief, this action does not remove the pages of
the index of Google if the engine has already retained in its index ... it will
start on a no index tag).
In short, to find a growth
site, we must compare the volume of pages that GoogleBot browses to the active
pages, that is to say the pages that have generated at least one SEO visit.
Other examples of hidden
growth sites in the logs do not miss:
v Optimize latency and load
times
v Better manage paging,
faceted navigation, and URL settings!
v Find orphan pages (excluding
internal mesh of the site)
Etc ... It's long, but pays!

