In the final talk â" and the keynote slot â" of Brighton SEO 2018, Google spokesman John Mueller is joined by international SEO consultant Aleyda Solis for a live Webmasters hangout. Here are all the highlights.
Is Google able to recognise content from desktop sites in the mobile index?
Itâs important to understand that theyâre only indexing the desktop content at the moment, with mobile friendliness as a helpful signal. Mobile means Google will only focus on the mobile version. So if anythingâs only on desktop, it wonât be indexed at all.
And what about hidden content?
In the past, Johnâs mentioned that hidden content rules wonât apply in the future, but will this content have the same weight? Well it wonât necessarily have exactly the same weight, but they know that hidden content is more useful on mobile, so theyâll be able to take that account in the mobile first index. They want people to be able to hide content if itâs helpful for UX, but thereâs no concrete end result at the moment. They need to see how it turns out with some experimentation. If itâs abused this might change.
Should a company with low mobile traffic prioritise mobile optimisation?
This is a good question, according to John. The big problem from Googleâs side is that people see low mobile traffic because their site is bad. Itâs a situation where low traffic leads to no mobile site, but this is the wrong way to look at it. Sure, some sites donât have much mobile demand, but this could change over time. In some countries the change will be more pronounced than in others.
With so many people involved in the changes in Googleâs algorithm, does anyone know which ranking factors are the most important?
Yes. There are people at Google who have been there for a long time who have really in-depth knowledge around search and Googleâs algorithms, especially around the crawling and ranking side of things. Itâs something thatâs fostered within the search engineers, because itâs important that every small part of search fits in with the bigger picture of search. For example, mobile first changes were made after a lot of analysis to see where the problems could be. They want to understand how everything works together. Google is not a big black box â" there are lots of people involved who do know whatâs going on.
When can we expect better reporting of universal search results?
John doesnât have a timeline for when. They try not to pre-announce too many things, as a lot of announcements go bad. He expects that they will try to provide more information on certain metrics in Search Console, e.g. data on voice queries. He also expects that some things will come out in API form to try and make sure that newer features have API access so that other people can go and make complicated and fancy tools. They want to avoid a focus on the expert audience and make sure that the average webmaster can get good info from it.
Is older information going to be present in the new Search Console eventually?
Yes, thatâs basically the plan. They didnât just want to take everything from the old UI and reformat it, but they do want to bring everything over that they think is genuinely useful and rethink how they can make the information as useful as possible. For example, old SC gives a lot of random lists which need a lot of interpretation, so they want to make that kind of thing a little more actionable. Theyâre tracking what people are doing so theyâre aware of which tools are getting the most engagement.
What speed metrics do you use for different sites?
Itâs hard to find one metric that works for all sites. Thereâs not just one number to focus on â" you need to look at the different page speed elements that contribute to a good experience for your audience. Sometimes you can find low hanging fruit, but other times itâs not so simple.
A lot of people have jumped into AMP, but can you confirm an update late last year where many websites were dropped from the Top Stories functionality? How do you decide which AMP pages can be included there?
The Top Stories is an organic feature. Itâs not something that can be based on a simple metatag or category of site. A variety of signals are taken into account to work out which indexed sites can be shown. Just using AMP in the right way is not enough to be shown. These things also change over time, so itâs normal to drop in and out of this SERP feature, as youâd see with any other organic feature. At the moment itâs counted as one block of results.
With a big website, how can you optimise for crawl budget effectively? Can you confirm the best way to handle low quality content so that more important pages are prioritised?
If you know that you have a lot of content that doesnât need to be indexed or crawled, you can handle it in a few ways. Robots.txt is one way, especially if you have loads of URLs from search pages. If you have resource-intensive sites robots.txt can also work. For other sites a combination of noindex and nofollow â" all the traditional methods to guide the crawler â" are the methods that work fairly well. For faceted navigation it does get really complicated, but thereâs no way to avoid that itâs a technical undertaking. The tricky part with faceted navigation and pagination is that thereâs no one answer for all websites. You have to understand what value different pages provide and work it out with an analytical approach. Youâll always see a change with big updates, so itâs worth trying things out!
Is it okay to disavow links if your website isnât penalised?
John likes that the disavow tool allows you to take care of these issues yourself. If youâre aware of poor link practices in the past, then the disavow tool can be used to clean that up. Itâs also good as a response to manual actions. When you look at link reports and you see something crazy happening, you can also use a disavow tool to take care of it. They donât see it as an admission of guilt, but they see it as a technical tool.
Brighton SEO Keynote â" Live Google Webmasters Hangout with John Mueller & Aleyda Solis was last modified: April 27th, 2018 by Ben Garry
0 comments: