Thursday 16 February 2017

I, Search: How AI Will Transform the Landscape of SEO by @jeremyknauff

Artificial intelligence, or AI, has been a trending topic in the search engine optimization industry. It’s also widely misunderstood.

A lot of the widely-held beliefs about AI are based on speculation, surmised from patents and search engine behavior. Unfortunately, speculation frequently leads to fear — and fear sometimes empowers the industry’s con artists. I wonder how long it will be before we start seeing practitioners claiming to offer “AI-proof” SEO services.

The most important thing to understand about AI is that it is not a static formula to solve. It’s a constantly evolving system designed to identify, sort, and present the data that is most likely to meet the needs of users at that specific time, based on a multitude of variables that go far beyond just a simple keyword phrase.

AI is trained by using known data, such as:

  • content
  • links
  • user behavior
  • trust
  • citations
  • patterns

and then analyzing that data using user experience, big data, and machine learning to develop new ranking factors capable of producing the results most likely to meet user needs.

Robotic hand, accessing on laptop, the virtual world of information. Concept of artificial intelligence and replacement of humans by machines.

The Past

Search algorithms of the past were pretty unsophisticated. Drop a keyword phrase into your title, heading, and alt tags, then pepper it throughout your content, and you could almost be assured top ranking. That is, until your competitors did the same, and then it became a virtual arms race to see who could stuff a keyword phrase into a page as many times and in as many ways as possible.

SEO practitioners became creative at finding new ways to squeeze a few more instances of a keyword phrase into a page, even if it meant using ridiculous tactics that served no purpose other than to increase keyword density. They hid text by coloring it to blend into the background, positioning it off screen, or even using z-index to change the stack order of elements, along with a plethora of other equally sketchy methods.

Fortunately, it didn’t take search engines long to build effective countermeasures into their algorithms to defeat these rudimentary tactics. A more challenging obstacle, since Google’s algorithm relied heavily on links, was separating the legitimate editorial links from manipulative link spam.

After spending a few years battling both black hat SEO practitioners and honest but misinformed marketers, Google implemented a scorched earth policy with the release of Penguin 1.0, destroying thousands of legitimate businesses in the process.

The Present

As the algorithms evolved to measure less gameable ranking signals over the past several years, many of the industry’s bottom feeders were killed off. However, these new algorithms also made it necessary for legitimate digital marketers to dig deeper and put more effort into quality in terms of technical SEO, content development, and link building. All three components are still essential today to build an effective search engine optimization campaign.

Technical SEO has evolved from simple formulas, like keyword density, to an ongoing holistic effort to improve user experience while making it easier for search engines to understand what your content is about. Factors that indicate a positive user experience, like mobile responsiveness, page speed, and time on site play a significant role in technical SEO today.

Link building is no longer just a matter of volume. Penguin changed that. Today, only legitimate editorial links, which take significant time and effort to earn, will produce safe, long-term results. Link building techniques that fall outside of Google’s webmaster guidelines may produce short-term results, but will eventually earn you a nasty penalty, resulting in zero organic visibility. That’s an expensive risk in my book.

And while the tired phrase “content is king” still has meaning, a more accurate statement would be “engagement is king.” Simply writing a few hundred words sprinkled with a keyword phrase won’t produce results anymore. Search algorithms today are looking for high-quality content that engages users. If it’s not valuable to users, it generally won’t rank well in organic search.

The Future

Artificial intelligence will completely revolutionize search engine optimization. Instead of a static formula, it utilizes user experience, big data, and machine learning to produce results that meet user needs more precisely while learning and improving on the fly.

Buckle up, kids, this is going to be an interesting ride!

Keyword Phrases Are Dead

The days of developing keyword-centric content are, for the most part, long behind us. Google’s Knowledge Graph, based on latent semantic indexing, initially led the charge in this direction, but RankBrain turned it into an all-out blitzkrieg.

RankBrain is Google’s machine-learning artificial intelligence system that helps process its search results, which uses an entirely new way of processing queries according to Greg Corrado, a senior research scientist at Google who is involved with RankBrain.

RankBrain uses artificial intelligence to embed vast amounts of written language into mathematical entities — called vectors — that the computer can understand. If RankBrain sees a word or phrase it isn’t familiar with, the machine can make a guess as to what words or phrases might have a similar meaning and filter the result accordingly, making it more effective at handling never-before-seen search queries.

Google tells us that RankBrain has quickly become the third most important factor in their overall algorithm for ranking webpages (right behind links and content). This is because unlike previous versions of their algorithm, it is very effective at analyzing a query and returning the most relevant content even when it doesn’t contain the keyword phrases used in the search.

This means instead of repetitively trying to force a particular keyword phrase into your content, you can and should focus on writing naturally, as you would if organic search wasn’t a factor. Naturally, Google can rank a webpage based on the content it contains, but under the right circumstances, they can also rank it based on information that isn’t even on the page. Because of this, you should include related terms and concepts whenever you can do so naturally because it will add more value for users, which helps send Google more positive ranking signals.

Search Intent

Today, if you want to become and remain competitive, you need to understand and plan for search intent. It’s not just what the visitor is looking for, but why are they looking for it? Google has been focusing on this for a while now, but you can expect it to become even more important as the role of artificial intelligence increases in newer iterations of their algorithm.

You need to think beyond the initial search term, and also think about what problem a visitor is most likely trying to solve.

You have four types of search queries in regards to search intent:

  • Navigational Queries are performed when a user is trying to find specific content on a specific URL. In some cases, the user may not know the exact domain.
  • Informational Queries cover a wide range of topics, but obtaining the information is the singular goal. No activity beyond clicking and reading is necessary.
  • Commercial Queries often consist both of queries from users with an immediate intent to purchase, and users looking for information to make a purchase at a later time.
  • Transactional Queries could consist of activities like signing up for a newsletter, creating an account, or paying a bill.

What Comes Next?

You can’t stop thinking yet — you’ve only reached the tip of the iceberg.

You need to anticipate what a visitor might need after their original search intent has been satisfied. What questions might they have at that point?

If I search for “fbi agent from swordfish,” I will get a lot of information on the 2001 action movie about a rogue counter-terrorism unit attempting to steal $400 million dollars from old DEA dummy corporations.

Wikipedia holds the number one organic position, while IMDB holds number two and three. Obviously, both of these websites are highly authoritative due to the volume and quality of content and links, but is that the only reason they consistently rank highly for these type of terms?

I don’t think so.

Think of all the different directions you could potentially go from a single page. Let’s use IMDB for this example. From the page about the movie Swordfish, which already has a tremendous amount of content on the page, we can also find links to extensive information about:

  • All of the actors, writers, directors, etc. involved in the movie
  • All of the other movies and television shows any of those people have worked in/on
  • Reviews of the movie from multiple third-party sources
  • Polls and message boards on topics relevant to the movie
  • Similar movies and television programs

Consider the fact that all of this information is interconnected, so almost any question that may cross your mind as a result of your initial search can probably be found without even leaving the IMDB website. I often find myself deep in a rabbit hole after visiting this website for that exact reason.

I’d say they’ve done a pretty good job of anticipating what a visitor might need after their original search intent has been satisfied, wouldn’t you?

IMDB operates on a massive scale in terms of content development, which is essential because they are also operating in a highly competitive market. But what do you think would happen if you applied their model, on a much smaller scale, to your own website?

I’m confident in saying that the AI gods would smile upon your website. (What would it look like when bits of data smile?)

Think about it like this — the goal of AI is to give the user a piece of content most likely to meet their needs, right? So which would make more sense:

  1. Send them to a particular page just because it lives on an authoritative domain,
  2. or send them to a particular page on a domain that also contains content that seems to answer the type of queries users tend to search for after finding the answer to their initial query?

I think the answer is clear.

To take advantage of this strategy, you need to segment your content into the phases of the typical buying decision, start at the beginning, and then develop content around all of the potential questions that may come up throughout the process.

The phases of your visitors’ buying decision are:

1.) Curiosity

At this stage, visitors may have stumbled across your paid advertisement, a link someone shared on social media, or an organic search result. They may not yet know anything about your company, your products or services, or even your industry, so their questions will revolve primarily around what your products or services do.

2.) Interest

Visitors at this stage may be interested in what you have to offer, but may be unaware of the value or whether it’s a good fit for them. Their questions now will be focused mainly on how your products or services may be able to help them specifically. Information on specific features and benefits are especially valuable at this point.

3.) Buying Decision

By this point, visitors understand the value in your products, have likely even evaluated some of your competitors, and now they’re looking for information to help them make a final decision. A good approach here is to present information about what differentiates you from competitors, such as testimonials, guarantees and warranties, and value-added incentives.

Deeper Search Intent Variables

A search query phrased exactly the same way, but conducted under different circumstances, may indicate a different intent.

For example, if I search for “Thai food” from my desktop, mid-afternoon on a Friday, I might be looking for a nice place to take my wife to dinner later that night. In this case, I’m probably most interested in large pictures of the interior and the food to see if it’s appropriate for date night. On the other hand, if I conduct that same search from my iPhone while driving at 12:05 on a Tuesday, I’m probably most interested in driving directions and a summary of reviews for a place I can get in and out of fairly quickly.

Some factors that can be a variable in search intent:

Time, Day, Date, Etc.

I’ve already shared one example of how time can play a role in AI-powered search, but other factors, like the day of the week, date, month, or even year can play an equally important role.

The algorithm could apply extra weight to local events within a certain radius during the times they’re being held, so if I searched for “things to do near me” between September through October, Google may display results for Halloween Horror Nights at Universal Studios, but not display results for them at other times when they don’t have a special event going on.

Location

If you search for gas stations from a mobile device, it’s a safe assumption that you’re running low on gas, which is part of the reason results are ordered by distance from your current location.

But what might be some other examples where location can play a role in search intent?

I run a digital marketing agency in Tampa, Florida, so my site is significantly more likely to come up in the search results for visitors in the Tampa area compared to another competitively similar (age, trust, content, links, etc.) digital marketing agency in a different city. The inverse is also true.

Device (iPhone, Android, Alexa, Google Home)

If you ask Alexa or Google Home about a product, it’s generally going to assume you’re interested in purchasing it, or at least collecting information to make a buying decision at a later date.

At this point, the algorithm has one singular goal — to present you with the one product that you’re most likely to purchase.

Why only one instead of the traditional ten or so listings? Because while it’s easy to scan a search results page on your screen, it’s simply far too cumbersome to do that with voice search.

This means that if you aren’t using PPC, you probably won’t achieve any visibility for even moderately competitive searches because there’s only room for one position. Even with PPC, however, companies with lots of complaints, poor reviews, or a limited track record probably won’t make the cut in voice search because they are less likely to generate conversions. Google’s primary goal is to satisfy their customer — and that means helping them find what they want the first time around.

Voice Search

Artificial Intelligence coupled with voice search will dramatically change the landscape of search within the next few years.

Unlike traditional algorithms of the past, AI has the unique ability to improve its results on the fly, and voice search helps it do that more quickly and more accurately. These improvements are driven through a combination of user experience and big data.

User experience, as it relates to artificial intelligence in search, can be measured in a variety of ways, such as:

  • Does the user seem to find what they needed from the result provided, or do they quickly return to make another query?
  • If the user didn’t find what they needed from the result provided, did they ask for the next result, modify their initial query, or did they rephrase it entirely?
  • Does the user transition from voice to traditional search? Did they then seem to find what they needed at that point?
  • Is the user’s voice relaxed and at a normal conversational volume, or agitated and raised? Has it changed during the search?

Voice also introduces a new layer of complexity into the equation because search terms are phrased differently and are far more varied. For example, someone searching for my services using traditional text search might use a search phrase like “tampa web design” but when using voice search, they would likely use a more conversational search phrase like “which web design company in Tampa designs websites for contractors?”

While AI theoretically has evolved to the point where it understands that those queries are basically the same thing, it’s still a wise idea to engage in a little hand holding, especially in the beginning. This is usually a simple matter of proper copywriting.

The Death of Websites

As someone who earns a significant portion of my revenue designing websites, it pains me to say this, but websites, as we know them today, face an inevitable demise.

You might think that sounds crazy, but it was just a few years ago when most people thought catering to mobile traffic was crazy, and today, mobile accounts for more than half of all web traffic.

When I talk about the death of websites, I’m not saying that the need for a powerful digital presence will die. That will continue to become even more important as time goes on. I’m not even saying that you won’t be able to access information about a company using a traditional web browser. What I am saying, however, is that the traditional way of thinking about websites will die.

The idea of a piece of content living at a particular URL will be replaced by a more data-centric concept. Think less like HTML/CSS and more like schema, XML, or some other form of structured data yet to be invented. As AI evolves, is refined, and becomes the dominant component in search algorithms, search engines will simply extract the data most likely to meet users’ needs, and present it directly to them rather than giving them a list of potential matches.

If you think this sounds far-fetched, consider that Google is already using a similar concept for Android Instant Apps, by requiring developers to build their apps modularly and host them on Google’s highly-optimized servers. Cindy Krum, one of the leading voices in all things mobile, explained the numerous advantages this offers to Google, marketers, and users on a recent episode of Webcology. It’s only logical to assume Google will soon implement some sort of requirement for data structure in websites as well, and make it a significant organic ranking factor, just as they’ve done with responsive design and page speed.

Once this evolution takes place, websites that don’t implement whatever new standard the search engines decide on will essentially become invisible. Implementation won’t be enough though, because instead of about ten results, the search engines will, in most cases, only return one. So like Highlander, in the end, there can be only one.

Big Data

People flock to data-driven marketing because it works. Large data sets enable search engines to spot patterns they couldn’t otherwise identify, and aside from Facebook, there is no company with more data than Google.

Powered by artificial intelligence, a search algorithm could utilize big data to identify and leverage trends and compare similar users using criteria like:

  • Geography
  • Profession
  • Education
  • Hobbies and interests
  • Medical and health conditions
  • Cultural beliefs
  • Search and browsing history
  • Age
  • Political affiliation
  • Social media activity
  • Reviews (on Google, as well as third-party websites, like Facebook, Amazon, Yelp, etc.)
  • Race
  • Connections to other users
  • Sexual orientation
  • Employer
  • Purchase history
  • Gender
  • Date, day, or time
  • Religious beliefs
  • Marital status

Big data is then used to rapidly identify patterns in user behavior, trends, and satisfaction in search results in order to provide better results for future searches in real time.

Self-Created Algorithms

One of the most exciting, and yet most concerning aspects of AI is the fact that it will use machine learning to develop new ranking factors entirely on its own.

According to Google, not only will their AI develop new ranking factors on its own — it will do so inside a proverbial black box. This presents a special problem for search engines and digital marketers alike, because if the engineers don’t know what their AI is using as ranking signals, how can they issue clear guidelines for marketers? And if marketers have no official guidelines, how can they follow them?

Dave Davies, co-host of the long-running SEO podcast Webcology, weighs in:

There’s an interesting phenomenon on the horizon and that’s crossing over the point where Google’s AI takes over and begins creating its own factors and ranking signals.

Historically it’s been a game of cat-and-mouse between those at Google who develop their ranking algorithms and SEOs who seek to understand them and optimize for them. This game was based on a core principle that the algorithm itself was a knowable thing; a very complex formula that applied weights to various attributes, and that could be reverse-engineered with enough time. Of course, no one had that time between updates but the core principle was that it could be and that at least the basic signals and approximate weights could be understood and optimized around.

The interesting thing about an AI-dominant environment is that even the engineers of the AI itself can never fully compute how the machine got to the specific conclusion it did in a specific instance and if they can’t, marketers and SEOs certainly won’t be able to. And that’s when the AI, which itself was designed by humans, takes it one step further. This is being worked on presently, and any ability to reverse engineer even a core understanding of the algorithm will be significantly reduced from what SEOs have worked with historically. Not only will the weights be highly variable and in constant flux but the signals being weighed will be added and removed on the fly and generated outside of any human input, meaning there will be tests and signals unlike any we have seen in the past and beyond what we may even be able to predict.

This factor alone will be a complete game changer for the SEO industry. I wouldn’t be the slightest bit surprised to see Google’s interaction with the industry disappear entirely, to be replaced with a boilerplate “we don’t know what factors go into the algorithm, just make great content and you should be fine” type of response.

Combatting Black Hat SEO

Contrary to popular belief, black hat SEO practitioners tend to be a rather brilliant crowd. They understand search engine optimization on an advanced level, have the insight to identify opportunities others can’t see, and possess the skills to exploit those opportunities at scale. To top it all off, they’re always looking for new ways to outsmart the search engines.

This makes them a significant and formidable enemy for search engines, but AI has the potential to give search engines a massive strategic advantage in this long-running battle.

MIT’s Computer Science and Artificial Intelligence Lab has recently created an algorithm that can predict how humans will behave in certain situations, which was trained by “watching” 600 hours of TV shows pulled from clips on YouTube, including The Office, Big Bang Theory, and Desperate Housewives.

After training the algorithm, researchers presented it with a series of new videos and froze the clips just before an action was about to take place. They found that the algorithm was able to correctly predict what would happen next 43% of the time.

While it’s not quite Blue CRUSH (IBM’s Blue Crime Reduction Utilizing Statistical History), when you consider the massive and constantly growing pool of data Google has on black hat tactics, it’s easy to see how this concept could become Google’s “precrime” response to black hat SEO.

AI could be trained on previous black hat techniques and monitor new techniques that show up, but the potential goes a lot further than that. It can also identify patterns, and then use those patterns to predict future techniques people may attempt. Over time, as the algorithm learns more about human behavior related to exploiting black hat SEO, it will become particularly effective at eliminating it, which could lead to some scary unintended consequences…

Unintended Consequences

AI can behave brilliantly at some times while behaving more like a drunk toddler on a sugar high at other times.

The results have the potential to be catastrophic but don’t take my word for it. Known for their brilliance, especially in regards to technology, Elon Musk and Stephen Hawking have said that AI is like “summoning the demon.” Even Google’s own engineers have addressed these concerns by building a kill switch into their own systems.

Researchers at Google’s DeepMind team developed artificial intelligence that can learn to play classic Atari games like Space Invaders and Pong. This AI doesn’t need to be taught the rules before playing because it’s equipped to remember and learn from previous attempts to play the game and improve over time. When coupled with the goal of maximizing its score, this AI produced an unexpected outcome.

For example, in the game Seaquest, the AI figured out that it could prevent its submarine from running out of oxygen and stay alive forever by keeping it near the water’s surface.

So how could this type of unintended consequences play out in search?

Let’s start with what we already know. Search engines want to provide their users with a positive user experience, and one of the factors they see as an indication of a positive user experience is how long a visitor stays on a website.

It doesn’t take much effort to imagine how this could go wrong. Great information will obviously keep visitors on a page longer, but so could:

  • Slow page speed
  • Broken or ineffective navigation
  • Non-functioning elements
  • Text size/color issues
  • Obtrusive pop ups
  • Disabling the back button

So without the context that seems like common sense to you, AI could incorrectly interpret the effects of poor user experience as a positive ranking signal.

Let’s look at another potential scenario where AI thinks it’s providing the best user experience.

Imagine that you have a page with amazing content, and based on multiple signals, Google’s algorithm is confident that it serves visitors’ needs perfectly so it ranks that page highly for relevant terms. You later decide to add an opt-in form that users must fill out in order to access that content.

The bounce rate now goes up dramatically, and along with several other signals, AI makes the incorrect interpretation and looks for a way to provide the best content — even if it’s no longer visible to search engines.

Since the AI has determined that your content is the best result for a particular search query, and Google already has the old page in its cache, it decides to simply send searchers to an older cached version of your page hosted on their own servers. They could even mask the URL for Chrome users so they appear to be on your website, rather than theirs.

Offensive Capabilities

Taking the concept of unintended consequences a step further — what happens when AI, which has the singular goal of presenting the best search results possible, decides to strike offensively against websites that it deems to be using “inappropriate” techniques in an effort to protect the quality of its search results?

These offensive strikes could range from something as mild as a penalty demoting ranking for a particular set of keyword phrases, to something more vicious, such as systematically scrubbing all organic search results (online reviews, press releases, write ups about the company, etc.) related to that website that the algorithm deems to be violating its guidelines.

This may sound ludicrous, but it’s well within the realm of likely scenarios, and while it’s not quite as ominous as Skynet becoming self aware, the adverse impact on digital marketing has the potential to be massive.

Experienced SEO professionals have already seen first hand that Google’s engineers are not the slightest bit squeamish about destroying innocent business owners. Penguin is a good example of how far they’re willing to go, but bad as that was, imagine the collateral damage of an AI algorithm with zero empathy for website owners.

Censorship

Once AI has created its own guidelines, it has essentially made a determination of right and wrong. From there, the next logical progression is to make a determination of right and wrong on other topics as well, like political viewpoints, social issues, and the morality of a product or service.

Researchers from the Entertainment Research Lab at the Georgia Institute of Technology have already taught AI right and wrong by telling it stories selected by humans that demonstrate normal or acceptable behavior. This technique then assigns rewards, basically the robot version of a gold star, when the AI makes decisions that align with the positive behavior exhibited in the stories.

Just as search engines today will penalize a website for using techniques they disapprove of, search engines of the future may penalize websites for promoting ideas they deem inappropriate.

Conclusion

Artificial intelligence will become a disruptive force in the SEO industry and there are bound to be some unexpected outcomes, especially during the early stages. In the long run though, I believe it will have a positive impact on users, search engines, and even digital marketers.

Success in an organic search powered by AI ultimately comes down to delivering a positive user experience. That means producing amazing content that meets their needs, and making it as easy as possible to access on any type of device.

Image Credits

Featured Image: iLex / DepositPhotos

In-post Image: ktsdesign/DepositPhotos

Go to Source
Author: Jeremy Knauff

The post I, Search: How AI Will Transform the Landscape of SEO by @jeremyknauff appeared first on On Page SEO Checker.



source http://www.onpageseochecker.com/i-search-how-ai-will-transform-the-landscape-of-seo-by-jeremyknauff/

No comments:

Post a Comment