DEV Community

Cover image for Seo Restart 2023 - SEO Restart in the AI era
iFOCUS for iFOCUS - Digital Marketing Partner

Posted on • Edited on

Seo Restart 2023 - SEO Restart in the AI era

On March 30, 2023, the SEO Restart conference was held again in Prague. Just like last year's SEO Restart 2022, the conference covered the latest topics in the SEO industry for the ongoing year 2023. In order to compare the correctness of our internal SEO services and procedures, we did not hesitate to attend this information-packed conference.

This year's SEO Restart aimed to take a sober look at AI. Some lectures even cast doubt on the reliability of AI itself and the attempt to view metrics from well-known tools with a dose of skepticism. Valuable information and insights from people for people were not only found in the presented topics but also in the atmosphere of the conference.

Where is SEO heading in the AI era?

The opening keynote titled "Quo vadis SEO?" was delivered by Jan Tichý. He reflected on the evolution of SEO and its overall industry. He mentioned the old days when SEO consisted solely of keyword analysis and link acquisition without regard for quality. In other words, a "cargo cult" where practices and actions were imitated even if incorrect.

As an alternative to these mentioned approaches, good SEO from the search engine's perspective, and therefore SEO that aims to serve searchers effectively, is crucial. This involves changing approaches and moving away from keyword analysis to processed and comprehensive topics, focusing on overall and thoroughly covered themes on the web. The emergence of Rank Brain and the reduced importance of keywords, the growing significance of topic and landing page, and subsequent linkbaiting were discussed.

A partial prediction was made about the approaching decline in organic traffic in the AI era, as language models will provide much more extensive and relevant answers. Thanks to language models, we can express a topic in different words that haven't been used before. However, it should be noted that this involves reproducing information that has already been shared in other sources. AI, in its current state of development, is not suitable for creating something new, unique, and unknown to others. Search engines will not be replaced by answer generators since the continuation of monetization is necessary. Perhaps the next step will involve a fee-based recommendation system.

Answer generators will not access information sources differently than before. They will consider authoritative sources, similar to teachers. The answer generator acts as a summary of knowledge. If you ask for something specific, it will direct you to an authoritative source.

The second emerging role of AI will be distinctiveness and originality, exemplified by the article "How to choose ice skates?" It will be necessary to disseminate information beyond "common knowledge." The usefulness of the mentioned article is considered to be zero since countless similar articles already exist. SEO ends in this regard because there is no chance to influence the answer generator and its source database. It has no need to refer to such a source as it can generate such content from its database.

SEO itself does not end. GPT will not take over the work. The role of an SEO consultant simply changes. In the mentioned SEO consultant position, the word SEO doesn't hold any weight. It will be necessary to bring something new.

Being quoted and referenced will be essential. This is where brand building comes into play. GPT needs to encounter and recognize you beyond "common knowledge." Therefore, the work of an SEO consultant continues. It is necessary to bring something new. We return to people, to providing value to people. Keyword analysis will be irrelevant in two years. Building links to attract people to the website and establishing the brand will be crucial.

Vojta Fiala - Practical Use of AI in SEO for Beginners and (Intermediate) Advanced

In his presentation, Vojta Fiala mentioned the most notable AI tools that currently exist. According to him, OpenAI, Jasper, Adcreative.ai, Sydney, Bard, and Rytr are worth testing and using. Among the AI tool plugins, notable mentions are "ChatGPT for sheets," "AIPRM" (for Chrome), and upcoming plugins being developed by "OpenAI."

Based on his practical experience, the existing language models are most suitable for tasks such as identifying personas, understanding "jobs to be done" (what is necessary for a customer to accomplish), identifying suitable topics, preparing article structures, creating outlines for FAQs, estimating user intent, preparing meta tags, and writing regular expressions.

Pavel Ungr - Transformation of Content Creation and Optimization Using AI for SEO: New Opportunities

The introduction of the lecture reminded that the first attempt at a form of AI conversation could be observed from Google on May 15, 2013 when the so-called "conversational search experience" was introduced.

Since that date, some time has passed, and we have progressed to language models. According to Pavel, their content quality is sufficient, but caution must be exercised to avoid crossing the line where the content becomes "spammy". His lecture consisted of a list of various AI tools and browser extensions that are already available today, and the list can be found in the presentation link. To identify whether content has been generated by a language model or not, he unequivocally recommended using Originality.ai.

Lukáš Kostka: AI and R Studio - Meta Tag Optimization in "One" Click!

One of the few presentations that delved into the topic of language models, specifically focusing on the use of ChatGPT for generating descriptions (meta tags) for web pages. The presented solution was implemented using AI and RStudio. For working with Czech and Slovak languages, GPT was recommended, with GPT4 being communicated as more ideal for communication.

The essence of working with AI, such as GPT, involves using commands, known as "prompt engineering". Sequentially ordered commands are part of the fine-tuning process to achieve stable responses. One way to compensate for fine-tuning is by providing one or more examples, simply stating how the desired output should look. This process is referred to as "one-shot" or "few-shot" learning.

The main idea behind working with AI and RStudio is the analysis of landing pages, keyword analysis, web content scraping, data from Google Search Console (GSC), GSC data analysis, selecting landing pages for optimization, and subsequently creating a prompt with a "one-shot" prompt for AI learning. Lukáš has published the overall process on the company's blog, and the source code can be found in his GitHub repository.

Milan Zeman - SEO 3x Differently: 3 Case Studies with Real Impact on B2B Business

In the presentation, three case studies were presented outlining the general optimization workflow. One example was the client website Algotech, where the goal was to align positions with competitors, resulting in a 20% increase in organic traffic. In the next step, a "content gap" analysis was conducted, leading to an estimated execution of approximately 50 articles per year and the acquisition of around 20 quality backlinks. The entire process was carried out on a weekly basis, resulting in an 18% increase in click-through rate (CTR).

After a year, the collaboration was extended. The keywords were divided into two groups: one group consisted of keywords closer to conversion, and the other group consisted of keywords further from conversion. For the keywords in the first group, meta descriptions and titles were optimized to improve CTR. Selected subpages were equipped with call-to-action buttons. Following these adjustments, there was a 40% increase in conversions.

When acquiring backlinks, the focus was not on "exact match" keywords, but on incorporating links through text such as "TIP" and direct URL links.

In another B2B case study of TotalService, an SEO audit was conducted, which did not reveal any significant issues. An important step was the development of an SEO strategy, in which the financial impact for the client was modeled. The final recommendation was to consistently adhere to the SEO strategy and view SEO activities through the eyes of an investor.

Zdeněk Nešpor: Titan Websites

The main idea of the big lecture was the perception of large websites, also referred to as "titanic" websites, as websites governed by internal rules, where they don't function as suitable inspiration. In the case of Heureka.cz, we rely on facts in numbers such as 30,000,000 products, 65,000 subdomains, and 3,000 categories.

For such extensive and voluminous websites, crawling doesn't work. It's not reliable to rely solely on the output from Google Analytics or Google Search Console. In fact, there have been cases where Google Search Console reported 100,000,000 dropped URL addresses from the index, but in reality, nothing of the sort happened. There was no decrease in organic traffic.

A more natural approach is to sketch the current website structure with a pencil on paper and try to understand it. In the next step, the structure and links can be set up in a regular Google Sheets document. Crawling the entire website is unnecessary. "Sample" content and its structure are sufficient for modeling further adjustments. Therefore, tools like Screaming Frog and fractions of data from the crawled website are enough. To get a better understanding of the website structure, gather access logs to find additional URL addresses. It should be kept in mind, with such a content-rich website, the goal is to reduce content and streamline the website structure.

Further steps in working with such a content-rich website involve data processing. "XPath" and "regular expressions" are used for further processing. SQL and Kibana are used for data processing. Google Tag Manager is suitable for implementing and testing titles and deploying structured data if needed.

Last but not least, working with such a content-rich website involves meetings, promoting SEO activities in various departments, and education. In general, it is recommended to apply minimalism when assigning tasks.

Lastly, the recommendation was briefly mentioned to use subdomains for separating CMS solutions or for thematically distinct websites.

Radek Kupr - How to Utilize GA4 for SEO

Unique and exceptional lecture focused on the latest insights and information about the new Google Analytics 4. Radek Kupr emphasized that the tool itself has important dimensions and metrics, and overall it is about event tracking. He highlighted the importance of default reports as well as creating and exploring custom reports. He emphasized the integration with Google Search Console, Google Ads, Ads Manager, Merchant Center, and, if necessary, with BigQuery.

He delved into terms such as "engagement session," "engagement rate," and "engagement time." He emphasized that the user is the one who takes action in the case of Google Analytics 4, and the metric "total user" is the number that defines the overall count of website visitors.

For processing data on highly visited websites, he recommended using BigQuery or Kebola DB, while Looker would suffice for smaller clients.

Zdeněk Dvořák – Slabiny Ahrefs. Jaké jsou? A co s nimi dělat, když děláte linkbuilding?

Doubts about the metrics provided by tools were discussed in Zdeněk Dvořák's lecture. The commonly used tool Ahrefs has its own pitfalls and metrics that need to be properly understood to avoid misleading interpretations. Ahrefs itself includes three types of indexes: a historical index, a recent index, and a "live" index that is checked twice to ensure its existence in the past few days.

There was a revelation about the partnership between Screaming Frog and Ahrefs, offering the possibility to obtain 300,000 links for free.

The most practical part of the lecture focused on exposing common manipulation techniques that lead to an inflated value of metrics and a misleading interpretation of their importance. Some of the mentioned manipulation techniques included registering in "directories" where websites are automatically listed, links from websites that steal content and images, and the well-known practice of cloaking. These practices not only increase the number of backlinks but also manipulate the domain authority (DA).

To illustrate the expedited identification of a suspicious website, an example was given using the operator "site:nazovwebu.cz" and observing a large number of indexed pages but a low number of keyword appearances in Ahrefs' Keywords Explorer, indicating manipulation of Ahrefs' metrics. It was emphasized that relying on "domain authority" is not always reliable, and it is better to consider the "traffic" section under "trends."

Václav Brynda - Link Building and How Strategies Have Changed for 2023 - Back to the Roots?

The only lecture on link building and its trends in 2023 drew from Václav Brynda's personal experiences and proven practices. In link building, he focuses on discussions with brand mentions, blogging, but without the use of AI. He found success in reviews and discussions on the website. Ideally, technical articles about a tool or service with a link to a client's e-shop or its inclusion as an example work well for him. Links in mystery shopping yield a success rate of about 50/50.

From his experience, he has found that about 90% of his links come from existing contacts. He emphasized the importance of avoiding a detectable "link pattern," stating that having three links from the same article already signals a clear pattern. An ideal link comes from a website with the potential for traffic, so websites with existing traffic are a good choice. Tools like Ahrefs and Similarweb are suitable for research. When considering whether to get a link from an article for several thousand euros or from an archive, he tends to choose the latter.

The diversity of links he acquires mainly comes from branded keywords on the homepage. He ensures that the article is perceived as significantly editorial. He complements anchor links by linking to bare URLs or using a mix of cited articles or their blog sections. He is in favor of link exchanges but prefers three to four-way exchanges.

He reminded the audience not to forget about guest posts, purchased articles, offline link building, interviews, and sponsorships.

Richard Klačko - Compared - Do tools have the right data on keyword search volume?

Continuing the skepticism towards the metrics of measurement tools, it was supported by facts in another lecture by Richard Klačko. Various tools were compared, including Google Search Console, Collabim, and Marketing Miner. Google Search Console displays impressions but not "search volume." In Collabim, the data on search volume is clustered, meaning that related words and similar words have a common search volume value. It can be said that one tool underestimates while another overestimates. Each tool had its strengths and weaknesses in data processing, but there was no clear winner.

Martina Zrzavá Libřická – Challenges and Obstacles in the World of SEO People

The final presentation that concluded the pace of the SEO Restart 2023 conference was a reflection by Martina on other qualities that an experienced SEO consultant should develop. Some well-known problems that we encounter in client assignments, SEO issues, and overall collaboration were also discussed. Recommendations were given from the perspective of project management, as well as the need to connect and create a more active community of women, SEO consultants, who would participate in the valuable presentations of the SEO-packed conference like SEORestart undoubtedly is in the coming years.

Conclusion on AI in SEO

For the mentioned current trends at the SEO Restart 2023 conference in the field of SEO and search engine algorithms, AI, and language models, it is clear that AI will become an integral part of the search engine optimization process. However, despite the usefulness of AI in optimizing for existing search engines, it will be important to carefully work with the new possibilities.

From the individual presentations, it is evident that we must remember that the use of AI in SEO is not a guaranteed and definitive answer to success. It will be important to consider the context and the source of how newly developed tools process data, such as the configuration of language models (AI). This is because search engines, as "responders," will process content created in this way in the future.

Quality content remains an important factor, which should be unique, original, and beneficial to visitors. The added value of an SEO consultant will not be lost. There are still many ancillary activities that require detailed control and the imprint of human activity (elimination of patterns). Website loading speed, quality backlinks, the use of proper technical optimization techniques, website accessibility, and targeted content work all remain crucial.

Ultimately, it is necessary to harmonize the use of AI tools, which will continue to evolve, with traditional SEO practices to achieve the desired result of organic traffic. And in the near future, credibility (authority) for citing a client's website in an accessible language model will also be important.

Top comments (0)