
Ryan Jones, marketing manager at SEOTesting in the U.K., has been pushing
out valuable insights on LinkedIn during the past few days.
"Most SEOs stop at reporting on traffic," Jones wrote in a post, "but traffic isn't the business
outcome. Conversions are."
He launched into a custom Google Analytics 4 example, which he says marketers can set up in less than two minutes. It shows the content on a website that
drives conversions -- not just clicks.
The step-by-step process includes naming factors, adding dimensions such as sessions sources, key events, and building the report. All can be found here.
Following his steps, Jones explains that marketers can see the landing pages that bring traffic
that drives engagement, and pages that deliver conversions.
advertisement
advertisement
The device category can be added by columns to split performance by desktop, mobile, and tablet, he explains, for those who
want more information.
Jones wrote that the experiment changes "SEO reporting from 'traffic updates' into business impact storytelling," because it tells about the specific pages,
formats, and topics that generate results.
On Thursday, Jones reported that he had run about 140 search engine optimization (SEO) tests, and shared the results -- what "moved the needle."
The tests were run on the SEOTesting website using ChatGPT's Deep Research to analyze the results. Some patterns were difficult to ignore, he said.
Positive outcomes
include content refreshes that often delivered double-digit percentage gains in clicks per day, internal linking improvements that led to consistent test gains and embedded YouTube videos that
produced some of the largest gains per click.
Fixing the problem of cannibalizing keywords by merging pages seemed to produce strong results.
Among other findings, he
said, small title tweaks did not improve much on their own, while broad titles increased impressions. However, click-through rates fell, and adding generic schema markup in isolation
typically led to flat results.
"Tests that added meaningful value (fresh content, stronger links, video embeds) drove real traffic growth," he wrote. "Tests that chased
impressions without improving relevance or intent often hurt performance slightly."
Jones also posted the raw numbers:
87 test wins
49 test losses/learnings
4 tests
leading to flat click graphs with no meaningful change
Jones felt it was important to note that SEOTesting is a software as a service (SaaS) site. The other option is on-premise
software.
I'm not sure why this makes a difference.
If anyone who reads this column understands the difference, please comment.