41. Effective SEO Testing | Kyle Roof | Part 1

Download MP3

Doug: Hey, what's going on?

Welcome to the ranking revolution podcast.

My name is Doug Cunnington.

And in this episode,
we talked to Kyle roof.

It's actually a two
parter in the first part.

In this episode, we talk about SEO
testing and how to run effective tests.

How you can run tests on your
own and some of the pitfalls

that people run into as well as I
actually provide a nice little idea.

I donate an idea that you could run
with and I'm almost 99 percent sure.

I think Kyle was convinced to that.

It would be a great idea for like a
YouTube channel and a few other things.

It's a good idea.

I just don't have time to do it myself.

So this is a great one, especially if
you're interested in SEO testing and

being effective at running such tests.

And there's going to be a second part
where we talk about the keyword golden

ratio, but that's a separate episode
without further ado, let's hear from Kyle.

Hey, what's going on and welcome to
the ranking revolution podcast, your

go to source for strategies and ideas
for SEO, organic growth, content

creation, and online business today.

I'm excited because we
have my friend Kyle roof.

Kyle is involved in many SEO and
online businesses from the on page SEO

tool, Page Optimizer Pro, to training
at Internet Marketing Gold, IMG, and

he's at conferences all the time.

In fact, he's a founder, one of the
founders of the SEO Estonia conference,

and he's a speaker and attendee of
many conferences across the world.

He's dedicated nearly, a decade.

Actually, it's more than a decade
of his life to SEO testing.

And he's published these results
over 400 scientific tests

he's conducted over on IMG.

So over the years, he's fine tuned
and developed a method to test

whether single variables are ranking
factors in the Google algorithm.

He even has a patent.

In this area, we're going to
cover two main topics today.

One around SEO testing and using
those results and the other.

About the keyword golden ratio.

So we're going to get into those details.

Kyle, how's it, how's it going today?

That

Kyle: was a fantastic intro.

My mother will be very
happy to hear that one.

Doug: I think I just got
it from your about page.

I think you wrote the whole thing.

I'm just kidding.

Thank you.

It's, it works out well when
I prepare, but it's been a

little while since we caught up.

So what's new with you, man?

Kyle: Oh man.

It feels like the, uh, things
just get busier and busier.

Doesn't it?

Like, time flies.

Lots of work going on, doing
lots of shows, doing lots of

testing, doing a lot of SEO.

Yeah, same old, same old, I guess.

Doug: All right.

Well, it looks like you're
having fun doing it.

And I know when I check out
my YouTube feed or my podcast

feed, your name pops up a lot.

So good job getting out there and
spreading the word and all that stuff.

You're a great guest,
which is why you're here.

Let's just jump right into it.

So let's talk about SEO testing and
maybe some of the mistakes that people

do make and some of the things that
you figured out so that you not only

could get that patent, but you could
run these tests with sort of reliable

results and Something that you could
take away so people can take action

based on the results of the test.

So very wide open question,
but I'll let you lead the way.

Kyle: You know, running these types
of tests is really a lot of no fun.

They're, they take a
lot of time to set up.

There are a lot of
challenges involved in that.

They often take time and, and
I think that's the, the main

reason that people don't do them.

It's just, it's not, it's not enjoyable
in the process necessarily, but it is a,

it is a lot of fun in the, in the results.

I think a mistake that people will
make, that I think there are two.

One is they'll say like, I
want to test this technique.

Well, that technique might have 50 steps.

And really, if you want to do a
scientific test, or as close as

we can get, you really want to
get it down to testing a thing.

You know, that's when you can
feel the best about those results.

And so, Conceptually, a lot of people
have issues or problems getting down

to a single variable or a single
thing that they want to look at.

But that said the second problem
that I see with a lot of people

is that they just don't do it.

Like they've got a great idea and it
doesn't matter that it's not the most

scientific of all time, but there's
nothing wrong with trying something.

And reporting how it went, you
know, that you did the technique

and you did it this way.

And under these circumstances,
and these are the results you got

that that information is extremely
beneficial to the SEO community.

And it will be extremely
beneficial to, to your own SEO.

And so I think people.

One is, they'll struggle to kind of
understand the concept of a scientific

test, but on the other hand, they
just end up not doing anything.

That is usually where most people fall in.

And the result is the
same in both regards.

And then really just no tests
happen, and no knowledge is gained.

Doug: And for the sort of variable, the
test, how do you, I guess, look at a

scenario and maybe you have a seat of an
idea, the thing that you want to test,

how do you narrow it down to the variable?

That you do want to test and then
we are dealing with the real world.

We're dealing with the Google
algorithm, which may or may not

work logically or how we expect.

So it's a multiple variable
situation plus external factors.

So it gets very complicated.

I mean, just like all scientists may
have to deal with this kind of stuff.

So how do you narrow it down and then
make sure you're not getting interference?

Yeah.

Kyle: The biggest thing to really
understand those, you don't have

to know the entire algorithm.

All you really need to do is pull
back the curtain just a little bit and

have that little bit of knowledge on
things that you know will work, and

that's going to give you the edge.

So don't feel the need or the
idea of like, Oh, if I don't know

everything about Google, then running
a test isn't worth your while.

It certainly is.

Tests are the easiest ones to
run and to get results on are

going to be on page results.

By far, and in the most basic setup you
would say build five identical pages.

They would go after the same keyword.

This can be a nonsense keyword, or
it could be like a three word phrase

in English of words that don't
necessarily go together naturally.

So nobody's really optimizing for them.

The reason you do this is if it's a
nonsense keyword, you get those pages

index, you control the entire environment.

If it's a, uh, three words that don't go
together, you're going to get very limited

results and often no results over because
people aren't optimizing for that thing.

So again, once you then introduce
something to that environment, you

know, that's the only thing that's going
on in there that could impact this.

So let's say you want to decide
if is an H one a ranking factor?

Does Google care about this?

And this is actually
something that I just did.

In the last month where people talk
about with AI and the new SERPs, how

Google doesn't look at tags anymore,
the tags are out and the AI is reading

them and, and on page optimization
is dead essentially because of AI.

So then, you know, this is going back to
2014, 2015, when I started running these

tests, the first thing was, and it was, I
realized that there were different areas

on the page that Google was looking at and
they have different relative strengths.

They're not all equal.

And so when I look at, you know,
just from a basic sense, 2014 test,

you know, is Google reading the H1.

And so what we do in this situation is
five identical pages on the same domain.

They're all optimized
for the same keyword.

We index them, so all five show.

None of those have an H1 on them.

They're all identical in their
optimization, but no H1s.

Then on the last ranking page, the page
that's indexed at the bottom, an H1 is

added containing the target keyword.

On all the other pages, the target
keyword is added one more time in

paragraph tags, so that there's
still the same term frequency for

the use of the target keyword.

We put that on, uh, resend them
through Search Console, and

that one goes right to the top.

The H1 goes right to the top.

Which is what was anticipated and
what I was confident would happen.

Because Google is still looking at
the pages as a bot looks at the pages.

And they're still care about the
different zones on the page and they

still care about H tags and H tags
are still a stronger signal than

paragraph tags and, um, very simple
test you can run and you can do it.

And that's, that's something
also that you can replicate it.

You can do yourself now to get
validity and a way to, to feel

really good about your results is
you want to repeat them, right?

But again, that's.

Time consuming that's taxing at a,
it takes budgeted resources and just

mental strength that you might not
have, or, or the desire to stick to it.

So what you can do, and this is
also something I learned throughout

the way is you can test in the
inverse, do it the other way.

So the idea is you'd have
five identical pages.

You would have once they all index
with no H ones, you'd add an H one

to four of them, and then the top
one wouldn't have an H one you'd add.

One more time, and the paragraph
tags, and what happens is that

one then drops to the bottom.

You've done the test in the inverse,
you've shown it both ways, and

then you know you've got something.

And that allows you to fast track your
testing, so you don't have to continue

to repeat a test to see the same result.

If you can do the test in the inverse,
and it goes the opposite direction,

then you know you've got something.

Doug: You mentioned you do five samples.

I don't know how you describe it.

Kyle: Five, five, five identical pages.

And so, they're going to be exactly
the same except for a couple of

different letters in the URL.

So they'll have a, you'll have
a unique URL, but say the target

keywords right next to the.

com.

And you've got a string of say
seven random letters after it, but

then the rest of the pages, the
rest of the pages are identical.

Doug: Are they all in the
same domain or does it matter?

Kyle: You want to do
them on the same domain?

Because.

If there's something that has
happened to this domain, it's

happening to all of those pages.

And then people will be like,
well, what about duplicate content?

And I would say, well, what
about duplicate content?

But if there is a duplicate content
issue, it's happening to all the pages.

You know, and then you can see all
the pages indexed so you know that

there isn't a duplicate content issue.

Doug: Why five?

Why not seven or three?

Do as

Kyle: many as you like.

It's a, it's a, it's a, I think the, the
smallest number that you feel comfortable.

Well, three would be the smallest
number, but you want to get a few pages

in there so you can see some movement.

Okay.

Doug: How often do you run and I'm not
sure if my vocabulary is right, but how

often do you run like the positive test
where you're looking for this thing

to happen because you added the H1 tag
versus like a negative test where you're

like, we're going to test this and we
actually don't expect anything to happen.

Do you look for those as well?

Kyle: Well, the negative test and
what I was, I was just kind of seeing

the results running it the other
way and getting it in the inverse.

Okay.

But no, they're, they're, they're
tests are run all the time, not

knowing what the outcome might be.

We give the best guess because you do
want to write down what your thoughts

are, what you think will happen.

Uh, you want to do that just in case your
own bias has somehow crept in and you

can kind of evaluate maybe what happened.

But no, we run tests all the
time, but we're not sure.

Did one just last week where looking at
the no script tag It's a, it's a, it's

a tag that's kind of giving maybe like
an alternate version of the text and

it's kind of keeping it within the CSS.

Is Google going to read that?

And I can tell you that Google
does read that and a page

will index where a keyword.

So what you do on that is you index
a page for a target keyword whatever

it might be, whatever you like to do.

And then what you do is you add
content into a no script tag and

it contains a different keyword.

So this page is in the index.

Google knows that it's
there running back through.

And if the page index for that new
keyword, then, you know, Google is

reading that script area and we know,
and I can tell you for a fact that

Google is doing that because the page is
showing twice for two different keywords.

And one is visible.

On the page as would normally be.

And the other is not visible.

It only appears within that one tag.

Doug: So what would be an actionable,
how might someone use that information?

Kyle: Well, that came up through a user in
pop that was saying like, Hey, I'm using.

And they were doing it for a valid reason.

Uh, they were just kind of hanging on to
some text and it was being shown, I think,

within if the user clicked on something
and then there was an an application

that would then show some texts,
like we actually have that over here.

Well, it's like, well,
we'll Google read that.

We'll count the terms and we wanted to
make sure, cause pop wasn't counting it.

And I was like, well,
we'll Google recognize it.

And if Google does, then that's
something that we want to count.

And that's, that's how
that test came to be.

And yeah, lo and behold,
Google will count it.

So pop will now count it.

And that actually kind of brings up
a huge misconception about on page

where people are like, you know, and my
Google doc, it says I've got 500 words.

Here in content, but when I
put it into pop, it tells me

I've got a thousand words.

Well, that's because there's a lot of
words that Google's reading that you're

not seeing, you know, there's a lot
within the code that Google reads and

that's those are tests that we run to
see if Google's actually picking it up

and and counting it and By and large they
are this will explain to people why when

they do like a template change why their
rankings tanked You know, they probably

their code changed You know, then they
probably reduce their term counts going

from this template to that template.

And so they just probably cut their
content in half without even realizing it.

And yeah, sure, you're, you
know, you're, you're, Google's

going to refactor those pages.

So you cut out half your content
and you're going to drop.

That's just how it's going to be.

It, it, it, and so like that, that's what
all these tests are great for is that,

Hey, we've got this particular issue.

What's going on here?

Is Google actually looking
at this section of the page?

Is it looking at this tag?

Is it looking inside here?

Does, do these tags matter at all?

Like I can tell you it matters down to H4.

H1 not looking at H5.

It's gonna count the same
as a, as a paragraph test.

And so, but H1 through H4 do have
different values and descending order.

And that's just all through running
these types of tests and then Running

them again and then running them in the
inverse when we can and and you start to

get results you start feeling comfortable
with the okay, this is what's going on.

And then you can take that
information and you can apply it.

Doug: So the tags, the example that you're
mentioning, it seems like that might be

one of the most significant ones because,
you know, we've used tags for years.

And common knowledge is.

You know, use the tags.

It provides a hierarchy.

Google reads them.

It provides all this great
information about the content.

And I'm curious, are there other
big tests that you've executed?

You got results that really sort of
significantly changed or provided like

evidence that things are a certain way.

And you, you could talk about the
last year since SEO has changed a

decent amount in the last year or so,
or if you need to go back further.

That's totally fine too, but
just curious about the biggest

test that you've figured out.

Kyle: Well, similar to the no script one
that really rocked a lot of socks off was

ranking a blank page and, um, doing it
with uh, a tag where you display none.

So the idea is that the entire
content is gone and, and a human

cannot see anything, but Google can.

And this goes to show that Google
is not looking at the page.

Google is not reading the page.

Google cannot see the page.

Google's reading the code and
Google's reading what's in the CSS.

And people will argue
that point all day long.

That, you know, Google's
looking at it this way.

Google's not reading the code.

And they're rendering the page and
then they're looking at it that way.

And I can tell you for a fact
that not because you can rank

a page that is displayed none.

Doug: Okay.

And I mean, illogically, if you would
have asked me before we started chatting

or that you answer the question, right
before that, I think I probably would

have said, yeah, they're looking at
what is visible on the page, not as

what's Not anything that you cannot
see, but that is completely wrong.

Google is reading the code.

Okay.

Kyle: Yeah, which also tells you,
you know, was it a year and a half

ago or so, there was the Yandex leak.

And, um, in that Yandex is
built as a, as a Google clone.

And it seemed pretty clear that
Yandex was using what they call VM 25.

It's a it's like a TF IDF plus word count.

It is basically what it is.

So TF IDF is, it's a bag of words model.

We were just counting frequency.

So you put all the words into a bag.

Give it a shake.

You pull them out.

You just start counting, right?

That's kind of, that's the, and then
the more times a word shows up, the

more important is its term frequency,
but it's inverse document frequency.

So the fewer time, fewer documents
that have the most terms.

Essentially, that's
what you're looking at.

Then adds in word count
as a factor within that.

And now, when you think about it from
what Yandex is trying to do there,

they, I believe they poached Google
engineers, brought them in to do this.

Now, if Google is not using VM25,
what this tells me is they're

doing something that is a very
close approximation to that.

Even if it's a fancy version of
something, it's basically that.

But then, really, when I started
thinking about it more, I was

like, This is so cheap to do,
they have to run a trillion pages.

They're going to find the
least expensive way to do it.

And this is the least expensive way.

So there's really little doubt in my mind
that this is how they're reading pages.

And then when you kind of run through
the tests that I've run through and the

things that I've seen, like, yeah, I just
taken all the words I put in the bag.

They're shaking it up and it's, it's
cheap and easy and it's effective.

And, you know, the
results are pretty good.

This is why something like pop works is
because we're essentially we're using a

bag of words, and this is why we're able
to have consistent success with the tool.

And why it hasn't been canned years ago
is because we're doing something quite

similar to how the search engines are
doing it because it's a very cheap way to

do something to get pretty decent results.

And that's we're able to emulate.

So.

Yeah, so that was a long way to answer,
but, uh, you know, it still goes back

to as much as things change, they all,
it stays the same, you know, and that's

why I'm not very knee jerky and scared
about whatever is coming with the, with

the AI stuff, because I think it's going
to be used in a very limited fashion.

It's not AI anyway.

It's a language model type deal and
language models are still algorithmic.

And if it's algorithmic, then that's going
to require optimization and there's going

to be a lot that we can do with that.

Doug: So I have a couple other questions
around running tests and I do want to ask

a little bit more about page optimizer
pro pop, which you referred to, and

then we'll transition into the other
area, the keyword golden ratio stuff.

So for.

This sort of testing, like you mentioned,
you opened up and you said, it's

pretty difficult because it's tedious.

It's not the most exciting thing.

Like we want the results and then we
want to run with the results, but like

setting up the tests, even though, you
know, simple step by step, but it all

takes time to do and everything adds up.

And we all probably have
like 50 tests in our head.

We're like, Oh, I wonder if this works.

I wonder if that works.

So it becomes overwhelming
and people don't take action.

Do you know of.

Other people, other organizations,
other software companies that

are running tests like this and
publishing the results somewhere.

And yeah, and if you're able and
willing to share, I think a lot of

the highest level SEOs like yourself,
like, you're more about, like, sharing

information than, like, withholding it.

And the more we all know,
the more everyone knows,

which is really a good thing.

So, yeah, any others running tests
like this that, you know, of.

Kyle: There are several, excuse me,
there are several people within IMG that

are running tests and they and their
tests are published and which is great

because it's not what I find what's
kind of fun about that is everybody's

kind of got a different thing that
they're interested in, you know, they're

coming at it from a different angle.

They have different problems.

All my tests come from my own problems,
my own things that I need to figure

out and other people have other issues
or other things they want to know.

And so it really gives a nice variety.

SIA is still around, I believe.

I haven't seen anything from them in a
while, but I think they're still there.

And that's, that's the group
that I was originally with.

I think they're still trucking along.

Okay.

Doug: What were they called again?

I'm sorry.

Kyle: SIA.

The SEO intelligence agency.

Okay.

Yeah.

Pretty confident they're still,
they're still running tests.

But I haven't seen much in a while.

And but you know, somebody like Matt
Diggity, you know, he, he's running

tests and all the tests that he runs,
he runs them through seotesting.

com.

If you're gun shy on how to run
these things, or you want to do

something that might be a little
more instantly applicable where

you can say, Hey, you know, we've
got, we could do a or b, right?

So why don't we do a to 10 pages on our
site and be to a different 10 pages?

And monitor results.

Something like SEO testing is a great
platform to run those types of tests

where you can then monitor those results
with a high degree of confidence.

It's a, it's a great platform and
I, and I highly recommend that.

So maybe you don't want to run, you know,
a test like I talked about really for

the academic knowledge of it, or, you
know, that it's not quite as applicable

to like, okay, Google's reading the
code now what, you know, and there

are, there are things you can do, but
that's more of a conceptual idea as

you, as you're building out your SEO
framework, if you want to do something

that's really, let's get some results
and feel confident about making the right

decision, use something like seotesting.

com and I think You might, uh, find
a really good avenue for running

tests and getting good results,
but you can apply immediately.

Doug: Perfect.

So other question around this.

Some of us are a little bit older.

I'm in my mid forties now and my high
school scientific method knowledge.

I forgot some of it and
Kyle, you're laughing.

I mean, we're about the same age.

I think so.

Kyle: We're exactly the same age.

Doug: So, yeah, in like, in theory,
when I hear a little bit older

Kyle: speak for yourself,

Doug: I'm just hitting my prime,
you know, I'm just sleeping.

So basically that's one of the,
that's one of the hangups, right?

Someone like here, here's results.

They hear people testing and then, you
know, it turns out maybe whoever's running

the test, they're just like not doing
what they should, but let's step back and

think, okay, if someone is interesting.

Interested in doing some tests, how
would you recommend they refresh

their knowledge on the scientific
method to ensure that they are, you

know, setting up a test properly?

They're writing down their
hypothesis ahead of time.

A couple of the key just, I mean, it's
probably a one pager situation, or you

could ask like your your nearest high
school student and your family that.

Just took this class, but yeah, how would
you guide someone who maybe they didn't,

they don't have a scientific background.

Kyle: You can just do that quick
Google search scientific method and

they'll show you a little wheel.

That'll get you there.

You know, what, what helps
me in these things is first

writing down the background.

Like, where did I see this issue?

Or like, maybe you saw something in
the SERPs, or you saw a competitor do

something, or just an idea came to you,
or you were doing client work and a client

asked a question you really didn't know.

Write down a little background so you
have a feel for where you're coming from.

And then write down what
you think will happen.

That's your hypothesis.

I think, I think this will happen.

And then from there, As you set
up these tests, you just want to

remember what you do to one, you
need to do to them all to eliminate

variables as much as you possibly can.

And then once you've done
that, then you're only

changing one thing on one page.

That's your test page.

Everything else is the control.

So whatever you do to one, you have
to do to the others, unless that's

the one thing that you're testing.

So if you click and open a page,
for example, you go to look at

it, you're going to want to go and
click and open all the other ones.

You know, so that you've
got, they are all as.

The same as you possibly know.

That's why we it's if you can some test
require you have to do different domains.

That's just kind of how it works,
depending on what you're doing.

Let's say you want to see
if domain names matter.

So then obviously you're going
to be running this on different

domains in those situations.

And you just want to run it as many times
as you possibly can and plot the results.

You know, and then you can start to
see if they cluster, you know, and

you're going to see, okay, there's a
noticeable trend, or if it's scattered,

like, okay, there's nothing here.

But otherwise as long as you're very
cognizant of the fact that I've got these

tests, I've got those five pages, they're
going to be completely identical and

anything I do the one I'm going to do is
except for that one thing that I really

want to look at, then you'll be okay.

And, uh, if after you make the change to
one page, for example, your test page,

and then you want to run it through
search console, which is totally fine.

Make sure you run all the other
ones through search console, even

though you didn't change any on them.

So just let me kind of keep that
consistency amongst your pages.

Then I think you do pretty well
setting up your own little tests.

Doug: I'll donate an idea to
anyone that wants to run with it.

Cause it would be, it would be
a pain in the ass to do this.

But if someone created a YouTube
channel and or Facebook group, an

email list, just running tests, You
don't have to have all the tests be

successful, but just sharing the results.

And the process you described, Kyle, where
it's like you write down the hypothesis.

I mean, it's a little vlog.

You could take people on a
story, a little journey there.

And, you know, you're learning something
and it's, I mean, it's relatively

inexpensive other than your time.

So

Kyle: on, you know, so if somebody
wants to run a test and they want to

publish it with an IMG, we'll give
them a free membership for the quarter.

Oh,

Doug: there you go.

So they can,

Kyle: they go

Doug: a lot of wins, a lot of, a
lot of incentives to get started.

And I'm pretty sure their channel
would blow up and you could speak

at conferences or whatever, because
you could talk about the test.

You could talk about starting the
channel and the whole scenario.

It's a very fruitful situation.

People love hearing about

Kyle: the

Doug: results.

Kyle: Love it.

It makes great content
for, for presentation.

And, um, here's the thing too.

You know, so it's like 2015.

I spoke at my first conference,
which is an excellent conference.

Um, and it was, um, it's a smaller group.

There's only about 40,
50 people in the room.

And they were, they're really high level
people and I was a late addition and

I kind of figured out this method for
testing and this was my presentation and

then I was going to show some results
and I thought it was one of these where

people can talk to our own stage two is
kind of collaborative and I just thought

I'm going to show the way I'm doing it
and they're going to, everyone else is

going to show the way they're doing it
and five minutes into the talk, I realized

nobody's doing this like nobody at all.

Then I realized, so then I went back,
I spoke to my business partner about

this and I was like, you know, we've
got this agency and we're pretty new.

You know, we're trying to find our
point of view, trying to find our voice.

Like, why are we providing these services?

And I was like, well, you know,
our methods are built on these

tests that I've been running.

And it turns out we're the
only ones running this.

And so we can brand ourselves as a science
based SEO agency and that we run tests.

And then a line that we used at the
beginning was, you know, Hey, those other

agencies you're talking to ask them the
tests that they're running right now.

And we knew the answer was they're not.

That means they're, they're
testing and they're learning on

your site is what that means.

We're testing and learning over here.

So we know what works and what doesn't.

And then we're bringing to you the stuff
that has the best chance for the success.

We're not going to learn on your side.

We're not going to run a test on your
site and, uh, and learn that way.

And that got us some
clients we didn't deserve.

Doug: That's a hell of a sales point.

Kyle: Yeah.

Where, I mean, we just, you know,
we didn't have the, The case studies

necessarily, we didn't have the portfolio
to say, but we, Hey, we've got these ideas

and we know these are, these are good and
we know them because of the tests and we

can kind of take that and apply it and
put it as part of our pitch and really

kind of give ourselves a a voice, which
kind of gets into sales too, which is

something that really didn't occur to me
till later in my business life was, you

know, everyone can sell a widget, you
know, or everyone can sell a service.

But why are you providing it?

You know, you have to
have a point of view.

Otherwise you're just lost in the crowd
and you can, you can still be successful

and you might get clients, but you
really have to have something that makes

you stand out, like what you're doing.

And so for us, we found that within
the test, we could really make

that a huge part of our identity.

And I've made it a huge part
of my professional identity and

when got the patent and all that.

And that was actually
through Dory from SIA.

It was really, it was
originally a marketing stunt.

She just wanted to say that
this was patent pending.

And then,

Doug: and

Kyle: then, and then the patent came
through, which no one was expecting.

And and so that worked out, that worked
out pretty well for me as as well.

So.

That's how that came about.

But then again, I all went into a little
thing like, Hey, I've got this patent.

I run these tests.

These are the results we do.

And this is how we can, we can help
you and your company do better SEO.

Doug: Amazing.

Okay.

As we wrap up this section, Page
Optimizer Pro, can you talk about

what the tool does, what the goal is
and who the best users are for POP?

Kyle: The, the real basic idea is,
uh, we do competitive analysis.

To find out the amount of times you
need to use very important terms

and where you need to put them.

And as we mentioned kind of earlier, that,
uh, Google is still looking at the pages

like they were looking at them before.

There are different areas on the page
that are more important, and Google

is looking at how many times you're
using specific terms in those places.

And we give you the terms to
use and arrange on how many

times you should use them.

And, uh, to get you that competitive edge.

But it's also important to
know we're not doing averages.

And we're also not leveraging
somebody else's API.

This is our own math,
is our own algorithm.

And, uh, I think what I can
confidently say is that with doing

on page this way, you're giving
yourself the best chance to rank.

You know, nothing in SEO
is a 100 percent guarantee.

It's the idea of doing repeatable
things that bring about the

best chance for success.

And that's exactly what Pop does.

Is that if you follow the recommendations,
I think with your on page, it gives

you the best chance for success.

Doug: Very cool.

Well, I'll link up to pop so
people could check it out and

the, the YouTube videos and such.

You guys have a channel out
there, some content occasionally,

so I'll make sure to link up.

Thanks a lot to Kyle.

He is fun to talk to.

I can't wait to just hang
out with him in person.

And then we don't have to talk
about SEO and marketing so much,

but be sure to check out his stuff.

We'll link up to the places
where you can find them.

He's over at page optimizer pro,
and you may see him at conferences

somewhere across the world.

He's at many of the big SEO conferences.

And don't forget there's
a part two and we.

Talk about a topic that I was really
excited that Kyle actually suggested,

and that's the keyword golden ratio.

So we talk about why the keyword
golden ratio is good and effective.

And we also talk about where it falls
short and the improvements that Kyle made.

For the KGR and how he implements it and
how he's, you know, helped clients and

other folks use the keyword golden ratio.

So be sure to check out part two, which
is the other episode in this series.

Thanks a lot to Kyle and we'll
catch you on the next episode.

41. Effective SEO Testing | Kyle Roof | Part 1
Broadcast by