General

101 SEO Tips for more Traffic & Better Ranking – Easy for everyone

ADVERTISEMENT

In this article, I want to share with you 101 SEO tips that can improve your rankings and traffic via organic search. Please consume the tips with caution. Things can change quickly and SEO is not an exact science …

Really ???? Seo is not an exact science ??? SEO sounds like a pure blind flight now !!!!! WHY AM I NOT PAYING ANYTHING FOR ANY TIPS HERE?

Okay – calm down and take a deep breath.

In SEO, there are certain factors that can positively influence the ranking. But not about every lever that can be operated, can be said with certainty, how strong the influence on the rankings is or whether this has any influence at all. Also, spurious correlations are not exactly uncommon in SEO and so tests for possible factors are particularly difficult. Simply put: With SEO, there is always a portion of uncertainty involved!

In the following, you will benefit from my many years of experience. My colleague and I have improved rankings and increased the visibility of the respective domain on the basis of well thought-out SEO strategies in various customer projects. What I want to say: The following tips are based on successful measures for customers and could therefore hardly be more practice-oriented.

Are you ready? Let’s go!

User Signals and Search Intention

  1. What does Google want?
    With every optimization, there is always one elementary question: What does Google want? Only if we deliver what Google wants, we can achieve good rankings in the search results (SERPs). Google always wants to provide its users with the best search results, so that they continue to use the search engine and do not go to the competition. Therefore, always think about the user who should go to your site and their search intent / search intention, then sooner or later the search engine will reward you.

  2. What is the search intent?
    This refers to the search intention behind every search query. What does the user want to achieve with his search query? Which website and which content does he want? Only when we know this, we can also provide the optimal results on our website.

  3. Types of search queriesThe
    different types of search queries must be taken into account during
    optimization. Here, a distinction is made between information-driven, commerce-driven, transaction-driven and navigational keywords. In the case of a search query in the information field, the user wants to be provided with information on a topic (e.g. How high is Mount Everest? ). In the transactional domain, the goal is to take an action, such as downloading an e-book about marketing. In commerce-driven searches, there is often a purchase intent behind the query, such as buy Puma shoes or find a good marketing agency. For navigational searches, the search is targeted to a specific website, such as customer support phone number Zalando.

  4. Find out the search intentTo find
    out the search intent, it usually helps to look at the top ten for the keyword for which you want to rank well. Here you will find the current results that Google displays to its users and most likely meet the search intent correctly. If you want to satisfy the search intent correctly, you have to deal with the target group. What problems and questions do they have about the keyword? What are their expectations? Research in forums, social media, customer surveys and much more will help you to understand the target group as best as possible.

  5. Can Google measure user signals?
    Google can very likely measure user signals (the behavior of its users) and thus judge whether a website or its content is well received in the SERPs or not. Google has also filed some patents on this (example). Clear user signals can be, for example, the CTR (click through rate). This includes the ratio of impressions and actual clicks on the search result. Rand Fishkind has published an interesting test on this
    :https://twitter.com/randfish/status/612691391848648704Das Topic, however, causes great discussions in the SEO scene: How can Google measure all this? Dwell time and other metrics of user signals vary greatly – depending on topics and interests … a never-ending story! So just remember this: Provide your users with the right information, make sure you have a good user experience (positive user experience), the user should enjoy being on your site and getting the information they are looking for. An appealing design of your website, which fits the target group, is also mandatory (nobody likes ugly and cluttered websites).

  6. Measure User Signals and User Experience
    While time on site, bounce rate e.g. with Google Analytics and CTR with Search Console can be measured, it becomes more difficult to measure if the user experience is really good. Use tools like Hotjar, Overheat or Mouseflow to record individual user sessions to see how users behave. This allows you to see how visitors behave on your website and where problems occur, if any.

Crawling and indexing your website

First, a simplified explanation about how search engines work:

Many search engines like Google work through the web with their crawlers for relevant information (crawling) and decide whether the information is included in their search index (indexing), is considered relevant to certain search queries and on which positions they then appear in the search results (determining the relevance). In order to have a good chance to get a high visibility on Google & Co., your website must be optimized for these three steps.

  1. Crawling and indexing
    Your website must allow access by the search engine bots. Although this is actually a matter of course, this mistake happens more often than you think. Especially during relaunches people forget to remove the block for the search engines. Therefore, always check the Robots.txt to see if they erroneously lock out the entire domain or individual subpages for the search engines. Browser plugins such as https://addons.mozilla.org/de/firefox/addon/seerobots/ or https://addons.mozilla.org/de/firefox/addon/link-redirect-trace-addon/ show you the URL you visit in the browser, whether search engines are blocked here.

  2. Don’t block URLs via Robots.txt
    If you want to block certain URLs from the search engines so that they don’t appear in the SERPs (search results), work with the tag “nonidex,follow”. If URLs are locked out via Robots.txt, Google may still display the URLs in search results.

  3. Use Noindex correctly
    Pages that have no search intention and other benefits for the visitors, do not need to clutter up the search engine index. Therefore, set them to Noindex,follow, so that the bots know that these pages are not relevant for them. An example: pages like imprint, privacy, logins or pages that should not be indexed for strategic or legal reasons. Always use noindex,follow and never noindex, nofollow, otherwise your page will lose link power internally (more about this later under the point internal links).

  4. Duplicate Content
    Duplicate content (also abbreviated DC) can also be remedied by setting the Noindex,follow tag. Duplicate content means that identical content can be called up several times on different URLs.

  5. Disadvantages of Duplicate Content
    DC is not a reason for Google to penalize you (Google penalty). If identical or very similar content is located on different URLs, Google itself decides which of them will be displayed in the search results. However, duplicate content brings some disadvantages:

    • It can lead to duplicate rankings and therefore possibly to worse rankings

    • Google may view DC as a fraud/manipulation attempt

    • Keyword cannibalization through DC: internal competition for the same search term. The URLs are in competition with each other and this can result in worse rankings.

  6. URL structure
    Keep the URL structure of your website as short and clean as possible! URLs Session IDs and other parameters in a URL often create a new URL with each call. If these URLs have a valid status code, Google can index them under certain circumstances and there can be a lot of duplicate content problems on your website (more about this here).

  7. Click depth I
    In the page architecture, important URLs through which you want to generate traffic organically should be accessible with as few clicks as possible from the homepage. If there are too many clicks to a URL, Google may consider it less important and thus crawl it less often. Especially with WordPress, old posts like to disappear deep into the blog archive. With the plugin https://wordpress.org/plugins/simple-yearly-archive/ you can link all posts on a page and thus reduce the click depth. A prerequisite is the linking of this annual archive, for example, in the footer of your website, so that it is accessible with just one click.

  8. Click Depth II
    With the Screaming Frog you can analyze the click depth of each URL on your website and thus see which pages/URLs can be reached with how many clicks from the homepage. As a rule, this should be a maximum of three clicks.

  9. Avoid broken links on your website
    Broken links are links that point to a destination that is no longer accessible (e.g. pages with a 404 error). These are not only annoying for your visitors, but are also a sign that the content of your website is no longer up-to-date. Therefore, avoid broken links on your site (whether they link internally or to external sources). Also, avoid automatically redirecting broken links to the home page, as this could also be considered an error by Google (soft 404 error, read more here). You can also find broken links with the tool Screaming Frog.

  10. Multilingual website
    If you operate a multilingual website, you can and should use the hreflang attribute to show Google which language and country the respective URL is intended for. This ensures that Google recognizes the respective country orientation correctly and displays the appropriate URLs to its users in the SERPs. In the Ahreflang guide from Sistrix you can learn everything about this topic (otherwise this would completely blow up the article).

  11. Sitemap According to John Mueller (Webmaster Trends Analyst at Google), a Sitemap.xml is not required for pages with up to 10,000 URLs. But a sitemap.xml file helps the search engine to understand your website better, to recognize important pages and to be informed about changes faster. Always create an extra sitemap for videos (if available) on your page.

Keyword research

  1. What is a keyword research?
    A keyword research is the search for search terms (keywords), on which a website should be optimized, and is therefore an important pillar of any optimization. Many start immediately with the use of various research tools (I will mention good tools later) and forget an important tool or a step before: The research always begins with a brainstorming. What are the terms, which are used in your niche according to your experience, what is the wording on the website? In companies, sales and customer support often have other great ideas for search terms.

  2. Useful tools for research can be

    • Google Autocomplete: Google search shows you matching search suggestions below the search bar while entering a search term, which other users are also searching for.

    • Keyword.io, Seorch.de and Kwshitter.com: These tools suggest countless search terms for your keywords.

    • OpenThesaurus: The tool suggests synonyms that match your search terms.

    • answerthepublic.com: This application delivers especially many combinations to your keywords – e.g. in question form.

    • KWFINDER: This is a paid tool with which you can find out the monthly search volume for your keywords.

  3. Search Intent
    The search intent should be determined for each search term – this makes subsequent optimization much easier. Look at the competition: which questions are answered and which solutions are shown.

  4. Short-, Mid- and Longtail-Keywords
    In addition to the search intention, a distinction is made between short-tail keywords, mid-tail keywords and long-tail keywords.

  5. Shorttail Keyword
    Shorttail keywords are often generic terms with a very high search volume, but often with an imprecise search intent and target a very competitive market in the SERPs.

  6. Midtail keywords
    Midtail keywords are search queries that have a lower search volume than shorttail, but the search intent is easier to determine and the conversion is usually better. Compare the two search queries fire extinguisher vs. buy fire extinguisher: Here, with the mid-tail search term buy fire extinguisher, someone really wants to purchase something like that, while with the short-tail search term, the focus is more on obtaining information.

  7. Long-tail keywords
    Long-tail keywords have the lowest search volume, but they are usually easier to bring to the front and are very specific, so that optimization for the search intent is easier. Exceptions are long-tail keywords with a commercial search intent – here it can happen that many competitors already optimize for these keywords (e.g. lucrative long-tail terms for online shops).

  8. Long-tail keywords are super (?)
    Even if long-tail keywords have low search volumes, they can provide a lot of visitors and are much more accurate in terms of search intent. Therefore, do not only focus on short- and mid-tail keywords, but also consider long-tail keywords.

    The designation of these three keyword types comes from the graph of the “Search Demand Curve”. Short-tail keywords sit at the top of the graph (this is where most searches are), mid-tail keywords sit in the middle, and long-tail keywords sit at the long end (long-tail) of the curve.

  9. Keyword clustering
    Avoid creating different subpages for very similar keywords. Keywords that are very similar can and should be optimized on one URL (You can find a detailed article on this with examples here.) This reduces the risk of keyword cannibalization and duplicate content on your website.

  10. Assess competition in the SERPs
    To find out whether you have a chance of getting to the top for the selected keywords, you need to check various criteria of your competitors in the top ten. These can be, among others:

    • Which sites are in the top ten? Many brands (such as Zalando, Otto, etc.) indicate strong competition.

    • If there are videos, forums and rather smaller niche sites in the top ten, the competition is usually less strong.

    • How are the pages of the competitors maintained? Are there technical errors, too long loading times and so on, or are they optimised?

    • How is the content on the respective competition: Is there content that answers the search intention in the best way, or are there gaps and open questions that YOU can answer better?

    • How strong is the link profile of the competition? Do they have many strong or only a few backlinks in the link profile (can be found out with tools such as Ahrefs or Openlinkprofiler)?

    • Do the pages have a high acceptance or popularity among your target group? Indications of this can be found, for example, in the social media channels of the respective competition (fan page with many fans / followers and a high interaction such as likes). For blogs, it can also be comments under the respective posts.

Content

  1. There are no SEO texts …
    You write texts for people without any ifs and buts. Google does not want to be tricked and always provide its users with the best content for the respective search query. Stick to that and don’t try to use any tricks that make your texts look unnatural, just in the hope that Google will let them rank better.

  2. Forget keyword
    densityKeyword density may have been important many years ago, but not anymore. Google has become smart enough to properly rank texts. Therefore, avoid compulsively including keywords everywhere in order to maintain an ominous keyword density.

  3. WDF*IDF / TF*IDFWDF*IDF
    is also no longer important – just because you include certain terms in the text according to the WDF*IDF calculation, you will not immediately rank better. However, the WDF*IDF analysis shows you very well what the competition writes in their texts. This knowledge can help you to better understand the search intent for the respective keyword and to write better texts for your readers.

  4. The right text length?
    How long should a text be so that it has a high visibility in Google? Popular question, to which there is only one answer: “It depends”. If the search intention is satisfied with a short text (300-400 words), then 300-400 words will do. If the topic is complex and there are many questions, theories, backgrounds, etc., the text will automatically be longer (> 2,000 words) to accommodate all the important information.

  5. What is good content?
    The question is difficult to answer because good content is often based on subjective perception. When a text is liked or not depends on the reader. Once you’ve done your SEO homework and figured out the search intent for your keywords, you can/should write copy based on the expectations and desires of the target audience. Some points what good content is:

    • Content that shows (new) insights and/or real solutions to problems for the target group

    • The content is informative and offers real added value for the reader (“You won’t read this information anywhere else” or “This text really helped me”).

    • The content is authentic and true.

    • The text should be very well researched and supported with some sources.

    • Content design: The text must be easy to read and scannable at first glance (many subheadings, highlighting important content, images, videos, enough paragraphs, etc.).

    • The content must be easy for the target audience to understand, error-free and deliver exactly the information promised in the headline. Tools like https://wortliga.de/textanalyse/ help you write the perfect text.

    • The text must be written in such a way that the target group will gladly recommend it to friends, colleagues, etc., share it and save it as a bookmark. If you can do that, you have written a very good text.

  6. Looking for new topics for your website?
    Then tools and platforms such as answerthepublic.com, Google Autocomplete, Google Trends, topic-relevant Facebook groups and Internet forums will help you to find new topics and find out the questions of the target group.

  7. Diversification in the SERPS
    Google does not deliver 10 places to its users to deliver 10 x the same search results with the same content. Google wants to show different results that address and answer the topic from different angles. Therefore, think about how you stand out from the competition and don’t copy any of your competitor’s texts, nor their setup and structure. Your content must be absolutely unique. And if rumor has it that only articles with 2500 words rank well, and everyone sticks to it, that’s what “satisfied” users look like:

  8. Recycle contentRecycling content means optimizing existing content. Content that already has rankings on Google on page 2-3 is particularly interesting. Here it is often not far to get on page 1. With tools like Sistrix or Metrics.tools, but also with the free Search Console, you can see the current rankings of your website on positions 11-20. Sometimes smaller changes such as an expansion and restructuring of the text as well as the adjustment of the meta titles and headings in the text can make a difference to get into the top ten.

  9. Example Content Recycle
    Our article about search engines (optimized for the keyword “search engine list”) had presented just five search engines
    before the Medic update and still had some good rankings. According to Sistrix the article had an SI of 0.3264 before the update and after the update the visibility went down to 0.03 – the article flew out of the top ten for some keywords.

    Collapse of URL visibility (see “B”) and recovery from it as well as increase in visibility through our content recycle actionsSo
    it was time to update this article to restore the old visibility and attack the main keyword search engine listing again. The search intention among users for this keyword is “Hey, I want to have a website where I can see as many search engines as possible and preferably as an overview in a list”. Okay, that wasn’t particularly hard to figure out. So we expanded the former five search engines to a total of 35 and presented them all in one list in the article. Further, we expanded the text to introduce each of the 35 search engines, how they work, and a little preview of what’s to come. Yes, that was a hell of a lot of work – but gave us a visibility (as of 01/06/2019) of 1,237 (increase in visibility of 278%).

  10. Additional tip for example
    We also made it into the answer box (rank 0) on Google for quite some time to the keyword search engine list. We named our five favorites in the introduction and Google took that and made an answer box out of it. Of course, this “trick” does not always work!

    from Seokratie.de has writtena great article on this topic.

Link building + build reach

  1. Backlinks are still important for optimization,
    but …… can quickly destroy a lot instead of improving it. Yes, you can achieve good rankings without backlinks, but it will be very difficult to get into the top ten for moderately to highly competitive keywords without links.

  2. No one likes to voluntarily link to a junk website with no content worth mentioning.
    Therefore, it really should be clear to everyone: You need extremely good content to earn backlinks in a natural way.

  3. Earning BacklinksTo earn
    backlinks through so-called good content, the content must appeal to the webmasters you want a backlink from. For example, you write an e-book on how dog owners can respond to the danger of poisoning their dog and write to pet clubs and online pet stores to see if they would like to share your e-book (can also be an infographic, video, etc.) with their target audience. This type of link building is usually much more natural and organic than buying any backlinks or building SPAM links with tools.

  4. What is a good backlink?
    There are several evaluation criteria, some of which include:

    • A link that came about naturally (no linkspam or other “hacks” -> the webmaster has linked to you voluntarily)

    • The link comes from within an article and not from the sidebar or footer (sitewide linking).

    • The link comes from a topic relevant/business relevant environment

    • The linking website has a healthy link profile and has not been bred up to be a mere link slinger, but offers real value to readers and publishes new content on a regular basis.

    • Visitors come via the link – the linking website has visitors (e.g. via organic search) who click on the link and visit your website.

  5. Useful metrics for evaluating linking domains to check the strength of a backlink

    • LRT Pagerank (works with the free Link Redirect Trace Addon)

    • Trust Flow and Citation Flow from Majestic.com

    • Visibility (works with Metrics.tools, Sistrix, Searchmetrics etc.)

    • Dompop (also works with Majesitc or for free with Openlinkprofiler.org)

  6. Stone Age Link Building with Web CatalogsUsing web catalogs
    or other directories for linking is of little or no use anymore. Especially web catalogs that have terms like link or SEO in the domain, you should give a wide berth, because they can be very toxic for a healthy link profile. For local businesses such as craftsmen, dentists are at most industry directories interesting, since Google can create a regional reference to the company and this can have a positive effect on local search queries (more on this under the point “Local SEO” below).

  7. Link to other pagesThis
    is not a direct ranking factor, but the Internet works by linking individual pages to each other and the search engines find new pages this way. Backlinks are therefore relevant for the entire WWW. Link to other sites that give your readers a real added value and stop the linking.

  8. Negative SEO through LinkspamWith
    With “A***geigen” (sorry, but I have to write that now) try again and again (cross-industry) to shoot down websites through many bad backlinks. With various tools (which I do not want to name here) it is possible to build fully automated unnatural backlinks from thousands of websites. In the past this has cost some websites rankings, however Google has been emphasizing more and more lately that they are able to detect this technique and weed out the harmful links. You should still disavow the links via the Google Disavow Tool in the event of a negative SEO attack, just to be safe. Here you can find a guide on how to detect and remove harmful backlinks.

  9. Your competitors’ backlinks
    With tools like Ahrefs, you can analyze the link profile of your competitors and try to rebuild the links for your website. But be careful: rebuilding your competitors’ backlinks is not always a good idea: you never know which links may have been devalued and thus build bad backlinks by mistake.

  10. NoFollow backlinks
    By using the HTML tag Nofollow, you indicate to Google that the link should not pass on any link power. Example: <a href=”https://beispiel.de/” rel=”nofollow”>example</a >. The Google bot will still follow the link, but without considering it for determining rankings. You should only use the nofollow tag for links to other websites if the link was created by a sponsored post.

  11. Forget the tricks and stick to the rules
    Particularly in link building, there was a lot of trickery in the past to get good rankings. Google is getting better and better at quickly recognizing the tricks (see Penguin Update) and punishes the “tricksters” faster and faster. Therefore, stay away from so-called black-hat SEO techniques such as link spam through automated tools, buying too many backlinks at once, etc.

Internal links

  1. Why page architecture is so important
    The internal links on your website help Google to better understand your website. Therefore, invest a lot of work in a sensible page architecture, so that Google, but also your visitors can find their way around your website. Important URLs that should rank well should be accessible with 1-2 clicks from the home page.

  2. How to keep track of your page architecture
    You can visualize the page architecture of your site with Gephi, for example, or Screaming Frog from the 10th version onwards, to get a better overview. Here you can find a tutorial from searchengingeland.com on how to visualize your site architecture with Gephi.

  3. Link to your most important pages
    Link to appropriate subpages in your texts – this way you can additionally strengthen important subpages that can have a positive influence on rankings.

  4. Hard anchor texts
    Internal links need hard anchor texts (link texts). Therefore, never link internally e.g. with meaningless terms such as here, but better with here you will find more on the topic XYZ.

  5. Never use the same anchor texts for different link targets internally
    It is better never to name different internal link targets with the same link text – Google has to be able to understand each page and so you also reduce possible dangers of keyword cannibalization.

  6. No nofollowInternal
    links should not be set to nofollow. This will cause your website to lose valuable link power.

  7. Much does not help much
    Never internally link the same link target on a URL multiple times. Google only considers the first internal link to the URL and ignores the rest. This means that valuable link power is also lost.

  8. How many internal links per URL / per text should be set?
    “It depends”, as there are no rules or guidelines here. Link as it is natural for your readers. If you absolutely want to have a rule: An internal link every 400-500 words has worked best for us.

Content Marketing & Distribution is highly relevant for SEO

  1. Content marketing and SEO go together wonderfully.
    Search engine optimization without good content (informative, entertaining) simply does not work and this good content provides you with content marketing. Content distribution, which is so important for content marketing, can also be helpful for your SEO. Roughly speaking, the distribution of content through the distribution channels (paid, earned, social, owned) should reach the appropriate target audience. This distribution also has advantages for SEO:

    • You will be more noticed in your business niche.

    • Other webmasters will become aware of your content and link to it voluntarily because they want to share it with their target audience.

    • The content is also mentioned and linked in the social media channels through the distribution. This way, you receive new visitors via the social channels – although these are a black box for Google, since the search engine has no access to them, a high traffic volume with corresponding positive user signals does not remain hidden from Google.

    • Positive user signals across the site can improve, such as Time on Page (How long were visitors on the site?), Bounce Rate (What was the user bounce rate?), and Scrolling Behavior (How far did users read the content?).

    • Your trust in Google will increase and your domain will be perceived more as a brand in your niche, which can also have positive effects.

    • Through content distribution, the content on your website is in focus and this is exactly what Google sees and evaluates positively.

Optimize snippets

  1. Titles and Descriptions
    Titles and
    descriptions
    are extremely important for your SEO, as they are displayed in the SERPs and are the first thing that search engine users see of your website. Therefore, maintain them especially well and avoid giving no title and no description, because then Google compiles itself some in the SERPS and these are usually not optimal.

  2. The optimal title consists of …

    • a main keyword (if possible at the beginning of the title).

    • possibly a second main keyword (only if it fits).

    • a short description of what awaits the user or a call to action (request to click). The title must arouse curiosity and encourage the user to click.

    • the name of the domain at the end (Brand).

  3. The optimal description …

    • … describes what the user can expect on the website.

    • … highlights the advantages.

    • … arouses additional curiosity.

    • … tries to encourage the user to click.

  4. AIDA formula
    Using the tried and tested AIDA formula, you can create the optimal title and description for each URL. Especially numbers and special forms can stand out. However, do not overdo it. Here is an example of what a good title and description can look like.

    is the example of Zalando: The keyword is in the title at the very beginning, followed by a call and the brand – here everyone knows immediately what it is about. In the description, the expectation is built up, the advantages are mentioned and additional attention is achieved by means of special characters.

  5. Uniqueness
    Each title and description should be unique – duplicate content can have negative consequences here.

  6. No Keyword Spam
    Avoid the pointless stringing together of keywords (DOT – That’s all I really need to say about that).

  7. Test Snippets
    With the Sistrix Serp Tester you can check how your Title + Description will look in the SERPs.

  8. Google decides for itself,
    …… whether the meta tags are really taken over or not. As soon as a new URL has been indexed by your website, you should use a search query to check whether Google has adopted the entries or not (if not, they should be rewritten).

  9. Emojis in the snippet ???
    If special characters and emojis are displayed in the snippet, this can increase the attention and thus the CTR. Depending on the target group, the slightly playful/childish emoijs can have a rather negative effect on the CTR. Here you can find a Google table with all special characters that were spotted in the SERPs by Sistrix.de.

  10. Meta Keywords
    The meta keyword information is not taken into account by Google and does not need to be filled in.

Structured data

  1. What is structured data?
    These help search engines to better understand your website. They are not a direct ranking factor for Google, but they can display more information in the SERPs (rich snippets) and the Google bot understands your website better. Google explicitly recommends the use of so-called structured data. Examples of rich snippets in the SERPs:

    webmaster-de.googleblog.com you can find a helpful article for getting started and implementing structured data. At search.google.com/structured-data/testing-tool you can check the stored structured data for correctness.

  2. Include structured data
    Structured data can be included for your website in a number of ways – for example, via Google Search Console using the Data Highlighter.

Videos are good for your SEO

  1. Why videos are good for your SEO?
    Using videos (whether on YouTube, Vimeo or self-hosted) you stand out from the competition if they do not offer videos. Content that needs explanation is especially perfect as a video. Since the user signals have become so important, videos on your page can become very important for good rankings in the future, in addition to the content.

  2. Structured data for videos
    Using Schema.org for videos, you can add structured data to your website to show Google that there is a video on the URL. If you want to increase the chances that the video is also displayed in the SERPs as a rich snippet (usually increases the CTR enormously), then in addition to the Schema.org code (can be generated here), there should also be a video sitemap on your website.

Mobile SEO

  1. Mobile-first rule
    More and more people are using smartphones and tablets to access the internet – that’s why Google uses the mobile-first rule. This means that the mobile version of your website is used for indexing and rankings by Google. So it’s hugely important that your website is mobile ready.

  2. Testing Tool
    With this tool you can check if everything is correct on your website in the mobile view: https://search.google.com/test/mobile-friendly.

  3. Websites that are mobile ready, …

    • … adapt to the size of the display (mobile responsive).

    • … are therefore easy to read on any display size.

    • … can be used without zoom.

    • … have a fast loading time. Annoying pop-ups, too large images, etc. should be avoided, especially in the mobile presentation. The user is on the go and wants to quickly look up something on the smartphone. Then such hurdles really interfere enormously.

Local SEO

  1. What is LOCAL SEO?
    Local SEO is about the optimization of a website in relation to local search terms. For example, if you want to rank well for hairdresser Düsseldorf or dentist Cologne etc., Google must have a reference to the company website and the corresponding location. Google wants to make its users happy: If a user wants to have a dentist in Berlin, he doesn’t want to have a dentist displayed 35 km away in Potsdam. Google must therefore clearly identify the correct location.

  2. Indications that show Google that the company website is related to a regional location:

    • Google My Business: Your business listing should be complete and completely truthful (no faking addresses or phone numbers).

    • Phone number and address should be from the city you want to be found.

    • Phone number and address should be on the website, e.g. in the footer or directly at the top of the menu.

    • Business directories: Listing in business directories also helps to show a regional connection.

    • Regional links from magazines, blogs, portals, newspapers, etc. additionally help to achieve better rankings for local search terms.

  3. Business directories examples
    Only for local SEO, business directories are suitable to generate a few good links. The following examples of business directories all have high visibility according to Sistrix (as of 01/06/2019)
    :https://www.gelbeseiten.de https://www.dastelefonbuch.de /https://www.dasoertliche.de/

     

Optimize loading times

  1. One of the biggest potentials for optimization
    Don’t save on the hoster and don’t go to cheap mass hosters. Here, in addition to poor support, the quality of the hosting in terms of loading times is usually not good. There are various hosters who have specialized in certain websites and CMS and can get the maximum of fast loading times (just use the Google search).

  2. Image sizes
    Reduce the size of your images and compress them. If the server has to call up > 500 KB images every time your page is loaded, the loading times will suffer greatly.

  3. Mobile presentation and loading times
    Particularly for the mobile presentation the loading time is very important: Use browser caching for faster loading times.

  4. Use optimization tools
    Search your website for the files that are driving up your load times. With tools like Webpagetest.org or Pingdom Tools you can see exactly which files take how long to load.

    Use Webpagetest.org to check load times and spot potential problems (via a waterfall chart to illustrate problems)

  5. Caching Plugins
    There are many caching plugins for WordPress, with which you can optimize the loading times – e.g. WPRocket or Borlabs Cache. Here you have to find the perfect caching plugin for you by testing it on your own site. If you want to optimize even more, you should definitely read this guide.

WordPress SEO Tips

  1. SEO plugins
    There are also many SEO plugins for WordPress. These help you to make your website more search engine friendly: creating a sitemap, maintaining meta tags, noindexing pages without search intent and much more. We use the Yoast plugin and are happy with it. However, there is no perfect SEO plugin. Test which one you get along with best. Alternatives to Yoast are the plugins THE SEO FRAMEWORK and ALL IN ONE SEO.

  2. Tips for Yoast
    If you are happy with Yoast, here are some tips about this powerful plugin. You can let Yoast create the title on all URLs automatically by using so-called variables (placeholders). In advance, the variables title, page, separator as well as title of the website are entered in
    every installation.

    For example, you can use the shortcut %%currentyear%% to automatically display the most recent year in your title (However, you should be careful here, Google and your visitors don’t want to be fooled, of course). You can find many more useful variables here.

  3. Another tip about the Yoast plugin
    Under Tools -> File Editor you can create a Robots.txt and make entries as well as edit the .htaccess file directly.

  4. Bulk Editor
    About Tools -> Bulk Editor of the Yoast plugin can be adjusted to all articles and pages of the title and the descriptions with the bulk editing function, without having to go into the individual pages. The time savings here is great because otherwise you have to jump into each article / page individually.

  5. Excel-style Bulk Editor
    The Bulk Edit Posts and Products plugin in Spreadsheet lets you edit even more via a handy bulk editor without having to go into individual pages/articles. If you need to edit many pages at once, this plugin can save you a lot of time.

  6. Optimize images afterwards
    PB SEO Friendly Images: This WordPress plugin adds missing ALT tags to your images afterwards. If these are filled in sensibly, they have a positive influence on the rankings of your images in Google Image Search.

  7. Integrate Structured Data
    Structured data on WordPress can be integrated with various plugins, such asMarkup
    JSON-LDSchema
    App Structured DataAll
    In One Schema Rich Snippets.

     

Search Console

  1. What is the Search Console?
    The Search Console is Google’s mouthpiece for all webmasters. Therefore, always use the Search Console. The tool provides you with valuable information about the indexation status, the number of clicks on the organic search of your website and much more – and it’s completely free!!!!!

  2. In the Search Console, you can have pages indexed more quickly
    Go to URL check, then enter the URL that should be indexed and click on the Request indexing button. Of course, Google itself decides whether a page is included in the search index or not.

  3. URL check
    Using the already mentioned URL check, you will also get reasons from Google itself if a URL cannot be found in the index.

  4. Show error messages
    The Search Console reports when problems occur, such as server errors, crawling errors, or mobile display problems.

More SEO Tools

  1. Screaming Frog
    In addition to the already mentioned Search Console, Screaming Frog is an absolute must for anyone who wants to optimize their website. With this tool, you can crawl up to 500 URLs in the free version and display a variety of information about your website. The program crawls the specified target page and collects relevant data that is important for optimization – e.g. the titles of all URLs, the status code, H1 headlines, click depth, etc. Besides all this information, Screaming Frog can also create reports and site maps for your website.

  2. Monitor your website
    There are tools like Uptime Robot that warn you immediately via email if the entire domain or only certain URLs are no longer accessible – e.g. have a 404 status code. Here you can then react quickly and experience no unpleasant surprises when all rankings are suddenly gone because of a stupid error, just because you were not attentive once.

  3. Varvy.com
    The Google Webmaster Guidelines are something like the holy scriptures for webmasters who want to be found better in the long run. The tool https://varvy.com/ tries to detect whether your website follows these guidelines or not.

  4. Keyword-Hero
    The tracking tool Google Analytics lost the transmission of keyword data by switching to SSL encryption and thus it was no longer possible to see which search terms the visitors used to visit your site. The tool Keyword-Hero retrieves the data in Google Analytics, which was previously listed with the note not provided.

  5. Log-Hero
    From the same makers of the Keyword Hero tool comes the Log-Hero. It shows the visits of the search engine bots and their paths through your website. Using the log file analysis, you can see how often the crawlers visit your website and which pages they visit and how often, and optimize your website accordingly.

  6. Browseo
    Find out what your website looks like from the Google crawler’s point of view. With BROWSEO you can view your website through the eyes of the Googlebot and discover errors. The tool is also available as a smartphone app for IOS and Android.

  7. Siteliner
    OnPage and content analysis tool: On Siteliner you can detect duplicate content on your page. It also checks broken links, load times, word count, links and much more.

  8. Hreflang-tag-generator
    For multilingual websites, creating the correct hreflang-tag can quickly lead to errors: The hreflang-tags-generator by well-known SEO specialist Aleyda Solis creates the correct href-lang tags so that Google can correctly assign your multilingual website to the respective region.

 

Summary: Those were my 101 SEO tips

You did it! I hope I was able to give you as many valuable tips as possible. But with all the tips, please don’t forget the first commandment: Google wants to have happy users, so that half the world (soon the whole world) continues to use this service and world domination is no longer far away:

Next Post