
When a small business website doesn’t show up where customers expect—Google Search, Maps, or “near me” results—the cause is almost always a mix of technical visibility issues, weak local signals, or content that isn’t aligned with what people are searching for. Here’s a practical, cited guide to diagnosing and fixing the most common problems.
In this article we’ll review some of the more common issues a website owner may have overlooked and how to fix them.
Table of Contents
How to Get Your Website Found
Your site isn’t indexed (or only a handful of pages are)
Identify it – Google: site:yourdomain.com — how many results appear? Are key pages missing?
In Google Search Console (GSC), open Indexing → Pages to see why pages aren’t indexed.
Fix it – Use URL Inspection on missing pages; address any reported blocks (robots/noindex).
Submit important URLs and an XML sitemap in GSC (see #5).
Your Google Business Profile (GBP) isn’t verified or complete

Identify it – Google your brand name; look for your Business Profile on the right (desktop) or top (mobile). Missing or “Verify” prompts = not verified.
Fix it – Verify the profile and complete all fields (name, address, phone, hours, categories, description, photos). Keep hours and holiday hours current; respond to reviews.
Robots.txt is blocking search engines
Identify it – Visit yourdomain.com/robots.txt. Look for Disallow: / or rules blocking key folders. In GSC, check URL Inspection → Page indexing reasons (e.g., “Blocked by robots.txt”).
Fix it – Remove or narrow restrictive rules; don’t combine Disallow with noindex on the same URL—Google can’t see tags on blocked pages
Accidental noindex tags (or headers)
Identify it – View page source and look for <meta name="robots" content="noindex">. Or check response headers for X‑Robots‑Tag: noindex.
Fix it – Remove unintended noindex tags/headers on pages you want found; then Request indexing in GSC.
No (or broken) XML sitemap

Identify it – Check yourdomain.com/sitemap.xml (or sitemap_index.xml). In GSC → Sitemaps, see status, errors, and last read.
Fix it – Generate an XML sitemap from your CMS or a plugin; keep within 50MB/50,000 URLs per file; submit in GSC and reference from robots.txt.
Orphan pages (no internal links)
Identify it – Crawl your site (Screaming Frog/SEO tools) and list pages with 0 inbound internal links. SEJ notes sitemaps help discovery but orphan pages still need internal links.
Fix it – Link orphan pages from navigation, hubs, or relevant content; ensure the canonical (see #11) version is linked.
Not mobile‑friendly / content differs on mobile
Identify it – Run Google’s Mobile‑Friendly/mobile‑first checks; compare mobile and desktop content/metadata for parity.
Fix it – Use responsive design; ensure the same primary content and robots meta across mobile/desktop; avoid lazy‑loading critical content on interaction.
Weak Core Web Vitals (speed, responsiveness, stability)

Identify it – In GSC → Core Web Vitals and PageSpeed Insights: check LCP, INP, CLS. Targets: LCP ≤ 2.5s, INP 200ms, CLS < 0.1.
Fix it – Optimize images (compression, proper sizes), defer non‑critical JS/CSS, reduce main‑thread work, reserve space to prevent layout shift.
Images cause slow loads or layout shift (CLS)
Identify it – PageSpeed Insights flags large images and unsized media contributing to CLS.
Fix it – Serve appropriately sized, compressed images; include width/height (or CSS aspect‑ratio); use modern formats (WebP/AVIF).
Duplicate URLs are splitting ranking signals
Identify it – Find duplicates from parameters (?utm=), http vs https, www vs non‑www, printer pages, or trailing slashes. GSC may show “Alternate page with proper canonical tag” or “Duplicate, Google chose different canonical”.
Fix it – Consolidate with redirects and/or <link rel="canonical"> to a single preferred URL; point internal links and sitemaps to the canonical.
URL parameters creating crawl waste & index noise
Identify it – GSC indexing reports show parameterized URLs; ecommerce filters/facets affect your results and waste your value.
Fix it – Add canonicals to the clean URL; avoid blocking parameters in robots unless truly necessary; let Google ignore them when canonical signals are clear.
Content isn’t helpful or trustworthy (E‑E‑A‑T)
Identify it – Compare your pages to Google’s people‑first content questions; thin, generic content and lack of author expertise are red flags.
Fix it – Add specific detail, first‑hand experience, sources, author bios, and customer‑centric answers (pricing, process, service areas). E‑E‑A‑T isn’t a single ranking factor, but aligning improves quality signals.
Missing LocalBusiness structured data
Identify it – Check pages with your location info; run them in Google’s Rich Results Test. No LocalBusiness markup = missed rich results eligibility.
Fix it – Add JSON‑LD LocalBusiness (or a subtype like Dentist/Restaurant) including NAP, hours, geo, and images; validate and keep it consistent with on‑page content.
Structured data is invalid or incomplete
Identify it – Run the page in Rich Results Test; check for errors/warnings (missing required fields).
Fix it – Correct required properties; ensure JSON‑LD matches visible content and isn’t blocked by robots/noindex. Monitor structured‑data reports in GSC.
GBP categories are wrong or too generic
Identify it – In GBP, check Primary and Additional categories; mismatched categories hurt relevance.
Fix it – Choose the most specific primary category (e.g., “Pizza restaurant,” not just “Restaurant”) and add relevant secondary categories; review and update as services evolve.
NAP (Name, Address, Phone) inconsistencies across the web
Identify it – Audit major listings: Google, Bing Places, Apple Maps, Yelp, Facebook, Yellow Pages. Inconsistencies confuse search engines and users.
Fix it – Standardize NAP formatting everywhere; correct duplicates/old data; maintain ongoing citation hygiene.
Weak review profile (few reviews, no responses)

Identify it – How many reviews and how recent? Are you responding? Google highlights reviews in local rankings.
Fix it – Invite satisfied customers to leave reviews (no incentives or gating); respond to all reviews with specifics and care; maintain a steady cadence.
Service‑area or map pin inaccuracies
Identify it – In GBP, confirm your address, pin placement, and service areas. Misplaced pins confuse proximity signals.
Fix it – Set precise service‑area boundaries and confirm the pin; add unique location pages with localized content if you serve multiple areas.
Too few high‑quality local citations
Identify it – Check presence on reputable directories (chamber of commerce, niche sites) and data aggregators; weak citation footprint hurts prominence.
Fix it – Build consistent citations on trusted local/niche sites; maintain accuracy over time and monitor for duplicates.
Limited link authority (weak or irrelevant backlinks)
Identify it – Review referring domains; compare to competitors. Google uses links among many signals—quality/relevance matter far more than quantity.
Fix it – Earn links via local PR, partnerships, sponsorships, guides/resources worth referencing; avoid manipulative schemes.
Crawl budget wasted on low‑value/duplicate URLs
Identify it – GSC shows lots of crawled but not indexed pages, parameter variants, or duplicated content clusters.
Fix it – Reduce crawl waste: canonicalize variants, noindex truly low‑value pages (and don’t block with robots if you rely on noindex), simplify architecture, and guide bots to important pages.
Faceted navigation (filters) creating index bloat (ecommerce)
Identify it – Many filter combinations indexed (colour=size=brand) with thin content; “duplicate/canonicalized” reports spike.
Fix it – Keep facets crawlable for users but canonical to base category, avoid linking infinite combinations, and consider rendering filters without generating unique indexable URLs.
Missing or misused X‑Robots‑Tag for non‑HTML files
Identify it – PDFs/images indexed when they shouldn’t be; or blocked via robots, which can’t enforce noindex.
Fix it – Use X‑Robots‑Tag: noindex in server headers for non‑HTML assets you don’t want indexed; don’t rely on robots.txt for removal.
Sitemaps don’t reflect site changes (or mix non‑indexable URLs)
Identify it – GSC reports errors, old/removed URLs, or noindex pages included in the sitemap.
Fix it – Keep dynamic sitemaps updated; include only canonical, indexable URLs; break into multiple sitemaps if needed and submit a sitemap index.
“Set it and forget it” — no monitoring or iteration
Identify it – No routine reviews of GSC (Indexing, Core Web Vitals, Enhancements), GBP Insights, or analytics; issues linger for months.
Fix it – Establish monthly checks: GSC: Indexing reasons, CWV trends, Enhancement/Schema reports. GBP: Views, calls, directions, photo engagement; update hours, posts, Q&A. Adjust based on what users search and how they interact.
Your 30‑Day Action Plan
This is a dedicated, stand‑alone plan you can execute over four weeks. It maps directly to the 25 issues in the how‑to guide and sequences them from foundational fixes to growth tasks. Use it as a checklist with concrete deliverables and success metrics.
Week 0 (Day 0–1): Prep & Access
Goals: Ensure you can see what search engines see and make changes quickly.
- Access & tools
- Confirm ownership in Google Search Console (GSC) for all domains.
- Log into/claim your Google Business Profile (GBP).
- Verify access to your CMS, analytics, and hosting/CDN.
- Baseline snapshots
- Record current indexed page count (
site:yourdomain.com). - Export GSC Pages and Core Web Vitals reports.
- Screenshot GBP (listing completeness, reviews, hours).
- Record current indexed page count (
- Backups & safety
- Enable site backups and version control for theme/code changes.
Deliverables: Tool access confirmed, baseline metrics saved, rollback plan in place.
Week 1 (Days 2–7): Indexing Hygiene (Issues #1–#5, #6)
Focus: Make the site discoverable and indexable; eliminate basic blockers.
- Index audit (#1)
- Run
site:search and GSC Pages report; list missing/important URLs.
- Run
- Robots.txt review (#3)
- Remove broad blocks (e.g.,
Disallow: /) that hide content you want indexed.
- Remove broad blocks (e.g.,
- Noindex cleanup (#4)
- Search for unintended
<meta name="robots" content="noindex">orX‑Robots‑Tag: noindexon key pages; remove.
- Search for unintended
- Sitemap setup (#5)
- Generate XML sitemap(s), include only canonical, indexable URLs; submit in GSC; reference in
robots.txt.
- Generate XML sitemap(s), include only canonical, indexable URLs; submit in GSC; reference in
- URL Inspection & re‑crawl (#1)
- Inspect priority pages; Request indexing once blockers are fixed.
- Orphan pages (#6)
- Identify pages with 0 inbound internal links; add links from menus, hubs, and relevant posts.
Deliverables: Clean robots.txt, no unintended noindex, valid sitemap submitted, list of fixed pages, internal links added.
Success metric: Indexed page count up; “Blocked by robots.txt / noindex” reasons drop in GSC.
Week 2 (Days 8–14): Mobile Experience & Speed (Issues #7–#9)
Focus: Meet mobile‑first expectations and Core Web Vitals targets.
- Mobile parity & UX (#7)
- Confirm responsive layout; ensure the same primary content/metadata on mobile and desktop; remove interaction‑gated (lazy) content.
- Core Web Vitals plan (#8)
- From GSC/PSI, list LCP/INP/CLS offenders by template (home, product/service, blog).
- Performance fixes (#8, #9)
- Optimize/resize/compress images (WebP/AVIF); set width/height/aspect‑ratio to prevent CLS.
- Defer non‑critical JS/CSS; reduce main‑thread work; cache and preconnect as needed.
Deliverables: Before/after PSI screenshots, fixed templates/components, image optimization report.
Success metric: ≥80% of key URLs achieve LCP ≤ 2.5s, INP < 200ms, CLS < 0.1.
Week 3 (Days 15–21): Canonicals, Architecture & Crawl Efficiency (Issues #10–#12, #21–#22)
Focus: Consolidate signals, reduce duplication, guide crawlers to the right pages.
- Canonical policy (#10, #11)
- Choose preferred URL formats (https, non‑www, trailing slash policy).
- Implement
<link rel="canonical">on duplicates and 301 redirects where appropriate.
- Internal linking & navigation (#6, #10)
- Ensure menus, breadcrumbs, and contextual links point to canonical URLs.
- Parameters & faceted navigation (#11, #22)
- Canonicalize filter pages to the base category; avoid creating indexable infinite combinations.
- Crawl waste review (#21)
- Reduce low‑value/thin URLs; consider
noindexfor non‑essential pages (do not block those with robots if you rely onnoindex).
- Reduce low‑value/thin URLs; consider
Deliverables: Canonical/redirect map, updated internal links, facet/parameter rules doc.
Success metric: GSC “Duplicate/Chosen canonical” issues trend down; crawl stats show more hits to priority pages.
Week 4 (Days 22–30): Local Presence, Entity Clarity, Trust & Ongoing Monitoring
Focus: Strengthen local signals, structured data, reviews, and authority; set up continuous oversight.
Local Presence & Relevance (Issues #2, #15–#19)
- GBP optimization & verification (#2, #15)
- Confirm verification; set precise primary and secondary categories; update hours (including holidays), description, attributes, and add real photos.
- Service area & pins (#18)
- Correct map pin and service‑area boundaries; publish unique, localized location pages.
- NAP consistency & citations (#16–#19)
- Audit and correct NAP across Google, Bing Places, Apple Maps, Yelp, Facebook, Yellow Pages, industry directories; remove duplicates; add missing citations.
- Reviews system (#17)
- Launch review requests post‑purchase/service; reply to every review; set weekly cadence.
Entity Clarity & Rich Results (Issues #13–#14)
- LocalBusiness schema (JSON‑LD) (#13)
- Add/validate on each location page; include name, URL, phone, address, geo, hours, images; ensure parity with on‑page content.
- Structured‑data QA (#14)
- Fix Rich Results Test errors/warnings; monitor GSC enhancements.
Authority & Links (Issue #20)
- Local link outreach (#20)
- Target local media, associations, partners, sponsorships; publish helpful resources to earn natural citations.
Monitoring & Iteration (Issues #24–#25)
- Monthly dashboards (#24–#25)
- Create a recurring checklist to review GSC Indexing, CWV, Enhancements; GBP Insights (views, calls, directions); analytics for organic traffic/leads.
- Keep sitemaps updated and clear of removed/noindex URLs.
Deliverables: Optimized GBP, citation/NAP audit log, reviews playbook, schema validation screenshots, outreach list, monitoring checklist.
Success metrics:
- GBP discovery views, calls, and directions increase; review count and response rate improve.
- Rich results eligibility appears in GSC; organic impressions/clicks rise for local service queries.
Address items in sequence; you’ll typically see indexing improve first, then impressions, and then rankings and customer actions—especially in local results.
When To Hire An Agency
Tackling your website shouldn’t be the focus of running your business, especially if your time can be better spent generating revenue or solving problems you’re better suited for. If you need help with your website, call us and we can help get your website found.
Frequently Asked Questions
How long will it take for my fixes to show up in Google?
After you publish changes (e.g., removing noindex, updating robots.txt, or adding a sitemap), Google usually needs several days to recrawl and re‑index. You can speed this up by using URL Inspection to request indexing in Search Console, but reprocessing still takes time.
Do this: Submit your updated XML sitemap and use Request indexing for priority URLs. Track status in GSC → Indexing → Pages
What’s the difference between crawling and indexing?
Crawling is when Googlebot discovers and fetches your pages. Indexing is when Google stores and decides whether to show them in results. A page can be crawled but not indexed (e.g., thin content, duplicates) or indexed without full crawl access if it’s discovered via links—but that’s rare and limited.
Do this: Ensure pages aren’t blocked by robots.txt, avoid unintended noindex, and provide a clean sitemap.
Should I use noindex or Disallow in robots.txt to hide pages?
Use noindex when you want a page accessible to crawlers but not shown in search. Use Disallow in robots.txt when pages should not be crawled at all (e.g., admin areas). Don’t combine them on the same URL—if a page is disallowed, Google can’t see the noindex.
Do this: Pick one: noindex or Disallow, based on whether you want the content crawled. Validate your setup in Search Console.
Do I really need an XML sitemap for a small site?
While Google can discover pages via links, most sites benefit from a sitemap because it accelerates discovery, helps Google understand canonical URLs, and provides indexing feedback in GSC. Keep only canonical, indexable URLs in the file.
Do this: Create sitemap.xml, keep within 50,000 URLs / 50MB, submit in GSC, and reference it in robots.txt.
My site is responsive—why am I still struggling on mobile?
Google uses mobile‑first indexing, and ranking systems reward good Core Web Vitals. A site can be responsive yet still slow, unresponsive (high INP), or unstable (high CLS) on mobile.
Do this: Use PageSpeed Insights and GSC Core Web Vitals. Target LCP ≤ 2.5s, INP < 200ms, CLS < 0.1; ensure mobile parity of content and metadata.



