Ads disabled until consent granted.

Browser tool

Open Graph & Metadata Preview — About

Dive deeper into how the metadata fetcher works, the safeguards in place, and best practices for running link audits.

Key points

Metascraper at the core

We rely on metascraper packages to normalize titles, descriptions, imagery, and publisher signals so previews mirror what major platforms expect.

Respectful crawling defaults

The serverless endpoint follows redirects, times out quickly, and refuses pages marked with robots noarchive directives to avoid violating site policies.

Shareable yet sanitized

Only cleaned text fields, canonical URLs, and absolute media links are returned. No raw HTML or scripts pass through to the browser.

The Open Graph & Metadata Preview helps teams avoid nasty surprises when a link is posted to social networks or chat tools. Instead of waiting for a platform to re-scrape your site, you can validate metadata locally and share the results with stakeholders in seconds.

Fetch strategy

Requests flow through a serverless function that follows redirects, applies a 10 second timeout, and strips cookies. It inspects both HTTP headers and on-page robots directives before returning any data. If a page disallows archiving, the tool reports that status instead of exposing the content.

What gets returned

  • Normalized title, description, author, publisher, canonical URL, language, and imagery fields.
  • All Open Graph and Twitter Card meta tags with copy-to-clipboard controls for each value.
  • Every JSON-LD script block parsed and formatted for schema validators.

Why caching exists

To keep the service responsive, responses are cached for 24 hours at the edge. That keeps repeat checks fast while giving authors enough time to fix metadata before the next scrape. You can always trigger a fresh fetch once your updates are live.

Operational considerations

Treat the tool as a staging aid: keep an eye on HTTP response statuses, be mindful of rate limits, and coordinate with site owners when testing third-party URLs. This mindset keeps the service sustainable and respectful of origin servers.

Ads disabled until consent granted.

How to use this tool

  1. Step 1

    What the preview excels at

    Providing a fast, policy-aware look at link unfurls so marketing, editorial, and engineering teams can validate messaging before launches.
  2. Step 2

    How to use it responsibly

    Only fetch pages that are public or belong to properties you manage. Avoid sensitive dashboards, paywalled articles, or personal documents.
  3. Step 3

    Limitations to keep in mind

    The tool does not execute client-side JavaScript, emulate logged-in sessions, or bypass geo restrictions. Use browser developer tools for deeper debugging.

Frequently asked questions

Which user agent does the tool use?
It identifies as a generic metadata fetcher to avoid masquerading as consumer apps. That helps site owners trace requests and block the tool if needed.
What should I do if a site blocks the fetch?
Respect the block. Contact the site owner if you need preview access or test on a staging URL instead. The tool will not bypass authentication or restrictive robots rules.
Can I export the structured data?
Yes. Each JSON-LD block includes a copy button so you can paste the sanitized payload into validators, CMS previews, or change logs.

The previewer is for diagnostics, not for archiving proprietary content. Cached responses expire within 24 hours, but you should still avoid submitting anything private or governed by access agreements.

Open Graph & Metadata Preview — About & FAQ | WebUtility.org