Skip to content
This repository has been archived by the owner on Nov 3, 2020. It is now read-only.

Juhlinus/depictr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

21 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

📽 Depictr

💰 Is this useful to you?

Consider sponsoring me on github! 🙏

💾 Installation

composer require juhlinus/depictr

📝 Config

You can publish the config by running the artisan vendor:publish command like so:

php artisan vendor:publish --provider="Depictr\ServiceProvider"

🕷 Crawlers

The following crawlers are defined out of the box:

return [
    'crawlers' => [
        /*
        |--------------------------------------------------------------------------
        | Search engines
        |--------------------------------------------------------------------------
        |
        | These are the list of all the regular search engines that crawl your
        | website on a regular basis and is the crucial if you want good
        | SEO.
        |
        */
        'googlebot',            // Google
        'duckduckbot',          // DuckDuckGo
        'bingbot',              // Bing
        'yahoo',                // Yahoo
        'yandexbot',            // Yandex

        /*
        |--------------------------------------------------------------------------
        | Social networks
        |--------------------------------------------------------------------------
        |
        | Allowing social networks to crawl your website will help the social
        | networks to create "social-cards" which is what people see when
        | they link to your website on the social network websites.
        |
        */
        'facebookexternalhit',  // Facebook
        'twitterbot',           // Twitter
        'whatsapp',             // WhatsApp
        'linkedinbot',          // LinkedIn
        'slackbot',             // Slack

        /*
        |--------------------------------------------------------------------------
        | Other
        |--------------------------------------------------------------------------
        |
        | For posterity's sake you want to make sure that your website can be
        | crawled by Alexa. This will archive your website so that future
        | generations may gaze upon your craftsmanship.
        |
        */
        'ia_archiver',          // Alexa
    ]
]        

⛔ Exclusion

Depictr comes with the option of excluding an array of urls that shouldn't be processed.

This is useful for plain text files like sitemap.txt, where Panther will wrap it in a stripped down HTML file. Use of wildcard is permitted.

Per default the admin route and its sub-routes are excluded in the config file.

🏞 Environments

You can specify which environments Depictr should run on. Default is testing and production.

About

A middleware for rendering static pages when crawled by search engines

Topics

Resources

Stars

Watchers

Forks

Sponsor this project

 

Packages

No packages published