How to configure the GEO Suite module?

 

 

Module - GEO Suite

 

 

The GEO Suite module optimizes the visibility of your PrestaShop store for AI assistants and search engines. It integrates instant submission to search engines (on which AIs are based) thanks to the IndexNow protocol, automatically generates an llms.txt file as well as structured markdown content (for your products, categories and CMS pages) to facilitate information extraction by the AI, and offers an AI bot traffic analysis tool to evaluate the effectiveness of your Generative Engine Optimization (GEO) strategy.
This guide takes you step by step through the configuration and optimal use of the module.


1. IndexNow Protocol Integration


The IndexNow feature allows you to instantly submit your product, category, and CMS page URLs to compatible search engines such as Bing or Yandex (used by generative AIs like ChatGPT), speeding up their indexing. This ensures that new products or price changes in your catalog appear much faster in shopping results and AI-powered smart search modules.

Mandatory prerequisite step: Before configuring IndexNow in the module, you must register your site on Bing Webmaster Tools and validate your domain ownership. For a step-by-step guide, see our dedicated FAQ.

This step is essential for IndexNow submission to work. Without validation, URLs will not be accepted by Bing. For more technical information about the protocol, see the official IndexNow documentation.


a) Configuration

 

  1. Your IndexNow API key is already generated automatically by the module. You can regenerate it if you want:

     

     

  2. In the IndexNow > Configuration tab, enable IndexNow integration.

     

     

  3. Select the types of content to submit (products, categories, CMS). In the example below, only products and categories are selected for real-time indexing:

     

     

  4. Enable the automatic submission option if you want new URLs to be sent without manual intervention.

     

     

  5. Select search engines to notify via IndexNow:

     

     

  6. Click Save to confirm the configuration.


b) First URL submission (First Submission and Queue Management tabs)


Before IndexNow automation takes over, it is mandatory to perform an initial submission of all your shop's URLs. This step initializes the indexing of your site with compatible search engines.

  1. Go to the First Submission tab to bulk submit all the URLs in your catalog (products, categories, CMS pages). Select a first search engine:

     

     

  2. Click the "Queue all my shop's URLs" button:

     

     

    A pop-up window opens and the queuing process begins.

  3. Close the pop-up once all your URLs have been queued.
  4. Repeat the operation by selecting the second search engine and so on.
  5. Then go to the Queue Management tab and click on the "Run the entire queue" button:

     

     

  6. Monitor the progress of the submission, check for any errors, and restart submissions if necessary.

Why is this step mandatory? Without this initial submission, search engines will not receive all your URLs and automation will not work properly. Once this step is completed, new URLs or changes will be submitted automatically by the module, without manual intervention.

Note: After the initial submission, it is generally no longer necessary to use these tabs, except in the following cases:


c) IndexNow Dashboard


The Dashboard tab allows you to view in real time the status of your shop's IndexNow submissions. It centralizes all useful information for tracking and analyzing the indexing of your pages.

The dashboard is particularly useful for:


d) Manual submission


The Manual Submission tab allows you to individually submit one or more specific URLs to IndexNow, outside of the automatic or bulk submission flows.

Main use cases:

Manual submission is complementary to automation: it offers targeted, on-demand control over the indexing of strategic pages.

 

 


2. Automatic generation of the llms.txt and markdown content files


The module automatically generates a llms.txt file, placed at the root of your site. This file is a standard designed to help artificial intelligences (AI) and search engines quickly understand the structure and content of your shop. It acts as a simplified access map, specially formatted for robots, allowing them to easily identify your products, categories, and important pages.

In addition, the module generates markdown files for each product, category, and CMS page. These files provide a clear, structured, and lightweight version of your content, ideal for indexing by AIs and modern search engines.

This automated generation allows you to keep your site always up to date for robots, with no manual effort. See the official llms.txt documentation for more information.


a) Configuration


The LLMS > Configuration tab allows you to precisely define which content will be included in the llms.txt file and the markdown files generated by the module. You can choose whether or not to include products, categories, CMS pages, enable multilingual support, and add additional information.

  1. In the LLMS > Configuration tab, enable or disable the inclusion of products, categories, and CMS pages as needed.

     

     

  2. Enable the Multilingual support option if you want to generate files for each language of your shop.

     

     

  3. If needed, enter additional information in the dedicated field (e.g.: legal notices, shop information).

     

     

  4. Click the Save button to save your settings.
  5. Then go to the File Generation tab.


b) File generation


The File Generation tab allows you to manually launch the generation or regeneration of the llms.txt file and markdown files. A summary of the generation (number of files, duration, etc.) is displayed after each operation.

Note:

  1. Click the File Generation button to start creating the llms.txt index and content files:

     

     

  2. You can follow the steps of the file generation:

     

     

Once the generation is complete, where are the files located?

You can also view the files in the Generated Files tab.

Note: After generating your files, refresh the page if you want to view them in the "Generated Files" tab (see below).


c) Viewing generated files


The Generated Files tab allows you to view, filter, and access all files generated by the module (main llms.txt file, product, category, CMS markdown files). You can filter by content type, view the content of a file, and download if needed.

  1. After refreshing the page, click the "View llms.txt" button to view the main file in a new tab:

     

     

  2. Select the type of content file to view, then click "Filter":

     

     

  3. For each file, click "View" to open it in a new tab:

     

     


d) Setting up the CRON task


Why set up a CRON task? Even though real-time generation covers most needs, the CRON task acts as an extra safeguard to ensure your entire catalog is up to date, without manual intervention.

Mandatory prerequisite: To use automatic generation via CRON, you must:

  1. Have access to your hosting's CRON configuration (cPanel, Plesk, SSH access, etc.).
  2. Check that your host allows the execution of external CRON tasks.

If you do not have access to the CRON configuration, contact your host or system administrator.

  1. In the LLMS > CRON Task tab, copy the CRON URL provided by the module.

     

     

  2. Add this URL to your hosting's CRON configuration (using cURL, wget, or PHP CLI as you prefer). You can use the example CRON commands provided by the module (copying commands is possible).

     

     

  3. Save the configuration on your server.

Note: If you reset the security key using the dedicated button under the CRON URL, remember to update the CRON URL on your server. Otherwise, previously configured CRON tasks will stop working.

Tips:

 

 


3. AI Bot Traffic Analysis


The module provides analysis tools to track the visits of AI bots (Googlebot, Bingbot, etc.) to your site, via two modes: simple (buffer) or advanced (log analysis).

Mandatory prerequisite: To use the advanced mode (log analysis), you must:

  1. Have access to your server log files (usually via your host or FTP/SFTP access).
  2. Check that the log deposit folder is authorized by the module (ask your site administrator or webmaster).

Without access to these files, the advanced mode will not work.


a) Configuration

 

  1. In the Data Analysis tab, choose the analysis mode:
    • Simple Mode (recommended): Front buffer mode (user agent collection via JS)

       

    • or Advanced Mode: Log analysis mode (classic log analysis via CRON)

       

       

  2. For advanced mode, specify the log file path (it must be in an authorized path—open_basedir restriction). Example: my_site/logs/access.log and, if needed, the custom format.

     

     

  3. Click Save to save the configuration.

Tips:


b) Log analysis (for advanced analysis mode)


The Log Analysis tab provides the CRON task URL to configure on your server to automate the analysis of server log files (advanced analysis mode only). This analysis detects the visits of bots (Googlebot, Bingbot, etc.) to your site and provides detailed statistics on their activity.

Once the CRON task is configured, log analysis is performed automatically and regularly, without manual intervention. The CRON task ensures that the analysis is carried out regularly and reliably, even with large log volumes.


c) AI Bot Traffic analysis (AI Bot Traffic button)


The AI Bot Traffic tab in your PrestaShop back office (accessible from the left menu or by clicking the blue AI Bot Traffic button in the module) allows you to view and analyze the visits of the main indexing robots (Googlebot, Bingbot, Applebot, etc.) and other bots to your shop. This interface helps you to:

Features and usage:

Tip: Click on an item in the chart legend to hide or show the curve for a specific bot.

 

Other FAQs in this category