How to configure the GEO Suite module?
Module - GEO Suite
The GEO Suite module optimizes the visibility of your PrestaShop store for AI assistants and search engines. It integrates instant submission to search engines (on which AIs are based) thanks to the IndexNow protocol, automatically generates an llms.txt file as well as structured markdown content (for your products, categories and CMS pages) to facilitate information extraction by the AI, and offers an AI bot traffic analysis tool to evaluate the effectiveness of your Generative Engine Optimization (GEO) strategy.
This guide takes you step by step through the configuration and optimal use of the module.
1. IndexNow Protocol Integration
The IndexNow feature allows you to instantly submit your product, category, and CMS page URLs to compatible search engines such as Bing or Yandex (used by generative AIs like ChatGPT), speeding up their indexing. This ensures that new products or price changes in your catalog appear much faster in shopping results and AI-powered smart search modules.
Mandatory prerequisite step: Before configuring IndexNow in the module, you must register your site on Bing Webmaster Tools and validate your domain ownership. For a step-by-step guide, see our dedicated FAQ.
This step is essential for IndexNow submission to work. Without validation, URLs will not be accepted by Bing. For more technical information about the protocol, see the official IndexNow documentation.
a) Configuration
- Your IndexNow API key is already generated automatically by the module. You can regenerate it if you want:
- In the IndexNow > Configuration tab, enable IndexNow integration.
- Select the types of content to submit (products, categories, CMS). In the example below, only products and categories are selected for real-time indexing:
- Enable the automatic submission option if you want new URLs to be sent without manual intervention.
- Select search engines to notify via IndexNow:
- Click Save to confirm the configuration.
b) First URL submission (First Submission and Queue Management tabs)
Before IndexNow automation takes over, it is mandatory to perform an initial submission of all your shop's URLs. This step initializes the indexing of your site with compatible search engines.
- Go to the First Submission tab to bulk submit all the URLs in your catalog (products, categories, CMS pages). Select a first search engine:
- Click the "Queue all my shop's URLs" button:
A pop-up window opens and the queuing process begins.
- Close the pop-up once all your URLs have been queued.
- Repeat the operation by selecting the second search engine and so on.
- Then go to the Queue Management tab and click on the "Run the entire queue" button:
- Monitor the progress of the submission, check for any errors, and restart submissions if necessary.
Why is this step mandatory? Without this initial submission, search engines will not receive all your URLs and automation will not work properly. Once this step is completed, new URLs or changes will be submitted automatically by the module, without manual intervention.
Note: After the initial submission, it is generally no longer necessary to use these tabs, except in the following cases:
- You want to force the submission of a large number of URLs after a migration or major site redesign.
- You notice submission errors and want to retry certain submissions.
- You want to check the status of the queue or the history of bulk submissions.
c) IndexNow Dashboard
The Dashboard tab allows you to view in real time the status of your shop's IndexNow submissions. It centralizes all useful information for tracking and analyzing the indexing of your pages.
- Submission tracking: View the detailed history of submissions, the status of each URL (successful, pending, error).
- Error analysis: For each failed submission, see an explicit error message (e.g.: URL format issue, invalid API key, no response from the search engine, etc.) to quickly correct the situation.
The dashboard is particularly useful for:
- Checking that your shop's indexing is progressing correctly.
- Detecting any blockages or slowdowns in URL submission.
d) Manual submission
The Manual Submission tab allows you to individually submit one or more specific URLs to IndexNow, outside of the automatic or bulk submission flows.
Main use cases:
- You have just published an important page (new product, special offer, landing page) and want to speed up its indexing.
- You have fixed an error on a page and want to force its re-indexing without waiting for the next automatic run.
- You want to test IndexNow on a specific URL before launching a global submission.
Manual submission is complementary to automation: it offers targeted, on-demand control over the indexing of strategic pages.
2. Automatic generation of the llms.txt and markdown content files
The module automatically generates a llms.txt file, placed at the root of your site. This file is a standard designed to help artificial intelligences (AI) and search engines quickly understand the structure and content of your shop. It acts as a simplified access map, specially formatted for robots, allowing them to easily identify your products, categories, and important pages.
In addition, the module generates markdown files for each product, category, and CMS page. These files provide a clear, structured, and lightweight version of your content, ideal for indexing by AIs and modern search engines.
This automated generation allows you to keep your site always up to date for robots, with no manual effort. See the official llms.txt documentation for more information.
a) Configuration
The LLMS > Configuration tab allows you to precisely define which content will be included in the llms.txt file and the markdown files generated by the module. You can choose whether or not to include products, categories, CMS pages, enable multilingual support, and add additional information.
- In the LLMS > Configuration tab, enable or disable the inclusion of products, categories, and CMS pages as needed.
- Enable the Multilingual support option if you want to generate files for each language of your shop.
- If needed, enter additional information in the dedicated field (e.g.: legal notices, shop information).
- Click the Save button to save your settings.
- Then go to the File Generation tab.
b) File generation
The File Generation tab allows you to manually launch the generation or regeneration of the llms.txt file and markdown files. A summary of the generation (number of files, duration, etc.) is displayed after each operation.
Note:
- Manual generation is only required during the very first use of the module, in order to create all the files for your existing catalog.
- Afterwards, generation is performed automatically in real time with each modification, addition, or deletion of a product, category, or CMS page. The corresponding markdown files and the main llms.txt file are then updated instantly.
- The CRON task (see below) allows you to regenerate all files regularly and automatically (for example, every night). This ensures that all files remain consistent, even in the event of massive changes, batch imports, or occasional issues during real-time updates.
- Click the File Generation button to start creating the llms.txt index and content files:
- You can follow the steps of the file generation:
- The llms.txt file is saved at the root of your site (e.g.:
/llms.txt
). It is publicly accessible athttps://your-shop.com/llms.txt
. - The content markdown files are saved in the
/llms-content/
folder at the root of the site, organized into subfolders:/llms-content/products/
for products (e.g.:/llms-content/products/12345.md
)/llms-content/categories/
for categories (e.g.:/llms-content/categories/12.md
)/llms-content/cms/
for CMS pages (e.g.:/llms-content/cms/7.md
)
https://your-shop.com/llms-content/products/12345.md
).
You can also view the files in the Generated Files tab.
Note: After generating your files, refresh the page if you want to view them in the "Generated Files" tab (see below).
c) Viewing generated files
The Generated Files tab allows you to view, filter, and access all files generated by the module (main llms.txt file, product, category, CMS markdown files). You can filter by content type, view the content of a file, and download if needed.
- After refreshing the page, click the "View llms.txt" button to view the main file in a new tab:
- Select the type of content file to view, then click "Filter":
- For each file, click "View" to open it in a new tab:
d) Setting up the CRON task
Why set up a CRON task? Even though real-time generation covers most needs, the CRON task acts as an extra safeguard to ensure your entire catalog is up to date, without manual intervention.
Mandatory prerequisite: To use automatic generation via CRON, you must:
- Have access to your hosting's CRON configuration (cPanel, Plesk, SSH access, etc.).
- Check that your host allows the execution of external CRON tasks.
If you do not have access to the CRON configuration, contact your host or system administrator.
- In the LLMS > CRON Task tab, copy the CRON URL provided by the module.
- Add this URL to your hosting's CRON configuration (using cURL, wget, or PHP CLI as you prefer). You can use the example CRON commands provided by the module (copying commands is possible).
- Save the configuration on your server.
Note: If you reset the security key using the dedicated button under the CRON URL, remember to update the CRON URL on your server. Otherwise, previously configured CRON tasks will stop working.
Tips:
- Schedule the CRON to run at night to minimize performance impact.
- Regularly check the date of the last generation in the module. It is shown under the CRON examples:
3. AI Bot Traffic Analysis
The module provides analysis tools to track the visits of AI bots (Googlebot, Bingbot, etc.) to your site, via two modes: simple (buffer) or advanced (log analysis).
Mandatory prerequisite: To use the advanced mode (log analysis), you must:
- Have access to your server log files (usually via your host or FTP/SFTP access).
- Check that the log deposit folder is authorized by the module (ask your site administrator or webmaster).
Without access to these files, the advanced mode will not work.
a) Configuration
- In the Data Analysis tab, choose the analysis mode:
- Simple Mode (recommended): Front buffer mode (user agent collection via JS)
- or Advanced Mode: Log analysis mode (classic log analysis via CRON)
- Simple Mode (recommended): Front buffer mode (user agent collection via JS)
- For advanced mode, specify the log file path (it must be in an authorized path—
open_basedir
restriction). Example:my_site/logs/access.log
and, if needed, the custom format. - Click Save to save the configuration.
Tips:
- Use simple mode if you have no technical knowledge of logs.
- Advanced mode offers more detailed analysis for high-traffic shops but requires more technical knowledge.
b) Log analysis (for advanced analysis mode)
The Log Analysis tab provides the CRON task URL to configure on your server to automate the analysis of server log files (advanced analysis mode only). This analysis detects the visits of bots (Googlebot, Bingbot, etc.) to your site and provides detailed statistics on their activity.
Once the CRON task is configured, log analysis is performed automatically and regularly, without manual intervention. The CRON task ensures that the analysis is carried out regularly and reliably, even with large log volumes.
c) AI Bot Traffic analysis (AI Bot Traffic button)
The AI Bot Traffic tab in your PrestaShop back office (accessible from the left menu or by clicking the blue AI Bot Traffic button in the module) allows you to view and analyze the visits of the main indexing robots (Googlebot, Bingbot, Applebot, etc.) and other bots to your shop. This interface helps you to:
- Understand the frequency of bot visits to your site.
- Identify the most active bots.
- Spot the appearance of new bots.
- Track traffic spikes generated by robots.
Features and usage:
- Period selector: choose to display statistics by day, by month, or over a custom range (with date selection).
- Key indicators (KPIs):
- Total bot visits: total number of bot visits over the selected period.
- Most active bot: name of the bot with the most visits, with the associated number of hits.
- New bots: number of bots that visited your site for the first time during the period.
- Traffic peak: day or month with the highest number of bot visits, with the number of hits.
- Interactive chart: view the evolution of traffic from different bots over the chosen period. You can hide/show each bot by clicking on the legend.
- CSV export: export the displayed data in CSV format for further analysis or archiving. The "Export CSV" button takes into account the selected period and filters.
Tip: Click on an item in the chart legend to hide or show the curve for a specific bot.