n8n mysql purge history greater than 10 days
This workflow is designed to automatically clean up execution records in the MySQL database that are older than 30 days, effectively preventing performance degradation caused by data accumulation. Users can choose to schedule the cleanup operation to run automatically every day or trigger it manually, ensuring that the database remains tidy and operates efficiently. It is suitable for users who need to maintain execution history, simplifying database management tasks and improving system stability and maintenance efficiency.
Tags
Workflow Name
n8n_mysql_purge_history_greater_than_10_days
Key Features and Highlights
This workflow automates the cleanup of execution records in a MySQL database by regularly deleting historical execution data older than 30 days. It effectively prevents database performance degradation caused by data accumulation. The workflow supports both manual triggering and daily scheduled automatic execution, offering flexibility and convenience.
Core Problem Addressed
As the number of n8n workflow executions increases, the execution history records in the database continuously grow, resulting in increased storage burden and reduced query efficiency. This workflow helps users maintain a clean and efficient database by periodically purging expired execution records.
Use Cases
- Users of n8n who store execution records in a MySQL database
- Those needing regular database maintenance to prevent unlimited growth of historical data
- Users seeking to simplify database cleanup tasks through automation
Main Process Steps
- Scheduled Trigger (Cron Node): Automatically starts daily at 7 AM
- Manual Trigger (Manual Trigger Node): Allows users to manually initiate the cleanup process
- Execute MySQL Delete Command (MySQL Node): Deletes historical records with execution timestamps older than 30 days
Involved Systems or Services
- MySQL Database: Stores n8n execution history data
- Built-in n8n Automation Nodes (Cron for scheduled triggering, Manual Trigger for manual execution, MySQL node for database operations)
Target Users and Value Proposition
- n8n platform operators and developers, assisting in maintaining database performance
- Database administrators, simplifying routine cleanup tasks
- Enterprises and teams requiring automated management of historical data to enhance system stability and maintenance efficiency
Import Excel Product Data into PostgreSQL Database
This workflow is designed to automatically import product data from local Excel spreadsheets into a PostgreSQL database. By reading and parsing the Excel files, it performs batch inserts into the "product" table of the database. This automation process significantly enhances data entry efficiency, reduces the complexity and errors associated with manual operations, and is particularly suitable for industries such as e-commerce, retail, and warehouse management, helping users achieve more efficient data management and analysis.
Automated Project Budget Missing Alert Workflow
This workflow automatically monitors project budgets through scheduled triggers, querying the MySQL database for all active projects that are of external type, have a status of open, and a budget of zero. It categorizes and compiles statistics based on the company and cost center, and automatically sends customized HTML emails to remind relevant teams to update budget information in a timely manner. This improves data accuracy, reduces management risks, optimizes team collaboration efficiency, and ensures the smooth progress of project management.
Baserow Markdown to HTML
This workflow is designed to automate the conversion of Markdown text from the Baserow database into HTML format and update it back to the database, enhancing content display efficiency. It supports both single record and batch operations, allowing users to trigger the process via Webhook. This process addresses the issue of Markdown text not being able to be displayed directly in HTML format, simplifying content management. It is suitable for content editing, product operations, and technical teams, improving data consistency and display quality.
Postgres Database Table Creation and Data Insertion Demonstration Workflow
This workflow is manually triggered to automatically create a table named "test" in a Postgres database and insert a record containing an ID and a name. Subsequently, the workflow queries and returns the data from the table, simplifying the process of creating database tables and inserting data, thereby avoiding the tediousness of manually writing SQL. This process is suitable for quickly setting up test environments or demonstrating database operations, enhancing the automation and repeatability of database management to meet the needs of various application scenarios.
Remote IoT Sensor Monitoring via MQTT and InfluxDB
This workflow implements the real-time reception of temperature and humidity data from a remote DHT22 sensor via the MQTT protocol, automatically formats the data, and writes it into a local InfluxDB time-series database. This process efficiently subscribes to sensor data, ensures that the data complies with database writing standards, and addresses the automation of data collection and storage for IoT sensors. It enhances the timeliness and accuracy of the data, facilitating subsequent analysis and monitoring. It is suitable for fields such as IoT development, environmental monitoring, and smart manufacturing.
DigitalOceanUpload
This workflow automates the file upload process. After submitting a file through an online form, it automatically uploads the file to DigitalOcean's object storage and generates a publicly accessible file link, providing real-time feedback to the user. This process is simple and efficient, eliminating the cumbersome steps of traditional manual uploads and link generation, significantly enhancing file management efficiency. It is suitable for various scenarios that require a quick setup for file upload and sharing functionalities.
Store the Data Received from the CocktailDB API in JSON
This workflow can automatically retrieve detailed information from the random cocktail API of CocktailDB, convert the returned JSON data into binary format, and ultimately save it as a local file named cocktail.json. By manually triggering the process, users can achieve real-time data retrieval and storage, eliminating the hassle of manual operations and ensuring the accuracy of data acquisition. It is suitable for various scenarios, including the beverage industry, developers, and educational training.
Google Drive File Update Synchronization to AWS S3
This workflow can automatically monitor a specified Google Drive folder and, when files are updated, automatically upload them to an AWS S3 bucket. It supports file deduplication and server-side encryption, ensuring data security and consistency. This effectively meets the automatic synchronization needs across cloud storage, avoiding the hassle of manual operations and enhancing the efficiency of file backup and management. It is suitable for enterprises and teams that require automated file management.