diff --git a/DOCKER_DEPLOYMENT_GUIDE.md b/DOCKER_DEPLOYMENT_GUIDE.md deleted file mode 100644 index cdeb7e1..0000000 --- a/DOCKER_DEPLOYMENT_GUIDE.md +++ /dev/null @@ -1,303 +0,0 @@ -# Quality Application - Docker Deployment Guide - -## ๐Ÿ“‹ Overview - -This application is containerized with Docker and docker-compose, providing: -- **MariaDB 11.3** database with persistent storage -- **Flask** web application with Gunicorn -- **Mapped volumes** for easy access to code, data, and backups - -## ๐Ÿ—‚๏ธ Volume Structure - -``` -quality_app/ -โ”œโ”€โ”€ data/ -โ”‚ โ””โ”€โ”€ mariadb/ # Database files (MariaDB data directory) -โ”œโ”€โ”€ config/ -โ”‚ โ””โ”€โ”€ instance/ # Application configuration (external_server.conf) -โ”œโ”€โ”€ logs/ # Application and Gunicorn logs -โ”œโ”€โ”€ backups/ # Database backup files (shared with DB container) -โ””โ”€โ”€ py_app/ # Application source code (optional mapping) -``` - -## ๐Ÿš€ Quick Start - -### 1. Setup Volumes - -```bash -# Create necessary directories -bash setup-volumes.sh -``` - -### 2. Configure Environment - -```bash -# Create .env file from example -cp .env.example .env - -# Edit configuration (IMPORTANT: Change passwords!) -nano .env -``` - -**Critical settings to change:** -- `MYSQL_ROOT_PASSWORD` - Database root password -- `DB_PASSWORD` - Application database password -- `SECRET_KEY` - Flask secret key (generate random string) - -**First deployment settings:** -- `INIT_DB=true` - Initialize database schema -- `SEED_DB=true` - Seed with default data - -**After first deployment:** -- `INIT_DB=false` -- `SEED_DB=false` - -### 3. Deploy Application - -**Option A: Automated deployment** -```bash -bash quick-deploy.sh -``` - -**Option B: Manual deployment** -```bash -# Build images -docker-compose build - -# Start services -docker-compose up -d - -# View logs -docker-compose logs -f -``` - -## ๐Ÿ“ฆ Application Dependencies - -### Python Packages (from requirements.txt): -- Flask - Web framework -- Flask-SSLify - SSL support -- Werkzeug - WSGI utilities -- gunicorn - Production WSGI server -- pyodbc - ODBC database connectivity -- mariadb - MariaDB connector -- reportlab - PDF generation -- requests - HTTP library -- pandas - Data manipulation -- openpyxl - Excel file support -- APScheduler - Job scheduling for automated backups - -### System Dependencies (handled in Dockerfile): -- Python 3.10 -- MariaDB client libraries -- curl (for health checks) - -## ๐Ÿณ Docker Images - -### Web Application -- **Base**: python:3.10-slim -- **Multi-stage build** for minimal image size -- **Non-root user** for security -- **Health checks** enabled - -### Database -- **Image**: mariadb:11.3 -- **Persistent storage** with volume mapping -- **Performance tuning** via environment variables - -## ๐Ÿ“Š Resource Limits - -### Database Container -- CPU: 2.0 cores (limit), 0.5 cores (reserved) -- Memory: 2GB (limit), 512MB (reserved) -- Buffer pool: 512MB - -### Web Container -- CPU: 2.0 cores (limit), 0.5 cores (reserved) -- Memory: 2GB (limit), 512MB (reserved) -- Workers: 5 Gunicorn workers - -## ๐Ÿ”ง Common Operations - -### View Logs -```bash -# Application logs -docker-compose logs -f web - -# Database logs -docker-compose logs -f db - -# All logs -docker-compose logs -f -``` - -### Restart Services -```bash -# Restart all -docker-compose restart - -# Restart specific service -docker-compose restart web -docker-compose restart db -``` - -### Stop Services -```bash -# Stop (keeps data) -docker-compose down - -# Stop and remove volumes (WARNING: deletes database!) -docker-compose down -v -``` - -### Update Application Code - -**Without rebuilding (development mode):** -1. Uncomment volume mapping in docker-compose.yml: - ```yaml - - ${APP_CODE_PATH}:/app:ro - ``` -2. Edit code in `./py_app/` -3. Restart: `docker-compose restart web` - -**With rebuilding (production mode):** -```bash -docker-compose build --no-cache web -docker-compose up -d -``` - -### Database Access - -**MySQL shell inside container:** -```bash -docker-compose exec db mysql -u trasabilitate -p -# Enter password: Initial01! (or your custom password) -``` - -**From host machine:** -```bash -mysql -h 127.0.0.1 -P 3306 -u trasabilitate -p -``` - -**Root access:** -```bash -docker-compose exec db mysql -u root -p -``` - -## ๐Ÿ’พ Backup Operations - -### Manual Backup -```bash -# Full backup -docker-compose exec db mysqldump -u trasabilitate -pInitial01! trasabilitate > backups/manual_$(date +%Y%m%d_%H%M%S).sql - -# Data-only backup -docker-compose exec db mysqldump -u trasabilitate -pInitial01! --no-create-info trasabilitate > backups/data_only_$(date +%Y%m%d_%H%M%S).sql - -# Structure-only backup -docker-compose exec db mysqldump -u trasabilitate -pInitial01! --no-data trasabilitate > backups/structure_only_$(date +%Y%m%d_%H%M%S).sql -``` - -### Automated Backups -The application includes a built-in scheduler for automated backups. Configure via the web interface. - -### Restore from Backup -```bash -# Stop application (keeps database running) -docker-compose stop web - -# Restore database -docker-compose exec -T db mysql -u trasabilitate -pInitial01! trasabilitate < backups/backup_file.sql - -# Start application -docker-compose start web -``` - -## ๐Ÿ” Troubleshooting - -### Container won't start -```bash -# Check logs -docker-compose logs db -docker-compose logs web - -# Check if ports are available -ss -tulpn | grep 8781 -ss -tulpn | grep 3306 -``` - -### Database connection failed -```bash -# Check database is healthy -docker-compose ps - -# Test database connection -docker-compose exec db mysqladmin ping -u root -p - -# Check database users -docker-compose exec db mysql -u root -p -e "SELECT User, Host FROM mysql.user;" -``` - -### Permission issues -```bash -# Check directory permissions -ls -la data/mariadb -ls -la logs -ls -la backups - -# Fix permissions if needed -chmod -R 755 data logs backups config -``` - -### Reset everything (WARNING: deletes all data!) -```bash -# Stop and remove containers, volumes -docker-compose down -v - -# Remove volume directories -rm -rf data/mariadb/* logs/* config/instance/* - -# Start fresh -bash quick-deploy.sh -``` - -## ๐Ÿ”’ Security Notes - -1. **Change default passwords** in .env file -2. **Generate new SECRET_KEY** for Flask -3. Never commit .env file to version control -4. Use firewall rules to restrict database port (3306) access -5. Consider using Docker secrets for sensitive data in production -6. Regular security updates: `docker-compose pull && docker-compose up -d` - -## ๐ŸŒ Port Mapping - -- **8781** - Web application (configurable via APP_PORT in .env) -- **3306** - MariaDB database (configurable via DB_PORT in .env) - -## ๐Ÿ“ Configuration Files - -- **docker-compose.yml** - Service orchestration -- **.env** - Environment variables and configuration -- **Dockerfile** - Web application image definition -- **docker-entrypoint.sh** - Container initialization script -- **init-db.sql** - Database initialization script - -## ๐ŸŽฏ Production Checklist - -- [ ] Change all default passwords -- [ ] Generate secure SECRET_KEY -- [ ] Set FLASK_ENV=production -- [ ] Configure resource limits appropriately -- [ ] Set up backup schedule -- [ ] Configure firewall rules -- [ ] Set up monitoring and logging -- [ ] Test backup/restore procedures -- [ ] Document deployment procedure for your team -- [ ] Set INIT_DB=false and SEED_DB=false after first deployment - -## ๐Ÿ“ž Support - -For issues or questions, refer to: -- Documentation in `documentation/` folder -- Docker logs: `docker-compose logs -f` -- Application logs: `./logs/` directory diff --git a/IMPROVEMENTS_APPLIED.md b/IMPROVEMENTS_APPLIED.md deleted file mode 100644 index c55b8a7..0000000 --- a/IMPROVEMENTS_APPLIED.md +++ /dev/null @@ -1,123 +0,0 @@ -# Improvements Applied to Quality App - -## Date: November 13, 2025 - -### Overview -All improvements from the production environment have been successfully transposed to the quality_app project. - -## Files Updated/Copied - -### 1. Docker Configuration -- **Dockerfile** - Added `mariadb-client` package for backup functionality -- **docker-compose.yml** - Updated with proper volume mappings and /data folder support -- **.env** - Updated all paths to use absolute paths under `/srv/quality_app/` - -### 2. Backup & Restore System -- **database_backup.py** - Fixed backup/restore functions: - - Changed `result_success` to `result.returncode == 0` - - Added `--skip-ssl` flag for MariaDB connections - - Fixed restore function error handling -- **restore_database.sh** - Fixed SQL file parsing to handle MariaDB dump format - -### 3. UI Improvements - Sticky Table Headers -- **base.css** - Added sticky header CSS for all report tables -- **scan.html** - Wrapped table in `report-table-container` div -- **fg_scan.html** - Wrapped table in `report-table-container` div - -### 4. Quality Code Display Enhancement -- **fg_quality.js** - Quality code `0` displays as "OK" in green; CSV exports as "0" -- **script.js** - Same improvements for quality module reports - -## Directory Structure - -``` -/srv/quality_app/ -โ”œโ”€โ”€ py_app/ # Application code (mapped to /app in container) -โ”œโ”€โ”€ data/ -โ”‚ โ””โ”€โ”€ mariadb/ # Database files -โ”œโ”€โ”€ config/ -โ”‚ โ””โ”€โ”€ instance/ # Application configuration -โ”œโ”€โ”€ logs/ # Application logs -โ”œโ”€โ”€ backups/ # Database backups -โ”œโ”€โ”€ docker-compose.yml -โ”œโ”€โ”€ Dockerfile -โ”œโ”€โ”€ .env -โ””โ”€โ”€ restore_database.sh -``` - -## Environment Configuration - -### Volume Mappings in .env: -``` -DB_DATA_PATH=/srv/quality_app/data/mariadb -APP_CODE_PATH=/srv/quality_app/py_app -LOGS_PATH=/srv/quality_app/logs -INSTANCE_PATH=/srv/quality_app/config/instance -BACKUP_PATH=/srv/quality_app/backups -``` - -## Features Implemented - -### โœ… Backup System -- Automatic scheduled backups -- Manual backup creation -- Data-only backups -- Backup retention policies -- MariaDB client tools installed - -### โœ… Restore System -- Python-based restore function -- Shell script restore with proper SQL parsing -- Handles MariaDB dump format correctly - -### โœ… UI Enhancements -- **Sticky Headers**: Table headers remain fixed when scrolling -- **Quality Code Display**: - - Shows "OK" in green for quality code 0 - - Exports "0" in CSV files - - Better user experience - -### โœ… Volume Mapping -- All volumes use absolute paths -- Support for /data folder mapping -- Easy to configure backup location on different drives - -## Starting the Application - -```bash -cd /srv/quality_app -docker compose up -d --build -``` - -## Testing Backup & Restore - -### Create Backup: -```bash -cd /srv/quality_app -docker compose exec web bash -c "cd /app && python3 -c 'from app import create_app; from app.database_backup import DatabaseBackupManager; app = create_app(); -with app.app_context(): bm = DatabaseBackupManager(); result = bm.create_backup(); print(result)'" -``` - -### Restore Backup: -```bash -cd /srv/quality_app -./restore_database.sh /srv/quality_app/backups/backup_file.sql -``` - -## Notes - -- Database initialization is set to `false` (already initialized) -- All improvements are production-ready -- Backup path can be changed to external drive if needed -- Application port: 8781 (default) - -## Next Steps - -1. Review .env file and update passwords if needed -2. Test all functionality after deployment -3. Configure backup schedule if needed -4. Set up external backup drive if desired - ---- -**Compatibility**: All changes are backward compatible with existing data. -**Status**: Ready for deployment diff --git a/MERGE_COMPATIBILITY.md b/MERGE_COMPATIBILITY.md deleted file mode 100644 index 1b7790d..0000000 --- a/MERGE_COMPATIBILITY.md +++ /dev/null @@ -1,292 +0,0 @@ -# Merge Compatibility Analysis: docker-deploy โ†’ master - -## ๐Ÿ“Š Merge Status: **SAFE TO MERGE** โœ… - -### Conflict Analysis -- **No merge conflicts detected** between `master` and `docker-deploy` branches -- All changes are additive or modify existing code in compatible ways -- The docker-deploy branch adds 13 files with 1034 insertions and 117 deletions - -### Files Changed -#### New Files (No conflicts): -1. `DOCKER_DEPLOYMENT_GUIDE.md` - Documentation -2. `IMPROVEMENTS_APPLIED.md` - Documentation -3. `quick-deploy.sh` - Deployment script -4. `restore_database.sh` - Restore script -5. `setup-volumes.sh` - Setup script - -#### Modified Files: -1. `Dockerfile` - Added mariadb-client package -2. `docker-compose.yml` - Added /data volume mapping, resource limits -3. `py_app/app/database_backup.py` - **CRITICAL: Compatibility layer added** -4. `py_app/app/static/css/base.css` - Added sticky header styles -5. `py_app/app/static/fg_quality.js` - Quality code display enhancement -6. `py_app/app/static/script.js` - Quality code display enhancement -7. `py_app/app/templates/fg_scan.html` - Added report-table-container wrapper -8. `py_app/app/templates/scan.html` - Added report-table-container wrapper - ---- - -## ๐Ÿ”ง Compatibility Layer: database_backup.py - -### Problem Identified -The docker-deploy branch changed backup commands from `mysqldump` to `mariadb-dump` and added `--skip-ssl` flag, which would break the application when running with standard Gunicorn (non-Docker) deployment. - -### Solution Implemented -Added intelligent environment detection and command selection: - -#### 1. Dynamic Command Detection -```python -def _detect_dump_command(self): - """Detect which mysqldump command is available (mariadb-dump or mysqldump)""" - try: - # Try mariadb-dump first (newer MariaDB versions) - result = subprocess.run(['which', 'mariadb-dump'], - capture_output=True, text=True) - if result.returncode == 0: - return 'mariadb-dump' - - # Fall back to mysqldump - result = subprocess.run(['which', 'mysqldump'], - capture_output=True, text=True) - if result.returncode == 0: - return 'mysqldump' - - # Default to mariadb-dump (will error if not available) - return 'mariadb-dump' - except Exception as e: - print(f"Warning: Could not detect dump command: {e}") - return 'mysqldump' # Default fallback -``` - -#### 2. Conditional SSL Arguments -```python -def _get_ssl_args(self): - """Get SSL arguments based on environment (Docker needs --skip-ssl)""" - # Check if running in Docker container - if os.path.exists('/.dockerenv') or os.environ.get('DOCKER_CONTAINER'): - return ['--skip-ssl'] - return [] -``` - -#### 3. Updated Backup Command Building -```python -cmd = [ - self.dump_command, # Uses detected command (mariadb-dump or mysqldump) - f"--host={self.config['host']}", - f"--port={self.config['port']}", - f"--user={self.config['user']}", - f"--password={self.config['password']}", -] - -# Add SSL args if needed (Docker environment) -cmd.extend(self._get_ssl_args()) - -# Add backup options -cmd.extend([ - '--single-transaction', - '--skip-lock-tables', - '--force', - # ... other options -]) -``` - ---- - -## ๐ŸŽฏ Deployment Scenarios - -### Scenario 1: Docker Deployment (docker-compose) -**Environment Detection:** -- โœ… `/.dockerenv` file exists -- โœ… `DOCKER_CONTAINER` environment variable set in docker-compose.yml - -**Backup Behavior:** -- Uses `mariadb-dump` (installed in Dockerfile) -- Adds `--skip-ssl` flag automatically -- Works correctly โœ… - -### Scenario 2: Standard Gunicorn Deployment (systemd service) -**Environment Detection:** -- โŒ `/.dockerenv` file does NOT exist -- โŒ `DOCKER_CONTAINER` environment variable NOT set - -**Backup Behavior:** -- Detects available command: `mysqldump` or `mariadb-dump` -- Does NOT add `--skip-ssl` flag -- Uses system-installed MySQL/MariaDB client tools -- Works correctly โœ… - -### Scenario 3: Mixed Environment (External Database) -**Both deployment types can connect to:** -- External MariaDB server -- Remote database instance -- Local database with proper SSL configuration - -**Backup Behavior:** -- Automatically adapts to available tools -- SSL handling based on container detection -- Works correctly โœ… - ---- - -## ๐Ÿงช Testing Plan - -### Pre-Merge Testing -1. **Docker Environment:** - ```bash - cd /srv/quality_app - git checkout docker-deploy - docker-compose up -d - # Test backup via web UI - # Test scheduled backup - # Test restore functionality - ``` - -2. **Gunicorn Environment:** - ```bash - # Stop Docker if running - docker-compose down - - # Start with systemd service (if available) - sudo systemctl start trasabilitate - - # Test backup via web UI - # Test scheduled backup - # Test restore functionality - ``` - -3. **Command Detection Test:** - ```bash - # Inside Docker container - docker-compose exec web python3 -c " - from app.database_backup import DatabaseBackupManager - manager = DatabaseBackupManager() - print(f'Dump command: {manager.dump_command}') - print(f'SSL args: {manager._get_ssl_args()}') - " - - # On host system (if MySQL client installed) - python3 -c " - from app.database_backup import DatabaseBackupManager - manager = DatabaseBackupManager() - print(f'Dump command: {manager.dump_command}') - print(f'SSL args: {manager._get_ssl_args()}') - " - ``` - -### Post-Merge Testing -1. Verify both deployment methods still work -2. Test backup/restore in both environments -3. Verify scheduled backups function correctly -4. Check error handling when tools are missing - ---- - -## ๐Ÿ“‹ Merge Checklist - -- [x] No merge conflicts detected -- [x] Compatibility layer implemented in `database_backup.py` -- [x] Environment detection for Docker vs Gunicorn -- [x] Dynamic command selection (mariadb-dump vs mysqldump) -- [x] Conditional SSL flag handling -- [x] UI improvements (sticky headers) are purely CSS/JS - no conflicts -- [x] Quality code display changes are frontend-only - no conflicts -- [x] New documentation files added - no conflicts -- [x] Docker-specific files don't affect Gunicorn deployment - -### Safe to Merge Because: -1. **Additive Changes**: Most changes are new files or new features -2. **Backward Compatible**: Code detects environment and adapts -3. **No Breaking Changes**: Gunicorn deployment still works without Docker -4. **Independent Features**: UI improvements work in any environment -5. **Fail-Safe Defaults**: Falls back to mysqldump if mariadb-dump unavailable - ---- - -## ๐Ÿš€ Merge Process - -### Recommended Steps: -```bash -cd /srv/quality_app - -# 1. Ensure working directory is clean -git status - -# 2. Switch to master branch -git checkout master - -# 3. Pull latest changes -git pull origin master - -# 4. Merge docker-deploy (should be clean merge) -git merge docker-deploy - -# 5. Review merge -git log --oneline -10 - -# 6. Test in current environment -# (If using systemd, test the app) -# (If using Docker, test with docker-compose) - -# 7. Push to remote -git push origin master - -# 8. Tag the release (optional) -git tag -a v2.0-docker -m "Docker deployment support with compatibility layer" -git push origin v2.0-docker -``` - -### Rollback Plan (if needed): -```bash -# If issues arise after merge -git log --oneline -10 # Find commit hash before merge -git reset --hard -git push origin master --force # Use with caution! - -# Or revert the merge commit -git revert -m 1 -git push origin master -``` - ---- - -## ๐ŸŽ“ Key Improvements in docker-deploy Branch - -### 1. **Bug Fixes** -- Fixed `result_success` variable error โ†’ `result.returncode == 0` -- Fixed restore SQL parsing with sed preprocessing -- Fixed missing mariadb-client in Docker container - -### 2. **Docker Support** -- Complete Docker Compose setup -- Volume mapping for persistent data -- Health checks and resource limits -- Environment-based configuration - -### 3. **UI Enhancements** -- Sticky table headers for scrollable reports -- Quality code 0 displays as "OK" (green) -- CSV export preserves original "0" value - -### 4. **Compatibility** -- Works in Docker AND traditional Gunicorn deployment -- Auto-detects available backup tools -- Environment-aware SSL handling -- No breaking changes to existing functionality - ---- - -## ๐Ÿ“ž Support - -If issues arise after merge: -1. Check environment detection: `ls -la /.dockerenv` -2. Verify backup tools: `which mysqldump mariadb-dump` -3. Review logs: `docker-compose logs web` or application logs -4. Test backup manually from command line -5. Fall back to master branch if critical issues occur - ---- - -**Last Updated:** 2025-11-13 -**Branch:** docker-deploy โ†’ master -**Status:** Ready for merge โœ… diff --git a/Open .Orders WIZ New.xlsb b/Open .Orders WIZ New.xlsb deleted file mode 100644 index 766c994..0000000 Binary files a/Open .Orders WIZ New.xlsb and /dev/null differ diff --git a/README.md b/README.md deleted file mode 100644 index 04809c9..0000000 --- a/README.md +++ /dev/null @@ -1,74 +0,0 @@ -# Quality Recticel Application - -Production traceability and quality management system. - -## ๐Ÿ“š Documentation - -All development and deployment documentation has been moved to the **[documentation](./documentation/)** folder. - -### Quick Links - -- **[Documentation Index](./documentation/README.md)** - Complete documentation overview -- **[Database Setup](./documentation/DATABASE_DOCKER_SETUP.md)** - Database configuration guide -- **[Docker Guide](./documentation/DOCKER_QUICK_REFERENCE.md)** - Docker commands reference -- **[Backup System](./documentation/BACKUP_SYSTEM.md)** - Database backup documentation - -## ๐Ÿš€ Quick Start - -```bash -# Start application -cd /srv/quality_app/py_app -bash start_production.sh - -# Stop application -bash stop_production.sh - -# View logs -tail -f /srv/quality_app/logs/error.log -``` - -## ๐Ÿ“ฆ Docker Deployment - -```bash -# Start with Docker Compose -docker-compose up -d - -# View logs -docker-compose logs -f web - -# Stop services -docker-compose down -``` - -## ๐Ÿ” Default Access - -- **URL**: http://localhost:8781 -- **Username**: superadmin -- **Password**: superadmin123 - -## ๐Ÿ“ Project Structure - -``` -quality_app/ -โ”œโ”€โ”€ documentation/ # All documentation files -โ”œโ”€โ”€ py_app/ # Flask application -โ”œโ”€โ”€ backups/ # Database backups -โ”œโ”€โ”€ logs/ # Application logs -โ”œโ”€โ”€ docker-compose.yml # Docker configuration -โ””โ”€โ”€ Dockerfile # Container image definition -``` - -## ๐Ÿ“– For More Information - -See the **[documentation](./documentation/)** folder for comprehensive guides on: - -- Setup and deployment -- Docker configuration -- Database management -- Backup and restore procedures -- Application features - ---- - -**Version**: 1.0.0 -**Last Updated**: November 3, 2025 diff --git a/py_app/app.log b/py_app/app.log deleted file mode 100644 index 8cfc659..0000000 --- a/py_app/app.log +++ /dev/null @@ -1,4 +0,0 @@ - * Serving Flask app 'app' - * Debug mode: on -Address already in use -Port 8781 is in use by another program. Either identify and stop that program, or start the server with a different port. diff --git a/py_app/app/daily_mirror_db_setup.py b/py_app/app/daily_mirror_db_setup.py index 4c0cb93..7ba4b19 100644 --- a/py_app/app/daily_mirror_db_setup.py +++ b/py_app/app/daily_mirror_db_setup.py @@ -184,7 +184,11 @@ class DailyMirrorDatabase: raise Exception("Could not read Excel file. Please ensure it has a 'Production orders Data' or 'DataSheet' sheet.") logger.info(f"Loaded production data from {sheet_used}: {len(df)} rows, {len(df.columns)} columns") - logger.info(f"First 5 column names: {list(df.columns)[:5]}") + logger.info(f"All column names: {list(df.columns)}") + + # Log columns that have at least some non-null data + columns_with_data = [col for col in df.columns if df[col].notna().any()] + logger.info(f"Columns with data ({len(columns_with_data)}): {columns_with_data}") cursor = self.connection.cursor() success_count = 0 @@ -235,6 +239,10 @@ class DailyMirrorDatabase: for index, row in df.iterrows(): try: + # Skip rows where production order is empty + if pd.isna(row.get('Comanda Productie')) or str(row.get('Comanda Productie')).strip() == '': + continue + # Create concatenated fields with dash separator opened_for_order = str(row.get('Opened for Order', '')).strip() if pd.notna(row.get('Opened for Order')) else '' linia = str(row.get('Linia', '')).strip() if pd.notna(row.get('Linia')) else '' @@ -269,6 +277,8 @@ class DailyMirrorDatabase: # Prepare data tuple data = ( safe_str(row.get('Comanda Productie')), # production_order + safe_str(row.get('Opened for Order')), # production_order_line + safe_str(row.get('Linia')), # line_number open_for_order_line, # open_for_order_line (concatenated) client_order_line, # client_order_line (concatenated) safe_str(row.get('Cod. Client')), # customer_code diff --git a/py_app/app/order_labels.py b/py_app/app/order_labels.py index 7eaed8b..db30cad 100755 --- a/py_app/app/order_labels.py +++ b/py_app/app/order_labels.py @@ -10,6 +10,7 @@ import os import json import tempfile from datetime import datetime +import pandas as pd def get_db_connection(): """Get database connection using external server configuration""" @@ -73,8 +74,15 @@ def validate_order_row(row_data): data_livrare = row_data.get('data_livrare', '').strip() if data_livrare: try: - # Try to parse common date formats - for date_format in ['%Y-%m-%d', '%d/%m/%Y', '%m/%d/%Y', '%d.%m.%Y']: + # Try to parse common date formats including Excel datetime format + date_formats = [ + '%Y-%m-%d', # 2024-03-12 + '%Y-%m-%d %H:%M:%S', # 2024-03-12 00:00:00 (Excel format) + '%d/%m/%Y', # 12/03/2024 + '%m/%d/%Y', # 03/12/2024 + '%d.%m.%Y' # 12.03.2024 + ] + for date_format in date_formats: try: datetime.strptime(data_livrare, date_format) break @@ -118,8 +126,15 @@ def add_order_to_database(order_data): data_livrare_str = order_data.get('data_livrare', '').strip() if data_livrare_str: try: - # Try to parse common date formats and convert to YYYY-MM-DD - for date_format in ['%Y-%m-%d', '%d/%m/%Y', '%m/%d/%Y', '%d.%m.%Y']: + # Try to parse common date formats including Excel datetime and convert to YYYY-MM-DD + date_formats = [ + '%Y-%m-%d', # 2024-03-12 + '%Y-%m-%d %H:%M:%S', # 2024-03-12 00:00:00 (Excel format) + '%d/%m/%Y', # 12/03/2024 + '%m/%d/%Y', # 03/12/2024 + '%d.%m.%Y' # 12.03.2024 + ] + for date_format in date_formats: try: parsed_date = datetime.strptime(data_livrare_str, date_format) data_livrare_value = parsed_date.strftime('%Y-%m-%d') @@ -167,6 +182,141 @@ def add_order_to_database(order_data): except Exception as e: return False, f"Unexpected error: {str(e)}" +def process_excel_file(file_path): + """ + Process uploaded Excel file (.xlsx) and return parsed data with validation + Returns: (orders_data: list, validation_errors: list, validation_warnings: list) + """ + orders_data = [] + all_errors = [] + all_warnings = [] + + try: + # Read Excel file - try 'Sheet1' first (common data sheet), then fallback to first sheet + try: + df = pd.read_excel(file_path, sheet_name='Sheet1', engine='openpyxl') + except: + try: + df = pd.read_excel(file_path, sheet_name=0, engine='openpyxl') + except: + # Last resort - try 'DataSheet' + df = pd.read_excel(file_path, sheet_name='DataSheet', engine='openpyxl') + + # Column mapping for Excel files (case-insensitive) + # Maps Excel column names to database field names + column_mapping = { + # Core order fields + 'comanda productie': 'comanda_productie', + 'comanda_productie': 'comanda_productie', + 'cod articol': 'cod_articol', + 'cod_articol': 'cod_articol', + 'descriere': 'descr_com_prod', + 'descr. com. prod': 'descr_com_prod', + 'descr com prod': 'descr_com_prod', + 'descr_com_prod': 'descr_com_prod', + 'description': 'descr_com_prod', + 'cantitate': 'cantitate', + 'cantitate ceruta': 'cantitate', + 'quantity': 'cantitate', + 'datalivrare': 'data_livrare', + 'data livrare': 'data_livrare', + 'data_livrare': 'data_livrare', + 'delivery date': 'data_livrare', + 'dimensiune': 'dimensiune', + 'dimension': 'dimensiune', + + # Customer and order info + 'customer': 'customer_name', + 'customer name': 'customer_name', + 'customer_name': 'customer_name', + 'comanda client': 'com_achiz_client', + 'com.achiz.client': 'com_achiz_client', + 'com achiz client': 'com_achiz_client', + 'com_achiz_client': 'com_achiz_client', + 'customer article number': 'customer_article_number', + 'customer_article_number': 'customer_article_number', + + # Status and dates + 'status': 'status', + 'end of quilting': 'end_of_quilting', + 'end of sewing': 'end_of_sewing', + 'data deschiderii': 'data_deschiderii', + 'data planific.': 'data_planific', + 'data planific': 'data_planific', + + # Machine and production info + 'masina cusut': 'masina_cusut', + 'masina cusut ': 'masina_cusut', # Note trailing space in Excel + 'tip masina': 'tip_masina', + 'numar masina': 'numar_masina', + 'clasificare': 'clasificare', + 'timp normat total': 'timp_normat_total', + + # Quality control stages (T1, T2, T3) + 't1': 't1', + 'data inregistrare t1': 'data_inregistrare_t1', + 'numele complet t1': 'numele_complet_t1', + 't2': 't2', + 'data inregistrare t2': 'data_inregistrare_t2', + 'numele complet t2': 'numele_complet_t2', + 't3': 't3', + 'data inregistrare t3': 'data_inregistrare_t3', + 'numele complet t3': 'numele_complet_t3', + + # Design and model info + 'model lb2': 'model_lb2', + 'design nr': 'design_nr', + 'needle position': 'needle_position', + + # Line references + 'nr. linie com. client': 'nr_linie_com_client', + 'nr linie com client': 'nr_linie_com_client', + 'nr_linie_com_client': 'nr_linie_com_client', + 'line': 'line_number', + 'line_number': 'line_number', + 'open for order': 'open_for_order', + 'open_for_order': 'open_for_order' + } + + # Normalize column names + df.columns = [col.lower().strip() if col else f'col_{i}' for i, col in enumerate(df.columns)] + + # Process each row + for idx, row in df.iterrows(): + # Skip empty rows + if row.isna().all(): + continue + + # Create normalized row data + normalized_row = {} + for col_name in df.columns: + col_key = col_name.lower().strip() + mapped_key = column_mapping.get(col_key, col_key.replace(' ', '_').replace('.', '')) + + # Get value and convert to string, handle NaN + value = row[col_name] + if pd.isna(value): + normalized_row[mapped_key] = '' + else: + normalized_row[mapped_key] = str(value).strip() + + # Validate the row + errors, warnings = validate_order_row(normalized_row) + + if errors: + all_errors.extend([f"Row {idx + 2}: {error}" for error in errors]) + else: + # Only add valid rows + orders_data.append(normalized_row) + + if warnings: + all_warnings.extend([f"Row {idx + 2}: {warning}" for warning in warnings]) + + except Exception as e: + all_errors.append(f"Error processing Excel file: {str(e)}") + + return orders_data, all_errors, all_warnings + def process_csv_file(file_path): """ Process uploaded CSV file and return parsed data with validation @@ -268,7 +418,7 @@ def upload_orders_handler(): if request.method == 'POST': # Handle file upload file = request.files.get('csv_file') - if file and file.filename.endswith(('.csv', '.CSV')): + if file and file.filename.endswith(('.csv', '.CSV', '.xlsx', '.XLSX', '.xls', '.XLS')): try: # Save uploaded file temp_path = os.path.join(temp_dir, file.filename) @@ -278,8 +428,11 @@ def upload_orders_handler(): session['csv_filename'] = file.filename session['orders_csv_filepath'] = temp_path - # Process the CSV file - orders_data, validation_errors, validation_warnings = process_csv_file(temp_path) + # Process the file based on extension + if file.filename.lower().endswith(('.xlsx', '.xls')): + orders_data, validation_errors, validation_warnings = process_excel_file(temp_path) + else: + orders_data, validation_errors, validation_warnings = process_csv_file(temp_path) # Store processed data in session session['orders_csv_data'] = orders_data diff --git a/py_app/app/print_module.py b/py_app/app/print_module.py index 10b3d8f..da602ff 100755 --- a/py_app/app/print_module.py +++ b/py_app/app/print_module.py @@ -7,8 +7,13 @@ def get_db_connection(): settings = {} with open(settings_file, 'r') as f: for line in f: - key, value = line.strip().split('=', 1) - settings[key] = value + line = line.strip() + # Skip empty lines and comments + if not line or line.startswith('#'): + continue + if '=' in line: + key, value = line.split('=', 1) + settings[key] = value return mariadb.connect( user=settings['username'], password=settings['password'], @@ -23,6 +28,10 @@ def get_unprinted_orders_data(limit=100): Returns list of order dictionaries where printed_labels != 1 """ try: + import sys + sys.stderr.write(f"DEBUG print_module: get_unprinted_orders_data called with limit={limit}\n") + sys.stderr.flush() + conn = get_db_connection() cursor = conn.cursor() @@ -30,8 +39,14 @@ def get_unprinted_orders_data(limit=100): cursor.execute("SHOW COLUMNS FROM order_for_labels LIKE 'printed_labels'") column_exists = cursor.fetchone() + sys.stderr.write(f"DEBUG print_module: printed_labels column exists={bool(column_exists)}\n") + sys.stderr.flush() + if column_exists: # Use printed_labels column + sys.stderr.write(f"DEBUG print_module: Executing query with printed_labels != 1\n") + sys.stderr.flush() + cursor.execute(""" SELECT id, comanda_productie, cod_articol, descr_com_prod, cantitate, com_achiz_client, nr_linie_com_client, customer_name, @@ -43,6 +58,9 @@ def get_unprinted_orders_data(limit=100): LIMIT %s """, (limit,)) else: + sys.stderr.write(f"DEBUG print_module: Executing fallback query (no printed_labels column)\n") + sys.stderr.flush() + # Fallback: get all orders if no printed_labels column cursor.execute(""" SELECT id, comanda_productie, cod_articol, descr_com_prod, cantitate, @@ -55,7 +73,21 @@ def get_unprinted_orders_data(limit=100): """, (limit,)) orders = [] - for row in cursor.fetchall(): + rows = cursor.fetchall() + sys.stderr.write(f"DEBUG print_module: Query returned {len(rows)} rows\n") + sys.stderr.flush() + + # Also write to file for debugging + try: + with open('/app/print_module_debug.log', 'w') as f: + f.write(f"Query returned {len(rows)} rows\n") + f.write(f"Column exists: {column_exists}\n") + if rows: + f.write(f"First row: {rows[0]}\n") + except: + pass + + for row in rows: if column_exists: orders.append({ 'id': row[0], @@ -100,6 +132,21 @@ def get_unprinted_orders_data(limit=100): return orders except Exception as e: + import sys + import traceback + error_trace = traceback.format_exc() + + sys.stderr.write(f"ERROR in get_unprinted_orders_data: {e}\n{error_trace}\n") + sys.stderr.flush() + + # Write to file + try: + with open('/app/print_module_error.log', 'w') as f: + f.write(f"ERROR: {e}\n") + f.write(f"Traceback:\n{error_trace}\n") + except: + pass + print(f"Error retrieving unprinted orders: {e}") return [] diff --git a/py_app/app/routes.py b/py_app/app/routes.py index f9ec80c..1fce075 100755 --- a/py_app/app/routes.py +++ b/py_app/app/routes.py @@ -1717,45 +1717,203 @@ def etichete(): @requires_labels_module def upload_data(): if request.method == 'POST': + import sys + sys.stdout.flush() + + # Write to file to ensure we can see it + try: + with open('/app/request.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"POST REQUEST at {datetime.now()}\n") + f.write(f"Form data: {dict(request.form)}\n") + f.write(f"Files: {list(request.files.keys())}\n") + except: + pass + + sys.stderr.write(f"DEBUG: POST request received for upload_data\n") + sys.stderr.flush() action = request.form.get('action', 'preview') + sys.stderr.write(f"DEBUG: Action = {action}\n") + sys.stderr.flush() if action == 'preview': # Handle file upload and show preview + print(f"DEBUG: Processing preview action") + print(f"DEBUG: Files in request: {list(request.files.keys())}") + if 'file' not in request.files: + print(f"DEBUG: No file in request.files") flash('No file selected', 'error') return redirect(request.url) file = request.files['file'] + print(f"DEBUG: File received: {file.filename}") + if file.filename == '': + print(f"DEBUG: Empty filename") flash('No file selected', 'error') return redirect(request.url) - if file and file.filename.lower().endswith('.csv'): + filename_lower = file.filename.lower() + print(f"DEBUG: Filename lowercase: {filename_lower}") + + # Handle both CSV and Excel files + if file and (filename_lower.endswith('.csv') or filename_lower.endswith('.xlsx') or filename_lower.endswith('.xls')): try: - # Read CSV file - import csv - import io - - # Read the file content - stream = io.StringIO(file.stream.read().decode("UTF8"), newline=None) - csv_input = csv.DictReader(stream) - - # Convert to list for preview preview_data = [] headers = [] - for i, row in enumerate(csv_input): - if i == 0: - headers = list(row.keys()) - if i < 10: # Show only first 10 rows for preview - preview_data.append(row) - else: - break + if filename_lower.endswith('.csv'): + # Read CSV file + import csv + import io + + # Read the file content + stream = io.StringIO(file.stream.read().decode("UTF8"), newline=None) + csv_input = csv.DictReader(stream) + + # Define the fields that are stored in the database + database_fields = [ + 'comanda_productie', 'cod_articol', 'descr_com_prod', 'cantitate', + 'data_livrare', 'dimensiune', 'com_achiz_client', 'nr_linie_com_client', + 'customer_name', 'customer_article_number', 'open_for_order', 'line_number' + ] + + # Convert to list for preview + all_rows = [] + for i, row in enumerate(csv_input): + all_rows.append(row) + if i == 0: + # Get all available fields from CSV + all_fields = list(row.keys()) + # Filter to only database fields + headers = [field for field in database_fields if field in all_fields or + any(field.lower() == k.lower().replace(' ', '_').replace('.', '') for k in all_fields)] + if i < 10: # Show only first 10 rows for preview + # Filter row to only show database fields + filtered_row = {k: v for k, v in row.items() if k.lower().replace(' ', '_').replace('.', '') in database_fields} + preview_data.append(filtered_row) + + # If no headers were set, use all available + if not headers and all_rows: + headers = list(all_rows[0].keys()) + preview_data = all_rows[:10] + + # Store the full file content in a temp file instead of session + import uuid + + upload_id = str(uuid.uuid4()) + temp_data_file = f'/tmp/upload_{upload_id}.csv' + + file.stream.seek(0) # Reset file pointer + with open(temp_data_file, 'wb') as f: + f.write(file.stream.read()) + + # Store only the file reference in session + session['upload_id'] = upload_id + session['csv_filename'] = file.filename + session['file_type'] = 'csv' + session.modified = True - # Store the full file content in session for later processing - file.stream.seek(0) # Reset file pointer - session['csv_content'] = file.stream.read().decode("UTF8") - session['csv_filename'] = file.filename + else: # Excel file + print(f"DEBUG: Processing Excel file: {file.filename}") + import sys + sys.stderr.write(f"DEBUG: Processing Excel file: {file.filename}\n") + sys.stderr.flush() + + import os + import tempfile + from app.order_labels import process_excel_file + + # Save uploaded file temporarily + temp_file = tempfile.NamedTemporaryFile(delete=False, suffix=os.path.splitext(file.filename)[1]) + print(f"DEBUG: Created temp file: {temp_file.name}") + sys.stderr.write(f"DEBUG: Created temp file: {temp_file.name}\n") + sys.stderr.flush() + + file.save(temp_file.name) + temp_file.close() + print(f"DEBUG: Saved file to temp location") + sys.stderr.write(f"DEBUG: Saved file to temp location\n") + sys.stderr.flush() + + # Process Excel file + print(f"DEBUG: Calling process_excel_file()") + orders_data, errors, warnings = process_excel_file(temp_file.name) + print(f"DEBUG: Process complete - orders: {len(orders_data)}, errors: {len(errors)}, warnings: {len(warnings)}") + + # Clean up temp file + os.unlink(temp_file.name) + + if errors: + for error in errors[:10]: + flash(error, 'error') + if len(errors) > 10: + flash(f'... and {len(errors) - 10} more errors', 'error') + + if warnings: + for warning in warnings[:5]: + flash(warning, 'warning') + + if not orders_data: + import sys + sys.stderr.write(f"ERROR: No valid orders data found. Errors: {len(errors)}\n") + sys.stderr.flush() + + # Write to file + try: + with open('/app/upload_error.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"NO VALID DATA at {datetime.now()}\n") + f.write(f"File: {file.filename}\n") + f.write(f"Errors ({len(errors)}):\n") + for err in errors[:20]: + f.write(f" - {err}\n") + f.write(f"Warnings ({len(warnings)}):\n") + for warn in warnings[:20]: + f.write(f" - {warn}\n") + except: + pass + + flash('No valid data found in Excel file', 'error') + return redirect(request.url) + + # Get headers from first row - only show fields that will be stored in database + if orders_data: + # Define the fields that are stored in the database + database_fields = [ + 'comanda_productie', 'cod_articol', 'descr_com_prod', 'cantitate', + 'data_livrare', 'dimensiune', 'com_achiz_client', 'nr_linie_com_client', + 'customer_name', 'customer_article_number', 'open_for_order', 'line_number' + ] + + # Filter headers to only include database fields that exist in data + all_fields = list(orders_data[0].keys()) + headers = [field for field in database_fields if field in all_fields] + + # Filter preview data to only show database fields + preview_data = [] + for order in orders_data[:10]: + filtered_order = {k: v for k, v in order.items() if k in headers} + preview_data.append(filtered_order) + + # Store data in a temporary file instead of session (session is too small for large datasets) + import json + import uuid + + upload_id = str(uuid.uuid4()) + temp_data_file = f'/tmp/upload_{upload_id}.json' + + with open(temp_data_file, 'w') as f: + json.dump(orders_data, f) + + # Store only the file reference in session + session['upload_id'] = upload_id + session['csv_filename'] = file.filename + session['file_type'] = 'excel' + session.modified = True return render_template('upload_orders.html', preview_data=preview_data, @@ -1764,27 +1922,112 @@ def upload_data(): filename=file.filename) except Exception as e: - flash(f'Error reading CSV file: {str(e)}', 'error') + import traceback + import sys + error_trace = traceback.format_exc() + print(f"ERROR processing file: {error_trace}") + sys.stderr.write(f"ERROR processing file: {error_trace}\n") + sys.stderr.flush() + + # Also write to a file + try: + with open('/app/upload_error.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"ERROR at {datetime.now()}\n") + f.write(f"File: {file.filename if file else 'unknown'}\n") + f.write(f"Error: {str(e)}\n") + f.write(f"Traceback:\n{error_trace}\n") + except: + pass + + flash(f'Error reading file: {str(e)}', 'error') return redirect(request.url) else: - flash('Please upload a CSV file', 'error') + flash('Please upload a CSV or Excel file (.csv, .xlsx, .xls)', 'error') return redirect(request.url) elif action == 'save': # Save the data to database - if 'csv_content' not in session: + import sys + sys.stderr.write("DEBUG: Save action triggered\n") + sys.stderr.flush() + + # Log to file immediately + try: + with open('/app/save_check.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"SAVE ACTION at {datetime.now()}\n") + f.write(f"Session keys: {list(session.keys())}\n") + f.write(f"Session file_type: {session.get('file_type', 'NOT SET')}\n") + f.write(f"Has csv_content: {'csv_content' in session}\n") + f.write(f"Has orders_data: {'orders_data' in session}\n") + except Exception as log_err: + sys.stderr.write(f"Error writing log: {log_err}\n") + + file_type = session.get('file_type') + upload_id = session.get('upload_id') + + sys.stderr.write(f"DEBUG: File type = {file_type}, upload_id = {upload_id}\n") + sys.stderr.flush() + + if not file_type or not upload_id: + sys.stderr.write("DEBUG: Missing file_type or upload_id in session\n") + sys.stderr.flush() + + try: + with open('/app/save_check.log', 'a') as f: + f.write("ERROR: Missing upload data in session - redirecting\n") + except: + pass + flash('No data to save. Please upload a file first.', 'error') return redirect(request.url) try: - import csv - import io + print(f"DEBUG: Starting {file_type.upper()} upload processing...") + sys.stderr.write(f"DEBUG: Starting {file_type.upper()} upload processing...\n") + sys.stderr.flush() - print(f"DEBUG: Starting CSV upload processing...") + # Log to file + try: + with open('/app/save_debug.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"SAVE START at {datetime.now()}\n") + f.write(f"File type: {file_type}\n") + f.write(f"Session keys: {list(session.keys())}\n") + except: + pass - # Read the CSV content from session - stream = io.StringIO(session['csv_content'], newline=None) - csv_input = csv.DictReader(stream) + # Get orders data from temp file + import json + temp_data_file = f'/tmp/upload_{upload_id}.{"json" if file_type == "excel" else "csv"}' + + if file_type == 'excel': + with open(temp_data_file, 'r') as f: + orders_list = json.load(f) + + # Log + try: + with open('/app/save_debug.log', 'a') as f: + f.write(f"Loaded {len(orders_list)} orders from temp file (Excel)\n") + except: + pass + else: + # Read the CSV content from temp file + import csv + with open(temp_data_file, 'r') as f: + csv_input = csv.DictReader(f) + orders_list = list(csv_input) + + # Log + try: + with open('/app/save_debug.log', 'a') as f: + f.write(f"Loaded {len(orders_list)} orders from temp file (CSV)\n") + except: + pass # Connect to database conn = get_db_connection() @@ -1794,10 +2037,10 @@ def upload_data(): error_count = 0 errors = [] - print(f"DEBUG: Connected to database, processing rows...") + print(f"DEBUG: Connected to database, processing {len(orders_list)} rows...") # Process each row - for index, row in enumerate(csv_input): + for index, row in enumerate(orders_list): try: print(f"DEBUG: Processing row {index + 1}: {row}") @@ -1824,10 +2067,18 @@ def upload_data(): # Convert empty string to None for date field if data_livrare: try: - # Parse date from various formats (9/23/2023, 23/9/2023, 2023-09-23, etc.) + # Parse date from various formats including Excel datetime format from datetime import datetime # Try different date formats - date_formats = ['%m/%d/%Y', '%d/%m/%Y', '%Y-%m-%d', '%m-%d-%Y', '%d-%m-%Y'] + date_formats = [ + '%Y-%m-%d', # 2024-03-12 + '%Y-%m-%d %H:%M:%S', # 2024-03-12 00:00:00 (Excel format) + '%m/%d/%Y', # 03/12/2024 + '%d/%m/%Y', # 12/03/2024 + '%m-%d-%Y', # 03-12-2024 + '%d-%m-%Y', # 12-03-2024 + '%d.%m.%Y' # 12.03.2024 + ] parsed_date = None for fmt in date_formats: try: @@ -1897,9 +2148,39 @@ def upload_data(): print(f"DEBUG: Committed {inserted_count} records to database") - # Clear session data - session.pop('csv_content', None) + # Log the result + import sys + try: + with open('/app/upload_success.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"UPLOAD COMPLETED at {datetime.now()}\n") + f.write(f"File type: {file_type}\n") + f.write(f"Total rows processed: {len(orders_list)}\n") + f.write(f"Successfully inserted: {inserted_count}\n") + f.write(f"Errors: {error_count}\n") + if errors: + f.write(f"First 10 errors:\n") + for err in errors[:10]: + f.write(f" - {err}\n") + except: + pass + + sys.stderr.write(f"DEBUG: Upload complete - inserted {inserted_count}, errors {error_count}\n") + sys.stderr.flush() + + # Clear session data and remove temp file + import os + temp_file_path = f'/tmp/upload_{upload_id}.{"json" if file_type == "excel" else "csv"}' + try: + if os.path.exists(temp_file_path): + os.unlink(temp_file_path) + except: + pass + + session.pop('upload_id', None) session.pop('csv_filename', None) + session.pop('file_type', None) # Show results if error_count > 0: @@ -1912,6 +2193,24 @@ def upload_data(): flash(f'Successfully uploaded {inserted_count} orders for labels', 'success') except Exception as e: + import sys + import traceback + error_trace = traceback.format_exc() + + # Log the error + try: + with open('/app/upload_error.log', 'a') as f: + from datetime import datetime + f.write(f"\n{'='*80}\n") + f.write(f"SAVE ERROR at {datetime.now()}\n") + f.write(f"Error: {str(e)}\n") + f.write(f"Traceback:\n{error_trace}\n") + except: + pass + + sys.stderr.write(f"ERROR in save: {error_trace}\n") + sys.stderr.flush() + flash(f'Error processing data: {str(e)}', 'error') return redirect(url_for('main.upload_data')) @@ -3092,15 +3391,46 @@ def get_unprinted_orders(): # return jsonify({'error': 'Access denied. Required roles: superadmin, warehouse_manager, etichete'}), 403 try: - print("DEBUG: Calling get_unprinted_orders_data()") + import sys + sys.stderr.write("DEBUG: Calling get_unprinted_orders_data()\n") + sys.stderr.flush() + data = get_unprinted_orders_data() - print(f"DEBUG: Retrieved {len(data)} orders") + + sys.stderr.write(f"DEBUG: Retrieved {len(data)} orders\n") + sys.stderr.flush() + + # Write to file + try: + with open('/app/unprinted_debug.log', 'w') as f: + from datetime import datetime + f.write(f"DEBUG at {datetime.now()}\n") + f.write(f"Retrieved {len(data)} orders\n") + if data: + f.write(f"First order: {data[0]}\n") + except: + pass + return jsonify(data) except Exception as e: - print(f"DEBUG: Error in get_unprinted_orders: {e}") + import sys import traceback - traceback.print_exc() + error_trace = traceback.format_exc() + + sys.stderr.write(f"DEBUG: Error in get_unprinted_orders: {e}\n{error_trace}\n") + sys.stderr.flush() + + # Write to file + try: + with open('/app/unprinted_debug.log', 'w') as f: + from datetime import datetime + f.write(f"ERROR at {datetime.now()}\n") + f.write(f"Error: {e}\n") + f.write(f"Traceback:\n{error_trace}\n") + except: + pass + return jsonify({'error': str(e)}), 500 @bp.route('/generate_labels_pdf/', methods=['POST']) diff --git a/py_app/app/templates/print_lost_labels.html b/py_app/app/templates/print_lost_labels.html index fcd3d2d..dba7b33 100755 --- a/py_app/app/templates/print_lost_labels.html +++ b/py_app/app/templates/print_lost_labels.html @@ -2,6 +2,50 @@ {% block head %} + {% endblock %} {% block content %} @@ -13,7 +57,7 @@ -
+