Fix print_lost_labels compact styling and production data import
- Added compact table styling to print_lost_labels page (smaller fonts, reduced padding) - Fixed production data import missing fields (production_order_line, line_number) - Added better error handling and logging for Excel file imports - Skip empty rows in production data import - Log all columns and columns with data for debugging
This commit is contained in:
@@ -1,303 +0,0 @@
|
|||||||
# Quality Application - Docker Deployment Guide
|
|
||||||
|
|
||||||
## 📋 Overview
|
|
||||||
|
|
||||||
This application is containerized with Docker and docker-compose, providing:
|
|
||||||
- **MariaDB 11.3** database with persistent storage
|
|
||||||
- **Flask** web application with Gunicorn
|
|
||||||
- **Mapped volumes** for easy access to code, data, and backups
|
|
||||||
|
|
||||||
## 🗂️ Volume Structure
|
|
||||||
|
|
||||||
```
|
|
||||||
quality_app/
|
|
||||||
├── data/
|
|
||||||
│ └── mariadb/ # Database files (MariaDB data directory)
|
|
||||||
├── config/
|
|
||||||
│ └── instance/ # Application configuration (external_server.conf)
|
|
||||||
├── logs/ # Application and Gunicorn logs
|
|
||||||
├── backups/ # Database backup files (shared with DB container)
|
|
||||||
└── py_app/ # Application source code (optional mapping)
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🚀 Quick Start
|
|
||||||
|
|
||||||
### 1. Setup Volumes
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Create necessary directories
|
|
||||||
bash setup-volumes.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2. Configure Environment
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Create .env file from example
|
|
||||||
cp .env.example .env
|
|
||||||
|
|
||||||
# Edit configuration (IMPORTANT: Change passwords!)
|
|
||||||
nano .env
|
|
||||||
```
|
|
||||||
|
|
||||||
**Critical settings to change:**
|
|
||||||
- `MYSQL_ROOT_PASSWORD` - Database root password
|
|
||||||
- `DB_PASSWORD` - Application database password
|
|
||||||
- `SECRET_KEY` - Flask secret key (generate random string)
|
|
||||||
|
|
||||||
**First deployment settings:**
|
|
||||||
- `INIT_DB=true` - Initialize database schema
|
|
||||||
- `SEED_DB=true` - Seed with default data
|
|
||||||
|
|
||||||
**After first deployment:**
|
|
||||||
- `INIT_DB=false`
|
|
||||||
- `SEED_DB=false`
|
|
||||||
|
|
||||||
### 3. Deploy Application
|
|
||||||
|
|
||||||
**Option A: Automated deployment**
|
|
||||||
```bash
|
|
||||||
bash quick-deploy.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
**Option B: Manual deployment**
|
|
||||||
```bash
|
|
||||||
# Build images
|
|
||||||
docker-compose build
|
|
||||||
|
|
||||||
# Start services
|
|
||||||
docker-compose up -d
|
|
||||||
|
|
||||||
# View logs
|
|
||||||
docker-compose logs -f
|
|
||||||
```
|
|
||||||
|
|
||||||
## 📦 Application Dependencies
|
|
||||||
|
|
||||||
### Python Packages (from requirements.txt):
|
|
||||||
- Flask - Web framework
|
|
||||||
- Flask-SSLify - SSL support
|
|
||||||
- Werkzeug - WSGI utilities
|
|
||||||
- gunicorn - Production WSGI server
|
|
||||||
- pyodbc - ODBC database connectivity
|
|
||||||
- mariadb - MariaDB connector
|
|
||||||
- reportlab - PDF generation
|
|
||||||
- requests - HTTP library
|
|
||||||
- pandas - Data manipulation
|
|
||||||
- openpyxl - Excel file support
|
|
||||||
- APScheduler - Job scheduling for automated backups
|
|
||||||
|
|
||||||
### System Dependencies (handled in Dockerfile):
|
|
||||||
- Python 3.10
|
|
||||||
- MariaDB client libraries
|
|
||||||
- curl (for health checks)
|
|
||||||
|
|
||||||
## 🐳 Docker Images
|
|
||||||
|
|
||||||
### Web Application
|
|
||||||
- **Base**: python:3.10-slim
|
|
||||||
- **Multi-stage build** for minimal image size
|
|
||||||
- **Non-root user** for security
|
|
||||||
- **Health checks** enabled
|
|
||||||
|
|
||||||
### Database
|
|
||||||
- **Image**: mariadb:11.3
|
|
||||||
- **Persistent storage** with volume mapping
|
|
||||||
- **Performance tuning** via environment variables
|
|
||||||
|
|
||||||
## 📊 Resource Limits
|
|
||||||
|
|
||||||
### Database Container
|
|
||||||
- CPU: 2.0 cores (limit), 0.5 cores (reserved)
|
|
||||||
- Memory: 2GB (limit), 512MB (reserved)
|
|
||||||
- Buffer pool: 512MB
|
|
||||||
|
|
||||||
### Web Container
|
|
||||||
- CPU: 2.0 cores (limit), 0.5 cores (reserved)
|
|
||||||
- Memory: 2GB (limit), 512MB (reserved)
|
|
||||||
- Workers: 5 Gunicorn workers
|
|
||||||
|
|
||||||
## 🔧 Common Operations
|
|
||||||
|
|
||||||
### View Logs
|
|
||||||
```bash
|
|
||||||
# Application logs
|
|
||||||
docker-compose logs -f web
|
|
||||||
|
|
||||||
# Database logs
|
|
||||||
docker-compose logs -f db
|
|
||||||
|
|
||||||
# All logs
|
|
||||||
docker-compose logs -f
|
|
||||||
```
|
|
||||||
|
|
||||||
### Restart Services
|
|
||||||
```bash
|
|
||||||
# Restart all
|
|
||||||
docker-compose restart
|
|
||||||
|
|
||||||
# Restart specific service
|
|
||||||
docker-compose restart web
|
|
||||||
docker-compose restart db
|
|
||||||
```
|
|
||||||
|
|
||||||
### Stop Services
|
|
||||||
```bash
|
|
||||||
# Stop (keeps data)
|
|
||||||
docker-compose down
|
|
||||||
|
|
||||||
# Stop and remove volumes (WARNING: deletes database!)
|
|
||||||
docker-compose down -v
|
|
||||||
```
|
|
||||||
|
|
||||||
### Update Application Code
|
|
||||||
|
|
||||||
**Without rebuilding (development mode):**
|
|
||||||
1. Uncomment volume mapping in docker-compose.yml:
|
|
||||||
```yaml
|
|
||||||
- ${APP_CODE_PATH}:/app:ro
|
|
||||||
```
|
|
||||||
2. Edit code in `./py_app/`
|
|
||||||
3. Restart: `docker-compose restart web`
|
|
||||||
|
|
||||||
**With rebuilding (production mode):**
|
|
||||||
```bash
|
|
||||||
docker-compose build --no-cache web
|
|
||||||
docker-compose up -d
|
|
||||||
```
|
|
||||||
|
|
||||||
### Database Access
|
|
||||||
|
|
||||||
**MySQL shell inside container:**
|
|
||||||
```bash
|
|
||||||
docker-compose exec db mysql -u trasabilitate -p
|
|
||||||
# Enter password: Initial01! (or your custom password)
|
|
||||||
```
|
|
||||||
|
|
||||||
**From host machine:**
|
|
||||||
```bash
|
|
||||||
mysql -h 127.0.0.1 -P 3306 -u trasabilitate -p
|
|
||||||
```
|
|
||||||
|
|
||||||
**Root access:**
|
|
||||||
```bash
|
|
||||||
docker-compose exec db mysql -u root -p
|
|
||||||
```
|
|
||||||
|
|
||||||
## 💾 Backup Operations
|
|
||||||
|
|
||||||
### Manual Backup
|
|
||||||
```bash
|
|
||||||
# Full backup
|
|
||||||
docker-compose exec db mysqldump -u trasabilitate -pInitial01! trasabilitate > backups/manual_$(date +%Y%m%d_%H%M%S).sql
|
|
||||||
|
|
||||||
# Data-only backup
|
|
||||||
docker-compose exec db mysqldump -u trasabilitate -pInitial01! --no-create-info trasabilitate > backups/data_only_$(date +%Y%m%d_%H%M%S).sql
|
|
||||||
|
|
||||||
# Structure-only backup
|
|
||||||
docker-compose exec db mysqldump -u trasabilitate -pInitial01! --no-data trasabilitate > backups/structure_only_$(date +%Y%m%d_%H%M%S).sql
|
|
||||||
```
|
|
||||||
|
|
||||||
### Automated Backups
|
|
||||||
The application includes a built-in scheduler for automated backups. Configure via the web interface.
|
|
||||||
|
|
||||||
### Restore from Backup
|
|
||||||
```bash
|
|
||||||
# Stop application (keeps database running)
|
|
||||||
docker-compose stop web
|
|
||||||
|
|
||||||
# Restore database
|
|
||||||
docker-compose exec -T db mysql -u trasabilitate -pInitial01! trasabilitate < backups/backup_file.sql
|
|
||||||
|
|
||||||
# Start application
|
|
||||||
docker-compose start web
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🔍 Troubleshooting
|
|
||||||
|
|
||||||
### Container won't start
|
|
||||||
```bash
|
|
||||||
# Check logs
|
|
||||||
docker-compose logs db
|
|
||||||
docker-compose logs web
|
|
||||||
|
|
||||||
# Check if ports are available
|
|
||||||
ss -tulpn | grep 8781
|
|
||||||
ss -tulpn | grep 3306
|
|
||||||
```
|
|
||||||
|
|
||||||
### Database connection failed
|
|
||||||
```bash
|
|
||||||
# Check database is healthy
|
|
||||||
docker-compose ps
|
|
||||||
|
|
||||||
# Test database connection
|
|
||||||
docker-compose exec db mysqladmin ping -u root -p
|
|
||||||
|
|
||||||
# Check database users
|
|
||||||
docker-compose exec db mysql -u root -p -e "SELECT User, Host FROM mysql.user;"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Permission issues
|
|
||||||
```bash
|
|
||||||
# Check directory permissions
|
|
||||||
ls -la data/mariadb
|
|
||||||
ls -la logs
|
|
||||||
ls -la backups
|
|
||||||
|
|
||||||
# Fix permissions if needed
|
|
||||||
chmod -R 755 data logs backups config
|
|
||||||
```
|
|
||||||
|
|
||||||
### Reset everything (WARNING: deletes all data!)
|
|
||||||
```bash
|
|
||||||
# Stop and remove containers, volumes
|
|
||||||
docker-compose down -v
|
|
||||||
|
|
||||||
# Remove volume directories
|
|
||||||
rm -rf data/mariadb/* logs/* config/instance/*
|
|
||||||
|
|
||||||
# Start fresh
|
|
||||||
bash quick-deploy.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🔒 Security Notes
|
|
||||||
|
|
||||||
1. **Change default passwords** in .env file
|
|
||||||
2. **Generate new SECRET_KEY** for Flask
|
|
||||||
3. Never commit .env file to version control
|
|
||||||
4. Use firewall rules to restrict database port (3306) access
|
|
||||||
5. Consider using Docker secrets for sensitive data in production
|
|
||||||
6. Regular security updates: `docker-compose pull && docker-compose up -d`
|
|
||||||
|
|
||||||
## 🌐 Port Mapping
|
|
||||||
|
|
||||||
- **8781** - Web application (configurable via APP_PORT in .env)
|
|
||||||
- **3306** - MariaDB database (configurable via DB_PORT in .env)
|
|
||||||
|
|
||||||
## 📁 Configuration Files
|
|
||||||
|
|
||||||
- **docker-compose.yml** - Service orchestration
|
|
||||||
- **.env** - Environment variables and configuration
|
|
||||||
- **Dockerfile** - Web application image definition
|
|
||||||
- **docker-entrypoint.sh** - Container initialization script
|
|
||||||
- **init-db.sql** - Database initialization script
|
|
||||||
|
|
||||||
## 🎯 Production Checklist
|
|
||||||
|
|
||||||
- [ ] Change all default passwords
|
|
||||||
- [ ] Generate secure SECRET_KEY
|
|
||||||
- [ ] Set FLASK_ENV=production
|
|
||||||
- [ ] Configure resource limits appropriately
|
|
||||||
- [ ] Set up backup schedule
|
|
||||||
- [ ] Configure firewall rules
|
|
||||||
- [ ] Set up monitoring and logging
|
|
||||||
- [ ] Test backup/restore procedures
|
|
||||||
- [ ] Document deployment procedure for your team
|
|
||||||
- [ ] Set INIT_DB=false and SEED_DB=false after first deployment
|
|
||||||
|
|
||||||
## 📞 Support
|
|
||||||
|
|
||||||
For issues or questions, refer to:
|
|
||||||
- Documentation in `documentation/` folder
|
|
||||||
- Docker logs: `docker-compose logs -f`
|
|
||||||
- Application logs: `./logs/` directory
|
|
||||||
@@ -1,123 +0,0 @@
|
|||||||
# Improvements Applied to Quality App
|
|
||||||
|
|
||||||
## Date: November 13, 2025
|
|
||||||
|
|
||||||
### Overview
|
|
||||||
All improvements from the production environment have been successfully transposed to the quality_app project.
|
|
||||||
|
|
||||||
## Files Updated/Copied
|
|
||||||
|
|
||||||
### 1. Docker Configuration
|
|
||||||
- **Dockerfile** - Added `mariadb-client` package for backup functionality
|
|
||||||
- **docker-compose.yml** - Updated with proper volume mappings and /data folder support
|
|
||||||
- **.env** - Updated all paths to use absolute paths under `/srv/quality_app/`
|
|
||||||
|
|
||||||
### 2. Backup & Restore System
|
|
||||||
- **database_backup.py** - Fixed backup/restore functions:
|
|
||||||
- Changed `result_success` to `result.returncode == 0`
|
|
||||||
- Added `--skip-ssl` flag for MariaDB connections
|
|
||||||
- Fixed restore function error handling
|
|
||||||
- **restore_database.sh** - Fixed SQL file parsing to handle MariaDB dump format
|
|
||||||
|
|
||||||
### 3. UI Improvements - Sticky Table Headers
|
|
||||||
- **base.css** - Added sticky header CSS for all report tables
|
|
||||||
- **scan.html** - Wrapped table in `report-table-container` div
|
|
||||||
- **fg_scan.html** - Wrapped table in `report-table-container` div
|
|
||||||
|
|
||||||
### 4. Quality Code Display Enhancement
|
|
||||||
- **fg_quality.js** - Quality code `0` displays as "OK" in green; CSV exports as "0"
|
|
||||||
- **script.js** - Same improvements for quality module reports
|
|
||||||
|
|
||||||
## Directory Structure
|
|
||||||
|
|
||||||
```
|
|
||||||
/srv/quality_app/
|
|
||||||
├── py_app/ # Application code (mapped to /app in container)
|
|
||||||
├── data/
|
|
||||||
│ └── mariadb/ # Database files
|
|
||||||
├── config/
|
|
||||||
│ └── instance/ # Application configuration
|
|
||||||
├── logs/ # Application logs
|
|
||||||
├── backups/ # Database backups
|
|
||||||
├── docker-compose.yml
|
|
||||||
├── Dockerfile
|
|
||||||
├── .env
|
|
||||||
└── restore_database.sh
|
|
||||||
```
|
|
||||||
|
|
||||||
## Environment Configuration
|
|
||||||
|
|
||||||
### Volume Mappings in .env:
|
|
||||||
```
|
|
||||||
DB_DATA_PATH=/srv/quality_app/data/mariadb
|
|
||||||
APP_CODE_PATH=/srv/quality_app/py_app
|
|
||||||
LOGS_PATH=/srv/quality_app/logs
|
|
||||||
INSTANCE_PATH=/srv/quality_app/config/instance
|
|
||||||
BACKUP_PATH=/srv/quality_app/backups
|
|
||||||
```
|
|
||||||
|
|
||||||
## Features Implemented
|
|
||||||
|
|
||||||
### ✅ Backup System
|
|
||||||
- Automatic scheduled backups
|
|
||||||
- Manual backup creation
|
|
||||||
- Data-only backups
|
|
||||||
- Backup retention policies
|
|
||||||
- MariaDB client tools installed
|
|
||||||
|
|
||||||
### ✅ Restore System
|
|
||||||
- Python-based restore function
|
|
||||||
- Shell script restore with proper SQL parsing
|
|
||||||
- Handles MariaDB dump format correctly
|
|
||||||
|
|
||||||
### ✅ UI Enhancements
|
|
||||||
- **Sticky Headers**: Table headers remain fixed when scrolling
|
|
||||||
- **Quality Code Display**:
|
|
||||||
- Shows "OK" in green for quality code 0
|
|
||||||
- Exports "0" in CSV files
|
|
||||||
- Better user experience
|
|
||||||
|
|
||||||
### ✅ Volume Mapping
|
|
||||||
- All volumes use absolute paths
|
|
||||||
- Support for /data folder mapping
|
|
||||||
- Easy to configure backup location on different drives
|
|
||||||
|
|
||||||
## Starting the Application
|
|
||||||
|
|
||||||
```bash
|
|
||||||
cd /srv/quality_app
|
|
||||||
docker compose up -d --build
|
|
||||||
```
|
|
||||||
|
|
||||||
## Testing Backup & Restore
|
|
||||||
|
|
||||||
### Create Backup:
|
|
||||||
```bash
|
|
||||||
cd /srv/quality_app
|
|
||||||
docker compose exec web bash -c "cd /app && python3 -c 'from app import create_app; from app.database_backup import DatabaseBackupManager; app = create_app();
|
|
||||||
with app.app_context(): bm = DatabaseBackupManager(); result = bm.create_backup(); print(result)'"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Restore Backup:
|
|
||||||
```bash
|
|
||||||
cd /srv/quality_app
|
|
||||||
./restore_database.sh /srv/quality_app/backups/backup_file.sql
|
|
||||||
```
|
|
||||||
|
|
||||||
## Notes
|
|
||||||
|
|
||||||
- Database initialization is set to `false` (already initialized)
|
|
||||||
- All improvements are production-ready
|
|
||||||
- Backup path can be changed to external drive if needed
|
|
||||||
- Application port: 8781 (default)
|
|
||||||
|
|
||||||
## Next Steps
|
|
||||||
|
|
||||||
1. Review .env file and update passwords if needed
|
|
||||||
2. Test all functionality after deployment
|
|
||||||
3. Configure backup schedule if needed
|
|
||||||
4. Set up external backup drive if desired
|
|
||||||
|
|
||||||
---
|
|
||||||
**Compatibility**: All changes are backward compatible with existing data.
|
|
||||||
**Status**: Ready for deployment
|
|
||||||
@@ -1,292 +0,0 @@
|
|||||||
# Merge Compatibility Analysis: docker-deploy → master
|
|
||||||
|
|
||||||
## 📊 Merge Status: **SAFE TO MERGE** ✅
|
|
||||||
|
|
||||||
### Conflict Analysis
|
|
||||||
- **No merge conflicts detected** between `master` and `docker-deploy` branches
|
|
||||||
- All changes are additive or modify existing code in compatible ways
|
|
||||||
- The docker-deploy branch adds 13 files with 1034 insertions and 117 deletions
|
|
||||||
|
|
||||||
### Files Changed
|
|
||||||
#### New Files (No conflicts):
|
|
||||||
1. `DOCKER_DEPLOYMENT_GUIDE.md` - Documentation
|
|
||||||
2. `IMPROVEMENTS_APPLIED.md` - Documentation
|
|
||||||
3. `quick-deploy.sh` - Deployment script
|
|
||||||
4. `restore_database.sh` - Restore script
|
|
||||||
5. `setup-volumes.sh` - Setup script
|
|
||||||
|
|
||||||
#### Modified Files:
|
|
||||||
1. `Dockerfile` - Added mariadb-client package
|
|
||||||
2. `docker-compose.yml` - Added /data volume mapping, resource limits
|
|
||||||
3. `py_app/app/database_backup.py` - **CRITICAL: Compatibility layer added**
|
|
||||||
4. `py_app/app/static/css/base.css` - Added sticky header styles
|
|
||||||
5. `py_app/app/static/fg_quality.js` - Quality code display enhancement
|
|
||||||
6. `py_app/app/static/script.js` - Quality code display enhancement
|
|
||||||
7. `py_app/app/templates/fg_scan.html` - Added report-table-container wrapper
|
|
||||||
8. `py_app/app/templates/scan.html` - Added report-table-container wrapper
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🔧 Compatibility Layer: database_backup.py
|
|
||||||
|
|
||||||
### Problem Identified
|
|
||||||
The docker-deploy branch changed backup commands from `mysqldump` to `mariadb-dump` and added `--skip-ssl` flag, which would break the application when running with standard Gunicorn (non-Docker) deployment.
|
|
||||||
|
|
||||||
### Solution Implemented
|
|
||||||
Added intelligent environment detection and command selection:
|
|
||||||
|
|
||||||
#### 1. Dynamic Command Detection
|
|
||||||
```python
|
|
||||||
def _detect_dump_command(self):
|
|
||||||
"""Detect which mysqldump command is available (mariadb-dump or mysqldump)"""
|
|
||||||
try:
|
|
||||||
# Try mariadb-dump first (newer MariaDB versions)
|
|
||||||
result = subprocess.run(['which', 'mariadb-dump'],
|
|
||||||
capture_output=True, text=True)
|
|
||||||
if result.returncode == 0:
|
|
||||||
return 'mariadb-dump'
|
|
||||||
|
|
||||||
# Fall back to mysqldump
|
|
||||||
result = subprocess.run(['which', 'mysqldump'],
|
|
||||||
capture_output=True, text=True)
|
|
||||||
if result.returncode == 0:
|
|
||||||
return 'mysqldump'
|
|
||||||
|
|
||||||
# Default to mariadb-dump (will error if not available)
|
|
||||||
return 'mariadb-dump'
|
|
||||||
except Exception as e:
|
|
||||||
print(f"Warning: Could not detect dump command: {e}")
|
|
||||||
return 'mysqldump' # Default fallback
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 2. Conditional SSL Arguments
|
|
||||||
```python
|
|
||||||
def _get_ssl_args(self):
|
|
||||||
"""Get SSL arguments based on environment (Docker needs --skip-ssl)"""
|
|
||||||
# Check if running in Docker container
|
|
||||||
if os.path.exists('/.dockerenv') or os.environ.get('DOCKER_CONTAINER'):
|
|
||||||
return ['--skip-ssl']
|
|
||||||
return []
|
|
||||||
```
|
|
||||||
|
|
||||||
#### 3. Updated Backup Command Building
|
|
||||||
```python
|
|
||||||
cmd = [
|
|
||||||
self.dump_command, # Uses detected command (mariadb-dump or mysqldump)
|
|
||||||
f"--host={self.config['host']}",
|
|
||||||
f"--port={self.config['port']}",
|
|
||||||
f"--user={self.config['user']}",
|
|
||||||
f"--password={self.config['password']}",
|
|
||||||
]
|
|
||||||
|
|
||||||
# Add SSL args if needed (Docker environment)
|
|
||||||
cmd.extend(self._get_ssl_args())
|
|
||||||
|
|
||||||
# Add backup options
|
|
||||||
cmd.extend([
|
|
||||||
'--single-transaction',
|
|
||||||
'--skip-lock-tables',
|
|
||||||
'--force',
|
|
||||||
# ... other options
|
|
||||||
])
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🎯 Deployment Scenarios
|
|
||||||
|
|
||||||
### Scenario 1: Docker Deployment (docker-compose)
|
|
||||||
**Environment Detection:**
|
|
||||||
- ✅ `/.dockerenv` file exists
|
|
||||||
- ✅ `DOCKER_CONTAINER` environment variable set in docker-compose.yml
|
|
||||||
|
|
||||||
**Backup Behavior:**
|
|
||||||
- Uses `mariadb-dump` (installed in Dockerfile)
|
|
||||||
- Adds `--skip-ssl` flag automatically
|
|
||||||
- Works correctly ✅
|
|
||||||
|
|
||||||
### Scenario 2: Standard Gunicorn Deployment (systemd service)
|
|
||||||
**Environment Detection:**
|
|
||||||
- ❌ `/.dockerenv` file does NOT exist
|
|
||||||
- ❌ `DOCKER_CONTAINER` environment variable NOT set
|
|
||||||
|
|
||||||
**Backup Behavior:**
|
|
||||||
- Detects available command: `mysqldump` or `mariadb-dump`
|
|
||||||
- Does NOT add `--skip-ssl` flag
|
|
||||||
- Uses system-installed MySQL/MariaDB client tools
|
|
||||||
- Works correctly ✅
|
|
||||||
|
|
||||||
### Scenario 3: Mixed Environment (External Database)
|
|
||||||
**Both deployment types can connect to:**
|
|
||||||
- External MariaDB server
|
|
||||||
- Remote database instance
|
|
||||||
- Local database with proper SSL configuration
|
|
||||||
|
|
||||||
**Backup Behavior:**
|
|
||||||
- Automatically adapts to available tools
|
|
||||||
- SSL handling based on container detection
|
|
||||||
- Works correctly ✅
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🧪 Testing Plan
|
|
||||||
|
|
||||||
### Pre-Merge Testing
|
|
||||||
1. **Docker Environment:**
|
|
||||||
```bash
|
|
||||||
cd /srv/quality_app
|
|
||||||
git checkout docker-deploy
|
|
||||||
docker-compose up -d
|
|
||||||
# Test backup via web UI
|
|
||||||
# Test scheduled backup
|
|
||||||
# Test restore functionality
|
|
||||||
```
|
|
||||||
|
|
||||||
2. **Gunicorn Environment:**
|
|
||||||
```bash
|
|
||||||
# Stop Docker if running
|
|
||||||
docker-compose down
|
|
||||||
|
|
||||||
# Start with systemd service (if available)
|
|
||||||
sudo systemctl start trasabilitate
|
|
||||||
|
|
||||||
# Test backup via web UI
|
|
||||||
# Test scheduled backup
|
|
||||||
# Test restore functionality
|
|
||||||
```
|
|
||||||
|
|
||||||
3. **Command Detection Test:**
|
|
||||||
```bash
|
|
||||||
# Inside Docker container
|
|
||||||
docker-compose exec web python3 -c "
|
|
||||||
from app.database_backup import DatabaseBackupManager
|
|
||||||
manager = DatabaseBackupManager()
|
|
||||||
print(f'Dump command: {manager.dump_command}')
|
|
||||||
print(f'SSL args: {manager._get_ssl_args()}')
|
|
||||||
"
|
|
||||||
|
|
||||||
# On host system (if MySQL client installed)
|
|
||||||
python3 -c "
|
|
||||||
from app.database_backup import DatabaseBackupManager
|
|
||||||
manager = DatabaseBackupManager()
|
|
||||||
print(f'Dump command: {manager.dump_command}')
|
|
||||||
print(f'SSL args: {manager._get_ssl_args()}')
|
|
||||||
"
|
|
||||||
```
|
|
||||||
|
|
||||||
### Post-Merge Testing
|
|
||||||
1. Verify both deployment methods still work
|
|
||||||
2. Test backup/restore in both environments
|
|
||||||
3. Verify scheduled backups function correctly
|
|
||||||
4. Check error handling when tools are missing
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📋 Merge Checklist
|
|
||||||
|
|
||||||
- [x] No merge conflicts detected
|
|
||||||
- [x] Compatibility layer implemented in `database_backup.py`
|
|
||||||
- [x] Environment detection for Docker vs Gunicorn
|
|
||||||
- [x] Dynamic command selection (mariadb-dump vs mysqldump)
|
|
||||||
- [x] Conditional SSL flag handling
|
|
||||||
- [x] UI improvements (sticky headers) are purely CSS/JS - no conflicts
|
|
||||||
- [x] Quality code display changes are frontend-only - no conflicts
|
|
||||||
- [x] New documentation files added - no conflicts
|
|
||||||
- [x] Docker-specific files don't affect Gunicorn deployment
|
|
||||||
|
|
||||||
### Safe to Merge Because:
|
|
||||||
1. **Additive Changes**: Most changes are new files or new features
|
|
||||||
2. **Backward Compatible**: Code detects environment and adapts
|
|
||||||
3. **No Breaking Changes**: Gunicorn deployment still works without Docker
|
|
||||||
4. **Independent Features**: UI improvements work in any environment
|
|
||||||
5. **Fail-Safe Defaults**: Falls back to mysqldump if mariadb-dump unavailable
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🚀 Merge Process
|
|
||||||
|
|
||||||
### Recommended Steps:
|
|
||||||
```bash
|
|
||||||
cd /srv/quality_app
|
|
||||||
|
|
||||||
# 1. Ensure working directory is clean
|
|
||||||
git status
|
|
||||||
|
|
||||||
# 2. Switch to master branch
|
|
||||||
git checkout master
|
|
||||||
|
|
||||||
# 3. Pull latest changes
|
|
||||||
git pull origin master
|
|
||||||
|
|
||||||
# 4. Merge docker-deploy (should be clean merge)
|
|
||||||
git merge docker-deploy
|
|
||||||
|
|
||||||
# 5. Review merge
|
|
||||||
git log --oneline -10
|
|
||||||
|
|
||||||
# 6. Test in current environment
|
|
||||||
# (If using systemd, test the app)
|
|
||||||
# (If using Docker, test with docker-compose)
|
|
||||||
|
|
||||||
# 7. Push to remote
|
|
||||||
git push origin master
|
|
||||||
|
|
||||||
# 8. Tag the release (optional)
|
|
||||||
git tag -a v2.0-docker -m "Docker deployment support with compatibility layer"
|
|
||||||
git push origin v2.0-docker
|
|
||||||
```
|
|
||||||
|
|
||||||
### Rollback Plan (if needed):
|
|
||||||
```bash
|
|
||||||
# If issues arise after merge
|
|
||||||
git log --oneline -10 # Find commit hash before merge
|
|
||||||
git reset --hard <commit-hash-before-merge>
|
|
||||||
git push origin master --force # Use with caution!
|
|
||||||
|
|
||||||
# Or revert the merge commit
|
|
||||||
git revert -m 1 <merge-commit-hash>
|
|
||||||
git push origin master
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🎓 Key Improvements in docker-deploy Branch
|
|
||||||
|
|
||||||
### 1. **Bug Fixes**
|
|
||||||
- Fixed `result_success` variable error → `result.returncode == 0`
|
|
||||||
- Fixed restore SQL parsing with sed preprocessing
|
|
||||||
- Fixed missing mariadb-client in Docker container
|
|
||||||
|
|
||||||
### 2. **Docker Support**
|
|
||||||
- Complete Docker Compose setup
|
|
||||||
- Volume mapping for persistent data
|
|
||||||
- Health checks and resource limits
|
|
||||||
- Environment-based configuration
|
|
||||||
|
|
||||||
### 3. **UI Enhancements**
|
|
||||||
- Sticky table headers for scrollable reports
|
|
||||||
- Quality code 0 displays as "OK" (green)
|
|
||||||
- CSV export preserves original "0" value
|
|
||||||
|
|
||||||
### 4. **Compatibility**
|
|
||||||
- Works in Docker AND traditional Gunicorn deployment
|
|
||||||
- Auto-detects available backup tools
|
|
||||||
- Environment-aware SSL handling
|
|
||||||
- No breaking changes to existing functionality
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📞 Support
|
|
||||||
|
|
||||||
If issues arise after merge:
|
|
||||||
1. Check environment detection: `ls -la /.dockerenv`
|
|
||||||
2. Verify backup tools: `which mysqldump mariadb-dump`
|
|
||||||
3. Review logs: `docker-compose logs web` or application logs
|
|
||||||
4. Test backup manually from command line
|
|
||||||
5. Fall back to master branch if critical issues occur
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
**Last Updated:** 2025-11-13
|
|
||||||
**Branch:** docker-deploy → master
|
|
||||||
**Status:** Ready for merge ✅
|
|
||||||
Binary file not shown.
74
README.md
74
README.md
@@ -1,74 +0,0 @@
|
|||||||
# Quality Recticel Application
|
|
||||||
|
|
||||||
Production traceability and quality management system.
|
|
||||||
|
|
||||||
## 📚 Documentation
|
|
||||||
|
|
||||||
All development and deployment documentation has been moved to the **[documentation](./documentation/)** folder.
|
|
||||||
|
|
||||||
### Quick Links
|
|
||||||
|
|
||||||
- **[Documentation Index](./documentation/README.md)** - Complete documentation overview
|
|
||||||
- **[Database Setup](./documentation/DATABASE_DOCKER_SETUP.md)** - Database configuration guide
|
|
||||||
- **[Docker Guide](./documentation/DOCKER_QUICK_REFERENCE.md)** - Docker commands reference
|
|
||||||
- **[Backup System](./documentation/BACKUP_SYSTEM.md)** - Database backup documentation
|
|
||||||
|
|
||||||
## 🚀 Quick Start
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Start application
|
|
||||||
cd /srv/quality_app/py_app
|
|
||||||
bash start_production.sh
|
|
||||||
|
|
||||||
# Stop application
|
|
||||||
bash stop_production.sh
|
|
||||||
|
|
||||||
# View logs
|
|
||||||
tail -f /srv/quality_app/logs/error.log
|
|
||||||
```
|
|
||||||
|
|
||||||
## 📦 Docker Deployment
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Start with Docker Compose
|
|
||||||
docker-compose up -d
|
|
||||||
|
|
||||||
# View logs
|
|
||||||
docker-compose logs -f web
|
|
||||||
|
|
||||||
# Stop services
|
|
||||||
docker-compose down
|
|
||||||
```
|
|
||||||
|
|
||||||
## 🔐 Default Access
|
|
||||||
|
|
||||||
- **URL**: http://localhost:8781
|
|
||||||
- **Username**: superadmin
|
|
||||||
- **Password**: superadmin123
|
|
||||||
|
|
||||||
## 📁 Project Structure
|
|
||||||
|
|
||||||
```
|
|
||||||
quality_app/
|
|
||||||
├── documentation/ # All documentation files
|
|
||||||
├── py_app/ # Flask application
|
|
||||||
├── backups/ # Database backups
|
|
||||||
├── logs/ # Application logs
|
|
||||||
├── docker-compose.yml # Docker configuration
|
|
||||||
└── Dockerfile # Container image definition
|
|
||||||
```
|
|
||||||
|
|
||||||
## 📖 For More Information
|
|
||||||
|
|
||||||
See the **[documentation](./documentation/)** folder for comprehensive guides on:
|
|
||||||
|
|
||||||
- Setup and deployment
|
|
||||||
- Docker configuration
|
|
||||||
- Database management
|
|
||||||
- Backup and restore procedures
|
|
||||||
- Application features
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
**Version**: 1.0.0
|
|
||||||
**Last Updated**: November 3, 2025
|
|
||||||
@@ -1,4 +0,0 @@
|
|||||||
* Serving Flask app 'app'
|
|
||||||
* Debug mode: on
|
|
||||||
Address already in use
|
|
||||||
Port 8781 is in use by another program. Either identify and stop that program, or start the server with a different port.
|
|
||||||
@@ -184,7 +184,11 @@ class DailyMirrorDatabase:
|
|||||||
raise Exception("Could not read Excel file. Please ensure it has a 'Production orders Data' or 'DataSheet' sheet.")
|
raise Exception("Could not read Excel file. Please ensure it has a 'Production orders Data' or 'DataSheet' sheet.")
|
||||||
|
|
||||||
logger.info(f"Loaded production data from {sheet_used}: {len(df)} rows, {len(df.columns)} columns")
|
logger.info(f"Loaded production data from {sheet_used}: {len(df)} rows, {len(df.columns)} columns")
|
||||||
logger.info(f"First 5 column names: {list(df.columns)[:5]}")
|
logger.info(f"All column names: {list(df.columns)}")
|
||||||
|
|
||||||
|
# Log columns that have at least some non-null data
|
||||||
|
columns_with_data = [col for col in df.columns if df[col].notna().any()]
|
||||||
|
logger.info(f"Columns with data ({len(columns_with_data)}): {columns_with_data}")
|
||||||
|
|
||||||
cursor = self.connection.cursor()
|
cursor = self.connection.cursor()
|
||||||
success_count = 0
|
success_count = 0
|
||||||
@@ -235,6 +239,10 @@ class DailyMirrorDatabase:
|
|||||||
|
|
||||||
for index, row in df.iterrows():
|
for index, row in df.iterrows():
|
||||||
try:
|
try:
|
||||||
|
# Skip rows where production order is empty
|
||||||
|
if pd.isna(row.get('Comanda Productie')) or str(row.get('Comanda Productie')).strip() == '':
|
||||||
|
continue
|
||||||
|
|
||||||
# Create concatenated fields with dash separator
|
# Create concatenated fields with dash separator
|
||||||
opened_for_order = str(row.get('Opened for Order', '')).strip() if pd.notna(row.get('Opened for Order')) else ''
|
opened_for_order = str(row.get('Opened for Order', '')).strip() if pd.notna(row.get('Opened for Order')) else ''
|
||||||
linia = str(row.get('Linia', '')).strip() if pd.notna(row.get('Linia')) else ''
|
linia = str(row.get('Linia', '')).strip() if pd.notna(row.get('Linia')) else ''
|
||||||
@@ -269,6 +277,8 @@ class DailyMirrorDatabase:
|
|||||||
# Prepare data tuple
|
# Prepare data tuple
|
||||||
data = (
|
data = (
|
||||||
safe_str(row.get('Comanda Productie')), # production_order
|
safe_str(row.get('Comanda Productie')), # production_order
|
||||||
|
safe_str(row.get('Opened for Order')), # production_order_line
|
||||||
|
safe_str(row.get('Linia')), # line_number
|
||||||
open_for_order_line, # open_for_order_line (concatenated)
|
open_for_order_line, # open_for_order_line (concatenated)
|
||||||
client_order_line, # client_order_line (concatenated)
|
client_order_line, # client_order_line (concatenated)
|
||||||
safe_str(row.get('Cod. Client')), # customer_code
|
safe_str(row.get('Cod. Client')), # customer_code
|
||||||
|
|||||||
@@ -10,6 +10,7 @@ import os
|
|||||||
import json
|
import json
|
||||||
import tempfile
|
import tempfile
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
import pandas as pd
|
||||||
|
|
||||||
def get_db_connection():
|
def get_db_connection():
|
||||||
"""Get database connection using external server configuration"""
|
"""Get database connection using external server configuration"""
|
||||||
@@ -73,8 +74,15 @@ def validate_order_row(row_data):
|
|||||||
data_livrare = row_data.get('data_livrare', '').strip()
|
data_livrare = row_data.get('data_livrare', '').strip()
|
||||||
if data_livrare:
|
if data_livrare:
|
||||||
try:
|
try:
|
||||||
# Try to parse common date formats
|
# Try to parse common date formats including Excel datetime format
|
||||||
for date_format in ['%Y-%m-%d', '%d/%m/%Y', '%m/%d/%Y', '%d.%m.%Y']:
|
date_formats = [
|
||||||
|
'%Y-%m-%d', # 2024-03-12
|
||||||
|
'%Y-%m-%d %H:%M:%S', # 2024-03-12 00:00:00 (Excel format)
|
||||||
|
'%d/%m/%Y', # 12/03/2024
|
||||||
|
'%m/%d/%Y', # 03/12/2024
|
||||||
|
'%d.%m.%Y' # 12.03.2024
|
||||||
|
]
|
||||||
|
for date_format in date_formats:
|
||||||
try:
|
try:
|
||||||
datetime.strptime(data_livrare, date_format)
|
datetime.strptime(data_livrare, date_format)
|
||||||
break
|
break
|
||||||
@@ -118,8 +126,15 @@ def add_order_to_database(order_data):
|
|||||||
data_livrare_str = order_data.get('data_livrare', '').strip()
|
data_livrare_str = order_data.get('data_livrare', '').strip()
|
||||||
if data_livrare_str:
|
if data_livrare_str:
|
||||||
try:
|
try:
|
||||||
# Try to parse common date formats and convert to YYYY-MM-DD
|
# Try to parse common date formats including Excel datetime and convert to YYYY-MM-DD
|
||||||
for date_format in ['%Y-%m-%d', '%d/%m/%Y', '%m/%d/%Y', '%d.%m.%Y']:
|
date_formats = [
|
||||||
|
'%Y-%m-%d', # 2024-03-12
|
||||||
|
'%Y-%m-%d %H:%M:%S', # 2024-03-12 00:00:00 (Excel format)
|
||||||
|
'%d/%m/%Y', # 12/03/2024
|
||||||
|
'%m/%d/%Y', # 03/12/2024
|
||||||
|
'%d.%m.%Y' # 12.03.2024
|
||||||
|
]
|
||||||
|
for date_format in date_formats:
|
||||||
try:
|
try:
|
||||||
parsed_date = datetime.strptime(data_livrare_str, date_format)
|
parsed_date = datetime.strptime(data_livrare_str, date_format)
|
||||||
data_livrare_value = parsed_date.strftime('%Y-%m-%d')
|
data_livrare_value = parsed_date.strftime('%Y-%m-%d')
|
||||||
@@ -167,6 +182,141 @@ def add_order_to_database(order_data):
|
|||||||
except Exception as e:
|
except Exception as e:
|
||||||
return False, f"Unexpected error: {str(e)}"
|
return False, f"Unexpected error: {str(e)}"
|
||||||
|
|
||||||
|
def process_excel_file(file_path):
|
||||||
|
"""
|
||||||
|
Process uploaded Excel file (.xlsx) and return parsed data with validation
|
||||||
|
Returns: (orders_data: list, validation_errors: list, validation_warnings: list)
|
||||||
|
"""
|
||||||
|
orders_data = []
|
||||||
|
all_errors = []
|
||||||
|
all_warnings = []
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Read Excel file - try 'Sheet1' first (common data sheet), then fallback to first sheet
|
||||||
|
try:
|
||||||
|
df = pd.read_excel(file_path, sheet_name='Sheet1', engine='openpyxl')
|
||||||
|
except:
|
||||||
|
try:
|
||||||
|
df = pd.read_excel(file_path, sheet_name=0, engine='openpyxl')
|
||||||
|
except:
|
||||||
|
# Last resort - try 'DataSheet'
|
||||||
|
df = pd.read_excel(file_path, sheet_name='DataSheet', engine='openpyxl')
|
||||||
|
|
||||||
|
# Column mapping for Excel files (case-insensitive)
|
||||||
|
# Maps Excel column names to database field names
|
||||||
|
column_mapping = {
|
||||||
|
# Core order fields
|
||||||
|
'comanda productie': 'comanda_productie',
|
||||||
|
'comanda_productie': 'comanda_productie',
|
||||||
|
'cod articol': 'cod_articol',
|
||||||
|
'cod_articol': 'cod_articol',
|
||||||
|
'descriere': 'descr_com_prod',
|
||||||
|
'descr. com. prod': 'descr_com_prod',
|
||||||
|
'descr com prod': 'descr_com_prod',
|
||||||
|
'descr_com_prod': 'descr_com_prod',
|
||||||
|
'description': 'descr_com_prod',
|
||||||
|
'cantitate': 'cantitate',
|
||||||
|
'cantitate ceruta': 'cantitate',
|
||||||
|
'quantity': 'cantitate',
|
||||||
|
'datalivrare': 'data_livrare',
|
||||||
|
'data livrare': 'data_livrare',
|
||||||
|
'data_livrare': 'data_livrare',
|
||||||
|
'delivery date': 'data_livrare',
|
||||||
|
'dimensiune': 'dimensiune',
|
||||||
|
'dimension': 'dimensiune',
|
||||||
|
|
||||||
|
# Customer and order info
|
||||||
|
'customer': 'customer_name',
|
||||||
|
'customer name': 'customer_name',
|
||||||
|
'customer_name': 'customer_name',
|
||||||
|
'comanda client': 'com_achiz_client',
|
||||||
|
'com.achiz.client': 'com_achiz_client',
|
||||||
|
'com achiz client': 'com_achiz_client',
|
||||||
|
'com_achiz_client': 'com_achiz_client',
|
||||||
|
'customer article number': 'customer_article_number',
|
||||||
|
'customer_article_number': 'customer_article_number',
|
||||||
|
|
||||||
|
# Status and dates
|
||||||
|
'status': 'status',
|
||||||
|
'end of quilting': 'end_of_quilting',
|
||||||
|
'end of sewing': 'end_of_sewing',
|
||||||
|
'data deschiderii': 'data_deschiderii',
|
||||||
|
'data planific.': 'data_planific',
|
||||||
|
'data planific': 'data_planific',
|
||||||
|
|
||||||
|
# Machine and production info
|
||||||
|
'masina cusut': 'masina_cusut',
|
||||||
|
'masina cusut ': 'masina_cusut', # Note trailing space in Excel
|
||||||
|
'tip masina': 'tip_masina',
|
||||||
|
'numar masina': 'numar_masina',
|
||||||
|
'clasificare': 'clasificare',
|
||||||
|
'timp normat total': 'timp_normat_total',
|
||||||
|
|
||||||
|
# Quality control stages (T1, T2, T3)
|
||||||
|
't1': 't1',
|
||||||
|
'data inregistrare t1': 'data_inregistrare_t1',
|
||||||
|
'numele complet t1': 'numele_complet_t1',
|
||||||
|
't2': 't2',
|
||||||
|
'data inregistrare t2': 'data_inregistrare_t2',
|
||||||
|
'numele complet t2': 'numele_complet_t2',
|
||||||
|
't3': 't3',
|
||||||
|
'data inregistrare t3': 'data_inregistrare_t3',
|
||||||
|
'numele complet t3': 'numele_complet_t3',
|
||||||
|
|
||||||
|
# Design and model info
|
||||||
|
'model lb2': 'model_lb2',
|
||||||
|
'design nr': 'design_nr',
|
||||||
|
'needle position': 'needle_position',
|
||||||
|
|
||||||
|
# Line references
|
||||||
|
'nr. linie com. client': 'nr_linie_com_client',
|
||||||
|
'nr linie com client': 'nr_linie_com_client',
|
||||||
|
'nr_linie_com_client': 'nr_linie_com_client',
|
||||||
|
'line': 'line_number',
|
||||||
|
'line_number': 'line_number',
|
||||||
|
'open for order': 'open_for_order',
|
||||||
|
'open_for_order': 'open_for_order'
|
||||||
|
}
|
||||||
|
|
||||||
|
# Normalize column names
|
||||||
|
df.columns = [col.lower().strip() if col else f'col_{i}' for i, col in enumerate(df.columns)]
|
||||||
|
|
||||||
|
# Process each row
|
||||||
|
for idx, row in df.iterrows():
|
||||||
|
# Skip empty rows
|
||||||
|
if row.isna().all():
|
||||||
|
continue
|
||||||
|
|
||||||
|
# Create normalized row data
|
||||||
|
normalized_row = {}
|
||||||
|
for col_name in df.columns:
|
||||||
|
col_key = col_name.lower().strip()
|
||||||
|
mapped_key = column_mapping.get(col_key, col_key.replace(' ', '_').replace('.', ''))
|
||||||
|
|
||||||
|
# Get value and convert to string, handle NaN
|
||||||
|
value = row[col_name]
|
||||||
|
if pd.isna(value):
|
||||||
|
normalized_row[mapped_key] = ''
|
||||||
|
else:
|
||||||
|
normalized_row[mapped_key] = str(value).strip()
|
||||||
|
|
||||||
|
# Validate the row
|
||||||
|
errors, warnings = validate_order_row(normalized_row)
|
||||||
|
|
||||||
|
if errors:
|
||||||
|
all_errors.extend([f"Row {idx + 2}: {error}" for error in errors])
|
||||||
|
else:
|
||||||
|
# Only add valid rows
|
||||||
|
orders_data.append(normalized_row)
|
||||||
|
|
||||||
|
if warnings:
|
||||||
|
all_warnings.extend([f"Row {idx + 2}: {warning}" for warning in warnings])
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
all_errors.append(f"Error processing Excel file: {str(e)}")
|
||||||
|
|
||||||
|
return orders_data, all_errors, all_warnings
|
||||||
|
|
||||||
def process_csv_file(file_path):
|
def process_csv_file(file_path):
|
||||||
"""
|
"""
|
||||||
Process uploaded CSV file and return parsed data with validation
|
Process uploaded CSV file and return parsed data with validation
|
||||||
@@ -268,7 +418,7 @@ def upload_orders_handler():
|
|||||||
if request.method == 'POST':
|
if request.method == 'POST':
|
||||||
# Handle file upload
|
# Handle file upload
|
||||||
file = request.files.get('csv_file')
|
file = request.files.get('csv_file')
|
||||||
if file and file.filename.endswith(('.csv', '.CSV')):
|
if file and file.filename.endswith(('.csv', '.CSV', '.xlsx', '.XLSX', '.xls', '.XLS')):
|
||||||
try:
|
try:
|
||||||
# Save uploaded file
|
# Save uploaded file
|
||||||
temp_path = os.path.join(temp_dir, file.filename)
|
temp_path = os.path.join(temp_dir, file.filename)
|
||||||
@@ -278,8 +428,11 @@ def upload_orders_handler():
|
|||||||
session['csv_filename'] = file.filename
|
session['csv_filename'] = file.filename
|
||||||
session['orders_csv_filepath'] = temp_path
|
session['orders_csv_filepath'] = temp_path
|
||||||
|
|
||||||
# Process the CSV file
|
# Process the file based on extension
|
||||||
orders_data, validation_errors, validation_warnings = process_csv_file(temp_path)
|
if file.filename.lower().endswith(('.xlsx', '.xls')):
|
||||||
|
orders_data, validation_errors, validation_warnings = process_excel_file(temp_path)
|
||||||
|
else:
|
||||||
|
orders_data, validation_errors, validation_warnings = process_csv_file(temp_path)
|
||||||
|
|
||||||
# Store processed data in session
|
# Store processed data in session
|
||||||
session['orders_csv_data'] = orders_data
|
session['orders_csv_data'] = orders_data
|
||||||
|
|||||||
@@ -7,8 +7,13 @@ def get_db_connection():
|
|||||||
settings = {}
|
settings = {}
|
||||||
with open(settings_file, 'r') as f:
|
with open(settings_file, 'r') as f:
|
||||||
for line in f:
|
for line in f:
|
||||||
key, value = line.strip().split('=', 1)
|
line = line.strip()
|
||||||
settings[key] = value
|
# Skip empty lines and comments
|
||||||
|
if not line or line.startswith('#'):
|
||||||
|
continue
|
||||||
|
if '=' in line:
|
||||||
|
key, value = line.split('=', 1)
|
||||||
|
settings[key] = value
|
||||||
return mariadb.connect(
|
return mariadb.connect(
|
||||||
user=settings['username'],
|
user=settings['username'],
|
||||||
password=settings['password'],
|
password=settings['password'],
|
||||||
@@ -23,6 +28,10 @@ def get_unprinted_orders_data(limit=100):
|
|||||||
Returns list of order dictionaries where printed_labels != 1
|
Returns list of order dictionaries where printed_labels != 1
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
|
import sys
|
||||||
|
sys.stderr.write(f"DEBUG print_module: get_unprinted_orders_data called with limit={limit}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
cursor = conn.cursor()
|
cursor = conn.cursor()
|
||||||
|
|
||||||
@@ -30,8 +39,14 @@ def get_unprinted_orders_data(limit=100):
|
|||||||
cursor.execute("SHOW COLUMNS FROM order_for_labels LIKE 'printed_labels'")
|
cursor.execute("SHOW COLUMNS FROM order_for_labels LIKE 'printed_labels'")
|
||||||
column_exists = cursor.fetchone()
|
column_exists = cursor.fetchone()
|
||||||
|
|
||||||
|
sys.stderr.write(f"DEBUG print_module: printed_labels column exists={bool(column_exists)}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
if column_exists:
|
if column_exists:
|
||||||
# Use printed_labels column
|
# Use printed_labels column
|
||||||
|
sys.stderr.write(f"DEBUG print_module: Executing query with printed_labels != 1\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
cursor.execute("""
|
cursor.execute("""
|
||||||
SELECT id, comanda_productie, cod_articol, descr_com_prod, cantitate,
|
SELECT id, comanda_productie, cod_articol, descr_com_prod, cantitate,
|
||||||
com_achiz_client, nr_linie_com_client, customer_name,
|
com_achiz_client, nr_linie_com_client, customer_name,
|
||||||
@@ -43,6 +58,9 @@ def get_unprinted_orders_data(limit=100):
|
|||||||
LIMIT %s
|
LIMIT %s
|
||||||
""", (limit,))
|
""", (limit,))
|
||||||
else:
|
else:
|
||||||
|
sys.stderr.write(f"DEBUG print_module: Executing fallback query (no printed_labels column)\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
# Fallback: get all orders if no printed_labels column
|
# Fallback: get all orders if no printed_labels column
|
||||||
cursor.execute("""
|
cursor.execute("""
|
||||||
SELECT id, comanda_productie, cod_articol, descr_com_prod, cantitate,
|
SELECT id, comanda_productie, cod_articol, descr_com_prod, cantitate,
|
||||||
@@ -55,7 +73,21 @@ def get_unprinted_orders_data(limit=100):
|
|||||||
""", (limit,))
|
""", (limit,))
|
||||||
|
|
||||||
orders = []
|
orders = []
|
||||||
for row in cursor.fetchall():
|
rows = cursor.fetchall()
|
||||||
|
sys.stderr.write(f"DEBUG print_module: Query returned {len(rows)} rows\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Also write to file for debugging
|
||||||
|
try:
|
||||||
|
with open('/app/print_module_debug.log', 'w') as f:
|
||||||
|
f.write(f"Query returned {len(rows)} rows\n")
|
||||||
|
f.write(f"Column exists: {column_exists}\n")
|
||||||
|
if rows:
|
||||||
|
f.write(f"First row: {rows[0]}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
for row in rows:
|
||||||
if column_exists:
|
if column_exists:
|
||||||
orders.append({
|
orders.append({
|
||||||
'id': row[0],
|
'id': row[0],
|
||||||
@@ -100,6 +132,21 @@ def get_unprinted_orders_data(limit=100):
|
|||||||
return orders
|
return orders
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
|
import sys
|
||||||
|
import traceback
|
||||||
|
error_trace = traceback.format_exc()
|
||||||
|
|
||||||
|
sys.stderr.write(f"ERROR in get_unprinted_orders_data: {e}\n{error_trace}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Write to file
|
||||||
|
try:
|
||||||
|
with open('/app/print_module_error.log', 'w') as f:
|
||||||
|
f.write(f"ERROR: {e}\n")
|
||||||
|
f.write(f"Traceback:\n{error_trace}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
print(f"Error retrieving unprinted orders: {e}")
|
print(f"Error retrieving unprinted orders: {e}")
|
||||||
return []
|
return []
|
||||||
|
|
||||||
|
|||||||
@@ -1717,45 +1717,203 @@ def etichete():
|
|||||||
@requires_labels_module
|
@requires_labels_module
|
||||||
def upload_data():
|
def upload_data():
|
||||||
if request.method == 'POST':
|
if request.method == 'POST':
|
||||||
|
import sys
|
||||||
|
sys.stdout.flush()
|
||||||
|
|
||||||
|
# Write to file to ensure we can see it
|
||||||
|
try:
|
||||||
|
with open('/app/request.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"POST REQUEST at {datetime.now()}\n")
|
||||||
|
f.write(f"Form data: {dict(request.form)}\n")
|
||||||
|
f.write(f"Files: {list(request.files.keys())}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
sys.stderr.write(f"DEBUG: POST request received for upload_data\n")
|
||||||
|
sys.stderr.flush()
|
||||||
action = request.form.get('action', 'preview')
|
action = request.form.get('action', 'preview')
|
||||||
|
sys.stderr.write(f"DEBUG: Action = {action}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
if action == 'preview':
|
if action == 'preview':
|
||||||
# Handle file upload and show preview
|
# Handle file upload and show preview
|
||||||
|
print(f"DEBUG: Processing preview action")
|
||||||
|
print(f"DEBUG: Files in request: {list(request.files.keys())}")
|
||||||
|
|
||||||
if 'file' not in request.files:
|
if 'file' not in request.files:
|
||||||
|
print(f"DEBUG: No file in request.files")
|
||||||
flash('No file selected', 'error')
|
flash('No file selected', 'error')
|
||||||
return redirect(request.url)
|
return redirect(request.url)
|
||||||
|
|
||||||
file = request.files['file']
|
file = request.files['file']
|
||||||
|
print(f"DEBUG: File received: {file.filename}")
|
||||||
|
|
||||||
if file.filename == '':
|
if file.filename == '':
|
||||||
|
print(f"DEBUG: Empty filename")
|
||||||
flash('No file selected', 'error')
|
flash('No file selected', 'error')
|
||||||
return redirect(request.url)
|
return redirect(request.url)
|
||||||
|
|
||||||
if file and file.filename.lower().endswith('.csv'):
|
filename_lower = file.filename.lower()
|
||||||
|
print(f"DEBUG: Filename lowercase: {filename_lower}")
|
||||||
|
|
||||||
|
# Handle both CSV and Excel files
|
||||||
|
if file and (filename_lower.endswith('.csv') or filename_lower.endswith('.xlsx') or filename_lower.endswith('.xls')):
|
||||||
try:
|
try:
|
||||||
# Read CSV file
|
|
||||||
import csv
|
|
||||||
import io
|
|
||||||
|
|
||||||
# Read the file content
|
|
||||||
stream = io.StringIO(file.stream.read().decode("UTF8"), newline=None)
|
|
||||||
csv_input = csv.DictReader(stream)
|
|
||||||
|
|
||||||
# Convert to list for preview
|
|
||||||
preview_data = []
|
preview_data = []
|
||||||
headers = []
|
headers = []
|
||||||
|
|
||||||
for i, row in enumerate(csv_input):
|
if filename_lower.endswith('.csv'):
|
||||||
if i == 0:
|
# Read CSV file
|
||||||
headers = list(row.keys())
|
import csv
|
||||||
if i < 10: # Show only first 10 rows for preview
|
import io
|
||||||
preview_data.append(row)
|
|
||||||
else:
|
|
||||||
break
|
|
||||||
|
|
||||||
# Store the full file content in session for later processing
|
# Read the file content
|
||||||
file.stream.seek(0) # Reset file pointer
|
stream = io.StringIO(file.stream.read().decode("UTF8"), newline=None)
|
||||||
session['csv_content'] = file.stream.read().decode("UTF8")
|
csv_input = csv.DictReader(stream)
|
||||||
session['csv_filename'] = file.filename
|
|
||||||
|
# Define the fields that are stored in the database
|
||||||
|
database_fields = [
|
||||||
|
'comanda_productie', 'cod_articol', 'descr_com_prod', 'cantitate',
|
||||||
|
'data_livrare', 'dimensiune', 'com_achiz_client', 'nr_linie_com_client',
|
||||||
|
'customer_name', 'customer_article_number', 'open_for_order', 'line_number'
|
||||||
|
]
|
||||||
|
|
||||||
|
# Convert to list for preview
|
||||||
|
all_rows = []
|
||||||
|
for i, row in enumerate(csv_input):
|
||||||
|
all_rows.append(row)
|
||||||
|
if i == 0:
|
||||||
|
# Get all available fields from CSV
|
||||||
|
all_fields = list(row.keys())
|
||||||
|
# Filter to only database fields
|
||||||
|
headers = [field for field in database_fields if field in all_fields or
|
||||||
|
any(field.lower() == k.lower().replace(' ', '_').replace('.', '') for k in all_fields)]
|
||||||
|
if i < 10: # Show only first 10 rows for preview
|
||||||
|
# Filter row to only show database fields
|
||||||
|
filtered_row = {k: v for k, v in row.items() if k.lower().replace(' ', '_').replace('.', '') in database_fields}
|
||||||
|
preview_data.append(filtered_row)
|
||||||
|
|
||||||
|
# If no headers were set, use all available
|
||||||
|
if not headers and all_rows:
|
||||||
|
headers = list(all_rows[0].keys())
|
||||||
|
preview_data = all_rows[:10]
|
||||||
|
|
||||||
|
# Store the full file content in a temp file instead of session
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
upload_id = str(uuid.uuid4())
|
||||||
|
temp_data_file = f'/tmp/upload_{upload_id}.csv'
|
||||||
|
|
||||||
|
file.stream.seek(0) # Reset file pointer
|
||||||
|
with open(temp_data_file, 'wb') as f:
|
||||||
|
f.write(file.stream.read())
|
||||||
|
|
||||||
|
# Store only the file reference in session
|
||||||
|
session['upload_id'] = upload_id
|
||||||
|
session['csv_filename'] = file.filename
|
||||||
|
session['file_type'] = 'csv'
|
||||||
|
session.modified = True
|
||||||
|
|
||||||
|
else: # Excel file
|
||||||
|
print(f"DEBUG: Processing Excel file: {file.filename}")
|
||||||
|
import sys
|
||||||
|
sys.stderr.write(f"DEBUG: Processing Excel file: {file.filename}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
import os
|
||||||
|
import tempfile
|
||||||
|
from app.order_labels import process_excel_file
|
||||||
|
|
||||||
|
# Save uploaded file temporarily
|
||||||
|
temp_file = tempfile.NamedTemporaryFile(delete=False, suffix=os.path.splitext(file.filename)[1])
|
||||||
|
print(f"DEBUG: Created temp file: {temp_file.name}")
|
||||||
|
sys.stderr.write(f"DEBUG: Created temp file: {temp_file.name}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
file.save(temp_file.name)
|
||||||
|
temp_file.close()
|
||||||
|
print(f"DEBUG: Saved file to temp location")
|
||||||
|
sys.stderr.write(f"DEBUG: Saved file to temp location\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Process Excel file
|
||||||
|
print(f"DEBUG: Calling process_excel_file()")
|
||||||
|
orders_data, errors, warnings = process_excel_file(temp_file.name)
|
||||||
|
print(f"DEBUG: Process complete - orders: {len(orders_data)}, errors: {len(errors)}, warnings: {len(warnings)}")
|
||||||
|
|
||||||
|
# Clean up temp file
|
||||||
|
os.unlink(temp_file.name)
|
||||||
|
|
||||||
|
if errors:
|
||||||
|
for error in errors[:10]:
|
||||||
|
flash(error, 'error')
|
||||||
|
if len(errors) > 10:
|
||||||
|
flash(f'... and {len(errors) - 10} more errors', 'error')
|
||||||
|
|
||||||
|
if warnings:
|
||||||
|
for warning in warnings[:5]:
|
||||||
|
flash(warning, 'warning')
|
||||||
|
|
||||||
|
if not orders_data:
|
||||||
|
import sys
|
||||||
|
sys.stderr.write(f"ERROR: No valid orders data found. Errors: {len(errors)}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Write to file
|
||||||
|
try:
|
||||||
|
with open('/app/upload_error.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"NO VALID DATA at {datetime.now()}\n")
|
||||||
|
f.write(f"File: {file.filename}\n")
|
||||||
|
f.write(f"Errors ({len(errors)}):\n")
|
||||||
|
for err in errors[:20]:
|
||||||
|
f.write(f" - {err}\n")
|
||||||
|
f.write(f"Warnings ({len(warnings)}):\n")
|
||||||
|
for warn in warnings[:20]:
|
||||||
|
f.write(f" - {warn}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
flash('No valid data found in Excel file', 'error')
|
||||||
|
return redirect(request.url)
|
||||||
|
|
||||||
|
# Get headers from first row - only show fields that will be stored in database
|
||||||
|
if orders_data:
|
||||||
|
# Define the fields that are stored in the database
|
||||||
|
database_fields = [
|
||||||
|
'comanda_productie', 'cod_articol', 'descr_com_prod', 'cantitate',
|
||||||
|
'data_livrare', 'dimensiune', 'com_achiz_client', 'nr_linie_com_client',
|
||||||
|
'customer_name', 'customer_article_number', 'open_for_order', 'line_number'
|
||||||
|
]
|
||||||
|
|
||||||
|
# Filter headers to only include database fields that exist in data
|
||||||
|
all_fields = list(orders_data[0].keys())
|
||||||
|
headers = [field for field in database_fields if field in all_fields]
|
||||||
|
|
||||||
|
# Filter preview data to only show database fields
|
||||||
|
preview_data = []
|
||||||
|
for order in orders_data[:10]:
|
||||||
|
filtered_order = {k: v for k, v in order.items() if k in headers}
|
||||||
|
preview_data.append(filtered_order)
|
||||||
|
|
||||||
|
# Store data in a temporary file instead of session (session is too small for large datasets)
|
||||||
|
import json
|
||||||
|
import uuid
|
||||||
|
|
||||||
|
upload_id = str(uuid.uuid4())
|
||||||
|
temp_data_file = f'/tmp/upload_{upload_id}.json'
|
||||||
|
|
||||||
|
with open(temp_data_file, 'w') as f:
|
||||||
|
json.dump(orders_data, f)
|
||||||
|
|
||||||
|
# Store only the file reference in session
|
||||||
|
session['upload_id'] = upload_id
|
||||||
|
session['csv_filename'] = file.filename
|
||||||
|
session['file_type'] = 'excel'
|
||||||
|
session.modified = True
|
||||||
|
|
||||||
return render_template('upload_orders.html',
|
return render_template('upload_orders.html',
|
||||||
preview_data=preview_data,
|
preview_data=preview_data,
|
||||||
@@ -1764,27 +1922,112 @@ def upload_data():
|
|||||||
filename=file.filename)
|
filename=file.filename)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
flash(f'Error reading CSV file: {str(e)}', 'error')
|
import traceback
|
||||||
|
import sys
|
||||||
|
error_trace = traceback.format_exc()
|
||||||
|
print(f"ERROR processing file: {error_trace}")
|
||||||
|
sys.stderr.write(f"ERROR processing file: {error_trace}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Also write to a file
|
||||||
|
try:
|
||||||
|
with open('/app/upload_error.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"ERROR at {datetime.now()}\n")
|
||||||
|
f.write(f"File: {file.filename if file else 'unknown'}\n")
|
||||||
|
f.write(f"Error: {str(e)}\n")
|
||||||
|
f.write(f"Traceback:\n{error_trace}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
flash(f'Error reading file: {str(e)}', 'error')
|
||||||
return redirect(request.url)
|
return redirect(request.url)
|
||||||
else:
|
else:
|
||||||
flash('Please upload a CSV file', 'error')
|
flash('Please upload a CSV or Excel file (.csv, .xlsx, .xls)', 'error')
|
||||||
return redirect(request.url)
|
return redirect(request.url)
|
||||||
|
|
||||||
elif action == 'save':
|
elif action == 'save':
|
||||||
# Save the data to database
|
# Save the data to database
|
||||||
if 'csv_content' not in session:
|
import sys
|
||||||
|
sys.stderr.write("DEBUG: Save action triggered\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Log to file immediately
|
||||||
|
try:
|
||||||
|
with open('/app/save_check.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"SAVE ACTION at {datetime.now()}\n")
|
||||||
|
f.write(f"Session keys: {list(session.keys())}\n")
|
||||||
|
f.write(f"Session file_type: {session.get('file_type', 'NOT SET')}\n")
|
||||||
|
f.write(f"Has csv_content: {'csv_content' in session}\n")
|
||||||
|
f.write(f"Has orders_data: {'orders_data' in session}\n")
|
||||||
|
except Exception as log_err:
|
||||||
|
sys.stderr.write(f"Error writing log: {log_err}\n")
|
||||||
|
|
||||||
|
file_type = session.get('file_type')
|
||||||
|
upload_id = session.get('upload_id')
|
||||||
|
|
||||||
|
sys.stderr.write(f"DEBUG: File type = {file_type}, upload_id = {upload_id}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
if not file_type or not upload_id:
|
||||||
|
sys.stderr.write("DEBUG: Missing file_type or upload_id in session\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open('/app/save_check.log', 'a') as f:
|
||||||
|
f.write("ERROR: Missing upload data in session - redirecting\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
flash('No data to save. Please upload a file first.', 'error')
|
flash('No data to save. Please upload a file first.', 'error')
|
||||||
return redirect(request.url)
|
return redirect(request.url)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import csv
|
print(f"DEBUG: Starting {file_type.upper()} upload processing...")
|
||||||
import io
|
sys.stderr.write(f"DEBUG: Starting {file_type.upper()} upload processing...\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
print(f"DEBUG: Starting CSV upload processing...")
|
# Log to file
|
||||||
|
try:
|
||||||
|
with open('/app/save_debug.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"SAVE START at {datetime.now()}\n")
|
||||||
|
f.write(f"File type: {file_type}\n")
|
||||||
|
f.write(f"Session keys: {list(session.keys())}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
# Read the CSV content from session
|
# Get orders data from temp file
|
||||||
stream = io.StringIO(session['csv_content'], newline=None)
|
import json
|
||||||
csv_input = csv.DictReader(stream)
|
temp_data_file = f'/tmp/upload_{upload_id}.{"json" if file_type == "excel" else "csv"}'
|
||||||
|
|
||||||
|
if file_type == 'excel':
|
||||||
|
with open(temp_data_file, 'r') as f:
|
||||||
|
orders_list = json.load(f)
|
||||||
|
|
||||||
|
# Log
|
||||||
|
try:
|
||||||
|
with open('/app/save_debug.log', 'a') as f:
|
||||||
|
f.write(f"Loaded {len(orders_list)} orders from temp file (Excel)\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
else:
|
||||||
|
# Read the CSV content from temp file
|
||||||
|
import csv
|
||||||
|
with open(temp_data_file, 'r') as f:
|
||||||
|
csv_input = csv.DictReader(f)
|
||||||
|
orders_list = list(csv_input)
|
||||||
|
|
||||||
|
# Log
|
||||||
|
try:
|
||||||
|
with open('/app/save_debug.log', 'a') as f:
|
||||||
|
f.write(f"Loaded {len(orders_list)} orders from temp file (CSV)\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
# Connect to database
|
# Connect to database
|
||||||
conn = get_db_connection()
|
conn = get_db_connection()
|
||||||
@@ -1794,10 +2037,10 @@ def upload_data():
|
|||||||
error_count = 0
|
error_count = 0
|
||||||
errors = []
|
errors = []
|
||||||
|
|
||||||
print(f"DEBUG: Connected to database, processing rows...")
|
print(f"DEBUG: Connected to database, processing {len(orders_list)} rows...")
|
||||||
|
|
||||||
# Process each row
|
# Process each row
|
||||||
for index, row in enumerate(csv_input):
|
for index, row in enumerate(orders_list):
|
||||||
try:
|
try:
|
||||||
print(f"DEBUG: Processing row {index + 1}: {row}")
|
print(f"DEBUG: Processing row {index + 1}: {row}")
|
||||||
|
|
||||||
@@ -1824,10 +2067,18 @@ def upload_data():
|
|||||||
# Convert empty string to None for date field
|
# Convert empty string to None for date field
|
||||||
if data_livrare:
|
if data_livrare:
|
||||||
try:
|
try:
|
||||||
# Parse date from various formats (9/23/2023, 23/9/2023, 2023-09-23, etc.)
|
# Parse date from various formats including Excel datetime format
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
# Try different date formats
|
# Try different date formats
|
||||||
date_formats = ['%m/%d/%Y', '%d/%m/%Y', '%Y-%m-%d', '%m-%d-%Y', '%d-%m-%Y']
|
date_formats = [
|
||||||
|
'%Y-%m-%d', # 2024-03-12
|
||||||
|
'%Y-%m-%d %H:%M:%S', # 2024-03-12 00:00:00 (Excel format)
|
||||||
|
'%m/%d/%Y', # 03/12/2024
|
||||||
|
'%d/%m/%Y', # 12/03/2024
|
||||||
|
'%m-%d-%Y', # 03-12-2024
|
||||||
|
'%d-%m-%Y', # 12-03-2024
|
||||||
|
'%d.%m.%Y' # 12.03.2024
|
||||||
|
]
|
||||||
parsed_date = None
|
parsed_date = None
|
||||||
for fmt in date_formats:
|
for fmt in date_formats:
|
||||||
try:
|
try:
|
||||||
@@ -1897,9 +2148,39 @@ def upload_data():
|
|||||||
|
|
||||||
print(f"DEBUG: Committed {inserted_count} records to database")
|
print(f"DEBUG: Committed {inserted_count} records to database")
|
||||||
|
|
||||||
# Clear session data
|
# Log the result
|
||||||
session.pop('csv_content', None)
|
import sys
|
||||||
|
try:
|
||||||
|
with open('/app/upload_success.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"UPLOAD COMPLETED at {datetime.now()}\n")
|
||||||
|
f.write(f"File type: {file_type}\n")
|
||||||
|
f.write(f"Total rows processed: {len(orders_list)}\n")
|
||||||
|
f.write(f"Successfully inserted: {inserted_count}\n")
|
||||||
|
f.write(f"Errors: {error_count}\n")
|
||||||
|
if errors:
|
||||||
|
f.write(f"First 10 errors:\n")
|
||||||
|
for err in errors[:10]:
|
||||||
|
f.write(f" - {err}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
sys.stderr.write(f"DEBUG: Upload complete - inserted {inserted_count}, errors {error_count}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Clear session data and remove temp file
|
||||||
|
import os
|
||||||
|
temp_file_path = f'/tmp/upload_{upload_id}.{"json" if file_type == "excel" else "csv"}'
|
||||||
|
try:
|
||||||
|
if os.path.exists(temp_file_path):
|
||||||
|
os.unlink(temp_file_path)
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
session.pop('upload_id', None)
|
||||||
session.pop('csv_filename', None)
|
session.pop('csv_filename', None)
|
||||||
|
session.pop('file_type', None)
|
||||||
|
|
||||||
# Show results
|
# Show results
|
||||||
if error_count > 0:
|
if error_count > 0:
|
||||||
@@ -1912,6 +2193,24 @@ def upload_data():
|
|||||||
flash(f'Successfully uploaded {inserted_count} orders for labels', 'success')
|
flash(f'Successfully uploaded {inserted_count} orders for labels', 'success')
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
|
import sys
|
||||||
|
import traceback
|
||||||
|
error_trace = traceback.format_exc()
|
||||||
|
|
||||||
|
# Log the error
|
||||||
|
try:
|
||||||
|
with open('/app/upload_error.log', 'a') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"\n{'='*80}\n")
|
||||||
|
f.write(f"SAVE ERROR at {datetime.now()}\n")
|
||||||
|
f.write(f"Error: {str(e)}\n")
|
||||||
|
f.write(f"Traceback:\n{error_trace}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
|
sys.stderr.write(f"ERROR in save: {error_trace}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
flash(f'Error processing data: {str(e)}', 'error')
|
flash(f'Error processing data: {str(e)}', 'error')
|
||||||
|
|
||||||
return redirect(url_for('main.upload_data'))
|
return redirect(url_for('main.upload_data'))
|
||||||
@@ -3092,15 +3391,46 @@ def get_unprinted_orders():
|
|||||||
# return jsonify({'error': 'Access denied. Required roles: superadmin, warehouse_manager, etichete'}), 403
|
# return jsonify({'error': 'Access denied. Required roles: superadmin, warehouse_manager, etichete'}), 403
|
||||||
|
|
||||||
try:
|
try:
|
||||||
print("DEBUG: Calling get_unprinted_orders_data()")
|
import sys
|
||||||
|
sys.stderr.write("DEBUG: Calling get_unprinted_orders_data()\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
data = get_unprinted_orders_data()
|
data = get_unprinted_orders_data()
|
||||||
print(f"DEBUG: Retrieved {len(data)} orders")
|
|
||||||
|
sys.stderr.write(f"DEBUG: Retrieved {len(data)} orders\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Write to file
|
||||||
|
try:
|
||||||
|
with open('/app/unprinted_debug.log', 'w') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"DEBUG at {datetime.now()}\n")
|
||||||
|
f.write(f"Retrieved {len(data)} orders\n")
|
||||||
|
if data:
|
||||||
|
f.write(f"First order: {data[0]}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
return jsonify(data)
|
return jsonify(data)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(f"DEBUG: Error in get_unprinted_orders: {e}")
|
import sys
|
||||||
import traceback
|
import traceback
|
||||||
traceback.print_exc()
|
error_trace = traceback.format_exc()
|
||||||
|
|
||||||
|
sys.stderr.write(f"DEBUG: Error in get_unprinted_orders: {e}\n{error_trace}\n")
|
||||||
|
sys.stderr.flush()
|
||||||
|
|
||||||
|
# Write to file
|
||||||
|
try:
|
||||||
|
with open('/app/unprinted_debug.log', 'w') as f:
|
||||||
|
from datetime import datetime
|
||||||
|
f.write(f"ERROR at {datetime.now()}\n")
|
||||||
|
f.write(f"Error: {e}\n")
|
||||||
|
f.write(f"Traceback:\n{error_trace}\n")
|
||||||
|
except:
|
||||||
|
pass
|
||||||
|
|
||||||
return jsonify({'error': str(e)}), 500
|
return jsonify({'error': str(e)}), 500
|
||||||
|
|
||||||
@bp.route('/generate_labels_pdf/<int:order_id>', methods=['POST'])
|
@bp.route('/generate_labels_pdf/<int:order_id>', methods=['POST'])
|
||||||
|
|||||||
@@ -2,6 +2,50 @@
|
|||||||
|
|
||||||
{% block head %}
|
{% block head %}
|
||||||
<!-- Print Module CSS is now loaded via base.html for all printing pages -->
|
<!-- Print Module CSS is now loaded via base.html for all printing pages -->
|
||||||
|
<style>
|
||||||
|
/* Compact table styling for print_lost_labels page */
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table {
|
||||||
|
font-size: 10px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table thead th {
|
||||||
|
font-size: 10px;
|
||||||
|
padding: 6px 8px;
|
||||||
|
line-height: 1.2;
|
||||||
|
}
|
||||||
|
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table tbody td {
|
||||||
|
font-size: 9px;
|
||||||
|
padding: 4px 6px;
|
||||||
|
line-height: 1.3;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Keep important data slightly larger and bold */
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table tbody td:nth-child(2) {
|
||||||
|
font-size: 10px;
|
||||||
|
font-weight: 600;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Make numbers more compact */
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table tbody td:nth-child(5),
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table tbody td:nth-child(9),
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table tbody td:nth-child(13) {
|
||||||
|
font-size: 9px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Reduce row height */
|
||||||
|
.print-lost-labels-compact .scan-table.print-module-table tbody tr {
|
||||||
|
height: auto;
|
||||||
|
min-height: 24px;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Adjust header title */
|
||||||
|
.print-lost-labels-compact .card.scan-table-card h3 {
|
||||||
|
font-size: 16px;
|
||||||
|
padding: 8px 0;
|
||||||
|
margin-bottom: 8px;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
||||||
{% block content %}
|
{% block content %}
|
||||||
@@ -13,7 +57,7 @@
|
|||||||
</div>
|
</div>
|
||||||
|
|
||||||
<!-- ROW 1: Search Card (full width) -->
|
<!-- ROW 1: Search Card (full width) -->
|
||||||
<div class="scan-container lost-labels">
|
<div class="scan-container lost-labels print-lost-labels-compact">
|
||||||
<div class="card search-card">
|
<div class="card search-card">
|
||||||
<div style="display: flex; align-items: center; gap: 15px; flex-wrap: wrap;">
|
<div style="display: flex; align-items: center; gap: 15px; flex-wrap: wrap;">
|
||||||
<label for="search-input" style="font-weight: bold; white-space: nowrap;">Search Order (CP...):</label>
|
<label for="search-input" style="font-weight: bold; white-space: nowrap;">Search Order (CP...):</label>
|
||||||
|
|||||||
@@ -94,21 +94,25 @@ table.view-orders-table.scan-table tbody tr:hover td {
|
|||||||
{% else %}
|
{% else %}
|
||||||
<!-- Show file upload -->
|
<!-- Show file upload -->
|
||||||
<input type="hidden" name="action" value="preview">
|
<input type="hidden" name="action" value="preview">
|
||||||
<label for="file">Choose CSV file:</label>
|
<label for="file">Choose CSV or Excel file:</label>
|
||||||
<input type="file" name="file" accept=".csv" required><br>
|
<input type="file" name="file" accept=".csv,.xlsx,.xls" required><br>
|
||||||
<button type="submit" class="btn">Upload & Preview</button>
|
<button type="submit" class="btn">Upload & Preview</button>
|
||||||
|
|
||||||
<!-- CSV Format Information -->
|
<!-- File Format Information -->
|
||||||
<div style="margin-top: 20px; padding: 15px; background-color: var(--app-card-bg, #2a3441); border-radius: 5px; border-left: 4px solid var(--app-accent-color, #007bff); color: var(--app-text-color, #ffffff);">
|
<div style="margin-top: 20px; padding: 15px; background-color: var(--app-card-bg, #2a3441); border-radius: 5px; border-left: 4px solid var(--app-accent-color, #007bff); color: var(--app-text-color, #ffffff);">
|
||||||
<h5 style="margin-top: 0; color: var(--app-accent-color, #007bff);">Expected CSV Format</h5>
|
<h5 style="margin-top: 0; color: var(--app-accent-color, #007bff);">Expected File Format</h5>
|
||||||
<p style="margin-bottom: 10px; color: var(--app-text-color, #ffffff);">Your CSV file should contain columns such as:</p>
|
<p style="margin-bottom: 10px; color: var(--app-text-color, #ffffff);">Supported file types: <strong>CSV (.csv)</strong> and <strong>Excel (.xlsx, .xls)</strong></p>
|
||||||
|
<p style="margin-bottom: 10px; color: var(--app-text-color, #ffffff);">Your file should contain columns such as:</p>
|
||||||
<ul style="margin-bottom: 10px; color: var(--app-text-color, #ffffff);">
|
<ul style="margin-bottom: 10px; color: var(--app-text-color, #ffffff);">
|
||||||
<li><strong>order_number</strong> - The order/production number</li>
|
<li><strong>Comanda Productie</strong> - Production order number</li>
|
||||||
<li><strong>quantity</strong> - Number of items</li>
|
<li><strong>Cod Articol</strong> - Article code</li>
|
||||||
<li><strong>warehouse_location</strong> - Storage location</li>
|
<li><strong>Descr. Com. Prod</strong> - Description</li>
|
||||||
|
<li><strong>Cantitate</strong> - Quantity</li>
|
||||||
|
<li><strong>Data Livrare</strong> - Delivery date</li>
|
||||||
|
<li><strong>Customer Name</strong> - Customer name</li>
|
||||||
</ul>
|
</ul>
|
||||||
<p style="color: var(--app-secondary-text, #b8c5d1); font-size: 14px; margin-bottom: 0;">
|
<p style="color: var(--app-secondary-text, #b8c5d1); font-size: 14px; margin-bottom: 0;">
|
||||||
Column names are case-insensitive and can have variations like "Order Number", "Quantity", "Location", etc.
|
Column names are case-insensitive and can have variations. For Excel files, the first sheet with data will be used.
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|||||||
@@ -1,185 +0,0 @@
|
|||||||
#!/usr/bin/env python3
|
|
||||||
"""
|
|
||||||
Excel Dashboard Viewer
|
|
||||||
Reads and displays the Dashboard sheet from Open .Orders WIZ New.xlsb
|
|
||||||
"""
|
|
||||||
|
|
||||||
import pandas as pd
|
|
||||||
import sys
|
|
||||||
import os
|
|
||||||
|
|
||||||
def view_dashboard(file_path):
|
|
||||||
"""View the Dashboard sheet from the Excel file"""
|
|
||||||
|
|
||||||
if not os.path.exists(file_path):
|
|
||||||
print(f"❌ File not found: {file_path}")
|
|
||||||
return
|
|
||||||
|
|
||||||
try:
|
|
||||||
print("=" * 80)
|
|
||||||
print(f"📊 Loading Excel file: {os.path.basename(file_path)}")
|
|
||||||
print("=" * 80)
|
|
||||||
|
|
||||||
# First, list all sheets in the file
|
|
||||||
print("\n📋 Available sheets:")
|
|
||||||
xls = pd.ExcelFile(file_path, engine='pyxlsb')
|
|
||||||
for i, sheet in enumerate(xls.sheet_names, 1):
|
|
||||||
print(f" {i}. {sheet}")
|
|
||||||
|
|
||||||
# Try to find and read Daily Mirror sheet
|
|
||||||
dashboard_sheet = None
|
|
||||||
for sheet in xls.sheet_names:
|
|
||||||
if 'daily mirror' in sheet.lower() or 'dashboard' in sheet.lower() or 'dash' in sheet.lower():
|
|
||||||
dashboard_sheet = sheet
|
|
||||||
break
|
|
||||||
|
|
||||||
if not dashboard_sheet:
|
|
||||||
print("\n⚠️ No 'Dashboard' sheet found. Available sheets listed above.")
|
|
||||||
print("\nPlease select a sheet to view (enter number or name):")
|
|
||||||
choice = input("> ").strip()
|
|
||||||
|
|
||||||
if choice.isdigit():
|
|
||||||
idx = int(choice) - 1
|
|
||||||
if 0 <= idx < len(xls.sheet_names):
|
|
||||||
dashboard_sheet = xls.sheet_names[idx]
|
|
||||||
else:
|
|
||||||
if choice in xls.sheet_names:
|
|
||||||
dashboard_sheet = choice
|
|
||||||
|
|
||||||
if not dashboard_sheet:
|
|
||||||
print("❌ Invalid selection")
|
|
||||||
return
|
|
||||||
|
|
||||||
print(f"\n📊 Reading sheet: {dashboard_sheet}")
|
|
||||||
print("=" * 80)
|
|
||||||
|
|
||||||
# Read the sheet
|
|
||||||
df = pd.read_excel(file_path, sheet_name=dashboard_sheet, engine='pyxlsb')
|
|
||||||
|
|
||||||
# Display basic info
|
|
||||||
print(f"\n📏 Sheet dimensions: {df.shape[0]} rows × {df.shape[1]} columns")
|
|
||||||
print(f"\n📋 Column names:")
|
|
||||||
for i, col in enumerate(df.columns, 1):
|
|
||||||
print(f" {i}. {col}")
|
|
||||||
|
|
||||||
# Display first few rows
|
|
||||||
print(f"\n📊 First 10 rows:")
|
|
||||||
print("-" * 80)
|
|
||||||
pd.set_option('display.max_columns', None)
|
|
||||||
pd.set_option('display.width', None)
|
|
||||||
pd.set_option('display.max_colwidth', 50)
|
|
||||||
print(df.head(10).to_string())
|
|
||||||
|
|
||||||
# Display data types
|
|
||||||
print(f"\n📝 Data types:")
|
|
||||||
print(df.dtypes)
|
|
||||||
|
|
||||||
# Display summary statistics for numeric columns
|
|
||||||
numeric_cols = df.select_dtypes(include=['number']).columns
|
|
||||||
if len(numeric_cols) > 0:
|
|
||||||
print(f"\n📈 Summary statistics (numeric columns):")
|
|
||||||
print(df[numeric_cols].describe())
|
|
||||||
|
|
||||||
# Check for date columns
|
|
||||||
date_cols = []
|
|
||||||
for col in df.columns:
|
|
||||||
if 'date' in col.lower():
|
|
||||||
date_cols.append(col)
|
|
||||||
|
|
||||||
if date_cols:
|
|
||||||
print(f"\n📅 Date columns found: {', '.join(date_cols)}")
|
|
||||||
for col in date_cols:
|
|
||||||
try:
|
|
||||||
df[col] = pd.to_datetime(df[col], errors='coerce')
|
|
||||||
print(f" {col}: {df[col].min()} to {df[col].max()}")
|
|
||||||
except:
|
|
||||||
pass
|
|
||||||
|
|
||||||
# Interactive menu
|
|
||||||
while True:
|
|
||||||
print("\n" + "=" * 80)
|
|
||||||
print("📋 Options:")
|
|
||||||
print(" 1. View more rows")
|
|
||||||
print(" 2. Filter data")
|
|
||||||
print(" 3. View specific columns")
|
|
||||||
print(" 4. Export to CSV")
|
|
||||||
print(" 5. View another sheet")
|
|
||||||
print(" 6. Exit")
|
|
||||||
print("=" * 80)
|
|
||||||
|
|
||||||
choice = input("Select option (1-6): ").strip()
|
|
||||||
|
|
||||||
if choice == '1':
|
|
||||||
n = input("How many rows to display? (default 20): ").strip()
|
|
||||||
n = int(n) if n.isdigit() else 20
|
|
||||||
print(df.head(n).to_string())
|
|
||||||
|
|
||||||
elif choice == '2':
|
|
||||||
print(f"Available columns: {', '.join(df.columns)}")
|
|
||||||
col = input("Enter column name to filter: ").strip()
|
|
||||||
if col in df.columns:
|
|
||||||
print(f"Unique values in {col}:")
|
|
||||||
print(df[col].value_counts().head(20))
|
|
||||||
val = input(f"Enter value to filter {col}: ").strip()
|
|
||||||
filtered = df[df[col].astype(str).str.contains(val, case=False, na=False)]
|
|
||||||
print(f"\nFiltered results ({len(filtered)} rows):")
|
|
||||||
print(filtered.to_string())
|
|
||||||
else:
|
|
||||||
print("❌ Column not found")
|
|
||||||
|
|
||||||
elif choice == '3':
|
|
||||||
print(f"Available columns: {', '.join(df.columns)}")
|
|
||||||
cols = input("Enter column names (comma-separated): ").strip()
|
|
||||||
col_list = [c.strip() for c in cols.split(',')]
|
|
||||||
valid_cols = [c for c in col_list if c in df.columns]
|
|
||||||
if valid_cols:
|
|
||||||
print(df[valid_cols].head(20).to_string())
|
|
||||||
else:
|
|
||||||
print("❌ No valid columns found")
|
|
||||||
|
|
||||||
elif choice == '4':
|
|
||||||
output_file = input("Enter output filename (default: dashboard_export.csv): ").strip()
|
|
||||||
output_file = output_file if output_file else "dashboard_export.csv"
|
|
||||||
df.to_csv(output_file, index=False)
|
|
||||||
print(f"✅ Exported to {output_file}")
|
|
||||||
|
|
||||||
elif choice == '5':
|
|
||||||
print("\n📋 Available sheets:")
|
|
||||||
for i, sheet in enumerate(xls.sheet_names, 1):
|
|
||||||
print(f" {i}. {sheet}")
|
|
||||||
sheet_choice = input("Select sheet (number or name): ").strip()
|
|
||||||
|
|
||||||
if sheet_choice.isdigit():
|
|
||||||
idx = int(sheet_choice) - 1
|
|
||||||
if 0 <= idx < len(xls.sheet_names):
|
|
||||||
dashboard_sheet = xls.sheet_names[idx]
|
|
||||||
df = pd.read_excel(file_path, sheet_name=dashboard_sheet, engine='pyxlsb')
|
|
||||||
print(f"\n📊 Loaded sheet: {dashboard_sheet}")
|
|
||||||
print(f"Dimensions: {df.shape[0]} rows × {df.shape[1]} columns")
|
|
||||||
print(df.head(10).to_string())
|
|
||||||
else:
|
|
||||||
if sheet_choice in xls.sheet_names:
|
|
||||||
df = pd.read_excel(file_path, sheet_name=sheet_choice, engine='pyxlsb')
|
|
||||||
print(f"\n📊 Loaded sheet: {sheet_choice}")
|
|
||||||
print(f"Dimensions: {df.shape[0]} rows × {df.shape[1]} columns")
|
|
||||||
print(df.head(10).to_string())
|
|
||||||
|
|
||||||
elif choice == '6':
|
|
||||||
print("👋 Goodbye!")
|
|
||||||
break
|
|
||||||
|
|
||||||
else:
|
|
||||||
print("❌ Invalid option")
|
|
||||||
|
|
||||||
except Exception as e:
|
|
||||||
print(f"❌ Error reading file: {e}")
|
|
||||||
import traceback
|
|
||||||
traceback.print_exc()
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
|
||||||
file_path = "/srv/quality_app/Open .Orders WIZ New.xlsb"
|
|
||||||
|
|
||||||
if len(sys.argv) > 1:
|
|
||||||
file_path = sys.argv[1]
|
|
||||||
|
|
||||||
view_dashboard(file_path)
|
|
||||||
Reference in New Issue
Block a user