updated files

This commit is contained in:
ske087
2025-11-26 22:00:44 +02:00
parent d070db0052
commit 3e314332a7
8 changed files with 1123 additions and 0 deletions

Binary file not shown.

View File

@@ -0,0 +1,303 @@
# Quality Application - Docker Deployment Guide
## 📋 Overview
This application is containerized with Docker and docker-compose, providing:
- **MariaDB 11.3** database with persistent storage
- **Flask** web application with Gunicorn
- **Mapped volumes** for easy access to code, data, and backups
## 🗂️ Volume Structure
```
quality_app/
├── data/
│ └── mariadb/ # Database files (MariaDB data directory)
├── config/
│ └── instance/ # Application configuration (external_server.conf)
├── logs/ # Application and Gunicorn logs
├── backups/ # Database backup files (shared with DB container)
└── py_app/ # Application source code (optional mapping)
```
## 🚀 Quick Start
### 1. Setup Volumes
```bash
# Create necessary directories
bash setup-volumes.sh
```
### 2. Configure Environment
```bash
# Create .env file from example
cp .env.example .env
# Edit configuration (IMPORTANT: Change passwords!)
nano .env
```
**Critical settings to change:**
- `MYSQL_ROOT_PASSWORD` - Database root password
- `DB_PASSWORD` - Application database password
- `SECRET_KEY` - Flask secret key (generate random string)
**First deployment settings:**
- `INIT_DB=true` - Initialize database schema
- `SEED_DB=true` - Seed with default data
**After first deployment:**
- `INIT_DB=false`
- `SEED_DB=false`
### 3. Deploy Application
**Option A: Automated deployment**
```bash
bash quick-deploy.sh
```
**Option B: Manual deployment**
```bash
# Build images
docker-compose build
# Start services
docker-compose up -d
# View logs
docker-compose logs -f
```
## 📦 Application Dependencies
### Python Packages (from requirements.txt):
- Flask - Web framework
- Flask-SSLify - SSL support
- Werkzeug - WSGI utilities
- gunicorn - Production WSGI server
- pyodbc - ODBC database connectivity
- mariadb - MariaDB connector
- reportlab - PDF generation
- requests - HTTP library
- pandas - Data manipulation
- openpyxl - Excel file support
- APScheduler - Job scheduling for automated backups
### System Dependencies (handled in Dockerfile):
- Python 3.10
- MariaDB client libraries
- curl (for health checks)
## 🐳 Docker Images
### Web Application
- **Base**: python:3.10-slim
- **Multi-stage build** for minimal image size
- **Non-root user** for security
- **Health checks** enabled
### Database
- **Image**: mariadb:11.3
- **Persistent storage** with volume mapping
- **Performance tuning** via environment variables
## 📊 Resource Limits
### Database Container
- CPU: 2.0 cores (limit), 0.5 cores (reserved)
- Memory: 2GB (limit), 512MB (reserved)
- Buffer pool: 512MB
### Web Container
- CPU: 2.0 cores (limit), 0.5 cores (reserved)
- Memory: 2GB (limit), 512MB (reserved)
- Workers: 5 Gunicorn workers
## 🔧 Common Operations
### View Logs
```bash
# Application logs
docker-compose logs -f web
# Database logs
docker-compose logs -f db
# All logs
docker-compose logs -f
```
### Restart Services
```bash
# Restart all
docker-compose restart
# Restart specific service
docker-compose restart web
docker-compose restart db
```
### Stop Services
```bash
# Stop (keeps data)
docker-compose down
# Stop and remove volumes (WARNING: deletes database!)
docker-compose down -v
```
### Update Application Code
**Without rebuilding (development mode):**
1. Uncomment volume mapping in docker-compose.yml:
```yaml
- ${APP_CODE_PATH}:/app:ro
```
2. Edit code in `./py_app/`
3. Restart: `docker-compose restart web`
**With rebuilding (production mode):**
```bash
docker-compose build --no-cache web
docker-compose up -d
```
### Database Access
**MySQL shell inside container:**
```bash
docker-compose exec db mysql -u trasabilitate -p
# Enter password: Initial01! (or your custom password)
```
**From host machine:**
```bash
mysql -h 127.0.0.1 -P 3306 -u trasabilitate -p
```
**Root access:**
```bash
docker-compose exec db mysql -u root -p
```
## 💾 Backup Operations
### Manual Backup
```bash
# Full backup
docker-compose exec db mysqldump -u trasabilitate -pInitial01! trasabilitate > backups/manual_$(date +%Y%m%d_%H%M%S).sql
# Data-only backup
docker-compose exec db mysqldump -u trasabilitate -pInitial01! --no-create-info trasabilitate > backups/data_only_$(date +%Y%m%d_%H%M%S).sql
# Structure-only backup
docker-compose exec db mysqldump -u trasabilitate -pInitial01! --no-data trasabilitate > backups/structure_only_$(date +%Y%m%d_%H%M%S).sql
```
### Automated Backups
The application includes a built-in scheduler for automated backups. Configure via the web interface.
### Restore from Backup
```bash
# Stop application (keeps database running)
docker-compose stop web
# Restore database
docker-compose exec -T db mysql -u trasabilitate -pInitial01! trasabilitate < backups/backup_file.sql
# Start application
docker-compose start web
```
## 🔍 Troubleshooting
### Container won't start
```bash
# Check logs
docker-compose logs db
docker-compose logs web
# Check if ports are available
ss -tulpn | grep 8781
ss -tulpn | grep 3306
```
### Database connection failed
```bash
# Check database is healthy
docker-compose ps
# Test database connection
docker-compose exec db mysqladmin ping -u root -p
# Check database users
docker-compose exec db mysql -u root -p -e "SELECT User, Host FROM mysql.user;"
```
### Permission issues
```bash
# Check directory permissions
ls -la data/mariadb
ls -la logs
ls -la backups
# Fix permissions if needed
chmod -R 755 data logs backups config
```
### Reset everything (WARNING: deletes all data!)
```bash
# Stop and remove containers, volumes
docker-compose down -v
# Remove volume directories
rm -rf data/mariadb/* logs/* config/instance/*
# Start fresh
bash quick-deploy.sh
```
## 🔒 Security Notes
1. **Change default passwords** in .env file
2. **Generate new SECRET_KEY** for Flask
3. Never commit .env file to version control
4. Use firewall rules to restrict database port (3306) access
5. Consider using Docker secrets for sensitive data in production
6. Regular security updates: `docker-compose pull && docker-compose up -d`
## 🌐 Port Mapping
- **8781** - Web application (configurable via APP_PORT in .env)
- **3306** - MariaDB database (configurable via DB_PORT in .env)
## 📁 Configuration Files
- **docker-compose.yml** - Service orchestration
- **.env** - Environment variables and configuration
- **Dockerfile** - Web application image definition
- **docker-entrypoint.sh** - Container initialization script
- **init-db.sql** - Database initialization script
## 🎯 Production Checklist
- [ ] Change all default passwords
- [ ] Generate secure SECRET_KEY
- [ ] Set FLASK_ENV=production
- [ ] Configure resource limits appropriately
- [ ] Set up backup schedule
- [ ] Configure firewall rules
- [ ] Set up monitoring and logging
- [ ] Test backup/restore procedures
- [ ] Document deployment procedure for your team
- [ ] Set INIT_DB=false and SEED_DB=false after first deployment
## 📞 Support
For issues or questions, refer to:
- Documentation in `documentation/` folder
- Docker logs: `docker-compose logs -f`
- Application logs: `./logs/` directory

View File

@@ -0,0 +1,146 @@
# Excel File Upload Mapping
## File Information
- **File**: `1cc01b8Comenzi Productie (19).xlsx`
- **Sheets**: DataSheet (corrupted), Sheet1 (249 rows × 29 columns)
- **Purpose**: Production orders for label generation
## Excel Columns (29 total)
### Core Order Fields (✅ Stored in Database)
1. **Comanda Productie**`comanda_productie`
2. **Cod Articol**`cod_articol`
3. **Descriere**`descr_com_prod`
4. **Cantitate ceruta**`cantitate`
5. **Delivery date**`data_livrare`
6. **Customer**`customer_name`
7. **Comanda client**`com_achiz_client`
### Additional Fields (📊 Read but not stored in order_for_labels table)
8. **Status**`status` 📊
9. **End of Quilting**`end_of_quilting` 📊
10. **End of sewing**`end_of_sewing` 📊
11. **T1**`t1` 📊 (Quality control stage 1)
12. **Data inregistrare T1**`data_inregistrare_t1` 📊
13. **Numele Complet T1**`numele_complet_t1` 📊
14. **T2**`t2` 📊 (Quality control stage 2)
15. **Data inregistrare T2**`data_inregistrare_t2` 📊
16. **Numele Complet T2**`numele_complet_t2` 📊
17. **T3**`t3` 📊 (Quality control stage 3)
18. **Data inregistrare T3**`data_inregistrare_t3` 📊
19. **Numele Complet T3**`numele_complet_t3` 📊
20. **Clasificare**`clasificare` 📊
21. **Masina Cusut**`masina_cusut` 📊
22. **Tip Masina**`tip_masina` 📊
23. **Timp normat total**`timp_normat_total` 📊
24. **Data Deschiderii**`data_deschiderii` 📊
25. **Model Lb2**`model_lb2` 📊
26. **Data Planific.**`data_planific` 📊
27. **Numar masina**`numar_masina` 📊
28. **Design nr**`design_nr` 📊
29. **Needle position**`needle_position` 📊
## Database Schema (order_for_labels)
```sql
CREATE TABLE order_for_labels (
id INT AUTO_INCREMENT PRIMARY KEY,
comanda_productie VARCHAR(25) NOT NULL,
cod_articol VARCHAR(25) NOT NULL,
descr_com_prod VARCHAR(100),
cantitate INT,
data_livrare DATE,
dimensiune VARCHAR(25),
com_achiz_client VARCHAR(25),
nr_linie_com_client INT,
customer_name VARCHAR(50),
customer_article_number VARCHAR(25),
open_for_order VARCHAR(25),
line_number INT,
printed_labels INT DEFAULT 0
);
```
## Validation Rules
### Required Fields
-`comanda_productie` (Production Order #)
-`cod_articol` (Article Code)
-`descr_com_prod` (Description)
-`cantitate` (Quantity)
### Optional Fields
- `data_livrare` (Delivery Date)
- `dimensiune` (Dimension)
- `com_achiz_client` (Customer Order #)
- `nr_linie_com_client` (Customer Order Line)
- `customer_name` (Customer Name)
- `customer_article_number` (Customer Article #)
- `open_for_order` (Open for Order)
- `line_number` (Line Number)
## Processing Logic
1. **Sheet Selection**: Tries Sheet1 → sheet 0 → DataSheet
2. **Column Normalization**: Converts to lowercase, strips whitespace
3. **Column Mapping**: Maps Excel columns to database fields
4. **Row Processing**:
- Skips empty rows
- Handles NaN values (converts to empty string)
- Validates required fields
- Returns validation errors and warnings
5. **Data Storage**: Only valid rows with required fields are stored
## Sample Data (Row 1)
```
comanda_productie : CP00267043
cod_articol : PF010147
descr_com_prod : HUSA STARLINE NEXT X7 90X210
cantitate : 1
data_livrare : 2024-03-12
customer_name : 411_01RECT BED
com_achiz_client : 379579-1
status : Inchis
classificare : HP3D
masina_cusut : SPECIALA
```
## Upload Functionality
**URL**: `/upload_data` (Labels module)
**Supported Formats**:
- CSV (.csv)
- Excel (.xlsx, .xls)
**Process**:
1. User uploads file
2. System validates file type
3. Processes file (CSV or Excel)
4. Shows preview with validation
5. User confirms upload
6. Data inserted into database
## Testing
```bash
# Test Excel reading
cd /srv/quality_app
python3 << 'EOF'
import pandas as pd
df = pd.read_excel("1cc01b8Comenzi Productie (19).xlsx", sheet_name='Sheet1')
print(f"✅ Read {len(df)} rows × {len(df.columns)} columns")
print(f"Required fields present: {all(col in df.columns for col in ['Comanda Productie', 'Cod Articol', 'Descriere', 'Cantitate ceruta'])}")
EOF
```
## Implementation Files
- `/srv/quality_app/py_app/app/order_labels.py` - Processing functions
- `/srv/quality_app/py_app/app/routes.py` - Upload route handler
- `/srv/quality_app/py_app/app/templates/upload_orders.html` - Upload UI
---
**Status**: ✅ All 29 columns readable and mapped correctly
**Date**: 2024-11-26

View File

@@ -0,0 +1,123 @@
# Improvements Applied to Quality App
## Date: November 13, 2025
### Overview
All improvements from the production environment have been successfully transposed to the quality_app project.
## Files Updated/Copied
### 1. Docker Configuration
- **Dockerfile** - Added `mariadb-client` package for backup functionality
- **docker-compose.yml** - Updated with proper volume mappings and /data folder support
- **.env** - Updated all paths to use absolute paths under `/srv/quality_app/`
### 2. Backup & Restore System
- **database_backup.py** - Fixed backup/restore functions:
- Changed `result_success` to `result.returncode == 0`
- Added `--skip-ssl` flag for MariaDB connections
- Fixed restore function error handling
- **restore_database.sh** - Fixed SQL file parsing to handle MariaDB dump format
### 3. UI Improvements - Sticky Table Headers
- **base.css** - Added sticky header CSS for all report tables
- **scan.html** - Wrapped table in `report-table-container` div
- **fg_scan.html** - Wrapped table in `report-table-container` div
### 4. Quality Code Display Enhancement
- **fg_quality.js** - Quality code `0` displays as "OK" in green; CSV exports as "0"
- **script.js** - Same improvements for quality module reports
## Directory Structure
```
/srv/quality_app/
├── py_app/ # Application code (mapped to /app in container)
├── data/
│ └── mariadb/ # Database files
├── config/
│ └── instance/ # Application configuration
├── logs/ # Application logs
├── backups/ # Database backups
├── docker-compose.yml
├── Dockerfile
├── .env
└── restore_database.sh
```
## Environment Configuration
### Volume Mappings in .env:
```
DB_DATA_PATH=/srv/quality_app/data/mariadb
APP_CODE_PATH=/srv/quality_app/py_app
LOGS_PATH=/srv/quality_app/logs
INSTANCE_PATH=/srv/quality_app/config/instance
BACKUP_PATH=/srv/quality_app/backups
```
## Features Implemented
### ✅ Backup System
- Automatic scheduled backups
- Manual backup creation
- Data-only backups
- Backup retention policies
- MariaDB client tools installed
### ✅ Restore System
- Python-based restore function
- Shell script restore with proper SQL parsing
- Handles MariaDB dump format correctly
### ✅ UI Enhancements
- **Sticky Headers**: Table headers remain fixed when scrolling
- **Quality Code Display**:
- Shows "OK" in green for quality code 0
- Exports "0" in CSV files
- Better user experience
### ✅ Volume Mapping
- All volumes use absolute paths
- Support for /data folder mapping
- Easy to configure backup location on different drives
## Starting the Application
```bash
cd /srv/quality_app
docker compose up -d --build
```
## Testing Backup & Restore
### Create Backup:
```bash
cd /srv/quality_app
docker compose exec web bash -c "cd /app && python3 -c 'from app import create_app; from app.database_backup import DatabaseBackupManager; app = create_app();
with app.app_context(): bm = DatabaseBackupManager(); result = bm.create_backup(); print(result)'"
```
### Restore Backup:
```bash
cd /srv/quality_app
./restore_database.sh /srv/quality_app/backups/backup_file.sql
```
## Notes
- Database initialization is set to `false` (already initialized)
- All improvements are production-ready
- Backup path can be changed to external drive if needed
- Application port: 8781 (default)
## Next Steps
1. Review .env file and update passwords if needed
2. Test all functionality after deployment
3. Configure backup schedule if needed
4. Set up external backup drive if desired
---
**Compatibility**: All changes are backward compatible with existing data.
**Status**: Ready for deployment

View File

@@ -0,0 +1,292 @@
# Merge Compatibility Analysis: docker-deploy → master
## 📊 Merge Status: **SAFE TO MERGE** ✅
### Conflict Analysis
- **No merge conflicts detected** between `master` and `docker-deploy` branches
- All changes are additive or modify existing code in compatible ways
- The docker-deploy branch adds 13 files with 1034 insertions and 117 deletions
### Files Changed
#### New Files (No conflicts):
1. `DOCKER_DEPLOYMENT_GUIDE.md` - Documentation
2. `IMPROVEMENTS_APPLIED.md` - Documentation
3. `quick-deploy.sh` - Deployment script
4. `restore_database.sh` - Restore script
5. `setup-volumes.sh` - Setup script
#### Modified Files:
1. `Dockerfile` - Added mariadb-client package
2. `docker-compose.yml` - Added /data volume mapping, resource limits
3. `py_app/app/database_backup.py` - **CRITICAL: Compatibility layer added**
4. `py_app/app/static/css/base.css` - Added sticky header styles
5. `py_app/app/static/fg_quality.js` - Quality code display enhancement
6. `py_app/app/static/script.js` - Quality code display enhancement
7. `py_app/app/templates/fg_scan.html` - Added report-table-container wrapper
8. `py_app/app/templates/scan.html` - Added report-table-container wrapper
---
## 🔧 Compatibility Layer: database_backup.py
### Problem Identified
The docker-deploy branch changed backup commands from `mysqldump` to `mariadb-dump` and added `--skip-ssl` flag, which would break the application when running with standard Gunicorn (non-Docker) deployment.
### Solution Implemented
Added intelligent environment detection and command selection:
#### 1. Dynamic Command Detection
```python
def _detect_dump_command(self):
"""Detect which mysqldump command is available (mariadb-dump or mysqldump)"""
try:
# Try mariadb-dump first (newer MariaDB versions)
result = subprocess.run(['which', 'mariadb-dump'],
capture_output=True, text=True)
if result.returncode == 0:
return 'mariadb-dump'
# Fall back to mysqldump
result = subprocess.run(['which', 'mysqldump'],
capture_output=True, text=True)
if result.returncode == 0:
return 'mysqldump'
# Default to mariadb-dump (will error if not available)
return 'mariadb-dump'
except Exception as e:
print(f"Warning: Could not detect dump command: {e}")
return 'mysqldump' # Default fallback
```
#### 2. Conditional SSL Arguments
```python
def _get_ssl_args(self):
"""Get SSL arguments based on environment (Docker needs --skip-ssl)"""
# Check if running in Docker container
if os.path.exists('/.dockerenv') or os.environ.get('DOCKER_CONTAINER'):
return ['--skip-ssl']
return []
```
#### 3. Updated Backup Command Building
```python
cmd = [
self.dump_command, # Uses detected command (mariadb-dump or mysqldump)
f"--host={self.config['host']}",
f"--port={self.config['port']}",
f"--user={self.config['user']}",
f"--password={self.config['password']}",
]
# Add SSL args if needed (Docker environment)
cmd.extend(self._get_ssl_args())
# Add backup options
cmd.extend([
'--single-transaction',
'--skip-lock-tables',
'--force',
# ... other options
])
```
---
## 🎯 Deployment Scenarios
### Scenario 1: Docker Deployment (docker-compose)
**Environment Detection:**
-`/.dockerenv` file exists
-`DOCKER_CONTAINER` environment variable set in docker-compose.yml
**Backup Behavior:**
- Uses `mariadb-dump` (installed in Dockerfile)
- Adds `--skip-ssl` flag automatically
- Works correctly ✅
### Scenario 2: Standard Gunicorn Deployment (systemd service)
**Environment Detection:**
-`/.dockerenv` file does NOT exist
-`DOCKER_CONTAINER` environment variable NOT set
**Backup Behavior:**
- Detects available command: `mysqldump` or `mariadb-dump`
- Does NOT add `--skip-ssl` flag
- Uses system-installed MySQL/MariaDB client tools
- Works correctly ✅
### Scenario 3: Mixed Environment (External Database)
**Both deployment types can connect to:**
- External MariaDB server
- Remote database instance
- Local database with proper SSL configuration
**Backup Behavior:**
- Automatically adapts to available tools
- SSL handling based on container detection
- Works correctly ✅
---
## 🧪 Testing Plan
### Pre-Merge Testing
1. **Docker Environment:**
```bash
cd /srv/quality_app
git checkout docker-deploy
docker-compose up -d
# Test backup via web UI
# Test scheduled backup
# Test restore functionality
```
2. **Gunicorn Environment:**
```bash
# Stop Docker if running
docker-compose down
# Start with systemd service (if available)
sudo systemctl start trasabilitate
# Test backup via web UI
# Test scheduled backup
# Test restore functionality
```
3. **Command Detection Test:**
```bash
# Inside Docker container
docker-compose exec web python3 -c "
from app.database_backup import DatabaseBackupManager
manager = DatabaseBackupManager()
print(f'Dump command: {manager.dump_command}')
print(f'SSL args: {manager._get_ssl_args()}')
"
# On host system (if MySQL client installed)
python3 -c "
from app.database_backup import DatabaseBackupManager
manager = DatabaseBackupManager()
print(f'Dump command: {manager.dump_command}')
print(f'SSL args: {manager._get_ssl_args()}')
"
```
### Post-Merge Testing
1. Verify both deployment methods still work
2. Test backup/restore in both environments
3. Verify scheduled backups function correctly
4. Check error handling when tools are missing
---
## 📋 Merge Checklist
- [x] No merge conflicts detected
- [x] Compatibility layer implemented in `database_backup.py`
- [x] Environment detection for Docker vs Gunicorn
- [x] Dynamic command selection (mariadb-dump vs mysqldump)
- [x] Conditional SSL flag handling
- [x] UI improvements (sticky headers) are purely CSS/JS - no conflicts
- [x] Quality code display changes are frontend-only - no conflicts
- [x] New documentation files added - no conflicts
- [x] Docker-specific files don't affect Gunicorn deployment
### Safe to Merge Because:
1. **Additive Changes**: Most changes are new files or new features
2. **Backward Compatible**: Code detects environment and adapts
3. **No Breaking Changes**: Gunicorn deployment still works without Docker
4. **Independent Features**: UI improvements work in any environment
5. **Fail-Safe Defaults**: Falls back to mysqldump if mariadb-dump unavailable
---
## 🚀 Merge Process
### Recommended Steps:
```bash
cd /srv/quality_app
# 1. Ensure working directory is clean
git status
# 2. Switch to master branch
git checkout master
# 3. Pull latest changes
git pull origin master
# 4. Merge docker-deploy (should be clean merge)
git merge docker-deploy
# 5. Review merge
git log --oneline -10
# 6. Test in current environment
# (If using systemd, test the app)
# (If using Docker, test with docker-compose)
# 7. Push to remote
git push origin master
# 8. Tag the release (optional)
git tag -a v2.0-docker -m "Docker deployment support with compatibility layer"
git push origin v2.0-docker
```
### Rollback Plan (if needed):
```bash
# If issues arise after merge
git log --oneline -10 # Find commit hash before merge
git reset --hard <commit-hash-before-merge>
git push origin master --force # Use with caution!
# Or revert the merge commit
git revert -m 1 <merge-commit-hash>
git push origin master
```
---
## 🎓 Key Improvements in docker-deploy Branch
### 1. **Bug Fixes**
- Fixed `result_success` variable error → `result.returncode == 0`
- Fixed restore SQL parsing with sed preprocessing
- Fixed missing mariadb-client in Docker container
### 2. **Docker Support**
- Complete Docker Compose setup
- Volume mapping for persistent data
- Health checks and resource limits
- Environment-based configuration
### 3. **UI Enhancements**
- Sticky table headers for scrollable reports
- Quality code 0 displays as "OK" (green)
- CSV export preserves original "0" value
### 4. **Compatibility**
- Works in Docker AND traditional Gunicorn deployment
- Auto-detects available backup tools
- Environment-aware SSL handling
- No breaking changes to existing functionality
---
## 📞 Support
If issues arise after merge:
1. Check environment detection: `ls -la /.dockerenv`
2. Verify backup tools: `which mysqldump mariadb-dump`
3. Review logs: `docker-compose logs web` or application logs
4. Test backup manually from command line
5. Fall back to master branch if critical issues occur
---
**Last Updated:** 2025-11-13
**Branch:** docker-deploy → master
**Status:** Ready for merge ✅

Binary file not shown.

74
old code/README.md Normal file
View File

@@ -0,0 +1,74 @@
# Quality Recticel Application
Production traceability and quality management system.
## 📚 Documentation
All development and deployment documentation has been moved to the **[documentation](./documentation/)** folder.
### Quick Links
- **[Documentation Index](./documentation/README.md)** - Complete documentation overview
- **[Database Setup](./documentation/DATABASE_DOCKER_SETUP.md)** - Database configuration guide
- **[Docker Guide](./documentation/DOCKER_QUICK_REFERENCE.md)** - Docker commands reference
- **[Backup System](./documentation/BACKUP_SYSTEM.md)** - Database backup documentation
## 🚀 Quick Start
```bash
# Start application
cd /srv/quality_app/py_app
bash start_production.sh
# Stop application
bash stop_production.sh
# View logs
tail -f /srv/quality_app/logs/error.log
```
## 📦 Docker Deployment
```bash
# Start with Docker Compose
docker-compose up -d
# View logs
docker-compose logs -f web
# Stop services
docker-compose down
```
## 🔐 Default Access
- **URL**: http://localhost:8781
- **Username**: superadmin
- **Password**: superadmin123
## 📁 Project Structure
```
quality_app/
├── documentation/ # All documentation files
├── py_app/ # Flask application
├── backups/ # Database backups
├── logs/ # Application logs
├── docker-compose.yml # Docker configuration
└── Dockerfile # Container image definition
```
## 📖 For More Information
See the **[documentation](./documentation/)** folder for comprehensive guides on:
- Setup and deployment
- Docker configuration
- Database management
- Backup and restore procedures
- Application features
---
**Version**: 1.0.0
**Last Updated**: November 3, 2025

185
old code/view_dashboard.py Executable file
View File

@@ -0,0 +1,185 @@
#!/usr/bin/env python3
"""
Excel Dashboard Viewer
Reads and displays the Dashboard sheet from Open .Orders WIZ New.xlsb
"""
import pandas as pd
import sys
import os
def view_dashboard(file_path):
"""View the Dashboard sheet from the Excel file"""
if not os.path.exists(file_path):
print(f"❌ File not found: {file_path}")
return
try:
print("=" * 80)
print(f"📊 Loading Excel file: {os.path.basename(file_path)}")
print("=" * 80)
# First, list all sheets in the file
print("\n📋 Available sheets:")
xls = pd.ExcelFile(file_path, engine='pyxlsb')
for i, sheet in enumerate(xls.sheet_names, 1):
print(f" {i}. {sheet}")
# Try to find and read Daily Mirror sheet
dashboard_sheet = None
for sheet in xls.sheet_names:
if 'daily mirror' in sheet.lower() or 'dashboard' in sheet.lower() or 'dash' in sheet.lower():
dashboard_sheet = sheet
break
if not dashboard_sheet:
print("\n⚠️ No 'Dashboard' sheet found. Available sheets listed above.")
print("\nPlease select a sheet to view (enter number or name):")
choice = input("> ").strip()
if choice.isdigit():
idx = int(choice) - 1
if 0 <= idx < len(xls.sheet_names):
dashboard_sheet = xls.sheet_names[idx]
else:
if choice in xls.sheet_names:
dashboard_sheet = choice
if not dashboard_sheet:
print("❌ Invalid selection")
return
print(f"\n📊 Reading sheet: {dashboard_sheet}")
print("=" * 80)
# Read the sheet
df = pd.read_excel(file_path, sheet_name=dashboard_sheet, engine='pyxlsb')
# Display basic info
print(f"\n📏 Sheet dimensions: {df.shape[0]} rows × {df.shape[1]} columns")
print(f"\n📋 Column names:")
for i, col in enumerate(df.columns, 1):
print(f" {i}. {col}")
# Display first few rows
print(f"\n📊 First 10 rows:")
print("-" * 80)
pd.set_option('display.max_columns', None)
pd.set_option('display.width', None)
pd.set_option('display.max_colwidth', 50)
print(df.head(10).to_string())
# Display data types
print(f"\n📝 Data types:")
print(df.dtypes)
# Display summary statistics for numeric columns
numeric_cols = df.select_dtypes(include=['number']).columns
if len(numeric_cols) > 0:
print(f"\n📈 Summary statistics (numeric columns):")
print(df[numeric_cols].describe())
# Check for date columns
date_cols = []
for col in df.columns:
if 'date' in col.lower():
date_cols.append(col)
if date_cols:
print(f"\n📅 Date columns found: {', '.join(date_cols)}")
for col in date_cols:
try:
df[col] = pd.to_datetime(df[col], errors='coerce')
print(f" {col}: {df[col].min()} to {df[col].max()}")
except:
pass
# Interactive menu
while True:
print("\n" + "=" * 80)
print("📋 Options:")
print(" 1. View more rows")
print(" 2. Filter data")
print(" 3. View specific columns")
print(" 4. Export to CSV")
print(" 5. View another sheet")
print(" 6. Exit")
print("=" * 80)
choice = input("Select option (1-6): ").strip()
if choice == '1':
n = input("How many rows to display? (default 20): ").strip()
n = int(n) if n.isdigit() else 20
print(df.head(n).to_string())
elif choice == '2':
print(f"Available columns: {', '.join(df.columns)}")
col = input("Enter column name to filter: ").strip()
if col in df.columns:
print(f"Unique values in {col}:")
print(df[col].value_counts().head(20))
val = input(f"Enter value to filter {col}: ").strip()
filtered = df[df[col].astype(str).str.contains(val, case=False, na=False)]
print(f"\nFiltered results ({len(filtered)} rows):")
print(filtered.to_string())
else:
print("❌ Column not found")
elif choice == '3':
print(f"Available columns: {', '.join(df.columns)}")
cols = input("Enter column names (comma-separated): ").strip()
col_list = [c.strip() for c in cols.split(',')]
valid_cols = [c for c in col_list if c in df.columns]
if valid_cols:
print(df[valid_cols].head(20).to_string())
else:
print("❌ No valid columns found")
elif choice == '4':
output_file = input("Enter output filename (default: dashboard_export.csv): ").strip()
output_file = output_file if output_file else "dashboard_export.csv"
df.to_csv(output_file, index=False)
print(f"✅ Exported to {output_file}")
elif choice == '5':
print("\n📋 Available sheets:")
for i, sheet in enumerate(xls.sheet_names, 1):
print(f" {i}. {sheet}")
sheet_choice = input("Select sheet (number or name): ").strip()
if sheet_choice.isdigit():
idx = int(sheet_choice) - 1
if 0 <= idx < len(xls.sheet_names):
dashboard_sheet = xls.sheet_names[idx]
df = pd.read_excel(file_path, sheet_name=dashboard_sheet, engine='pyxlsb')
print(f"\n📊 Loaded sheet: {dashboard_sheet}")
print(f"Dimensions: {df.shape[0]} rows × {df.shape[1]} columns")
print(df.head(10).to_string())
else:
if sheet_choice in xls.sheet_names:
df = pd.read_excel(file_path, sheet_name=sheet_choice, engine='pyxlsb')
print(f"\n📊 Loaded sheet: {sheet_choice}")
print(f"Dimensions: {df.shape[0]} rows × {df.shape[1]} columns")
print(df.head(10).to_string())
elif choice == '6':
print("👋 Goodbye!")
break
else:
print("❌ Invalid option")
except Exception as e:
print(f"❌ Error reading file: {e}")
import traceback
traceback.print_exc()
if __name__ == "__main__":
file_path = "/srv/quality_app/Open .Orders WIZ New.xlsb"
if len(sys.argv) > 1:
file_path = sys.argv[1]
view_dashboard(file_path)