Compare commits
13 Commits
7f5991f60d
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
d5456c0ec4 | ||
|
|
d0fbfe25b3 | ||
|
|
52344a27a6 | ||
|
|
9cb32da13c | ||
|
|
0f34a47fa9 | ||
|
|
a5ef5749b1 | ||
| 3cc703a7d1 | |||
|
|
505c8e268c | ||
|
|
6cefce81ef | ||
| 359e330758 | |||
| 9c124dbd7e | |||
|
|
7b24245ddb | ||
|
|
58694ff3f4 |
662
OPTIMIZATION_PROPOSAL.md
Normal file
662
OPTIMIZATION_PROPOSAL.md
Normal file
@@ -0,0 +1,662 @@
|
||||
# DigiServer Optimization Proposal
|
||||
|
||||
## Executive Summary
|
||||
|
||||
After analyzing the DigiServer project, I've identified several optimization opportunities across performance, architecture, security, and maintainability. The current system is functional but has areas for improvement.
|
||||
|
||||
## Current State Analysis
|
||||
|
||||
### Metrics
|
||||
- **Main Application**: 1,051 lines (app.py)
|
||||
- **Docker Image Size**: 3.53 GB ⚠️ (Very Large)
|
||||
- **Database Size**: 2.6 MB
|
||||
- **Media Storage**: 13 MB
|
||||
- **Routes**: 30+ endpoints
|
||||
- **Templates**: 14 HTML files
|
||||
|
||||
### Architecture
|
||||
- ✅ **Good**: Modular structure (models, utils, templates)
|
||||
- ✅ **Good**: Docker containerization
|
||||
- ✅ **Good**: Flask extensions properly used
|
||||
- ⚠️ **Issue**: Monolithic app.py (1,051 lines)
|
||||
- ⚠️ **Issue**: Large Docker image
|
||||
- ⚠️ **Issue**: No caching strategy
|
||||
- ⚠️ **Issue**: Synchronous video processing blocks requests
|
||||
|
||||
---
|
||||
|
||||
## Priority 1: Critical Optimizations
|
||||
|
||||
### 1. Reduce Docker Image Size (3.53 GB → ~800 MB)
|
||||
|
||||
**Current Issue**: Docker image is unnecessarily large due to build dependencies
|
||||
|
||||
**Solution**: Multi-stage build
|
||||
|
||||
```dockerfile
|
||||
# Stage 1: Build stage with heavy dependencies
|
||||
FROM python:3.11-slim as builder
|
||||
|
||||
WORKDIR /build
|
||||
|
||||
# Install build dependencies
|
||||
RUN apt-get update && apt-get install -y \
|
||||
build-essential \
|
||||
g++ \
|
||||
cargo \
|
||||
libffi-dev \
|
||||
libssl-dev \
|
||||
&& rm -rf /var/lib/apt/lists/*
|
||||
|
||||
# Install Python packages with wheels
|
||||
COPY app/requirements.txt .
|
||||
RUN pip wheel --no-cache-dir --wheel-dir /build/wheels -r requirements.txt
|
||||
|
||||
# Stage 2: Runtime stage (smaller)
|
||||
FROM python:3.11-slim
|
||||
|
||||
WORKDIR /app
|
||||
|
||||
# Install only runtime dependencies
|
||||
RUN apt-get update && apt-get install -y \
|
||||
poppler-utils \
|
||||
libreoffice-writer \
|
||||
libreoffice-impress \
|
||||
ffmpeg \
|
||||
libmagic1 \
|
||||
curl \
|
||||
fonts-dejavu-core \
|
||||
--no-install-recommends \
|
||||
&& rm -rf /var/lib/apt/lists/* \
|
||||
&& apt-get clean
|
||||
|
||||
# Copy wheels from builder
|
||||
COPY --from=builder /build/wheels /wheels
|
||||
RUN pip install --no-cache-dir /wheels/* && rm -rf /wheels
|
||||
|
||||
# Copy application
|
||||
COPY app/ .
|
||||
RUN chmod +x entrypoint.sh
|
||||
|
||||
# Create volumes
|
||||
RUN mkdir -p /app/static/uploads /app/static/resurse /app/instance
|
||||
|
||||
EXPOSE 5000
|
||||
CMD ["./entrypoint.sh"]
|
||||
```
|
||||
|
||||
**Impact**:
|
||||
- ✅ Reduce image size by ~70% (3.53GB → ~800MB)
|
||||
- ✅ Faster deployment and startup
|
||||
- ✅ Less storage and bandwidth usage
|
||||
|
||||
---
|
||||
|
||||
### 2. Split Monolithic app.py into Blueprints
|
||||
|
||||
**Current Issue**: 1,051 lines in single file makes maintenance difficult
|
||||
|
||||
**Proposed Structure**:
|
||||
```
|
||||
app/
|
||||
├── app.py (main app initialization, ~100 lines)
|
||||
├── blueprints/
|
||||
│ ├── __init__.py
|
||||
│ ├── auth.py # Login, logout, register
|
||||
│ ├── admin.py # Admin routes
|
||||
│ ├── players.py # Player management
|
||||
│ ├── groups.py # Group management
|
||||
│ ├── content.py # Content upload/management
|
||||
│ └── api.py # API endpoints
|
||||
├── models/
|
||||
├── utils/
|
||||
└── templates/
|
||||
```
|
||||
|
||||
**Example Blueprint (auth.py)**:
|
||||
```python
|
||||
from flask import Blueprint, render_template, request, redirect, url_for, flash
|
||||
from flask_login import login_user, logout_user, login_required
|
||||
from models import User
|
||||
from extensions import db, bcrypt
|
||||
|
||||
auth_bp = Blueprint('auth', __name__)
|
||||
|
||||
@auth_bp.route('/login', methods=['GET', 'POST'])
|
||||
def login():
|
||||
# Login logic here
|
||||
pass
|
||||
|
||||
@auth_bp.route('/logout')
|
||||
@login_required
|
||||
def logout():
|
||||
logout_user()
|
||||
return redirect(url_for('auth.login'))
|
||||
|
||||
@auth_bp.route('/register', methods=['GET', 'POST'])
|
||||
def register():
|
||||
# Register logic here
|
||||
pass
|
||||
```
|
||||
|
||||
**Benefits**:
|
||||
- ✅ Better code organization
|
||||
- ✅ Easier to maintain and test
|
||||
- ✅ Multiple developers can work simultaneously
|
||||
- ✅ Clear separation of concerns
|
||||
|
||||
---
|
||||
|
||||
### 3. Implement Redis Caching
|
||||
|
||||
**Current Issue**: Database queries repeated on every request
|
||||
|
||||
**Solution**: Add Redis for caching
|
||||
|
||||
```python
|
||||
# Add to docker-compose.yml
|
||||
services:
|
||||
redis:
|
||||
image: redis:7-alpine
|
||||
container_name: digiserver-redis
|
||||
restart: unless-stopped
|
||||
networks:
|
||||
- digiserver-network
|
||||
volumes:
|
||||
- redis-data:/data
|
||||
|
||||
# Add to requirements.txt
|
||||
redis==5.0.1
|
||||
Flask-Caching==2.1.0
|
||||
|
||||
# Configuration
|
||||
from flask_caching import Cache
|
||||
|
||||
cache = Cache(config={
|
||||
'CACHE_TYPE': 'redis',
|
||||
'CACHE_REDIS_HOST': 'redis',
|
||||
'CACHE_REDIS_PORT': 6379,
|
||||
'CACHE_DEFAULT_TIMEOUT': 300
|
||||
})
|
||||
|
||||
# Usage examples
|
||||
@cache.cached(timeout=60, key_prefix='dashboard')
|
||||
def dashboard():
|
||||
# Cached for 60 seconds
|
||||
pass
|
||||
|
||||
@cache.memoize(timeout=300)
|
||||
def get_player_content(player_id):
|
||||
# Cached per player_id for 5 minutes
|
||||
return Content.query.filter_by(player_id=player_id).all()
|
||||
```
|
||||
|
||||
**Impact**:
|
||||
- ✅ 50-80% faster page loads
|
||||
- ✅ Reduced database load
|
||||
- ✅ Better scalability
|
||||
|
||||
---
|
||||
|
||||
## Priority 2: Performance Optimizations
|
||||
|
||||
### 4. Implement Celery for Background Tasks
|
||||
|
||||
**Current Issue**: Video conversion blocks HTTP requests
|
||||
|
||||
**Solution**: Use Celery for async tasks
|
||||
|
||||
```python
|
||||
# docker-compose.yml
|
||||
services:
|
||||
worker:
|
||||
build: .
|
||||
image: digiserver:latest
|
||||
container_name: digiserver-worker
|
||||
command: celery -A celery_worker.celery worker --loglevel=info
|
||||
volumes:
|
||||
- ./app:/app
|
||||
- ./data/uploads:/app/static/uploads
|
||||
networks:
|
||||
- digiserver-network
|
||||
depends_on:
|
||||
- redis
|
||||
|
||||
# celery_worker.py
|
||||
from celery import Celery
|
||||
from app import app
|
||||
|
||||
celery = Celery(
|
||||
app.import_name,
|
||||
broker='redis://redis:6379/0',
|
||||
backend='redis://redis:6379/1'
|
||||
)
|
||||
|
||||
@celery.task
|
||||
def convert_video_task(file_path, filename, target_type, target_id, duration):
|
||||
with app.app_context():
|
||||
convert_video_and_update_playlist(
|
||||
app, file_path, filename, target_type, target_id, duration
|
||||
)
|
||||
return {'status': 'completed', 'filename': filename}
|
||||
|
||||
# Usage in upload route
|
||||
@app.route('/upload_content', methods=['POST'])
|
||||
def upload_content():
|
||||
# ... validation ...
|
||||
|
||||
for file in files:
|
||||
if media_type == 'video':
|
||||
# Queue video conversion
|
||||
convert_video_task.delay(file_path, filename, target_type, target_id, duration)
|
||||
flash('Video queued for processing', 'info')
|
||||
else:
|
||||
# Process immediately
|
||||
process_uploaded_files(...)
|
||||
```
|
||||
|
||||
**Benefits**:
|
||||
- ✅ Non-blocking uploads
|
||||
- ✅ Better user experience
|
||||
- ✅ Can retry failed tasks
|
||||
- ✅ Monitor task status
|
||||
|
||||
---
|
||||
|
||||
### 5. Database Query Optimization
|
||||
|
||||
**Current Issues**: N+1 queries, no indexes
|
||||
|
||||
**Solutions**:
|
||||
|
||||
```python
|
||||
# Add indexes to models
|
||||
class Content(db.Model):
|
||||
__tablename__ = 'content'
|
||||
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
player_id = db.Column(db.Integer, db.ForeignKey('player.id'), index=True) # Add index
|
||||
position = db.Column(db.Integer, index=True) # Add index
|
||||
|
||||
__table_args__ = (
|
||||
db.Index('idx_player_position', 'player_id', 'position'), # Composite index
|
||||
)
|
||||
|
||||
# Use eager loading
|
||||
def get_group_content(group_id):
|
||||
# Bad: N+1 queries
|
||||
group = Group.query.get(group_id)
|
||||
content = [Content.query.filter_by(player_id=p.id).all() for p in group.players]
|
||||
|
||||
# Good: Single query with join
|
||||
content = db.session.query(Content)\
|
||||
.join(Player)\
|
||||
.join(Group, Player.groups)\
|
||||
.filter(Group.id == group_id)\
|
||||
.options(db.joinedload(Content.player))\
|
||||
.all()
|
||||
return content
|
||||
|
||||
# Use query result caching
|
||||
from sqlalchemy.orm import lazyload
|
||||
|
||||
@cache.memoize(timeout=300)
|
||||
def get_player_feedback_cached(player_name, limit=5):
|
||||
return PlayerFeedback.query\
|
||||
.filter_by(player_name=player_name)\
|
||||
.order_by(PlayerFeedback.timestamp.desc())\
|
||||
.limit(limit)\
|
||||
.all()
|
||||
```
|
||||
|
||||
**Impact**:
|
||||
- ✅ 40-60% faster database operations
|
||||
- ✅ Reduced database load
|
||||
|
||||
---
|
||||
|
||||
### 6. Optimize Static File Delivery
|
||||
|
||||
**Current**: Flask serves static files (slow)
|
||||
|
||||
**Solution**: Use nginx as reverse proxy
|
||||
|
||||
```yaml
|
||||
# docker-compose.yml
|
||||
services:
|
||||
nginx:
|
||||
image: nginx:alpine
|
||||
container_name: digiserver-nginx
|
||||
ports:
|
||||
- "80:80"
|
||||
volumes:
|
||||
- ./nginx.conf:/etc/nginx/nginx.conf
|
||||
- ./data/uploads:/var/www/uploads:ro
|
||||
- ./data/resurse:/var/www/resurse:ro
|
||||
depends_on:
|
||||
- digiserver
|
||||
networks:
|
||||
- digiserver-network
|
||||
|
||||
digiserver:
|
||||
ports: [] # Remove external port exposure
|
||||
```
|
||||
|
||||
```nginx
|
||||
# nginx.conf
|
||||
http {
|
||||
# Enable gzip compression
|
||||
gzip on;
|
||||
gzip_types text/css application/javascript application/json image/svg+xml;
|
||||
gzip_comp_level 6;
|
||||
|
||||
# Cache static files
|
||||
location /static/uploads/ {
|
||||
alias /var/www/uploads/;
|
||||
expires 1y;
|
||||
add_header Cache-Control "public, immutable";
|
||||
}
|
||||
|
||||
location / {
|
||||
proxy_pass http://digiserver:5000;
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Benefits**:
|
||||
- ✅ 3-5x faster static file delivery
|
||||
- ✅ Automatic gzip compression
|
||||
- ✅ Better caching
|
||||
- ✅ Load balancing ready
|
||||
|
||||
---
|
||||
|
||||
## Priority 3: Code Quality & Maintainability
|
||||
|
||||
### 7. Add Type Hints
|
||||
|
||||
```python
|
||||
# Before
|
||||
def get_player_content(player_id):
|
||||
return Content.query.filter_by(player_id=player_id).all()
|
||||
|
||||
# After
|
||||
from typing import List, Optional
|
||||
from models import Content
|
||||
|
||||
def get_player_content(player_id: int) -> List[Content]:
|
||||
"""Get all content for a specific player."""
|
||||
return Content.query.filter_by(player_id=player_id).all()
|
||||
|
||||
def update_playlist_version(player: Player, increment: int = 1) -> int:
|
||||
"""Update player playlist version and return new version."""
|
||||
player.playlist_version += increment
|
||||
db.session.commit()
|
||||
return player.playlist_version
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 8. Add API Rate Limiting
|
||||
|
||||
```python
|
||||
# Add to requirements.txt
|
||||
Flask-Limiter==3.5.0
|
||||
|
||||
# Configuration
|
||||
from flask_limiter import Limiter
|
||||
from flask_limiter.util import get_remote_address
|
||||
|
||||
limiter = Limiter(
|
||||
app=app,
|
||||
key_func=get_remote_address,
|
||||
storage_uri="redis://redis:6379",
|
||||
default_limits=["200 per day", "50 per hour"]
|
||||
)
|
||||
|
||||
# Apply to routes
|
||||
@app.route('/api/player-feedback', methods=['POST'])
|
||||
@limiter.limit("10 per minute")
|
||||
def api_player_feedback():
|
||||
# Protected from abuse
|
||||
pass
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 9. Implement Health Checks & Monitoring
|
||||
|
||||
```python
|
||||
# Add health endpoint
|
||||
@app.route('/health')
|
||||
def health():
|
||||
try:
|
||||
# Check database
|
||||
db.session.execute(text('SELECT 1'))
|
||||
|
||||
# Check Redis
|
||||
cache.set('health_check', 'ok', timeout=5)
|
||||
|
||||
# Check disk space
|
||||
upload_stat = os.statvfs(UPLOAD_FOLDER)
|
||||
free_space_gb = (upload_stat.f_bavail * upload_stat.f_frsize) / (1024**3)
|
||||
|
||||
return jsonify({
|
||||
'status': 'healthy',
|
||||
'database': 'ok',
|
||||
'cache': 'ok',
|
||||
'disk_space_gb': round(free_space_gb, 2)
|
||||
}), 200
|
||||
except Exception as e:
|
||||
return jsonify({'status': 'unhealthy', 'error': str(e)}), 500
|
||||
|
||||
# Add metrics endpoint (Prometheus)
|
||||
from prometheus_flask_exporter import PrometheusMetrics
|
||||
|
||||
metrics = PrometheusMetrics(app)
|
||||
|
||||
# Automatic metrics:
|
||||
# - Request count
|
||||
# - Request duration
|
||||
# - Request size
|
||||
# - Response size
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 10. Environment-Based Configuration
|
||||
|
||||
```python
|
||||
# config.py
|
||||
import os
|
||||
|
||||
class Config:
|
||||
SECRET_KEY = os.getenv('SECRET_KEY', 'default-dev-key')
|
||||
SQLALCHEMY_TRACK_MODIFICATIONS = False
|
||||
MAX_CONTENT_LENGTH = 2048 * 1024 * 1024
|
||||
|
||||
class DevelopmentConfig(Config):
|
||||
DEBUG = True
|
||||
SQLALCHEMY_DATABASE_URI = 'sqlite:///dev.db'
|
||||
CACHE_TYPE = 'simple'
|
||||
|
||||
class ProductionConfig(Config):
|
||||
DEBUG = False
|
||||
SQLALCHEMY_DATABASE_URI = os.getenv('DATABASE_URL')
|
||||
CACHE_TYPE = 'redis'
|
||||
CACHE_REDIS_HOST = 'redis'
|
||||
|
||||
class TestingConfig(Config):
|
||||
TESTING = True
|
||||
SQLALCHEMY_DATABASE_URI = 'sqlite:///:memory:'
|
||||
|
||||
# Usage in app.py
|
||||
env = os.getenv('FLASK_ENV', 'development')
|
||||
if env == 'production':
|
||||
app.config.from_object('config.ProductionConfig')
|
||||
elif env == 'testing':
|
||||
app.config.from_object('config.TestingConfig')
|
||||
else:
|
||||
app.config.from_object('config.DevelopmentConfig')
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Priority 4: Security Enhancements
|
||||
|
||||
### 11. Security Hardening
|
||||
|
||||
```python
|
||||
# Add to requirements.txt
|
||||
Flask-Talisman==1.1.0 # Already present
|
||||
Flask-SeaSurf==1.1.1 # CSRF protection
|
||||
|
||||
# Configuration
|
||||
from flask_talisman import Talisman
|
||||
from flask_seasurf import SeaSurf
|
||||
|
||||
# HTTPS enforcement (production only)
|
||||
if app.config['ENV'] == 'production':
|
||||
Talisman(app,
|
||||
force_https=True,
|
||||
strict_transport_security=True,
|
||||
content_security_policy={
|
||||
'default-src': "'self'",
|
||||
'img-src': ['*', 'data:'],
|
||||
'script-src': ["'self'", "'unsafe-inline'", 'cdn.jsdelivr.net'],
|
||||
'style-src': ["'self'", "'unsafe-inline'", 'cdn.jsdelivr.net']
|
||||
}
|
||||
)
|
||||
|
||||
# CSRF protection
|
||||
csrf = SeaSurf(app)
|
||||
|
||||
# Exempt API endpoints (use API keys instead)
|
||||
@csrf.exempt
|
||||
@app.route('/api/player-feedback', methods=['POST'])
|
||||
def api_player_feedback():
|
||||
# Verify API key
|
||||
api_key = request.headers.get('X-API-Key')
|
||||
if not verify_api_key(api_key):
|
||||
return jsonify({'error': 'Unauthorized'}), 401
|
||||
# ... rest of logic
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### 12. Input Validation & Sanitization
|
||||
|
||||
```python
|
||||
# Add to requirements.txt
|
||||
marshmallow==3.20.1
|
||||
Flask-Marshmallow==0.15.0
|
||||
|
||||
# schemas.py
|
||||
from marshmallow import Schema, fields, validate
|
||||
|
||||
class PlayerFeedbackSchema(Schema):
|
||||
player_name = fields.Str(required=True, validate=validate.Length(min=1, max=100))
|
||||
quickconnect_code = fields.Str(required=True, validate=validate.Length(min=6, max=20))
|
||||
message = fields.Str(required=True, validate=validate.Length(max=500))
|
||||
status = fields.Str(required=True, validate=validate.OneOf(['active', 'error', 'playing', 'stopped']))
|
||||
timestamp = fields.DateTime(required=True)
|
||||
playlist_version = fields.Int(allow_none=True)
|
||||
error_details = fields.Str(allow_none=True, validate=validate.Length(max=1000))
|
||||
|
||||
# Usage
|
||||
from schemas import PlayerFeedbackSchema
|
||||
|
||||
@app.route('/api/player-feedback', methods=['POST'])
|
||||
def api_player_feedback():
|
||||
schema = PlayerFeedbackSchema()
|
||||
try:
|
||||
data = schema.load(request.get_json())
|
||||
except ValidationError as err:
|
||||
return jsonify({'error': err.messages}), 400
|
||||
|
||||
# Data is now validated and sanitized
|
||||
feedback = PlayerFeedback(**data)
|
||||
db.session.add(feedback)
|
||||
db.session.commit()
|
||||
return jsonify({'success': True}), 200
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Roadmap
|
||||
|
||||
### Phase 1: Quick Wins (1-2 days)
|
||||
1. ✅ Multi-stage Docker build (reduce image size)
|
||||
2. ✅ Add basic caching for dashboard
|
||||
3. ✅ Database indexes
|
||||
4. ✅ Type hints for main functions
|
||||
|
||||
### Phase 2: Architecture (3-5 days)
|
||||
1. ✅ Split app.py into blueprints
|
||||
2. ✅ Add Redis caching
|
||||
3. ✅ Implement Celery for background tasks
|
||||
4. ✅ Add nginx reverse proxy
|
||||
|
||||
### Phase 3: Polish (2-3 days)
|
||||
1. ✅ Security hardening
|
||||
2. ✅ Input validation
|
||||
3. ✅ Health checks & monitoring
|
||||
4. ✅ Environment-based config
|
||||
|
||||
### Phase 4: Testing & Documentation (2-3 days)
|
||||
1. ✅ Unit tests
|
||||
2. ✅ Integration tests
|
||||
3. ✅ API documentation
|
||||
4. ✅ Deployment guide
|
||||
|
||||
---
|
||||
|
||||
## Expected Results
|
||||
|
||||
### Performance
|
||||
- **Page Load Time**: 2-3s → 0.5-1s (50-75% faster)
|
||||
- **API Response**: 100-200ms → 20-50ms (75% faster)
|
||||
- **Video Upload**: Blocks request → Async (immediate response)
|
||||
- **Docker Image**: 3.53GB → 800MB (77% smaller)
|
||||
|
||||
### Scalability
|
||||
- **Concurrent Users**: 10-20 → 100-200 (10x)
|
||||
- **Request Handling**: 10 req/s → 100 req/s (10x)
|
||||
- **Database Load**: High → Low (caching)
|
||||
|
||||
### Maintainability
|
||||
- **Code Organization**: Monolithic → Modular (blueprints)
|
||||
- **Type Safety**: None → Type hints
|
||||
- **Testing**: Difficult → Easy (smaller modules)
|
||||
- **Documentation**: Scattered → Centralized
|
||||
|
||||
---
|
||||
|
||||
## Cost-Benefit Analysis
|
||||
|
||||
| Optimization | Effort | Impact | Priority |
|
||||
|--------------|--------|---------|----------|
|
||||
| Multi-stage Docker | Low | High | 🔴 Critical |
|
||||
| Split to Blueprints | Medium | High | 🔴 Critical |
|
||||
| Redis Caching | Low | High | 🔴 Critical |
|
||||
| Celery Background | Medium | High | 🟡 High |
|
||||
| Database Indexes | Low | Medium | 🟡 High |
|
||||
| nginx Proxy | Low | Medium | 🟡 High |
|
||||
| Type Hints | Low | Low | 🟢 Medium |
|
||||
| Rate Limiting | Low | Low | 🟢 Medium |
|
||||
| Security Hardening | Medium | Medium | 🟡 High |
|
||||
| Monitoring | Low | Medium | 🟢 Medium |
|
||||
|
||||
---
|
||||
|
||||
## Next Steps
|
||||
|
||||
1. **Review this proposal** with the team
|
||||
2. **Prioritize optimizations** based on current pain points
|
||||
3. **Create feature branches** for each optimization
|
||||
4. **Implement in phases** to minimize disruption
|
||||
5. **Test thoroughly** before deploying to production
|
||||
|
||||
Would you like me to start implementing any of these optimizations?
|
||||
134
OPTIMIZATION_SUMMARY.md
Normal file
134
OPTIMIZATION_SUMMARY.md
Normal file
@@ -0,0 +1,134 @@
|
||||
# DigiServer Optimization - Quick Reference
|
||||
|
||||
## 🎯 Top 3 Critical Optimizations
|
||||
|
||||
### 1. Reduce Docker Image Size: 3.53GB → 800MB (77% smaller)
|
||||
**Impact**: Faster deployments, less storage
|
||||
**Effort**: 2 hours
|
||||
**File**: `Dockerfile` - implement multi-stage build
|
||||
|
||||
### 2. Split Monolithic app.py (1,051 lines) into Blueprints
|
||||
**Impact**: Better maintainability, easier testing
|
||||
**Effort**: 1 day
|
||||
**Structure**:
|
||||
```
|
||||
blueprints/
|
||||
├── auth.py # Login/Register
|
||||
├── admin.py # Admin panel
|
||||
├── players.py # Player management
|
||||
├── groups.py # Group management
|
||||
├── content.py # Upload/Media
|
||||
└── api.py # API endpoints
|
||||
```
|
||||
|
||||
### 3. Add Redis Caching
|
||||
**Impact**: 50-80% faster page loads
|
||||
**Effort**: 4 hours
|
||||
**Add**: Redis container + Flask-Caching
|
||||
|
||||
---
|
||||
|
||||
## 📊 Current State
|
||||
|
||||
| Metric | Value | Status |
|
||||
|--------|-------|--------|
|
||||
| Docker Image | 3.53 GB | ⚠️ Too large |
|
||||
| Main File (app.py) | 1,051 lines | ⚠️ Monolithic |
|
||||
| Routes | 30+ endpoints | ⚠️ No structure |
|
||||
| Caching | None | ❌ Missing |
|
||||
| Background Tasks | Synchronous | ❌ Blocks requests |
|
||||
| API Rate Limiting | None | ⚠️ Security risk |
|
||||
|
||||
---
|
||||
|
||||
## 🚀 Quick Performance Wins
|
||||
|
||||
### Database Indexes (30 minutes)
|
||||
```python
|
||||
# Add to models
|
||||
class Content(db.Model):
|
||||
player_id = db.Column(db.Integer, index=True)
|
||||
position = db.Column(db.Integer, index=True)
|
||||
```
|
||||
|
||||
### Cache Dashboard (1 hour)
|
||||
```python
|
||||
from flask_caching import Cache
|
||||
cache = Cache(config={'CACHE_TYPE': 'simple'})
|
||||
|
||||
@cache.cached(timeout=60)
|
||||
def dashboard():
|
||||
# Cached for 60 seconds
|
||||
```
|
||||
|
||||
### Type Hints (2 hours)
|
||||
```python
|
||||
def get_player_content(player_id: int) -> List[Content]:
|
||||
return Content.query.filter_by(player_id=player_id).all()
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📈 Expected Results
|
||||
|
||||
| Metric | Before | After | Improvement |
|
||||
|--------|--------|-------|-------------|
|
||||
| Docker Image | 3.53 GB | 800 MB | 77% ↓ |
|
||||
| Page Load | 2-3s | 0.5-1s | 70% ↓ |
|
||||
| API Response | 100-200ms | 20-50ms | 75% ↓ |
|
||||
| Concurrent Users | 10-20 | 100-200 | 10x ↑ |
|
||||
| Maintainability | Low | High | ++ |
|
||||
|
||||
---
|
||||
|
||||
## 🔧 Implementation Order
|
||||
|
||||
### Week 1: Critical
|
||||
- [ ] Multi-stage Docker build
|
||||
- [ ] Database indexes
|
||||
- [ ] Basic caching
|
||||
|
||||
### Week 2: Architecture
|
||||
- [ ] Split to blueprints
|
||||
- [ ] Add Redis
|
||||
- [ ] Celery for video processing
|
||||
|
||||
### Week 3: Polish
|
||||
- [ ] nginx reverse proxy
|
||||
- [ ] Security hardening
|
||||
- [ ] Monitoring & health checks
|
||||
|
||||
---
|
||||
|
||||
## 💡 Quick Commands
|
||||
|
||||
### Rebuild Docker (smaller image)
|
||||
```bash
|
||||
docker compose down
|
||||
docker compose build --no-cache
|
||||
docker compose up -d
|
||||
```
|
||||
|
||||
### Check Image Size
|
||||
```bash
|
||||
docker images digiserver:latest
|
||||
```
|
||||
|
||||
### Monitor Performance
|
||||
```bash
|
||||
docker stats digiserver
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 📝 Files to Modify
|
||||
|
||||
1. **Dockerfile** - Multi-stage build
|
||||
2. **docker-compose.yml** - Add Redis, Celery, nginx
|
||||
3. **app.py** - Split into blueprints
|
||||
4. **requirements.txt** - Add redis, celery, flask-caching
|
||||
5. **models/*.py** - Add indexes
|
||||
|
||||
---
|
||||
|
||||
See `OPTIMIZATION_PROPOSAL.md` for detailed implementation guide.
|
||||
206
app/app.py
206
app/app.py
@@ -1,3 +1,10 @@
|
||||
# ...existing code...
|
||||
|
||||
# Player feedback API
|
||||
from models.player_feedback import PlayerFeedback
|
||||
|
||||
# --- API route to receive player feedback ---
|
||||
|
||||
import os
|
||||
import click
|
||||
import psutil
|
||||
@@ -69,8 +76,12 @@ db_path = os.path.join(instance_dir, 'dashboard.db')
|
||||
app.config['SQLALCHEMY_DATABASE_URI'] = f'sqlite:///{db_path}'
|
||||
app.config['SQLALCHEMY_TRACK_MODIFICATIONS'] = False
|
||||
|
||||
# Set maximum content length to 1GB
|
||||
app.config['MAX_CONTENT_LENGTH'] = 2048 * 2048 * 2048 # 2GB, adjust as needed
|
||||
# Set maximum content length to 2GB
|
||||
app.config['MAX_CONTENT_LENGTH'] = 2048 * 1024 * 1024 # 2GB, adjust as needed
|
||||
|
||||
# Set longer timeouts for file processing
|
||||
app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 300 # 5 minutes for static files
|
||||
app.config['PERMANENT_SESSION_LIFETIME'] = 1800 # 30 minutes for sessions
|
||||
|
||||
# Ensure the instance folder exists
|
||||
os.makedirs(app.instance_path, exist_ok=True)
|
||||
@@ -95,6 +106,58 @@ login_manager.login_view = 'login'
|
||||
|
||||
migrate = Migrate(app, db)
|
||||
|
||||
# Global dictionary to track upload progress
|
||||
# Format: {session_id: {'status': 'uploading/converting/complete', 'progress': 0-100, 'message': 'details', 'files_total': N, 'files_processed': N}}
|
||||
upload_progress = {}
|
||||
|
||||
@app.route('/api/player-feedback', methods=['POST'])
|
||||
def api_player_feedback():
|
||||
from datetime import datetime
|
||||
import dateutil.parser
|
||||
|
||||
data = request.get_json()
|
||||
required_fields = ['player_name', 'quickconnect_code', 'message', 'status', 'timestamp']
|
||||
if not all(field in data for field in required_fields):
|
||||
return jsonify({'error': 'Missing required fields'}), 400
|
||||
|
||||
# Convert timestamp string to datetime object
|
||||
try:
|
||||
if isinstance(data['timestamp'], str):
|
||||
timestamp = dateutil.parser.parse(data['timestamp'])
|
||||
else:
|
||||
timestamp = data['timestamp']
|
||||
except (ValueError, TypeError):
|
||||
return jsonify({'error': 'Invalid timestamp format'}), 400
|
||||
|
||||
feedback = PlayerFeedback(
|
||||
player_name=data['player_name'],
|
||||
quickconnect_code=data['quickconnect_code'],
|
||||
message=data['message'],
|
||||
status=data['status'],
|
||||
timestamp=timestamp,
|
||||
playlist_version=data.get('playlist_version'),
|
||||
error_details=data.get('error_details')
|
||||
)
|
||||
db.session.add(feedback)
|
||||
db.session.commit()
|
||||
return jsonify({'success': True, 'feedback_id': feedback.id}), 200
|
||||
|
||||
# Add error handlers for better user experience
|
||||
@app.errorhandler(413)
|
||||
def request_entity_too_large(error):
|
||||
flash('File too large. Please upload files smaller than 2GB.', 'danger')
|
||||
return redirect(url_for('dashboard'))
|
||||
|
||||
@app.errorhandler(408)
|
||||
def request_timeout(error):
|
||||
flash('Request timed out. Please try uploading smaller files or try again later.', 'danger')
|
||||
return redirect(url_for('dashboard'))
|
||||
|
||||
@app.errorhandler(500)
|
||||
def internal_server_error(error):
|
||||
flash('An internal server error occurred. Please try again or contact support.', 'danger')
|
||||
return redirect(url_for('dashboard'))
|
||||
|
||||
@login_manager.user_loader
|
||||
def load_user(user_id):
|
||||
return db.session.get(User, int(user_id))
|
||||
@@ -200,21 +263,73 @@ def logout():
|
||||
@admin_required
|
||||
def upload_content():
|
||||
if request.method == 'POST':
|
||||
import uuid
|
||||
|
||||
target_type = request.form.get('target_type')
|
||||
target_id = request.form.get('target_id')
|
||||
files = request.files.getlist('files')
|
||||
duration = int(request.form['duration'])
|
||||
return_url = request.form.get('return_url')
|
||||
media_type = request.form['media_type']
|
||||
session_id = request.form.get('session_id', str(uuid.uuid4()))
|
||||
|
||||
print(f"Target Type: {target_type}, Target ID: {target_id}, Media Type: {media_type}")
|
||||
print(f"Target Type: {target_type}, Target ID: {target_id}, Media Type: {media_type}, Session ID: {session_id}")
|
||||
|
||||
if not target_type or not target_id:
|
||||
flash('Please select a target type and target ID.', 'danger')
|
||||
return redirect(url_for('upload_content'))
|
||||
|
||||
# Process uploaded files and get results
|
||||
results = process_uploaded_files(app, files, media_type, duration, target_type, target_id)
|
||||
# Initialize progress tracking
|
||||
upload_progress[session_id] = {
|
||||
'status': 'uploading',
|
||||
'progress': 0,
|
||||
'message': 'Starting upload...',
|
||||
'files_total': len(files),
|
||||
'files_processed': 0
|
||||
}
|
||||
|
||||
try:
|
||||
# Process uploaded files and get results
|
||||
results = process_uploaded_files(app, files, media_type, duration, target_type, target_id, upload_progress, session_id)
|
||||
|
||||
# Check if video conversion is happening in background
|
||||
if media_type == 'video':
|
||||
# For videos, don't mark as complete yet - background thread will do it
|
||||
# Status remains as "converting" set by the background thread
|
||||
flash('Video upload started. Conversion is in progress...', 'info')
|
||||
else:
|
||||
# For non-videos (images, PDF, PPT), mark as complete
|
||||
upload_progress[session_id] = {
|
||||
'status': 'complete',
|
||||
'progress': 100,
|
||||
'message': 'All files processed successfully!',
|
||||
'files_total': len(files),
|
||||
'files_processed': len(files)
|
||||
}
|
||||
|
||||
# Check for any failed uploads
|
||||
failed_files = [r for r in results if not r.get('success', True)]
|
||||
if failed_files:
|
||||
for failed in failed_files:
|
||||
flash(f"Error uploading {failed.get('filename', 'unknown file')}: {failed.get('message', 'Unknown error')}", 'warning')
|
||||
else:
|
||||
flash('All files uploaded and processed successfully!', 'success')
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error in upload_content: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# Mark as error
|
||||
upload_progress[session_id] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': f'Upload failed: {str(e)}',
|
||||
'files_total': len(files),
|
||||
'files_processed': 0
|
||||
}
|
||||
|
||||
flash(f'Upload failed: {str(e)}', 'danger')
|
||||
|
||||
return redirect(return_url)
|
||||
|
||||
@@ -295,7 +410,49 @@ def create_user():
|
||||
def player_page(player_id):
|
||||
player = db.session.get(Player, player_id)
|
||||
content = get_player_content(player_id)
|
||||
return render_template('player_page.html', player=player, content=content)
|
||||
|
||||
# Get last 5 feedback entries for this player
|
||||
player_feedback = PlayerFeedback.query.filter_by(
|
||||
player_name=player.username
|
||||
).order_by(PlayerFeedback.timestamp.desc()).limit(5).all()
|
||||
|
||||
# Get server playlist version for this player
|
||||
server_playlist_version = get_server_playlist_version(player)
|
||||
|
||||
return render_template('player_page.html',
|
||||
player=player,
|
||||
content=content,
|
||||
player_feedback=player_feedback,
|
||||
server_playlist_version=server_playlist_version)
|
||||
|
||||
def get_server_playlist_version(player):
|
||||
"""Get the current server playlist version for a specific player"""
|
||||
# Check if player is locked to a group
|
||||
if player.locked_to_group_id:
|
||||
# Get content for all players in the group to ensure shared content
|
||||
group_players = player.locked_to_group.players
|
||||
player_ids = [p.id for p in group_players]
|
||||
|
||||
# Use the first occurrence of each file for the playlist
|
||||
content_query = (
|
||||
db.session.query(
|
||||
Content.file_name,
|
||||
db.func.min(Content.id).label('id'),
|
||||
db.func.min(Content.duration).label('duration')
|
||||
)
|
||||
.filter(Content.player_id.in_(player_ids))
|
||||
.group_by(Content.file_name)
|
||||
)
|
||||
|
||||
content = db.session.query(Content).filter(
|
||||
Content.id.in_([c.id for c in content_query])
|
||||
).all()
|
||||
else:
|
||||
# Get player's individual content
|
||||
content = Content.query.filter_by(player_id=player.id).all()
|
||||
|
||||
# Return the current playlist version for this player
|
||||
return player.playlist_version
|
||||
|
||||
@app.route('/player/<int:player_id>/upload', methods=['POST'])
|
||||
@login_required
|
||||
@@ -598,6 +755,22 @@ def get_playlists():
|
||||
def media(filename):
|
||||
return send_from_directory(app.config['UPLOAD_FOLDER'], filename)
|
||||
|
||||
@app.route('/api/upload_progress/<session_id>', methods=['GET'])
|
||||
@login_required
|
||||
def get_upload_progress(session_id):
|
||||
"""
|
||||
API endpoint to get upload/conversion progress for a session.
|
||||
Returns JSON with status, progress percentage, and current message.
|
||||
"""
|
||||
progress_data = upload_progress.get(session_id, {
|
||||
'status': 'unknown',
|
||||
'progress': 0,
|
||||
'message': 'No active upload found',
|
||||
'files_total': 0,
|
||||
'files_processed': 0
|
||||
})
|
||||
return jsonify(progress_data)
|
||||
|
||||
@app.context_processor
|
||||
def inject_theme():
|
||||
if current_user.is_authenticated:
|
||||
@@ -624,13 +797,32 @@ def create_group():
|
||||
@login_required
|
||||
@admin_required
|
||||
def manage_group(group_id):
|
||||
from models.player_feedback import PlayerFeedback
|
||||
|
||||
group = Group.query.get_or_404(group_id)
|
||||
content = get_group_content(group_id)
|
||||
# Debug content ordering
|
||||
print("Group content positions before sorting:", [(c.id, c.file_name, c.position) for c in content])
|
||||
content = sorted(content, key=lambda c: c.position)
|
||||
print("Group content positions after sorting:", [(c.id, c.file_name, c.position) for c in content])
|
||||
return render_template('manage_group.html', group=group, content=content)
|
||||
|
||||
# Fetch player feedback for all players in the group
|
||||
players_status = []
|
||||
for player in group.players:
|
||||
player_feedback = PlayerFeedback.query.filter_by(player_name=player.username)\
|
||||
.order_by(PlayerFeedback.timestamp.desc())\
|
||||
.limit(5)\
|
||||
.all()
|
||||
players_status.append({
|
||||
'player': player,
|
||||
'feedback': player_feedback,
|
||||
'server_playlist_version': player.playlist_version
|
||||
})
|
||||
|
||||
return render_template('manage_group.html',
|
||||
group=group,
|
||||
content=content,
|
||||
players_status=players_status)
|
||||
|
||||
@app.route('/group/<int:group_id>/edit', methods=['GET', 'POST'])
|
||||
@login_required
|
||||
|
||||
1
app/migrations/README
Normal file
1
app/migrations/README
Normal file
@@ -0,0 +1 @@
|
||||
Single-database configuration for Flask.
|
||||
50
app/migrations/alembic.ini
Normal file
50
app/migrations/alembic.ini
Normal file
@@ -0,0 +1,50 @@
|
||||
# A generic, single database configuration.
|
||||
|
||||
[alembic]
|
||||
# template used to generate migration files
|
||||
# file_template = %%(rev)s_%%(slug)s
|
||||
|
||||
# set to 'true' to run the environment during
|
||||
# the 'revision' command, regardless of autogenerate
|
||||
# revision_environment = false
|
||||
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic,flask_migrate
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[logger_flask_migrate]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = flask_migrate
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
113
app/migrations/env.py
Normal file
113
app/migrations/env.py
Normal file
@@ -0,0 +1,113 @@
|
||||
import logging
|
||||
from logging.config import fileConfig
|
||||
|
||||
from flask import current_app
|
||||
|
||||
from alembic import context
|
||||
|
||||
# this is the Alembic Config object, which provides
|
||||
# access to the values within the .ini file in use.
|
||||
config = context.config
|
||||
|
||||
# Interpret the config file for Python logging.
|
||||
# This line sets up loggers basically.
|
||||
fileConfig(config.config_file_name)
|
||||
logger = logging.getLogger('alembic.env')
|
||||
|
||||
|
||||
def get_engine():
|
||||
try:
|
||||
# this works with Flask-SQLAlchemy<3 and Alchemical
|
||||
return current_app.extensions['migrate'].db.get_engine()
|
||||
except (TypeError, AttributeError):
|
||||
# this works with Flask-SQLAlchemy>=3
|
||||
return current_app.extensions['migrate'].db.engine
|
||||
|
||||
|
||||
def get_engine_url():
|
||||
try:
|
||||
return get_engine().url.render_as_string(hide_password=False).replace(
|
||||
'%', '%%')
|
||||
except AttributeError:
|
||||
return str(get_engine().url).replace('%', '%%')
|
||||
|
||||
|
||||
# add your model's MetaData object here
|
||||
# for 'autogenerate' support
|
||||
# from myapp import mymodel
|
||||
# target_metadata = mymodel.Base.metadata
|
||||
config.set_main_option('sqlalchemy.url', get_engine_url())
|
||||
target_db = current_app.extensions['migrate'].db
|
||||
|
||||
# other values from the config, defined by the needs of env.py,
|
||||
# can be acquired:
|
||||
# my_important_option = config.get_main_option("my_important_option")
|
||||
# ... etc.
|
||||
|
||||
|
||||
def get_metadata():
|
||||
if hasattr(target_db, 'metadatas'):
|
||||
return target_db.metadatas[None]
|
||||
return target_db.metadata
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
This configures the context with just a URL
|
||||
and not an Engine, though an Engine is acceptable
|
||||
here as well. By skipping the Engine creation
|
||||
we don't even need a DBAPI to be available.
|
||||
|
||||
Calls to context.execute() here emit the given string to the
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url, target_metadata=get_metadata(), literal_binds=True
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def run_migrations_online():
|
||||
"""Run migrations in 'online' mode.
|
||||
|
||||
In this scenario we need to create an Engine
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
|
||||
# this callback is used to prevent an auto-migration from being generated
|
||||
# when there are no changes to the schema
|
||||
# reference: http://alembic.zzzcomputing.com/en/latest/cookbook.html
|
||||
def process_revision_directives(context, revision, directives):
|
||||
if getattr(config.cmd_opts, 'autogenerate', False):
|
||||
script = directives[0]
|
||||
if script.upgrade_ops.is_empty():
|
||||
directives[:] = []
|
||||
logger.info('No changes in schema detected.')
|
||||
|
||||
conf_args = current_app.extensions['migrate'].configure_args
|
||||
if conf_args.get("process_revision_directives") is None:
|
||||
conf_args["process_revision_directives"] = process_revision_directives
|
||||
|
||||
connectable = get_engine()
|
||||
|
||||
with connectable.connect() as connection:
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=get_metadata(),
|
||||
**conf_args
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
||||
24
app/migrations/script.py.mako
Normal file
24
app/migrations/script.py.mako
Normal file
@@ -0,0 +1,24 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade():
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade():
|
||||
${downgrades if downgrades else "pass"}
|
||||
@@ -0,0 +1,38 @@
|
||||
"""Add PlayerFeedback table
|
||||
|
||||
Revision ID: 217eab16e4e4
|
||||
Revises:
|
||||
Create Date: 2025-09-08 11:30:26.742813
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '217eab16e4e4'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('player_feedback',
|
||||
sa.Column('id', sa.Integer(), nullable=False),
|
||||
sa.Column('player_name', sa.String(length=255), nullable=False),
|
||||
sa.Column('quickconnect_code', sa.String(length=255), nullable=False),
|
||||
sa.Column('message', sa.Text(), nullable=False),
|
||||
sa.Column('status', sa.String(length=50), nullable=False),
|
||||
sa.Column('timestamp', sa.DateTime(), nullable=False),
|
||||
sa.Column('playlist_version', sa.Integer(), nullable=True),
|
||||
sa.Column('error_details', sa.Text(), nullable=True),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade():
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_table('player_feedback')
|
||||
# ### end Alembic commands ###
|
||||
@@ -2,4 +2,5 @@ from .user import User
|
||||
from .player import Player
|
||||
from .group import Group, group_player
|
||||
from .content import Content
|
||||
from .server_log import ServerLog
|
||||
from .server_log import ServerLog
|
||||
from .player_feedback import PlayerFeedback
|
||||
11
app/models/player_feedback.py
Normal file
11
app/models/player_feedback.py
Normal file
@@ -0,0 +1,11 @@
|
||||
from extensions import db
|
||||
|
||||
class PlayerFeedback(db.Model):
|
||||
id = db.Column(db.Integer, primary_key=True)
|
||||
player_name = db.Column(db.String(255), nullable=False)
|
||||
quickconnect_code = db.Column(db.String(255), nullable=False)
|
||||
message = db.Column(db.Text, nullable=False)
|
||||
status = db.Column(db.String(50), nullable=False)
|
||||
timestamp = db.Column(db.DateTime, nullable=False)
|
||||
playlist_version = db.Column(db.Integer, nullable=True)
|
||||
error_details = db.Column(db.Text, nullable=True)
|
||||
@@ -18,6 +18,9 @@ alembic==1.14.1
|
||||
Mako==1.3.8
|
||||
greenlet==3.1.1
|
||||
|
||||
# Date parsing
|
||||
python-dateutil==2.9.0
|
||||
|
||||
# File Processing
|
||||
pdf2image==1.17.0
|
||||
PyPDF2==3.0.1
|
||||
|
||||
Binary file not shown.
|
Before Width: | Height: | Size: 153 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 52 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 331 KiB |
Binary file not shown.
|
Before Width: | Height: | Size: 432 KiB |
Binary file not shown.
@@ -3,8 +3,9 @@
|
||||
<head>
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<title>Manage Group</title>
|
||||
<title>Manage Group - {{ group.name }}</title>
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha3/dist/css/bootstrap.min.css" rel="stylesheet">
|
||||
<link href="https://cdn.jsdelivr.net/npm/bootstrap-icons@1.11.0/font/bootstrap-icons.css" rel="stylesheet">
|
||||
<style>
|
||||
body.dark-mode {
|
||||
background-color: #121212;
|
||||
@@ -17,16 +18,72 @@
|
||||
.dark-mode label, .dark-mode th, .dark-mode td {
|
||||
color: #ffffff;
|
||||
}
|
||||
|
||||
/* Logo styling */
|
||||
.logo {
|
||||
max-height: 80px;
|
||||
margin-right: 15px;
|
||||
}
|
||||
|
||||
/* Mobile optimizations */
|
||||
@media (max-width: 768px) {
|
||||
.logo {
|
||||
max-height: 50px;
|
||||
margin-right: 10px;
|
||||
}
|
||||
h1 {
|
||||
font-size: 1.5rem;
|
||||
font-size: 1.3rem;
|
||||
}
|
||||
h5 {
|
||||
font-size: 1rem;
|
||||
}
|
||||
h6 {
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
.btn {
|
||||
font-size: 0.9rem;
|
||||
padding: 0.5rem 1rem;
|
||||
font-size: 0.85rem;
|
||||
padding: 0.4rem 0.8rem;
|
||||
}
|
||||
.btn-sm {
|
||||
font-size: 0.75rem;
|
||||
padding: 0.25rem 0.5rem;
|
||||
}
|
||||
.card {
|
||||
margin-bottom: 1rem;
|
||||
margin-bottom: 0.75rem;
|
||||
}
|
||||
.card-body {
|
||||
padding: 0.75rem;
|
||||
}
|
||||
.badge {
|
||||
font-size: 0.75rem;
|
||||
}
|
||||
/* Stack buttons vertically on mobile */
|
||||
.action-buttons .btn {
|
||||
display: block;
|
||||
width: 100%;
|
||||
margin-bottom: 0.5rem;
|
||||
}
|
||||
/* Smaller text on mobile */
|
||||
small {
|
||||
font-size: 0.75rem;
|
||||
}
|
||||
/* Reduce padding in tables */
|
||||
.list-group-item {
|
||||
padding: 0.5rem;
|
||||
}
|
||||
}
|
||||
|
||||
/* Smaller screens - further optimization */
|
||||
@media (max-width: 576px) {
|
||||
.container-fluid {
|
||||
padding-left: 10px;
|
||||
padding-right: 10px;
|
||||
}
|
||||
h1 {
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
.card-header h5 {
|
||||
font-size: 0.95rem;
|
||||
}
|
||||
}
|
||||
|
||||
@@ -49,38 +106,135 @@
|
||||
.drag-over {
|
||||
border-top: 2px solid #0d6efd;
|
||||
}
|
||||
|
||||
/* Player status card compact design */
|
||||
.player-status-card {
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
|
||||
@media (max-width: 768px) {
|
||||
.player-status-card {
|
||||
font-size: 0.85rem;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
<body class="{{ 'dark-mode' if theme == 'dark' else '' }}">
|
||||
<div class="container py-5">
|
||||
<h1 class="text-center mb-4">Manage Group: {{ group.name }}</h1>
|
||||
|
||||
<!-- Group Information Card -->
|
||||
<div class="card mb-4 {{ 'dark-mode' if theme == 'dark' else '' }}">
|
||||
<div class="card-header bg-info text-white">
|
||||
<h2>Group Info</h2>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p><strong>Group Name:</strong> {{ group.name }}</p>
|
||||
<p><strong>Number of Players:</strong> {{ group.players|length }}</p>
|
||||
<div class="container-fluid py-3 py-md-4 py-lg-5">
|
||||
<!-- Header with Logo and Title -->
|
||||
<div class="d-flex justify-content-start align-items-center mb-3 mb-md-4">
|
||||
{% if logo_exists %}
|
||||
<img src="{{ url_for('static', filename='resurse/logo.png') }}" alt="Logo" class="logo">
|
||||
{% endif %}
|
||||
<div>
|
||||
<h1 class="mb-1">Manage Group</h1>
|
||||
<p class="text-muted mb-0 d-none d-md-block">{{ group.name }}</p>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- List of Players in the Group -->
|
||||
<div class="card mb-4 {{ 'dark-mode' if theme == 'dark' else '' }}">
|
||||
<div class="card-header bg-secondary text-white">
|
||||
<h2>Players in Group</h2>
|
||||
<!-- Mobile: Show group name if not shown in header -->
|
||||
<div class="d-md-none mb-3">
|
||||
<div class="badge bg-primary fs-6">{{ group.name }}</div>
|
||||
</div>
|
||||
|
||||
<!-- Row with Group Info (left) and Players Status (right) -->
|
||||
<div class="row mb-3 mb-md-4">
|
||||
<!-- Group Information Card - Responsive width -->
|
||||
<div class="col-lg-3 col-md-4 col-12 mb-3">
|
||||
<div class="card h-100 {{ 'dark-mode' if theme == 'dark' else '' }}">
|
||||
<div class="card-header bg-info text-white">
|
||||
<h5 class="mb-0"><i class="bi bi-info-circle me-2"></i>Group Info</h5>
|
||||
</div>
|
||||
<div class="card-body p-3">
|
||||
<div class="mb-2">
|
||||
<small class="text-muted">Group Name</small>
|
||||
<p class="mb-0"><strong>{{ group.name }}</strong></p>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<small class="text-muted">Players</small>
|
||||
<p class="mb-0"><strong>{{ group.players|length }}</strong></p>
|
||||
</div>
|
||||
<div class="mb-0">
|
||||
<small class="text-muted">Playlist Version</small>
|
||||
<p class="mb-0"><span class="badge bg-info mt-1">v{{ group.playlist_version }}</span></p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<ul class="list-group">
|
||||
{% for player in group.players %}
|
||||
<li class="list-group-item d-flex justify-content-between align-items-center">
|
||||
<div>
|
||||
<strong>{{ player.username }}</strong> ({{ player.hostname }})
|
||||
|
||||
<!-- Players Status Cards Container - 3/4 width on large screens -->
|
||||
<div class="col-lg-9 col-md-8 col-12">
|
||||
<div class="card {{ 'dark-mode' if theme == 'dark' else '' }}">
|
||||
<div class="card-header bg-success text-white">
|
||||
<h5 class="mb-0"><i class="bi bi-display me-2"></i>Players ({{ group.players|length }})</h5>
|
||||
</div>
|
||||
<div class="card-body p-2 p-md-3">
|
||||
{% if players_status %}
|
||||
<div class="row g-2 g-md-3">
|
||||
{% for player_status in players_status %}
|
||||
<div class="col-xl-4 col-lg-6 col-12 mb-2">
|
||||
<div class="card h-100 border-primary player-status-card {{ 'dark-mode' if theme == 'dark' else '' }}">
|
||||
<div class="card-header bg-primary text-white d-flex justify-content-between align-items-center py-2">
|
||||
<h6 class="mb-0"><i class="bi bi-tv me-1"></i>{{ player_status.player.username }}</h6>
|
||||
<a href="{{ url_for('player_page', player_id=player_status.player.id) }}"
|
||||
class="btn btn-sm btn-light py-0 px-2" title="View Details">
|
||||
<i class="bi bi-eye"></i>
|
||||
</a>
|
||||
</div>
|
||||
<div class="card-body p-2">
|
||||
<div class="mb-2">
|
||||
<small class="text-muted"><i class="bi bi-hdd-network me-1"></i>Hostname:</small>
|
||||
<small class="d-block">{{ player_status.player.hostname }}</small>
|
||||
</div>
|
||||
|
||||
{% if player_status.feedback %}
|
||||
<div class="mb-2">
|
||||
<small class="text-muted"><i class="bi bi-activity me-1"></i>Status:</small>
|
||||
<span class="badge bg-{{ 'success' if player_status.feedback[0].status in ['active', 'playing'] else 'danger' }}">
|
||||
{{ player_status.feedback[0].status|title }}
|
||||
</span>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<small class="text-muted"><i class="bi bi-clock me-1"></i>Last Activity:</small>
|
||||
<small class="d-block">{{ player_status.feedback[0].timestamp.strftime('%Y-%m-%d %H:%M:%S') }}</small>
|
||||
</div>
|
||||
<div class="mb-2">
|
||||
<small class="text-muted"><i class="bi bi-chat-dots me-1"></i>Message:</small>
|
||||
<small class="d-block text-muted">{{ player_status.feedback[0].message[:50] }}{% if player_status.feedback[0].message|length > 50 %}...{% endif %}</small>
|
||||
</div>
|
||||
<div class="mb-0">
|
||||
<small class="text-muted"><i class="bi bi-list-check me-1"></i>Playlist:</small>
|
||||
{% if player_status.feedback[0].playlist_version %}
|
||||
{% if player_status.feedback[0].playlist_version|int == player_status.server_playlist_version %}
|
||||
<span class="badge bg-success">v{{ player_status.feedback[0].playlist_version }} ✓</span>
|
||||
<small class="text-success d-block">In sync</small>
|
||||
{% else %}
|
||||
<span class="badge bg-warning text-dark">v{{ player_status.feedback[0].playlist_version }}</span>
|
||||
<small class="text-warning d-block">⚠ Out of sync (server: v{{ player_status.server_playlist_version }})</small>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<span class="badge bg-secondary">Unknown</span>
|
||||
{% endif %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="text-center text-muted py-2">
|
||||
<p class="mb-1"><small>No status data</small></p>
|
||||
<small>Player hasn't reported yet</small>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</li>
|
||||
{% endfor %}
|
||||
</ul>
|
||||
{% endfor %}
|
||||
</div>
|
||||
{% else %}
|
||||
<div class="text-center text-muted py-3">
|
||||
<i class="bi bi-inbox display-4 d-block mb-2"></i>
|
||||
<p>No players in this group</p>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -110,25 +264,41 @@
|
||||
|
||||
<ul class="list-group sortable-list" id="groupMediaList">
|
||||
{% for media in content %}
|
||||
<li class="list-group-item d-flex align-items-center {{ 'dark-mode' if theme == 'dark' else '' }}"
|
||||
<li class="list-group-item d-flex align-items-center {{ 'dark-mode' if theme == 'dark' else '' }}"
|
||||
draggable="true"
|
||||
data-id="{{ media.id }}"
|
||||
data-position="{{ loop.index0 }}">
|
||||
<!-- Checkbox for bulk selection -->
|
||||
<div class="me-2">
|
||||
<input class="form-check-input media-checkbox"
|
||||
type="checkbox"
|
||||
name="selected_content"
|
||||
<input class="form-check-input media-checkbox"
|
||||
type="checkbox"
|
||||
name="selected_content"
|
||||
value="{{ media.id }}">
|
||||
</div>
|
||||
|
||||
|
||||
<!-- Drag handle -->
|
||||
<div class="drag-handle me-2" title="Drag to reorder">
|
||||
<i class="bi bi-grip-vertical"></i>
|
||||
☰
|
||||
</div>
|
||||
|
||||
<div class="flex-grow-1">
|
||||
|
||||
<!-- Media Thumbnail and Name -->
|
||||
<div class="flex-grow-1 mb-2 mb-md-0 d-flex align-items-center">
|
||||
{% set file_ext = media.file_name.lower().split('.')[-1] %}
|
||||
{% if file_ext in ['mp4', 'avi', 'mkv', 'mov', 'webm'] %}
|
||||
<!-- Video file - show generic video icon -->
|
||||
<div style="width: 48px; height: 48px; margin-right: 10px; border-radius: 4px; background: linear-gradient(135deg, #667eea 0%, #764ba2 100%); display: flex; align-items: center; justify-content: center;">
|
||||
<svg width="24" height="24" viewBox="0 0 24 24" fill="none" xmlns="http://www.w3.org/2000/svg">
|
||||
<path d="M8 5v14l11-7z" fill="white"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% else %}
|
||||
<!-- Image file - show actual thumbnail -->
|
||||
<img src="{{ url_for('static', filename='uploads/' ~ media.file_name) }}"
|
||||
alt="thumbnail"
|
||||
style="width: 48px; height: 48px; object-fit: cover; margin-right: 10px; border-radius: 4px;"
|
||||
onerror="this.style.display='none';">
|
||||
{% endif %}
|
||||
<p class="mb-0"><strong>Media Name:</strong> {{ media.file_name }}</p>
|
||||
</div>
|
||||
<form action="{{ url_for('edit_group_media_route', group_id=group.id, content_id=media.id) }}" method="post" class="d-flex align-items-center">
|
||||
@@ -153,12 +323,19 @@
|
||||
</div>
|
||||
|
||||
<!-- Upload Media Button -->
|
||||
<div class="text-center mb-4">
|
||||
<a href="{{ url_for('upload_content', target_type='group', target_id=group.id, return_url=url_for('manage_group', group_id=group.id)) }}" class="btn btn-primary btn-lg">Go to Upload Media</a>
|
||||
<div class="text-center mb-3 action-buttons">
|
||||
<a href="{{ url_for('upload_content', target_type='group', target_id=group.id, return_url=url_for('manage_group', group_id=group.id)) }}"
|
||||
class="btn btn-primary btn-lg">
|
||||
<i class="bi bi-cloud-upload me-2"></i>Upload Media
|
||||
</a>
|
||||
</div>
|
||||
|
||||
<!-- Back to Dashboard Button -->
|
||||
<a href="{{ url_for('dashboard') }}" class="btn btn-secondary">Back to Dashboard</a>
|
||||
<div class="text-center mb-3">
|
||||
<a href="{{ url_for('dashboard') }}" class="btn btn-secondary">
|
||||
<i class="bi bi-arrow-left me-2"></i>Back to Dashboard
|
||||
</a>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha3/dist/js/bootstrap.bundle.min.js"></script>
|
||||
|
||||
@@ -54,20 +54,95 @@
|
||||
<div class="container py-5">
|
||||
<h1 class="text-center mb-4">Player Schedule for {{ player.username }}</h1>
|
||||
|
||||
<!-- Player Info Section -->
|
||||
<div class="card mb-4 {% if theme == 'dark' %}dark-mode{% endif %}">
|
||||
<div class="card-header bg-info text-white">
|
||||
<h2>Player Info</h2>
|
||||
<div class="row">
|
||||
<!-- Player Info Section -->
|
||||
<div class="col-md-6">
|
||||
<div class="card mb-4 {% if theme == 'dark' %}dark-mode{% endif %}">
|
||||
<div class="card-header bg-info text-white">
|
||||
<h2>Player Info</h2>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p><strong>Player Name:</strong> {{ player.username }}</p>
|
||||
<p><strong>Hostname:</strong> {{ player.hostname }}</p>
|
||||
{% if current_user.role == 'admin' %}
|
||||
<a href="{{ url_for('edit_player', player_id=player.id, return_url=url_for('player_page', player_id=player.id)) }}" class="btn btn-warning">Update</a>
|
||||
<form action="{{ url_for('delete_player', player_id=player.id) }}" method="post" style="display:inline;">
|
||||
<button type="submit" class="btn btn-danger" onclick="return confirm('Are you sure you want to delete this player?');">Delete</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
<p><strong>Player Name:</strong> {{ player.username }}</p>
|
||||
<p><strong>Hostname:</strong> {{ player.hostname }}</p>
|
||||
{% if current_user.role == 'admin' %}
|
||||
<a href="{{ url_for('edit_player', player_id=player.id, return_url=url_for('player_page', player_id=player.id)) }}" class="btn btn-warning">Update</a>
|
||||
<form action="{{ url_for('delete_player', player_id=player.id) }}" method="post" style="display:inline;">
|
||||
<button type="submit" class="btn btn-danger" onclick="return confirm('Are you sure you want to delete this player?');">Delete</button>
|
||||
</form>
|
||||
{% endif %}
|
||||
|
||||
<!-- Player Status Section -->
|
||||
<div class="col-md-6">
|
||||
<div class="card mb-4 {% if theme == 'dark' %}dark-mode{% endif %}">
|
||||
<div class="card-header bg-success text-white">
|
||||
<h2>Player Status</h2>
|
||||
</div>
|
||||
<div class="card-body">
|
||||
{% if player_feedback %}
|
||||
<div class="mb-3">
|
||||
<strong>Current Status:</strong>
|
||||
<span class="badge bg-{{ 'success' if player_feedback[0].status in ['active', 'playing'] else 'danger' }}">
|
||||
{{ player_feedback[0].status|title }}
|
||||
</span>
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Last Activity:</strong> {{ player_feedback[0].timestamp.strftime('%Y-%m-%d %H:%M:%S') }}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Latest Message:</strong> {{ player_feedback[0].message }}
|
||||
</div>
|
||||
<div class="mb-3">
|
||||
<strong>Server Playlist Version:</strong>
|
||||
<span class="badge bg-info">v{{ server_playlist_version }}</span>
|
||||
{% if player_feedback[0].playlist_version %}
|
||||
{% if player_feedback[0].playlist_version|int == server_playlist_version %}
|
||||
<small class="text-success ms-2">✓ Player in sync</small>
|
||||
{% else %}
|
||||
<small class="text-warning ms-2">⚠ Player v{{ player_feedback[0].playlist_version }} (out of sync)</small>
|
||||
{% endif %}
|
||||
{% else %}
|
||||
<small class="text-muted ms-2">Player version unknown</small>
|
||||
{% endif %}
|
||||
</div>
|
||||
|
||||
<!-- Recent Activity Log -->
|
||||
<details>
|
||||
<summary class="fw-bold mb-2">Recent Activity (Last 5)</summary>
|
||||
<div class="mt-2">
|
||||
{% for feedback in player_feedback %}
|
||||
<div class="border-bottom pb-2 mb-2">
|
||||
<div class="d-flex justify-content-between">
|
||||
<span class="badge bg-{{ 'success' if feedback.status in ['active', 'playing'] else 'danger' }}">
|
||||
{{ feedback.status|title }}
|
||||
</span>
|
||||
<small class="text-muted">{{ feedback.timestamp.strftime('%m-%d %H:%M') }}</small>
|
||||
</div>
|
||||
<div class="mt-1">
|
||||
<small>{{ feedback.message }}</small>
|
||||
{% if feedback.playlist_version %}
|
||||
<br><small class="text-muted">Playlist v{{ feedback.playlist_version }}</small>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
{% endfor %}
|
||||
</div>
|
||||
</details>
|
||||
{% else %}
|
||||
<div class="mb-3">
|
||||
<strong>Server Playlist Version:</strong>
|
||||
<span class="badge bg-info">v{{ server_playlist_version }}</span>
|
||||
<small class="text-muted ms-2">Player version unknown</small>
|
||||
</div>
|
||||
<div class="text-center text-muted">
|
||||
<p>No status information available</p>
|
||||
<small>Player hasn't sent any feedback yet</small>
|
||||
</div>
|
||||
{% endif %}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
@@ -139,10 +214,22 @@
|
||||
|
||||
<!-- Media Thumbnail and Name -->
|
||||
<div class="flex-grow-1 mb-2 mb-md-0 d-flex align-items-center">
|
||||
<img src="{{ url_for('static', filename='uploads/' ~ media.file_name) }}"
|
||||
alt="thumbnail"
|
||||
style="width: 48px; height: 48px; object-fit: cover; margin-right: 10px; border-radius: 4px;"
|
||||
onerror="this.style.display='none';">
|
||||
{% set file_ext = media.file_name.lower().split('.')[-1] %}
|
||||
{% if file_ext in ['mp4', 'avi', 'mkv', 'mov', 'webm'] %}
|
||||
<!-- Video Icon for video files -->
|
||||
<div style="width: 48px; height: 48px; margin-right: 10px; border-radius: 4px; background: linear-gradient(135deg, #667eea 0%, #764ba2 100%); display: flex; align-items: center; justify-content: center;">
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="28" height="28" fill="white" viewBox="0 0 16 16">
|
||||
<path d="M8 15A7 7 0 1 1 8 1a7 7 0 0 1 0 14zm0 1A8 8 0 1 0 8 0a8 8 0 0 0 0 16z"/>
|
||||
<path d="M6.271 5.055a.5.5 0 0 1 .52.038l3.5 2.5a.5.5 0 0 1 0 .814l-3.5 2.5A.5.5 0 0 1 6 10.5v-5a.5.5 0 0 1 .271-.445z"/>
|
||||
</svg>
|
||||
</div>
|
||||
{% else %}
|
||||
<!-- Image thumbnail for image files -->
|
||||
<img src="{{ url_for('static', filename='uploads/' ~ media.file_name) }}"
|
||||
alt="thumbnail"
|
||||
style="width: 48px; height: 48px; object-fit: cover; margin-right: 10px; border-radius: 4px;"
|
||||
onerror="this.style.display='none';">
|
||||
{% endif %}
|
||||
<p class="mb-0"><strong>Media Name:</strong> {{ media.file_name }}</p>
|
||||
</div>
|
||||
|
||||
|
||||
@@ -57,7 +57,7 @@
|
||||
{% endif %}
|
||||
<h1 class="mb-0">Upload Content</h1>
|
||||
</div>
|
||||
<form id="upload-form" action="{{ url_for('upload_content') }}" method="post" enctype="multipart/form-data" onsubmit="showStatusModal()">
|
||||
<form id="upload-form" action="{{ url_for('upload_content') }}" method="post" enctype="multipart/form-data" onsubmit="handleFormSubmit(event)">
|
||||
<input type="hidden" name="return_url" value="{{ return_url }}">
|
||||
<div class="row">
|
||||
<div class="col-md-6 col-12">
|
||||
@@ -223,61 +223,127 @@
|
||||
|
||||
<script src="https://cdn.jsdelivr.net/npm/bootstrap@5.3.0-alpha3/dist/js/bootstrap.bundle.min.js"></script>
|
||||
<script>
|
||||
let progressInterval = null;
|
||||
let sessionId = null;
|
||||
let statusModal = null;
|
||||
let returnUrl = '{{ return_url }}';
|
||||
|
||||
// Generate unique session ID for this upload
|
||||
function generateSessionId() {
|
||||
return 'upload_' + Date.now() + '_' + Math.random().toString(36).substr(2, 9);
|
||||
}
|
||||
|
||||
function handleFormSubmit(event) {
|
||||
event.preventDefault(); // Prevent default form submission
|
||||
|
||||
// Generate session ID and add it to the form
|
||||
sessionId = generateSessionId();
|
||||
const form = document.getElementById('upload-form');
|
||||
let sessionInput = document.getElementById('session_id_input');
|
||||
if (!sessionInput) {
|
||||
sessionInput = document.createElement('input');
|
||||
sessionInput.type = 'hidden';
|
||||
sessionInput.name = 'session_id';
|
||||
sessionInput.id = 'session_id_input';
|
||||
form.appendChild(sessionInput);
|
||||
}
|
||||
sessionInput.value = sessionId;
|
||||
|
||||
// Show modal
|
||||
showStatusModal();
|
||||
|
||||
// Submit form via AJAX
|
||||
const formData = new FormData(form);
|
||||
|
||||
fetch(form.action, {
|
||||
method: 'POST',
|
||||
body: formData
|
||||
})
|
||||
.then(response => {
|
||||
if (!response.ok) {
|
||||
throw new Error('Upload failed');
|
||||
}
|
||||
console.log('Form submitted successfully');
|
||||
// Don't redirect yet - keep polling until status is complete
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Form submission error:', error);
|
||||
if (upload_progress && sessionId) {
|
||||
upload_progress[sessionId] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': 'Upload failed: ' + error.message
|
||||
};
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
function showStatusModal() {
|
||||
console.log("Processing popup triggered");
|
||||
const statusModal = new bootstrap.Modal(document.getElementById('statusModal'));
|
||||
|
||||
statusModal = new bootstrap.Modal(document.getElementById('statusModal'));
|
||||
statusModal.show();
|
||||
|
||||
// Update status message based on media type
|
||||
const mediaType = document.getElementById('media_type').value;
|
||||
const statusMessage = document.getElementById('status-message');
|
||||
|
||||
switch(mediaType) {
|
||||
case 'image':
|
||||
statusMessage.textContent = 'Uploading images...';
|
||||
break;
|
||||
case 'video':
|
||||
statusMessage.textContent = 'Uploading and processing video. This may take a while...';
|
||||
break;
|
||||
case 'pdf':
|
||||
statusMessage.textContent = 'Converting PDF to 4K images. This may take a while...';
|
||||
break;
|
||||
case 'ppt':
|
||||
statusMessage.textContent = 'Converting PowerPoint to 4K images. This may take a while...';
|
||||
break;
|
||||
default:
|
||||
statusMessage.textContent = 'Uploading and processing your files. Please wait...';
|
||||
}
|
||||
|
||||
// Start system monitoring updates in modal
|
||||
|
||||
{% if system_info %}
|
||||
// Start system monitoring updates
|
||||
startModalSystemMonitoring();
|
||||
{% endif %}
|
||||
|
||||
// Simulate progress updates
|
||||
|
||||
// Start polling progress
|
||||
pollUploadProgress();
|
||||
}
|
||||
|
||||
function pollUploadProgress() {
|
||||
const statusMessage = document.getElementById('status-message');
|
||||
const progressBar = document.getElementById('progress-bar');
|
||||
let progress = 0;
|
||||
const interval = setInterval(() => {
|
||||
// For slow processes, increment more slowly
|
||||
const increment = (mediaType === 'image') ? 20 : 5;
|
||||
progress += increment;
|
||||
|
||||
if (progress >= 100) {
|
||||
clearInterval(interval);
|
||||
statusMessage.textContent = 'Files uploaded and processed successfully!';
|
||||
|
||||
// Stop system monitoring updates
|
||||
{% if system_info %}
|
||||
stopModalSystemMonitoring();
|
||||
{% endif %}
|
||||
|
||||
// Enable the close button
|
||||
document.querySelector('[data-bs-dismiss="modal"]').disabled = false;
|
||||
} else {
|
||||
progressBar.style.width = `${progress}%`;
|
||||
progressBar.setAttribute('aria-valuenow', progress);
|
||||
}
|
||||
}, 500);
|
||||
|
||||
// Poll every 500ms for real-time updates
|
||||
progressInterval = setInterval(() => {
|
||||
fetch(`/api/upload_progress/${sessionId}`)
|
||||
.then(response => response.json())
|
||||
.then(data => {
|
||||
console.log('Progress update:', data);
|
||||
|
||||
// Update progress bar
|
||||
progressBar.style.width = `${data.progress}%`;
|
||||
progressBar.setAttribute('aria-valuenow', data.progress);
|
||||
|
||||
// Update status message
|
||||
statusMessage.textContent = data.message;
|
||||
|
||||
// If complete or error, stop polling and enable close button
|
||||
if (data.status === 'complete' || data.status === 'error') {
|
||||
clearInterval(progressInterval);
|
||||
progressInterval = null;
|
||||
|
||||
{% if system_info %}
|
||||
stopModalSystemMonitoring();
|
||||
{% endif %}
|
||||
|
||||
const closeBtn = document.querySelector('[data-bs-dismiss="modal"]');
|
||||
closeBtn.disabled = false;
|
||||
|
||||
// Change progress bar color based on status
|
||||
if (data.status === 'complete') {
|
||||
progressBar.classList.remove('progress-bar-animated');
|
||||
progressBar.classList.add('bg-success');
|
||||
|
||||
// Auto-close after 2 seconds and redirect
|
||||
setTimeout(() => {
|
||||
statusModal.hide();
|
||||
window.location.href = returnUrl;
|
||||
}, 2000);
|
||||
} else if (data.status === 'error') {
|
||||
progressBar.classList.remove('progress-bar-animated', 'progress-bar-striped');
|
||||
progressBar.classList.add('bg-danger');
|
||||
}
|
||||
}
|
||||
})
|
||||
.catch(error => {
|
||||
console.error('Error fetching progress:', error);
|
||||
statusMessage.textContent = 'Error tracking upload progress';
|
||||
});
|
||||
}, 500); // Poll every 500ms
|
||||
}
|
||||
|
||||
{% if system_info %}
|
||||
|
||||
@@ -9,12 +9,23 @@ The converted PDF is then processed by the main upload workflow for 4K image gen
|
||||
import os
|
||||
import subprocess
|
||||
import logging
|
||||
import signal
|
||||
import time
|
||||
|
||||
# Set up logging
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def cleanup_libreoffice_processes():
|
||||
"""Clean up any hanging LibreOffice processes"""
|
||||
try:
|
||||
subprocess.run(['pkill', '-f', 'soffice'], capture_output=True, timeout=10)
|
||||
time.sleep(1) # Give processes time to terminate
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to cleanup LibreOffice processes: {e}")
|
||||
|
||||
|
||||
def pptx_to_pdf_libreoffice(pptx_path, output_dir):
|
||||
"""
|
||||
Convert PPTX to PDF using LibreOffice for highest quality.
|
||||
@@ -30,6 +41,9 @@ def pptx_to_pdf_libreoffice(pptx_path, output_dir):
|
||||
str: Path to the generated PDF file, or None if conversion failed
|
||||
"""
|
||||
try:
|
||||
# Clean up any existing LibreOffice processes
|
||||
cleanup_libreoffice_processes()
|
||||
|
||||
# Ensure output directory exists
|
||||
os.makedirs(output_dir, exist_ok=True)
|
||||
|
||||
@@ -39,14 +53,19 @@ def pptx_to_pdf_libreoffice(pptx_path, output_dir):
|
||||
'--headless',
|
||||
'--convert-to', 'pdf',
|
||||
'--outdir', output_dir,
|
||||
'--invisible', # Run without any UI
|
||||
'--nodefault', # Don't start with default template
|
||||
pptx_path
|
||||
]
|
||||
|
||||
logger.info(f"Converting PPTX to PDF using LibreOffice: {pptx_path}")
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=120)
|
||||
# Increase timeout to 300 seconds (5 minutes) for large presentations
|
||||
result = subprocess.run(cmd, capture_output=True, text=True, timeout=300)
|
||||
|
||||
if result.returncode != 0:
|
||||
logger.error(f"LibreOffice conversion failed: {result.stderr}")
|
||||
logger.error(f"LibreOffice stdout: {result.stdout}")
|
||||
cleanup_libreoffice_processes() # Clean up on failure
|
||||
return None
|
||||
|
||||
# Find the generated PDF file
|
||||
@@ -55,16 +74,22 @@ def pptx_to_pdf_libreoffice(pptx_path, output_dir):
|
||||
|
||||
if os.path.exists(pdf_path):
|
||||
logger.info(f"PDF conversion successful: {pdf_path}")
|
||||
cleanup_libreoffice_processes() # Clean up after success
|
||||
return pdf_path
|
||||
else:
|
||||
logger.error(f"PDF file not found after conversion: {pdf_path}")
|
||||
cleanup_libreoffice_processes() # Clean up on failure
|
||||
return None
|
||||
|
||||
except subprocess.TimeoutExpired:
|
||||
logger.error("LibreOffice conversion timed out (120s)")
|
||||
logger.error("LibreOffice conversion timed out (300s)")
|
||||
cleanup_libreoffice_processes() # Clean up on timeout
|
||||
return None
|
||||
except Exception as e:
|
||||
logger.error(f"Error in PPTX to PDF conversion: {e}")
|
||||
import traceback
|
||||
logger.error(f"Traceback: {traceback.format_exc()}")
|
||||
cleanup_libreoffice_processes() # Clean up on error
|
||||
return None
|
||||
|
||||
|
||||
|
||||
@@ -52,86 +52,166 @@ def add_image_to_playlist(app, file, filename, duration, target_type, target_id)
|
||||
|
||||
# Video conversion functions
|
||||
def convert_video(input_file, output_folder):
|
||||
print(f"Video conversion skipped for: {input_file}")
|
||||
return input_file
|
||||
|
||||
def convert_video_and_update_playlist(app, file_path, original_filename, target_type, target_id, duration, upload_progress=None, session_id=None, file_index=0, total_files=1):
|
||||
"""
|
||||
Converts a video file to MP4 format with H.264 codec.
|
||||
"""
|
||||
# Use simple path resolution for containerized environment
|
||||
if not os.path.isabs(output_folder):
|
||||
# In container, relative paths work from /app directory
|
||||
print(f"Using relative path: {output_folder}")
|
||||
else:
|
||||
print(f"Using absolute path: {output_folder}")
|
||||
Convert video to Raspberry Pi optimized format, then add to playlist.
|
||||
This ensures players only download optimized videos.
|
||||
|
||||
if not os.path.exists(output_folder):
|
||||
os.makedirs(output_folder, exist_ok=True)
|
||||
print(f"Created output folder: {output_folder}")
|
||||
|
||||
# Generate the output file path
|
||||
base_name = os.path.splitext(os.path.basename(input_file))[0]
|
||||
output_file = os.path.join(output_folder, f"{base_name}.mp4")
|
||||
print(f"Converting video: {input_file} -> {output_file}")
|
||||
|
||||
# FFmpeg command to convert the video
|
||||
command = [
|
||||
"ffmpeg",
|
||||
"-i", input_file, # Input file
|
||||
"-c:v", "libx264", # Video codec: H.264
|
||||
"-preset", "fast", # Encoding speed/quality tradeoff
|
||||
"-crf", "23", # Constant Rate Factor (quality, lower is better)
|
||||
"-vf", "scale=-1:1080", # Scale video to 1080p (preserve aspect ratio)
|
||||
"-r", "30", # Frame rate: 30 FPS
|
||||
"-c:a", "aac", # Audio codec: AAC
|
||||
"-b:a", "128k", # Audio bitrate
|
||||
output_file # Output file
|
||||
]
|
||||
|
||||
try:
|
||||
# Run the FFmpeg command
|
||||
subprocess.run(command, check=True, stdout=subprocess.PIPE, stderr=subprocess.PIPE)
|
||||
print(f"Video converted successfully: {output_file}")
|
||||
return output_file
|
||||
except subprocess.CalledProcessError as e:
|
||||
print(f"Error converting video: {e.stderr.decode()}")
|
||||
Args:
|
||||
upload_progress (dict): Global progress tracking dictionary
|
||||
session_id (str): Unique session identifier for progress tracking
|
||||
file_index (int): Current file index being processed
|
||||
total_files (int): Total number of files being processed
|
||||
"""
|
||||
import shutil
|
||||
import tempfile
|
||||
print(f"Starting video optimization for Raspberry Pi: {file_path}")
|
||||
|
||||
# Update progress - conversion starting
|
||||
if upload_progress and session_id:
|
||||
print(f"[VIDEO CONVERSION] Setting initial progress for session {session_id}")
|
||||
upload_progress[session_id] = {
|
||||
'status': 'converting',
|
||||
'progress': 40,
|
||||
'message': f'Optimizing video for Raspberry Pi (30fps, H.264)...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
print(f"[VIDEO CONVERSION] Progress set: {upload_progress[session_id]}")
|
||||
else:
|
||||
print(f"[VIDEO CONVERSION] WARNING: upload_progress or session_id is None!")
|
||||
|
||||
# Only process video files
|
||||
if not file_path.lower().endswith(('.mp4', '.avi', '.mkv', '.mov', '.webm')):
|
||||
print(f"Skipping non-video file: {file_path}")
|
||||
return None
|
||||
|
||||
def convert_video_and_update_playlist(app, file_path, original_filename, target_type, target_id, duration):
|
||||
"""
|
||||
Converts a video and updates the playlist database.
|
||||
"""
|
||||
print(f"Starting video conversion for: {file_path}")
|
||||
# Prepare temp output file
|
||||
temp_dir = tempfile.gettempdir()
|
||||
temp_output = os.path.join(temp_dir, f"optimized_{os.path.basename(file_path)}")
|
||||
|
||||
# Use simple path resolution for containerized environment
|
||||
upload_folder = app.config['UPLOAD_FOLDER']
|
||||
print(f"Upload folder: {upload_folder}")
|
||||
# Enhanced ffmpeg command for Raspberry Pi optimization
|
||||
ffmpeg_cmd = [
|
||||
'ffmpeg', '-y', '-i', file_path,
|
||||
'-c:v', 'libx264', # H.264 codec
|
||||
'-preset', 'medium', # Balanced encoding speed/quality
|
||||
'-profile:v', 'main', # Main profile for compatibility
|
||||
'-crf', '23', # Constant quality (23 is good balance)
|
||||
'-maxrate', '8M', # Max bitrate 8Mbps
|
||||
'-bufsize', '12M', # Buffer size
|
||||
'-vf', 'scale=\'min(1920,iw)\':\'min(1080,ih)\':force_original_aspect_ratio=decrease,fps=30', # Scale down if needed, 30fps
|
||||
'-r', '30', # Output framerate 30fps
|
||||
'-c:a', 'aac', # AAC audio codec
|
||||
'-b:a', '128k', # Audio bitrate 128kbps
|
||||
'-movflags', '+faststart', # Enable fast start for web streaming
|
||||
temp_output
|
||||
]
|
||||
|
||||
converted_file = convert_video(file_path, upload_folder)
|
||||
if converted_file:
|
||||
converted_filename = os.path.basename(converted_file)
|
||||
print(f"Video converted successfully: {converted_filename}")
|
||||
|
||||
# Use the application context to interact with the database
|
||||
print(f"Running ffmpeg optimization: {' '.join(ffmpeg_cmd)}")
|
||||
print(f"Settings: 1920x1080 max, 30fps, H.264, 8Mbps max bitrate")
|
||||
|
||||
# Update progress - conversion in progress
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id]['progress'] = 50
|
||||
upload_progress[session_id]['message'] = 'Converting video (this may take a few minutes)...'
|
||||
|
||||
try:
|
||||
result = subprocess.run(ffmpeg_cmd, capture_output=True, text=True, timeout=1800)
|
||||
if result.returncode != 0:
|
||||
print(f"ffmpeg error: {result.stderr}")
|
||||
print(f"Video conversion failed for: {original_filename}")
|
||||
|
||||
# Update progress - error
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': f'Video conversion failed: {original_filename}',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
# Delete the unconverted file
|
||||
if os.path.exists(file_path):
|
||||
os.remove(file_path)
|
||||
print(f"Removed unconverted video file: {file_path}")
|
||||
return None
|
||||
|
||||
# Update progress - replacing file
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id]['progress'] = 80
|
||||
upload_progress[session_id]['message'] = 'Saving optimized video and adding to playlist...'
|
||||
|
||||
# Replace original file with optimized one
|
||||
shutil.move(temp_output, file_path)
|
||||
print(f"Video optimized and replaced: {file_path}")
|
||||
print(f"Video is now optimized for Raspberry Pi playback (30fps, max 1080p)")
|
||||
|
||||
# NOW add to playlist after successful conversion
|
||||
with app.app_context():
|
||||
# Update the database with the converted filename
|
||||
if target_type == 'group':
|
||||
group = Group.query.get_or_404(target_id)
|
||||
for player in group.players:
|
||||
content = Content.query.filter_by(player_id=player.id, file_name=original_filename).first()
|
||||
if content:
|
||||
content.file_name = converted_filename
|
||||
new_content = Content(file_name=original_filename, duration=duration, player_id=player.id)
|
||||
db.session.add(new_content)
|
||||
player.playlist_version += 1
|
||||
group.playlist_version += 1
|
||||
print(f"Video added to group '{group.name}' playlist after optimization")
|
||||
elif target_type == 'player':
|
||||
content = Content.query.filter_by(player_id=target_id, file_name=original_filename).first()
|
||||
if content:
|
||||
content.file_name = converted_filename
|
||||
|
||||
player = Player.query.get_or_404(target_id)
|
||||
new_content = Content(file_name=original_filename, duration=duration, player_id=target_id)
|
||||
db.session.add(new_content)
|
||||
player.playlist_version += 1
|
||||
print(f"Video added to player '{player.username}' playlist after optimization")
|
||||
|
||||
db.session.commit()
|
||||
print(f"Database updated with converted video: {converted_filename}")
|
||||
|
||||
# Delete the original file only if it exists
|
||||
print(f"Playlist updated with optimized video: {original_filename}")
|
||||
|
||||
# Update progress - complete
|
||||
if upload_progress and session_id:
|
||||
print(f"[VIDEO CONVERSION] Video conversion complete! Updating progress for session {session_id}")
|
||||
upload_progress[session_id] = {
|
||||
'status': 'complete',
|
||||
'progress': 100,
|
||||
'message': f'Video conversion complete! Added to playlist.',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index + 1
|
||||
}
|
||||
print(f"[VIDEO CONVERSION] Final progress: {upload_progress[session_id]}")
|
||||
else:
|
||||
print(f"[VIDEO CONVERSION] WARNING: Cannot update completion status - upload_progress or session_id is None!")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"[VIDEO CONVERSION] ERROR during video optimization: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# Update progress - error
|
||||
if upload_progress and session_id:
|
||||
print(f"[VIDEO CONVERSION] Setting error status for session {session_id}")
|
||||
upload_progress[session_id] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': f'Error during video conversion: {str(e)}',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
else:
|
||||
print(f"[VIDEO CONVERSION] WARNING: Cannot update error status - upload_progress or session_id is None!")
|
||||
|
||||
# Delete the unconverted file on error
|
||||
if os.path.exists(file_path):
|
||||
os.remove(file_path)
|
||||
print(f"Original file deleted: {file_path}")
|
||||
else:
|
||||
print(f"Video conversion failed for: {file_path}")
|
||||
print(f"Removed unconverted video file due to error: {file_path}")
|
||||
return None
|
||||
|
||||
# Filename remains the same
|
||||
return True
|
||||
|
||||
# PDF conversion functions
|
||||
def convert_pdf_to_images(pdf_file, output_folder, delete_pdf=True, dpi=300):
|
||||
@@ -231,7 +311,7 @@ def update_playlist_with_files(image_filenames, duration, target_type, target_id
|
||||
print(f"Error updating playlist: {e}")
|
||||
return False
|
||||
|
||||
def process_pdf(input_file, output_folder, duration, target_type, target_id):
|
||||
def process_pdf(input_file, output_folder, duration, target_type, target_id, upload_progress=None, session_id=None, file_index=0, total_files=1):
|
||||
"""
|
||||
Process a PDF file: convert to images and update playlist.
|
||||
|
||||
@@ -241,6 +321,10 @@ def process_pdf(input_file, output_folder, duration, target_type, target_id):
|
||||
duration (int): Duration in seconds for each image
|
||||
target_type (str): 'player' or 'group'
|
||||
target_id (int): ID of the player or group
|
||||
upload_progress (dict): Global progress tracking dictionary
|
||||
session_id (str): Unique session identifier for progress tracking
|
||||
file_index (int): Current file index being processed
|
||||
total_files (int): Total number of files being processed
|
||||
|
||||
Returns:
|
||||
bool: True if successful, False otherwise
|
||||
@@ -248,6 +332,16 @@ def process_pdf(input_file, output_folder, duration, target_type, target_id):
|
||||
print(f"Processing PDF file: {input_file}")
|
||||
print(f"Output folder: {output_folder}")
|
||||
|
||||
# Update progress - starting PDF conversion
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'converting',
|
||||
'progress': 50,
|
||||
'message': f'Converting PDF to images (300 DPI)...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
# Ensure output folder exists
|
||||
if not os.path.exists(output_folder):
|
||||
os.makedirs(output_folder, exist_ok=True)
|
||||
@@ -256,17 +350,42 @@ def process_pdf(input_file, output_folder, duration, target_type, target_id):
|
||||
# Convert PDF to images using standard quality (delete PDF after successful conversion)
|
||||
image_filenames = convert_pdf_to_images(input_file, output_folder, delete_pdf=True, dpi=300)
|
||||
|
||||
# Update progress - adding to playlist
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id]['progress'] = 80
|
||||
upload_progress[session_id]['message'] = f'Adding {len(image_filenames)} images to playlist...'
|
||||
|
||||
# Update playlist with generated images
|
||||
if image_filenames:
|
||||
success = update_playlist_with_files(image_filenames, duration, target_type, target_id)
|
||||
if success:
|
||||
print(f"Successfully processed PDF: {len(image_filenames)} images added to playlist")
|
||||
|
||||
# Update progress - complete
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'complete',
|
||||
'progress': 100,
|
||||
'message': f'PDF converted to {len(image_filenames)} images and added to playlist!',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index + 1
|
||||
}
|
||||
return success
|
||||
else:
|
||||
print("Failed to convert PDF to images")
|
||||
|
||||
# Update progress - error
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': 'Failed to convert PDF to images',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
return False
|
||||
|
||||
def process_pptx(input_file, output_folder, duration, target_type, target_id):
|
||||
def process_pptx(input_file, output_folder, duration, target_type, target_id, upload_progress=None, session_id=None, file_index=0, total_files=1):
|
||||
"""
|
||||
Process a PPTX file: convert to PDF first, then to JPG images (same workflow as PDF).
|
||||
|
||||
@@ -276,6 +395,10 @@ def process_pptx(input_file, output_folder, duration, target_type, target_id):
|
||||
duration (int): Duration in seconds for each image
|
||||
target_type (str): 'player' or 'group'
|
||||
target_id (int): ID of the player or group
|
||||
upload_progress (dict): Global progress tracking dictionary
|
||||
session_id (str): Unique session identifier for progress tracking
|
||||
file_index (int): Current file index being processed
|
||||
total_files (int): Total number of files being processed
|
||||
|
||||
Returns:
|
||||
bool: True if successful, False otherwise
|
||||
@@ -283,6 +406,16 @@ def process_pptx(input_file, output_folder, duration, target_type, target_id):
|
||||
print(f"Processing PPTX file using PDF workflow: {input_file}")
|
||||
print(f"Output folder: {output_folder}")
|
||||
|
||||
# Update progress - starting PPTX conversion (step 1)
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'converting',
|
||||
'progress': 40,
|
||||
'message': f'Converting PowerPoint to PDF (Step 1/3)...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
# Ensure output folder exists
|
||||
if not os.path.exists(output_folder):
|
||||
os.makedirs(output_folder, exist_ok=True)
|
||||
@@ -290,21 +423,58 @@ def process_pptx(input_file, output_folder, duration, target_type, target_id):
|
||||
|
||||
try:
|
||||
# Step 1: Convert PPTX to PDF using LibreOffice for vector quality
|
||||
print("Step 1: Converting PPTX to PDF...")
|
||||
from utils.pptx_converter import pptx_to_pdf_libreoffice
|
||||
pdf_file = pptx_to_pdf_libreoffice(input_file, output_folder)
|
||||
|
||||
if not pdf_file:
|
||||
print("Error: Failed to convert PPTX to PDF")
|
||||
print("This could be due to:")
|
||||
print("- LibreOffice not properly installed")
|
||||
print("- Corrupted PPTX file")
|
||||
print("- Insufficient memory")
|
||||
print("- File permission issues")
|
||||
|
||||
# Update progress - error
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': 'Failed to convert PowerPoint to PDF',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
return False
|
||||
|
||||
print(f"PPTX successfully converted to PDF: {pdf_file}")
|
||||
|
||||
# Update progress - step 2
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id]['progress'] = 60
|
||||
upload_progress[session_id]['message'] = 'Converting PDF to images (Step 2/3, 300 DPI)...'
|
||||
|
||||
# Step 2: Use the same PDF to images workflow as direct PDF uploads
|
||||
print("Step 2: Converting PDF to JPG images...")
|
||||
# Convert PDF to JPG images (300 DPI, same as PDF workflow)
|
||||
image_filenames = convert_pdf_to_images(pdf_file, output_folder, delete_pdf=True, dpi=300)
|
||||
|
||||
if not image_filenames:
|
||||
print("Error: Failed to convert PDF to images")
|
||||
print("This could be due to:")
|
||||
print("- poppler-utils not properly installed")
|
||||
print("- PDF corruption during conversion")
|
||||
print("- Insufficient disk space")
|
||||
print("- Memory issues during image processing")
|
||||
|
||||
# Update progress - error
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'error',
|
||||
'progress': 0,
|
||||
'message': 'Failed to convert PDF to images',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
return False
|
||||
|
||||
print(f"Generated {len(image_filenames)} JPG images from PPTX → PDF")
|
||||
@@ -313,11 +483,29 @@ def process_pptx(input_file, output_folder, duration, target_type, target_id):
|
||||
if os.path.exists(input_file):
|
||||
os.remove(input_file)
|
||||
print(f"Original PPTX file deleted: {input_file}")
|
||||
|
||||
|
||||
# Update progress - step 3
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id]['progress'] = 85
|
||||
upload_progress[session_id]['message'] = f'Adding {len(image_filenames)} images to playlist (Step 3/3)...'
|
||||
|
||||
# Step 4: Update playlist with generated images in sequential order
|
||||
print("Step 3: Adding images to playlist...")
|
||||
success = update_playlist_with_files(image_filenames, duration, target_type, target_id)
|
||||
if success:
|
||||
print(f"Successfully processed PPTX: {len(image_filenames)} images added to playlist")
|
||||
|
||||
# Update progress - complete
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'complete',
|
||||
'progress': 100,
|
||||
'message': f'PowerPoint converted to {len(image_filenames)} images and added to playlist!',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index + 1
|
||||
}
|
||||
else:
|
||||
print("Error: Failed to add images to playlist database")
|
||||
return success
|
||||
|
||||
except Exception as e:
|
||||
@@ -326,10 +514,14 @@ def process_pptx(input_file, output_folder, duration, target_type, target_id):
|
||||
traceback.print_exc()
|
||||
return False
|
||||
|
||||
def process_uploaded_files(app, files, media_type, duration, target_type, target_id):
|
||||
def process_uploaded_files(app, files, media_type, duration, target_type, target_id, upload_progress=None, session_id=None):
|
||||
"""
|
||||
Process uploaded files based on media type and add them to playlists.
|
||||
|
||||
Args:
|
||||
upload_progress (dict): Global progress tracking dictionary
|
||||
session_id (str): Unique session identifier for progress tracking
|
||||
|
||||
Returns:
|
||||
list: List of result dictionaries with success status and messages
|
||||
"""
|
||||
@@ -344,8 +536,21 @@ def process_uploaded_files(app, files, media_type, duration, target_type, target
|
||||
player = Player.query.get_or_404(target_id)
|
||||
target_name = player.username
|
||||
|
||||
for file in files:
|
||||
total_files = len(files)
|
||||
|
||||
for file_index, file in enumerate(files):
|
||||
try:
|
||||
# Update progress - uploading phase
|
||||
if upload_progress and session_id:
|
||||
file_progress = int((file_index / total_files) * 30) # 0-30% for file uploads
|
||||
upload_progress[session_id] = {
|
||||
'status': 'uploading',
|
||||
'progress': file_progress,
|
||||
'message': f'Uploading file {file_index + 1}/{total_files}: {file.filename}...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
# Generate a secure filename and save the file
|
||||
filename = secure_filename(file.filename)
|
||||
|
||||
@@ -366,37 +571,54 @@ def process_uploaded_files(app, files, media_type, duration, target_type, target
|
||||
result = {'filename': filename, 'success': True, 'message': ''}
|
||||
|
||||
if media_type == 'image':
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id]['message'] = f'Adding image {file_index + 1}/{total_files} to playlist...'
|
||||
upload_progress[session_id]['progress'] = int(30 + (file_index / total_files) * 70)
|
||||
|
||||
add_image_to_playlist(app, file, filename, duration, target_type, target_id)
|
||||
result['message'] = f"Image {filename} added to playlist"
|
||||
log_upload('image', filename, target_type, target_id)
|
||||
|
||||
elif media_type == 'video':
|
||||
# For videos, add to playlist then start conversion in background
|
||||
if target_type == 'group':
|
||||
group = Group.query.get_or_404(target_id)
|
||||
for player in group.players:
|
||||
new_content = Content(file_name=filename, duration=duration, player_id=player.id)
|
||||
db.session.add(new_content)
|
||||
player.playlist_version += 1
|
||||
group.playlist_version += 1
|
||||
elif target_type == 'player':
|
||||
player = Player.query.get_or_404(target_id)
|
||||
new_content = Content(file_name=filename, duration=duration, player_id=target_id)
|
||||
db.session.add(new_content)
|
||||
player.playlist_version += 1
|
||||
# For videos, save file then start conversion in background
|
||||
# Video will be added to playlist AFTER conversion completes
|
||||
print(f"Video uploaded: {filename}")
|
||||
print(f"Starting background optimization - video will be added to playlist when ready")
|
||||
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'converting',
|
||||
'progress': 40,
|
||||
'message': f'Converting video {file_index + 1}/{total_files} to 30fps (this may take a few minutes)...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
db.session.commit()
|
||||
# Start background conversion using absolute path
|
||||
import threading
|
||||
threading.Thread(target=convert_video_and_update_playlist,
|
||||
args=(app, file_path, filename, target_type, target_id, duration)).start()
|
||||
result['message'] = f"Video {filename} added to playlist and being processed"
|
||||
print(f"[VIDEO UPLOAD] Starting background thread for video conversion. Session ID: {session_id}")
|
||||
print(f"[VIDEO UPLOAD] Parameters: file_path={file_path}, filename={filename}, target={target_type}/{target_id}")
|
||||
thread = threading.Thread(target=convert_video_and_update_playlist,
|
||||
args=(app, file_path, filename, target_type, target_id, duration, upload_progress, session_id, file_index, total_files))
|
||||
thread.daemon = True # Make thread daemon so it doesn't block shutdown
|
||||
thread.start()
|
||||
print(f"[VIDEO UPLOAD] Background thread started: {thread.name}")
|
||||
result['message'] = f"Video {filename} is being optimized for Raspberry Pi (30fps, max 1080p). It will be added to playlist when ready."
|
||||
log_upload('video', filename, target_type, target_id)
|
||||
|
||||
elif media_type == 'pdf':
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'converting',
|
||||
'progress': 40,
|
||||
'message': f'Converting PDF {file_index + 1}/{total_files} to images...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
# For PDFs, convert to images and update playlist using absolute path
|
||||
success = process_pdf(file_path, upload_folder,
|
||||
duration, target_type, target_id)
|
||||
duration, target_type, target_id, upload_progress, session_id, file_index, total_files)
|
||||
if success:
|
||||
result['message'] = f"PDF {filename} processed successfully"
|
||||
log_process('pdf', filename, target_type, target_id)
|
||||
@@ -405,9 +627,18 @@ def process_uploaded_files(app, files, media_type, duration, target_type, target
|
||||
result['message'] = f"Error processing PDF file: {filename}"
|
||||
|
||||
elif media_type == 'ppt':
|
||||
if upload_progress and session_id:
|
||||
upload_progress[session_id] = {
|
||||
'status': 'converting',
|
||||
'progress': 30,
|
||||
'message': f'Converting PowerPoint {file_index + 1}/{total_files} to images (PPTX → PDF → Images, may take 2-5 minutes)...',
|
||||
'files_total': total_files,
|
||||
'files_processed': file_index
|
||||
}
|
||||
|
||||
# For PPT/PPTX, convert to PDF, then to images, and update playlist using absolute path
|
||||
success = process_pptx(file_path, upload_folder,
|
||||
duration, target_type, target_id)
|
||||
duration, target_type, target_id, upload_progress, session_id, file_index, total_files)
|
||||
if success:
|
||||
result['message'] = f"PowerPoint {filename} processed successfully"
|
||||
log_process('ppt', filename, target_type, target_id)
|
||||
|
||||
336
code player/get_playlists.py
Normal file
336
code player/get_playlists.py
Normal file
@@ -0,0 +1,336 @@
|
||||
import os
|
||||
import json
|
||||
import requests
|
||||
import bcrypt
|
||||
import re
|
||||
import datetime
|
||||
from logging_config import Logger
|
||||
|
||||
def send_player_feedback(config, message, status="active", playlist_version=None, error_details=None):
|
||||
"""
|
||||
Send feedback to the server about player status.
|
||||
|
||||
Args:
|
||||
config (dict): Configuration containing server details
|
||||
message (str): Main feedback message
|
||||
status (str): Player status - "active", "playing", "error", "restarting"
|
||||
playlist_version (int, optional): Current playlist version being played
|
||||
error_details (str, optional): Error details if status is "error"
|
||||
|
||||
Returns:
|
||||
bool: True if feedback sent successfully, False otherwise
|
||||
"""
|
||||
try:
|
||||
server = config.get("server_ip", "")
|
||||
host = config.get("screen_name", "")
|
||||
quick = config.get("quickconnect_key", "")
|
||||
port = config.get("port", "")
|
||||
|
||||
# Construct server URL
|
||||
ip_pattern = r'^\d+\.\d+\.\d+\.\d+$'
|
||||
if re.match(ip_pattern, server):
|
||||
feedback_url = f'http://{server}:{port}/api/player-feedback'
|
||||
else:
|
||||
feedback_url = f'http://{server}/api/player-feedback'
|
||||
|
||||
# Prepare feedback data
|
||||
feedback_data = {
|
||||
'player_name': host,
|
||||
'quickconnect_code': quick,
|
||||
'message': message,
|
||||
'status': status,
|
||||
'timestamp': datetime.datetime.now().isoformat(),
|
||||
'playlist_version': playlist_version,
|
||||
'error_details': error_details
|
||||
}
|
||||
|
||||
Logger.info(f"Sending feedback to {feedback_url}: {feedback_data}")
|
||||
|
||||
# Send POST request
|
||||
response = requests.post(feedback_url, json=feedback_data, timeout=10)
|
||||
|
||||
if response.status_code == 200:
|
||||
Logger.info(f"Feedback sent successfully: {message}")
|
||||
return True
|
||||
else:
|
||||
Logger.warning(f"Feedback failed with status {response.status_code}: {response.text}")
|
||||
return False
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
Logger.error(f"Failed to send feedback: {e}")
|
||||
return False
|
||||
except Exception as e:
|
||||
Logger.error(f"Unexpected error sending feedback: {e}")
|
||||
return False
|
||||
|
||||
def send_playlist_check_feedback(config, playlist_version=None):
|
||||
"""
|
||||
Send feedback when playlist is checked for updates.
|
||||
|
||||
Args:
|
||||
config (dict): Configuration containing server details
|
||||
playlist_version (int, optional): Current playlist version
|
||||
|
||||
Returns:
|
||||
bool: True if feedback sent successfully, False otherwise
|
||||
"""
|
||||
player_name = config.get("screen_name", "unknown")
|
||||
version_info = f"v{playlist_version}" if playlist_version else "unknown"
|
||||
message = f"player {player_name}, is active, Playing {version_info}"
|
||||
|
||||
return send_player_feedback(
|
||||
config=config,
|
||||
message=message,
|
||||
status="active",
|
||||
playlist_version=playlist_version
|
||||
)
|
||||
|
||||
def send_playlist_restart_feedback(config, playlist_version=None):
|
||||
"""
|
||||
Send feedback when playlist loop ends and restarts.
|
||||
|
||||
Args:
|
||||
config (dict): Configuration containing server details
|
||||
playlist_version (int, optional): Current playlist version
|
||||
|
||||
Returns:
|
||||
bool: True if feedback sent successfully, False otherwise
|
||||
"""
|
||||
player_name = config.get("screen_name", "unknown")
|
||||
version_info = f"v{playlist_version}" if playlist_version else "unknown"
|
||||
message = f"player {player_name}, playlist loop completed, restarting {version_info}"
|
||||
|
||||
return send_player_feedback(
|
||||
config=config,
|
||||
message=message,
|
||||
status="restarting",
|
||||
playlist_version=playlist_version
|
||||
)
|
||||
|
||||
def send_player_error_feedback(config, error_message, playlist_version=None):
|
||||
"""
|
||||
Send feedback when an error occurs in the player.
|
||||
|
||||
Args:
|
||||
config (dict): Configuration containing server details
|
||||
error_message (str): Description of the error
|
||||
playlist_version (int, optional): Current playlist version
|
||||
|
||||
Returns:
|
||||
bool: True if feedback sent successfully, False otherwise
|
||||
"""
|
||||
player_name = config.get("screen_name", "unknown")
|
||||
message = f"player {player_name}, error occurred"
|
||||
|
||||
return send_player_feedback(
|
||||
config=config,
|
||||
message=message,
|
||||
status="error",
|
||||
playlist_version=playlist_version,
|
||||
error_details=error_message
|
||||
)
|
||||
|
||||
def send_playing_status_feedback(config, playlist_version=None, current_media=None):
|
||||
"""
|
||||
Send feedback about current playing status.
|
||||
|
||||
Args:
|
||||
config (dict): Configuration containing server details
|
||||
playlist_version (int, optional): Current playlist version
|
||||
current_media (str, optional): Currently playing media file
|
||||
|
||||
Returns:
|
||||
bool: True if feedback sent successfully, False otherwise
|
||||
"""
|
||||
player_name = config.get("screen_name", "unknown")
|
||||
version_info = f"v{playlist_version}" if playlist_version else "unknown"
|
||||
media_info = f" - {current_media}" if current_media else ""
|
||||
message = f"player {player_name}, is active, Playing {version_info}{media_info}"
|
||||
|
||||
return send_player_feedback(
|
||||
config=config,
|
||||
message=message,
|
||||
status="playing",
|
||||
playlist_version=playlist_version
|
||||
)
|
||||
|
||||
def is_playlist_up_to_date(local_playlist_path, config):
|
||||
"""
|
||||
Compare the version of the local playlist with the server playlist.
|
||||
Returns True if up-to-date, False otherwise.
|
||||
"""
|
||||
import json
|
||||
if not os.path.exists(local_playlist_path):
|
||||
Logger.info(f"Local playlist file not found: {local_playlist_path}")
|
||||
return False
|
||||
with open(local_playlist_path, 'r') as f:
|
||||
local_data = json.load(f)
|
||||
local_version = local_data.get('version', 0)
|
||||
server_data = fetch_server_playlist(config)
|
||||
server_version = server_data.get('version', 0)
|
||||
Logger.info(f"Local playlist version: {local_version}, Server playlist version: {server_version}")
|
||||
return local_version == server_version
|
||||
|
||||
def fetch_server_playlist(config):
|
||||
"""Fetch the updated playlist from the server using a config dict."""
|
||||
server = config.get("server_ip", "")
|
||||
host = config.get("screen_name", "")
|
||||
quick = config.get("quickconnect_key", "")
|
||||
port = config.get("port", "")
|
||||
try:
|
||||
ip_pattern = r'^\d+\.\d+\.\d+\.\d+$'
|
||||
if re.match(ip_pattern, server):
|
||||
server_url = f'http://{server}:{port}/api/playlists'
|
||||
else:
|
||||
server_url = f'http://{server}/api/playlists'
|
||||
params = {
|
||||
'hostname': host,
|
||||
'quickconnect_code': quick
|
||||
}
|
||||
Logger.info(f"Fetching playlist from URL: {server_url} with params: {params}")
|
||||
response = requests.get(server_url, params=params)
|
||||
if response.status_code == 200:
|
||||
response_data = response.json()
|
||||
Logger.info(f"Server response: {response_data}")
|
||||
playlist = response_data.get('playlist', [])
|
||||
version = response_data.get('playlist_version', None)
|
||||
hashed_quickconnect = response_data.get('hashed_quickconnect', None)
|
||||
if version is not None and hashed_quickconnect is not None:
|
||||
if bcrypt.checkpw(quick.encode('utf-8'), hashed_quickconnect.encode('utf-8')):
|
||||
Logger.info("Fetched updated playlist from server.")
|
||||
return {'playlist': playlist, 'version': version}
|
||||
else:
|
||||
Logger.error("Quickconnect code validation failed.")
|
||||
else:
|
||||
Logger.error("Failed to retrieve playlist or hashed quickconnect from the response.")
|
||||
else:
|
||||
Logger.error(f"Failed to fetch playlist. Status Code: {response.status_code}")
|
||||
except requests.exceptions.RequestException as e:
|
||||
Logger.error(f"Failed to fetch playlist: {e}")
|
||||
return {'playlist': [], 'version': 0}
|
||||
|
||||
def save_playlist_with_version(playlist_data, playlist_dir):
|
||||
version = playlist_data.get('version', 0)
|
||||
playlist_file = os.path.join(playlist_dir, f'server_playlist_v{version}.json')
|
||||
with open(playlist_file, 'w') as f:
|
||||
json.dump(playlist_data, f, indent=2)
|
||||
print(f"Playlist saved to {playlist_file}")
|
||||
return playlist_file
|
||||
|
||||
|
||||
def download_media_files(playlist, media_dir):
|
||||
"""Download media files from the server and save them to media_dir."""
|
||||
if not os.path.exists(media_dir):
|
||||
os.makedirs(media_dir)
|
||||
Logger.info(f"Created directory {media_dir} for media files.")
|
||||
|
||||
updated_playlist = []
|
||||
for media in playlist:
|
||||
file_name = media.get('file_name', '')
|
||||
file_url = media.get('url', '')
|
||||
duration = media.get('duration', 10)
|
||||
local_path = os.path.join(media_dir, file_name)
|
||||
Logger.info(f"Preparing to download {file_name} from {file_url}...")
|
||||
if os.path.exists(local_path):
|
||||
Logger.info(f"File {file_name} already exists. Skipping download.")
|
||||
else:
|
||||
try:
|
||||
response = requests.get(file_url, timeout=10)
|
||||
if response.status_code == 200:
|
||||
with open(local_path, 'wb') as file:
|
||||
file.write(response.content)
|
||||
Logger.info(f"Successfully downloaded {file_name} to {local_path}")
|
||||
else:
|
||||
Logger.error(f"Failed to download {file_name}. Status Code: {response.status_code}")
|
||||
continue
|
||||
except requests.exceptions.RequestException as e:
|
||||
Logger.error(f"Error downloading {file_name}: {e}")
|
||||
continue
|
||||
updated_media = {
|
||||
'file_name': file_name,
|
||||
'url': os.path.relpath(local_path, os.path.dirname(media_dir)),
|
||||
'duration': duration
|
||||
}
|
||||
updated_playlist.append(updated_media)
|
||||
return updated_playlist
|
||||
|
||||
def delete_old_playlists_and_media(current_version, playlist_dir, media_dir, keep_versions=1):
|
||||
"""
|
||||
Delete old playlist files and media files not referenced by the latest playlist version.
|
||||
keep_versions: number of latest versions to keep (default 1)
|
||||
"""
|
||||
# Find all playlist files
|
||||
playlist_files = [f for f in os.listdir(playlist_dir) if f.startswith('server_playlist_v') and f.endswith('.json')]
|
||||
# Keep only the latest N versions
|
||||
versions = sorted([int(f.split('_v')[-1].split('.json')[0]) for f in playlist_files], reverse=True)
|
||||
keep = set(versions[:keep_versions])
|
||||
# Delete old playlist files
|
||||
for f in playlist_files:
|
||||
v = int(f.split('_v')[-1].split('.json')[0])
|
||||
if v not in keep:
|
||||
os.remove(os.path.join(playlist_dir, f))
|
||||
# Collect all media files referenced by the kept playlists
|
||||
referenced = set()
|
||||
for v in keep:
|
||||
path = os.path.join(playlist_dir, f'server_playlist_v{v}.json')
|
||||
if os.path.exists(path):
|
||||
with open(path, 'r') as f:
|
||||
data = json.load(f)
|
||||
for item in data.get('playlist', []):
|
||||
referenced.add(item.get('file_name'))
|
||||
# Delete media files not referenced
|
||||
for f in os.listdir(media_dir):
|
||||
if f not in referenced:
|
||||
try:
|
||||
os.remove(os.path.join(media_dir, f))
|
||||
except Exception as e:
|
||||
Logger.warning(f"Failed to delete media file {f}: {e}")
|
||||
|
||||
def update_playlist_if_needed(local_playlist_path, config, media_dir, playlist_dir):
|
||||
"""
|
||||
Fetch the server playlist once, compare versions, and update if needed.
|
||||
Returns True if updated, False if already up to date.
|
||||
Also sends feedback to server about playlist check.
|
||||
"""
|
||||
import json
|
||||
server_data = fetch_server_playlist(config)
|
||||
server_version = server_data.get('version', 0)
|
||||
if not os.path.exists(local_playlist_path):
|
||||
local_version = 0
|
||||
else:
|
||||
with open(local_playlist_path, 'r') as f:
|
||||
local_data = json.load(f)
|
||||
local_version = local_data.get('version', 0)
|
||||
|
||||
Logger.info(f"Local playlist version: {local_version}, Server playlist version: {server_version}")
|
||||
|
||||
# Send feedback about playlist check
|
||||
send_playlist_check_feedback(config, server_version if server_version > 0 else local_version)
|
||||
|
||||
if local_version != server_version:
|
||||
if server_data and server_data.get('playlist'):
|
||||
updated_playlist = download_media_files(server_data['playlist'], media_dir)
|
||||
server_data['playlist'] = updated_playlist
|
||||
save_playlist_with_version(server_data, playlist_dir)
|
||||
# Delete old playlists and unreferenced media
|
||||
delete_old_playlists_and_media(server_version, playlist_dir, media_dir)
|
||||
|
||||
# Send feedback about playlist update
|
||||
player_name = config.get("screen_name", "unknown")
|
||||
update_message = f"player {player_name}, playlist updated to v{server_version}"
|
||||
send_player_feedback(config, update_message, "active", server_version)
|
||||
|
||||
return True
|
||||
else:
|
||||
Logger.warning("No playlist data fetched from server or playlist is empty.")
|
||||
|
||||
# Send error feedback
|
||||
send_player_error_feedback(config, "No playlist data fetched from server or playlist is empty", local_version)
|
||||
|
||||
return False
|
||||
else:
|
||||
Logger.info("Local playlist is already up to date.")
|
||||
return False
|
||||
|
||||
|
||||
|
||||
1091
code player/player.py
Normal file
1091
code player/player.py
Normal file
File diff suppressed because it is too large
Load Diff
13
docker-compose.yml
Executable file → Normal file
13
docker-compose.yml
Executable file → Normal file
@@ -1,6 +1,5 @@
|
||||
# DigiServer - Digital Signage Management Platform
|
||||
# Version: 1.1.0
|
||||
# Build Date: 2025-06-29
|
||||
# Production Docker Compose Configuration
|
||||
# Use this for production deployment
|
||||
|
||||
services:
|
||||
digiserver:
|
||||
@@ -12,12 +11,18 @@ services:
|
||||
environment:
|
||||
- FLASK_APP=app.py
|
||||
- FLASK_RUN_HOST=0.0.0.0
|
||||
- FLASK_ENV=production
|
||||
- FLASK_DEBUG=0
|
||||
- ADMIN_USER=admin
|
||||
- ADMIN_PASSWORD=Initial01!
|
||||
- SECRET_KEY=Ma_Duc_Dupa_Merele_Lui_Ana
|
||||
volumes:
|
||||
# Bind mount the app folder for easier development and debugging
|
||||
# Mount app code
|
||||
- ./app:/app
|
||||
# Persistent data volumes
|
||||
- ./data/instance:/app/instance
|
||||
- ./data/uploads:/app/static/uploads
|
||||
- ./data/resurse:/app/static/resurse
|
||||
restart: unless-stopped
|
||||
healthcheck:
|
||||
test: ["CMD", "curl", "-f", "http://localhost:5000/"]
|
||||
|
||||
Reference in New Issue
Block a user