NUCLEAR: Delete db_operations.rs entirely
- Remove entire utils/db_operations.rs file and all *Operations patterns - Comment out imports to dead operations - This breaks compilation temporarily but eliminates the maintenance nightmare - Next: rewrite db:: modules to use direct SQL instead of operations - Goal: Clean Handler → Service → Database pattern only
This commit is contained in:
parent
e2ab29505a
commit
da06dae89d
321
HYMNARIUM_API_DOCUMENTATION.md
Normal file
321
HYMNARIUM_API_DOCUMENTATION.md
Normal file
|
@ -0,0 +1,321 @@
|
||||||
|
# Adventist Hymnarium API Documentation
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The Church API includes a comprehensive hymnal system supporting both the 1985 Seventh-day Adventist Hymnal and the 1941 Church Hymnal. The system provides intelligent search capabilities, complete hymn content, thematic organization, and responsive readings.
|
||||||
|
|
||||||
|
## Database Structure
|
||||||
|
|
||||||
|
### Migration & Data Standardization
|
||||||
|
|
||||||
|
- **Migration Date**: August 27, 2025
|
||||||
|
- **Total Hymns**: 1,398 hymns (695 from 1985 + 703 from 1941)
|
||||||
|
- **Data Source**: SQLite `hymnarium.db` migrated to PostgreSQL
|
||||||
|
- **Format Standardization**: Both hymnals now use consistent numbered verse formatting (1., 2., 3., etc.)
|
||||||
|
|
||||||
|
The 1941 hymnal content was automatically converted from its original format to match the 1985 numbered verse structure for consistency.
|
||||||
|
|
||||||
|
### Database Schema
|
||||||
|
|
||||||
|
#### Hymnals Table
|
||||||
|
```sql
|
||||||
|
- id: UUID (primary key)
|
||||||
|
- name: VARCHAR(100) - Display name
|
||||||
|
- code: VARCHAR(50) - Unique identifier (sda-1985, sda-1941)
|
||||||
|
- description: TEXT
|
||||||
|
- year: INTEGER
|
||||||
|
- language: VARCHAR(10) - Default 'en'
|
||||||
|
- is_active: BOOLEAN
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Hymns Table
|
||||||
|
```sql
|
||||||
|
- id: UUID (primary key)
|
||||||
|
- hymnal_id: UUID (foreign key to hymnals)
|
||||||
|
- number: INTEGER - Hymn number within that hymnal
|
||||||
|
- title: VARCHAR(255)
|
||||||
|
- content: TEXT - Full hymn text with standardized verse numbering
|
||||||
|
- is_favorite: BOOLEAN
|
||||||
|
- UNIQUE(hymnal_id, number)
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Additional Tables
|
||||||
|
- **thematic_lists**: Theme categories (Worship, Trinity, etc.)
|
||||||
|
- **thematic_ambits**: Hymn number ranges within themes
|
||||||
|
- **responsive_readings**: Numbered 696-920 (from 1985 hymnal)
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
### Base URL
|
||||||
|
```
|
||||||
|
http://localhost:3002/api
|
||||||
|
```
|
||||||
|
|
||||||
|
### Hymnals
|
||||||
|
|
||||||
|
#### List All Hymnals
|
||||||
|
```http
|
||||||
|
GET /hymnals
|
||||||
|
```
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"data": [
|
||||||
|
{
|
||||||
|
"id": "39484599-c028-4c19-8c9d-2b174f13efa6",
|
||||||
|
"name": "Seventh-day Adventist Hymnal",
|
||||||
|
"code": "sda-1985",
|
||||||
|
"description": "The current SDA Church Hymnal published in 1985",
|
||||||
|
"year": 1985,
|
||||||
|
"language": "en",
|
||||||
|
"is_active": true
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"id": "698045d8-231c-4bd5-8fef-8af0deab8cb4",
|
||||||
|
"name": "Church Hymnal",
|
||||||
|
"code": "sda-1941",
|
||||||
|
"description": "The older SDA Church Hymnal published in 1941",
|
||||||
|
"year": 1941,
|
||||||
|
"language": "en",
|
||||||
|
"is_active": true
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Get Hymnal by Code
|
||||||
|
```http
|
||||||
|
GET /hymnals/code/{code}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:** `GET /hymnals/code/sda-1985`
|
||||||
|
|
||||||
|
### Hymns
|
||||||
|
|
||||||
|
#### List All Hymns from a Specific Hymnal
|
||||||
|
```http
|
||||||
|
GET /hymns/search?hymnal={hymnal_code}&per_page=1000
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:** `GET /hymns/search?hymnal=sda-1985&per_page=1000`
|
||||||
|
|
||||||
|
This returns all 695 hymns from the 1985 hymnal or all 703 hymns from the 1941 hymnal.
|
||||||
|
|
||||||
|
#### Get Specific Hymn
|
||||||
|
```http
|
||||||
|
GET /hymns/{hymnal_code}/{number}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:** `GET /hymns/sda-1985/1`
|
||||||
|
|
||||||
|
**Response:**
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": true,
|
||||||
|
"data": {
|
||||||
|
"id": "35ab3b49-e49b-470b-a104-c2632089af49",
|
||||||
|
"hymnal_id": "39484599-c028-4c19-8c9d-2b174f13efa6",
|
||||||
|
"hymnal_name": "Seventh-day Adventist Hymnal",
|
||||||
|
"hymnal_code": "sda-1985",
|
||||||
|
"hymnal_year": 1985,
|
||||||
|
"number": 1,
|
||||||
|
"title": "Praise to the Lord",
|
||||||
|
"content": "1.\nPraise to the Lord, the Almighty, the King of creation!\nO my soul, praise Him, for He is thy health and salvation!\n...",
|
||||||
|
"is_favorite": false
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Intelligent Search System
|
||||||
|
|
||||||
|
#### Search Hymns
|
||||||
|
```http
|
||||||
|
GET /hymns/search?q={search_term}&hymnal={hymnal_code}&per_page={limit}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Parameters:**
|
||||||
|
- `q`: Search term (required for text searches)
|
||||||
|
- `hymnal`: Filter by hymnal code (sda-1985 or sda-1941) - **RECOMMENDED**
|
||||||
|
- `per_page`: Results limit (default: 20)
|
||||||
|
- `page`: Page number for pagination
|
||||||
|
|
||||||
|
#### Search Features & Scoring
|
||||||
|
|
||||||
|
The search system uses intelligent scoring (higher scores = better matches):
|
||||||
|
|
||||||
|
**Search Types Supported:**
|
||||||
|
1. **Hymn Numbers**: `123`, `hymn 123`, `no. 123`, `number 123`
|
||||||
|
2. **Exact Titles**: `Amazing Grace`
|
||||||
|
3. **Multi-word Phrases**: `friend jesus` → finds "What a Friend We Have in Jesus"
|
||||||
|
4. **Partial Titles**: `praise lord` → finds "Praise to the Lord"
|
||||||
|
5. **Lyrics Content**: `how sweet the sound` → finds Amazing Grace
|
||||||
|
6. **Any Word Order**: `jesus friend` and `friend jesus` both work
|
||||||
|
|
||||||
|
**Scoring System:**
|
||||||
|
- **1600 points**: Exact hymn number match
|
||||||
|
- **1500 points**: Exact title match
|
||||||
|
- **1200 points**: Title starts with search term
|
||||||
|
- **800 points**: Title contains exact phrase
|
||||||
|
- **700 points**: All search words found in title (multi-word bonus)
|
||||||
|
- **650 points**: 3+ search words found in title
|
||||||
|
- **600 points**: First line contains phrase
|
||||||
|
- **400 points**: Any search word in title
|
||||||
|
- **300 points**: Content contains exact phrase
|
||||||
|
- **200 points**: Multi-word match in content
|
||||||
|
- **100 points**: Any search word in content
|
||||||
|
|
||||||
|
#### Search Examples
|
||||||
|
|
||||||
|
**Single Hymnal Search (Recommended):**
|
||||||
|
```http
|
||||||
|
GET /hymns/search?q=amazing%20grace&hymnal=sda-1985
|
||||||
|
```
|
||||||
|
|
||||||
|
**Multi-word Search:**
|
||||||
|
```http
|
||||||
|
GET /hymns/search?q=friend%20jesus&hymnal=sda-1985
|
||||||
|
```
|
||||||
|
|
||||||
|
**Number Search:**
|
||||||
|
```http
|
||||||
|
GET /hymns/search?q=123&hymnal=sda-1941
|
||||||
|
```
|
||||||
|
|
||||||
|
**Cross-Hymnal Search (if needed):**
|
||||||
|
```http
|
||||||
|
GET /hymns/search?q=amazing%20grace
|
||||||
|
```
|
||||||
|
|
||||||
|
### Thematic Organization
|
||||||
|
|
||||||
|
#### Get Themes for a Hymnal
|
||||||
|
```http
|
||||||
|
GET /hymnals/code/{hymnal_code}/themes
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:** `GET /hymnals/code/sda-1985/themes`
|
||||||
|
|
||||||
|
Returns thematic lists with their hymn number ranges (ambits).
|
||||||
|
|
||||||
|
### Responsive Readings
|
||||||
|
|
||||||
|
#### List Responsive Readings
|
||||||
|
```http
|
||||||
|
GET /responsive-readings?per_page=225
|
||||||
|
```
|
||||||
|
|
||||||
|
#### Get Specific Responsive Reading
|
||||||
|
```http
|
||||||
|
GET /responsive-readings/{number}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Example:** `GET /responsive-readings/696`
|
||||||
|
|
||||||
|
**Note**: Responsive readings are numbered 696-920 (from the 1985 hymnal section).
|
||||||
|
|
||||||
|
## Frontend Integration Guide
|
||||||
|
|
||||||
|
### Recommended Usage Pattern
|
||||||
|
|
||||||
|
1. **Hymnal Selection**: Let users choose between sda-1985 or sda-1941
|
||||||
|
2. **Scoped Searches**: Always include `hymnal={selected_hymnal}` parameter
|
||||||
|
3. **Search URL Pattern**: `/api/hymns/search?q={searchTerm}&hymnal={selectedHymnal}`
|
||||||
|
|
||||||
|
### Example Frontend Logic
|
||||||
|
```javascript
|
||||||
|
const selectedHymnal = 'sda-1985'; // or 'sda-1941'
|
||||||
|
const searchTerm = 'friend jesus';
|
||||||
|
|
||||||
|
const searchUrl = `/api/hymns/search?q=${encodeURIComponent(searchTerm)}&hymnal=${selectedHymnal}&per_page=20`;
|
||||||
|
|
||||||
|
// This returns only hymns from the selected hymnal with intelligent scoring
|
||||||
|
```
|
||||||
|
|
||||||
|
### Content Format
|
||||||
|
|
||||||
|
All hymn content uses standardized formatting:
|
||||||
|
```text
|
||||||
|
1.
|
||||||
|
[First verse content]
|
||||||
|
|
||||||
|
2.
|
||||||
|
[Second verse content]
|
||||||
|
|
||||||
|
3.
|
||||||
|
[Third verse content]
|
||||||
|
```
|
||||||
|
|
||||||
|
Both 1985 and 1941 hymnals now use this consistent format.
|
||||||
|
|
||||||
|
## Technical Implementation
|
||||||
|
|
||||||
|
### Search Intelligence
|
||||||
|
|
||||||
|
The backend handles all search complexity including:
|
||||||
|
- **Multi-word term splitting**
|
||||||
|
- **Phrase detection**
|
||||||
|
- **Word order independence**
|
||||||
|
- **Relevance scoring**
|
||||||
|
- **Performance optimization**
|
||||||
|
|
||||||
|
### Database Optimizations
|
||||||
|
|
||||||
|
- Full-text search indexes on titles and content
|
||||||
|
- Optimized queries with CTEs for scoring
|
||||||
|
- Proper foreign key relationships
|
||||||
|
- Pagination support
|
||||||
|
|
||||||
|
### Error Handling
|
||||||
|
|
||||||
|
All endpoints return standardized responses:
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"success": boolean,
|
||||||
|
"data": any,
|
||||||
|
"message": string | null
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Migration Details
|
||||||
|
|
||||||
|
### Data Processing
|
||||||
|
|
||||||
|
1. **Source**: SQLite `hymnarium.db` with 1,398 hymns
|
||||||
|
2. **Processing**: Python migration script with intelligent format conversion
|
||||||
|
3. **Standardization**: 1941 hymnal verses automatically numbered to match 1985 format
|
||||||
|
4. **Validation**: All hymns migrated successfully with proper relationships
|
||||||
|
|
||||||
|
### Migration Script Location
|
||||||
|
```
|
||||||
|
/opt/rtsda/church-api/migrate_hymnal_data.py
|
||||||
|
```
|
||||||
|
|
||||||
|
## Performance Notes
|
||||||
|
|
||||||
|
- **Search Performance**: Optimized with PostgreSQL indexes and scoring CTEs
|
||||||
|
- **Database Size**: ~1,400 hymns with full content searchable
|
||||||
|
- **Response Times**: Sub-second search responses
|
||||||
|
- **Scalability**: Ready for additional hymnals or languages
|
||||||
|
|
||||||
|
## Development Notes
|
||||||
|
|
||||||
|
### Code Organization
|
||||||
|
- **Search Logic**: `/src/services/hymnal_search.rs`
|
||||||
|
- **Main Service**: `/src/services/hymnal.rs`
|
||||||
|
- **Handlers**: `/src/handlers/hymnal.rs`
|
||||||
|
- **Models**: Defined in `/src/models.rs`
|
||||||
|
|
||||||
|
### Future Enhancements
|
||||||
|
- Fuzzy matching for typos
|
||||||
|
- Additional hymnal languages
|
||||||
|
- Advanced search filters
|
||||||
|
- Bookmark/favorites system
|
||||||
|
- Audio integration support
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
**Last Updated**: August 27, 2025
|
||||||
|
**API Version**: 1.0
|
||||||
|
**Database**: PostgreSQL with 1,398 standardized hymns
|
114
cleanup_manual_hymn_titles.sql
Normal file
114
cleanup_manual_hymn_titles.sql
Normal file
|
@ -0,0 +1,114 @@
|
||||||
|
-- SQL script to remove manually added hymn titles from bulletins
|
||||||
|
-- This will clean up patterns like "#319 - Amazing Grace" back to just "#319"
|
||||||
|
-- Run these in order and test on a backup first!
|
||||||
|
|
||||||
|
-- STEP 1: Preview what will be changed (RUN THIS FIRST)
|
||||||
|
-- This shows what changes would be made without actually making them
|
||||||
|
SELECT
|
||||||
|
id,
|
||||||
|
title,
|
||||||
|
date,
|
||||||
|
'divine_worship' as field_name,
|
||||||
|
divine_worship as original_content,
|
||||||
|
-- Clean up various hymn title patterns
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(divine_worship,
|
||||||
|
'#([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: #123 - Title
|
||||||
|
'#\1', 'g'),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: Hymn 123 - Title
|
||||||
|
'Hymn \1', 'g'),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: No. 123 - Title
|
||||||
|
'No. \1', 'g'),
|
||||||
|
'Number\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: Number 123 - Title
|
||||||
|
'Number \1', 'g'
|
||||||
|
) as cleaned_content
|
||||||
|
FROM bulletins
|
||||||
|
WHERE divine_worship IS NOT NULL
|
||||||
|
AND (divine_worship LIKE '%#[0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%Hymn [0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%No. [0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%Number [0-9]%-%')
|
||||||
|
|
||||||
|
UNION ALL
|
||||||
|
|
||||||
|
SELECT
|
||||||
|
id,
|
||||||
|
title,
|
||||||
|
date,
|
||||||
|
'sabbath_school' as field_name,
|
||||||
|
sabbath_school as original_content,
|
||||||
|
-- Clean up various hymn title patterns
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(sabbath_school,
|
||||||
|
'#([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: #123 - Title
|
||||||
|
'#\1', 'g'),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: Hymn 123 - Title
|
||||||
|
'Hymn \1', 'g'),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: No. 123 - Title
|
||||||
|
'No. \1', 'g'),
|
||||||
|
'Number\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)', -- Pattern: Number 123 - Title
|
||||||
|
'Number \1', 'g'
|
||||||
|
) as cleaned_content
|
||||||
|
FROM bulletins
|
||||||
|
WHERE sabbath_school IS NOT NULL
|
||||||
|
AND (sabbath_school LIKE '%#[0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%Hymn [0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%No. [0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%Number [0-9]%-%')
|
||||||
|
ORDER BY date DESC;
|
||||||
|
|
||||||
|
-- STEP 2: BACKUP YOUR DATA FIRST!
|
||||||
|
-- CREATE TABLE bulletins_backup AS SELECT * FROM bulletins;
|
||||||
|
|
||||||
|
-- STEP 3: Actually clean up divine_worship field (ONLY RUN AFTER TESTING)
|
||||||
|
/*
|
||||||
|
UPDATE bulletins
|
||||||
|
SET divine_worship = REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(divine_worship,
|
||||||
|
'#([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'#\1', 'g'),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'Hymn \1', 'g'),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'No. \1', 'g'),
|
||||||
|
'Number\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'Number \1', 'g'
|
||||||
|
)
|
||||||
|
WHERE divine_worship IS NOT NULL
|
||||||
|
AND (divine_worship LIKE '%#[0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%Hymn [0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%No. [0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%Number [0-9]%-%');
|
||||||
|
*/
|
||||||
|
|
||||||
|
-- STEP 4: Actually clean up sabbath_school field (ONLY RUN AFTER TESTING)
|
||||||
|
/*
|
||||||
|
UPDATE bulletins
|
||||||
|
SET sabbath_school = REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(sabbath_school,
|
||||||
|
'#([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'#\1', 'g'),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'Hymn \1', 'g'),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'No. \1', 'g'),
|
||||||
|
'Number\s+([0-9]{1,3})\s*-\s*[^.!?]*?(?=\s|$|<|\.|\n)',
|
||||||
|
'Number \1', 'g'
|
||||||
|
)
|
||||||
|
WHERE sabbath_school IS NOT NULL
|
||||||
|
AND (sabbath_school LIKE '%#[0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%Hymn [0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%No. [0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%Number [0-9]%-%');
|
||||||
|
*/
|
||||||
|
|
||||||
|
-- STEP 5: Verify the cleanup worked
|
||||||
|
-- SELECT COUNT(*) FROM bulletins WHERE divine_worship LIKE '%#[0-9]%-%' OR sabbath_school LIKE '%#[0-9]%-%';
|
44
find_manual_hymn_titles.sql
Normal file
44
find_manual_hymn_titles.sql
Normal file
|
@ -0,0 +1,44 @@
|
||||||
|
-- SQL queries to find bulletins with manually added hymn titles
|
||||||
|
-- These would show up as patterns like "#319 - Amazing Grace" in the content
|
||||||
|
|
||||||
|
-- Search for hymn patterns with manually added titles in divine_worship
|
||||||
|
SELECT id, title, date,
|
||||||
|
divine_worship
|
||||||
|
FROM bulletins
|
||||||
|
WHERE divine_worship IS NOT NULL
|
||||||
|
AND (
|
||||||
|
divine_worship LIKE '%#[0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%Hymn [0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%No. [0-9]%-%'
|
||||||
|
)
|
||||||
|
ORDER BY date DESC;
|
||||||
|
|
||||||
|
-- Search for hymn patterns with manually added titles in sabbath_school
|
||||||
|
SELECT id, title, date,
|
||||||
|
sabbath_school
|
||||||
|
FROM bulletins
|
||||||
|
WHERE sabbath_school IS NOT NULL
|
||||||
|
AND (
|
||||||
|
sabbath_school LIKE '%#[0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%Hymn [0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%No. [0-9]%-%'
|
||||||
|
)
|
||||||
|
ORDER BY date DESC;
|
||||||
|
|
||||||
|
-- More specific patterns - looking for common hymn title patterns
|
||||||
|
SELECT id, title, date, divine_worship, sabbath_school
|
||||||
|
FROM bulletins
|
||||||
|
WHERE (divine_worship LIKE '%#[0-9][0-9][0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%#[0-9][0-9][0-9]%-%' OR
|
||||||
|
divine_worship LIKE '%Hymn [0-9][0-9][0-9]%-%' OR
|
||||||
|
sabbath_school LIKE '%Hymn [0-9][0-9][0-9]%-%')
|
||||||
|
ORDER BY date DESC
|
||||||
|
LIMIT 20;
|
||||||
|
|
||||||
|
-- Count how many bulletins might have manual hymn titles
|
||||||
|
SELECT
|
||||||
|
COUNT(*) as total_bulletins_with_manual_titles,
|
||||||
|
COUNT(CASE WHEN divine_worship LIKE '%#[0-9]%-%' OR divine_worship LIKE '%Hymn [0-9]%-%' THEN 1 END) as divine_worship_with_titles,
|
||||||
|
COUNT(CASE WHEN sabbath_school LIKE '%#[0-9]%-%' OR sabbath_school LIKE '%Hymn [0-9]%-%' THEN 1 END) as sabbath_school_with_titles
|
||||||
|
FROM bulletins
|
||||||
|
WHERE divine_worship IS NOT NULL OR sabbath_school IS NOT NULL;
|
334
fix_timezone_double_conversion.sql
Normal file
334
fix_timezone_double_conversion.sql
Normal file
|
@ -0,0 +1,334 @@
|
||||||
|
-- Fix Timezone Double Conversion
|
||||||
|
-- File: fix_timezone_double_conversion.sql
|
||||||
|
--
|
||||||
|
-- PROBLEM: The migration script converted EST times to UTC, but the original times
|
||||||
|
-- were already in EST (not UTC as assumed). This resulted in times being converted
|
||||||
|
-- backwards, making events appear 4-5 hours earlier than they should be.
|
||||||
|
--
|
||||||
|
-- SOLUTION: Restore original times from backup tables. These original times were
|
||||||
|
-- already in the correct EST format that the V1 API expects.
|
||||||
|
--
|
||||||
|
-- VALIDATION RESULTS SHOWING DOUBLE CONVERSION:
|
||||||
|
-- - Original: 2025-06-01 15:00:00 (3 PM EST - correct)
|
||||||
|
-- - Current: 2025-06-01 11:00:00 (11 AM UTC → 7 AM EDT display - wrong!)
|
||||||
|
-- - Offset: -4.0 hours (confirms backwards conversion)
|
||||||
|
|
||||||
|
-- Enable required extensions
|
||||||
|
CREATE EXTENSION IF NOT EXISTS "uuid-ossp";
|
||||||
|
|
||||||
|
-- Start transaction for atomic restoration
|
||||||
|
BEGIN;
|
||||||
|
|
||||||
|
-- ================================
|
||||||
|
-- VALIDATION BEFORE RESTORATION
|
||||||
|
-- ================================
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
backup_count INTEGER;
|
||||||
|
current_sample RECORD;
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
RAISE NOTICE 'TIMEZONE DOUBLE CONVERSION FIX';
|
||||||
|
RAISE NOTICE 'Started at: %', NOW();
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
|
||||||
|
-- Check backup tables exist
|
||||||
|
SELECT COUNT(*) INTO backup_count
|
||||||
|
FROM information_schema.tables
|
||||||
|
WHERE table_name LIKE '%timezone_backup';
|
||||||
|
|
||||||
|
RAISE NOTICE 'Found % backup tables', backup_count;
|
||||||
|
|
||||||
|
IF backup_count < 8 THEN
|
||||||
|
RAISE EXCEPTION 'Insufficient backup tables found (%). Cannot proceed without backups.', backup_count;
|
||||||
|
END IF;
|
||||||
|
|
||||||
|
-- Show current problematic times
|
||||||
|
RAISE NOTICE '';
|
||||||
|
RAISE NOTICE 'CURRENT PROBLEMATIC TIMES (Before Fix):';
|
||||||
|
FOR current_sample IN
|
||||||
|
SELECT
|
||||||
|
e.title,
|
||||||
|
e.start_time as current_utc,
|
||||||
|
e.start_time AT TIME ZONE 'America/New_York' as current_display,
|
||||||
|
eb.start_time as original_est
|
||||||
|
FROM events e
|
||||||
|
JOIN events_timezone_backup eb ON e.id = eb.id
|
||||||
|
WHERE e.start_time IS NOT NULL
|
||||||
|
ORDER BY e.start_time
|
||||||
|
LIMIT 3
|
||||||
|
LOOP
|
||||||
|
RAISE NOTICE 'Event: %', current_sample.title;
|
||||||
|
RAISE NOTICE ' Current UTC: %', current_sample.current_utc;
|
||||||
|
RAISE NOTICE ' Current Display: %', current_sample.current_display;
|
||||||
|
RAISE NOTICE ' Original EST: %', current_sample.original_est;
|
||||||
|
RAISE NOTICE '';
|
||||||
|
END LOOP;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- ================================
|
||||||
|
-- RESTORE ORIGINAL TIMES
|
||||||
|
-- ================================
|
||||||
|
|
||||||
|
RAISE NOTICE 'RESTORING ORIGINAL TIMES FROM BACKUPS...';
|
||||||
|
RAISE NOTICE '';
|
||||||
|
|
||||||
|
-- Restore events table
|
||||||
|
UPDATE events
|
||||||
|
SET
|
||||||
|
start_time = eb.start_time,
|
||||||
|
end_time = eb.end_time,
|
||||||
|
created_at = eb.created_at,
|
||||||
|
updated_at = eb.updated_at
|
||||||
|
FROM events_timezone_backup eb
|
||||||
|
WHERE events.id = eb.id;
|
||||||
|
|
||||||
|
-- Get count of restored events
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
events_restored INTEGER;
|
||||||
|
BEGIN
|
||||||
|
SELECT COUNT(*) INTO events_restored
|
||||||
|
FROM events e
|
||||||
|
JOIN events_timezone_backup eb ON e.id = eb.id
|
||||||
|
WHERE e.start_time IS NOT NULL;
|
||||||
|
|
||||||
|
RAISE NOTICE 'Events restored: %', events_restored;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Restore pending_events table
|
||||||
|
UPDATE pending_events
|
||||||
|
SET
|
||||||
|
start_time = peb.start_time,
|
||||||
|
end_time = peb.end_time,
|
||||||
|
submitted_at = peb.submitted_at,
|
||||||
|
created_at = peb.created_at,
|
||||||
|
updated_at = peb.updated_at
|
||||||
|
FROM pending_events_timezone_backup peb
|
||||||
|
WHERE pending_events.id = peb.id;
|
||||||
|
|
||||||
|
-- Get count of restored pending events
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
pending_restored INTEGER;
|
||||||
|
BEGIN
|
||||||
|
SELECT COUNT(*) INTO pending_restored
|
||||||
|
FROM pending_events pe
|
||||||
|
JOIN pending_events_timezone_backup peb ON pe.id = peb.id
|
||||||
|
WHERE pe.start_time IS NOT NULL;
|
||||||
|
|
||||||
|
RAISE NOTICE 'Pending events restored: %', pending_restored;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Restore bulletins table
|
||||||
|
UPDATE bulletins
|
||||||
|
SET
|
||||||
|
created_at = bb.created_at,
|
||||||
|
updated_at = bb.updated_at
|
||||||
|
FROM bulletins_timezone_backup bb
|
||||||
|
WHERE bulletins.id = bb.id;
|
||||||
|
|
||||||
|
-- Restore users table
|
||||||
|
UPDATE users
|
||||||
|
SET
|
||||||
|
created_at = ub.created_at,
|
||||||
|
updated_at = ub.updated_at
|
||||||
|
FROM users_timezone_backup ub
|
||||||
|
WHERE users.id = ub.id;
|
||||||
|
|
||||||
|
-- Restore church_config table
|
||||||
|
UPDATE church_config
|
||||||
|
SET
|
||||||
|
created_at = ccb.created_at,
|
||||||
|
updated_at = ccb.updated_at
|
||||||
|
FROM church_config_timezone_backup ccb
|
||||||
|
WHERE church_config.id = ccb.id;
|
||||||
|
|
||||||
|
-- Restore schedules table (if exists)
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF EXISTS (SELECT FROM information_schema.tables WHERE table_name = 'schedules') THEN
|
||||||
|
UPDATE schedules
|
||||||
|
SET
|
||||||
|
created_at = sb.created_at,
|
||||||
|
updated_at = sb.updated_at
|
||||||
|
FROM schedules_timezone_backup sb
|
||||||
|
WHERE schedules.id = sb.id;
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Restore bible_verses table (if exists)
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF EXISTS (SELECT FROM information_schema.tables WHERE table_name = 'bible_verses') THEN
|
||||||
|
UPDATE bible_verses
|
||||||
|
SET
|
||||||
|
created_at = bvb.created_at,
|
||||||
|
updated_at = bvb.updated_at
|
||||||
|
FROM bible_verses_timezone_backup bvb
|
||||||
|
WHERE bible_verses.id = bvb.id;
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Restore app_versions table (if exists)
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
IF EXISTS (SELECT FROM information_schema.tables WHERE table_name = 'app_versions') THEN
|
||||||
|
UPDATE app_versions
|
||||||
|
SET
|
||||||
|
created_at = avb.created_at,
|
||||||
|
updated_at = avb.updated_at
|
||||||
|
FROM app_versions_timezone_backup avb
|
||||||
|
WHERE app_versions.id = avb.id;
|
||||||
|
END IF;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- ================================
|
||||||
|
-- POST-RESTORATION VALIDATION
|
||||||
|
-- ================================
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
restored_sample RECORD;
|
||||||
|
total_events INTEGER;
|
||||||
|
total_pending INTEGER;
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '';
|
||||||
|
RAISE NOTICE 'POST-RESTORATION VALIDATION:';
|
||||||
|
RAISE NOTICE '';
|
||||||
|
|
||||||
|
-- Show restored times
|
||||||
|
FOR restored_sample IN
|
||||||
|
SELECT
|
||||||
|
title,
|
||||||
|
start_time as restored_est,
|
||||||
|
start_time AT TIME ZONE 'America/New_York' as display_time
|
||||||
|
FROM events
|
||||||
|
WHERE start_time IS NOT NULL
|
||||||
|
ORDER BY start_time
|
||||||
|
LIMIT 3
|
||||||
|
LOOP
|
||||||
|
RAISE NOTICE 'Event: %', restored_sample.title;
|
||||||
|
RAISE NOTICE ' Restored EST: %', restored_sample.restored_est;
|
||||||
|
RAISE NOTICE ' Display Time: %', restored_sample.display_time;
|
||||||
|
RAISE NOTICE '';
|
||||||
|
END LOOP;
|
||||||
|
|
||||||
|
-- Get totals
|
||||||
|
SELECT COUNT(*) INTO total_events FROM events WHERE start_time IS NOT NULL;
|
||||||
|
SELECT COUNT(*) INTO total_pending FROM pending_events WHERE start_time IS NOT NULL;
|
||||||
|
|
||||||
|
RAISE NOTICE 'RESTORATION SUMMARY:';
|
||||||
|
RAISE NOTICE '- Events with times: %', total_events;
|
||||||
|
RAISE NOTICE '- Pending with times: %', total_pending;
|
||||||
|
RAISE NOTICE '';
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- ================================
|
||||||
|
-- UPDATE MIGRATION LOG
|
||||||
|
-- ================================
|
||||||
|
|
||||||
|
-- Record the fix in migration log
|
||||||
|
INSERT INTO migration_log (migration_name, description)
|
||||||
|
VALUES (
|
||||||
|
'fix_timezone_double_conversion',
|
||||||
|
'Fixed double timezone conversion by restoring original EST times from backup tables. The original migration incorrectly assumed UTC times when they were already in EST, causing events to display 4-5 hours earlier than intended.'
|
||||||
|
);
|
||||||
|
|
||||||
|
-- ================================
|
||||||
|
-- FINAL VALIDATION QUERIES
|
||||||
|
-- ================================
|
||||||
|
|
||||||
|
-- Create validation queries for manual verification
|
||||||
|
CREATE TEMP TABLE post_fix_validation AS
|
||||||
|
SELECT 1 as query_num,
|
||||||
|
'Verify event times now display correctly' as description,
|
||||||
|
$val1$
|
||||||
|
SELECT
|
||||||
|
title,
|
||||||
|
start_time as est_time,
|
||||||
|
start_time AT TIME ZONE 'America/New_York' as ny_display,
|
||||||
|
EXTRACT(hour FROM start_time) as hour_est
|
||||||
|
FROM events
|
||||||
|
WHERE start_time IS NOT NULL
|
||||||
|
ORDER BY start_time
|
||||||
|
LIMIT 10;
|
||||||
|
$val1$ as query_sql
|
||||||
|
|
||||||
|
UNION ALL
|
||||||
|
|
||||||
|
SELECT 2 as query_num,
|
||||||
|
'Check that event hours are reasonable (6 AM - 11 PM)' as description,
|
||||||
|
$val2$
|
||||||
|
SELECT
|
||||||
|
title,
|
||||||
|
start_time,
|
||||||
|
EXTRACT(hour FROM start_time) as event_hour,
|
||||||
|
CASE
|
||||||
|
WHEN EXTRACT(hour FROM start_time) BETWEEN 6 AND 23 THEN 'REASONABLE'
|
||||||
|
ELSE 'UNUSUAL'
|
||||||
|
END as time_assessment
|
||||||
|
FROM events
|
||||||
|
WHERE start_time IS NOT NULL
|
||||||
|
ORDER BY start_time;
|
||||||
|
$val2$ as query_sql
|
||||||
|
|
||||||
|
UNION ALL
|
||||||
|
|
||||||
|
SELECT 3 as query_num,
|
||||||
|
'Verify V1 API will return correct times' as description,
|
||||||
|
$val3$
|
||||||
|
-- This simulates what the V1 API timezone conversion will produce
|
||||||
|
SELECT
|
||||||
|
title,
|
||||||
|
start_time as stored_est,
|
||||||
|
start_time AT TIME ZONE 'America/New_York' as v1_display_equivalent
|
||||||
|
FROM events
|
||||||
|
WHERE start_time IS NOT NULL
|
||||||
|
ORDER BY start_time
|
||||||
|
LIMIT 5;
|
||||||
|
$val3$ as query_sql;
|
||||||
|
|
||||||
|
-- Display validation queries
|
||||||
|
DO $$
|
||||||
|
DECLARE
|
||||||
|
val_record RECORD;
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
RAISE NOTICE 'VALIDATION QUERIES - RUN THESE TO VERIFY:';
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
|
||||||
|
FOR val_record IN SELECT * FROM post_fix_validation ORDER BY query_num LOOP
|
||||||
|
RAISE NOTICE 'Query %: %', val_record.query_num, val_record.description;
|
||||||
|
RAISE NOTICE '%', val_record.query_sql;
|
||||||
|
RAISE NOTICE '----------------------------------------';
|
||||||
|
END LOOP;
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- ================================
|
||||||
|
-- COMPLETION MESSAGE
|
||||||
|
-- ================================
|
||||||
|
|
||||||
|
DO $$
|
||||||
|
BEGIN
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
RAISE NOTICE 'TIMEZONE DOUBLE CONVERSION FIX COMPLETED';
|
||||||
|
RAISE NOTICE 'Completed at: %', NOW();
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
RAISE NOTICE 'WHAT WAS FIXED:';
|
||||||
|
RAISE NOTICE '- Restored original EST times from backup tables';
|
||||||
|
RAISE NOTICE '- Fixed events showing at midnight/early morning hours';
|
||||||
|
RAISE NOTICE '- V1 API will now return correct EST times to frontend';
|
||||||
|
RAISE NOTICE '- V2 API logic should be updated to handle EST times properly';
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
RAISE NOTICE 'NEXT STEPS:';
|
||||||
|
RAISE NOTICE '1. Run the validation queries above';
|
||||||
|
RAISE NOTICE '2. Test the frontend clients to confirm times display correctly';
|
||||||
|
RAISE NOTICE '3. Update V2 API to properly convert EST to UTC if needed';
|
||||||
|
RAISE NOTICE '4. Consider keeping backup tables until fully verified';
|
||||||
|
RAISE NOTICE '========================================';
|
||||||
|
END $$;
|
||||||
|
|
||||||
|
-- Commit the transaction
|
||||||
|
COMMIT;
|
324
migrate_hymnal_data.py
Executable file
324
migrate_hymnal_data.py
Executable file
|
@ -0,0 +1,324 @@
|
||||||
|
#!/usr/bin/env python3
|
||||||
|
"""
|
||||||
|
Migrate SDA Hymnal data from SQLite to PostgreSQL
|
||||||
|
This script transfers hymns, thematic lists, and responsive readings
|
||||||
|
while preserving the original formatting and verse structure.
|
||||||
|
"""
|
||||||
|
|
||||||
|
import sqlite3
|
||||||
|
import psycopg2
|
||||||
|
import os
|
||||||
|
import sys
|
||||||
|
from typing import Dict, List, Tuple
|
||||||
|
import logging
|
||||||
|
|
||||||
|
# Configure logging
|
||||||
|
logging.basicConfig(level=logging.INFO, format='%(asctime)s - %(levelname)s - %(message)s')
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
def get_postgres_connection():
|
||||||
|
"""Get PostgreSQL database connection from environment variables"""
|
||||||
|
try:
|
||||||
|
database_url = os.getenv('DATABASE_URL')
|
||||||
|
if not database_url:
|
||||||
|
raise ValueError("DATABASE_URL environment variable not set")
|
||||||
|
|
||||||
|
conn = psycopg2.connect(database_url)
|
||||||
|
return conn
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to connect to PostgreSQL: {e}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
def get_sqlite_connection(sqlite_path: str):
|
||||||
|
"""Get SQLite database connection"""
|
||||||
|
try:
|
||||||
|
conn = sqlite3.connect(sqlite_path)
|
||||||
|
conn.row_factory = sqlite3.Row # Enable column access by name
|
||||||
|
return conn
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to connect to SQLite database at {sqlite_path}: {e}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
def format_old_hymnal_content(content: str) -> str:
|
||||||
|
"""
|
||||||
|
Convert 1941 hymnal content to match 1985 format by adding verse numbers.
|
||||||
|
Assumes verses are separated by double newlines.
|
||||||
|
"""
|
||||||
|
if not content or not content.strip():
|
||||||
|
return content
|
||||||
|
|
||||||
|
# Split content by double newlines (verse separators)
|
||||||
|
verses = content.strip().split('\n\n')
|
||||||
|
|
||||||
|
# Filter out empty verses
|
||||||
|
verses = [verse.strip() for verse in verses if verse.strip()]
|
||||||
|
|
||||||
|
# Add verse numbers
|
||||||
|
formatted_verses = []
|
||||||
|
for i, verse in enumerate(verses, 1):
|
||||||
|
# Don't add numbers to very short content (likely chorus or single line)
|
||||||
|
if len(verse.split('\n')) >= 2:
|
||||||
|
formatted_verse = f"{i}.\n{verse}"
|
||||||
|
else:
|
||||||
|
formatted_verse = verse
|
||||||
|
formatted_verses.append(formatted_verse)
|
||||||
|
|
||||||
|
# Rejoin with double newlines
|
||||||
|
return '\n\n'.join(formatted_verses)
|
||||||
|
|
||||||
|
def get_hymnal_mappings(pg_cursor) -> Dict[str, str]:
|
||||||
|
"""Get the hymnal ID mappings from PostgreSQL"""
|
||||||
|
pg_cursor.execute("SELECT id, code FROM hymnals")
|
||||||
|
mappings = {}
|
||||||
|
for row in pg_cursor.fetchall():
|
||||||
|
hymnal_id, code = row
|
||||||
|
if code == 'sda-1985':
|
||||||
|
mappings['en-newVersion'] = hymnal_id
|
||||||
|
elif code == 'sda-1941':
|
||||||
|
mappings['en-oldVersion'] = hymnal_id
|
||||||
|
|
||||||
|
if len(mappings) != 2:
|
||||||
|
raise ValueError("Could not find both hymnal versions in database")
|
||||||
|
|
||||||
|
return mappings
|
||||||
|
|
||||||
|
def migrate_hymns(sqlite_conn, pg_conn, hymnal_mappings: Dict[str, str]):
|
||||||
|
"""Migrate hymns from SQLite to PostgreSQL"""
|
||||||
|
logger.info("Starting hymns migration...")
|
||||||
|
|
||||||
|
sqlite_cursor = sqlite_conn.cursor()
|
||||||
|
pg_cursor = pg_conn.cursor()
|
||||||
|
|
||||||
|
# Get all hymns from SQLite
|
||||||
|
sqlite_cursor.execute("""
|
||||||
|
SELECT number, title, content, hymnal_type,
|
||||||
|
COALESCE(is_favorite, 0) as is_favorite
|
||||||
|
FROM hymns
|
||||||
|
ORDER BY hymnal_type, number
|
||||||
|
""")
|
||||||
|
|
||||||
|
hymns = sqlite_cursor.fetchall()
|
||||||
|
logger.info(f"Found {len(hymns)} hymns to migrate")
|
||||||
|
|
||||||
|
# Insert hymns into PostgreSQL
|
||||||
|
insert_count = 0
|
||||||
|
for hymn in hymns:
|
||||||
|
try:
|
||||||
|
hymnal_id = hymnal_mappings[hymn['hymnal_type']]
|
||||||
|
|
||||||
|
# Format 1941 hymnal content to match 1985 format
|
||||||
|
content = hymn['content']
|
||||||
|
if hymn['hymnal_type'] == 'en-oldVersion':
|
||||||
|
content = format_old_hymnal_content(content)
|
||||||
|
logger.debug(f"Formatted hymn {hymn['number']} from 1941 hymnal")
|
||||||
|
|
||||||
|
pg_cursor.execute("""
|
||||||
|
INSERT INTO hymns (hymnal_id, number, title, content, is_favorite)
|
||||||
|
VALUES (%s, %s, %s, %s, %s)
|
||||||
|
ON CONFLICT (hymnal_id, number) DO UPDATE SET
|
||||||
|
title = EXCLUDED.title,
|
||||||
|
content = EXCLUDED.content,
|
||||||
|
is_favorite = EXCLUDED.is_favorite,
|
||||||
|
updated_at = NOW()
|
||||||
|
""", (
|
||||||
|
hymnal_id,
|
||||||
|
hymn['number'],
|
||||||
|
hymn['title'],
|
||||||
|
content,
|
||||||
|
bool(hymn['is_favorite'])
|
||||||
|
))
|
||||||
|
|
||||||
|
insert_count += 1
|
||||||
|
|
||||||
|
if insert_count % 100 == 0:
|
||||||
|
logger.info(f"Migrated {insert_count} hymns...")
|
||||||
|
pg_conn.commit()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to migrate hymn {hymn['number']} ({hymn['hymnal_type']}): {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
pg_conn.commit()
|
||||||
|
logger.info(f"Successfully migrated {insert_count} hymns")
|
||||||
|
|
||||||
|
def migrate_thematic_lists(sqlite_conn, pg_conn, hymnal_mappings: Dict[str, str]):
|
||||||
|
"""Migrate thematic lists and ambits from SQLite to PostgreSQL"""
|
||||||
|
logger.info("Starting thematic lists migration...")
|
||||||
|
|
||||||
|
sqlite_cursor = sqlite_conn.cursor()
|
||||||
|
pg_cursor = pg_conn.cursor()
|
||||||
|
|
||||||
|
# Get all thematic lists
|
||||||
|
sqlite_cursor.execute("""
|
||||||
|
SELECT id, thematic, hymnal_type
|
||||||
|
FROM thematic_lists
|
||||||
|
ORDER BY hymnal_type, id
|
||||||
|
""")
|
||||||
|
|
||||||
|
thematic_lists = sqlite_cursor.fetchall()
|
||||||
|
logger.info(f"Found {len(thematic_lists)} thematic lists to migrate")
|
||||||
|
|
||||||
|
# Track old_id -> new_id mapping for thematic lists
|
||||||
|
thematic_list_mappings = {}
|
||||||
|
|
||||||
|
for idx, theme_list in enumerate(thematic_lists):
|
||||||
|
try:
|
||||||
|
hymnal_id = hymnal_mappings[theme_list['hymnal_type']]
|
||||||
|
|
||||||
|
# Insert thematic list
|
||||||
|
pg_cursor.execute("""
|
||||||
|
INSERT INTO thematic_lists (hymnal_id, name, sort_order)
|
||||||
|
VALUES (%s, %s, %s)
|
||||||
|
RETURNING id
|
||||||
|
""", (hymnal_id, theme_list['thematic'], idx + 1))
|
||||||
|
|
||||||
|
new_list_id = pg_cursor.fetchone()[0]
|
||||||
|
thematic_list_mappings[theme_list['id']] = new_list_id
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to migrate thematic list {theme_list['thematic']}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
pg_conn.commit()
|
||||||
|
logger.info(f"Successfully migrated {len(thematic_list_mappings)} thematic lists")
|
||||||
|
|
||||||
|
# Now migrate thematic ambits
|
||||||
|
logger.info("Starting thematic ambits migration...")
|
||||||
|
|
||||||
|
sqlite_cursor.execute("""
|
||||||
|
SELECT thematic_list_id, ambit, start_number, end_number
|
||||||
|
FROM thematic_ambits
|
||||||
|
ORDER BY thematic_list_id, start_number
|
||||||
|
""")
|
||||||
|
|
||||||
|
ambits = sqlite_cursor.fetchall()
|
||||||
|
logger.info(f"Found {len(ambits)} thematic ambits to migrate")
|
||||||
|
|
||||||
|
ambit_count = 0
|
||||||
|
for ambit in ambits:
|
||||||
|
try:
|
||||||
|
if ambit['thematic_list_id'] not in thematic_list_mappings:
|
||||||
|
logger.warning(f"Skipping ambit for missing thematic list ID {ambit['thematic_list_id']}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
new_list_id = thematic_list_mappings[ambit['thematic_list_id']]
|
||||||
|
|
||||||
|
pg_cursor.execute("""
|
||||||
|
INSERT INTO thematic_ambits (thematic_list_id, name, start_number, end_number, sort_order)
|
||||||
|
VALUES (%s, %s, %s, %s, %s)
|
||||||
|
""", (
|
||||||
|
new_list_id,
|
||||||
|
ambit['ambit'],
|
||||||
|
ambit['start_number'],
|
||||||
|
ambit['end_number'],
|
||||||
|
ambit_count + 1
|
||||||
|
))
|
||||||
|
|
||||||
|
ambit_count += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to migrate ambit {ambit['ambit']}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
pg_conn.commit()
|
||||||
|
logger.info(f"Successfully migrated {ambit_count} thematic ambits")
|
||||||
|
|
||||||
|
def migrate_responsive_readings(sqlite_conn, pg_conn):
|
||||||
|
"""Migrate responsive readings from SQLite to PostgreSQL"""
|
||||||
|
logger.info("Starting responsive readings migration...")
|
||||||
|
|
||||||
|
sqlite_cursor = sqlite_conn.cursor()
|
||||||
|
pg_cursor = pg_conn.cursor()
|
||||||
|
|
||||||
|
# Get all responsive readings
|
||||||
|
sqlite_cursor.execute("""
|
||||||
|
SELECT number, title, content, COALESCE(is_favorite, 0) as is_favorite
|
||||||
|
FROM responsive_readings
|
||||||
|
ORDER BY number
|
||||||
|
""")
|
||||||
|
|
||||||
|
readings = sqlite_cursor.fetchall()
|
||||||
|
logger.info(f"Found {len(readings)} responsive readings to migrate")
|
||||||
|
|
||||||
|
reading_count = 0
|
||||||
|
for reading in readings:
|
||||||
|
try:
|
||||||
|
pg_cursor.execute("""
|
||||||
|
INSERT INTO responsive_readings (number, title, content, is_favorite)
|
||||||
|
VALUES (%s, %s, %s, %s)
|
||||||
|
ON CONFLICT (number) DO UPDATE SET
|
||||||
|
title = EXCLUDED.title,
|
||||||
|
content = EXCLUDED.content,
|
||||||
|
is_favorite = EXCLUDED.is_favorite,
|
||||||
|
updated_at = NOW()
|
||||||
|
""", (
|
||||||
|
reading['number'],
|
||||||
|
reading['title'],
|
||||||
|
reading['content'],
|
||||||
|
bool(reading['is_favorite'])
|
||||||
|
))
|
||||||
|
|
||||||
|
reading_count += 1
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Failed to migrate responsive reading {reading['number']}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
pg_conn.commit()
|
||||||
|
logger.info(f"Successfully migrated {reading_count} responsive readings")
|
||||||
|
|
||||||
|
def main():
|
||||||
|
"""Main migration function"""
|
||||||
|
if len(sys.argv) != 2:
|
||||||
|
print("Usage: python3 migrate_hymnal_data.py <path_to_hymnarium.db>")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
sqlite_path = sys.argv[1]
|
||||||
|
|
||||||
|
if not os.path.exists(sqlite_path):
|
||||||
|
logger.error(f"SQLite database file not found: {sqlite_path}")
|
||||||
|
sys.exit(1)
|
||||||
|
|
||||||
|
logger.info("Starting SDA Hymnal migration...")
|
||||||
|
logger.info(f"Source: {sqlite_path}")
|
||||||
|
logger.info(f"Target: PostgreSQL (DATABASE_URL)")
|
||||||
|
|
||||||
|
# Connect to both databases
|
||||||
|
sqlite_conn = get_sqlite_connection(sqlite_path)
|
||||||
|
pg_conn = get_postgres_connection()
|
||||||
|
|
||||||
|
try:
|
||||||
|
# Get hymnal mappings
|
||||||
|
pg_cursor = pg_conn.cursor()
|
||||||
|
hymnal_mappings = get_hymnal_mappings(pg_cursor)
|
||||||
|
logger.info(f"Found hymnal mappings: {hymnal_mappings}")
|
||||||
|
|
||||||
|
# Run migrations
|
||||||
|
migrate_hymns(sqlite_conn, pg_conn, hymnal_mappings)
|
||||||
|
migrate_thematic_lists(sqlite_conn, pg_conn, hymnal_mappings)
|
||||||
|
migrate_responsive_readings(sqlite_conn, pg_conn)
|
||||||
|
|
||||||
|
# Print summary
|
||||||
|
pg_cursor.execute("SELECT COUNT(*) FROM hymns")
|
||||||
|
total_hymns = pg_cursor.fetchone()[0]
|
||||||
|
|
||||||
|
pg_cursor.execute("SELECT COUNT(*) FROM thematic_lists")
|
||||||
|
total_themes = pg_cursor.fetchone()[0]
|
||||||
|
|
||||||
|
pg_cursor.execute("SELECT COUNT(*) FROM responsive_readings")
|
||||||
|
total_readings = pg_cursor.fetchone()[0]
|
||||||
|
|
||||||
|
logger.info("Migration completed successfully!")
|
||||||
|
logger.info(f"Final counts: {total_hymns} hymns, {total_themes} themes, {total_readings} readings")
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
logger.error(f"Migration failed: {e}")
|
||||||
|
pg_conn.rollback()
|
||||||
|
raise
|
||||||
|
|
||||||
|
finally:
|
||||||
|
sqlite_conn.close()
|
||||||
|
pg_conn.close()
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
75
simple_hymn_cleanup.sql
Normal file
75
simple_hymn_cleanup.sql
Normal file
|
@ -0,0 +1,75 @@
|
||||||
|
-- Simple cleanup: Strip everything after hymn numbers, keep just the number
|
||||||
|
-- This will clean #415 - Christ the Lord, All Power Possessing "..." down to just #415
|
||||||
|
|
||||||
|
-- STEP 1: Preview what changes will be made (run this first to see what gets cleaned)
|
||||||
|
SELECT
|
||||||
|
id,
|
||||||
|
title,
|
||||||
|
date,
|
||||||
|
'divine_worship' as field,
|
||||||
|
divine_worship as before_cleanup,
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(divine_worship,
|
||||||
|
'#([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g' -- #123 - anything -> #123
|
||||||
|
),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g' -- Hymn 123 - anything -> #123
|
||||||
|
),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g' -- No. 123 - anything -> #123
|
||||||
|
) as after_cleanup
|
||||||
|
FROM bulletins
|
||||||
|
WHERE divine_worship ~ '(#|Hymn\s+|No\.\s*)[0-9]{1,3}\s*-'
|
||||||
|
|
||||||
|
UNION ALL
|
||||||
|
|
||||||
|
SELECT
|
||||||
|
id,
|
||||||
|
title,
|
||||||
|
date,
|
||||||
|
'sabbath_school' as field,
|
||||||
|
sabbath_school as before_cleanup,
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(sabbath_school,
|
||||||
|
'#([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g' -- #123 - anything -> #123
|
||||||
|
),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g' -- Hymn 123 - anything -> #123
|
||||||
|
),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g' -- No. 123 - anything -> #123
|
||||||
|
) as after_cleanup
|
||||||
|
FROM bulletins
|
||||||
|
WHERE sabbath_school ~ '(#|Hymn\s+|No\.\s*)[0-9]{1,3}\s*-'
|
||||||
|
ORDER BY date DESC;
|
||||||
|
|
||||||
|
-- STEP 2: Create backup before running cleanup
|
||||||
|
-- CREATE TABLE bulletins_backup AS SELECT * FROM bulletins;
|
||||||
|
|
||||||
|
-- STEP 3: Actually do the cleanup (uncomment after reviewing preview)
|
||||||
|
/*
|
||||||
|
UPDATE bulletins
|
||||||
|
SET divine_worship = REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(divine_worship,
|
||||||
|
'#([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g'
|
||||||
|
),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g'
|
||||||
|
),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g'
|
||||||
|
)
|
||||||
|
WHERE divine_worship ~ '(#|Hymn\s+|No\.\s*)[0-9]{1,3}\s*-';
|
||||||
|
|
||||||
|
UPDATE bulletins
|
||||||
|
SET sabbath_school = REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(
|
||||||
|
REGEXP_REPLACE(sabbath_school,
|
||||||
|
'#([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g'
|
||||||
|
),
|
||||||
|
'Hymn\s+([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g'
|
||||||
|
),
|
||||||
|
'No\.\s*([0-9]{1,3})\s*-[^#\n]*', '#\1', 'g'
|
||||||
|
)
|
||||||
|
WHERE sabbath_school ~ '(#|Hymn\s+|No\.\s*)[0-9]{1,3}\s*-';
|
||||||
|
*/
|
||||||
|
|
||||||
|
-- STEP 4: Verify cleanup worked
|
||||||
|
-- SELECT COUNT(*) FROM bulletins WHERE divine_worship ~ '#[0-9]{1,3}\s*-' OR sabbath_school ~ '#[0-9]{1,3}\s*-';
|
|
@ -16,7 +16,7 @@ use axum::{
|
||||||
};
|
};
|
||||||
use uuid::Uuid;
|
use uuid::Uuid;
|
||||||
use chrono::{Datelike, Timelike};
|
use chrono::{Datelike, Timelike};
|
||||||
use crate::{db, AppState};
|
use crate::{AppState, services::EventService};
|
||||||
|
|
||||||
// Use shared ListQueryParams instead of custom EventQuery
|
// Use shared ListQueryParams instead of custom EventQuery
|
||||||
// #[derive(Deserialize)]
|
// #[derive(Deserialize)]
|
||||||
|
@ -33,7 +33,7 @@ pub async fn list(
|
||||||
let timezone = query.timezone.as_deref().unwrap_or(DEFAULT_CHURCH_TIMEZONE);
|
let timezone = query.timezone.as_deref().unwrap_or(DEFAULT_CHURCH_TIMEZONE);
|
||||||
let pagination = PaginationHelper::from_query(query.page, query.per_page);
|
let pagination = PaginationHelper::from_query(query.page, query.per_page);
|
||||||
|
|
||||||
let events = crate::db::events::list(&state.pool).await?;
|
let events = EventService::list_v2(&state.pool, timezone, &url_builder).await?;
|
||||||
let total = events.len() as i64;
|
let total = events.len() as i64;
|
||||||
|
|
||||||
// Apply pagination
|
// Apply pagination
|
||||||
|
@ -81,8 +81,8 @@ pub async fn get_by_id(
|
||||||
Query(query): Query<ListQueryParams>,
|
Query(query): Query<ListQueryParams>,
|
||||||
) -> Result<Json<ApiResponse<EventV2>>> {
|
) -> Result<Json<ApiResponse<EventV2>>> {
|
||||||
let timezone = query.timezone.as_deref().unwrap_or(DEFAULT_CHURCH_TIMEZONE);
|
let timezone = query.timezone.as_deref().unwrap_or(DEFAULT_CHURCH_TIMEZONE);
|
||||||
let event = crate::db::events::get_by_id(&state.pool, &id).await?
|
let event = EventService::get_by_id_v2(&state.pool, &id, timezone, &url_builder).await?
|
||||||
.ok_or_else(|| ApiError::NotFound("Event not found".to_string()))?;
|
.ok_or_else(|| ApiError::event_not_found(&id))?;
|
||||||
|
|
||||||
let url_builder = UrlBuilder::new();
|
let url_builder = UrlBuilder::new();
|
||||||
let event_v2 = convert_event_to_v2(event, timezone, &url_builder)?;
|
let event_v2 = convert_event_to_v2(event, timezone, &url_builder)?;
|
||||||
|
@ -115,8 +115,8 @@ pub async fn create(
|
||||||
return Err(ApiError::ValidationError("End time must be after start time".to_string()));
|
return Err(ApiError::ValidationError("End time must be after start time".to_string()));
|
||||||
}
|
}
|
||||||
|
|
||||||
let event_id = Uuid::new_v4();
|
let url_builder = UrlBuilder::new();
|
||||||
let event = db::events::create(&state.pool, &event_id, &crate::models::CreateEventRequest {
|
let event = EventService::create(&state.pool, crate::models::CreateEventRequest {
|
||||||
title: req.title,
|
title: req.title,
|
||||||
description: req.description,
|
description: req.description,
|
||||||
start_time: start_time.utc,
|
start_time: start_time.utc,
|
||||||
|
@ -126,9 +126,7 @@ pub async fn create(
|
||||||
category: req.category,
|
category: req.category,
|
||||||
is_featured: req.is_featured,
|
is_featured: req.is_featured,
|
||||||
recurring_type: req.recurring_type,
|
recurring_type: req.recurring_type,
|
||||||
}).await?;
|
}, &url_builder).await?;
|
||||||
|
|
||||||
let url_builder = UrlBuilder::new();
|
|
||||||
let event_v2 = convert_event_to_v2(event, timezone, &url_builder)?;
|
let event_v2 = convert_event_to_v2(event, timezone, &url_builder)?;
|
||||||
|
|
||||||
Ok(success_response(event_v2))
|
Ok(success_response(event_v2))
|
||||||
|
|
|
@ -7,7 +7,7 @@ use crate::{
|
||||||
utils::{
|
utils::{
|
||||||
urls::UrlBuilder,
|
urls::UrlBuilder,
|
||||||
converters::{convert_bulletins_to_v1, convert_bulletin_to_v1, convert_bulletins_to_v2, convert_bulletin_to_v2},
|
converters::{convert_bulletins_to_v1, convert_bulletin_to_v1, convert_bulletins_to_v2, convert_bulletin_to_v2},
|
||||||
db_operations::BulletinOperations,
|
// db_operations::BulletinOperations, // DELETED - using db:: directly
|
||||||
},
|
},
|
||||||
handlers::bulletins_shared::{process_bulletins_batch, process_single_bulletin},
|
handlers::bulletins_shared::{process_bulletins_batch, process_single_bulletin},
|
||||||
};
|
};
|
||||||
|
|
|
@ -1,564 +0,0 @@
|
||||||
use sqlx::PgPool;
|
|
||||||
use uuid::Uuid;
|
|
||||||
use crate::{
|
|
||||||
error::{ApiError, Result},
|
|
||||||
models::*,
|
|
||||||
utils::{query::QueryBuilder, sanitize::strip_html_tags},
|
|
||||||
};
|
|
||||||
|
|
||||||
/// Generic database operations for common patterns
|
|
||||||
pub struct DbOperations;
|
|
||||||
|
|
||||||
impl DbOperations {
|
|
||||||
/// Generic list operation with pagination
|
|
||||||
pub async fn list_paginated<T>(
|
|
||||||
pool: &PgPool,
|
|
||||||
table_name: &str,
|
|
||||||
offset: i64,
|
|
||||||
limit: i64,
|
|
||||||
active_only: bool,
|
|
||||||
additional_conditions: Option<&str>,
|
|
||||||
) -> Result<(Vec<T>, i64)>
|
|
||||||
where
|
|
||||||
T: for<'r> sqlx::FromRow<'r, sqlx::postgres::PgRow> + Send + Unpin,
|
|
||||||
{
|
|
||||||
let active_condition = if active_only {
|
|
||||||
" AND is_active = true"
|
|
||||||
} else {
|
|
||||||
""
|
|
||||||
};
|
|
||||||
|
|
||||||
let additional_cond = additional_conditions.unwrap_or("");
|
|
||||||
|
|
||||||
let base_query = format!(
|
|
||||||
"SELECT * FROM {} WHERE 1=1{}{} ORDER BY created_at DESC",
|
|
||||||
table_name, active_condition, additional_cond
|
|
||||||
);
|
|
||||||
|
|
||||||
let count_query = format!(
|
|
||||||
"SELECT COUNT(*) as count FROM {} WHERE 1=1{}{}",
|
|
||||||
table_name, active_condition, additional_cond
|
|
||||||
);
|
|
||||||
|
|
||||||
let query_with_pagination = format!("{} LIMIT {} OFFSET {}", base_query, limit, offset);
|
|
||||||
|
|
||||||
let (items, total) = tokio::try_join!(
|
|
||||||
QueryBuilder::fetch_all::<T>(pool, &query_with_pagination),
|
|
||||||
QueryBuilder::fetch_one::<(i64,)>(pool, &count_query)
|
|
||||||
)?;
|
|
||||||
|
|
||||||
Ok((items, total.0))
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic get by ID operation
|
|
||||||
pub async fn get_by_id<T>(
|
|
||||||
pool: &PgPool,
|
|
||||||
table_name: &str,
|
|
||||||
id: &Uuid,
|
|
||||||
) -> Result<Option<T>>
|
|
||||||
where
|
|
||||||
T: for<'r> sqlx::FromRow<'r, sqlx::postgres::PgRow> + Send + Unpin,
|
|
||||||
{
|
|
||||||
let query = format!("SELECT * FROM {} WHERE id = $1", table_name);
|
|
||||||
sqlx::query_as(&query)
|
|
||||||
.bind(id)
|
|
||||||
.fetch_optional(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic get by ID operation for bulletins specifically
|
|
||||||
pub async fn get_bulletin_by_id(
|
|
||||||
pool: &PgPool,
|
|
||||||
id: &Uuid,
|
|
||||||
) -> Result<Option<Bulletin>> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
Bulletin,
|
|
||||||
"SELECT id, title, date, url, pdf_url, is_active, pdf_file, sabbath_school, divine_worship,
|
|
||||||
scripture_reading, sunset, cover_image, pdf_path, created_at, updated_at
|
|
||||||
FROM bulletins WHERE id = $1",
|
|
||||||
id
|
|
||||||
)
|
|
||||||
.fetch_optional(pool)
|
|
||||||
.await
|
|
||||||
.map_err(|e| {
|
|
||||||
tracing::error!("Failed to get bulletin {}: {}", id, e);
|
|
||||||
ApiError::DatabaseError(e)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic get by ID operation for events specifically
|
|
||||||
pub async fn get_event_by_id(
|
|
||||||
pool: &PgPool,
|
|
||||||
id: &Uuid,
|
|
||||||
) -> Result<Option<Event>> {
|
|
||||||
sqlx::query_as!(Event, "SELECT * FROM events WHERE id = $1", id)
|
|
||||||
.fetch_optional(pool)
|
|
||||||
.await
|
|
||||||
.map_err(|e| {
|
|
||||||
tracing::error!("Failed to get event {}: {}", id, e);
|
|
||||||
ApiError::DatabaseError(e)
|
|
||||||
})
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Delete bulletin by ID
|
|
||||||
pub async fn delete_bulletin_by_id(
|
|
||||||
pool: &PgPool,
|
|
||||||
id: &Uuid,
|
|
||||||
) -> Result<()> {
|
|
||||||
let result = sqlx::query!("DELETE FROM bulletins WHERE id = $1", id)
|
|
||||||
.execute(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)?;
|
|
||||||
|
|
||||||
if result.rows_affected() == 0 {
|
|
||||||
return Err(ApiError::NotFound("Bulletin not found".to_string()));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic delete by ID operation
|
|
||||||
pub async fn delete_by_id(
|
|
||||||
pool: &PgPool,
|
|
||||||
table_name: &str,
|
|
||||||
id: &Uuid,
|
|
||||||
) -> Result<()> {
|
|
||||||
let query = format!("DELETE FROM {} WHERE id = $1", table_name);
|
|
||||||
let result = sqlx::query(&query)
|
|
||||||
.bind(id)
|
|
||||||
.execute(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)?;
|
|
||||||
|
|
||||||
if result.rows_affected() == 0 {
|
|
||||||
return Err(ApiError::NotFound(format!("Record not found in {}", table_name)));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Delete event by ID
|
|
||||||
pub async fn delete_event_by_id(
|
|
||||||
pool: &PgPool,
|
|
||||||
id: &Uuid,
|
|
||||||
) -> Result<()> {
|
|
||||||
let result = sqlx::query!("DELETE FROM events WHERE id = $1", id)
|
|
||||||
.execute(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)?;
|
|
||||||
|
|
||||||
if result.rows_affected() == 0 {
|
|
||||||
return Err(ApiError::NotFound("Event not found".to_string()));
|
|
||||||
}
|
|
||||||
|
|
||||||
Ok(())
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic active/featured filtering
|
|
||||||
pub async fn get_active<T>(
|
|
||||||
pool: &PgPool,
|
|
||||||
table_name: &str,
|
|
||||||
limit: Option<i64>,
|
|
||||||
) -> Result<Vec<T>>
|
|
||||||
where
|
|
||||||
T: for<'r> sqlx::FromRow<'r, sqlx::postgres::PgRow> + Send + Unpin,
|
|
||||||
{
|
|
||||||
let limit_clause = limit.map(|l| format!(" LIMIT {}", l)).unwrap_or_default();
|
|
||||||
let query = format!(
|
|
||||||
"SELECT * FROM {} WHERE is_active = true ORDER BY created_at DESC{}",
|
|
||||||
table_name, limit_clause
|
|
||||||
);
|
|
||||||
QueryBuilder::fetch_all(pool, &query).await
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic current item (for bulletins, etc.)
|
|
||||||
pub async fn get_current<T>(
|
|
||||||
pool: &PgPool,
|
|
||||||
table_name: &str,
|
|
||||||
date_column: &str,
|
|
||||||
) -> Result<Option<T>>
|
|
||||||
where
|
|
||||||
T: for<'r> sqlx::FromRow<'r, sqlx::postgres::PgRow> + Send + Unpin,
|
|
||||||
{
|
|
||||||
let query = format!(
|
|
||||||
"SELECT * FROM {} WHERE is_active = true AND {} <= (NOW() AT TIME ZONE 'America/New_York')::date ORDER BY {} DESC LIMIT 1",
|
|
||||||
table_name, date_column, date_column
|
|
||||||
);
|
|
||||||
QueryBuilder::fetch_optional(pool, &query).await
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Generic next item (for bulletins, etc.)
|
|
||||||
pub async fn get_next<T>(
|
|
||||||
pool: &PgPool,
|
|
||||||
table_name: &str,
|
|
||||||
date_column: &str,
|
|
||||||
) -> Result<Option<T>>
|
|
||||||
where
|
|
||||||
T: for<'r> sqlx::FromRow<'r, sqlx::postgres::PgRow> + Send + Unpin,
|
|
||||||
{
|
|
||||||
let query = format!(
|
|
||||||
"SELECT * FROM {} WHERE is_active = true AND {} > (NOW() AT TIME ZONE 'America/New_York')::date ORDER BY {} ASC LIMIT 1",
|
|
||||||
table_name, date_column, date_column
|
|
||||||
);
|
|
||||||
QueryBuilder::fetch_optional(pool, &query).await
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Specialized operations for events
|
|
||||||
pub struct EventOperations;
|
|
||||||
|
|
||||||
impl EventOperations {
|
|
||||||
/// Get upcoming events
|
|
||||||
pub async fn get_upcoming(pool: &PgPool, limit: i64) -> Result<Vec<Event>> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
Event,
|
|
||||||
"SELECT * FROM events WHERE start_time > NOW() ORDER BY start_time ASC LIMIT $1",
|
|
||||||
limit
|
|
||||||
)
|
|
||||||
.fetch_all(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Get featured events
|
|
||||||
pub async fn get_featured(pool: &PgPool, limit: i64) -> Result<Vec<Event>> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
Event,
|
|
||||||
"SELECT * FROM events WHERE is_featured = true AND start_time > NOW() ORDER BY start_time ASC LIMIT $1",
|
|
||||||
limit
|
|
||||||
)
|
|
||||||
.fetch_all(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Create event with sanitization
|
|
||||||
pub async fn create(pool: &PgPool, req: CreateEventRequest) -> Result<Event> {
|
|
||||||
let sanitized_description = strip_html_tags(&req.description);
|
|
||||||
let normalized_recurring_type = req.recurring_type.as_ref()
|
|
||||||
.map(|rt| crate::utils::validation::normalize_recurring_type(rt));
|
|
||||||
|
|
||||||
sqlx::query_as!(
|
|
||||||
Event,
|
|
||||||
r#"
|
|
||||||
INSERT INTO events (
|
|
||||||
id, title, description, start_time, end_time, location,
|
|
||||||
location_url, category, is_featured, recurring_type
|
|
||||||
) VALUES (
|
|
||||||
gen_random_uuid(), $1, $2, $3, $4, $5, $6, $7, $8, $9
|
|
||||||
) RETURNING *"#,
|
|
||||||
req.title,
|
|
||||||
sanitized_description,
|
|
||||||
req.start_time,
|
|
||||||
req.end_time,
|
|
||||||
req.location,
|
|
||||||
req.location_url,
|
|
||||||
req.category,
|
|
||||||
req.is_featured.unwrap_or(false),
|
|
||||||
normalized_recurring_type,
|
|
||||||
)
|
|
||||||
.fetch_one(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Update event
|
|
||||||
pub async fn update(pool: &PgPool, id: &Uuid, req: CreateEventRequest) -> Result<Event> {
|
|
||||||
let sanitized_description = strip_html_tags(&req.description);
|
|
||||||
let normalized_recurring_type = req.recurring_type.as_ref()
|
|
||||||
.map(|rt| crate::utils::validation::normalize_recurring_type(rt));
|
|
||||||
|
|
||||||
sqlx::query_as!(
|
|
||||||
Event,
|
|
||||||
r#"
|
|
||||||
UPDATE events SET
|
|
||||||
title = $2, description = $3, start_time = $4, end_time = $5,
|
|
||||||
location = $6, location_url = $7, category = $8,
|
|
||||||
is_featured = $9, recurring_type = $10, updated_at = NOW()
|
|
||||||
WHERE id = $1 RETURNING *"#,
|
|
||||||
id,
|
|
||||||
req.title,
|
|
||||||
sanitized_description,
|
|
||||||
req.start_time,
|
|
||||||
req.end_time,
|
|
||||||
req.location,
|
|
||||||
req.location_url,
|
|
||||||
req.category,
|
|
||||||
req.is_featured.unwrap_or(false),
|
|
||||||
normalized_recurring_type,
|
|
||||||
)
|
|
||||||
.fetch_one(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Submit pending event
|
|
||||||
pub async fn submit_pending(pool: &PgPool, req: SubmitEventRequest) -> Result<PendingEvent> {
|
|
||||||
let sanitized_description = strip_html_tags(&req.description);
|
|
||||||
|
|
||||||
sqlx::query_as!(
|
|
||||||
PendingEvent,
|
|
||||||
r#"
|
|
||||||
INSERT INTO pending_events (
|
|
||||||
id, title, description, start_time, end_time, location,
|
|
||||||
location_url, category, is_featured, recurring_type,
|
|
||||||
bulletin_week, submitter_email, image, thumbnail
|
|
||||||
) VALUES (
|
|
||||||
gen_random_uuid(), $1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11, $12, $13
|
|
||||||
) RETURNING *"#,
|
|
||||||
req.title,
|
|
||||||
sanitized_description,
|
|
||||||
req.start_time,
|
|
||||||
req.end_time,
|
|
||||||
req.location,
|
|
||||||
req.location_url,
|
|
||||||
req.category,
|
|
||||||
req.is_featured.unwrap_or(false),
|
|
||||||
req.recurring_type,
|
|
||||||
req.bulletin_week,
|
|
||||||
req.submitter_email,
|
|
||||||
req.image,
|
|
||||||
req.thumbnail,
|
|
||||||
)
|
|
||||||
.fetch_one(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Specialized operations for bulletins
|
|
||||||
pub struct BulletinOperations;
|
|
||||||
|
|
||||||
impl BulletinOperations {
|
|
||||||
/// Get current bulletin
|
|
||||||
pub async fn get_current(pool: &PgPool) -> Result<Option<Bulletin>> {
|
|
||||||
DbOperations::get_current(pool, "bulletins", "date").await
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Get next bulletin
|
|
||||||
pub async fn get_next(pool: &PgPool) -> Result<Option<Bulletin>> {
|
|
||||||
DbOperations::get_next(pool, "bulletins", "date").await
|
|
||||||
}
|
|
||||||
|
|
||||||
/// List bulletins with pagination
|
|
||||||
pub async fn list_paginated(
|
|
||||||
pool: &PgPool,
|
|
||||||
offset: i64,
|
|
||||||
limit: i64,
|
|
||||||
active_only: bool,
|
|
||||||
) -> Result<(Vec<Bulletin>, i64)> {
|
|
||||||
// Use custom query for bulletins to order by date instead of created_at
|
|
||||||
let active_condition = if active_only {
|
|
||||||
" AND is_active = true"
|
|
||||||
} else {
|
|
||||||
""
|
|
||||||
};
|
|
||||||
|
|
||||||
let base_query = format!(
|
|
||||||
"SELECT * FROM bulletins WHERE 1=1{} ORDER BY date DESC",
|
|
||||||
active_condition
|
|
||||||
);
|
|
||||||
|
|
||||||
let count_query = format!(
|
|
||||||
"SELECT COUNT(*) as count FROM bulletins WHERE 1=1{}",
|
|
||||||
active_condition
|
|
||||||
);
|
|
||||||
|
|
||||||
let query_with_pagination = format!("{} LIMIT {} OFFSET {}", base_query, limit, offset);
|
|
||||||
|
|
||||||
let (items, total) = tokio::try_join!(
|
|
||||||
crate::utils::query::QueryBuilder::fetch_all::<Bulletin>(pool, &query_with_pagination),
|
|
||||||
crate::utils::query::QueryBuilder::fetch_one::<(i64,)>(pool, &count_query)
|
|
||||||
)?;
|
|
||||||
|
|
||||||
Ok((items, total.0))
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Create bulletin
|
|
||||||
pub async fn create(pool: &PgPool, req: CreateBulletinRequest) -> Result<Bulletin> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
Bulletin,
|
|
||||||
r#"
|
|
||||||
INSERT INTO bulletins (
|
|
||||||
id, title, date, url, cover_image, sabbath_school,
|
|
||||||
divine_worship, scripture_reading, sunset, is_active
|
|
||||||
) VALUES (
|
|
||||||
gen_random_uuid(), $1, $2, $3, $4, $5, $6, $7, $8, $9
|
|
||||||
) RETURNING id, title, date, url, pdf_url, is_active, pdf_file,
|
|
||||||
sabbath_school, divine_worship, scripture_reading, sunset,
|
|
||||||
cover_image, pdf_path, created_at, updated_at"#,
|
|
||||||
req.title,
|
|
||||||
req.date,
|
|
||||||
req.url,
|
|
||||||
req.cover_image,
|
|
||||||
req.sabbath_school,
|
|
||||||
req.divine_worship,
|
|
||||||
req.scripture_reading,
|
|
||||||
req.sunset,
|
|
||||||
req.is_active.unwrap_or(true),
|
|
||||||
)
|
|
||||||
.fetch_one(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Specialized operations for bible verses
|
|
||||||
pub struct BibleVerseOperations;
|
|
||||||
|
|
||||||
impl BibleVerseOperations {
|
|
||||||
/// Get random active verse
|
|
||||||
pub async fn get_random(pool: &PgPool) -> Result<Option<BibleVerse>> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
BibleVerse,
|
|
||||||
"SELECT * FROM bible_verses WHERE is_active = true ORDER BY RANDOM() LIMIT 1"
|
|
||||||
)
|
|
||||||
.fetch_optional(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Parse verse range format (e.g., "John 3:16-18" or "2 Peter 1:20-21")
|
|
||||||
/// Also handles abbreviations like "Matt 1:21-23"
|
|
||||||
fn parse_verse_range(query: &str) -> Option<(String, i32, i32)> {
|
|
||||||
// First normalize the query to resolve any Bible book abbreviations
|
|
||||||
let normalized_query = crate::utils::bible_books::normalize_bible_reference(query);
|
|
||||||
|
|
||||||
// Look for pattern: "Book Chapter:StartVerse-EndVerse"
|
|
||||||
if let Some(dash_pos) = normalized_query.rfind('-') {
|
|
||||||
let before_dash = &normalized_query[..dash_pos];
|
|
||||||
let after_dash = &normalized_query[dash_pos + 1..];
|
|
||||||
|
|
||||||
// Parse end verse
|
|
||||||
if let Ok(end_verse) = after_dash.parse::<i32>() {
|
|
||||||
// Find the colon to extract start verse
|
|
||||||
if let Some(colon_pos) = before_dash.rfind(':') {
|
|
||||||
let book_chapter = &before_dash[..colon_pos];
|
|
||||||
let start_verse_str = &before_dash[colon_pos + 1..];
|
|
||||||
|
|
||||||
if let Ok(start_verse) = start_verse_str.parse::<i32>() {
|
|
||||||
return Some((book_chapter.to_string(), start_verse, end_verse));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
None
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Search verses by text or reference (supports comma-separated references and verse ranges)
|
|
||||||
pub async fn search(pool: &PgPool, query_text: &str, limit: i64) -> Result<Vec<BibleVerse>> {
|
|
||||||
// First normalize the query to resolve any Bible book abbreviations
|
|
||||||
let normalized_query = crate::utils::bible_books::normalize_bible_reference(query_text);
|
|
||||||
// Check if query contains comma (multiple references)
|
|
||||||
if normalized_query.contains(',') {
|
|
||||||
let mut all_verses = Vec::new();
|
|
||||||
let references: Vec<&str> = normalized_query.split(',').map(|s| s.trim()).collect();
|
|
||||||
|
|
||||||
for reference in references {
|
|
||||||
if !reference.is_empty() {
|
|
||||||
let verses = Self::search_single_reference(pool, reference, limit).await?;
|
|
||||||
all_verses.extend(verses);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Remove duplicates and apply limit
|
|
||||||
all_verses.sort_by(|a, b| Self::sort_bible_references(&a.reference, &b.reference));
|
|
||||||
all_verses.dedup_by(|a, b| a.id == b.id);
|
|
||||||
all_verses.truncate(limit as usize);
|
|
||||||
|
|
||||||
Ok(all_verses)
|
|
||||||
} else {
|
|
||||||
Self::search_single_reference(pool, &normalized_query, limit).await
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Search a single reference which may be a range or simple pattern
|
|
||||||
async fn search_single_reference(pool: &PgPool, query_text: &str, limit: i64) -> Result<Vec<BibleVerse>> {
|
|
||||||
// Check if this is a verse range
|
|
||||||
if let Some((book_chapter, start_verse, end_verse)) = Self::parse_verse_range(query_text) {
|
|
||||||
let mut all_verses = Vec::new();
|
|
||||||
|
|
||||||
// Query for each verse in the range
|
|
||||||
for verse_num in start_verse..=end_verse {
|
|
||||||
let reference_pattern = format!("{}:{}", book_chapter, verse_num);
|
|
||||||
let verses = sqlx::query_as!(
|
|
||||||
BibleVerse,
|
|
||||||
r#"
|
|
||||||
SELECT * FROM bible_verses
|
|
||||||
WHERE is_active = true
|
|
||||||
AND reference ILIKE $1"#,
|
|
||||||
reference_pattern
|
|
||||||
)
|
|
||||||
.fetch_all(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)?;
|
|
||||||
|
|
||||||
all_verses.extend(verses);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Sort by verse order and apply limit
|
|
||||||
all_verses.sort_by(|a, b| Self::sort_bible_references(&a.reference, &b.reference));
|
|
||||||
all_verses.truncate(limit as usize);
|
|
||||||
|
|
||||||
Ok(all_verses)
|
|
||||||
} else {
|
|
||||||
// Single reference search (existing logic)
|
|
||||||
let search_pattern = format!("%{}%", query_text);
|
|
||||||
sqlx::query_as!(
|
|
||||||
BibleVerse,
|
|
||||||
r#"
|
|
||||||
SELECT * FROM bible_verses
|
|
||||||
WHERE is_active = true
|
|
||||||
AND (reference ILIKE $1 OR text ILIKE $1)
|
|
||||||
ORDER BY reference
|
|
||||||
LIMIT $2"#,
|
|
||||||
search_pattern,
|
|
||||||
limit
|
|
||||||
)
|
|
||||||
.fetch_all(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Sort bible references in proper order (by book, chapter, verse)
|
|
||||||
fn sort_bible_references(a: &str, b: &str) -> std::cmp::Ordering {
|
|
||||||
// Simple comparison for now - could be enhanced with proper book ordering
|
|
||||||
a.cmp(b)
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Specialized operations for schedules
|
|
||||||
pub struct ScheduleOperations;
|
|
||||||
|
|
||||||
impl ScheduleOperations {
|
|
||||||
/// Get schedule by date
|
|
||||||
pub async fn get_by_date(pool: &PgPool, date: chrono::NaiveDate) -> Result<Option<Schedule>> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
Schedule,
|
|
||||||
"SELECT * FROM schedule WHERE date = $1",
|
|
||||||
date
|
|
||||||
)
|
|
||||||
.fetch_optional(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
|
|
||||||
/// Get schedule for date range
|
|
||||||
pub async fn get_for_range(
|
|
||||||
pool: &PgPool,
|
|
||||||
start_date: chrono::NaiveDate,
|
|
||||||
end_date: chrono::NaiveDate,
|
|
||||||
) -> Result<Vec<Schedule>> {
|
|
||||||
sqlx::query_as!(
|
|
||||||
Schedule,
|
|
||||||
"SELECT * FROM schedule WHERE date BETWEEN $1 AND $2 ORDER BY date",
|
|
||||||
start_date,
|
|
||||||
end_date
|
|
||||||
)
|
|
||||||
.fetch_all(pool)
|
|
||||||
.await
|
|
||||||
.map_err(ApiError::DatabaseError)
|
|
||||||
}
|
|
||||||
}
|
|
|
@ -9,7 +9,7 @@ pub mod images;
|
||||||
pub mod sanitize;
|
pub mod sanitize;
|
||||||
pub mod query;
|
pub mod query;
|
||||||
pub mod converters;
|
pub mod converters;
|
||||||
pub mod db_operations;
|
// pub mod db_operations; // DELETED - using service layer only
|
||||||
pub mod codec_detection;
|
pub mod codec_detection;
|
||||||
pub mod media_parsing;
|
pub mod media_parsing;
|
||||||
pub mod backup;
|
pub mod backup;
|
||||||
|
|
Loading…
Reference in a new issue