6.0 KiB
6.0 KiB
SatLoc Application Processor Integration with Partner Sync Worker
Overview
Successfully integrated the SatLocApplicationProcessor into the partner_sync_worker.js to replace the previous binary processor with proper application grouping, file management, and retry logic.
Changes Made
1. Import Added
const SatLocApplicationProcessor = require('../helpers/satloc_application_processor');
2. Helper Functions Added
findMatchingAssignmentsForFile(taskData)
- Finds job assignments that match the aircraft ID for a log file
- Returns assignments with confidence scoring
- Used for building context data for the Application Processor
buildContextDataFromAssignment(assignmentMatch, taskData)
- Builds proper context data for the Application Processor from job assignments
- Includes job ID, user ID, partner metadata, and confidence scoring
- Ensures proper application grouping and tracking
checkForExistingApplicationFile(logFileName, contextData)
- Checks if a log file has been processed before
- Enables retry logic by detecting existing ApplicationFile records
- Returns boolean indicating if retry processing is needed
3. Processing Logic Updated
processLocalLogFile(taskData)
Before: Used SatLocBinaryProcessor directly
After: Uses SatLocApplicationProcessor with:
- Proper application grouping by job and date
- Automatic retry detection and cleanup
- Enhanced error handling per assignment
- ApplicationFile and ApplicationDetail management
processLogData(logData, taskData)
Before: Used SatLocBinaryProcessor with temporary files
After: Uses SatLocApplicationProcessor with:
- Same application grouping logic as local files
- Temporary file cleanup after processing
- Proper context data from assignments
Integration Benefits
✅ Application Grouping
- Files are now grouped under Applications by job ID and upload date
- Multiple log files for the same job are properly grouped
- Follows Job Worker pattern for consistency
✅ Automatic Retry/Cleanup Logic
- Detects if a file has been processed before
- Automatically calls
processor.retryLogFile()for existing files - Cleans up ApplicationDetails and resets ApplicationFile data
- No manual cleanup needed - handled by the processor
✅ Enhanced Data Management
- Creates proper Application, ApplicationFile, and ApplicationDetail records
- Stores spray segments in compressed format
- Calculates accumulated statistics (spray time, area, material)
- Optimized metadata storage in ApplicationFile.meta
✅ Error Handling
- Continues processing other assignments if one fails
- Proper error logging and reporting
- Transaction-safe processing with MongoDB sessions
✅ Partner Context Integration
- Includes partner-specific metadata (aircraft ID, log ID, confidence)
- Maintains assignment tracking and job matching
- Preserves existing partner sync workflow
How Retry/Cleanup Works
1. Detection
const isRetry = await checkForExistingApplicationFile(taskData.logFileName, contextData);
2. Automatic Cleanup & Retry
if (isRetry) {
// SatLocApplicationProcessor.retryLogFile() automatically:
// - Finds existing ApplicationFile by filename
// - Deletes all ApplicationDetails for that file
// - Resets ApplicationFile calculated fields
// - Reprocesses the file fresh
processingResult = await processor.retryLogFile(tempFilePath, contextData);
} else {
processingResult = await processor.processLogFile({ filePath: tempFilePath }, contextData);
}
3. What Gets Cleaned Up
- ApplicationDetails: All records linked to the file (
fileId) - ApplicationFile fields:
data(spray segments)totalSprayTimetotalFlightTimetotalSprayedtotalSprayMat
Processor Configuration
const processor = new SatLocApplicationProcessor({
batchSize: 1000, // Batch size for ApplicationDetail inserts
enableRetryLogic: true, // Enable automatic retry functionality
groupingTolerance: 24 * 60 * 60 * 1000, // 24-hour tolerance for grouping
validateChecksums: true // Validate record checksums during parsing
});
Context Data Structure
const contextData = {
jobId: assignment.job?._id, // Job ID for application grouping
userId: assignment.user?._id, // User ID for application grouping
uploadedDate: new Date(), // Upload timestamp for grouping tolerance
meta: {
source: 'partner_sync', // Processing source identifier
partnerId: taskData.partnerCode, // Partner system (SATLOC)
aircraftId: taskData.aircraftId, // Aircraft identifier
logId: taskData.logId, // Partner log ID
logFileName: taskData.logFileName, // Original log filename
assignmentId: assignment._id, // Job assignment ID
confidence: assignmentMatch.confidence, // Matching confidence score
matchCriteria: assignmentMatch.matchCriteria // How the match was made
}
};
Testing
Run the integration test to verify everything is working:
node tests/test_partner_sync_integration.js
Check partner sync worker syntax:
node -c workers/partner_sync_worker.js
Next Steps
- ✅ Integration Complete - SatLocApplicationProcessor is now wired into partner sync worker
- ⏳ PROCESS_PARTNER_DATA_FILE - Ready for future integration when needed
- ⏳ Production Testing - Test with real partner log files
- ⏳ Performance Monitoring - Monitor application grouping and processing performance
Notes
- PROCESS_PARTNER_DATA_FILE task type was left untouched as requested
- Backward compatibility maintained for existing partner sync workflows
- Error handling improved to handle assignment-level failures gracefully
- Cleanup logic is now automated and comprehensive via the Application Processor