Zenodo Integration API Reference¶
The Heritage Data Processor Zenodo Integration API provides comprehensive endpoints for managing the complete Zenodo publication workflow, from metadata preparation through draft creation, file uploads, and publication, with support for versioning, backup/restore, and pipeline integration.
Base URL¶
Endpoints use various paths: /metadata, /project, and /zenodo.
Metadata Mapping¶
Get Metadata Mapping Schema¶
Loads and returns the Zenodo mapping schema definition from a JSON file.
Endpoint: GET /metadata/mapping_schema_details
Response:
{
"fields": {
"title": {
"type": "string",
"required": true,
"description": "Title of the record"
},
"creators": {
"type": "array",
"required": true,
"description": "List of creators"
}
},
"mapping_types": [
"literal",
"column",
"filename",
"constructed",
"complex"
]
}
File Location:
Schema file is located at server_app/data/zenodo_mapping_schema.json.
Status Codes:
200 OK: Schema retrieved successfully500 Internal Server Error: Schema file not found or failed to load
Save Metadata Mapping¶
Saves or updates metadata mapping configuration for the active project.
Endpoint: POST /project/metadata/save_mapping
Decorator: @project_required
Request Body:
{
"mappingConfiguration": {
"_mapping_mode": "file",
"_file_path": "/data/metadata.xlsx",
"_file_format": "excel",
"filename": {
"type": "column",
"value": "Filename"
},
"title": {
"type": "column",
"value": "Title"
},
"creators": {
"type": "complex",
"entries": [
{
"name": {"type": "column", "value": "Author"},
"affiliation": {"type": "literal", "value": "University"}
}
]
},
"keywords": {
"type": "column",
"value": "Keywords",
"delimiter": ";"
},
"description": {
"type": "construct_later"
}
}
}
Request Parameters:
mappingConfiguration(object, required): Complete mapping configuration dictionary_mapping_mode(string): Mode of mapping:"file"(spreadsheet-based) or other modes_file_path(string): Path to metadata spreadsheet file_file_format(string): Format of metadata file (e.g.,"csv","excel")- Field mappings with types:
literal,column,filename,filename_stem,constructed,construct_later,complex,ordered_combined_columns
Mapping Types:
Literal: Static value used for all records
Column: Extract value from spreadsheet column
Filename: Use complete filename
Filename Stem: Use filename without extension
Constructed: Template with {filename} and {filename_stem} placeholders
Construct Later: Auto-generate description during metadata preparation
Complex: Array of objects with nested field mappings
Ordered Combined Columns: Concatenate multiple columns with delimiter
Response:
Database Behavior:
The endpoint uses transactional operations:
- Deletes existing metadata values for the old mapping
- Deletes the old mapping configuration
- Inserts the new mapping configuration
Status Codes:
200 OK: Mapping saved successfully404 Not Found: Project not found in database500 Internal Server Error: Database error or unexpected server error
Get Project Mappings¶
Retrieves all metadata mapping configurations for the active project.
Endpoint: GET /mappings
Decorator: @project_required
Response:
[
{
"mapping_id": 1,
"project_id": 1,
"mapping_name": "mapping_for_metadata.xlsx",
"file_path": "/data/metadata.xlsx",
"file_format": "excel",
"column_definitions": "{...}",
"last_used_timestamp": "2025-10-21T10:00:00Z"
}
]
Empty Response:
Returns empty array [] if no mappings exist.
Status Codes:
200 OK: Mappings retrieved successfully
File & Record Preparation¶
Add Source Files¶
Adds source files to the project database with hash calculation and MIME type detection.
Endpoint: POST /project/source_files/add
Decorator: @project_required
Request Body:
Request Parameters:
absolute_file_paths(array, required): List of absolute file paths to add
Response:
{
"message": "File addition process completed.",
"added_count": 2,
"skipped_existing_path": 0,
"errors_count": 0,
"errors": []
}
Processing Logic:
For each file path:
- Validates file exists and is a file (not directory)
- Checks for existing database entry by absolute path
- Calculates SHA256 hash using
calculate_file_hash() - Determines MIME type using
get_file_mime_type() - Inserts into
source_filestable with statuspending
File Metadata:
project_id: Current project identifierabsolute_path,relative_path,filename: Path informationsize_bytes: File size from filesystemsha256_hash: Computed hashmime_type: Detected MIME typefile_type: Set to"source"status: Set to"pending"added_timestamp: Current UTC timestamp
Status Codes:
200 OK: File addition process completed (check counts for details)400 Bad Request: Missingabsolute_file_paths500 Internal Server Error: Database error
Prepare Metadata for File¶
Prepares, validates, and stores Zenodo metadata for a source file, with support for pipeline output integration and user overrides.
Endpoint: POST /project/prepare_metadata_for_file
Decorator: @project_required
Request Body:
{
"source_file_db_id": 42,
"target_is_sandbox": true,
"overrides": {
"title": "Custom Title",
"keywords": ["heritage", "manuscript"]
},
"pipeline_id": 5
}
Request Parameters:
source_file_db_id(integer, required): Database ID of the source filetarget_is_sandbox(boolean, optional): Whether to prepare for Zenodo sandbox. Defaults totrueoverrides(object, optional): User-provided metadata field overridespipeline_id(integer, optional): Pipeline ID for applying output mappings
Response:
{
"success": true,
"message": "Metadata prepared and validated successfully.",
"log": [
"Starting metadata preparation for File ID: 42",
"Extracted base metadata from primary mapping source (e.g., spreadsheet).",
"Auto-constructing default description...",
"Applying output mappings from pipeline ID: 5",
"Finalizing pipeline overwrites for fields: ['description', 'keywords']",
"Applying user overrides: ['title']",
"Metadata stored successfully."
]
}
Preparation Workflow:
The endpoint executes a multi-step preparation process:
Step 1 - Extract Base Metadata: Uses _extract_and_prepare_metadata() to load metadata from configured spreadsheet mapping
Step 2 - Auto-Construct Description: If description has construct_later flag, generates: "Zenodo record for the data file: {title}."
Step 3 - Apply Pipeline Mappings: If pipeline_id is provided, calls _apply_output_mappings() to read pipeline output files and apply Zenodo field mappings. Overwrites spreadsheet values with pipeline results
Step 4 - Apply User Overrides: Merges user-provided overrides from request
Step 5 - Sanitize: Removes fields with None or empty string values
Step 6 - Prepare API Payload: Converts to Zenodo API format using prepare_zenodo_metadata()
Step 7 - Validate: Validates against Zenodo schema using validate_zenodo_metadata()
Step 8 - Store: Saves metadata and creates zenodo_records entry with status prepared using store_metadata_for_file()
Error Response:
{
"success": false,
"error": "Metadata validation failed.",
"validation_errors": [
"Title is required",
"At least one creator is required"
],
"log": [...]
}
Status Codes:
200 OK: Metadata prepared and validated successfully400 Bad Request: Validation failed500 Internal Server Error: Preparation error
Preview Mapped Values¶
Performs a dry run of metadata preparation to preview the result without saving to the database.
Endpoint: POST /project/preview_mapped_values
Decorator: @project_required
Request Body:
Request Parameters:
source_file_db_id(integer, required): Database ID of the source file
Response:
{
"success": true,
"filename": "manuscript_001.xml",
"prepared_metadata": {
"metadata": {
"title": "Medieval Manuscript Collection",
"upload_type": "dataset",
"description": "Zenodo record for the data file: Medieval Manuscript Collection.",
"creators": [
{"name": "Smith, John", "affiliation": "University"}
],
"access_right": "open",
"keywords": ["heritage", "medieval"]
}
}
}
Processing:
Identical to prepare_metadata_for_file but without database write operations. Auto-constructs description if needed.
Status Codes:
200 OK: Preview generated successfully500 Internal Server Error: Preview generation failed
Load Metadata File Preview¶
Loads and returns a preview of the first 5 rows from a metadata spreadsheet file.
Endpoint: POST /project/metadata/load_file_preview
Request Body:
Request Parameters:
filePath(string, required): Absolute path to the spreadsheet filefileFormat(string, optional): File format:"csv"or"excel". Defaults to"csv"
Response:
{
"success": true,
"columns": ["Filename", "Title", "Author", "Date"],
"previewData": [
{
"Filename": "manuscript_001.xml",
"Title": "Medieval Manuscript",
"Author": "Smith, John",
"Date": "2025-10-15"
}
],
"rowCount": 150
}
Response Fields:
success(boolean): Operation success statuscolumns(array): List of column headerspreviewData(array): First 5 rows as array of dictionariesrowCount(integer): Total number of rows in the file
Data Processing:
- Fills NaN values with empty strings
- Converts datetime columns to strings
- Handles infinity values in numeric columns
- Uses UTF-8 encoding with error replacement
Status Codes:
200 OK: Preview loaded successfully400 Bad Request: Invalid file path, unsupported format, encoding error, or parsing error500 Internal Server Error: Unexpected error loading file
Zenodo API Operations¶
Create API Draft for Prepared Record¶
Creates a Zenodo draft deposition via API for a record with prepared metadata.
Endpoint: POST /project/create_api_draft_for_prepared_record
Decorator: @project_required
Request Body:
Request Parameters:
local_record_db_id(integer, required): Database ID of the prepared local record
Response:
{
"success": true,
"message": "Zenodo draft record created successfully.",
"local_record_db_id": 15,
"zenodo_response": {
"id": 1234567,
"doi": "10.5281/zenodo.1234567",
"conceptrecid": "7891234",
"links": {
"bucket": "https://zenodo.org/api/files/...",
"publish": "https://zenodo.org/api/deposit/depositions/1234567/actions/publish",
"discard": "https://zenodo.org/api/deposit/depositions/1234567/actions/discard"
},
"metadata": {...}
}
}
Processing Logic:
The endpoint performs careful payload preparation:
- Retrieves stored metadata JSON from
zenodo_recordstable - Extracts only the
metadatafield, discarding any previous API response fields - Wraps in clean payload:
{"metadata": ...} - Calls
create_record_cli()from legacy functions - Updates local record with complete Zenodo API response
- Sets record status to
draft - Creates entry in
record_files_maplinking source file with statuspending
Payload Cleaning:
The endpoint defensively removes state-corrupting fields like id, doi, links, state that may have been stored from previous API calls.
Status Codes:
200 OK: Draft created successfully400 Bad Request: Missinglocal_record_db_id404 Not Found: No prepared record found with specified ID500 Internal Server Error: Failed to create Zenodo record
Create API Draft for CLI¶
CLI-specific endpoint for creating Zenodo drafts with corrected logic that bypasses web UI bugs.
Endpoint: POST /project/cli/create_api_draft
Request Body:
Request Parameters:
local_record_db_id(integer, required): Database ID of the local record
Response:
{
"success": true,
"zenodo_response": {
"id": 1234567,
"doi": "10.5281/zenodo.1234567",
"metadata": {...}
}
}
Implementation Differences:
- Uses
zenodo_api_service.create_new_deposition()directly instead of legacy CLI functions - Cleans response-only keys before API call:
id,doi,recid,links,state,submitted,created,modified,owner,record_id,conceptrecid - Updates local database with new deposition ID and full response
Status Codes:
200 OK: Draft created successfully400 Bad Request: No project loaded or missinglocal_record_db_id404 Not Found: Local record not found500 Internal Server Error: Failed to create Zenodo record
Upload File to Deposition¶
Uploads a single file to a Zenodo draft deposition using the bucket API.
Endpoint: POST /project/upload_file_to_deposition
Decorator: @project_required
Request Body:
Request Parameters:
local_record_db_id(integer, required): Database ID of the Zenodo recordsource_file_db_id(integer, required): Database ID of the file to upload
Response:
{
"success": true,
"message": "File uploaded successfully.",
"zenodo_response": {
"key": "manuscript_001.xml",
"size": 45120,
"checksum": "md5:...",
"links": {...}
}
}
Upload Process:
- Retrieves record metadata to get bucket URL from
links.bucket - Opens file in binary read mode
- Sends PUT request to
{bucket_url}/{filename}with file data - Updates
record_files_maptable setting upload_status touploaded
Status Codes:
200 OK: File uploaded successfully201 Created: File uploaded successfully (alternative success code)404 Not Found: Record or file not found in database500 Internal Server Error: Bucket URL missing or upload failed
Upload Files for Deposition¶
Uploads all pending files associated with a Zenodo record.
Endpoint: POST /project/upload_files_for_deposition
Decorator: @project_required
Request Body:
Request Parameters:
local_record_db_id(integer, required): Database ID of the Zenodo record
Response:
{
"success": true,
"message": "Upload process finished. 5 succeeded, 0 failed.",
"log": [
"Starting file uploads for record ID: 15",
"Found 5 file(s) to upload.",
"Uploading 'manuscript_001.xml'...",
"Successfully uploaded 'manuscript_001.xml'.",
"Uploading 'metadata.json'...",
"Successfully uploaded 'metadata.json'."
]
}
Processing Logic:
- Retrieves bucket URL from record metadata
- Queries for all files with
upload_status = 'pending' - For each pending file:
- Opens file in binary mode
- Sends PUT request to bucket
- Updates status to
uploadedon success - Updates status to
upload_erroron failure - Commits all status updates
- Returns detailed log messages
Error Tolerance:
Individual file failures do not stop the upload process. All files are attempted and results are reported.
Status Codes:
200 OK: Upload process completed (checksuccessflag for overall status)400 Bad Request: Missinglocal_record_db_id404 Not Found: Record not found500 Internal Server Error: Bucket URL missing or unexpected error
Publish Record¶
Publishes a Zenodo draft record with automatic HTTP 500 error recovery logic.
Endpoint: POST /project/publish_record
Decorator: @project_required
Request Body:
Request Parameters:
local_record_db_id(integer, required): Database ID of the draft record to publish
Response:
{
"success": true,
"message": "Record published successfully.",
"zenodo_response": {
"id": 1234567,
"doi": "10.5281/zenodo.1234567",
"conceptdoi": "10.5281/zenodo.7891234",
"conceptrecid": "7891234",
"metadata": {...},
"state": "done"
},
"log": [
"Attempting to publish record with DB ID: 15",
"Publishing to Production environment.",
"API response: Record published successfully."
]
}
Error Recovery Logic:
The endpoint includes sophisticated HTTP 500 error handling:
Standard Success (202): Updates database with published status and metadata
HTTP 500 Error: Attempts re-verification via /api/records/{id} endpoint:
- If GET returns 200, record is actually published → Updates database
- If GET returns 404, record is not published → Reports failure
- Logs all re-verification attempts with detailed messages
Publication Process:
- Retrieves draft metadata and publish link
- Sends POST request to publish link with rate limiting
- Handles 202 (success), 500 (retry), or other errors
- Updates
zenodo_recordstable with statuspublished - Sets
zenodo_doi,concept_doi,concept_rec_id - Stores full API response in
record_metadata_json - Logs API call to
api_logtable
Status Codes:
200 OK: Record published successfully (or re-verification confirmed publication)404 Not Found: Record not found in local database500 Internal Server Error: Publication failed (after re-verification attempt)
Discard Zenodo Draft¶
Discards a draft on Zenodo and restores local metadata from backup.
Endpoint: POST /project/discard_zenodo_draft
Decorator: @project_required
Request Body:
Request Parameters:
local_record_db_id(integer, required): Database ID of the draft record
Response:
Processing Steps:
Step 1 - Discard on Zenodo: Retrieves discard link from stored metadata and calls Zenodo API. Continues with local restoration even if API call fails.
Step 2 - Restore Metadata: Calls _restore_record_metadata() to restore from backup:
- Retrieves most recent backup from metadata_backups table
- Parses backed-up JSON to extract denormalized fields (title, version)
- Updates zenodo_records table with backed-up metadata
- Resets status to prepared
- Clears zenodo_record_id, zenodo_doi, last_api_error
- Deletes backup entry after successful restore
Fallback Behavior:
If no backup exists, resets status to prepared without changing metadata.
Status Codes:
200 OK: Draft discarded and metadata restored404 Not Found: Record not found
Create Quick Zenodo Record¶
Creates a Zenodo draft record with minimal auto-generated metadata for a source file.
Endpoint: POST /project/create_zenodo_record_for_file
Decorator: @project_required
Request Body:
Request Parameters:
source_file_db_id(integer, required): Database ID of the source fileis_sandbox(boolean, optional): Whether to use Zenodo sandbox. Defaults totrue
Response:
{
"success": true,
"message": "Zenodo draft record created successfully.",
"local_record_db_id": 25,
"zenodo_response": {...}
}
Auto-Generated Metadata:
{
"title": "Record for manuscript_001.xml",
"upload_type": "other",
"description": "Zenodo record automatically created for source file: manuscript_001.xml",
"creators": [{"name": "Heritage Data Processor User"}]
}
Use Case:
Quick record creation without going through full metadata preparation workflow.
Status Codes:
200 OK: Record created successfully400 Bad Request: Missingsource_file_db_id404 Not Found: Source file not found500 Internal Server Error: Failed to create Zenodo record
Get Latest Version Files¶
Retrieves the file list from the latest published version of a record by concept ID.
Endpoint: GET /zenodo/records/latest_files/<concept_rec_id>
Decorator: @project_required
URL Parameters:
concept_rec_id(string, required): Zenodo concept record ID
Query Parameters:
is_sandbox(string, optional): Whether to check sandbox environment. Defaults to"true"
Response:
[
{
"id": "file-uuid-1",
"key": "manuscript_001.xml",
"size": 45120,
"checksum": "md5:abc123...",
"links": {
"self": "https://zenodo.org/api/files/...",
"download": "https://zenodo.org/api/files/.../manuscript_001.xml"
}
}
]
Query Logic:
Finds the latest published record by ordering by version DESC and selecting the first result.
Status Codes:
200 OK: Files retrieved successfully404 Not Found: No published record found for concept ID500 Internal Server Error: Failed to retrieve files
UI Data Retrieval¶
Get Uploadable Files¶
Retrieves all files requiring user actions, categorized by workflow stage.
Endpoint: GET /project/uploadable_files
Decorator: @project_required
Query Parameters:
is_sandbox_for_drafts(string, optional): Whether to include sandbox drafts. Defaults to"true"
Response:
[
{
"source_file_db_id": 42,
"filename": "manuscript_001.xml",
"absolute_path": "/data/manuscript_001.xml",
"file_db_status": "pending",
"required_action": "action_prepare_metadata"
},
{
"source_file_db_id": 43,
"filename": "manuscript_002.xml",
"local_record_db_id": 15,
"record_title": "Medieval Manuscript 002",
"is_sandbox": 1,
"zenodo_record_db_status": "prepared",
"required_action": "action_create_api_draft"
},
{
"source_file_db_id": 44,
"filename": "manuscript_003.xml",
"local_record_db_id": 16,
"zenodo_api_deposition_id": "1234567",
"record_title": "Medieval Manuscript 003",
"zenodo_record_db_status": "draft",
"file_upload_on_zenodo_status": "pending",
"required_action": "action_upload_file"
}
]
Action Categories:
action_prepare_metadata: Files needing metadata preparation
- Source files with status pending, source_added, or metadata_error
- No associated zenodo_records or failed/discarded records only
action_create_api_draft: Records needing API draft creation
- Records with status prepared
- No zenodo_record_id yet
action_upload_file: Files needing upload to existing drafts
- Records with status draft and valid zenodo_record_id
- Files with upload status pending, pending_pipeline_upload, or containing error
Status Codes:
200 OK: Files retrieved successfully500 Internal Server Error: Database query failed
Get Uploads by Tab¶
Retrieves filtered and paginated files for specific workflow tabs with advanced filtering.
Endpoint: GET /project/uploads_by_tab
Decorator: @project_required
Query Parameters:
tab_id(string, optional): Tab identifier. Options:pending_preparation,pending_operations,drafts,published,versioning. Defaults to"pending_preparation"is_sandbox(string, optional): Environment filter. Defaults to"true"search(string, optional): Search term for filename or titletitle_pattern(string, optional): Wildcard pattern for title matching (use*as wildcard)date_since(string, optional): Filter records created on or after this date (ISO 8601)date_until(string, optional): Filter records created on or before this date (ISO 8601)
Response (pending_preparation):
[
{
"source_file_db_id": 42,
"filename": "manuscript_001.xml",
"absolute_path": "/data/manuscript_001.xml",
"file_db_status": "Valid",
"required_action": "action_prepare_metadata",
"total_bundle_files": 3
}
]
Response (drafts):
[
{
"local_record_db_id": 15,
"record_title": "Medieval Manuscript Collection",
"zenodo_record_db_status": "draft",
"is_sandbox": 1,
"zenodo_api_deposition_id": "1234567",
"record_metadata_json": "{...}",
"filename": "manuscript_001.xml",
"total_files_in_record": 5,
"uploaded_files_in_record": 3,
"discard_link": "https://zenodo.org/api/deposit/depositions/1234567/actions/discard"
}
]
Tab Queries:
pending_preparation: Root-level source files without prepared records
pending_operations: Records with status prepared needing draft creation
drafts: Records with status draft and valid Zenodo IDs
published: Records with status published
versioning: Empty (reserved for future feature)
Advanced Filtering:
All filters are applied with AND logic:
search: Matches record_title OR filename using LIKEtitle_pattern: Wildcard matching (converts*to SQL%)date_since,date_until: Date range filtering oncreated_timestamp
Post-Processing:
For drafts tab, the response includes discard_link extracted from record_metadata_json.
Status Codes:
200 OK: Results retrieved successfully400 Bad Request: Invalid tab_id (exceptversioningwhich returns empty array)500 Internal Server Error: Database query failed
Helper Functions¶
Restore Record Metadata¶
Function: _restore_record_metadata(record_id: int) -> bool
Restores metadata from the most recent backup, including denormalized fields.
Process:
- Queries
metadata_backupstable for most recent backup - Parses JSON to extract title and version
- Updates
zenodo_recordswith backed-up metadata JSON - Restores denormalized
record_titleandversionfields - Resets status to
prepared - Clears API-related fields:
zenodo_record_id,zenodo_doi,last_api_error - Deletes backup entry
Returns: True on successful restore, False if no backup found or error occurred
Apply Output Mappings¶
Function: _apply_output_mappings(base_metadata: Dict, source_file_id: int, pipeline_id: int) -> Tuple[Dict, set]
Applies Zenodo metadata mappings from pipeline output files.
Process:
- Finds completed pipeline execution for given pipeline and project
- Retrieves output directory path
- Queries pipeline steps with
output_mappingconfigurations - For each step with mapping:
- Resolves output filename using
{original_stem}placeholder - Reads JSON output file
- Extracts values using
jsonKeypaths - Maps to
zenodoFieldnames - Overwrites values in base_metadata
Returns: Tuple of (updated_metadata, set_of_overwritten_keys)
Extract and Prepare Metadata¶
Function: _extract_and_prepare_metadata(conn: sqlite3.Connection, source_file_db_id: int) -> Tuple[dict, dict, dict]
Extracts metadata from spreadsheet based on mapping configuration.
Process:
- Loads active metadata mapping configuration
- Loads source file information
- Reads spreadsheet (CSV or Excel) based on file format
- Finds matching row using filename column
- For each field in mapping:
literal: Uses static valuecolumn: Extracts from spreadsheet columnfilename/filename_stem: Uses file nameconstructed: Replaces placeholderscomplex: Processes array of objectsordered_combined_columns: Concatenates columns- Handles special delimiter for keywords
Returns: Tuple of (extracted_metadata, file_info, mapping_config)
Publish Record¶
Function: publish_record(record_data_from_db: sqlite3.Row, conn: sqlite3.Connection, conn_params: Dict, base_url: str) -> Tuple[Dict, Dict]
Core publishing function with HTTP 500 error recovery.
Enhanced Error Handling:
- Standard 202: Success
- HTTP 500: Re-verifies via GET /api/records/{id}
- 200 response → Published
- 404 response → Not published
- Logs all attempts with detailed messages
- Updates database with final determined status
Returns: Tuple of (return_msg, api_response_data)
Create New Version Draft¶
Function: create_new_version_draft(concept_rec_id: str, is_sandbox: bool) -> dict
Creates a new version draft with automatic version incrementing.
Version Increment Logic:
- Semantic versions (e.g.,
1.2.3) → Increments patch:1.2.4 - Simple versions (e.g.,
v2) → Increments number:v3 - Other formats → Appends
-new
Process:
- Checks for existing draft → Returns it if found
- Finds latest published version by concept ID
- Retrieves
newversionlink from metadata - Calculates new version number
- POSTs to newversion link to create draft
- PUTs updated metadata with correct version
- Creates new local database record
Returns: Dict with success, new_local_record_id, zenodo_response, and optional message
Database Integration¶
Tables Used¶
project_info: Project metadatasource_files: File information and statuszenodo_records: Zenodo record metadata and staterecord_files_map: File-to-record associations with upload statusmetadata_mapping_files: Mapping configurationsmetadata_values: Extracted metadata valuesmetadata_backups: Metadata backups for rollbackapi_log: API interaction logging
Transaction Management¶
Critical operations use explicit transaction control with BEGIN, COMMIT, and ROLLBACK. The execute_db_transaction() helper ensures atomicity.
Rate Limiting¶
All Zenodo API calls use the rate_limiter_zenodo service:
rate_limiter_zenodo.wait_for_rate_limit() # Before request
rate_limiter_zenodo.record_request() # After request
Usage Examples¶
Complete Workflow Example¶
Step 1: Add Files
Step 2: Prepare Metadata
Step 3: Create Draft
Step 4: Upload Files
Step 5: Publish