Modern data pipelines often rely on frequent updates from REDCap projects, especially in large-scale surveillance systems where data changes continually across sites. However, repeatedly downloading the entire dataset can be slow and redundant. A better approach is to track only new or modified records using the REDCap audit trail (logging API).
The function below demonstrates a practical way to achieve this in R, retrieving the latest modification timestamp from REDCap efficiently.
The function get_last_log_timestamp() provides a compact way to query REDCap’s event log and determine when the last data update occurred.
By recording this timestamp locally, you can design a dashboard or data-processing script that fetches only new data since the last run—saving time and bandwidth while maintaining data integrity.
get_last_log_timestamp <- function(url, token, begin_time) {
### Validate inputs
if (missing(url) || missing(token) || missing(begin_time)) {
stop("url, token, and begin_time are required parameters")
}
### Make API request
response <- httr::POST(
url = url,
body = list(
token = token,
content = "log",
format = "json",
logtype = "record",
beginTime = begin_time
)
)
### Check for API errors
httr::stop_for_status(response)
### Parse response
log_list <- httr::content(response, as = "parsed")
### Handle empty logs
if (length(log_list) == 0) {
warning("No log entries found for the specified time period")
return(NULL)
}
lasttime_update <- max(sapply(log_list, `[[`, "timestamp"), na.rm = TRUE)
# Convert to POSIXct for better handling
last_timestamp <- as.POSIXct(lasttime_update, format = "%Y-%m-%d %H:%M", tz = "UTC")
return(last_timestamp)
}
tic()
time <- get_last_log_timestamp(url = url, token = token , begin_time = date_time)
toc()
In large collaborative REDCap projects, such as multi-site enrolment studies, new data can be entered at any moment. Continuously pulling the full dataset after each change is inefficient and increases load on both REDCap and your Shiny app.
By capturing only the timestamp of the latest update, you can:
Cache previously downloaded data locally.
Request only new or modified records since the last update.
Maintain real-time dashboards without unnecessary API calls.
This approach forms the foundation of incremental synchronization pipelines, where each data refresh starts from the last known modification time.
In the example below, the tictoc package is used to measure how long the API call takes:
tic()
time <- get_last_log_timestamp(url = url, token = token, begin_time = date_time)
toc()
This helps evaluate how efficiently REDCap responds to the query and confirms the function’s suitability for scheduled or reactive Shiny updates.
When embedded in a Shiny dashboard, the timestamp returned by this function can act as a global reference for all users.
The app can compare the stored timestamp to the current REDCap log and decide whether a new data pull is necessary. This mechanism allows many users to share the same cached data snapshot while ensuring updates are fetched only when something actually changes in REDCap.
The get_last_log_timestamp() function offers a lightweight and reliable way to monitor REDCap data changes.
By leveraging REDCap’s built-in logging system, it provides an elegant solution for incremental data updates—critical for scaling dashboards and analytics in high-volume projects.
When combined with local caching, it forms a robust foundation for efficient, multi-user Shiny dashboards powered by REDCap.