https://github.com/lsvekis/Chunk-your-data-Google-Apps-Script
How to Chunk Data in Google Apps Script (So You Don’t Hit “Exceeded Maximum Execution Time”) If you work with large spreadsheets or big Drive folders in Google Apps Script, you’ve probably seen this error:
Exception: Exceeded maximum execution time
Most Apps Script executions have a hard time limit (often around 6 minutes per run for standard accounts). Once you hit that limit, the script stops—no matter how close you were to finishing your task.
The solution is chunking: breaking your work into smaller pieces so each execution stays comfortably under the time limit.
In this post you’ll learn:
- Why Apps Script times out
- Two main chunking strategies
- A re-usable chunking pattern for Sheets
- Full sample code with step-by-step explanations
Why Apps Script Times Out
Common reasons:
- You loop over thousands of rows one by one
- You call methods like
getRange(),setValue(), orUrlFetchApp.fetch()inside tight loops - You try to process an entire file or folder in one execution
Because Apps Script has a maximum runtime, “brute force” approaches will eventually hit that limit.
The fix is to:
- Do work in batches (chunks)
- Minimize calls to external services (
SpreadsheetApp,DriveApp,UrlFetchApp, etc.) - If necessary, spread the work across multiple executions using Properties + Triggers
Let’s walk through a practical example.
Example Scenario
You have a sheet with tens of thousands of rows.
You want to:
- Read existing data
- Process each row (e.g., build a status message)
- Write results back into a column
Instead of processing everything in one go, we’ll:
- Process N rows per execution (e.g., 500 rows)
- Store the last processed row in
PropertiesService - If we’re close to our own time limit, we stop and schedule a time-based trigger to continue later
High-Level Flow
- User runs
startProcessing()(from the menu or editor). - Script calls
processChunk_(). processChunk_():- Reads the last processed row from script properties.
- Processes the next chunk (e.g., 500 rows).
- Checks how long it has been running.
- If done with all rows → cleanup and stop.
- If time is nearly up → save progress and create a new time-based trigger to continue.
Sample Code: Chunking Sheet Rows with Resume Support
1. Main Entry Point
function startProcessing() {
// Reset state and kick off the first chunk
const scriptProps = PropertiesService.getScriptProperties();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
processChunk_();
}
What this does:
- Clears any previous progress (
LAST_PROCESSED_ROW). - Starts the first chunk by calling
processChunk_().
2. The Chunk Processor
function processChunk_() {
const startTime = Date.now();
const MAX_RUNTIME_MS = 5 * 60 * 1000; // Aim for ~5 mins (leave buffer under hard limit)
const CHUNK_SIZE = 500; // Number of rows to process per run
const ss = SpreadsheetApp.getActiveSpreadsheet();
const sheet = ss.getSheetByName('Data'); // Change to your sheet name
if (!sheet) {
throw new Error('Sheet "Data" not found.');
}
const scriptProps = PropertiesService.getScriptProperties();
let lastProcessedRow = Number(scriptProps.getProperty('LAST_PROCESSED_ROW')) || 1;
// Assume row 1 is headers, start from row 2
const headerRow = 1;
let startRow = Math.max(lastProcessedRow + 1, headerRow + 1);
const lastRow = sheet.getLastRow();
if (startRow > lastRow) {
Logger.log('All rows already processed. Nothing to do.');
cleanupTriggers_();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
return;
}
// Calculate how many rows to process in this chunk
const endRow = Math.min(startRow + CHUNK_SIZE - 1, lastRow);
const numRows = endRow - startRow + 1;
// Read the data range for this chunk in one go
const dataRange = sheet.getRange(startRow, 1, numRows, sheet.getLastColumn());
const data = dataRange.getValues();
// We'll write results into a specific column (e.g., column 5 = "Status")
const STATUS_COL = 5;
const output = [];
for (let i = 0; i < data.length; i++) {
const row = data[i];
// Example: build a simple status text from first 2 columns
const name = row[0]; // Column A
const value = row[1]; // Column B
const status = buildStatus_(name, value); // Your custom logic
// Prepare row output: keep existing values but ensure STATUS_COL is updated
const rowOutput = row.slice(); // shallow copy
rowOutput[STATUS_COL - 1] = status; // 0-based index
output.push(rowOutput);
// Periodically check execution time
if (i % 50 === 0) {
const elapsed = Date.now() - startTime;
if (elapsed > MAX_RUNTIME_MS) {
// Save progress and schedule continuation
const processedRowNumber = startRow + i;
scriptProps.setProperty('LAST_PROCESSED_ROW', String(processedRowNumber));
Logger.log('Time limit approaching. Stopped at row ' + processedRowNumber);
createContinuationTrigger_();
// Write back what we processed so far
const partialRange = sheet.getRange(startRow, 1, i + 1, sheet.getLastColumn());
partialRange.setValues(output.slice(0, i + 1));
return;
}
}
}
// If we get here, we finished our chunk within time
dataRange.setValues(output);
// Update last processed row
scriptProps.setProperty('LAST_PROCESSED_ROW', String(endRow));
Logger.log('Finished rows ' + startRow + ' to ' + endRow);
// If there are more rows, schedule the next chunk
if (endRow < lastRow) {
createContinuationTrigger_();
} else {
// Done: clean up
Logger.log('All rows processed!');
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
cleanupTriggers_();
}
}
What’s happening here?
MAX_RUNTIME_MS– you choose a safe internal limit (e.g., 5 minutes) so you never bump into the hard Apps Script time limit.CHUNK_SIZE– how many rows per run. Adjust based on your workload (try 200–1000).LAST_PROCESSED_ROW– stored in script properties so we can resume later.- Every 50 rows, we check how long we’ve been running:
- If too long → save progress, write partial results, set property, create a new trigger, and
return.
- If too long → save progress, write partial results, set property, create a new trigger, and
- If we finish the chunk:
- We write all processed values back at once (
setValues(output)– 1 call). - Update
LAST_PROCESSED_ROW. - Decide whether we’re done or need another run.
- We write all processed values back at once (
3. The Business Logic (What Happens Per Row)
function buildStatus_(name, value) {
if (!name && !value) {
return 'Empty row';
}
if (value > 1000) {
return 'High value for ' + name;
}
if (value > 0) {
return 'OK for ' + name;
}
return 'No value for ' + name;
}
This is just a simple example.
Replace this with whatever real processing your script needs to do.
4. Creating a Time-Based Trigger to Continue
function createContinuationTrigger_() {
// Optional: avoid creating duplicate triggers
const existingTriggers = ScriptApp.getProjectTriggers()
.filter(t => t.getHandlerFunction() === 'processChunk_');
if (existingTriggers.length > 0) {
Logger.log('Continuation trigger already exists. Skipping creation.');
return;
}
ScriptApp.newTrigger('processChunk_')
.timeBased()
.after(1 * 60 * 1000) // run again in 1 minute
.create();
Logger.log('Continuation trigger created.');
}
Why triggers?
- Each trigger execution is a fresh 6 minutes (approx.).
- You can process massive datasets by chaining multiple runs together.
- The user doesn’t have to manually restart anything.
5. Cleaning Up Triggers
function cleanupTriggers_() {
const triggers = ScriptApp.getProjectTriggers();
triggers.forEach(trigger => {
if (trigger.getHandlerFunction() === 'processChunk_') {
ScriptApp.deleteTrigger(trigger);
}
});
}
Once all rows are processed, we delete any leftover “continue” triggers so the project stays tidy.
Alternative: Simple Chunking Without Triggers (For Smaller Jobs)
If your data set is big but not huge, you might just need to chunk your data access, not your runtime.
Example: process 5,000 rows in chunks of 500 inside a single execution, but still under 6 minutes.
function processInMemoryChunks() {
const ss = SpreadsheetApp.getActiveSpreadsheet();
const sheet = ss.getSheetByName('Data');
const lastRow = sheet.getLastRow();
const lastCol = sheet.getLastColumn();
const headerRow = 1;
const startRow = headerRow + 1;
const totalRows = lastRow - headerRow;
const CHUNK_SIZE = 500;
for (let offset = 0; offset < totalRows; offset += CHUNK_SIZE) {
const numRows = Math.min(CHUNK_SIZE, totalRows - offset);
const range = sheet.getRange(startRow + offset, 1, numRows, lastCol);
const values = range.getValues();
// Process each row in this chunk
for (let i = 0; i < values.length; i++) {
const row = values[i];
// ... your logic here ...
}
// Write data back in one go
range.setValues(values);
}
}
This approach:
- Uses
getRange().getValues()andsetValues()per chunk, not per row. - Reduces the overhead of handling very large ranges at once.
- Is simpler than triggers, but still helps performance.
However, if this still hits the time limit, switch to the trigger-based approach shown earlier.
Key Takeaways
- Apps Script has a hard execution time limit → you can’t turn it off.
- For large workloads:
- Chunk your data (rows, files, IDs).
- Minimize calls to
SpreadsheetApp,DriveApp,UrlFetchApp, etc. - Use
PropertiesServiceto store progress (LAST_PROCESSED_ROW). - Use time-based triggers to continue work in multiple runs.
- The pattern in this post is reusable:
- Change the sheet name
- Adjust
CHUNK_SIZE - Swap in your own per-row logic
Below is a full script that includes:
setupTestData()– creates a Data sheet with ~5,000 rows of fake datastartProcessing()+ helpers – from the previous example (so it’s self-contained)runFullTest()– clears state, generates data, and kicks off processing
You can paste this whole thing into a new Apps Script project bound to a Sheet and run runFullTest() or run them step by step.
Full Script with Setup + Chunk Processing + Test
/***************************************
* 1. Test Data Setup
***************************************/
function setupTestData() {
const ss = SpreadsheetApp.getActiveSpreadsheet();
let sheet = ss.getSheetByName('Data');
// Create or reset the "Data" sheet
if (sheet) {
sheet.clear();
} else {
sheet = ss.insertSheet('Data');
}
// Header row
const headers = ['Name', 'Value', 'Category', 'Notes', 'Status'];
sheet.getRange(1, 1, 1, headers.length).setValues([headers]);
// How many test rows?
const NUM_ROWS = 5000;
const names = ['Alice', 'Bob', 'Charlie', 'Dana', 'Eve', 'Frank', 'Grace', 'Heidi'];
const categories = ['A', 'B', 'C', 'D'];
const data = [];
for (let i = 0; i < NUM_ROWS; i++) {
const name = names[Math.floor(Math.random() * names.length)];
const value = Math.floor(Math.random() * 5000); // 0–4999
const category = categories[Math.floor(Math.random() * categories.length)];
const notes = 'Row ' + (i + 2) + ' sample data';
const status = ''; // will be filled by our processing
data.push([name, value, category, notes, status]);
}
// Write all rows in one batch
sheet.getRange(2, 1, data.length, data[0].length).setValues(data);
Logger.log('Test data created: ' + NUM_ROWS + ' rows in sheet "Data".');
}
/***************************************
* 2. Main Entry Point (from blog)
***************************************/
function startProcessing() {
const scriptProps = PropertiesService.getScriptProperties();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
processChunk_();
}
/***************************************
* 3. Chunk Processor
***************************************/
function processChunk_() {
const startTime = Date.now();
const MAX_RUNTIME_MS = 5 * 60 * 1000; // 5 minutes internal limit
const CHUNK_SIZE = 500; // rows per execution
const ss = SpreadsheetApp.getActiveSpreadsheet();
const sheet = ss.getSheetByName('Data');
if (!sheet) {
throw new Error('Sheet "Data" not found.');
}
const scriptProps = PropertiesService.getScriptProperties();
let lastProcessedRow = Number(scriptProps.getProperty('LAST_PROCESSED_ROW')) || 1;
const headerRow = 1;
let startRow = Math.max(lastProcessedRow + 1, headerRow + 1);
const lastRow = sheet.getLastRow();
if (startRow > lastRow) {
Logger.log('All rows already processed. Nothing to do.');
cleanupTriggers_();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
return;
}
const endRow = Math.min(startRow + CHUNK_SIZE - 1, lastRow);
const numRows = endRow - startRow + 1;
const dataRange = sheet.getRange(startRow, 1, numRows, sheet.getLastColumn());
const data = dataRange.getValues();
const STATUS_COL = 5; // "Status" column in our setup
const output = [];
for (let i = 0; i < data.length; i++) {
const row = data[i];
const name = row[0]; // Column A
const value = row[1]; // Column B
const status = buildStatus_(name, value);
const rowOutput = row.slice();
rowOutput[STATUS_COL - 1] = status;
output.push(rowOutput);
// Check time every 50 rows
if (i % 50 === 0) {
const elapsed = Date.now() - startTime;
if (elapsed > MAX_RUNTIME_MS) {
const processedRowNumber = startRow + i;
scriptProps.setProperty('LAST_PROCESSED_ROW', String(processedRowNumber));
Logger.log('Time limit approaching. Stopped at row ' + processedRowNumber);
// Write what we’ve processed so far
const partialRange = sheet.getRange(
startRow,
1,
i + 1,
sheet.getLastColumn()
);
partialRange.setValues(output.slice(0, i + 1));
createContinuationTrigger_();
return;
}
}
}
// Finished this chunk within time
dataRange.setValues(output);
scriptProps.setProperty('LAST_PROCESSED_ROW', String(endRow));
Logger.log('Finished rows ' + startRow + ' to ' + endRow);
if (endRow < lastRow) {
createContinuationTrigger_();
} else {
Logger.log('All rows processed!');
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
cleanupTriggers_();
}
}
/***************************************
* 4. Row Logic
***************************************/
function buildStatus_(name, value) {
if (!name && (value === '' || value === null || typeof value === 'undefined')) {
return 'Empty row';
}
if (value > 3000) {
return 'Very high value for ' + name;
}
if (value > 1000) {
return 'High value for ' + name;
}
if (value > 0) {
return 'OK for ' + name;
}
return 'No value for ' + name;
}
/***************************************
* 5. Trigger Helpers
***************************************/
function createContinuationTrigger_() {
const existingTriggers = ScriptApp.getProjectTriggers()
.filter(t => t.getHandlerFunction() === 'processChunk_');
if (existingTriggers.length > 0) {
Logger.log('Continuation trigger already exists. Skipping creation.');
return;
}
ScriptApp.newTrigger('processChunk_')
.timeBased()
.after(1 * 60 * 1000) // 1 minute later
.create();
Logger.log('Continuation trigger created.');
}
function cleanupTriggers_() {
const triggers = ScriptApp.getProjectTriggers();
triggers.forEach(trigger => {
if (trigger.getHandlerFunction() === 'processChunk_') {
ScriptApp.deleteTrigger(trigger);
}
});
}
/***************************************
* 6. Test Helper – One-Click Test
***************************************/
function runFullTest() {
// Clean previous state
cleanupTriggers_();
PropertiesService.getScriptProperties().deleteProperty('LAST_PROCESSED_ROW');
// Create fresh test data
setupTestData();
// Kick off processing
startProcessing();
Logger.log(
'runFullTest: Created test data and started processing.\n' +
'Check the "Status" column on sheet "Data" and the project triggers.'
);
}
How to Use It
- Create / open a Google Sheet.
- Open Extensions → Apps Script and paste this entire script into
Code.gs. - Save the project.
- In the Apps Script editor, select and run:
setupTestData()– if you just want to populate the sheetstartProcessing()– to manually start chunked processing- or
runFullTest()– to reset everything, create data, and start processing in one go
- Watch:
- The Status column on the
Datasheet will fill in over time. - The Triggers tab will show a time-based trigger for
processChunk_until all rows are done.
- The Status column on the
If you want a different sheet name, number of rows, or chunk size, you can tweak NUM_ROWS, CHUNK_SIZE, and the sheet name at the top.
How to make it a standalone script
Sheet name: Data (created automatically if it doesn’t exist)
You can paste this entire script into a new standalone Apps Script project (script.google.com), then run runFullTest().
/***************************************
* CONFIG
***************************************/
const SPREADSHEET_ID = '1pM';
const SHEET_NAME = 'Data';
/***************************************
* Helper: Get Spreadsheet & Sheet
***************************************/
function getSpreadsheet_() {
return SpreadsheetApp.openById(SPREADSHEET_ID);
}
function getDataSheet_() {
const ss = getSpreadsheet_();
let sheet = ss.getSheetByName(SHEET_NAME);
if (!sheet) {
sheet = ss.insertSheet(SHEET_NAME);
}
return sheet;
}
/***************************************
* 1. Test Data Setup
***************************************/
function setupTestData() {
const sheet = getDataSheet_();
// Reset sheet
sheet.clear();
// Header row
const headers = ['Name', 'Value', 'Category', 'Notes', 'Status'];
sheet.getRange(1, 1, 1, headers.length).setValues([headers]);
// How many test rows?
const NUM_ROWS = 5000;
const names = ['Alice', 'Bob', 'Charlie', 'Dana', 'Eve', 'Frank', 'Grace', 'Heidi'];
const categories = ['A', 'B', 'C', 'D'];
const data = [];
for (let i = 0; i < NUM_ROWS; i++) {
const name = names[Math.floor(Math.random() * names.length)];
const value = Math.floor(Math.random() * 5000); // 0–4999
const category = categories[Math.floor(Math.random() * categories.length)];
const notes = 'Row ' + (i + 2) + ' sample data';
const status = ''; // will be filled by our processing
data.push([name, value, category, notes, status]);
}
// Write all rows in one batch
sheet.getRange(2, 1, data.length, data[0].length).setValues(data);
Logger.log('Test data created: ' + NUM_ROWS + ' rows in sheet "' + SHEET_NAME + '".');
}
/***************************************
* 2. Main Entry Point
***************************************/
function startProcessing() {
const scriptProps = PropertiesService.getScriptProperties();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
processChunk_();
}
/***************************************
* 3. Chunk Processor
***************************************/
function processChunk_() {
const startTime = Date.now();
const MAX_RUNTIME_MS = 5 * 60 * 1000; // 5 minutes internal limit
const CHUNK_SIZE = 500; // rows per execution
const sheet = getDataSheet_();
const scriptProps = PropertiesService.getScriptProperties();
let lastProcessedRow = Number(scriptProps.getProperty('LAST_PROCESSED_ROW')) || 1;
const headerRow = 1;
let startRow = Math.max(lastProcessedRow + 1, headerRow + 1);
const lastRow = sheet.getLastRow();
if (startRow > lastRow) {
Logger.log('All rows already processed. Nothing to do.');
cleanupTriggers_();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
return;
}
const endRow = Math.min(startRow + CHUNK_SIZE - 1, lastRow);
const numRows = endRow - startRow + 1;
const dataRange = sheet.getRange(startRow, 1, numRows, sheet.getLastColumn());
const data = dataRange.getValues();
const STATUS_COL = 5; // "Status" column in our setup
const output = [];
for (let i = 0; i < data.length; i++) {
const row = data[i];
const name = row[0]; // Column A
const value = row[1]; // Column B
const status = buildStatus_(name, value);
const rowOutput = row.slice();
rowOutput[STATUS_COL - 1] = status;
output.push(rowOutput);
// Check time every 50 rows
if (i % 50 === 0) {
const elapsed = Date.now() - startTime;
if (elapsed > MAX_RUNTIME_MS) {
const processedRowNumber = startRow + i;
scriptProps.setProperty('LAST_PROCESSED_ROW', String(processedRowNumber));
Logger.log('Time limit approaching. Stopped at row ' + processedRowNumber);
// Write what we’ve processed so far
const partialRange = sheet.getRange(
startRow,
1,
i + 1,
sheet.getLastColumn()
);
partialRange.setValues(output.slice(0, i + 1));
createContinuationTrigger_();
return;
}
}
}
// Finished this chunk within time
dataRange.setValues(output);
scriptProps.setProperty('LAST_PROCESSED_ROW', String(endRow));
Logger.log('Finished rows ' + startRow + ' to ' + endRow);
if (endRow < lastRow) {
createContinuationTrigger_();
} else {
Logger.log('All rows processed!');
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
cleanupTriggers_();
}
}
/***************************************
* 4. Row Logic
***************************************/
function buildStatus_(name, value) {
if (!name && (value === '' || value === null || typeof value === 'undefined')) {
return 'Empty row';
}
if (value > 3000) {
return 'Very high value for ' + name;
}
if (value > 1000) {
return 'High value for ' + name;
}
if (value > 0) {
return 'OK for ' + name;
}
return 'No value for ' + name;
}
/***************************************
* 5. Trigger Helpers
***************************************/
function createContinuationTrigger_() {
const existingTriggers = ScriptApp.getProjectTriggers()
.filter(t => t.getHandlerFunction() === 'processChunk_');
if (existingTriggers.length > 0) {
Logger.log('Continuation trigger already exists. Skipping creation.');
return;
}
ScriptApp.newTrigger('processChunk_')
.timeBased()
.after(1 * 60 * 1000) // 1 minute later
.create();
Logger.log('Continuation trigger created.');
}
function cleanupTriggers_() {
const triggers = ScriptApp.getProjectTriggers();
triggers.forEach(trigger => {
if (trigger.getHandlerFunction() === 'processChunk_') {
ScriptApp.deleteTrigger(trigger);
}
});
}
/***************************************
* 6. Test Helper – One-Click Test
***************************************/
function runFullTest() {
// Clean previous state
cleanupTriggers_();
PropertiesService.getScriptProperties().deleteProperty('LAST_PROCESSED_ROW');
// Create fresh test data in the target spreadsheet
setupTestData();
// Kick off processing
startProcessing();
Logger.log(
'runFullTest: Created test data and started processing.\n' +
'Check the "Status" column on sheet "' + SHEET_NAME + '" in spreadsheet ' + SPREADSHEET_ID +
' and the project triggers.'
);
}
How to use:
- Go to script.google.com, create a standalone project.
- Paste this entire script into
Code.gs. - Authorize and run:
runFullTest()→ sets up data in your spreadsheet + starts chunked processing,
orsetupTestData()thenstartProcessing()manually.
If you want to swap to another sheet ID later, just change SPREADSHEET_ID.
Here’s your updated version that works as a standalone script with your Sheet ID
It replaces getActiveSpreadsheet() with openById() and adds helpers.
/***************************************
* Chunked Processing with Triggers
***************************************/
function startProcessing() {
// Reset state and kick off the first chunk
const scriptProps = PropertiesService.getScriptProperties();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
processChunk_();
}
function processChunk_() {
const startTime = Date.now();
const MAX_RUNTIME_MS = 5 * 60 * 1000; // Aim for ~5 mins (leave buffer under hard limit)
const CHUNK_SIZE = 500; // Number of rows to process per run
const sheet = getDataSheet_();
const scriptProps = PropertiesService.getScriptProperties();
let lastProcessedRow = Number(scriptProps.getProperty('LAST_PROCESSED_ROW')) || 1;
// Assume row 1 is headers, start from row 2
const headerRow = 1;
let startRow = Math.max(lastProcessedRow + 1, headerRow + 1);
const lastRow = sheet.getLastRow();
if (startRow > lastRow) {
Logger.log('All rows already processed. Nothing to do.');
cleanupTriggers_();
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
return;
}
// Calculate how many rows to process in this chunk
const endRow = Math.min(startRow + CHUNK_SIZE - 1, lastRow);
const numRows = endRow - startRow + 1;
// Read the data range for this chunk in one go
const dataRange = sheet.getRange(startRow, 1, numRows, sheet.getLastColumn());
const data = dataRange.getValues();
// We'll write results into a specific column (e.g., column 5 = "Status")
const STATUS_COL = 5;
const output = [];
for (let i = 0; i < data.length; i++) {
const row = data[i];
// Example: build a simple status text from first 2 columns
const name = row[0]; // Column A
const value = row[1]; // Column B
const status = buildStatus_(name, value); // Your custom logic
// Prepare row output: keep existing values but ensure STATUS_COL is updated
const rowOutput = row.slice(); // shallow copy
rowOutput[STATUS_COL - 1] = status; // 0-based index
output.push(rowOutput);
// Periodically check execution time
if (i % 50 === 0) {
const elapsed = Date.now() - startTime;
if (elapsed > MAX_RUNTIME_MS) {
// Save progress and schedule continuation
const processedRowNumber = startRow + i;
scriptProps.setProperty('LAST_PROCESSED_ROW', String(processedRowNumber));
Logger.log('Time limit approaching. Stopped at row ' + processedRowNumber);
// Write back what we processed so far
const partialRange = sheet.getRange(startRow, 1, i + 1, sheet.getLastColumn());
partialRange.setValues(output.slice(0, i + 1));
createContinuationTrigger_();
return;
}
}
}
// If we get here, we finished our chunk within time
dataRange.setValues(output);
// Update last processed row
scriptProps.setProperty('LAST_PROCESSED_ROW', String(endRow));
Logger.log('Finished rows ' + startRow + ' to ' + endRow);
// If there are more rows, schedule the next chunk
if (endRow < lastRow) {
createContinuationTrigger_();
} else {
// Done: clean up
Logger.log('All rows processed!');
scriptProps.deleteProperty('LAST_PROCESSED_ROW');
cleanupTriggers_();
}
}
/***************************************
* Row Logic
***************************************/
function buildStatus_(name, value) {
if (!name && !value) {
return 'Empty row';
}
if (value > 1000) {
return 'High value for ' + name;
}
if (value > 0) {
return 'OK for ' + name;
}
return 'No value for ' + name;
}
/***************************************
* Trigger Helpers
***************************************/
function createContinuationTrigger_() {
// Optional: avoid creating duplicate triggers
const existingTriggers = ScriptApp.getProjectTriggers()
.filter(t => t.getHandlerFunction() === 'processChunk_');
if (existingTriggers.length > 0) {
Logger.log('Continuation trigger already exists. Skipping creation.');
return;
}
ScriptApp.newTrigger('processChunk_')
.timeBased()
.after(1 * 60 * 1000) // run again in 1 minute
.create();
Logger.log('Continuation trigger created.');
}
function cleanupTriggers_() {
const triggers = ScriptApp.getProjectTriggers();
triggers.forEach(trigger => {
if (trigger.getHandlerFunction() === 'processChunk_') {
ScriptApp.deleteTrigger(trigger);
}
});
}
/***************************************
* Optional: In-Memory Chunking (no triggers)
***************************************/
function processInMemoryChunks() {
const sheet = getDataSheet_();
const lastRow = sheet.getLastRow();
const lastCol = sheet.getLastColumn();
const headerRow = 1;
const startRow = headerRow + 1;
const totalRows = lastRow - headerRow;
const CHUNK_SIZE = 500;
for (let offset = 0; offset < totalRows; offset += CHUNK_SIZE) {
const numRows = Math.min(CHUNK_SIZE, totalRows - offset);
const range = sheet.getRange(startRow + offset, 1, numRows, lastCol);
const values = range.getValues();
// Process each row in this chunk
for (let i = 0; i < values.length; i++) {
const row = values[i];
// ... your logic here if you want a different behavior ...
// e.g. row[4] = buildStatus_(row[0], row[1]);
}
// Write data back in one go
range.setValues(values);
}
}
You can drop this straight into your standalone script project (the same one using that Sheet ID), and it’ll use the same Data sheet in that file.
