AppsScriptPulse

Breaking the Grid: 4 Ways to Generate Excel Files from Google Workspace

We have all been there: You build a beautiful, automated dashboard in Google Sheets. It’s dynamic, it’s collaborative, and it’s perfectly integrated with your data sources. Then, the stakeholder request comes in: “Can you email this to me as an Excel file every Monday morning?”

For Google Workspace developers, the “Excel Requirement” is often a stumbling block. While Google Sheets is powerful, the bridge to Excel can be shaky. In this post, I’m going to explore four architectural approaches to solving this problem. If you would like to play along here is the Google Sheet Invoice Template and the Excel Invoice Template.

1. The Low-Code Agent (AppSheet Automation)

Before you write a single line of code, you should consider if the platform can do the heavy lifting for you. Since AppSheet is now a core part of Google Workspace, it offers a native “No-Code” way to generate Excel files that is robust, scheduled, and template-based.

The Strategy

Instead of writing a script to fetch data and build a file, you connect your Google Sheet to an AppSheet app. You then configure an Automation that triggers on a schedule or a data change.

The Workflow:

  1. Trigger: A new row is added to your “Invoices” sheet (or a weekly schedule fires).
  2. Task: AppSheet runs a “Create a new file” task.
  3. Template: It reads a simplified Excel template stored in Drive, replaces tags like <<[Qty]>> with actual data, and generates the file.
  4. Delivery: The file is saved to Drive or emailed directly to the client.

Why This Wins

  • Zero Maintenance: No scripts to debug, no OAuth tokens to manage, and no breaking changes when APIs update.
  • Native Templates: It supports complex Excel formatting out of the box.

The Limitation

  • Syntax: You must use AppSheet’s specific templating syntax (<< >>) inside your Excel file.
  • Scaling: Generating reports with thousands of line items can hit automation timeout limits.

2. The Native Export (Apps Script)

If you need more granular control or don’t want to spin up an AppSheet app, the most common starting point is attempting to do everything within Apps Script. You create a temporary Google Sheet, fill it with data, and then export it using the Drive API.

The Code

In this scenario, we assume you have a Google Sheet version of your Invoice template. We first write the specific line items to the Sheet, then export it.

function generateInvoiceExport() {
  const ss = SpreadsheetApp.getActiveSpreadsheet();
  const sheet = ss.getSheetByName('InvoiceTemplate');
  
  // 1. Prepare Invoice Data
  const invoiceData = [
    { desc: "Web Development - Phase 1", qty: 40, unit: 100 },
    { desc: "Server Hosting (Annual)", qty: 1, unit: 250 },
    { desc: "Domain Registration", qty: 2, unit: 15 },
    { desc: "Maintenance Retainer", qty: 10, unit: 85 }
  ];
  
  // 2. Map data to the Sheet structure (Cols B, C, D, E, F)
  // B=Desc, E=Qty, F=Unit. (C and D are skipped/empty)
  const rows = invoiceData.map(item => [item.desc, "", "", item.qty, item.unit]);

  // 3. Write data to the Google Sheet
  // Matches Template: Start at Row 19, Column B (2)
  if (rows.length > 0) {
    // Corrected: ensure the number of columns (5) matches your data structure
    sheet.getRange(19, 2, rows.length, 5).setValues(rows);
  }
  
  // 4. Construct the export URL for the populated sheet
  const fileName = `Invoice_${new Date().toISOString().slice(0,10)}.xlsx`;
  const url = `https://docs.google.com/spreadsheets/d/${ss.getId()}/export?format=xlsx&gid=${sheet.getSheetId()}`;
  
  // 5. Fetch and Save
  const params = {
    method: "GET",
    headers: { "Authorization": `Bearer ${ScriptApp.getOAuthToken()}` },
    muteHttpExceptions: true
  };
  
  const response = UrlFetchApp.fetch(url, params);
  
  if (response.getResponseCode() === 200) {
    const blob = response.getBlob().setName(fileName);
    const file = DriveApp.createFile(blob);
    console.log(`Success: Invoice saved to Drive (ID: ${file.getId()})`);
  } else {
    console.log(`Error: ${response.getContentText()}`);
  }
};

The Limitations

  • Speed & Scale: The overhead of creating a temporary sheet and exporting it adds up. Generating 50 invoices in a loop will take a long time.
  • The “Source” Gap: Your source material must be a Google Sheet, meaning you lose Excel-specific features like Power Query or VBA macros from your template.

3. The Browser Round-Trip (ExcelJS)

If you need to maintain stricter fidelity to an existing Excel template (preserving fonts, colors, and layouts) and are interested in speed, you can use the “Round Trip” technique.

This method uses the user’s browser as a processing engine. You fetch the Invoice.xlsx template from Drive, process it client-side with ExcelJS, and send the result back.

View the Code Project

The Strategy

  1. Parallel Fetching (Client): The browser fires concurrent requests to Apps Script to fetch the template (Base64) and invoice data.
  2. Client-Side Processing: ExcelJS loads the template, injects data into specific cells (preserving layout), and recompiles the file.
  3. Upload: The new file is encoded and sent back to Drive.

The Code

This snippet focuses on the core orchestration logic. For the full UI setup and helper functions, see the linked project.

/* --- CLIENT SIDE (Core Logic) --- */
async function startGeneration() {
  
  // 1. Parallel Fetching
  // Fire both requests at once to speed up the wait time
  const [templateResponse, invoiceData] = await Promise.all([
    runGoogleScript('getTemplateBase64', null), 
    runGoogleScript('getInvoiceData')
  ]);

  // 2. Load into ExcelJS
  const workbook = new ExcelJS.Workbook();
  const binaryString = atob(templateResponse.base64);
  const bytes = new Uint8Array(binaryString.length);
  for (let i = 0; i < binaryString.length; i++) { bytes[i] = binaryString.charCodeAt(i); } await workbook.xlsx.load(bytes.buffer); // 3. Modify the Sheet const sheet = workbook.worksheets[0]; let currentRow = 19; // Starting at Row 19 based on template invoiceData.forEach((item, index) => {
    // Ensure we do not write past the visual template limit (e.g. Row 22)
    if (currentRow + index > 22) return;

    const row = sheet.getRow(currentRow + index);
    
    // Inject data into specific cells (B, E, F)
    row.getCell(2).value = item.desc;  
    row.getCell(5).value = item.qty;   
    row.getCell(6).value = item.unit;  
    
    // ExcelJS preserves the existing styles of these cells!
    row.commit();
  });

  // 4. Export & Save
  const outBuffer = await workbook.xlsx.writeBuffer();
  
  // Convert buffer to base64 (simplified for brevity)
  let binary = '';
  const outBytes = new Uint8Array(outBuffer);
  for (let i = 0; i < outBytes.length; i++) {
    binary += String.fromCharCode(outBytes[i]);
  }
  const outBase64 = btoa(binary);

  // Upload back to Drive
  await runGoogleScript('saveProcessedFile', outBase64, `Invoice_${Date.now()}.xlsx`);
}

/* --- SERVER SIDE (Apps Script) --- */
function getTemplateBase64(templateId) {
  const file = DriveApp.getFileById(templateId);
  return {
    base64: Utilities.base64Encode(file.getBlob().getBytes())
  };
}

function saveProcessedFile(base64Content, fileName) {
  const blob = Utilities.newBlob(
    Utilities.base64Decode(base64Content), 
    MimeType.MICROSOFT_EXCEL, 
    fileName
  );
  return { url: DriveApp.createFile(blob).getUrl() };
}

The Trade-off

  • Payload Limits: google.script.run calls can become unstable when the payload exceeds 50MB.
  • Base64 Inflation: Remember that Base64 encoding increases file size by ~33%.

4. The Enterprise Engine (Cloud Functions + Python)

For robust, high-fidelity reporting, we need to leave the Google Sheets conversion pipeline behind. By using Google Cloud Functions with Python, we can manipulate native Excel files directly using libraries like openpyxl.

The “User Identity” Pattern

Instead of managing complex Service Account permissions, this solution uses the User’s Identity to perform Drive operations.

  1. Apps Script: Generates an OIDC ID Token (for identity), using getIdentityToken()) and an OAuth Access Token (for Drive permissions) with getOAuthToken().
  2. Cloud Function: Verifies the ID Token and uses the Access Token to initialise the Drive API client as the user.

Setup Instructions

1. Google Cloud Function (Python)

  • Deploy the Function:
    • Source: cloud-fn.py
    • Runtime: Python 3.x
    • Entry point: generate_excel_report
    • Requirements: functions-frameworkgoogle-authgoogle-api-python-clientopenpyxl
  • Configure Environment:
    • Open cloud-fn.py and update APPS_SCRIPT_CLIENT_ID with your Apps Script’s OAuth Client ID.
      Note: You can find this ID by running the logClientId() helper function in Code.js.

2. Google Apps Script (Code.js)

  • Update Manifest (appsscript.json): Add the https://www.googleapis.com/auth/userinfo.email scope.
  • Configuration: Update YOUR_CLOUD_FUNCTION_URL in callInvoiceEngine() with your deployed function URL.

The Code

Google Apps Script (Code.js)

function callInvoiceEngine() {
    const url = "YOUR_CLOUD_FUNCTION_URL";

    // 1. Get Tokens
    const accessToken = ScriptApp.getOAuthToken();
    const idToken = ScriptApp.getIdentityToken(); // Requires 'openid' scope in manifest

    // --- LOGGING FOR LOCAL TESTING ---
    console.log("--- CREDENTIALS FOR LOCAL TESTING ---");
    console.log("ID Token:", idToken);
    console.log("Access Token:", accessToken);
    console.log("-------------------------------------");

    const payload = {
        templateId: "ID_OF_TEMPLATE_FILE",
        folderId: "ID_OF_DESTINATION_FOLDER",
        // Matches the same dataset used in our other examples
        data: [
            { desc: "Web Development - Phase 1", qty: 40, unit: 100 },
            { desc: "Server Hosting (Annual)", qty: 1, unit: 250},
            { desc: "Domain Registration", qty: 2, unit: 15},
            { desc: "Maintenance Retainer", qty: 10, unit: 85}
        ],
        // The Magic Key: Passing the active user's Access Token for Drive operations
        userToken: accessToken
    };

    const options = {
        method: "post",
        contentType: "application/json",
        headers: {
            "Authorization": `Bearer ${idToken}`
        },
        payload: JSON.stringify(payload)
    };

    UrlFetchApp.fetch(url, options);
}

function logClientId() {
    // Helper to find the Project's OAuth Client ID
    var token = ScriptApp.getOAuthToken();
    var url = 'https://www.googleapis.com/oauth2/v1/tokeninfo?access_token=' + token;
    var response = UrlFetchApp.fetch(url);
    var data = JSON.parse(response.getContentText());
    
    console.log('Client ID:', data.issued_to);
}

Pro Tip: Batching Requests

If you need to generate 50 invoices at once, don’t loop through them one by one. Use UrlFetchApp.fetchAll(requests) to fire off parallel requests to your Cloud Function. Since the heavy lifting happens in the cloud, Apps Script just acts as the orchestrator.

Google Apps Script (appsscript.json)

{
    "oauthScopes": [
        "https://www.googleapis.com/auth/drive",
        "https://www.googleapis.com/auth/script.external_request",
        "https://www.googleapis.com/auth/userinfo.email"
    ]
}

Cloud Function (cloud-fn.py)

import requests
import json

def run_test():
    print("--- Local Cloud Function Tester ---")
    print("Ensure you are running: functions-framework --target=generate_excel_report --source=cloud-fn.py --port=8080\n")

    # 1. Gather Tokens & IDs
    print("Paste your Apps Script Tokens (use Logger.log in GAS):")
    id_token = input("1. ID Token (ScriptApp.getIdentityToken()): ").strip()
    if not id_token: return
    
    access_token = input("2. Access Token (ScriptApp.getOAuthToken()): ").strip()
    if not access_token: return

    template_id = input("3. Google Sheet/Excel Template ID: ").strip()
    folder_id = input("4. Destination Folder ID: ").strip()

    # 2. Prepare Payload
    payload = {
        "templateId": template_id,
        "folderId": folder_id,
        "data": [
            { "desc": "Local Test Item 1", "qty": 1, "unit": 10 },
            { "desc": "Local Test Item 2", "qty": 5, "unit": 20 }
        ],
        "userToken": access_token
    }

    headers = {
        "Authorization": f"Bearer {id_token}",
        "Content-Type": "application/json"
    }

    # 3. Send Request
    url = "http://localhost:8080"
    print(f"\nSending POST request to {url}...")
    
    try:
        response = requests.post(url, json=payload, headers=headers)
        print(f"\nStatus Code: {response.status_code}")
        print(f"Response Body: {response.text}")
            
    except requests.exceptions.ConnectionError:
        print(f"\n[!] Could not connect to {url}.")
        print("Is the functions-framework running?")

if __name__ == "__main__":
    run_test()

Local Test Script (test_request.py)

Before deploying, you can test the entire flow locally to ensure your tokens and permissions are working correctly.

  1. Start the Function: functions-framework --target=generate_excel_report --source=cloud-fn.py --port=8080
  2. Run the Tester: python3 test_request.py (Paste tokens from Apps Script logs when prompted).
import requests
import json

def run_test():
    print("--- Local Cloud Function Tester ---")
    print("Ensure you are running: functions-framework --target=generate_excel_report --source=cloud-fn.py --port=8080\n")

    # 1. Gather Tokens & IDs
    print("Paste your Apps Script Tokens (use Logger.log in GAS):")
    id_token = input("1. ID Token (ScriptApp.getIdentityToken()): ").strip()
    if not id_token: return
    
    access_token = input("2. Access Token (ScriptApp.getOAuthToken()): ").strip()
    if not access_token: return

    template_id = input("3. Google Sheet/Excel Template ID: ").strip()
    folder_id = input("4. Destination Folder ID: ").strip()

    # 2. Prepare Payload
    payload = {
        "templateId": template_id,
        "folderId": folder_id,
        "data": [
            { "desc": "Local Test Item 1", "qty": 1, "unit": 10 },
            { "desc": "Local Test Item 2", "qty": 5, "unit": 20 }
        ],
        "userToken": access_token
    }

    headers = {
        "Authorization": f"Bearer {id_token}",
        "Content-Type": "application/json"
    }

    # 3. Send Request
    url = "http://localhost:8080"
    print(f"\nSending POST request to {url}...")
    
    try:
        response = requests.post(url, json=payload, headers=headers)
        print(f"\nStatus Code: {response.status_code}")
        print(f"Response Body: {response.text}")
            
    except requests.exceptions.ConnectionError:
        print(f"\n[!] Could not connect to {url}.")
        print("Is the functions-framework running?")

if __name__ == "__main__":
    run_test()

Why This Wins

  • True Excel Support: You are editing the actual binary file, not a conversion.
  • Separation of Concerns: The heavy lifting is done in the cloud, not in the browser or the Apps Script runtime.

Summary: Which Tool for Which Job?

Feature AppSheet Apps Script (Native) Client-Side (ExcelJS) Cloud Functions (Python)
Complexity Very Low Low High High
Cost Free Free Free GCP Usage (Low)
Best For Standard Workflows Simple Reports Interactive Template Fills Enterprise Reports

Final Thoughts

There is no “one size fits all” here. The “Excel Requirement” is a spectrum:

  • For quick internal tools: Stick to Method 1 (AppSheet) and Method 2 (Native). It’s fast to build and free.
  • For polished client reports: Method 3 (ExcelJS) is your best friend. It keeps the heavy lifting in the browser and preserves your branding.
  • For enterprise scale: If you are generating hundreds of invoices a week, Method 4 (Cloud Functions) is the robust choice. It decouples the generation logic from the spreadsheet UI.

Choose the architecture that fits your volume and your “pixel-perfect” tolerance. I’m also keen to hear have got another ‘Excel Generator’ approach? Happy scripting!

A Framework for Integrating Agentic AI into Google Workspace Add-ons

A fact-check custom function for Google Sheets to be used as a bound Apps Script project powered by a Vertex AI agent and Gemini model.

This sample demonstrates how you can use two powerful types of AI resources directly into your Google Sheets spreadsheets

A new tutorial published by Google’s Pierrick Voulet, while ostensibly about creating a fact-checking tool in Google Sheets, offers something more valuable to developers: a clear framework for integrating powerful, multi-step Vertex AI Agents into Google Workspace solutions.

While the example is a FACT_CHECK custom function, the underlying architecture caught my eye as it provides a blueprint for connecting Google Workspace applications with the sophisticated reasoning capabilities of custom AI agents.

The Core Architectural Pattern

The solution uses Google Apps Script as a bridge between the user-facing application (Google Sheets) and the backend AI services on Google Cloud.

When a user calls the custom function, the script authenticates with a service account and makes a call to a deployed Vertex AI Agent. This agent then performs its multi-step reasoning. The result is then optionally passed to a Gemini model for final formatting before being returned to the Sheet.

This pattern is highly adaptable for various use cases, allowing developers to bring advanced Agentic AI into your own solutions.

Key Components for Developers

For developers looking to adapt this framework, the tutorial outlines the essential components:

  • A configured Google Cloud Project with the Vertex AI API enabled.
  • A deployed ADK Agent on the Vertex AI Agent Engine.
  • An Apps Script project using a service account for secure, server-to-server authentication with Google Cloud.

The provided Apps Script code (Code.gs and AiVertex.js) serves as a robust starting point, handling the API calls to both the reasoning agent and a Gemini model for final output formatting.

Ultimately, the fact-checking tool serves as an excellent proof of concept. The true value for the developer community, however, lies in the architectural blueprint it provides. This tutorial offers a clear model for integrating multi-step AI reasoning into Google Workspace add-ons, opening the door to a new class of intelligent applications.

Source: Fact-check statements with an ADK AI agent and Gemini model  |  Apps Script  |  Google for Developers

A More Versatile Way to Append Data in Google Sheets with the Sheets API Advanced Service


Learn how to use the Append method in Google Sheets Advanced Service in Google Apps Script and see how it differs from the SpreadsheetApp option

While most Google Apps Script developers are familiar with the standard appendRow() method, it does come with its limitations, namely being restricted to a single row and always appending after the very last row with content on a sheet. For those looking for more control and flexibility, community expert Scott Donald (aka Yagisanatode) has published an excellent tutorial on using the spreadsheets.values.append() method available through the Google Sheets API Advanced Service.

Scott’s guide provides a deep dive into the powerful features this method unlocks, offering a more versatile approach to handling data in your Google Sheets.

The Power of the Append Method

In his tutorial, Scott highlights several key advantages over the traditional SpreadsheetApp service:

  • Intelligently append to a table: The API automatically finds the first empty row after a data table within a specified range. This allows you to append new rows directly to the correct location in a single API call, without first needing to find the last row of content.
  • Choose Your Insert Option: The API allows you to decide whether you want to INSERT_ROWS, which pushes existing data down, or OVERWRITE the empty cells below your target table.
  • Control Value Input: You can specify USER_ENTERED to have Sheets parse the data as if it were typed in by a user (which processes formulas and formats dates), or RAW to insert the values exactly as they are in your array.

Scott’s tutorial provides a detailed breakdown of each parameter and includes a helpful video walkthrough and a starter sheet for you to get hands-on with the concepts. He also notes some current documentation errors and bugs, which makes this a valuable resource for anyone looking to implement this feature.

A big thank you to Scott for sharing this insightful guide with the community!

Source: Append – Google Sheets API Advanced Service for Apps Script – Yagisanatode

Automate creating or updating tables in Google Sheets with Apps Script🚀

A step-by-step guide to dynamically import, filter, and format your data.

How many times have you found yourself copying data from one sheet to another, applying filters, and then manually formatting everything as a table to make it more readable and functional? It is a sometimes tedious and error-prone process.

What if we could automate all of this with a single click? Welcome to the world of Google Apps Script!

In this article, we will analyze a script that not only imports and filters data between two spreadsheets but also uses the power of the Sheets API to dynamically create a formatted table. Ready? Let’s start! 🚀


What does the script do?

In summary, our script performs these steps in sequence:

  • Connects to a source spreadsheet (even one different from where the script is located).
  • Imports all the data from the specified sheet.
  • Filters the rows based on a specific condition (in our case, a numeric value in a column).
  • Clears the destination sheet and writes the filtered data to it.
  • Creates (or recreates) a formatted table that perfectly adapts to the size of the new data.

The Sample File

To follow this tutorial, you can create a copy of the sample spreadsheet:

➡️ Click here to create a copy of the sample file

The file contains the source sheet DataSet with the starting data. In your destination sheet, you can, for example, use a sheet named Filtered where the filtered data will be inserted. Nothing prevents you from using this script to filter and create a table in the same file. In any case, for my example, you will need the file I linked and a second file where you import this data.

In the script we will analyze, the filter is applied to the “Billing” column, selecting only rows with a value greater than 2000.


Extra: If you want to learn how to use tables:


Preparation: Enable the Sheets API

Before starting, there is a fundamental step. Our script uses advanced methods that are not part of the standard SpreadsheetApp service, but of the advanced Sheets API service.

To enable it:

  • Open the Apps Script editor.
  • In the left menu, click on Services (+).
  • Find Google Sheets API in the list, select it, and click Add.

Perfect! Now your script has the “superpowers” to manipulate tables, conditional formatting, and much more.


Script Analysis

Here is the complete script that you will need to copy and paste into your file:

function importFilterCreateTable() {
    const idOrigin = "ID_FILE"; // Source File ID
    const nameOrigin = "DataSet"; // source sheet name
    const nameDest = "Filtered" // destination sheet name
    const indexCol = "Billing" // name of Column to filter
    const nameTable = "TBL_Filtered" // name of the table

    const ss = SpreadsheetApp.getActive();
    const sheetOrigin = SpreadsheetApp.openById(idOrigin).getSheetByName(nameOrigin);
    const sheetDest = ss.getSheetByName(nameDest);
    const destID = sheetDest.getSheetId();

    // Import & Filter the data
    const values = sheetOrigin.getDataRange().getValues();

    const col = values[0].indexOf(indexCol); // Find the column number to filter

    const filterValues = values.filter((rw, h) => h === 0 || rw[col] > 2000); //Filter by condition

    if (!filterValues.length) return;

    // Retrieve the table data via the Sheets API
    const ssObj = Sheets.Spreadsheets.get(ss.getId(), { includeGridData: false });
    const sheetObj = ssObj.sheets.find(s => s.properties.sheetId === destID);

    const requests = [];

    // Checks if the table exists, if so it deletes it
    if (sheetObj.tables) {
        const table = sheetObj.tables[0];
        requests.push({ deleteTable: { tableId: table.tableId } });
    }

    // Adds the new data
    sheetDest.getRange(1, 1, filterValues.length, filterValues[0].length).setValues(filterValues);

    // Adds a new table
    requests.push({
        addTable: {
            table: {
                name: nameTable,
                range: {
                    sheetId: destID,
                    startRowIndex: 0,
                    endRowIndex: filterValues.length,
                    startColumnIndex: 0,
                    endColumnIndex: filterValues[0].length
                }
            }
        }
    });

    if (requests.length) {
        Sheets.Spreadsheets.batchUpdate({ requests }, ss.getId());
    }
}

💪🏻Let’s analyze the interesting part about the table.

The Creation of the Table

This is the most interesting and powerful part of the script. The standard SpreadsheetApp service does not have a direct method like sheet.createTable(). To do this, we need to speak directly with the Google Sheets API.

How does it work?

  • Sheets.Spreadsheets.get(ss.getId(), { includeGridData: false }); With this line, we ask the API to provide us with all the properties of our spreadsheet (like the sheets within it, their IDs, and of course, the existing tables), but without downloading the cell data (includeGridData: false), making the call much faster.
  • ssObj.sheets.find(s => s.properties.sheetId === destID); From the object we have received, we look for the specific sheet we are working on by comparing its ID (destID).
  • Preventive deletion: if (sheetObj.tables) Our script is designed to be run multiple times. For simplicity, we delete the old table before creating a new one. How? This block of code checks if a table already exists in the sheetObj sheet (sheetObj.tables). If so, it adds a deletion request (deleteTable) to an array of “requests” (requests).
  • Creation: requests.push({ addTable: { ... } }); We add a second request to our array: the creation of the new table (addTable). Here we define all its properties:
    • name: The unique name of the table.
    • range: The cell range it will occupy. It is crucial to note that the API uses “zero-based” indexes (the first row/column is 0) and the final index (endRowIndex) is excluded. This is why filterValues.length works perfectly to define the last row.

📌 In this example, the table will be placed in cell A1. If you want the table to appear in a different row, for example in row 3 of a column, you must set startRowIndex: 2 and endRowIndex: filterValues.length + 2. If you want it in a different column instead, for example in column D, you will have to put startColumnIndex: 3 and endColumnIndex: filterValues[0].length + 3.

  • Sheets.Spreadsheets.batchUpdate({ requests }, ss.getId()); This is the final call. Instead of making one API call to delete and another to create, we group them into a single batchUpdate. It is more efficient and ensures that the operations are performed in sequence.

This “delete and recreate” approach is extremely robust and ensures that the table always has the exact dimensions of the filtered data. The previous data? It will be removed when we delete the table. In fact, it is only after possibly deleting the table that the new data will be inserted.


Updating an Existing Table

What if we just wanted to update the dimensions of an existing table instead of deleting and recreating it? In this case, the import and filter logic remains the same, but the table manipulation part changes.

This method becomes useful when your table already has a structure that you want to preserve (special formatting, data validation, drop-down menus, etc.) and you just want to update its data and dimensions.

Here is what the final part of the script would look like in this case:

    // Delete the values
    sheetDest.getRange(2, 1, sheetDest.getMaxRows(), sheetDest.getMaxColumns()).clearContent()

    // Insert new values
    sheetDest.getRange(2, 1, filterValues.length, filterValues[0].length).setValues(filterValues);

    // Retrieve the table data via the Sheets API
    const obj = Sheets.Spreadsheets.get(ss.getId(), { includeGridData: false });
    const sheetObj = obj.sheets.find(s => s.properties.sheetId === destID)

    const tableID = sheetObj.tables[0].tableId

    const requests = [{
        updateTable: {
            table: {
                tableId: tableID,
                range: {
                    sheetId: destID,
                    startRowIndex: 0,
                    endRowIndex: sheetDest.getLastRow(),
                    startColumnIndex: 0,
                    endColumnIndex: sheetDest.getLastColumn()
                }
            },
            fields: "range"
        }
    }];

    Sheets.Spreadsheets.batchUpdate({ requests }, ss.getId());

What changes?

  • Instead of deleting and creating, we use an updateTable request.
  • We must specify the ID of the table to be modified (tableId).
  • The fields: "range" field is crucial: it tells the API to update only and exclusively the range property of the table, leaving everything else (like the name) unchanged.

Conclusion

Which method to choose? It always depends on the use you want to make of it, so it depends a lot on your needs. Remembering that the “delete and recreate” method of the first script is generally used if we always want to have a new and clean table, while the second case is useful when we have already structured our table.

Automating tasks in Google Sheets with Apps Script can save you hours of work. Learning to use the Sheets API opens up a world of possibilities that go far beyond simple data manipulation, allowing you to programmatically control almost every aspect of your spreadsheets.

Automating Personalised Gmail Drafts with the AI Formula and Apps Script

The AI formula in Google Sheets is a powerful and easy way to use Gemini AI directly inside of your Google Sheets. In this example, we use the AI formula to generate draft emails providing feedback on student exam scores. Then we use AI to write Apps Script that moves the draft emails from our Sheet into drafts in our Gmail, ready to send.

For many, the process of drafting personalised emails from a list of contacts in Google Sheets is a tedious, manual task. It involves switching between tabs, copying and pasting data, and carefully composing each message. But what if you could automate the entire workflow, generating perfectly tailored draft emails in Gmail with the click of a button?

In a recent video tutorial, Google Developer Expert Ben Collins demonstrates a powerful and efficient method to achieve exactly this. He showcases how to combine the intelligence of the new AI() formula in Google Sheets with the automation capabilities of Google Apps Script to turn structured data into ready-to-review email drafts.

What makes this solution so compelling is that it’s more than just a clever solution; it’s a glimpse into the future of work in Google Workspace.  As Google continues to embed AI and end-to-end automations like Workspace Flows, the ability to effectively guide these systems becomes paramount. Ben’s solution provides a perfect, practical sandbox for honing a crucial new skill of prompt engineering and mastering the art of communicating with Large Language Models.

To see the full demonstration and get started on your own prompting journey, be sure to watch Ben’s complete video.

Source: Turn Sheets Data into Gmail Drafts with the AI Formula and Apps Script 

An AI Assistant to Triage Your Inbox and Draft Replies with Apps Script and the Gemini API

Image credit: Zack Akil

🤖 AI-powered Gmail assistant that automatically analyzes emails and drafts responses using Google’s Gemini AI

Google is rapidly integrating Gemini across the Workspace platform, bringing powerful AI features directly into the tools we use every day. While these native integrations offer incredible value, developer-led solutions provide a tantalising glimpse into the future of hyper-personalised automation, especially as we look toward the potential of the new Google Workspace Flows. This is where the real magic begins, allowing for custom agents that can handle complex, multi-step tasks.

Google Developer Advocate Zack Akil has created one such example solution that turns your Gmail inbox into a smart, automated assistant. His open-source project, the “AI-got-this-gmail-delegator,” is a Google Apps Script that uses the Gemini API to analyse incoming emails, categorise them, and even draft replies based on your own knowledge and writing style.

This project is a great example of how to combine the power of Google Workspace automation with the intelligence of modern AI. The script works by:

  • Analysing Incoming Emails: When a new email arrives, the script sends the content to the Gemini API for analysis.
  • Providing Context: The AI’s analysis is guided by a “knowledge base” that you provide in a Google Doc. This could include your role, common responses, or specific project details.
  • Intelligent Triage: Based on the email’s content and your knowledge base, the script decides whether the email needs a reply, requires manual intervention, or can be archived.
  • Drafting Replies: For emails that need a response, the script uses Gemini to generate a draft reply in your own writing style, ready for you to review and send.

The entire process is logged in a Google Sheet, giving you a complete overview of your AI assistant’s activity.

What makes this project so compelling is its practicality and customisability. By modifying the Google Doc knowledge base and the writing style prompts, you can create a personalised email assistant that understands the nuances of your work.

To explore the code and set up your own AI email delegator, check out the project on GitHub.

Source: GitHub – ZackAkil/AI-got-this-gmail-delegator 

Extending Chat Apps Across Workspace with Universal Actions

In this video we build a Chat app as a Google Workspace add-on with Apps Script and extend it to other Workspace applications (Calendar, Gmail, Drive, Docs, Sheets, and Slides).

With the recent general availability of being able to build Google Chat apps as Google Workspace add-ons, the developer community now has a unified model for creating integrated experiences across Google Workspace.

To see this new framework in action, Pierrick Voulet from the Google Developer Relations team has released a fantastic video tutorial, ‘Build a Chat app as a Google Workspace add-on with Apps Script’.

The ‘OOO Assistant’: A Practical Demonstration

The video centres on building an ‘Out-of-Office Assistant’ that can block out the user’s Calendar for the day and set a Gmail out-of-office auto-reply. The real magic is how this Chat-based logic extends into a full Workspace Add-on, available in the sidebars of Gmail, Calendar, Drive, Docs, Sheets and Slides. This is achieved with minimal code changes, primarily by configuring the appsscript.json manifest to include:

  • homepageTrigger to display a UI card in the add-on sidebar.
  • universalActions that reuse the exact same functions (blockDayOut()setAutoReply()) originally built for the Chat quick commands.

The video provides a complete overview of the process, and you can get hands-on with the shared code yourself. For more technical details, check out the official Extend Google Chat documentation.

Source: Build a Chat app as a Google Workspace add-on with Apps Script (YouTube)
Code: Build a Google Workspace add-on extending all UIs (GitHub)
Docs: Extend Google Chat (Documentation)

From Vague Vibes to Structured Data: “Vibe Scraping” with Google Apps Script and the Gemini API


Nine years ago, I wrote an article describing how to scrape data from the internet in about five minutes. It featured a Google Apps Script library that allowed you to specify what to download from a webpage by identifying the text surrounding your target information. This became one of my most-read articles, and the library itself saved me a significant amount of time.

With the advent of large language models like Gemini, this entire paradigm is shifting. Just as “vibe coding” describes a more intuitive approach to programming, I’d say we’re now entering an era of “vibe-scraping.”

Using data from the web has long been a staple. But what if you could just describe the information you want and have an AI fetch it for you?

Google Developer Expert, Ivan Kutil, introduces a modern approach he calls “vibe scraping.” In his latest post, he demonstrates a powerful method that combines Google Apps Script with the Gemini API to pull structured data from any webpage directly into a Google Sheet.

To understand the power of this technique, Ivan set out to solve an old problem. He had a Google Sheet with a list of movies, each with a URL pointing to its page on ČSFD (a popular Czech-Slovak film database). His goal was to automatically populate columns with details like genre and rating by extracting the data directly from those linked pages.

At the heart of his solution is a new, experimental Gemini API feature called the URL Context tool. As the documentation notes, this tool is surprisingly flexible: it “can then retrieve content from the URLs and use that content to inform and shape its response.”

The tool can be used in two main ways:

  1. URL Context “By Itself”: In this mode, the tool directly fetches and processes the content from the URL you provide. This is a straightforward way to analyze the live content of a single page.
  2. URL Context with Google Search Grounding: In this mode, the tool can read content from a URL and then use Google Search for a more in-depth understanding.

What makes Ivan’s script so elegant is how it combines this tool with the simplicity of Google Sheets. For each row, the script constructs a prompt for the Gemini API, asking it to analyse the content of the provided URL and find the specific information defined in the spreadsheet headers.

The result is a system where modifying your data query is as easy as renaming a column. This turns a potentially complex scraping task into a simple spreadsheet edit, allowing anyone to create powerful, custom data importers.

To get the full details and the code to build this yourself, check out Ivan’s original post.

Source: Vibe Scraping with Google Apps Script and Gemini’s URL Context ~ kutil.org

Bring Your Data to Life: Smart Chip Read/Write Support Comes to the Sheets API

Last month, we introduced the ability to take basic actions to create and modify tables via the Google Sheets API. This week, we’re expanding the API functionality to include the ability to write and read file and people smart chips.

Following last month’s update for tables, it was recently announced that the Google Sheets API now allows developers interact with the file and people smart chips. This update provides developers with direct access to manage these types of smart chips, which were previously only available through the user interface. For detailed documentation and code snippets on implementation, refer to the official Google Developers support page.

Here is a quick overview of what you can do:

  • Reading Smart Chips: You can retrieve data for all smart chips within a cell. The API response will detail the properties for chipped sections of text and will also identify the plain text sections of the cell. As noted in the documentation to get the chip’s display text, you must request the formattedValue.
  • Adding Smart Chips: You can now insert smart chips for people and files into cells.
    • People Chips: These can be created to link to an individual using their email address. You can also specify the display format for the person’s name.
    • Rich Link Chips: These can be used to create a chip that links to a Google Drive file. Note that writing these chips requires your application to have at least one Drive OAuth scope.

In the feature image above you can hopefully see how a recruitment tracker could be created and read. In the example people chips are used for the ‘Applicant’ and ‘Interviewer’ columns, providing immediate access to contact information. A file chip is used in the ‘Resume’ column to link directly to the candidate’s document in Google Drive. A developer could automate this entire process. For instance, a script could be triggered when a new application is submitted, automatically adding a new row to the tracker complete with the relevant person and file chips, streamlining the entire recruitment workflow.

For those using Google Apps Script, these new functionalities are accessible via the Google Sheets Advanced Service. This update provides better methods for managing smart chips in Google Sheets.

Source: Support for file and people smart chips now available with Google Sheets API

Automate Your Organisational Chart with Apps Script & Google Sheets

In this guide, I’ll show you how to build a data-driven organizational chart based on Google Sheets data, using Apps Script and the OrgChart JavaScript library.

Recently, I used Gemini Deep Reacher to “find the most popular or useful Google Apps Script tutorials?”. There was no mention of AppsScriptPulse (rude, but if you are curious, here is the report :), but it did introduce me to the work of Joseph Petty on the GreenFlux Blog.

One tutorial, in particular, stood out for its practical application for many organisations: “Building A Data-Driven Organisational Chart In Apps Script”.

Creating diagrams, particularly organisational charts, often requires laborious design work and manual revisions. Petty’s tutorial directly addresses this common frustration by demonstrating how Google Sheets, Apps Script, and the OrgChart JavaScript library can produce dynamic, data-driven organisational charts that automatically stay current.

The Core Idea: From Spreadsheet Data to Visual Chart

The beauty of this approach lies in its simplicity and efficiency. Most organisations already maintain employee data, including supervisor information, in spreadsheets or databases. Petty’s method cleverly taps into this existing data source.

The tutorial guides you through several key steps:

  • Foundation Setup: Deploy a basic Apps Script web app, creating an index.html for the frontend and Code.gs for server-side logic (including a doGet() function).
  • OrgChart Library Integration: Introduce the open-source OrgChart JavaScript library, which uses a nested JSON structure to define the visual hierarchy.
  • Data Transformation: Write an Apps Script function (getEmployeeData) to pull data from Google Sheets (employee IDs, names, titles, supervisor IDs) and convert it into the required nested JSON format for the chart.
  • Dynamic Connection: Pass the generated JSON data from Code.gs to the OrgChart library in index.html using google.script.run.withSuccessHandler() to render the chart.

The Result? An Always Up-to-Date Org Chart

By following these steps, you can create an organisational chart that’s directly linked to your Google Sheet. When an employee changes roles, or a new team member joins, simply update the spreadsheet, and your web app will reflect these changes automatically. No more manual graphic updates!

For the complete code and detailed setup instructions, please follow the link to the original tutorial on GreenFlux Blog.

[Update 2025-05-16: A great tip from Pablo Felip Monferrer: “might be worth mentioning that the good old Sheets has a native organizational chart that can be easily published, too (hint: employee names in a third column can be displayed as tooltips)”.

Image credit: Pablo Felip Monferrer

Source: Building A Data-Driven Organizational Chart In Apps Script