Posts

More Flexible Test Databases in Flask

I've been longing for an easier way to manage test data in Flask. Specifically, when running automated tests, I wanted an easier way to populate a database with some known values which would then be used in the tests themselves. This turned out to be tricker than I thought, but I learned a bunch along the way and I'll share that process in detail here.

Why I needed test data

I tend to focus on integration tests - I'm interested in how the application takes in requests and returns a response. Having test data in my database allows me to define test results easily. I know what types of responses I should be getting from each route and dynamically loaded data from a JSON file allows me to quickly define those results over and over.

Up until this point, I would create database objects like normal, using a model constructor:

import unittest

from myapp.models import Event, User


class MyTestClass(unittest.TestCase):
  def setUp(self):
    user = User(name="My name", email="myname@example.com")
    user2 = User(name="Another name", email="another@example.com")

    event = Event(title="Some event")

    db.session.add_all([user, user2, event])
    db.session.commit()

  # the rest of the tests

class AnotherTestClass(unittest.TestCase):
  def setUp(self):
    # do the same thing...

The problem with this is that it is extremely repetitive. Each test (or each TestCase instance) has its own database declarations which have to be loaded when the test is run. That means I'm either typing each record for each test or I'm copy/pasting items in between tests. If my routes ever change I have to change each instance of the test as a result, which is no fun.

Using libraries

I came across two libraries, but neither really solved my problem but for different reasons.

Flask Fixtures is a library which allows you to run unit tests based on JSON representations of your data. It takes in a list of JSON files and then populates an in-memory sqlite database. I tried this method, but the library hasn't been updated in several years and didn't play well with Flask's application factory pattern.

I had Factory Boy recommended, and while tempting, I needed to have consistent data in memory to run tests against. That said, I'll probably come back to Factory Boy for generating large data sets where I have more freedom in how I test.

I like the pattern of using JSON to populate a test database on the fly. I ended up writing my own, much simplified, version of Flask Fixtures.

JSON structure

I followed the pattern in Flask Fixtures because it provides a clear, extensible way of loading data into the application.

[
  {
    "table": "user",
    "records": [
      {
        "id": 1,
        "name": "Admin",
        "email": "admin@example.com",
        "usertype_id": 1,
        "location_id": 1
       },
    }
]

Each file can be expanded as necessary, adding new items or new files to expand the test database scope on the fly. These files live inside /test/fixtures in my project tree.

Dynamically loading test data

Instead of defining database records at the start of each test, I now define records in JSON files which can be loaded on demand within a test or set of tests. The biggest change in my app structure was to handle application context appropriately.

A new Loader module is created with the current application instance, database, and a list of fixtures to load into sqlite. The module only runs within the current context, so I can control when loading happens within the individual tests, even loading data after the setUp function has run.

import json
import os
import unittest

from sqlalchemy import Table

from app.extensions import db

class Loader(object):
    """
    Reusable class for loading fixture data into test databases.
    Initialize with an in-context application and database engine.
    """

    def __init__(self, app, db, fixtures):
        self.app = app
        self.connection = db.engine.connect()
        self.fixtures = fixtures
        self.metadata = db.metadata

    def load(self):
        for filename in self.fixtures:
            filepath = os.path.join(self.app.config["FIXTURES_DIR"], filename)
            with open(filepath) as file_in:
                self.data = json.load(file_in)
                self.load_from_file()

    def load_from_file(self):
        table = Table(self.data[0]["table"], self.metadata)
        self.connection.execute(table.insert(), self.data[0]["records"])
        return


class MyTest(unittest.TestCase):
    def create(self):
        self.app = create_app(TestConfig)

        # Build the database structure in the application context
        with self.app.app_context():
            db.init_app(self.app)
            db.create_all()
        return self.app

    def setUp(self):
        self.app = self.create()

        # Set up the application context manually to build the database
        # and test client for requests.
        ctx = self.app.app_context()
        ctx.push()

        self.client = self.app.test_client()

        # Include any data to be loaded into the database
        fixtures = [
            "events.json",
            "users.json",
        ]

        # Now that we're in context, we can load the database.
        loader = Loader(self.app, db, fixtures)
        loader.load()

    def tearDown(self):
        db.session.remove()
        db.drop_all()

Main takeaways

My biggest frustration was figuring out application context. This update to my test runner included moving to an application factory pattern, so I had to rethink how everything ran from the ground up. In the main application, context is handled by the create_app function and I didn't have to think about what context was active. In the tests, that has to be done manually with each instance. Moving the app context into startUp ensured only one context was used at a given time.

Being self taught, I do my best to apply best practice principles like "don't repeat yourself" (DRY). This was especially noticeable as the number of tests increased and I'm happy with this solution. There's still some boilerplate for each test case and one of my goals is to wrap that up in a unittest.TestCase subclass so I can simply inherit the boilerplate rather than type it out. I'm still working on the best way to do that for my use case.

This is probably one of the more complex problems I've had to solve on my own. The application itself is just a layer to interact with database records, so the logic itself isn't too complex. Writing my own module to handle the automated work was new and I'm happy with the result. I'm hoping to be able to expand on it and eventually (maybe?) package it up into something I can import and use in some other projects. But that's another task for another day.

Dune Hikes

In September, it was my family's turn to battle COVID-19. We're not sure who brought it home first, but it hit me first. Hard.

Not as hard as a lot of other people, but I was flat for a couple days battling high fevers and severe muscle and joint aches. Luckily, I didn't have any of the breathing issues that have hurt so many people. Following me, my wife and kids all came down with positive tests, so we settled in for - what we hoped - would be a relatively short infection period.

Unfortunately, our tests lingered positive. We stayed away from family and friends. We weighed how to safely get groceries (we're pretty rural, so delivery isn't really viable), and how to pass the time. The hardest part was when we were all feeling better, but still testing positive.

We ended up finding ways to be outside together away from people. We took trips to local creeks and parks. We spent time working in the garden on the farm, preparing for winter. One of the best trips was to Warren Dunes State Park about 30 minutes north of here.

The sand stretches in every direction.

The sand stretches in every direction.

The kids hadn't ever been to the dunes, so this was a treat for us. In the parking area, there is a monster dune immediately across the road and they started running to the top. By the end, they looked like Everest hikers, stopping every couple of steps to catch their breath due to the actual height and an incredibly steep grade.

Warren Dunes is nearly 2,000 acres and has public camping access, so there are trails all over the dunes to different campsites. They weave down, in, and around natural trails through the scrubby dune grass so you move from full sun to shady and change elevation quickly (if you want to). There are also several stretches that cross the ridges of adjoining dunes, so you can move across the area with amazing views out across the lake.

We chose to stay on top instead of going down and then back up.

We chose to stay on top instead of going down and then back up.

We've lived in small-child land for a very long time. This was the first time our three year old was almost as independent as his older sisters. He wanted to walk, slide down hills, and climb with the older kids. My wife and I were actually able to walk and talk together while the others explored.

I think this is the start of the next phase of family life - one where we can begin to set and break boundaries at the same time. It also makes me realize how many of the childhood memories I have were created by my parents - and we need to do the same. We're looking forward to protecting our family time so we can get out and explore more frequently.

No agenda other than showing up.

Dealing with Bad Questions

...not all questions are created equal and some questions inhibit learning.

Source: Yes, There is Such a Thing as a Bad Question — Teachers Going Gradeless

This is a great look at how our practice of teaching needs to change if we want students to think differently about school. The author writes from the perspective of doing this as a result of going gradeless, but the same habits of instruction can be used to make the same shift if gradeless isn't an option.

...I find that students approach me with a different question; “Can you show me how to do this?” Since my students are self-assessing their work regularly throughout the term, they always have a strong grasp of where they have shown understanding and where they are developing their understanding.

The same thing often happens in standards-based approaches to grading or even just focusing on feedback to drive your interaction with students. The critical component is to hone your own responses to questions to point students back to the main goal: learning.

Break's End

The end of summer break is always a little weird. I have a shorter vacation than normal because of the nature of my job, but going back to work at the end of July still feels...too soon.

Brian is smiling in the foreground while his family is crouched behind him, selfie-style, behind. They are on the beach with the sunset in the background.

I'm thankful for my breaks. It seems strange in the US to get such a large block of time away from work and I'm getting better at actually taking a break.

I originally titled this "Summer's End," but I'm reminding myself, like.i reminded my kids last night, that going back to work doesn't mean summer is over. It just means our schedule changes a little bit.

I'm still thankful for that summer break.

Coming Back Around

This week, my wife pointed something out to me: our entire dinner was either something we grew or something we helped raise. Vegetables from our garden. Pork from working with my brother in law and his animals.

Time spent in the ground and with the ground is never wasted.

I was watching a farmer process chickens when learning how to do our own and he said something that has stuck with me.

Something had to die for you to live today.

He meant animals, but the same is true for our vegetables. We work hard to make sure we grow and raise food well so that we can appreciate the value it brings to us when it comes time to harvest. We're starting to understand that more. My kids might understand it better than I do.

We're early in our work to establish a sustainable space and I'm looking forward to when more dinners we're raised here, at home.

Cloud Power

Summer brings some pop up storms at the end of the day when the humid air starts to cool. They often form over Lake Michigan to the west and slowly move west, reaching up until they flatten.

Looking west from the house, we get to watch these spectacular white towers slide across the landscape. Oftentimes, we hear the rain before we feel it.

Clouds tower in the west.

Clouds tower in the west.

The late day storms are my favorite because the setting sun is still able to backlight the delicate edge of the cloud, reminding me that storms come and go.

Recent Happenings

We're in the throes of summer and there have been a number of events, ranging from new construction (because there is never enough stuff to build), beekeeping, and birthdays. Here are a few highlights.

This guy turned three.

This guy turned three.

The honeybees are going gangbuster.

The honeybees are going gangbuster.

Chores with the animals usually include a pit stop at the swing set to get some wiggles out.

Chores with the animals usually include a pit stop at the swing set to get some wiggles out.

We're building a barn.

We're building a barn.

I'm not posting to social media much. I post some to Pixelfed and (even less frequently), Instagram.

Adventures in Building an Interactive Apps Script Sidebar

Now that I've finished a rewrite of an application in HTMX, I wanted to see if HTMX could be used to enhance the use of Google Apps Script sidebar interfaces. I build these from time to time at work to help with spreadsheet interaction that goes beyond simple formulas. The idea is to allow for more dynamic interactions in the (very limited) sidebar available through Google Apps Script.

HTMX: This Won't Work

I started with adding HTMX directly and had no luck. The main issue is that HTMX uses ``XMLHttpRequest` to fetch data <https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest>`__, and that leads to CORS issues. When HTMX starts a request, a preflight check is made because it includes the upload parameter, which is considered "not simple" by the browser. This preflight check has to pass for the request to proceed.

The preflight fails because of the redirect between the published Apps Script /exec endpoint and the googleusercontent URL with the actual code. HTMX also adds headers to the request, which also makes the request unsafe and the preflight check fails with a CORS error.

So, I tried getting around the CORS issue by issuing async fetch calls, but that just adds a layer of complexity rather than solving a problem. It also confirms the fact that browser engineers are much smarter than me because it still didn't work.

All that said, fetching content directly from a sidebar won't work, let alone posting data to the backend.

Hyperscript to the Rescue

Hyperscript is a companion to HTMX which offers similar functionality, just in a different context. It's a lightweight scripting language with Python-ish syntax that you can include as a "_=*" attribute on HTML elements. You can add different handlers and listen for events just like with HTMX, so clean interactions are still possible.

Using Hyperscript, you can take advantage of the google.script.run() API to interact with your Apps Script code. This allows you to do some simple dynamic content replacement all the way up to accepting input from a user to query or update data in the sheet through the sidebar.

The Hyperscript cookbook has some examples, so let's look at how to implement it in an Apps Script context for some real-world application. All of the code to try it yourself is in this GitHub repo. You can also make a copy of this sheet to get your own version.

Getting started

There are some utility functions we need to get the project started. I'm greatly indebted to Bruce McPherson and his extensive writing on Apps Script project structure. In your Code.gs file, we'll create some global functions to allow us to work more effectively between the Apps Script code and the client.

// Code.gs

// Allow the client to access your Apps Script functions dynamically.
function exposeRun(namespace, method, argArray) {
  var func = (namespace ? this[namespace][method] : this[method]);
  if(argArray && argArray.length) {
    return func.apply(this, argArray)
  }
  else {
    return func();
  }
}

// Utility function to include other files in HTML templates
function include(filename) {
  return HtmlService.createHtmlOutputFromFile(filename).getContent();
}

// Trigger the menu to open the sidebar
function onOpen() {
  const ui = SpreadsheetApp.getUi()
  ui.createMenu('Menu').addItem('Run', 'showSidebar').addToUi()
}

// Display the sidebar
function showSidebar() {
  const html = HtmlService.createTemplateFromFile('template/_base.html').evaluate()
      .setTitle('The sidebar');
  SpreadsheetApp.getUi() // Or DocumentApp or SlidesApp or FormApp.
      .showSidebar(html);
}

I'll also use Bruce McPherson's Promise-based wrapper for working with Apps Script as a starting point. All of our requests will go through this method:

// static/main.js.html
<script>
  var Runner = (function(ns) {

    ns.run = function(namespace, method) {
      let runArgs = Array.prototype.slice.call(arguments).slice(2);

      if(arguments.length < 2) {
        throw new Error('Need at least a namespace and method.')
      }

      return new Promise(function(resolve, reject) {
        google.script.run.withFailureHandler(function(err) {
          reject(err)
        }).withSuccessHandler(function(result) {
          resolve(result)
        }).exposeRun(namespace, method, runArgs)
      })
    }
    return ns;
  })(Runner || {})
</script>

Templates

Now that the boilerplate is done, we need to start defining some worker classes and templates.

To keep things clean, I wrap each of my operations in an IIFE object which defines methods and the templates to return with each completion. This means there are more files to manage in the code editor, but each one encapsulates function cleanly and it's easier to maintain.

Make the Apps Script handler:

// SimpleSwap.gs

var SimpleSwap = (() => {

  const htmlTemplate = () => {
    let html = `
    <p
      class="active"
      _="on click set my innerHTML to 'Clicked!'"
    >Click me!</p>`;
    return html;
  }

  return {
    htmlTemplate
  };
})();

And finally our base HTML template for the sidebar:

// static/_base.html
// This is the sidebar wrapper. Content will be inserted dynamically.
<!DOCTYPE html>
<html>
  <head>
    <base target="_top">
    <script src="https://unpkg.com/hyperscript.org@0.9.5"></script>

    <!-- Our promise-based runner google.script.run() requests -->
    <?!= include('static/Runner.js'); ?>
  </head>
  <body>
    <main>
      <div class="main-container">
        <div class="sample">
          <b>Dynamic insertion and interaction</b>
          <!-- Hyperscript to interact with the Apps Script code -->
          <button
            _="
              on click
              call Runner.run('SimpleSwap', 'htmlTemplate')
              then put the result into #target
              then call _hyperscript.processNode(#target)
            "
          >trigger</button>
          <!-- This receives the result of the request -->
          <div id="target"></div>
        </div>
        <!-- other divs... -->
      </div>
    </main>
  </body>
</html>

This template will:

  • Allow us to access the Runner middleware to marshal API calls.
  • Click a button to run a bit of code.
  • Put the result of the Apps Script code into the #target div.
  • Initialize any Hyperscript included in the template to allow for interactivity following the swap.

The key in this method is to think through what interaction you want your template to have and to include that in the hyperscript attribute.

A More Complex Example

Let's say you want to make an Apps Script Extension (formerly "Add-on") or container bound script which allows you to fetch data from an API and then selectively insert results into your sheet. You can do that with Hyperscript inside the Apps Script sidebar quite cleanly. We'll keep the same boilerplate code but define a couple of functions to build a quick sample. I'm going to use the handy Star Wars API as the data source.

Although you can use Hyperscript to get data directly (via the ``fetch` command <https://hyperscript.org/commands/fetch/>`__), templating the response in the script gets messy, especially if you want to have actions on the results of the fetch request. To clean this up, we'll take advantage of Apps Script's ``URLFetchApp` class <https://developers.google.com/apps-script/reference/url-fetch/url-fetch-app#fetch(String)>`__ and another HTML template fragment.

Start by adding a button to a sidebar:

Get Star Wars characters

Now, we'll create our IIFE function

// SWAPI.gs
var SWAPI = (() => {

  // We'll use these sheet params to interact with the spreadsheet.
  const ss = SpreadsheetApp.getActiveSpreadsheet();
  const sheet = ss.getSheetByName('Sheet1');

  const getPeople = () => {
    let request = UrlFetchApp.fetch('https://swapi.dev/api/people').getContentText();
    let json = JSON.parse(request);

    // Apps Script templates can evaluate data dynamically. Call the
    // template fragment and then define a parameter on the object you
    // want to access.
    // https://developers.google.com/apps-script/guides/html/templates#pushing_variables_to_templates
    let template = HtmlService.createTemplateFromFile('template/swapi-list')
    template.people = json.results;

    // Evaluate the template and then get the resulting HTML to return.
    let html = template.evaluate().getContent()

    return html
  }

  const saveName = (name) => {
    sheet.getRange(sheet.getLastRow() + 1, 1).setValue(name);
  }

  return {
    getPeople,
    saveName
  }

})();

And lastly, our new template fragment which handles looping over the results as part of the evaluation step.

<!-- templates/swapi-list.html -->
<!-- Read the docs on Apps Script template evaluation if you
     haven't looked them over before. They can be quite helpful
     https://developers.google.com/apps-script/guides/html/templates
-->
<? for (var i=0; i<people.length; i++) { ?>
  <span class="active"
    _='
      on click call Runner.run("SWAPI", "saveName", "<?= people[i].name ?>")
      then remove me
    '
  >
    <?= people[i].name ?>
  </span>
<? } ?>

To help understand what's happening, we are:

  • Rendering a sidebar with a button to fetch results from a third party using URLFetchApp.
  • Allowing the Apps Script templating engine to handle rendering results.
  • Collecting the resulting HTML string and sending it back to the client.
  • Hyperscript swaps the HTML result into the DOM and then re-initializes on the new elements.
  • Clicking on a name adds it to the spreadsheet and then removes the option from the page.

Here's the result:

In the template, we define Hyperscript actions on each element so they also become interactive in the sidebar. Clicking on a name calls the saveName function and adds the value to the next available row in the sheet before removing itself from the sidebar.

Is it worth it?

This seems like a ton of work that could be achieved with out-of-the-box Javascript. So, is it all worth it?

It depends.

If you have minor interactions here and there, it might not be worth adding the extra attributes or taking time to create template fragments. DOM interactions can be pretty simple if you're just fetching and displaying data.

The real power of Hyperscript comes in locality of behavior and in making interaction plain in the HTML and not burying those actions in script files and event listeners. In the advanced example, I think Hyperscript is worth the effort because it is easy to see exactly what interactions exist on which elements.

Apps Script is notorious for weird behavior just because of the platform. Adding Hyperscript as a tool to manage interaction and behaviors can help identify bugs sooner because you - the developer - have a better idea of which interaction causes which behavior in the application.

If you're on the fence, take some time just to play around with simple swaps like I showed in the first example. Once you have the hang of writing behaviors on elements rather than in event handlers, some of the benefits will start to emerge.

Moving from Svelte to HTMX

Last year, I built an event registration tool for our district, mainly to keep track of what professional development we were doing and try to get paper out of the workflow. I chose to use Flask and Svelte for this project and this year, I decided to move away from Svelte and rebuild the application using HTMX. In this post, I'll explain why I decided to make the change and highlight some situations where HTMX just makes more sense for a large app managed by a single person.

Making a Switch

I'm self taught. I started playing with Internety things in the late 00's, starting with WordPress and straight HTML/CSS. I added Javascript slowly and since then, have managed to cobble together some kind of useful tools. Mostly for myself, but some to benefit others.

I had tried taking a dive into React, Meteor, and Vue, all with little or no success. The complexity of the frameworks and the abstraction needed to get stuff to show up on the page went way over my head, especially as a hobbyist. Adding complexity was the requirement to use build systems (I know Vue can be added directly with script tags, but that's not where I was introduced to it) and those felt untouchable.

Over the 2019-2020 school year, I got what I'll call a "working prototype" of an events registration system published using Firebase, some HTML, and a lot of vanilla Javascript. That pushed me closer to understanding build systems and when Svelte came along, its syntax finally felt familiar - just some template tags with scoped Javascript and CSS. Not too hard to handle.

I had a published version of the registration site by fall 2021 and it worked well for a first "real" project. Python did the heavy lifting in the background and I was able to have a nice interaction on the front with Svelte. So, why the move?

There are two main reasons:

  1. It's essentially an application to create and maintain records. I don't need elements to be running continuously (ie, a video player) while the user does things. Having a Javascript-built application was overkill for the purpose of the site. Rich Harris (creator of Svelte) elaborates on this in a podcast on when to use single-page applications over multiple-page structures.
  2. Maintainability of large Javascript-based systems is very difficult. Granted, I'm an amateur and this was my first project, so there are things I can certainly do better to take advantage of the systems, but I found myself reluctant to touch the app for updates because it would take me so long to untangle how everything worked together.

With those issues in mind, that's why I decided to move to HTMX.

How is HTMX Different?

I started following Carson Gross, the author of HTMX (formerly IntercoolerJS) and reading some of his essays, arguing for a return to hypermedia as the main driver of the web rather than Javascript. For my background, this made a lot of sense. Servers exist to serve content. The modern Javascript frameworks were created to allow for non-full page refreshes of content. HTMX blends those two goals, allowing the server to send content to the client and allowing for dynamic and strategic (even surgical) updates to the page the user is on.

This provides instant benefit in several ways:

  • The client no longer has to manage state. The server already knows the application state, so why not just send content that is stateful based on the user session (ie, logged in vs not logged in)? Using Flask sessions makes this very simple.
  • Templating engines can still be used for dynamic content creation. Each template contains the content it needs and doesn't have to re-wire itself with the other items on the page once it loads.
  • Specific elements can be updated as a result of a user action. The user interaction is clean and does not require any full-page refreshes, but without any of the messy Javascript workarounds to make that kind of interaction possible (looking at you, shadow DOM [whatever that means]).
  • Each request returns everything the user needs. You're not required to make more network calls to get the data necessary for the view or template (I think that's what "hydration" is).

With any Create/Read/Update/Delete (CRUD) tool, form rendering is a common task. I want to dive into the difference between doing this in Svelte and HTMX as an example of why I decided to make the move.

Form Rendering

In Svelte, I created a FormWrapper component which would dynamically handle input element rendering and form submission. This pattern was extremely helpful and I actually took some of the principles I learned and applied them to HTMX. The snarls came in determining what actually needed to be rendered.

A simple example is this view: when the user clicks on one of the action buttons, a form is rendered. Depending on the action, the form is different. In Svelte, you need to first render the form and then fetch the fields - two network calls to render. It works this way because Flask is serving JSON - agnostic structured data rather than data which carries all the information it needs in order to render.

Svelte form wrapper

So, a couple hundred lines of code to render a form. Reusability is good, but when it came to editing a form, it got gnarly really quickly. The main drawback was that for the form to even render, it required several more requests to the backend to get the fields necessary. Because the backend was just a JSON cannon, even the data it returned needed to be mapped and filtered into a usable state.

HTMX form rendering

With HTMX, that form is pre-templated and then sent from the server. Instead of a single endpoint to return all the form data, individual endpoints can be used to send back whatever form is necessary in HTML. There is no needed to request more information or process the data in the client to make it usable.

Because of Flask's templating engine and thanks to the extremely helpful Jinja partials extension from Michael Kennedy, I was able to mimic the dynamic action where a temporary sidebar is used to load and display the form.

The obvious tradeoff is that there are more files to maintain. But because each file is single-purpose, they're smaller and more focused, which means maintaining is much easier. It's also much more clear what is being returned by each operation rather than firing the JSON cannon and then untangling everything on the client.

The Takeaway

I'm very happy with the decision to make a switch. In the cutover, I've made some incremental improvements that were on the list of "someday" changes, but had felt overwhelming to try and attempt because of all the side effect potential. I'm also not saying that Javascript frameworks like Svelte are a terrible idea - writing this app in Svelte to start helped me learn how to build resilient and flexible backends.

For me, the value in HTMX comes back to maintainability and developer experience. The language of the web is HTML. Svelte was a good entry point for the initial build because its syntax was very close to plain HTML. But in the end, it still relied on Javascript to work and instead of feeling flexible, it felt brittle.

HTMX's approach to adding functionality via HTML attributes is much closer to the surface of normal web structure and gives just as much flexibility for 99% of what I need to do at a much lower complexity. I just enjoy working with HTMX because I can spend less time trying to figure out where an emitted event goes to make an update and more time making the tool more useful for my colleagues. That's a win every time.

Comments

Alan Levine

Thanks for blogging this Brian although my understanding level is hovering at 16% 😉 But something I did pick up on is what I want to be implementing more on some work projects, being able to create filterable tables of data, where you can provide a place to dynamically pare down tables of stuff to ones of interest.

I like AwesomeTables but you have to pay per app. And I tried the Google Dynamic Tables thing but it looked like it was built for something else- I get a sense HTMX might be a way. And like you, I prefer this HTML oriented approach.

Cheers!

Brian Bennett

Oh boy, you’re in for a treat. HTMX has a partner tool called hyperscript which allows for some lite client scripting. It’s made by the same author and it’s pretty much plug and play with HTMX. I’ve got it filtering tables (alphabetical clicking on table headers) as well as an active search on rows in large tables, all right there in the HTML. I’d be happy to send a demo if you think it’d be helpful.

Sort a Google Sheet with Vertically Merged Ranges

This is a quick tip, but if you've ever tried to sort a sheet with vertically-merged rows, you've probably been disappointed. Here's an example:

If you wanted to sort your sheet by Column A, you're out of luck. You can't do that automatically because Sheets doesn't know how to move blocks of unmerged cells even though they're right next to the merged range. Your best bet in that case is to highlight all of the rows in the merged range and drag & drop in the correct order.

Even though the gray line while you're dragging makes you think you can drop a merged range into another, you'll get an error if you actually try to do that.

This won't work for super-large data sets, so try to alphabetize as you go.

You can, however, sort ranges within selections, not just the entire sheet. So I can sort the names within each team to make life easier.

Highlight your range and right click to bring up the menu. At the bottom, hover over "View more cell actions" and then select Sort Range.

You can sort this selection how you'd like to at least have your subset data in the right order.

This is why I typically don't use merged cells when creating large spreadsheets. It makes it harder to move data around the way I'd like to.

That Time I Deleted a Database

I've been hobby programming for several years that has spilled over into making applications for staff at school. One of those tools is a website to manage PD events, signups, and documentation. This week, I accidentally deleted that database.

I don't want to downplay how bad this was, but it also isn't on the scale of losing student or financial data. But, it was roughly eight months of events, signups, and more importantly, participation records. Several things went wrong, all of which I should have caught at some point:

  1. My database migrations (history of changes to structure) didn't match between my computer and the server.
  2. I didn't double check the data in the database dump I had made before re-importing it.
  3. I had an artificial deadline in my head and I didn't slow down in the process of the changes.

We're constantly trying to follow best practices when it comes to accessibility. The change I was introducing allows staff to request accommodations on their registration so the presenter knows to do one thing or another. In addition to our general presenter guidelines, this helps make sure everyone's needs are met in a non-obstructive way.

I learned some things...

First, instead of trying to force the migration history to reconcile, I should have slowed down and fixed the root issue. Those migrations files are critical to making sure everything moves around in a way that can be reversed and repaired. Instead of reconciling the differences, I made some quick changes to which steps were related to one another and that contributed to the problem in the first place.

Second, I assumed the database dump from last night was good enough. I didn't check to see that it actually had good data in it. As it turns out, the dump was blank - and that means that when I reloaded it back into the database, I effectively erased everything. Since that was my only backup, there was no chance to roll it back to a previous state. So, now we'll be doing nightly backups as well as dumps right before migrations to make sure we have at least one good copy. I'll also be checking the file directly to make sure it holds information, period.

Third, I wanted to get the update in place. It didn't take too long to make and the code itself wasn't too complex. I wanted to say it was done and be able to move on. Instead, I ended up giving myself about 10 more hours of work to piece information back together and then work out ways to safely integrate everything back together. Deadlines are always flexible, especially when you're setting them yourself.

I did have one stroke of luck in this whole ordeal: when a session is created, a Google calendar event is automatically created. When someone registers for that event, they are also automatically added to that event as a guest. Since this only affected the site database and not the Google Calendar events, I was able to use some quick-and-dirty Google Apps scripts to restore users, events, and their registrations. The only thing I couldn't restore was attendance data.

Scripts

The first task was to get all the events on the calendar within a date range. I like using the Calendar v3 API because it gives me access to more properties on each Event that we can throw into a spreadsheet.

Once I had this sheet, I was able to do things like repopulate all users, extracting their data from the attendees string and using the Admin SDK to look up their name and location:

In this case, I got lucky that most of what I needed was available in the Calendar. Knowing what's available in Apps Script made this a partial loss rather than a total loss.


Systems of Trust and web3 in Education

This post has been percolating in my mind for several months. It started as an opus that sat stale in my drafts because I didn't want to wrestle the mess it was into something readable. Then I pretty much forgot about it because, in the end, .

In that time between writing a fiery essay and not really caring to write at all, I finally settled on why web3 doesn't sit well with me. And it comes down to systems of trust.

Set the stage

For the sake of clarity, let's get on the same page. web3 is a buzzword right now which posits that the "next" version of the Internet will run on the blockchain. This is manifested from the rise in popularity of Bitcoin and other cryptocurrencies, the NFT weirdness happening, and techno-futurists hedging on being a part of the "next big thing."

The platform dictates the conversation, and since education and technology are intimately woven together (for better or worse), there are articles and accounts popping up, prompting teachers to start thinking about how to make sure they're ready for the blockchain.

How did we get here?

Understanding how this all works is important to forming an informed opinion. Again, this is a very brief description, but here's a rundown of some of the history.

"Blockchain" is a technology which is essentially a history of something done that cannot be changed. Once something is on the blockchain, it's there forever. The history is public and everyone contributing to that chain can verify the record. The idea was developed in the late 1980's and early 1990's, but didn't really come into play until the release of Bitcoin in 2008. Since then, blockchain technology has been seen as the next big thing in everything from currency to supply chain routing to vegetable freshness. If you're interested in a technical rundown of blockchain, this article was immensely helpful to me.

Bitcoin's big selling point is that you can have secure financial transactions if everyone can see the entire record. In simple terms, a blockchain is a database that you can only add records and only if everyone else watching the chain agrees. Instead of relying on a bank to tell you who paid what to whom, a network of computers manage the consensus model for any update made.

Imagine you're at a soccer game. Instead of there being one scoreboard which displays the current state of the game, every person in the stadium has a scorecard where they keep the game score. Normally, when a goal is scored, the authority (referees) update the scoreboard and the game continues. In our imaginary blockchain soccer game, any time there was a goal, every person in the stadium would have to agree on the new score based on that goal.

The idea is the same in the digital space. When a change is made to the blockchain, every computer involved in that network adds to the consensus of that change.

Systems of trust

Every structure in society is based on trust. I trust my employer will give me a check every two weeks. I trust other drivers on the road will stay on their side of the road. Blockchain technology moves trust away from people and into technology. There are certain benefits to trusting in systems (traffic lights being consistent, for example), but there are also drawbacks, especially when it comes to web3 and education.

At its core, our time with students is based on trust. Parents trust us to make wise decisions for their children. School systems trust teachers to implement the curriculum with depth and rigor. Students trust us to watch out for their best interests. Trust is built into every interaction in the school day and relationships are the backbone for why most of us are teachers.

web3, on the other hand, asks us to trust in the distributed blockchain network. Trust is in the system rather than the person. For a blockchain to work, you need a lot of people (computers, really) to trust one another. In simple examples, it sounds like a great idea. That student actually turned in that assignment.

But when we get into the nuance of some of the proposed benefits of web3, you lose the relationship aspect of teaching and learning completely. Is this their best work? Is this actually their work? We see this already with surveillance tech making it's way into classrooms across the country - systems which assume the student is an adversary before the first day of class.

With relational trust systems, there are avenues for disagreement. Dialogue and relational history come into play when we're judging student work. With web3, we're relying on a distributed network of computers to make a judgement call. If that is the wrong judgement (or a false judgement - theft and fraud still happen with Bitcoin), there is no avenue for recourse.

Permanent scars

Trusting in a system which cannot be revised may have some fringe benefits, but at the end of the day, we want our students to grow. We want students to be better citizens today than they were yesterday. With an educational web3/blockchain, any past mistakes are there forever as signposts of bad judgement. Rather than giving students an opportunity to describe lessons and what came of those situations, they're now an open record for anyone to interpret however they want. Systems of trust are better when people are involved, not when they're distributed.

Technology trends sweep through education just like they do through hedge fund and venture capital circles, albeit somewhat slower. This is an important enough topic to read and think through what implications could come as a result of being a part of the "next big thing." Because if we get this wrong, it's going to be pretty hard to go back.


Other reading

I spent a lot of time reading and chewing through resources to learn more about the underlying technology and some of the other problems that make blockchain a questionable solution to many of the problems it claims to solve. Here is a selection you may find helpful.


The featured image is Iterated Function System by Quasimondo found on Flickr. It is licensed under CC BY-NC.

Check for alt tags with a Bookmarklet

tl;dr

If you want a quick way to check if images on a webpage contain alt text (for screen readers), I made a bookmarklet that will blur images without that text.

Click and drag this guy up to your bookmarks bar. Click it on any website to blurify images that are inaccessible.

altChecker

Full story

Mike Nikels tweeted this last month and it's stuck in the back of my mind:

Still my favourite CSS trick in years. Thank you @estellevw!

I've been trying to be much better about including alt text in everything (thanks to Alan Levine barking about it all the time, too). I wrote a little script that runs in Tampermonkey, but that's a high barrier for entry. So is a Chrome extension.

So now it's a bookmarklet.

You can drag the mark up above into your bookmarks bar. Click it on any webpage to have any image without alt text blurred.

Here are some sites to try:

  • This blog. The first image below has alt text, the second one doesn't (just to prove it works).
  • YouTube.
  • Any news website.
  • Your favorite cooking blog.
A book on a table, opened in the middle with the pages fanning out. https://blog.ohheybrian.com/wp-content/uploads/2020/02/49092969911_983248feea_c.jpg

Alt text is how images are described for people who have vision impairments. By not providing the text, they immediately lose out on whatever you're trying to say.

It's also important to remember that alt text is different that the little hover text you see when you hover an image. That's the title tag and displays. Often, the text is the same. But sites which show a title don't always include alt text as well. YouTube in particular becomes a very different experience.

Anyways, more of a little helper to keep myself from slipping. Try it out. The source is here if you want to take a look or improve.

Comments

Alan Levine

Thanks for making this, it is really effective, Plus you are saving me the trouble of figuring out Tampermonkey.

Suggestions/Wishes:

it catches img where alt is empty but not one completely lacking an alt. I believe this might work to cover both (?):


         img[alt=””], img:not([alt]) {
         ….
         }
      

If I were to really dream, I’d like to toggle it on and off. Maybe an option click can undo the blur?

Because Everything is a Priority...

My job for the last six years years has been to identify and provide professional development for staff. We set goals, looked at upcoming challenges, and worked toward meeting the needs on the horizon. In that, the team I was on worked hard to infuse practice into the skills we were teaching. It was a different paradigm and we made some huge progress in a short amount of time.

Then COVID-19 raged across the planet and threw us into a spin. Given all the unknowns, one thing was very clear: we had to train staff and build skill. Fast.

And we did.

Time was given by the district to focus on staff development. They realized that staff would need much more support in the new, online environment. My partner and I worked hours and hours, alongside the teaching staff, to provide helpful materials, workshops, templates, and resources. Teachers had time and space to learn and grow even though things were upside down.

Time and space.

Time and space were given for those shifts to take place. It was a deliberate choice to make room for those things to happen. And because we had time and space, we came out of our 18 months online in much better shape that we would have otherwise.

Now, we're back in school full time.

Coming back to school, we made the deliberate decision to remove that time and space. Our schedule lacks any time for teachers to work together. We have a rigid structure of required work (I don't want to dignify it as "professional development") that amounts to checkboxes on a form. Back to business as usual.

I feel like I've been shifted over into accounting. Time spent doing professional learning is finally paid (thanks ESSER) and instead of setting a high bar and making it relevant, we're reading prompts and watching videos. "Answering" questions and verifying that people have clicked submit. Head knowledge is provided, but there is no expectation or mechanism to work on practice. There is no support or system to make sure what we're taking time to do is being used.

Why?

Because those things don't require time and space.

Not at school, at least.

On the face, our core departments are doing slightly better at communicating, but we have different groups prioritizing different things all in the same timeline. We're lacking cohesive leadership and vision setting. And because everything is a priority, nothing gets our full attention.

Instead of diving in and exploring meaningful, powerful change and merging what we learned while remote with our excitement about actually being with students, we said, "nah."

So, what's there to be done?

On one hand, I can put my head down and do the work given, trusting that people up higher have a better vision for what's needed. I can also listen to the staff I've built relationships with, remembering that the time and space they get is sucked up by other things. Even if we had time and space, then PD probably still wouldn't be at the forefront.

On the other hand, I could do more and think outside the box.

We are still in an emergency, without a doubt. But I'm tired of letting the emergency dictate what we do.

I still don't know what this looks like day to day, but it's more interesting that what I'm doing day to day now. Even though everything is a priority and we're lacking the traditional time and space, there is room for good work to be done.

And I'm planning on trying to do it.


Shatura narrow gauge railway, Misheronskiy by trolleway is licensed under CC BY

Using Google Apps Script as a Webhook

Google Apps Script projects can be published as web apps when you’re done with them. This is helpful if you’re building a tool with a simple UI (using HTMLService), but they can also be used as webhooks to do work within a user account.

Google provides the option of service accounts, so why use webhooks?

For my project, it was because I needed to send calendar invites to other people. Service accounts are generally the way to handle those kinds of jobs, but it needed to have full account delegation, which means it can act on behalf of any user in the domain. That wasn't an option in this case, so a webhook became the next best option.

What is a webhook?

In simple terms, it’s an application that can do something when it receives a standard HTTP request. They generally work on a subscription model, where the webhook service listens for an action on a service. It sits in the middle, listening for an HTTP call and then emits another action in response.

For this example, I have our events management app running in Python on a local server. When there’s an action requiring an update to the calendar, it hits the Apps Script webhook and the script does some work before returning a response to the events app.

The Code

In Google Apps Script, doPost and doGet are functions which respond to POST and GET requests respectively. Because I’m calling the script from the server with a payload for the calendar event, I used doPost to listen for events.

The simplest hook you can set up is a function which listens for a POST request from somewhere and returns a response:

```javascript

function doPost(e) { return ContentService.createTextOutput(JSON.stringify( { "message": "Yo." } ) )} ```

When you deploy the script as a web app, make sure it is set to “Anyone” can access but that the script runs as you. And that’s where the magic is.

With a traditional service account, it needs access as another user. There’s no way to limit that permission to a single user from the admin console, which is why I couldn’t take that approach with my project. In this case, the webhook is triggered by a request and then runs as the scoped user.

Using this method, we're able to achieve the same permissions but without giving it access to every other user in the domain.

A Quick Note on Deploying

| Update 11/29/2021 | Many thanks to Joseph for sharing how to update a deployment without generating a new URL. This section of the post remains becuase it is a good example of how not to manage deployments.

The pain you mention about deployments and having to change the exec URL in the other codebase can, I believe, be removed by the method described in this post. I have used it several times since I discovered it. It's very useful for situations where you've already embedded the exec URL in a Google site or elsewhere.

With the new Apps Script editor, deploys are a pain. You can't use the dev endpoint to test the hook (I don't know why), meaning there's no way to execute the HEAD version of the script in this instance. So, to test your changes, you need to deploy a new version and that gives a new URL for the endpoint. You'll have to update your API request in your other codebase...it's a mess.

If you can swing it, either build in the legeacy editor (still using the V8 runtime) or use clasp to manage your deploys. If I could change one thing about this whole project, this would be it.

Security

Now, how to secure something like this? There’s nothing foolproof, obviously, but there are a few advantages to using Apps Script in this way:

It all depends on how you’re expecting the webhook to be used with that third party.

When a request comes in, it includes an event parameter which holds information for the task. Even though every request is a POST, I listen for different methods in the post body to determine what happens next.

Here’s the same application as above with more detail added:

```javascript

function doPost(e) { const accessKey = 'someBigLongString'; let result; let params = JSON.parse(e.postData.contents) let method = params.method; let token = params.token; let userId = params.userId; // Add whatever other params you want

   if(token === accessKey) {
       switch method {
           case method === 'POST':
               result = {
                   // ...
               }
           case method === 'PUT':
               // etc...
       }
   } else {
       result = {
           'status': 'Forbidden',
           'statusCode': 403,
           'message': 'You do not have access to this resource.'
       }
   }
   return ContentService.createTextOutput(JSON.stringify(result))

} ```

In Practice

The general structure for any web hook is the same:

You have the double benefit of hosting the script and user-scoped permissions for individual projects. In the future, it may be worth finding some kind of parsing library for handling incoming requests to cut down on boilerplate code for new projects. But if you’re looking for a way to interact with Google resources from the outside, this is one way that has worked well for me.

New PD Site: SuperAdmins

Starting off with this project, I knew it would need a strong admin interface. One of the problems I mentioned initially is that all of the PD troubleshooting and administration went through me. That became a big burden, especially when presenters needed something done.

The new platform includes a SuperAdmin area which can be used to manage events and users. This role can be given to anyone, so it’ll be nice to have some other team members who are able to make system-level changes.

Event Management

SuperAdmins have access to all events in the system. They can see registration details and update attendance. From time to time, a presenter may miss a person who was actually there, so the admin can go in and mark that person as having attended after the fact.

The SuperAdmin event controls are nearly the same as the Presenter controls with two major differences:

First, SuperAdmins can add any user as a presenter to an event. Presenters can only search through users who are already presenters in the platform. SuperAdmins override that and can add any registered user. When that user is made a presenter, their user role is also updated and they’ll be granted the Presenter permissions detailed in my last post.

Second, SuperAdmins can completely delete events. Presenters could set an event status as ‘inactive,’ which hides it from the UI, but doesn’t remove any of the data. The SuperAdmin can nuke the entire event along with registrations, attendance, and other associated data.

Because the deletion is completely irreversible, this has an extra confirmation step before actually performing the operation.

A popup asking the SuperAdmin if they are sure they want to delete all information and registrations for an event.

User Management

When a user registers for the site, their account is set by default to User. This limits what controls the general user has (again, detailed in another post) and lets people sign up without interaction from presenters or admins.

There are times when users need to graduate up a level. The User Management area allows admins to change user roles with a dropdown menu. This role change is immediate and on the next login (or page refresh), the user permissions will update.

A list of users with their name, location, email, and role. Each user has an Edit button available to the SuperAdmin.

SuperAdmins can make edits to users within the platform. Their login matches their Google account, but maybe they want their first name to be displayed differently. Or, more importantly, they didn't register for an event that they actually showed up for. If an edit is necessary, the SuperAdmin can do all of those things in the sidebar. This is the same sidebar mentioned in the post on presenters with different form fields thrown in there.

I decided not to all admins to disenroll a user from an event because we want to be able to look at what was registered for vs what was actually attended. There isn't a view for this report yet, but we'll be able to do that in the future.

In the end...

This was a project of necessity to help us move forward as a cohesive team but it was also one of significant learning for me. This is my real first full-stack project from the database up to the frontend. The entire project is on Github, and while it isn't ready for general distribution yet, you can look at the design from the code level.

There are several features I've added since starting these posts (duplicating events, UI updates, mobile styles, accessibility styles, etc) that I won't be visiting specifically. I'm going to do one last post on technical aspects and explain some of my design and code choices if you're interested.

If this is something you'd like to consider using in your school or district, I'd be happy to set up a demo to walk through the platform as a whole. Get in touch if you'd like to do that.

New PD Site: Presenters

sWe’ve had a look at how users work in the new PD site, so let’s take a look at another major player: presenters.

The word “presenter” is really loose in this system. This can be a person who is leading a one-off workshop. It can also be someone who can answer questions about a program. Or it can be someone who is facilitating a multi-week learning extravaganza. At the end of the day, a presenter in this system is someone who is responsible for certifying participants have done a thing.

I mentioned in the last post we want to move toward competencies and to encourage the coaching aspect of skill development, presenters play a crucial role in the professional learning system. I’m getting a little ahead, so let’s back up.

FREEDOMMMMMM

Presenters start with a little more freedom in the site. They have two additional menu items: Presenter Tools and Create. This is a smaller group of people who will make things happen, so they need more tools in their kit to do those things.

Home

The presenter is a more powerful user. So, their home page is the same as a general teacher. Notice that the home page now only loads future, active events. Their navigation menu has new options, though.

Presenters get a new dashboard option.

Create

By default, a Presenter can create a new event. An event can be a one time workshop or something spanning a longer period of time. The type of event is determined by the presenter, so the form helps with that. This also helps us categorize which types are more or less popular, which ones get more registrations, and which have higher rates of completion.

The Create Event form allows presenters to post new events for staff to register for.

Creating an event sets it to Active by default and people can begin registering. On first submission, the person completing the form is set as a presenter. This will need to change because we'll eventually have secretaries or assistants creating events but we don't want them listed as the main point of contact.

One point I'm particularly happy with is setting the event type and location fields. I took major inspiration from Jonnie Hallman, a developer who write extensively about his design and build processes. His post on building inline forms helped me think through how to handle this part well.

Presenters can choose existing sites and locations or create new ones on the fly.

When the page loads, it grabs current event type options (In person, Google Meet, Webex, etc) and throws them in a dropdown menu. The same goes for locations. These include metadata that can be used later in the UI, but for now, it's just to help categorize our events.

The big question was how to handle a situation where the type or location didn't exist. Using an inline form, I was able to allow the presenter to create a new type or a location on the fly and dynamically update the menu.

After submitting the event, a simple modal confirms (or rejects) the submission.

A sidebar slides out from the right to keep users from leaving the creation flow.

To do

  • Better validation in the UI to make sure errors are caught early.

Presenter Tools

Here’s where the rubber starts to meet the road. Once a presenter has created (or been added to) a event, they are able to see more information and even change some of those details.

In the Presenter Tools, the user is given a list at the top of the page of each session where they are listed as a presenter. Clicking on an event title loads the registrations and enables editing tools. It's important to note that this view does not filter by date or active status because we want presenters to be able to make those changes.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-15_11-37-07.png

Edit Sessions

In the tools section, a presenter can open a sidebar to make small adjustments to the event. Things like the title, meeting location, description, etc. Date changes are also supported right now. Using the same inline form as before, a slider will pop out with a form they can edit. Current values for the event are pre-loaded into each form field.

To keep the sidebar from scrolling to the moon and back, different edits are split into different actions. The only edit not supported for presenters right now is the ability to edit who is presenting. There's no method for getting users who are already presenters, so that needs to be built out before those changes can be allowed.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-15_11-39-33.png

Another helpful tool for presenters is a clean method for adding resource links to the event. In the edit form, current links are shown as well as a simple form to add a new link. The link categorization isn't really used yet, but it will allow us to use that metadata later.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-15_11-43-06.png

To do:

  • Edit event presenters by only displaying users who already have presenters status somewhere else.
  • Remove links from the event.

Communication

Often, presenters will want to get in touch with registrants before or after an event. When a presenter clicks on an event in their list, they’re given a snapshot of the registrations at that moment along with a couple of tools.

Presenters can open an email to all registrations for quick communication from here. Emails are sent by whatever is set up on their computer (Gmail, etc) so the app can stay simpler. Getting into sending automated emails is hairy.

They’re also able to see the registrant status. Remember in the last post how one session was marked “Registered” and the other was marked “Attended?” This is where that happens.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-15_11-44-16.png

In reality, some of the training we’ll be facilitating is just that: one time training. After an event, the presenter can come in here and mark an individual has having participated or make those marks in bulk on the registrations list. This will flip the status for users and they’ll be able to get their documents.

At other times, we want to see growth and competency. So, a presenter may have a long-running event - weeks or months - and as participants show their skills, the presenter can come in and mark those people off. The asynchronous, intentional marking of completion will help presenters take action in working with their participants and signal to staff that we want to help them make substantive change in their practice.

This was a huge update to functionality, so I'm going to stop there. There will be at least one more post detailing the admin tooling. Lastly, I'll probably do a big writeup of the technology behind this system and give links to source code so you can dig in and take a look.

New PD Site: Staff User

This spring and summer, I've taken on a full-blown re-write of our PD registration and management system. Our current system is a collection of centralized Google Apps Script projects which create, share, and manage all event folders, documents, and records. It's worked well over the last two years but there's been a single point of contact for everything: me.

Now that we're getting more people involved in professional development, it's time to have a more robust, user-based system for creating and managing events as well as teacher signups and recordkeeping. This post is going to explore the first role: Staff Users. These are teachers or staff who are registering for and tracking participation in events.

Home

The home page shows logged-in users all district events. Their own state is shown on course badge as either Attended or Registered. Clicking on a course shows specifics (presenters, location, etc) in a sidebar. If a user registers for an event, their course badge updates dynamically which prevents multiple registrations by the same person.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-05_14-47-29.png https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-05_14-28-46-1.png

To do

  • Google calendar invitation to events
  • Only show upcoming events
  • Disable registration on events with no remaining space

Schedule

This is essentially a filtered view of workshops or events the staff member has signed up for. Each event's status is shown and details are displayed in the sidebar when the event is clicked.

To do

  • Custom date filtering
  • Expand view to remove a click for details

Documents

We've had a digital sign-up tool in place for several years. The biggest improvement I'm excited about is the documentation processing. Any registration is put into a database which can be queried and filtered by a bunch of parameters. This allows me to build out a nice spot for teachers to find their documents on demand and print whatever they needed rather than waiting on us to generate a PDF from a spreadsheet and send it off.

This page shows the which events have their participation or completion confirmed by the facilitator. The reason this confirmation step is so important is that we need to move away from being trained and move toward showing competency. So, a workshop might be a part of a program, but it does not guarantee that the staff member has actually improved.

This is a big shift for us. In the past, we used a standard feedback form. But, given the variety of presenters working with us, we wanted to give people more freedom in how they collected feedback. Also, since we were generating all the feedback forms centrally, we found presenters were less likely to actually read the feedback because the questions may not have been relevant to their own goals. At worst, participants were filling out multiple forms at events - one for us, and one for the presenter. Taking the form out of the documentation flow simplifies for everyone.

Without showing the presenter interface now, this view is any confirmed event for the user. They are also given a couple snapshots at the top: total registrations (how much am I signing up for) and Professional Growth Points (PGPs) earned for completing requirements.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-05_14-39-09-1.png

From here, they can either print a summary of all activity on record or print individual documents as needed. All of these details are generated by the database. The record is also validated by the server and database rather than taking input directly. There's no more wondering when an event was or how many PGPs it was worth because it's all driven from a single source of truth.

https://blog.ohheybrian.com/wp-content/uploads/2021/07/2021-07-05_14-46-11.png

That's a quick view of a portion of this site that's just about finished. But there's a lot happening in the background to make that work and to allow different people to manage. In future posts, I'll detail Presenters and SuperAdmins and their roles on the site for creating and managing events. I'll also get a technical post published on the technology used to build this as well as deploying.

Bees, bees, bees

Several years ago, my wife took a short beekeeping course with her mom. They tried a hive in their backyard a few times, but the colonies never survived. The hive sat unused for at least a year until last May when a swarm decided to move in.

A honeybee swarm congregating on an old beehive

I decided that I wanted to try my hand a keeping some bees. I couldn't start last year because we were in the middle of building the house. I spent last winter reading and watching videos as well as talking with my friend Phil who keeps bees at his place.

This spring, I set out several swarm traps - boxes that are ideal for swarms because they're cozy and smell like bees from old wax and some lemongrass oil as bait. Most forums say one in three traps is successful and I was hoping to catch at least one.

I caught three over the last month.

A swarm of bees on a branch of a tree.

I immediately fell in love with this work.

Watching these insects work is...overwhelming and humbling. The colony doesn't need me at all, but they don't mind the help. I'm allowed to peek in the box from time to time to make sure they're healthy. I can provide shelter and a safe place to build out a permanent home. In return, they'll pollinate the gardens nearby and share their extra honey.

A frame of bees working on brood comb.

As a new keeper, I'm learning their signals. I'm watching for behaviors that can tell me what they're doing and what I can do to help. This is a hobby of observation...and at the end of the day after dinner, sometimes I just go sit and watch and enjoy the hum.

Comments

John Sowash

So cool! Thanks for sharing your hobby!

Handling Paginated API Responses in Google Apps Script

One of my favorite things about Google Apps Script is that I can get information from an API and write it directly to a shared document or spreadsheet. Information is collected automatically and immediately available to a group of people working on that same doc.

It's not uncommon for these requests to come back in paginated form. Meaning, if your request has 100 responses, they'll come in batches of n size instead of all at once. To demonstrate, I've made a video showing how I use a script to handle requests to Canvas.

If you want to play with it, here's an example you can start from:

Comments

Philip

I am trying to do the same with Notion API in Google App Script, querying a database of more than 100 items.

In the requests they provide a start_cursor and in the response a has_more boolean value letting you know there is more data as well as a next_cursor parameter that you can use to retrieve the next page of results by passing the value as the start_cursor parameter to the same endpoint.

i just started learning about APIs and don’t really know how to structure this particular example. So far i have code working without any pagination so a hard limit of 100 items.

Do you have any pointers on how to do this or an example?