Posts
I attended the Michigan Science Teachers Association Conference on March 1st. It was the first all-science teaching conference I'd attended, which is very different than the kind of conference I generally go to.
Some of my big takeaways are:
- Lots of new resources on where to find relevant and rich data. My Environmental Science course uses a ton of data because we're constantly circling that question of how humans are interacting with and impacting the environment. I like to have my students analyze and make sense of data. One site in particular that stands out is Our World in Data, a free, searchable site of data on just about anything you could possibly want. We will be using it this week.
- At some point, the fun got taken out of science. Maybe it was the testing changes in the early-mid 2000's or it was me just deciding that I was going to be "rigorous" and "serious about learning." I'm not really sure. Either way, there is a ton of fun to be had in school and it is okay to do the fun stuff. I got a couple new, goofy songs to help students remember the intermolecular force properties of water (to the tune of the Mickey Mouse Club) that we will use in our next chapter on solutions.
- I'm a chump when it comes to labs. So many teachers have such better labs than I've been doing. We're going to turn pennies to gold next week for St. Patrick's Day (see #2).
What good science teachers do
The session that made me think the most was a research report out of Michigan State University which looked at classroom practices which led to higher student retention and application of science skills and content. Teachers and students from across the country participated (125,000 students over four years) so this is, like, a legit thing.
The most effective science teachers took time to help students go through a divergent/convergent thinking protocol over the course of an investigation. Students are guided through forming ideas, comparing, coming to consensus, investigating a phenomena, and then using evidence to draw conclusions. The best teachers get students to diverge in their thinking by calling up background knowldge to engage with the scientific thinking process and then working as a community (like real scientists) to develop plans to test their thinking.
That's all well and good but the real payoff comes in the convergence that happens later. Because students are engaged in the process, all of their work is consequential, meaning it can be used to draw conclusions (whether or not it is graded - that doesn't actually matter). The research found that when students are encouraged to use what they've done, they converge on ideas which descibe the phenomena. They learn what to look for as they are learning what it represents in parallel.
I would really like to think that this is what I do in my classroom, but in all honesty, I think four days out of five I tend to focus on process and getting from Point A to Point B. We do a lot of consequential writing in our notebooks and students are able to use those materials on assessments, but it isn't around longer-running, wholistic units.
This is not to say that every unit must follow that process. It's more a point of reference for me moving foward - am I allowing time and space for divergent thinking to become convergent? What methods and tools do I use to help students find that consensus on phenomena? In what ways can I support students through that process? It's a very different way of thinking about science as a discipline in high school but it seems to do a better job of helping students build dual understandings (plural - that's important) of content as well as practice.
I'm re-working my next unit a little bit in respose. We're staring solution chemistry this week and I'm going to try to do some pre-thinking activities to tease out what they know (or think they know) about solutions before we go through our investigations later this month.
This is a scheduled post!
One thing I miss from a CMS blog is the ability to schedule posts for publication. It turns out a lot of other people running Pelican (my blog engine of choice) have also faced this problem and shared their solutions:
- Scheduling Posts with Pelican written by Tiffany B. Brown of Webinsta mentions a bash script to look for "scheduled" posts in the frontmatter and then a second cron job to build the site.
- Timed Posts with Pelican written by mcman_s which includes a link to a bash script which will publish shared posts from a cron job.
- Scheduled Posts with Pelican by Benjamin Gordon. The oldest of the promising results I came across and the simplest implementation.
It would be great to be able to use the WITH_FUTURE_DATES setting in Pelican, but since this is built from a bare git repo, I don't actually have the entire content directory on my server anywhere. My build step creates it when a push is received, so method #3 won't work easily without a change to my workflow.
I decied to modify the script shared by mcman_s (and I'm assuming it's similar to what Tiffany Brown wrote about). Each day, a cron job will create a clone of the repo to get the most up to date source files. Then, it loops over the content directory and looks for posts which need to be published based on the date. When it finds one, the file is edited and then committed and pushed back to the repo to be built.
So, this post was automatically published from a cron job. Now, when the mood hits me to fire out a couple of posts, I can do that and just set the publish date as some time in the future.
I read less in February, but the books were a little longer than January. We also didn't have a week-long break as part of my reading schedule.
The Boys in the Boat: Nine Americans and Their Epic Quest for Gold at the 1936 Berlin Olympics - Daniel James Brown
This is a biography/sports history book about the Washington rowing team which, according to all sources, was the best the world had ever seen. Brown takes an intimate look of the team members and coaches pulling from stories from the rowers themselves and family who have records and journals from their time together. Some people compare this to Unbroken by Laura Hillenbrand, I felt it was more like Seabiscuit (also by Laura Hillenbrand). It is definitely worth reading. I hear the movie is also very good.
Tomorrow, and Tomorrow, and Tomorrow - Gabrielle Zevin
This is one of those books that I heard about from several different outlets, so I decided to give it a try. My copy took several weeks of waiting on hold to actually come, but then I read it pretty quickly. This book has several ups and downs. Each character is tragic, losing something dear and having an unclear resolution of anything gained in the loss. Sam and Sadie are complex in faults and their differences led to a satisfying ending.
Update November 2024
The post below details how to build Helix from source. This doesn't really provide much of a benefit other than being able to install from a specific branch of the project or a specific build from an open PR.
I've updated the bottom of the post with how to install Helix from the pre-built binaries. This takes much less disk space because you don't have to install the entire Rust toolchain to build the editor. The binary runs at about 13MB, so is much friendlier for the humble Chromebook.
I'm setting up another Chromebook and I realized I never documented the entire process of building Helix for a Chromebook. There are a couple extra steps needed to get everything wired up correctly.
The ChromeOS Linux container is Debian, but Helix doesn't have an apt package for an easy install, so the best options are to either download and install the binaries or build from source. The GitHub repo has compiled source binaries for many systems. aarch64 builds should run on an ARM Chromebook, but I haven't tried it myself. This will walk through building from source.
Install from source
Chromebooks don't come with a C compiler toolchain, so you need to install the gcc package:
text
sudo apt install build-essential
Now, you can install Rust:
text
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
Once the tooling is in place, make sure the /usr/src directory is writable. It's owned by root initially, so either reset the ownership or adjust the group who can write. This is where you can download the Helix git repository source files.
Clone the Helix repo into your directory of choice:
text
git clone https://github.com/helix-editor/helix
cd helix
Once you have the source downloaded, you can use cargo to build Helix. Make sure you're in the helix directory you just downloaded:
text
cargo install --path helix-term --locked
This will take a while, so grab a book. All of the dependencies are downloaded and then the compiler will build the Helix tooling and command line utilities. When you're finished, you'll have the hx command available in your terminal to launch the editor.
The last thing to do is set up a symlink to the runtime directory:
text
ln -Ts $PWD/runtime ~/.config/helix/runtime
From there, you can set up your config.toml and languages.toml files in the config directory. If you're on a Chromebook, you'll need to override the truecolor check to use themes.
Install from pre-built binary
The Helix team releases pre-built binaries for each release. You can get those from the releases page on the repo. You'll need to make sure you download the binary for the correct architecture. Most likely you'll need the *-x86_64_linux.tar.gz file.
Once it's downloaded, you'll need to move it to your Linux container. If you're running the Linux development container, you'll need to make sure your Downloads folder is shared. You can also use the ChromeOS file manager to move the folder around.
Unzip the file and move the hx file into your Linux system somewhere. Mine lives at ~/.cargo/bin/hx, but it can just as easily be under /usr/bin or /usr/local/bin. It just needs to be accessible on your PATH variable. Once you've got your file stored somewhere, you'll need to make it executable with chmod +x hx.
Finally, you'll need to move the runtime/ directory. Copy it into a config directory. The installation documents suggest ~/.config/helix/runtime which is what I do. I also keep my languages.toml and config.toml files here.
Last month, I gave myself a diagnosis of tinnitus. My ears have a strong persistent ring that has started to work its way into my day to day experience. I think it's been there for a long time, but for some reason, it has intensified this year.
I've been reading and listening to a lot of materials to try and understand more. First, I've learned it is a symptom, not a disease. It's a non-existent sound produced by my brain for some reason. I think mine is related to persistent loud noise in my younger years, which is very common. This week, I experienced my first "spike" - a sudden increase because of some external factor - and that led to feeling horrible overall. A headache set in and I just ended up going to bed. It was my first true experience of the tiny screaming fairy.
There is no one cause or one fix for any of these symptoms. My body is sensitive to some things and I need to pay close attention to causes and effects. I'm paying attention to my diet, my moods, my sleep...pretty much anything I do day to day which can affect my perception of the ringing. And that's the difficult part - it's all perception. I'm fighting against the feeling that I should just be able to deal with it because it's all in my head and the reality that this is affecting me in new ways.
Success! I've been using Helix on my Chromebook for almost a year and I was finally able to get editor themes working correctly. Helix has a great selection of themes but I was never able to get them working on this little machine. I had to install from source, which was a little dicey given how small the hard drive is and how low-power mid-range Chromebooks are in general. The development container works well but the terminal emulator on Chrome isn't well documented and I kept running into an error any time I tried to set the theme:
text
Unsupported theme: theme requires true color support.
Long story short, Helix does some kind of internal check for color support in the terminal and, for some reason, it would throw a false negative. I finally found a discussion thread in the repo from 2023 showing that you can set the editor section of the config file to override the auto check:
toml
[editor]
true-color = true
Funnily enough, another ChromeOS user posted in the repo's discussions about ChromeOS specificallywith the same answer being given. I wish it hadn't taken me this long to figure out, but here we are.
It's still not my favorite for programming, but this little Chromebook can do a lot more than it seems on the surface. This is one more box checked off on my list and a mystery finally solved.
I've learned to focus on slowing this down rather than fighting to accelerate all the time. It comes from living away from places. It comes from taking my work as it comes rather than tracking productivity like I used to. I'm appreciating the time it takes so do things instead of constantly tweaking my rhythms to squeeze as much "value" out of every minute. Sometimes the value is in the time between starting and finishing a project.

When I was doing technical and organizational work, I fell into the productivity black hole. I was constantly trying to find systems that would make me just a little more productive - a few more tasks tracked and a few more jobs done. I did good work but I felt a compulsion to show just how much I got done. I logged and planned and at the end, filled pages of books with stuff I don't care about.
I don't have these notebooks anymore.

This kernel of popcorn took nine months to eat. It has the story of our family choosing to plant something new. The dirt of the garden under our nails when we planted the seeds. Water on our skin from setting up the sprinkler. Husks in our hands. The smell in the air.
All of that takes time. I have this written down in my new notebook.
I just finished watching the final series of The Expanse on Amazon Prime last night and I'm really hoping they continue the story. If you're not familiar with the series, it is a TV adaptation of a series of novels of the same name written by James S. A. Corey (Daniel Abram and Ty Frank). It's set a couple hundred years into the future and centers on the tensions between the people of Earth, Mars, and essentially an underclass living in the asteroid belt and outer planets who rely on the inner planets to survive. I've read and listened to the books a number of times because they're just that good as a return-to book when I need something familiar.
The TV series started on Syfy and was then dropped after three seasons. Amazon Prime picked it up and ran three more seasons before ending at the end of book 6, Babylon's Ashes. The TV series departs from the books in several areas, but more out of necessity than creative choices. For instance, the novels are very realistic in how how big space is, so there are several parts of books where the crew is in transit for months. You can't do that on TV. There are some changes in characters and their roles, but that's more because the books feature dozens of people. Each book in the novel series introduces secondary characters that go along with the main crew. You can't keep introducing new faces in a TV series.
That said, the TV series does a great job of adapting the huge universe created by the books. Without giving much away, there is a large gap of time between books six and seven in the novels that would make an adaptation difficult - Screnrant has a good breakdown of some of the challenges. The frustrating part is that season six left of with just enough hanging fruit that it could be started up again if the conditions were right.
If you've not seen the show, I would recommend reading the books first. I'm not a "the book is always better" kind of person but the depth of story in the books helps you appreciate how the show was made. The showrunners did a good job of telling a coherent, captivating, and human story in a very large, very complicated universe.
I took some time this morning to go through ooh.directory and add some personal blogs to my feed reader. Nothing in particular, just things that were updated in the last month and caught my attention while I combed through the lists page by page. There are two that have stood out immediately looking through the backlog:
- A Man and His Hoe. An anonymous man writing from Washington state about chickens and the weather. It's great to read someone who has similar perspectives and just to get day to day updates.
- My Granddad is Keeping Busy. Another daily site where Anne posts her grandfather's journal entries from 50 years ago. There are 40 (!) pages of posts so far. The first post was published in January 2023 so it appears this one will be around for a while.
For a long time, I'd only subscribed to blogs where people hadd A Message and I'm looking forward to meandering again.
In my elective class (environmental science), I've moved toward using "concept checks" at the end of each unit in favor of a traditional test. I have a Google Doc template with several prompts (usually 6-8) related to the concepts we've been working on in class. At the bottom of the test, students reflect on the learnin objectives set at the start of the unit and then grade their own understanding.
I like this method because it asks students to verbalize what they know and apply concepts in more detail than we normally do in class. As I read, I leave comments in the doc, prompting with followup questions or asking them to provide more detail on claims they make in writing. After the feedback stage, I send it back and students are able to make revisions.
While it has worked well, I don't think I'm doing a good job of preparing students for the depth of response I would like to see. They're able to use their notes and resources to form their responses, but many times, it turns into a definition word salad and I don't see application or justification of ideas. My feedback step pushes them to justify more, but I would love to see that happen on the first attempt.
Ideally, I would be able to give a single grade at the end of the semester representing their growth as science consumers and communicators. I'm not 100% sure how to do that along with tracking progress across individual units of study. I don't know if that's important or if it is my own perception of what should be shown in the gradebook. I just know that grades are something that trouble me and I'm trying to find a way to play both sides of the line.
A goal this year is to read more books. I finished four books this month:
Pastoral Song: An Inheritance - James Rebanks
Rebanks is a regenerative farmer in England and this book is about his care for the fell farms in northern England. He writes like James Herriot in this memoir, lamenting the "advances" of modern farming as he works to keep hold of his family's traditional approaches.
Under Alien Skies: A Signseer's Guide to the Universe - Phil Plait
Phil is an astronomer and writer of the Bad Astronomy newsletter/social media accounts. He explores various areas in our galaxy and explains how you would actually perceive some of these places if we could go there. The book is full of vivid imagery built from what we understand via observations mixed with some scientific fiction narrative worked in.
Leave the World Behind - Rumaan Alam
This is a new suspense/disaster novel that I picked up because I watched the Netflix trailer. It's a slow-build suspense novel that has a ton of weird stuff happen to two unfortunate families stuck together. The premise was interesting but the characters felt a little flat to me.
The Sheperd's Life: Modern Dispatches from an Ancient Landscape - James Rebanks
This is more of a memoir than Pastoral Song was as Rebanks works his way through a growing season as a shepherd. This is filled with lessons from his grandfather and father as he wrestles what it means to be his own farmer making decisions as a traditional farmer in a modern landscape. It also reads very much like James Herriott.
I'll keep posting here but I'm also keeping track on LibraryThing if you're there and would like to connect.
January is at an end and I'm longing for more sunlight. The days are definitely longer, but they're still too short to really have appreciable time after school to be outside and decompress.
We tapped maple trees yesterday on the farm. Our two families were working together in the woods to take advantage of a coming warm streak. The sap will run this week with temperatures forecast in the 40's, but still cold at night. It's not as early as last year which is good, but it's still probably earlier than it should be overall. Climate change is pushing our seasons earlier which means we need to be ready to grow sooner. My body is happy to have some warmer weather but my mind knows that it comes at a cost. Maybe not one we can pay.
My bees are still alive. I took a moment this week to put my ear against the hive. I could hear their hum through the wood - no stethescope needed. They know how to take care of themselves...I just try to give them a place they want to occupy.
There is still a lot of winter yet left. But one can hope.
We're wrapping up a case study on the Ogallala Aquifer this week and I'm happy with the kind of exploration students have been able to do. Original credit for the base materials go to my sister-in-law who graciously shared all of her environmental science materials with me once I found out I would be teaching it for the first time this year.
The Prelude
Students took some time to define the difference between groundwater and surface water. We did a large watershed mapping exercise (that will probably be another blog post) showing how water moves across different regions of the world downhill toward oceans. Then, we took some time to talk about freshwater uses before moving into discussing how areas without major rivers get freshwater resources.
The Aquifer
The Ogallala Aquifer is the largest freshwater deposit in the United States (175,000 sqaure miles in eight states). Starting in the mid 1900's, plains states farmers began pumping water from the aquifer as an irrigation source. I showed this short documentary as a discussion starter for the main portion of the exploration.
Expand and Explore
Following the video, I gave students a Google Doc template with expansion and exploration questions. Because environmental science overlaps science with human experience, these explorations look at the science behind what is happening as well as social, economic, and political implications. Students need to build their own understanding through research as they work to apply ideas and concepts to the scientific base.
This is also a great time to introduce data manipulation skills. The US Geological Survey publishes open-access data sources on all kinds of projects, one of which is the High Plains Water-Level Monitoring Study. This is field data for well depths which tap into the Ogallala Aquifer across all eight states. I grabbed the full TSV for 2013 from the published data page and then worked with students to create a scatter plot showing average well depth for each state that year. They then worked on some data analysis using their graphs.
The document linked below has two graphs already created, one showing the average depth to the water surface for each state as well as one showing the average water depth as well as the average well depth. I asked students to predict which state was most at risk of losing water first based on the water surface depth alone and, after some discussion, they realized that the aquifer might not be uniform depth. They also discussed topography differences which would make the water surface further into the crust. We then generated the second graph to show the average volume of water in each state which led to more thorough analysis.
Bigger Perspective
Data on charts is good for interpretation skills but I wanted them to be able to conceptualize each row on that sheet as an actual place with an actual well. Using the same USGS data, I threw together a custom Google Map and colored each marker in a range according to the depth of water below the surface of the ground (deeper wells are red).

The Endgame
We happen to live in an area rife with water thanks to the Great Lakes so we don't really think about water as a scarce commodity. We're finishing the unit with pollution and its effect on the Great Lakes' (sidebar: The Death and Life of the Great Lakes by Dan Eagen is wonderful and will probably appear in this unit soon) ecology and economy along with a reflection on their own personal water use habits. Some of my students come from farms with radial irrigation equipment, so I'm curious to see what connections come from this study.
Links
If you'd like copies of anything, you can get them here:
A new semester means a fresh gradebook and more time to think about how I'm actually giving feedback and reporting progress. I've been a user of the four-point rubric for a long time and, of course, I'm rethinking that entire approach. There are some benefits to the four point scale which are outlined particularly well by Robert Talbert's post on point everything toward feedback (my emphasis):
The biggest and most central problem is that marks are often used as an implicit form of feedback. But as feedback, number- or letter-based marks are not helpful. As I wrote in my previous post, the primary purpose of feedback is iteration. In order to be helpful, feedback should convey information back to the learner about their work that will be useful in crafting a next iteration of that work.
My standards rubric is essentially the EMRF model Robert discusses, but I've started wrestling with how to use a single point rubric as a method of driving students forwawrd. Single point rubrics remove the "score" component and would allow me to indiciate what a student needs to improve (they have not met the standard) as well as what they can do to expand (go beyond basic implementation or demonstration) their understanding.
The four-point rubric gives me some leeway in how to calculate that score because each item is entered numerically into the gradebook. At the end of each unit, I calculate a score out of four as the most recent mark + the highest mark divided by two. I like this because recent effort counts and students always benefit from their best performance, as opposed to a traditional average where they're always penalized for their worst performance.
I struggle with calculations because the last performance should be the most indiciative of skill development, but with so many conflicting factors involved in assessment, looking at growth over time is important to me. A single point rubric would make the grading faster and more specific to the student but would make my collection and analysis of progress more difficult. Alas.
I do have a site where students can track their progress over time and look at accumulated feedback. The idea is that they are checking that (for their digital assignments at least) to make sure they're making progress. On my end, I generate a simple sparkline to show trends. That isn't shown to students, though. Perhaps adding that small indicator would help me help them reflect on their own growth. Removing the point values entirely would allow them to focus on making their line move up rather than adding scores.
I'm slowly coming to the conclusion that I have a persistent case of tinnitus. Mine is a constant high-pitched ring in both ears that comes and goes in severity. Recently, it has been more noticable and I feel like it is starting to affect the way I perceive my surroundings.
In my younger days, I was often at loud basement concerts or similar shows. Loud music was also in my headphones and later car. This habit continues through college and into my younger married years. I didn't really become serious about protecting my ears until 2020 when we started building the house. Since then, I've tried to keep a pair of ear plugs on hand for every farm job or construction project.
I can't deny that it has gotten progressively more invasive. I particularly notice the ringing early in the morning and in the evening when I'm trying to settle into bed. If I'm busy or otherwise occupied, I don't notice the ring as much. My early reading on brain training can reduce the impact of tinnitus on day to day life and if I'm persistent in protecting myself, i can prevent further damage.
This is a lot of self-diagnosis, but much of the tinnitus diagnosis is based on self reporting. I need to schedule a time with my doctor for an actual audiology exam to see what frequencies I'm missing (if any). I'm also planning on writing about my experience more here.
I was perusing some blogs today and saw a post from Juha-Matti Santala in which he described his domain and others who inspired him to write his story. I've felt a little bit of a dry spell in writing, so here's mine.
I was at a conference in 2011 where I met some technology-focused teachers for the first time. Following my session, I was encouraged to buy my domain and start a website so I could "share my message." (In retrospect, that was a tainted way of thinking about how to use the Internet, but that's for a different post.)
Brian Bennett, unfortinately, is not an uncommon name. The first search result is Brian Bennett, the famous English drummer and pianist. The domain brianbennett.com is parked and for sale for an affordable $14,000 USD, so I had to come up with other options.
Since my initial focus was on marketing my message, I really wanted to stick with my full name if possible, so I grabbed brianbennett.org as the closest alternative. I threw WordPress on a BlueHost account and quickly built an okay website trying to market myself as an educational technology expert at the ripe age of 24.
Fast forward a couple years to 2014. I was working with a company and some friends started making the Family Guy "oh hey Brian" joke when I walked over to their building. It sort of caught on and I started noticing it a lot more. A quick domain search showed that it wasn't registered and wasn't too expensive. Since it was more of a gimmick domain, I threw a blank page up with a clip from Family Guy before finally taking time to learn how to build my own websites from scratch. The Wayback Machine has one of my earilest sites archived so you can still get a taste of what my early attempts at web development looked like. The blog link sill pointed to my brianbennett.org domain, but that was not long for this world.
The Vibe
I think ohheybrian.com as a domain helped me capture a little more of my personality. When I started, I was solely thinking about how to be A Person of Knolwedge on the Internet, which meant a name-based domain as well as a particular level of seriousness that I look back at now with a little bit of embarrassment. Since committing fully to the name, I've felt much more comfortable experimenting and letting my own personality come out a little more.
For a long time, my site was a very spartan landing page that had a little bit of snark about not wanting guest posts. It was a swing too far in the other direction, maybe as a pushback against the marketing tack I'd started with. I think my current design has captured the blend, keeping a very minimal design and aesthetic but not so bare that people can't learn a little about me.
Maybe it's maturity and maybe it's just not caring quite so much about what people think of me when they visit my site, but nearly 10 years after purchasing ohheybrian.com, I think this one is here to stay.
Well, I did a thing. I built a little service to add comments back into my blog. This was a little bit of a project because I had to build a backend process to handle the comment database as well as update my blog templates to fetch, display, and allow for comment submissions.
Receiving Comments in Flask
I love Flask. It's easy to get anything up and running quickly and runs great. David and the whole Pallets team do a great job maintaining and building the platform and I just enjoy working in it.
I threw a new endpoint on my main domain which will take in comments from my blog. Because they're on different domains, I had to handle CORS by allowing requests from my blog to the endpoint. Once a comment comes in, nefarious input is cleaned with nh3 and then stored in a SQLite database.
I want to moderate any submissions, so everything is marked "pending" by default. This meant I needed a little dashboard to both view and update the approved flag on the submission. I ended up making a small user model which allows me to log into an admin area on the site to manage the comments. There aren't any relationships on the model right now, so it's really just taking them one at at time.
Sending Comments from a Static Page
I took this as an opportunity to learn about HTML custom elements. I followed a few helpful articles and used the MDN docs to get a little element together which is now part of the article markup generated by Pelican. The component is a small javascript file which fetches any comments for the current article and throws them into the DOM.
The hard part about dynamically loading content onto a static page, especially content that is specific to that page, is how to have a unique identifier. Luckily, I've managed to have unique post titles for everything, so the page slug becomes the point of reference. Starting now, when a post loads, a request is sent to the Flask application with the page slug and any approved comments are then sent back over the wire and loaded.
The component also includes a form to submit your thoughts. I use htmx which handles the form submission through AJAX. But, if you're browsing with Javascript disabled, you can still submit the form because it falls back to the HTML action declaration in the markup.
They're Only Comments
There is no point to this other than it's something I wanted to do. I got to learn a bunch about custom elements and spend some time getting my little corner of the web a little more the way I want it.
What better way to start 2024 than with a blog post. I finished 2023 with 86 blog posts which is far more than I've written in several years. I was a few short of the 100 Days of Offload challenge, so this will start me off on the 2024 adventure.
There are several things I hope to do this year - some professional, some personal. I'll revisit this post at the end of the year to see how I do.
In no particular order, this year I hope to:
- Read 40 books. I used to read voraciously and I'm trying to rebuild the habit. Audiobooks count. I'm using LibraryThing to track my reading now (thanks to Alan Levine for the link to John Udell's post.)
- ✓ Add comments to this blog. I've got some work toward that goal finished, but it's no good if it's not published. (Published Jan 2, 2024).
- Cut my phone use time down to ~1 hour per day. I'm going to use Digital Wellbeing to track and I will reflect periodically on my progress. The ultimate goal would be to switch back to a flip phone full time.
- Ride my bike 150 miles. I'm not a cyclist by any means but I started doing local rides again last summer and I'd like to make it a more regular part of my leisure time. I'm using OpenTracks to track my rides.
- Get two new courses approved for 2024-25. I'm hoping to teach a beginner computer science and advanced chemistry course next school year, but we have some work to do to build out the syllabus and course sequences.
These are just a few of my bigger goals - smaller ones will come and go, but these are the ones I'll come back to in 12 months.
I had a soft goal of learning how to do some sketching and water color painting this year. I filled the small book I got last Christmas so I thought this would be a good time to share some of my favorites.

My first real attempt at painting mid January. This is our coat rack and various other hangable things. It had tons of folds and overlapping textures. I might have been wiser to start with a simpler subject.

A work trip took me to Detroit. The first day had brilliant blue skies, so I took a stab at some architectural sketching. Unfortunately, the pen bled when I added some color. Oh well.

This might be my favorite. In April, we took a trip to Cumberland Island off the coast of Georgia. The live oak trees canopy a huge portion of the island and give so much texture and character to the island. We will definitely go back.

This shows that not trying for something in particular can lead to the best result. My wife and I were discussing our chicken coop and I managed this little sketch during our talk on the back of an envelope.

My brother-in-law's cow pasture is next to our house and during the summer I enjoy just watching them wander around. This pushed me to capture general shapes quickly while they ate their way past the house one evening.

This is a sketch from a photo of a winter cardinal that I did as practice before painting a small ornament for our tree. I like the character that came out.
This has definitely been a journey. I have many other failed and janky paintings in my book that will stay private to me and my family. I'm looking forward to learning more this coming year.
We have a trip to Kentucky coming up and for the first time in a while, we don't have a general-purpose iPad to use for some in-car entertainment. My wife and I used to say that we wouldn't rely on movies to get through trips because "we didn't do that growing up." But, people also used to take three weeks to ride across the country on horses and now we have airplanes.
I have a small collection of movies on flash drive, so we went to the local library (I love our library) and grabbed an in-car DVD player from their wonderful Library of Things which also had an option for SD or USB playback. I eagerly grabbed my flash drive and was met with "encoding not readable."
Turns out that this particular device only accepts MPEG1, MPEG2, or DivX. h.264 encodings would not work because it's old-ish. I also learned that most DVDs are encoded in MPEG2, so it makes sense that storage media would need to be in the same format.
It might be once or twice a year that I really need FFMpeg and it never fails to come through. It also seems that this is a common question and there are pages and pages of debate about what bitrate is best for which format - this turned into a three-hour deep dive and semi-live toot thread while I wrestled not only with encoding and wrapper formats but also max resolution (this particular device exptected a max of 720x576 pixels).
In the end I landed on this command:
bash
ffmpeg -i file.mp4 -c:v libxvid -q:v 2 -q:a 2 -vf scale=720:-1 out.avi
This line does 4 things:
- Take the input mp4 and convert it to the DivX encoding
- Drop video and audio quality just a bit
- Scale the video to a max of 720 pixels wide while keeping the height in proportion
- Spit out a .avi file container the player can read
I'm running this all on a Chromebook, so encoding takes about the same amount of time it takes to watch the movie, so I'm pretty much keeping the computer open and running while we do other things around the house to get ready.
I am also acutely aware that I could run to the thrift store and get the same movies on DVD for $1 each and be set. But that's a much less interesting blog post.