Paul Graham, a venture capitalist, programmer, and essayist, calls this batching strategy “Maker’s Schedule/Manager’s Schedule.” If you’re trying to create something, the worst thing you can possibly do is to try to fit creative tasks in between administrative tasks—context switching will kill your productivity. The “Maker’s Schedule” consists of large blocks of uninterrupted time; the “Manager’s Schedule” is broken up into many small chunks for meetings.
-- The Personal MBA; Josh Kaufman
The SMS messages have been somewhat of a success at helping me retain the things I read. They’ve definitely brought my attention to highlights I’ve long forgotten, and sometimes have been uncanny in their timeliness. I’ve also found the recurring nature of seeing the same highlights has helped the information absorb deeper every time I see them.I say somewhat of a success because, since the messages arrive three times every day, I’ve gotten used to them and generally ignore them when they come through. I’m not so disciplined that I drop everything to read the message when it arrives at 10:35am, 1:35pm, or 10:35pm. I still read most of the messages, but in batches when I have both time and interest.
-- How I Export, Analyze, and Resurface My Kindle Highlights
At the heart of TCP congestion control is an algorithm called Additive Increase, Multiplicative Decrease, or AIMD. Before AIMD kicks in, a new connection will ramp up its transmission rate aggressively: if the first packet is received successfully it sends out two more, if both of those get through it sends out a batch of four, and so on. But as soon as any packet’s ACK does not come back to the sender, the AIMD algorithm takes over. Under AIMD, any fully received batch of packets causes the number of packets in flight not to double but merely to increase by 1, and dropped packets cause the transmission rate to cut back by half (hence the name Additive Increase, Multiplicative Decrease). Essentially, AIMD takes the form of someone saying, “A little more, a little more, a little more, whoa, too much, cut way back, okay a little more, a little more…”
-- Algorithms To Live By; Brian Christian, Tom Griffiths
For clients upserting larger amounts of data, you should insert data into an index in batches, over multiple upsert requests.
To batch well is to detect patterns well, and to use those patterns as leverage. Masters get so good at this process, they can see patterns in their mind before they happen in the real world. Ask any line cook about this phenomenon. Nothing happens in the kitchen without batching!
Here's a simple diagram of how grouping repetitive tasks (1,2) into a single task saves time in the long run for an upfront investment.
By batching the work for 1 and 2 together, we save n steps as we add an item to our list. That's a really big deal.
Social media apps are pros at batching. They package content in little boxes, and make it simple as possible to continue watching/reading/scrolling through unique chunks (this is important, because each chunk provides enough novelty to keep going).
They've curated (personalized) a batch of content (feed) directly, requiring no work from the end user outside of the minimal amount of cognitive engagement, and even that depends on if autoplay is on or not.
When I automate tasks/create scripts, I look to social media as inspiration. If my work is at least as easy to use/understand as Twitter/TikTok/IG etc., then I'm doing something right.
There is no glory in intentional obfuscation.[click to tweet]Find the patterns and ruthlessly automate them. The quicker you background the repetitive work, the quicker you can get to something new/cooler/more fun.
Funnily enough, this was my biggest takeaway building Stenography, and the core reason I made Autopilot.
A look back on how I created 80 unique projects in 2020.
So many books to read, and so few books to finish
GIT OVER HERE
Each week, I send out something that looks like this. Sign up below if you want in!