Workshops

Reading Time: 6 minutes
Teaching a workshop at RubyConf: Couch Edition from my kitchen.

Do you like what you learned from me at a conference, or online? I offer some workshops so you can bring that same experience to your team! My workshops, like my talks, are didactic, technical, and language/version agnostic. They also focus specifically on skill-building; audiences can apply the tools and frameworks we cover right away to their own projects.

I have two kinds: self-paced online workshops and live, interactive workshops.

Self-Paced Online Workshops

After I’ve playtested workshop material enough times, I adapt it into an online, self-paced version to make it more cost-accessible. I don’t have time to run my own course website, so I use Thinkific. You can see all my offerings here. So far I’ve published two workshops:

I’ve got a couple more in development.

Two notes:

  1. If you want to buy a course for your whole team, email me. Groups of 5+ get discount codes.
  2. If you live in a country with hyperinflation, email me. I’ll make you a code with a discount that puts the course in an accessible price range.

Current Live Workshops

Tackling Technical Debt: An Analytical Approach

Length: 2 Hours

Places Given:

  • RubyConf, 2021
  • Domain-Driven Design Europe, 2022

Brochure Pitch:

Getting out of tech debt can feel like a Sisyphean task. After weeks of work, the success case is for the app to work the same as it used to. Organizations often declare code bankruptcy and rewrite working systems from scratch. How do we end up here? And how do we alleviate, or even better, prevent such a situation?

In this workshop, you will learn how to measure tech debt and address the areas of highest need first. You’ll learn to identify high leverage code changes and separate those from renovations. You’ll also learn about the skills tech teams can use to prevent and reduce tech debt.

Debrief and Video Available Here:

https://chelseatroy.com/2021/11/10/rubyconf-2021-workshop-tackling-technical-debt-an-analytical-approach/

Parrot Emergency! Analyzing Risk in Software Systems

Length: 2 Hours

Places Given:

  • RubyConf Couch Edition, 2020
  • Domain-Driven Design Europe, 2022

Brochure Pitch:

How do you prevent an app from breaking?

Do you do it with automated tests? Does that work? When doesn’t it work? What do you do when automated tests don’t work?

How about cases where automated tests might work, but you don’t have them? Suppose you inherit a system with 8% test coverage. What do you test first?

The goal of this workshop: communicate a strategy for determining when and how apps will break. Here’s the key though: the strategy needs to be both accurate enough to be useful, and simple enough to be used.

Here’s how it goes: we put you on a team with 4-6 other software engineers. Then, we show you the class diagram for a complex and critical software system: an emergency triage system for parrots. That’s right: every poorly parrot gets a little parrot harness that monitors their vital signs and helps vets determine who needs attention first.

Your job, with your team: make sure the system works. You won’t be writing actual code, but you will be:

  • Identifying all the risks in the system where stuff could go wrong
  • Assessing which of those risks you should prioritize for mitigation
  • Matching the right mitigation tactics to each of the risks, in priority order

And of course, just like a real software project, time is of the essence. Both because the workshop is only two hours long, and also because fictional parrot lives are on the line!

Debrief Available Here:

https://chelseatroy.com/2020/11/30/rubyconf-workshop-analyzing-risk-in-a-software-system/a>

Here’s the RubyConf version of the workshop (recording paused during group activities):

Advanced Debugging: Strategies and Tactics

Length: 90 Minutes

Brochure Pitch:

Debugging: we spend most of our programming time doing it.

But we neglect it as a skill. We say it’s important, but we don’t deliberately practice it. Instead, we focus on shiny new languages, frameworks, and features. Our debugging skills lag behind the development of our other skills. And since we still spend so much time debugging, programming stays more frustrating for us than it needs to be.

Here’s how we make debugging hard for ourselves: we try to apply our assumptions from feature development to debugging, where they’re not true.

Case in point: we work on features with the starting assumption that we understand our code. But when we’re trying to track down a bug, that’s not accurate. We’re drawn to practices that align with our inaccurate assumption (like trying to fix it on the fly) but end up adding frustration to the debugging process. As we gain experience, we learn to ignore these temptations and do something else (like slow down and print out some variables), even if we don’t fully understand why.

But if we instead approach debugging from the starting perspective that we do not understand our code, we’re drawn to practices that *work* for debugging, rather than forcing ourselves to learn from experience to do the opposite of what we want to do based on an inaccurate assumption.

In this workshop, you will learn:

  • Strategies to systematically track down a bug (and why our usual approach so often fails)
  • Tactics to gather information about your systems, so you can narrow down the causes of bugs faster
  • Practices that you can use to sharpen your debugging skills over time

You can expect some lecturing, complete with illustrations and anecdotes to help summarize the concepts and bring them to life. We will also practice our new skills by working through debugging exercises.

This session is targeted at engineers who work as individual contributors on code. I am confident that everyone from the junior level to the principal engineer can learn something from this workshop. If you don’t believe me, come see me beforehand and we’ll make a bet.